A new year, A new DNS provider: A review of DNS Made Easy

As mentioned previously, on December 31, 2013 my previous DNS provider (Zerigo) decided to change their entire business model and create price increases for myself and many (probably all) of their customers, which they try to defend with this cute little note: http://www.zerigo.com/news/notice-zerigo-dns-change-of-plans

As I had not been sleeping very much that day I was somewhat impulsive and instead of attempting to comprehend Zerigo’s logic I began searching out a different DNS provider. My first choice was to promote my backup DNS provider, XpertDNS, to being my primary DNS provider. One obvious flaw is that this would leave me without a secondary backup DNS provider. The next thing that became apparent to me from reviewing their control panel more closely is that it isn’t very user friendly, especially not for having only a few days to transfer my domains to a new service (there were only a few days before the new semester would begin).

Once eliminating my current providers as valid options I performed a web search looking for premium DNS providers that offered vanity nameservers as well as secondary master (AXFR) options for redundancy. One of the top results was DNS Made Easy. The name was familiar to me as last year when there were problems with Zerigo they responded to my tweets with an invitation to join them, which was followed by an encouraging tweet from one of their enthusiastic clients.

What I was expecting going in to the search was that I would most likely have to go through and re-enter all of my hosts manually and take my domains over one at a time, which would likely require me setting up my nameservers on another domain along the way and ultimately switching over each domain one at a time as I got them entered. I also expected that I would have to commit to a provider before I could figure out much about them. DNS Made Easy surprised me on all of these. First, I created an account for free and had the ability to set up 3 domains for 30 days at no charge. I began the process by transferring a trivial secondary domain (pcfire.org) to their services. First of all, I was very impressed that as soon as I started the process I was asked if I would like to use AXFR to transfer records for the domain. Umm… yeah, sure, never quite thought of AXFR being used that way before, but it’s a very cool idea. In just a few seconds the domain was on the account and was fully set up. Next step was establishing vanity nameservers to test out that functionality, so I created them on the domain itself. It was a little difficult at first because every provider has their own model for how they think vanity servers should work. DNS Made Easy interprets it as a template process, so I set up the template. One of the coolest things about DME is that once in the account panel it gives you links to the help documents for any feature you haven’t used yet, so getting in to using the vanity DNS features was pretty easy. I love that the thing adapts. One thing that caught me off guard was that even using the vanity server configuration the SOA (Start of Authority record) still has to be changed, this too can be done on a template. Speaking of templates, that is another feature that is actually useful about DME. Zerigo had templates, but they were difficult to use and didn’t seem to be all that useful in managing the domains. On DME it is possible to template as much or as little about the domain configuration as is needed.

After only about 2 hours of work (spread over about 8 hours, giving myself time to learn and adapt to the DME panel) I was able to move the configuration for 19 domains into their system (at some point I decided to give in and upgrade my subscription, I didn’t feel I had much to risk). I don’t recall feeling stressed or annoyed at any point during the process. It was simple, painless and quite easy. The transfer of records with AXFR was the big key to it being so simple to transfer.

Some small things I learned after getting settled in:

  • I initially set up my backup DNS servers in the zone configuration by adding NS delegations (via a template), but this isn’t needed, even though it isn’t stated in the documentation this can be done in the vanity nameserver configuration in addition to the main servers.
  • SOA records are not assumed based on vanity configuration and must be specified separately.
  • The DNS Made Easy app is very helpful in monitoring queries or making quick changes to a domain or template
  • Using the templates it is possible to manage multiple domains that are similar, but have very distinct needs

 

If you are in need of a new DNS provider, I highly recommend DNS Made Easy.

This blog is a mess

This blog has been a lot of things over time. It started as my public-facing journal and in many ways still reflects that role, but as discussed in previous posts, the need for that role is narrowing. In August this blog will be 10 years old. I have documented my experiments in ceramic arts and my experiments in computer science in this blog. I have also engaged in pseudo-philosophical discussion through several of my posts. There have even been a few posts regarding my relationships. The point is, this blog does not have a clear focus, and I suppose it was never supposed to. Many aspects of my Internet presence are very focused on very narrow areas, whereas this one has always been the catch-all. I find myself contemplating sorting through the badly categorized pages to give them a bit of order and perhaps give this blog a clear mission. As quick as I consider that notion I question whether I should. This blog is a historical record of sorts for myself. It is in so many directions because its role has changed and evolved over time. Also, I do have to face that my interests are very diverse and span many areas and as such it is hard to consider dividing the content into neat and orderly categories. A concept that makes a possible decision to keep this disorganized mess problematic is that in a modern world a blog is supposed to be designed for its readers, not its writer. This blog was created at a time when MySpace was just beginning to set and Facebook was just beginning to rise, so those rules didn’t exist then. I am therefore left to decide if I wish to stick with the “modern” thing to do, or if I should respect the history of this blog. The decision that seems most appropriate is to archive this blog and start a new blog, but that seems a waste since there is so much material that is recent that follows along with my current intentions for the blog. For the moment I’m making no sudden actions, but am merely considering possible futures for this blog.

This topic only comes up now because as a component of an upcoming philosophy course (Feminist Theory and Its Applications) I will need to write a weekly blog entry in response to the weekly topic and I don’t feel as though I should create a new blog for those posts, it just seems repetitive. Viewing my blog as a catch-all publication allows this to be done without any sort of concern, but if I’m viewing it as a personal journal, then those posts really don’t fit in with the theme of the blog.

DNS Problems Again!

A few months ago (May 2013) I had problems with Zerigo, their servers were being constantly threatened by denial of service attacks. They overcame those issues by partnering with network infrastructure companies to provide a more distributed system. This solution has worked well, but apparently it wasn’t the end of the changes for Zerigo. I received a notification today that Zerigo is moving all of their services to the Akamai platform. This change, in general, should be invisible to the end users (like me). Instead of making the change invisible Zerigo has decided that they are fundamentally changing their DNS hosting plans in an attempt to position themselves as a “world-class DNS provider.” Unfortunately for me, that move is a bit too optimistic given their history over the previous year. Also, their new market position would raise my rates by a minimum of 650% ($480/yr, compared to $79/yr). I can handle a little cost increase for the reliability of my domains, but $500 per year for something I can get for free from OpenSRS (or $60 through XpertDNS or $60 through DNS Made Easy) is absolutely insane. I am not operating a portfolio of corporate domains and don’t need their services in the same capacity as the type of customer they are targeting.

The problems with this transition get even worse, it happens on January 31, 2014, a mere 30 days away! At that time my account will be converted and the prorated difference (based on value of the service remaining, not the time remaining) will be applied to whatever “new” package they decide to stick my account in, whether its based on the 65 domains I’m allowed or the 20 I actually have in my account, they have given me no way of knowing which.

Now, after a little research I have selected what I hope will be my DNS provider for a long time, DNS Made Easy. I’m not crazy about the name, but there are many reviews that include empirical data to show that they are very reliable and that they have one of the best server response times in their particular niche. It took me about 2 hours to copy all of my domains using AXFR and to set up custom records for everything that needed to be different (web redirects don’t translate in DNS, obviously). My domains will make the transition between now and January 6th.

Posted from Huntersville, North Carolina, United States.

Fall 2013 Paper Available

Normally at the conclusion of a semester I have multiple academic papers to release, but for this semester there is only one. This is as a result of only one of my courses having a major written component that would be appropriate for publication. This paper is now available for download.

 

Commodified Gender Performance and Influence in Pornography
Philosophical Methods & Analysis (PHIL 6120)
B8WZ2P1V

What might a user want to put on ice?

To follow up on my previous post, what sort of applications might a user find to be unnecessary of unwanted?

Below is a list of things that I have frozen currently. These apps have not been uninstalled, just prohibited from running without first being thawed and authorized. This list obviously does not include some applications that I have decided to complete uninstall.

My primary reason for freezing an application is that is duplicates an app that I prefer to the included app. Email, GMail and “Exchange services” are frozen. This is because I prefer MailDroid. On this particular device I originally had Chrome, “Internet” and then I installed FireFox. That’s three web browsers at over 50MB each (Chrome and FF are MUCH more). I never use Chrome on my devices so uninstalled it. I sometimes need the built-in “Internet” browser for system tasks, so it is allowed to remain live, but is hidden and given no default operations.  Some of the apps that are frozen are apps that are associated with some cool features (like Samsung Link and WatchOn), but I don’t currently need them or use them, so they sit here, frozen, until I need them. The next most frequent reason that apps end up on this list: I have no clue what it is! If I don’t know what it is, I don’t want it running, especially if it doesn’t directly correlate to an app in my launcher.  Finally there are apps that are “problem children” or represent a security issue for me. This includes things like Google+ (I don’t want to suddenly check-in somewhere without being aware of it) and Google Hangouts (it is a pain to make it stop running, even when it is doing nothing).

 

Frozen

The Root of Android

rootutilitiesLately a lot of phone and tablet manufacturers have become more determined at preventing users from accessing some of the lower level functions of their devices. Things like Samsung KNOX or the overly protective Amazon Kindle Fire OS really irritate me. First of all, the device belongs to the user, not the manufacturer or carrier (once the subsidy is satisfied anyway). Second, its Linux! The manufacturers defend their protection of commands requiring elevated access (root) stating that they are protecting users from destroying their devices (and voiding their warranty). Most of us who root or otherwise mod our devices know the risks we take and we accept those risks. Generally it isn’t possible to modify a device without being aware of the risks, as the device’s recovery software and download mode issue warnings to the same effect.

I personally find that Android devices are nearly unusable without having root access to the device. This is mostly due to the large amounts of garbage software installed by the manufacturers and carriers, in the case of phones. Having root access to a device allows the end user to freeze or remove applications that may slow down device performance or otherwise be unneeded or unwanted. There is of course the risk of things like accidentally uninstalling your launcher or other critical component, but if you are rooting your device, don’t you know how to use ADB to push it back, or perhaps use ODIN to restore an image? As an end user I find it offensive that anyone would try to tell me what I can and cannot do with a device that I own. Yes I know that it voids my warranty, but I don’t care, its my device.

To my next point: “Its Linux”. The Android OS is built on the work of many power users and others that have not been compensated for their work (and some that have) who were exactly the type of people who now wish to modify their Android software and have administrative control of their devices. The operating system evolved from an open platform, but now is being forced closed and unjustly made inaccessible to that same group of people. Beyond the open aspects, I have a problem with any variation of Linux/Unix being restricted in such a manner because it means that the usefulness of the software is severely restricted. Of course there is always “an app for that” or it is possible to develop applications based on that hidden functionality, but sometimes it is nice to access that sort of functionality without having to go through the app mess. Android devices are designed to be lightweight and as such they cannot handle an infinite number of apps. Of course we can just pop in a larger SD card… or can we? A lot of apps do not install to the SD card of a phone because the OS restricts what those applications can do because it is perceived as transient storage that cannot be relied on for critical operations. For those of us who have phones with the SD slot located below the battery, I can say that there is no great risk of the SD card being removed while the device is active.

There is one aspect of most stock Android implementations that completely annoys me. The recovery. In the name itself it implies that it is a mechanism for restoring the device to service in the event that something bad happens, right? Not in stock apparently! Stock recovery in most cases can be used to install updates or to install new versions of the operating system, but does not provide the ability to directly backup and restore the device. Most “custom” recoveries such as Clockwork or TWRP allow for backing up the entire device and then restoring it if it is needed. That is a cool feature whether you are modifying your device or not, but of course with Samsung KNOX, attempting to change recovery software results in a bricked (non-functional) device. Way to go Samsung!

 

So, what do I use root for?

Screenshot_2013-12-23-03-23-14First thing I use it for it disabling and removing applications that I don’t use or that are redundant for my preferred applications. I use an application called Link2SD (which requires root level access) to disable applications. As the name implies I also use this application for moving applications to my SD card (via symbolic links) on devices where space matters. There are many root-level apps that do the same thing, Link2SD is just the one I prefer.

The next thing I like having root access for is installing a more beefy set of system level commands. BusyBox installs some components of the Linux that have been stripped out of most implementations of Android. One that is especially useful for me is WHOIS. I would be lost without the ability to just pop in to Terminal and perform a WHOIS query.

Finally, system file access. This one is just so simple. There are areas of the file system on Android devices that we are just “not supposed to see” because we are “end users”. Applications like Root Browser will let us see (and modify) those sections with appropriate root access. One of my favorite things to do here is delete the phone camera shutter sounds, they are just so annoying and fake (and can be distracting depending on your subject). I have also used this feature to drop in large numbers of new ringtones.

Some might say that I really don’t NEED root access and that I don’t use all of the nuances of having root access. Those people would be absolutely correct, but root does let me take control of my devices in the way that I want without interference from the OS itself. I am the only user of most of my devices, so I feel as though I should have administrative access to the device.

 

How do you root a device?

The answer to that depends on your device. A good starting place is to pop into Google and type in root %device name% (replace %device name% with the EXACT model number of your device).

As a general method though, rooting involves installing a custom recovery on your device (Clockwork if available for your device, otherwise TWRP). This is usually accomplished through a tool like ADB or ODIN using the device’s download mode (Google that too, you will need to know it). Once in the recovery you will flash the SuperSU.zip (or SuperSU-signed.zip, or whatever root you use) to establish root access to your device. Once that’s done you will typically restart your device and open SuperSU to make sure everything is in place.

 

Is rooting safe?

No. Absolutely not. It carries great risks, but mostly because of things like Samsung KNOX. There is now greater risk to the actual rooting because of efforts to restrict root access than from having root access itself. 

There are some areas where it has become safer though. In earlier versions of SuperSU it was possible for applications to quietly request root access, without user knowledge. This is no longer the case, root access must now be requested on a per-use basis by the application and then confirmed by the user.

There are also some types of devices that are VERY easy and safe to root. Any device carrying the “Nexus” brand falls into this category. These devices start with an easy to unlock bootloader and generally have no mechanisms to deter the user from “interacting” with their device at whatever level of access or intimacy the user prefers. As of this writing there was a small Windows executable application that could “one-click” unlock and root any released Nexus device.

The first level of safety I take to ensure that there is no unnecessary risk to rooting is grabbing a copy of the firmware image for the device, either directly or by downloading it. For Nexus devices you can get a clean image from the Google Developers website. For most Samsung devices you can get it from samsung-updates.com. I prefer getting the factory images to making a backup because as mentioned before a custom recovery is required to make a full backup. The backup images should not be stored on the device’s internal storage or the SD card.  In a lot of brick situations it is still possible to “unbrick” using a factory image or backup (we call this soft-brick).

 

My Rooting Philosophy

It is my belief that it is the right of every Android user to root and modify their device in any way that they wish. I also believe that the carriers/manufacturers have the right to void the warranty of users who root. The void warranty represents the level of risk that we accept as root users (this does not however include accidental damage warranties, users don’t drop a phone because it happens to be running BusyBox!) Android is designed to be open and users should have the ability to experiment and learn with their devices and not be locked into a carrier or manufacturer’s vision for how the operating system should work.

Apple users have NO RIGHT to modify or “jailbreak” their devices. You bought into an ecosystem that is intentionally closed and hindered. This is not necessarily a bad thing. This prevents carriers from adding garbage software (mostly), and it means that you are on a platform with a clear vision and direction.

Comma Separated Volumes

In my professional work I have had to do a lot of custom coding for things, just because people I work with are “special” and generally don’t do things in a standard way. I’m fine with that, actually, I love that. There are some things that I feel as though I shouldn’t have to “custom code” for though, reading from CSV (comma separated values) is one of them.

With the project I’m currently working on, I need to read data in from a CSV file (plain text file, columns are separated by commas and there is a header row). The file is about 27,000 lines long. Biggest problem is that the columns aren’t always in the same order and sometimes there are different (additional) columns present. I thought for sure that going into the world of open source I would find some sort of pre-written tool, class or thingy to read the file for me and let me get the columns I want from each row. So far, my quest has turned up nothing, since apparently these files are always predictable and you just are supposed to just Split() the lines as they are read in and capture the indexes you want. Seems simple enough except for the fact that I have a file that is somewhat unpredictable (as in one iteration may have 26 columns, the next may be 28).

My problem is that I am caught slightly off-guard by the fact that there is nothing pre-written. How is it that such a simple thing has not been written? I suppose once I have finishes my coding it will be my responsibility to clean up the class and release it for the masses.

Selling False Security

http://ss.cuku.us/DA2xqi.png

For the past two weeks the above ad has been appearing on my Facebook timeline. I would like to air my concerns regarding this ad.

There are some problems with the claims on this ad that I find to be very obvious. There is a flat assumption that storing your passwords in the cloud is more secure than writing them down. I would hope that no one would believe that without question.

First, if your write your passwords down, it matters how you write them down. If you write them on sticky notes on your desk/monitor, then they are revealed for everyone to see. If you write your passwords in a notebook (like the one depicted in the ad), then it is a little more secure, if someone has prying eyes, it’s a little more obvious. Also, sticky notes are easily lost and can go missing without your knowledge, which increases your vulnerability. If you were to maintain a notebook of passwords and keep it stored in a locked drawer to which only you have the key, then its probably pretty secure (relative to any digital solution).

A notebook for passwords has certain advantages. The biggest advantage is that there is a bit of obscurity. If you keep your passwords in a notebook that is a distinctive color and have it mixed in with other notebooks, its pretty easy for you to pick it out, but not so easy for someone that does not know which notebook you keep your passwords in, or someone that doesn’t even know that you keep your passwords in a notebook. On the other hand, if someone grabs your phone when you aren’t looking and sees the icon for a well known password management app, then there is no obscurity and your passwords would be easily compromised.

Storing passwords in the cloud is more convenient, you don’t have to look them up and often you don’t have to enter them when you go to websites supported by the app. There are some major concerns with this method though. All of your passwords are protected by a single master password. This means you have no more security than using the same password for everything you do. It also means that one compromised password means all passwords are compromised, or loosing one password resulting in the loss of all other passwords. In my opinion cloud-based password management is a bit too all or nothing. There is the same risk with loosing the notebook, but the notebook is a physical object than can be lost and then easily found again. Also, if the notebook goes missing, you will know that your passwords have been compromised and can therefore change them as soon as you notice.

That brings me to my next point. There is no such thing as secure software or secure computing, at least not in the absolute sense. There is always the chance for some problem, some bug, some error to result in compromised data. Keep in mind that software is written by humans and it only takes one programmer not being thorough (or being intentionally malicious) to create a security breach. Large amounts of data are stolen every day from various “cloud-based” services. Usually they are trivial pieces of data or trivial accounts, but major breaches can and do occur. If your data is stolen electronically it may take a long time for the breach to be caught. If your notebook is stolen, you will know the next time you look up a password.

Just because something is “cloud based” (for whatever that really means), does not mean it is the best option.

The best solution is to not use the app or the notebook. The best option is to create passwords and then memorize them. I use neither and somehow seem to function reasonably well. I don’t have a distinct password for absolutely everything, but I have enough variation and other modes of obscurity that it would be difficult for any sort of automated attack on my accounts to use the password similarity alone to breach multiple accounts.

Here are my suggestions for passwords:

  • Use a Javascript (client side scripting) or desktop application to generate a random password.
  • Use a combination of letters, numbers and symbols as well as a variation of capitalization.
  • Make your password 10-20 characters in length.
  • Don’t use different passwords everywhere. This is contrary to the most common advice, but its not humanly possible to remember all of those passwords and if you can remember them you won’t need to write them down and can use more complex passwords.
  • Use different passwords for different “levels” of accounts. Use a distinct password for your banks, use different passwords from online shopping compared to more trivial accounts (such as Twitter). 

A random 10-20 character password may seem impossible to remember at first, but just try using it for a few days (don’t use a save password function, force yourself to type it) and see if you can remember it. I find that remembering new passwords is sometimes hard to do at first, but after a few days I can remember the passwords pretty easily. My biggest issue is remembering which password is assigned to which account.

PDF for Academics

I have made numerous negative critiques of the things I see that are done badly in academia regarding the use of PDF scans. Perhaps its time that I document those critiques more seriously and provide a few suggestions on how to have a happy PDF experience.

First of all, avoid scanning if you can. If the resource is available as a web page or a pre-made PDF provided by the publisher, it will save you some time and effort scanning and provide a better experience for your students. Also, if you have a cooperative ILL librarian or other library staff that will scan for you, let them do it. They have experience with making professional quality scans.

Second, scan one page at a time. It is more time consuming for you, but if you scan one page at a time, your students are more likely to read it on an electronic device and not print it out. This makes environmental and economical sense. It may take you a little longer to scan, but it will save your students the time of printing the pages or help them struggle with the frustration of constantly panning on a PC or tablet. It may be tempting to scan more than one page at a time, but it just isn’t worth it, unless of course you intend to split them electronically later.

This brings me to my next point, in sort of an odd way. If you are scanning  a page at a time, scan your document straight. This is a special annoyance for me as I have received documents scanned at a 35° angle. Its annoying, messy and unprofessional. Just don’t do it. If you can’t get your book or document straight with one page at a time, then put the whole thing on the platen and select the area that you need for each page.

Finally, crop it. Do not use the entire platen scan and then disseminate it out raw. Either before or after scanning, use the scanner software to select which region of the document you wish to save.

When you have finished scanning, check the pages and ensure that you have actually gotten all of the text you meant to get. Are there pages missing? Does the text fall off into the dark area at the inside edge, or perhaps fall off the edge of the scan?

Now, other than the lost words or lost pages issues, if you have failed to do everything noted above, there is a way to correct it. You must use Adobe Acrobat Pro (not reader, but the full professional Acrobat application). Most academic libraries have access to Adobe Acrobat, and some departments do, but it does cost the school for each license, therefore it is unlikely you will be able to have it on your personal office computer (unless you have other needs for it).

Using the tools pane in Acrobat you can do a lot.

The first and last thing to do is OCR. Select “Recognize Text, In this document” from the tools pane. Use this to not only provide machine-readable text from your document, but also to straighten your document. This tool will run through and detect text on each page, store it in the file’s background data and then align the page so that the text appears straight (on average). It is a good idea to do this at the beginning to straighten all of the pages for cropping, and then again at the end to ensure that the spacial mapping of the recognized text is correct.

The other task that needs to be done, if it wasn’t done in scanning is to crop the pages. Use Pages> Crop. This option will allow you to select a region of the page to keep and discard the rest. Turn on the crop tool, select the region, then press enter to finish the process.

If you scanned double pages, you will need to break them up. Start with the even pages (the pages on the right). Select the even side of the first page in the document, press enter, apply the crop to all pages in the document (even and odd), then save the document (usually DocumentName_even.pdf). Once saved, “undo” the crop (edit, undo). Now, select the odd (left) side of the page and again apply the crop to all pages. You may now rejoin the even pages to the document by selecting Pages> Insert from file. Select the _even.pdf file, and add the pages AFTER the LAST page in the document. Finally, move the pages into the correct order. You can turn on the thumbnail panel and drag them manually, or you can set up an automated task (see end of post for code) to do this. To do it by automation, you will first need to set up a task by selecting “action wizard” and then “create a new action”. From the action creation wizard, select “more tools”, then execute javascript. Select “options” to insert the code. You will only need to do this once, then you can just reuse the existing action. It should be noted that the code only works for documents with an even number of pages.  If your document has an odd number of pages, go to the Pages tool, Insert pages, more insert options. Add a blank page AFTER the LAST page in the document.

After you have finished working on your document, go to File> Properties and give your document a title, author information and perhaps a few keywords. This metadata will make your file easier to locate and index. Once that’s done, save the file.

Code for putting a document back together:

(if the code below causes an error, use the version at https://cmkularski.net/special/Interleave.txt instead)

function InterleavePages() {
var n = this.numPages;
var nOdd = Math.floor(n / 2);
var nEven = n – nOdd;
var x;
var y;
var i;
for(i = 0; i < nEven; i++) {
                         // movePage x, toAfterPage y
                         // note page numbers are 0-indexed
    x = nOdd + (i);      //
    y = i * 2     ;      //
    this.movePage(x,y);
   }
}
InterleavePages();