Feb 01

Applying availability and disaster recovery principles to your home computing environment

In my day job, I work with large enterprise customers to make 
systems continuously available.  
We focus on making sure systems can withstand component failures and we also 
make sure if a disaster strikes, users have access to their systems/data.  
Although our requirements in our homes is not as critical in terms of 
availability, loosing our data can be a major event and should never happen.  
This post will recommend a strategy for a multi-tier approach to backup both 
locally and remote and discuss archiving of data.  

Backup:  First the obvious.  Backup your computer on a regular basis.  
Most people do this with USB hard drives.  These drives are inexpensive and 
hold vast amounts of data.  A 1 Terabyte drive can be purchased for around $70.  
When you own multiple computers it is sometimes best to have a network attached 
storage(NAS) device.  
The difference is a NAS device becomes another virtual computer on your network 
and all other computers have access to this storage device.  The drive can be 
segmented for different functions such as sharing data within your family and 
each computer have its own backup area.  
I use the Western Digital my book duo  device which was about $300.  With this 
drive you can choose to either use both drives and have 4 gig of space or use 
the RAID 1 capability and have 2 gig of usable space with it mirrored to the 
second drive in the event of a single drive failure.  
I use mine in the RAID 1 configuration.  You can also make your data available 
to the external internet if you would like.  The actual process of backing up 
is very simple as all modern operating systems such as Windows 7 and Mac OS 
both have built-in features to perform the backups.  You can set the frequency 
of backup based on what each user thinks is appropriate.  
Another strategy that I use is to let the O/S do the full image backup as 
well as keeping data backed up as well.  
This can make moving to a new computer easier when it is time to upgrade. 

Off-site:  Much like enterprise computing, having a second set of backups or 
in the case of enterprise computing in a different location is critical.  
In the enterprise world we call this geographic diversity.  For home computing 
we do this in the event something happens to our home such as a fire or a tornado.  
Storing our most critical data off site should be part of our home computing 
strategy.  In my case I use Carbonite to back up the most critical data.  
Keep in mind the data is backed up as previously stated using the network 
drive but our most important data should have a second backup off site.  
Carbonite is a great solution.  For $59/yr you can back up an entire single 
computer.  
You can also access the data using other devices such as tablets and phones.  
I have also used Amazon S3 in the past.  There are several reputable offerings 
that perform this service.  

The cloud:  Everyone is talking about "the cloud".  There are many good offerings 
out there which can be used as both a backup and off-site option.  These offerings 
are also used to share data between your various devices such as sharing a 
single document between your tablet and home computer.  I have used several of 
them including Dropbox, Microsoft Skydrive, Apple iCloud and Box.  All are very 
similar and selecting the best one will be driven by which computing ecosystem 
you are using.  As an example, iCloud works well within the Apple ecosystem.  
Almost every offing offers a small amount of free storage and minimal cost to 
upgrade to higher levels of storage.  Google Drive is also another good offering 
to consider.  I use it for shared documents between multiple people.  
The ability for multiple people to access, including edit as well as read a 
document in a single location is a great computing capability.  In the 
old model, we would attach and e-mail documents from person to person.
An excellent solution for backup of photos is Flickr.  They offer 1TB of space 
for your photos for free and an app to automatically upload those pictures 
from your smart phone when you take the picture so the backup is automatic.  
All you need to do is sign up.  

Archive:  Even sophisticated technologists often do not understand the 
difference between backup and archive.  A backup is simple, it takes 
the systems and data we often use and copies it to a secondary entity in 
the event of some sort of failure.
Archive is very different.  When we archive data we take data we no longer 
routinely need and move it out of our regular computing environment and 
store it elsewhere in the event we may need it in the future. 
A good example would be our tax records.
Once we file our taxes we no longer need to routinely access that 
data but we are required to keep that data for seven years in the 
event of an audit.  Maintaining that data on our regular compute 
environment does not make sense but having an an 
archive solution for that type of data works well.  
Archives could be data burned to CD's of DVD's or just sent to a 
cloud solution.  Keeping all data together will use up space in your 
primary compute environment where archive-able data can be 
removed.  

Our home computing environments are a very important part of our lives.  
It is critical that we create an environment where we don't loose our 
valuable data or memories stored in that data.  

Hopefully this helps you build a strategy to keep your data safe.
Dec 18

Infrastructure does matter

I just spent a very interesting day at the IBM innovation center in Boston and left with a few impressions. First, traveling to Boston in mid December with snow in the forecast is never a good idea. Second, the team in our Cambridge facility that focus on user experience and user interface design is amazingly talented creating the next generation of client experience for the systems that we all will use. We have all used applications with both good and bad user interfaces and the experience we get from the UI design give us an overall impression of the quality of the overall application workmanship. Years ago as a developer, having my applications reviewed by what was then known as the “human factors team” was always a humbling experience but gave a perspective on the importance of usability of the application interface even back in the day of 3270 interfaces. But the most important thing that I left with is, as critical as the user interface is, the infrastructure that supports these systems is just as critical. As an architect that generally considers himself an infrastructure guy, that may seem like a natural reaction and almost self serving, but when someone uses either a web site or an application, as important as the user interface is, if the system can not scale to the required number of users, or is not available when it is needed, the user experience will be based on the these non-functional attributes and sadly, the user interface becomes irrelevant. Using the home construction analogy, when we design and build our homes we love the aesthetics and it is very important that our living space is both beautiful and functional but we can’t forget about the electrician and the plumber. We can have the most beautiful living space in the world, but if the toilets don’t flush and the lights fail to work when you need them, you will have a very undesirable living experience.

Nov 28

Working with (Technology) Bigots

This post is not about bigots in the traditional sense but the views espoused by technology bigots are just as closed minded and irrational as the views of the more traditional bigots.
This is about “my technology is better than your technology” and there is no place for yours in the industry.  I am specifically talking about hardware platforms, mainframe, traditional midrange UNIX and Intel based servers.  If you have spent any time in this industry you have either seen it, or perhaps but hopefully not, this may describe you.
The time I spend with large enterprise customers that have large legacy systems running on mainframe environments allow me to see this on a regular basis.  The view that large systems are a thing of the past and the world can run on commoditized only infrastructure is where you see this the most.  Most of this originates from individuals that don’t fully appreciate high volume  traditional transaction processing where the ACID properties of transactions are critical.  Likewise, the teams that have spent their entire lives on the “big iron” fail to appreciate or understand there is a significant amount of workload that is not only appropriate to run on distributed systems but actually are more well suited for a scale-out architecture.  The “mainframe guys” think the distributed team is a bunch of cowboys with a completely undisciplined approach to information technology.  The distributed team is convinced the mainframe team is too slow to respond to customer needs.  Traditional UNIX still has a place.  Large scale-up SMT environments where CPU performance is critical will continue to remain relevant well beyond the careers of all of us.
Quoting Rodney King, “can’t we just all get along”.  If you want to have a long fulfilling career in this industry, having a breadth of skills and understanding all available options and selecting the option that best meets the requirements is the right thing to do for our customers.  It is always possible to pound a square peg in a round hole with enough force but using the round peg is always better for everyone involved.

Sep 23

Collecting stuff, beach sand

Humans love collecting things.  Some people collect more typical things such as coins or stamps.  Others invest in things such as Hummel dolls, Crystal animals etc.  My favorite collection is a bit different.

I have been lucky to visit many beach locations all over the world.  When I do I like bringing some of it back with me using one of the local water bottles .  Over the years I have acquired many.  The variations are significant from the fine sand of the Caribbean and Hawaii to the pebbles of the southern coast of France and the large stones of southern United Kingdom.

Each bottle is labeled with the beach name, the date and the latitude and longitude of where I got it.

Sand Collection

Sand Collection

Jul 28

Chesaapeake Bay Swim

Chesapeake Bay Bridge

On June 9th 2013 I did my second Chesapeake Bay swim.  The 4.4 mile swim starts on the Annapolis side of the bay and you swim across between the bay bridges to the eastern shore side of Maryland.  This year the conditions where much more challenging with the storms the week prior to the event and the currents they caused.  My 2013 time was 28 minutes slower compared to the 2012 time of 2hrs 30 minutes.  Also swimming closer to the south span where the currents tend to funnel turned out to be a bad idea.  Each time I do this event I learn something and I am sure I will continue to do this each year.