Kill the data loss monster once and for all

Kill the data loss monster once and for all

Summary: What's scary this Halloween - or anytime? Putting your personal and business computing continuity at risk. With the inexpensive data resiliency and backup solutions available, you don't need to be the victim of a bad horror movie.

SHARE:

What’s scary this Halloween? Putting your personal and business computing continuity at riskThis morning, the editors at ZDNet posed a question for our blogging crew for Halloween:

“What IT product/service/development this year gave you the chills?”

Over a cup of really strong Puerto Rican coffee, sitting here in my timeshare in Humacao and staring out into the Caribbean, I pondered the possibilities. The tempting and obvious dig and easy way to get blog hits would be to take the usual stab at Vista with a wooden stake, shoot silver bullets at other proprietary and DRM-practicing witches such as Apple or Amazon, or wave fiery torches at Google in order to drive the monsters back. But that would be too easy, and frankly, I don't think any of the usual suspects merit a scary enough rating -- they don't even rank up there with Plan 9 From Outer Space as far as I am concerned.

What's really scary is the type of thing that nails you due to your own inability or willingness to act, not by the actions of another vendor or the behavior of a particular product. I'm talking about neglecting your Business Continuity and Resiliency, be it on a personal or an enterprise level. And while these monsters are extremely easy to defeat, if you don't make the adequate preparations beforehand, they will strike you dead. You can lose valuable data, lose customer confidence, as well as lots of money. Now THAT is truly scary.

Click on the "Read the rest of this entry" link below for more.

On the personal computing side, I can't tell you how many times this year I've been called by a friend or family member to help fix a computer that has "crashed" or is unrecoverable. My first question is the obvious one - "So, how do you handle your backups?" I frequently get blank stares as a response, and that's when I know we're in big trouble.

If your primary storage medium really is unrecoverable -- in other words, if you can't get access to the data via something like System Rescue CD or successfully mount the drive on another computer to retrieve your files, you need to have some sort of backup mechanism where you can either restore your critical data to a newly rebuilt OS or do an image-based restore. Much of how this is accomplished depends on what we in the Business Continuity and Recovery Services business call your RTO and your RPO - or your Recovery Point Objective and Recovery Time Objective.

For your average end user, the RPO is the most important consideration -- you want to be able to recover your files and data at a point in time closest to when you incurred the actual data loss. Whether it takes you an hour or a day to get the data back (your RTO) for a home user or small business is probably unimportant. For an enterprise, that's an entirely different matter.

Some companies require RTO's of less than 4 hours, and some I've seen as high or higher than 72 hours depending on the criticality of the system or tier. Very low RTOs and RPOs require some very sophisticated solutions such as SAN snapshotting, replication technologies and Disaster Recovery (DR) sites and protocols.  But most end-users are usually happy just to get their data back, period, and don't require anything nearly as infrastructure-intensive.

How can you as an end-user kill the data loss monster for good? Well, it's as inexpensive in most cases as going out and buying a secondary USB hard disk for less than $100, attaching it to your personal computer, and installing a free piece of software such as Cucku Backup, which can be configured to automatically back up your system to that new device in less than 5 minutes.

If you don't want to keep a backup drive onsite, or want a secondary backup mechanism that would protect you in the event of a true disaster, such as a flood or a fire, you can look into solutions such as Carbonite which will automatically back up your critical data over the Internet for a small yearly fee depending on the amount of offline storage you buy. Iron Mountain, which is a high-end service that is usually reserved for Fortune 500 companies, also has an Internet-based backup solution for small businesses, but it's a lot pricier.

As with any service, you need to weigh the maintenance fees against what a real failure would actually cost you, and what combination of services make the most sense and what data is more critical than others - obviously, your MP3 collection and your family photos from last Christmas probably aren't as important as your Quicken/Quickbooks files or your Office documents, so you should be burning your non-critical or static data to cheap storage such as DVDs or a backup hard drive instead of using net-based backups which charge for storage and bandwidth by the gigabyte.

Note to Google, Yahoo, Apple, Microsoft, IBM, HP and Amazon -- here's an area where you guys could really clean up and gain some serious customer loyalty -- by providing affordable and easy to use Internet backup services.

In the event that you incur a complete hard disk failure, you'll still need to re-install your OS and apps and your backup software package, no doubt a very time intensive process wrought with headaches, but you'll get your important data back if you've been doing daily backups. To bring back your entire system with OS, apps, data and all with minimum stress, you might also want to look into image-based solutions such as Acronis or Symantec GHOST which can be combined with the aforementioned USB backup drives and traditional file-based backups to restore from incremental data loss.

Linux users should definitely look at System Rescue CD for a great open source system imaging solution. As with any backup solution, image-based backups are only as good as how recent they were taken, so be sure to combine this with a file-based backup solution.

For those of you who want a completely transparent redundancy solution, you might want to consider putting in a second internal hard disk and configuring your system for RAID-1. RAID used to be for enterprises only, but every single consumer version of Windows has supported it out of the box since Windows XP, and it's been in NT and "Enterprise" Windows desktops for ages.

For Windows RAID, simply install a second hard disk, partition it to be the exact size of your existing hard disk partitions, open up the Disk Manager in the Microsoft Management Console (MMC) and create a software RAID 1/Disk Mirror -- no expensive RAID controller is needed, but your CPU will incur a little bit more overhead by mirroring the drive.

If your primary drive fails when you are using software mirroring, simply swap the cabling with your secondary drive and you are good to go,  although it's possible in some rare circumstances that you might destructively write or erase data on both drives simultaneously (I've seen this happen with things like database apps where no actual OS or hardware "crash" occurs but the application itself misbehaves causing a data loss) so you should always have a secondary backup/restore method handy.

Linux also supports software-based drive mirroring, but the setup is a little more complicated and you'll want to consult the 'HOWTO' guides on the Internet if you want to head down that route.

If you need higher disk performance and a no-brainer setup for Windows, Mac and Linux, you might want to look at a hardware-based solution from AMCC 3Ware or from Adaptec, both of which sell desktop caching SATA RAID controllers in the $200-$300 range depending on what features you need.

Some motherboards on higher-end PCs also include RAID controller chipsets. With RAID controllers, you set up the mirroring in the controller BIOS and the RAID chipset does all the work - the host OS sees just one physical hard disk, even though you might have two or more (RAID-5) disks installed. When a drive fails, the controller does all the hard work of re-syncing the data when you put the replacement secondary hard disk back in.

Finally, let's address another monster that a lot of users ignore - virus checking and anti-malware solutions. With so many inexpensive and well-designed programs on the market today, there's no excuse for not having one installed or letting your subscription expire.

On Windows I continue to be impressed with Symantec's Norton Internet Security 2009 - it's an excellent package and is far less resource intensive than previous versions for an all-in-one antivirus/antispyware/firewall solution. For a free antivirus  I like Avast! Home Edition, and for anti-malware and routine system maintenance tools, IOBit's Advanced Systemcare 3, CCleaner.com and Safer-Networking.org's Spybot Search and Destroy should be on everyone's download list.

With the inexpensive data resiliency and backup solutions available on the market today, you don't need  to be the victim of a bad horror movie.  Got another great solution? Talk Back and let me know.

Topics: Hardware, Data Centers, Data Management, Storage

About

Jason Perlow, Sr. Technology Editor at ZDNet, is a technologist with over two decades of experience integrating large heterogeneous multi-vendor computing environments in Fortune 500 companies. Jason is currently a Partner Technology Strategist with Microsoft Corp. His expressed views do not necessarily represent those of his employer.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

35 comments
Log in or register to join the discussion
  • another...

    You might like this one better for backups:
    http://www.2brightsparks.com/syncback/
    ridingthewind
    • Also a SyncBack fan

      I deploy SyncBackSE (commercial version) on every laptop we build, where the user's netowrk share is the target for the backups/synchs. For home users, a USB drive (with its own power supply) makes a great target for the free SyncBack version.
      alan.douglas
    • Jungle Disk Amazon

      Cheap as chips, easy as can be, secure too -- Jungle Disk front end to auto backing up essential files in the clouds on Amazon's spare storage space.
      peterpills
  • RE: Kill the data loss monster once and for all

    For Windows systems I'm surprised you didn't mention Windows Home Server. It not only covers data backup but also covers complete system restores.
    SamYeager
  • RE: Kill the data loss monster once and for all

    Don't forget Rebit for backup and recovery. All you have to do is plug it in - and it backs up the ENTIRE OS. wwww.rebit.com
    mruthk
  • SATA drives are the biggest reason...

    for data loss today. They flake on a regular basis. I've been told by a major vendor insider that their drives are approaching failure rates of 20%. This is absurd.
    bjbrock
    • Fundamentally, the same old technology as ATA

      However I think the failures could be due to more component cramming and inferior ventilation and thermal management in cheap computer cases. It could also be that the manufacturing standards have dropped since the big outsourcing of components trend started to China and Korea and Indonesia, but most of those plants are ISO 9000 or better. I think a lot of it can be chalked up to approaching MTBF much quicker due to more demanding apps (multimedia, bigger files, etc) and more severe operating conditions. When we had big giant tower cases with huge fans and cooled power supplies and lots of breathing room this wasnt as much an issue.

      This is why I'm saying its probably a good idea, at today's prices, to put in a RAID-1 as well as an external USB backup drive on your home desktop.
      jperlow
      • I am glad you mentioned the

        external USB backup. The RAID solution is simple and reliable, I have all the systems in the house setup that way.

        The only problem you have is that if an external agent takes out your components (lightning strike, fire, ect) both drives are in the same physical location, and would most likely suffer similar fates.

        Backing it up to an external drive that regualy gets placed in a drawer (or someplace else ) really ads another level of saftey to keeping your data intact.
        GuidingLight
      • A note on motherboard RAID . . .

        I would recommend [i]against[/i] motherboard RAID, though. My experience has been that the RAID controllers on motherboards are notoriously unreliable. And oh, yeah - they use proprietary disk formats, so you can't just dump the drive into a new motherboard if you buy a new computer. I personally just use automated backups.
        CobraA1
      • External storage is still a must for critical data

        One of the draw backs of RAID is that it does not fully protect you from data corruption. RAID 5 offers no redundancy, so if the data gets corrupt, that's all she wrote. While mirroring with RAID 10 (RAID 1+0) and RAID 15 (RAID 1+5) can appear on the surface, to be a satisfactory resolution, it shouldn't be relied on as a sole solution when dealing with critical data. Mirroring drives will duplicate <i>any</i> data changes, including data corruption, which can have several causes. Also, the RAID controller itself can become defective and cause unrecoverable data loss on both drives in an array. While RAID offers a nice zero downtime solution, periodic backups to external storage is what should be used as a bare minimum, with or without RAID. A USB hard drive or larger capacity flash drive used with regular backups is a relatively cheap insurance policy against critical data loss for a home user. It's also a pretty good idea to keep a recent backup at an offsite location, such as a trusted friend or relative?s house, or a secure place at work, in case of a fire or theft.
        Flying Pig
        • External storage and backup is a must period

          There isn't a lot of 'critical data' on my parents
          PC, but it gives me a good feeling to know that
          all I would have to do in order to restore their
          hard drive if it is 'borked' because of something
          little old me does or just from a hard drive
          breaking is restore from the backup on the
          external or go to a store, buy a new hard drive,
          and restore from the backup.
          Lerianis10
    • What is absurd is that percentage you are being stated

      It's closer to 1% when it comes down to it, except
      for SeaGate drives.... they have gotten VERY flaky
      in the past 2 years or so since they bought one of
      the 'poor' drive companies.
      Lerianis10
  • Backup timing

    I've had some experience with centralized backups such as Veritas BackupExec (before being bought by Symantec) and Symantec LiveState Recovery (before being integrated with BackupExec). One of the problems was when do you run a backup? The systems I use are for process control in a manufacturer, which require 24/7 availability. The agents employed by both of these schemes hammerred the NIC on the machine and made it unresponsive to any clients trying to connect. So, I've had to re-think this and I now try to go with internal SATA/RAID1 with an external SCSI RAID5 setup. I've also been looking into SSD drives for the internal RAID1, since this drive pair is where I put the OS, which has the hardware dependant drivers. Still, I can't take it offline to sync unless it has a failure, since that takes the entire process down.
    yet_another
    • In this case you should be thinking mainframe

      The only system that I know of that can be backed up while it is completely live is an IBM zSeries SYSPLEX, where you bring half of a plex processes down and back up off of the SAN-replicated system. In a SYSPLEX you also do upgrades the same way, with each side of the plex done in a "rolling" upgrade.

      With midrange/UNIX systems you might be able to set up some sort of SAN-based replication or snapshotting, depending on what storage vendor you use, what the UNIX vendor itself supports, and how you are clustering the filesytems, but the snapshot will still be at a point in time so you will still lose transactions until the next backup window, that is unless you are writing the transactions in parallel to some sort of replayable log which is also clustered and you have constant referential integrity with the backup. To some extent you can also do this with x86 systems but the OSes aren't as resilient.

      http://en.wikipedia.org/wiki/Sysplex
      jperlow
      • Or...

        server clustering... but no matter how you look at it you have to fork out some money to get a system that can be backed up and upgraded while online. Not to mention the IT nightmare of keeping it all working toegether. As bad as I dislike IBM... they do have the better solutions for mainframe and clustered server systems these days.

        I'm now experimenting with running clustered servers in virtual machines. I haven't been fast on taking in new technology lately... getting lazy in my older age.
        ShadowGIATL
  • Software we also use: VisaVersa and R-Studio

    Of course we have tape backups, RAID and redundancy in the network. But some tools are easier to use.

    I am using for laptops frequently VisaVersa Pro to store mobile data on a USB drive. An alternative could be SyncBackPro but I have not tested it.

    For the network a newer tool I started to use is R-Studio. Taken over the network disc snapshots, restoring files and without touching the desktop itself. First tests I have done are very promising.
    [GZ]
  • Big fan of on demand backups!

    I have about 80GB that needs reliable backups since it's my work, personal, and school information that I'd really not like to lose.

    I used to do local backups to external hard drives, and I managed to break 4 of them, although 1 seems to mostly work still. They were all Maxtors and after the $600 1.5 TB RAID drive died, I had had it with external storage.

    I tried Carbonite and fell in love. I now have 2 computers on it at home, as well as 3 others including my church file server.

    It's great and a bargain! $50 per year compared to $200/yr that I was blowing on unreliable backups. I also never have to worry about purging old backups or even being at home (I travel a lot -- even backs up in the hotel). I also get emailed if any of the computers stop backing up -- helpful for grandma's computer.

    I've also used Jungledisk Plus (works with Amazon S3) but I prefer the client interface of Carbonite, and if you have more than about 30GB to backup, then Carbonite is a cheaper option than Jungledisk/S3. Although Jungledisk also gives you additional storage for things you can't fit on the laptop.

    Google, where's my GDrive? Until it's ready, I'll sticking with Carbonite for backups and Jungledisk/S3 for offsite storage. Great stuff!
    jt_lovell
  • Other methods

    For many people, an alternate simple solution is to use a ~2 gig memory stick and use that to back up data files. That's plenty of space for most people and it's cheap and easy. Also I use Memeo for auto-backups on my external USB hard drive- it works flawlessly. I plug it in when for daily/weekly back-ups and disconect it at other times to protect it from surges.
    m38817
  • RE: Time Machine?

    Any good or bad opinions about using Time Machine under OS X?
    I mean, are there good reasons to search for an alternative solution, or simply Time Machine is the right way to go with for MAC owners?
    Thanks.
    Eeem
    • I really like it

      I really like it - it is very easy to use and i use it every day. however, there is one advantage to full-blown products: in case of a fatal failure (like when loose a hard drive), with time machine you'll need to first install OS and aplication and only then restore your files. It will take more time than if you had a full clone of your drive. Also, time machine does not back to the internet, only to phisical drive on your network.


      my comments at http://www.commentino.com/orim
      orimata