X
Business

Where's Your Backup System?

On Aug. 14, companies that thought they were safe found out otherwise because their backups weren't far away enough.
Written by Alex Salkever, Contributor

When the lights went out in Gotham on Thursday afternoon, Jim Simmons got busy. He's the CEO for availability systems at SunGard Data Systems. Headquartered in Wayne, Pa., SunGard helps companies survive disasters. That means providing services ranging from extensive business-continuity planning to specialized data backup and recovery services to wheeling in a mobile command center loaded with tech gear that can jump-start a corporate network and keep operational interruptions to a minimum.

Around the country, SunGard has dozens of business centers with empty seats and data hookups ready and waiting to host employees of stricken companies chased from their offices by disasters natural, man-made, or just plain mysterious -- like the Aug. 14 power meltdown. As of 10 p.m. that night, 34 SunGard customers had called to activate their disaster plans, and an additional 100 put SunGard on notice. It was the biggest demand Simmons had seen since Hurricane Floyd pounded the East Coast in September, 1999.

This time, even with no high winds or crashing waves, not to mention exploding buildings, Simmons says, "We're getting calls from all over and enacting plans in New York, Toronto, Montreal, and Detroit." In all, says Simmons, SunGard has about 7,000 customers in North America.

TOO CLOSE FOR COMFORT. Of course, many companies thought they had already covered business continuity with post-September 11 plans aimed at minimizing disruptions. Turns out some were wrong. Sure, most had backed up their data centers with generators and were ready for all sorts of attacks on their servers. But where to put the people if the air conditioning goes down within a radius of a hundreds of miles? Or the phones don't work on the backup trading floors anywhere between Philly and Boston?

"The information I'm getting from our crisis-management-control centers is clients are saying, 'I need to enact this portion of the plan because my backup was five miles away, and I am having the same problems there as well," says Simmons.

In fact, the timing of the outage was, if nothing else, fortuitous. It hit at around 4:10 pm, just minutes after Wall Street finished trading for the day, giving the financial sector a full night and much of the next morning to recover and get ready for Friday morning's opening bell -- rung by New York Mayor Michael Bloomberg.

DOUBLE BLOW. Perhaps more important, earlier in the week, info-tech staffs had toiled to patch systems either stricken or still vulnerable to the nasty MS Blaster worm that caused PCs running versions of Microsoft's Windows 2000 and XP to shut down without warning. The worm also clogged many corporate intranets with bogus traffic as it tried to replicate itself by seeking to infect other vulnerable machines.

Had the power outage hit at the same time the worm was wreaking havoc on networks, IT staffs would have been doubly unable to get control of the servers that crashed when the power got cut off. What condition those servers would have been in at restart if they crashed while still infected is unknowable, but it seems safe to say it wouldn't have been pretty.

While the blackout clearly could have been worse in those respects, some people see a connection between the mysterious outage and the latest worm. Most power-grid control systems run on specialized hardware and software that don't talk to the Internet, but some utilities use Windows machines to control and monitor those control systems through graphical interfaces. And a digital chain is only as strong as its weakest link.

"WHAT HAPPENED?" So, if a utility network had a Blaster infection, it could have rippled out to other areas. "I have no doubt that many of the systems involved in governing the distribution of power use Windows somewhere," says Russ Cooper, a Windows security expert with security consultancy TruSecure, which has done programming for utilities in Canada. "The whole grid was designed for a really hot day," he says. "Today wasn't even as hot as it was earlier this week. What happened?"

As of Friday morning, no one had an answer to that question. Everyone from President Bush to Mayor Bloomberg swore the blackout had nothing to do with terrorism. That's probably true.

However, several sobering lessons seem clear. Spreading backup systems around geographically for security purposes needs to be done across hundreds if not thousands of miles. Building a backup facility across the river just doesn't cut it. Companies will need to rethink what it means to create truly redundant business operations.

STILL AT RISK. While security analysts have painted a fearful picture of what would happen if terrorists combined a major attack with a well-timed and well-executed cyber-assault, a similarly nasty one-two may be just as likely to occur without any human intervention at all.

Finally, the outage underscored a year-old warning from the National Research Council that the North American power grid remains at risk and that severe outages could result from relatively minor attacks. Just looked what happened when no attack at all was involved.

BusinessWeek Online originally published this article on 13 August 2003.

Editorial standards