X
Finance

Daily Fix: The continuing drama

A spate of recent stories in the press, abetted by the CBS program 60 Minutes, which ran a technically naïve and outdated report last Sunday, has re-ignited concerns about widespread power outages and major corporate failures. Unfortunately, these reports play on speculation, often uninformed by technical realities, about the worst that could happen, while ignoring growing evidence of progress.
Written by Mitch Ratcliffe, Contributor

A spate of recent stories in the press, abetted by the CBS program 60 Minutes, which ran a technically naïve and outdated report last Sunday, has re-ignited concerns about widespread power outages and major corporate failures. Unfortunately, these reports play on speculation, often uninformed by technical realities, about the worst that could happen, while ignoring growing evidence of progress.

Part of the problem, of course, is that disclosure of Y2K status in this country is by no means perfect. Far from it, and this imperfection has resulted in both an underestimation of the actual cost of Y2K and a overstatement of the distance remaining between today's functional economy and a Y2K-ready infrastructure. First let me address the latter point, that poor disclosure has resulted in an amplified version of the Y2K problem. And, let me interject to say that the concerns these reports generated have been critical to the improvement in corporate, governmental, and personal preparation for Y2K.

Many Y2K managers and corporate executives I have spoken with have said quite plainly that they do not have the resources nor the patience to answer all the Y2K queries they get. Usually, they receive a form letter that does not ask questions that relate to their business or Y2K project milestones. Many companies send their own form letter in response to the survey, instead of answering the survey's questions.

In the opinion of many surveyors, such as the Weise organization, a corporate credit-rating company which inexplicably developed Y2K technology expertise late last year, these corporate form letters constitute a non-response and such companies are classified as poorly prepared for Y2K. Last year, Weise rated bank preparedness and came up with a "poorly prepared" percentage more than three times higher than the Federal Deposit Insurance Corp. and the Federal Reserve reported. It turns out that any bank that did not reply was assumed to be behind the curve - but those same banks, replying to the agencies that regulate their business, provided proof they are preparing adequately. In other words, a bias about Weise on the part of responding organizations translates into a bias about those companies on Weise's part. This same company recently announced that 22 percent of large companies will not be prepared for Y2K, a story widely reported in the press.

In fact, companies are very busy preparing for Y2K, as well as preparing documents for regulators and well-known ratings organizations. In California, for example, companies are required to file their contingency plans with the state this month - the IT managers I speak with are confident they will make these deadlines and coast through Y2K.

This overstatement phenomenon has repeated itself around the world, as early surveys of readiness assume that 100 percent of companies will be affected, that each of those companies must be 100 percent compliant to be ready, and so on, creating very low estimates of readiness. Later, companies doing Y2K repairs and surveyors realize that far fewer systems will be impacted by mission-critical Y2K failures and they acknowledge the fact that organizations need not be 100 percent compliant to operate normally, the perceived Y2K threat is greatly diminished. Cases in point: Early U.S. estimates, as well as British, Japanese and German Y2K threat levels, have all been revised downward.

Even in the last couple weeks, we've seen overstatement of Y2K problems at European airports, which The Times of London said were "headed for millennium computer crashes." This alarming headline is due to the way the question about preparedness was put to the airports by reporters - "Have you tested all your equipment?" Well, you don't need to test all your equipment. For example, less than four percent of all embedded systems worldwide are thought to be susceptible to Y2K problems, and less than one-half of one percent of embedded systems actually prove to be Y2K problems on average in companies that have talked openly about their systems.

However, because these airports have not tested their systems completely, they are assumed by the Times of London to be "at risk." The paper backed up this contention with comments that these airports are believed to be on a list of risky airports being compiled by the International Air Transport Association, but there is no confirmation of this - it's purely speculative reporting.

Likewise, several blue chip banking stocks were downgraded by analysts recently based on concerns about Y2K.

The lowered ratings were not the result of problems with the banks' preparedness for Y2K, in fact Credit Suisse analyst Michael Mayo said he wasn't concerned about the banks internal risk, but their susceptibility to third-party risk from overseas banks and, more importantly, the impact of financial panic on bank growth. Notice, there is merely the fear of slower growth, not failure. Not one example of an overseas bank with potential problems was actually cited.

Mayo fails to recognize that overseas banking is still largely manual, and that the leading international banks and financial networks are making good progress. Redundancy in financial networks, and the availability of Internet-based transaction systems provide fallbacks for currency transactions, as well.

In the meantime, U.S. banks are eagerly lending to business, according to Reuters. The volume of banking activity, in loans and consolidations, has not abated as Y2K approaches.

The crux of the argument, then, becomes the question of whether the number of IT problems resulting from Y2K will exceed the capacity of organizations to cope with those problems. The press seldom approaches the Y2K challenge from this perspective, despite the fact that there is ample evidence in this vein.

The best evidence that Y2K is far more manageable than anticipated is the sheer number of Y2K problems that have occurred and critical dates that have passed without an appreciable impact on the public consciousness. According CapGemini, 72 percent of companies and agencies have experienced a Y2K problem, as of the end of April, 1999.

More than 80 percent of Y2K problems will occur this year - meaning we are already in the midst of Y2K headaches, yet they are invisible to the consumer and having little effect on the economy, which is actually on the verge of overheating.

The GartnerGroup's Lou Marcoccio expects just eight percent of Y2K problems will actually occur in the December, 1999 - January, 2000 timeframe. In fact, as much as one half of all Y2K problems that will happen have already happened. We're on the downside of the failure curve.

So, why do we continue talking about this nonsensical January 1 and the lights go out scenario?

Because it makes a good and simple story. It sells gold coins, it sells survival gear. It makes headlines. Unfortunately, it doesn't lend much in the way of clarity about the impact of Y2K on the reader's life.

Editorial standards