A couple of weeks ago Cringley used his PBS pulpit to announce that IBM planned to lay off 150,000 people in the United States. Since IBM Global Services doesn't employ that many people, the report was quickly discredited; whereupon Cringley tried to argue that, regardless of the number, something evil is going on.
So, is there?
I see the putative layoffs in the U.S., whatever the real number, as a case of IBM trying to lower its hourly chargeable cost by laying off workers in high cost labour markets and hiring almost the same number of replacements in low cost markets. In a more perfect world that would be economically positive because the labour cost in the U.S. wouldn't be high if the people getting laid off couldn't get better jobs, while the reason the low cost market is cheap is precisely that the people there see these jobs as personal opportunities.
Unfortunately reality isn't that simple.
The source of the problem is that most of the IBM global services revenue affected comes from IBM loyalists deeply embedded in the corporate data processing culture still characteristic of many of America's larger and older companies - companies in critical infrastructure industries like banking, telecommunications, transportation, and insurance. These are the people still using System 360 technologies - including CICS/IMS and the multi-million line COBOL (with embedded assembler) programs whose "maintenance" is considered prime off-shoring fodder.
Outside of that market the out-sourcing trend has, I think, started to reverse; but the Global Services people now working in that market have no other markets for those skills, and may not have any salable technical skills outside that market - meaning that many will be either unemployable or have only the sales skills needed to get jobs in which they make net negative economic contributions by trying to turn back the technology clock for their new employers.
That kind of outcome would be bad for the people affected and, ultimately, for the economy as a whole, but there's a much nastier unknown here too - and that is simply that every time an American company passes hands-on responsibility for critical code to a foreign coder, American national security - and therefore the world's economic security - is reduced.
Most of us don't remember, but back in about 1982 the CIA used a Calgary company to enable a software theft by the Russians - software that had been rigged to disrupt their gas transmission infrastructure. Here's a summary - taken from a 2004 Washington Post article by David E. Hoffman:
In January 1982, President Ronald Reagan approved a CIA plan to sabotage the economy of the Soviet Union through covert transfers of technology that contained hidden malfunctions, including software that later triggered a huge explosion in a Siberian natural gas pipeline, according to a new memoir by a Reagan White House official....
"In order to disrupt the Soviet gas supply, its hard currency earnings from the West, and the internal Russian economy, the pipeline software that was to run the pumps, turbines, and valves was programmed to go haywire, after a decent interval, to reset pump speeds and valve settings to produce pressures far beyond those acceptable to pipeline joints and welds," Reed writes.
"The result was the most monumental non-nuclear explosion and fire ever seen from space," he recalls, adding that U.S. satellites picked up the explosion. Reed said in an interview that the blast occurred in the summer of 1982. A lot of the stuff companies like IBM hire locals to produce in places like Lahore gets used by key American companies in finance, transportation, and manufacturing - knock those companies out by the co-ordinated destruction of their information systems from the inside and the resulting chaos could drive world wide economic disaster while preventing an effective American response to something like a communist invasion of Taiwan.
And here's the fun part: having once written more than my share of an 850,000 line COBOL program I can testify that it's a lot easier to write the stuff than to debug it - because the plodding pace of typing out one long winded microstep at a time means you can write the stuff while dreaming about the receptionist, but you can't defang it that way because understanding even tiny numbers of off-sequence interactions means keeping thousands of lines of code, lots of fun JCL, and potentially hundreds of library calls, straight in your head at the same time.
In fact, with COBOL, even simple debugging is often much harder than writing -and looking through a million lines of working code for something that's not supposed to be there? I'd call it next to impossible for humans and far beyond the reach of any of today's automated scanners.