The forgotten story behind IBM's 'first mainframe'

IBM is celebrating the 40th anniversary of its S/360, the first commercially successful mainframe. But another design deserves most of the laurels

It's always a pleasure to celebrate the anniversary of a technology that's older than me. Thus, small sweet sherries all round for the birthday of IBM's S/360 mainframe, launched 40 years ago today.

The venerable machine is being feted around the world as the grandfather of modern computing: it brought such innovations as lookahead, pipelining, branch prediction, multitasking, memory protection, generalised interrupts, and the 8-bit byte to the commercial market. For those of us who've been brought up on a diet of microprocessor roadmaps, it's a welcome reminder that the latest, greatest chips depend on inventions dating back to the days when the Beatles still wanted to hold your hand.

But while IBM is keen to fly the flag for the S/360, an architecture that made the company emperor of the world for two decades, the real star has been nearly forgotten. All of those good ideas came from a computer that truly deserves the crown of first mainframe; one that can trace its origins to 10 years earlier and, plausibly, to a decade before that, when a remote valley in New Mexico was briefly illuminated by the light of the first atomic blast.

It is no coincidence that the end of the Second World War saw the start of digital computing. As well as the now-famous work done by Turing and others at Bletchley Park, atomic weaponry research in the US had proved two things -- that nuclear and thermonuclear bombs would define the course of the rest of the century, and that designing the things required more sums to be done than was humanly possible. The push for high-powered computation was on.

By 1955, the University of California Radiation Lab was looking for a computer faster than ever before. IBM bid but lost to Univac -- then the biggest computer company -- and IBM hated to lose. The company came back a year later with a proposal for Los Alamos labs for a computer with "a speed at least a hundred times greater" than existing machines. It won that one, and had four years to deliver the beast. The project was officially called the 7030, but was far better known as Project Stretch -- it would stretch every aspect of computing.

The innovations began right at the start. Stretch would be built with a brand-new invention, the transistor, and it was the first design to rely on a simulator. This was built by John Cocke and Harwood Kolsky early on, and let the designers try out new ideas before committing them to the final machine -- a method of working that has since become universal.

It's hard to list all the ideas that Stretch embodied and that have since become canon law in processor design. It could fetch and decode multiple instructions simultaneously -- remember the superscalar hype of the late 90s? -- and pipelined them, decoupling decoding and execution. It could predict the results of calculations and speculatively execute code depending on its best guess, and could look ahead to unexecuted instructions to make the best use of its internal resources.

These are ideas still essential to the best that Intel produces and the Stretch did them all with around 170,000 transistors. The last chip that Intel made with so few active components was 1982's 80286, which is a part remembered these days -- if at all -- for not needing a heat sink so much as a mop to get rid of the drool.

Alas, none of the above proved enough to match the promises made by IBM's marketing department. Stretch was delivered a year late and substantially more slowly than predicted. It was still the fastest computer in the world and remained so until 1964 but by then the market for high-performance computers had changed. Ten were made, going to atomic research institutes and spooks: the S/360's first customers were banks and airlines. There was a plan to build directly on the Stretch architecture, but that got canned. Although a commercial failure, Stretch was nonetheless IBM's best investment in research and development -- an exercise that Steve Jobs was to repeat 30 years later with the proto-Macintosh, Lisa.

All of which left some of the chief designers free to do other things. John Cocke, one of the brains behind the simulator and a true genius in many fields, took what he'd learned and what he hadn't been able to get through, and eventually ended up with an idea called RISC. IBM adopted that wholeheartedly with the PowerPC architecture, as did others such as Acorn, with its ARM processor. If you own a modern IBM mainframe, a mobile phone, a set-top box, an Apple or even an iPod, you'll possess some of the spirit of Stretch. The modern PC, needless to say, would simply never have happened otherwise.

So, while you toast the success of the S/360 -- another small sherry? -- remember that it and almost everything else you'll touch with a chip inside is the inheritor of a burst of unmatched innovation, one that flowered years before, in the unholy light of Trinity.

Show Comments