The forgotten story behind IBM's 'first mainframe'

The forgotten story behind IBM's 'first mainframe'

Summary: IBM is celebrating the 40th anniversary of its S/360, the first commercially successful mainframe. But another design deserves most of the laurels

TOPICS: Tech Industry
It's always a pleasure to celebrate the anniversary of a technology that's older than me. Thus, small sweet sherries all round for the birthday of IBM's S/360 mainframe, launched 40 years ago today.

The venerable machine is being feted around the world as the grandfather of modern computing: it brought such innovations as lookahead, pipelining, branch prediction, multitasking, memory protection, generalised interrupts, and the 8-bit byte to the commercial market. For those of us who've been brought up on a diet of microprocessor roadmaps, it's a welcome reminder that the latest, greatest chips depend on inventions dating back to the days when the Beatles still wanted to hold your hand.

But while IBM is keen to fly the flag for the S/360, an architecture that made the company emperor of the world for two decades, the real star has been nearly forgotten. All of those good ideas came from a computer that truly deserves the crown of first mainframe; one that can trace its origins to 10 years earlier and, plausibly, to a decade before that, when a remote valley in New Mexico was briefly illuminated by the light of the first atomic blast.

It is no coincidence that the end of the Second World War saw the start of digital computing. As well as the now-famous work done by Turing and others at Bletchley Park, atomic weaponry research in the US had proved two things -- that nuclear and thermonuclear bombs would define the course of the rest of the century, and that designing the things required more sums to be done than was humanly possible. The push for high-powered computation was on.

By 1955, the University of California Radiation Lab was looking for a computer faster than ever before. IBM bid but lost to Univac -- then the biggest computer company -- and IBM hated to lose. The company came back a year later with a proposal for Los Alamos labs for a computer with "a speed at least a hundred times greater" than existing machines. It won that one, and had four years to deliver the beast. The project was officially called the 7030, but was far better known as Project Stretch -- it would stretch every aspect of computing.

The innovations began right at the start. Stretch would be built with a brand-new invention, the transistor, and it was the first design to rely on a simulator. This was built by John Cocke and Harwood Kolsky early on, and let the designers try out new ideas before committing them to the final machine -- a method of working that has since become universal.

Topic: Tech Industry

Rupert Goodwins

About Rupert Goodwins

Rupert started off as a nerdy lad expecting to be an electronics engineer, but having tried it for a while discovered that journalism was more fun. He ended up on PC Magazine in the early '90s, before that evolved into ZDNet UK - and Rupert evolved with them into an online journalist.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.


Log in or register to join the discussion
  • A very interesting article.

    Of course I'd love to see something on the *real* fathers of computing. Maurice Wilkes, John Pinkerton and John Simmons deserve to have the story of LEO told.

    Unlike the IBM guys that Rupert describes, they really were innovating in both technology AND its application.
  • Often, in an industry so full of hype where all of us are pressured to have the latest and greatest products, we lose sight of the fact that many of the ideas being implemented in microprocessor-based systems go back much further. It pleases me to see credit given to some of the original developers and thinkers. I found this to be a well written piece on computing history. How often do people upgrade when what they are currently using can still do the job?