X
Business

We're fifty years into the future

Fifty years ago today, a revolution started. Let's not stop the party just yet.
Written by Rupert Goodwins, Contributor
Ladies and gentlemen – please raise your glasses and toast the Regency TR-1. On 18 October, 1954, this revolutionary device was announced in America. Fifty years later, it has been blamed for rock and roll, the death of the US consumer electronics industry, the relentless rise of IBM and the shocking state of modern manners. Not a bad score for a transistor radio.

It wasn't just a transistor radio, of course. It was the first. In fact, it was the first transistorised mass-market device, and it symbolised the central role that technology was taking in the post-war world. Never underestimate the power of such symbols – Thomas Watson Jr., head of IBM, gave his senior managers a TR-1 apiece to kick-start the company's transition from valves. That symbolism had a different flavour ten years later as outfits like Sony and Toshiba used the same technology to smoothly wrest control of the market from its inventors. Outsourcing fears are nothing new.

A lot has changed. The TR-1 had four transistors and cost $50; last week I bought a 256MB SD card – for a radio, appropriately enough – at about the same price. That has two billion transistors in it, or four thousand times as many as were used in the entire production run of the Regency. Factoring in devaluation, each transistor costs around four billion times less. We're living through an industrial revolution of unparalleled speed and reach – and it's all borne aloft on a massive tsunami of transistors.

Where will it go? Let's skip forward to 2054. For the same effective price of a TR-1, a straightforward extrapolation promises a memory card with two exabytes – one exabyte being around 10 to the power of 24 (if you don't like those numbers, feel free to substitute your own). Artificial intelligence will have evolved by then, because there's simply nothing else to do with those sorts of numbers – our hypothetical memory chip will have the same number of transistors as the synapses of around ten billion people. That's practically a planetful.

There are some small problems to overcome on the way. Nobody knows how the brain works, although there's lots of fascinating work being done in cognitive neuroscience – last week, researchers at the University of Rochester announced that adult ferrets used 80 percent of their brain's processing power to think about things after being shown clips from The Matrix. Nobody knows what this means (although the figures may be substantially lower for the sequels), but the fact remains that we are developing some very powerful tools to peer into the workings of mind.

It may be coincidence that the human brain appears to be made out of a number of discrete processing units bound together by a complex system of buses, the same direction processor designers are taking in an attempt to use their cornucopia of transistors. Certainly, the commercial pressures are all in place to make chips better at analysing the same sense data with which humans are most comfortable: vision, speech, even touch, smell and taste are areas receiving substantial funding. We know the way we think is intricately linked with the way we sense: it is inconceivable that research here won't feed into more general theories of artificial thought.

Such considerations may seem a thousand light years away from the concerns of today's IT, where we struggle to keep our dull brutes going on a diet of buggy software and tainted data streams. Security and management in large, interconnected systems are hard enough without dreams of hyper-intelligent silicon AIs. Yet evolution created mind because it is a powerful way to analyse threats, co-ordinate groups of individually weak creatures into strong societies, and map the world around us to reflect and control danger and opportunity. If you've been listening to the promises of the major business software vendors, you may recognise these ideas.

It would be trite at this point to say that effective evolution demands an environment where modifications can be freely made to existing systems: you're a smart kid. You draw the analogies. But it's not overly simplistic to point out that if fifty years of lightly moderated innovation can take a four-transistor radio capable of playing Elvis Presley and relaying sports results and turn it into a global network that consumes and generates our culture and businesses, it would be rather short-sighted to put the brakes on just as it's getting interesting.

Other notable dates in computing's history:

14 December 1994: First W3C meeting

1 January 1983: ARPANET officially switched from the NCP protocol to TCP/IP

22 May, 1973: Bob Metcalfe writes a memo at Xerox Parc describing Ethernet

17 July 1968: Intel incorporated in California under the name NM Electronics

7 April 1964: IBM launches S/360, the first modern mainframe

1962: Spacewar, the first computer game, runs on a Digital Equipment Corporation PDP-1

Editorial standards