X
Tech

IDF: Tech, show business--a strange mix

What really went on at the Intel Developer Forum? ZDNetUK's technology editor describes what it was like to be on the floor at Intel's school assembly, religious revival and talk show.
Written by Rupert Goodwins, Contributor

commentary You'll already have read the report about the marketing and technical thrust of the Intel Developer Forum's first keynote. What doesn't make the news is the ritual nature of the affair, and all the peculiar things that happen when an engineering company gets involved in show business.

It's a mixture of school assembly, religious revival and TV talk show, with a rather chummy sense that audience and showmen are all in it together.

Imagine a huge auditorium, more shed than meeting space, with thousands of chairs beneath TV studio style lights, speakers and special effects projectors. At the front is the stage, which this time looks not unlike one of the later Dr Who sets: there are a number of featureless ten-foot-high towers at the back made from curved panels that reflect camera flashes from the audience in a distracting shower of light whenever something interesting happens. We discover later that the towers are not what they seem.

Pat Gelsinger, Intel's chief technology officer, kicks things off. He's been to 12 of the 13 IDFs to date, he reminds us, and perhaps it shows. He normally has the enthusiastic, no-nonsense demeanor of a youthful teacher at a minor public school who's good at his subject and knows it, carrying the boys along by charisma underpinned with a strict no-fools policy. This time, he's looking a little gaunt, his cheekbones unhealthily prominent under the lights. But then, how bouncy can you be when you have to start off an IDF? You have to say things like "It's all about computing" to thousands of engineers, who may already have guessed. You then have to say "Convergence. It's great!" and mean it. The theme of this IDF, Gelsinger continued, was CCC--Computing, Communications, Convergence.

Unfortunately, most engineers are keen on science fiction and would know Harry Harrison's masterpiece Space Rats Of The CCC, which is a satirical take of every Independence Day/Starship Trooper cliché under the sun. CCC stands for Cosmic Camel Corps, and it was with this image still firmly in my head that I watched Gelsinger introduce Intel's biggest cheese, Craig Barrett. There is a short burst of gentle ragging--Barrett has a horse called NASDAQ, on which you probably wouldn't put your money just at the moment--and then it's into the main event.

Most reports of Intel keynotes will pull out the carefully honed key phrases. Intel predicts growth, new technology sectors, continued importance of research, and so on. That's why the phrases are there. But the tone is harder to get across, and this was a very low-key affair.

"We are still optimistic about the future," Barrett said, as if there was some doubt that he might be. Graphs appeared on the video screens at the back of the set, one showing IT spending since 1962. A slow climb turns into a mighty surge in the 90s, and then collapses in 2000. It heads towards the deck for another couple of years. And then, as if by magic, it resumes its upwards climb at the end of 2003 as if nothing had happened. A small note in brackets says (predicted) at this point. Hmm.

More figures appear from analysts, forecasting growth of 4-7 percent, "but I'm not subscribing to any of these," said Barrett. When I get up in the morning, he said in a tone of voice that just hinted at the idea that it wasn't his favorite activity any more; I'm sometimes optimistic, sometimes pessimistic. But it's mostly optimism. If this is the key idea that Intel wants us all to carry around during our days at the forum, it's rather unsettling.

The savior of us all will be the Internet. Oh, and mobile phones. There follows a snappy video of lots of people in far-off places like Thailand, Vietnam, Australia and France saying "Technology, it's super," to a retro 70s synthesizer soundtrack. There's always a short video in the middle of the first keynote, and I've never worked out why. This is a classic of its kind: relentlessly optimistic, feel good and empty: it has a computer graphic that looks like the Earth being bombarded with meteors, which can't be right.

Barrett then warmed to his theme that the Internet is fab. Never mind the details, feel the numbers! A billion users by 2006! The Japanese can get 100 megabits per second delivered to their home! The Russian railway system has a billion items in its inventory! For a happy moment, we're all back in 1998 where made-up numbers alone guarantee future prosperity. Everyone's incredibly excited, said Barrett. The only place where technology is considered passé is the United States.

And what will rescue us, and bring the global sense of excitement back to its home? Here, the story gets even fuzzier. A picture of a planar transistor pops up, next to an original IBM PC and a screenshot of the first Yahoo! home page. "Each led to the other, but the inventors of each had no idea that would happen." The cellphone is as revolutionary as the PC. The unspoken conclusion is that Intel has no plan beyond making the bits and seeing what happens: we don't know what will come about, but trust us. It'll be wonderful.

The ritual now demands the solemn invocation of Moore's Law--which given that it lies at the very heart of Intel is fair enough. Up pop the graphs of prediction versus actual results, and once again the sheer magic of the numbers is considered enough. Silicon technology is anti-inflationary, said Barrett. It does more for less each year. Given that the economists are currently biting their nails over anti-inflation--the three Ds of debt, deflation and depression--this little bon mot left over from the wonder years of the 90s sits uncomfortably in the mind.

But don't worry. If we've had Moore's Law then it must be time for the practical demonstrations of fab new technology! It's now that the strange pillars at the back of the set come alive--the one on the far left silently turns, revealing a lab bench packed with goodies. I half expect Craig Barrett to say "You could have won this!" in the manner of Ted Rodgers teasing Beryl from Crewe, but instead he invites a couple of Intel lab guys onto the stage and they show off their wares. It's good, meaty stuff involving phase-shift amplitude modulation of a laser, and they then shove a digitized Pat Gelsinger, shown riding last year's Segway, through a large roll containing five kilometers of optical fiber. Round of applause, on to the next one.

It gets better. A Computational Nanovision Researcher pops up--impressive title, notes Barrett--who turns out to be an even more impressive Teuton with an impeccable Dr Strangelove accent. Horst Haussecker clearly loves his work and takes it very seriously: he's involved in building machines that render Intel's very tiniest components visible. These distant cousins of the electron microscope have the problem that at the level you can see nanometer-sized features there's a huge amount of video noise obstructing the picture. "Looks like the Milky Way," said Craig about one very noisy picture of a tiny device. "Yes," said Horst, "But much smaller." Next up were demos of Newport, the new mobile platform for 2004. It was a tablet with a rather nice removable keyboard, but also had what Intel called 'closed-lid' computing. This was a small LCD set into the outside of the case, together with a couple of buttons: the idea is that if you get e-mail via wireless you can read it and send stuff back without having to open up the computer. Why you wouldn't want to do that on your cellphone via Bluetooth--and why anyone would leave their portable computer on all the time--was not covered.

Onwards! Here's Marble Falls, a desktop technology that does dual independent audio and video channels. That's two screens at once, in layman's language. The demo for this was rather gruesome, showing how medical experts from around the world had collaborated on head surgery for the separation of conjoined twins. The most interesting bit--on the screen without the detailed model of the inside of a skull--was a rather fractal knowledge model. You clicked on an idea, represented as a point on a circle of connected ideas, and it expanded into its own circle, bringing with it a more detailed breakdown of the underlying concepts. These had names like "Multiple Birth", "Heart Physiology" and "Children's Hospitals"--although I still can't work out what "House of Lords" and "Robbery" were doing there.

The final part of an Intel keynote demonstration is the ritual humiliation of a very bright, very young engineer. Our fall guy this time was showing off real-time video editing, and threw together an MTV clip of him and some dancers doing their stuff in front of tumbling multicolored polygons. And look, you can put it on a handheld video player, or send it down an ADSL line.

The show's closing number was a sneak peek at the trailer for the new Matrix videogame, which apparently bridges the gap between the Matrix II and Matrix III movies. It looked very swish, and was all done on Intel processors. So that's nice. But no, we couldn't take the video file away with us. I was once again reminded of the Cosmic Camel Corps, and walked out into the San Jose sunlight with a head full of surreal images and the deep desire to get stuck into some real design engineering.

Rupert Goodwins is the technology editor for ZDNetUK

Editorial standards