It's Day 0 at the Intel Developer Forum, the traditional pre-show press briefing. I got in to San Francisco on Saturday to find that the Marriott, the hotel where the press is usually billeted, is in the middle of being dramatically remodelled. I wander around a once familiar place feeling curiously disconsolate, like an old farmer who comes back to his village to find it levelled by an earthquake.
The broadband has got better – solid two megabits a second back to the UK -- but the in-room coffee machines have been dramatically downgraded to devices that dispense a paper cup of liquid so weak the caffeine in it is as undetectable as interstellar dark matter. You know it's there in theory, but CERN will have to be upgraded again before anything shows up on the scanner.
And the (rather nice otherwise) chopped salad I have for Sunday lunch is served in a scooped shiny white porcelain bowl with a bulge at the back. The effect is like eating from a urinal. Not slightly like – dramatically like.
But following the usual weekend activities – buying books at City Lights, having a nice pint outside the San Francisco brewpub while reading said books, fending off the hookers who see you having a nice pint outside the San Francisco brewpub while reading said books – it's time to buckle down. The massed hacks assemble in a gloomy basement ballroom, and await trial by Powerpoint.
The day starts inauspiciously, with Intel director of research Andrew Chien doing an introduction to Intel Research itself. This kicks off with a light shower of marketing phrases, which bounce harmlessly off like spring rain on a greenhouse roof. There is engaging talk of “Driving off-roadmap innovation”, which brings up a Clarksonian image of engineers getting stuck in muddy tracks with their wheels spinning. There is hackle-raising mention of technology “enabling me to achieve the goals I value most” by acting as a proxy, which sounds like giving computers powers of attorney. And as for “easily form and enrich relationships” - it's taken me half my life to learn not to form certain classes of enriching relationship too easily.
Then we get the first tempting dollop of actual stuff – an UbiFit. Not only does this sound like something from Philip Dick's imagination (he did live just down the road, after all), it acts like one too. It's a belt-mounted device that monitors your various vitals and lets you know whether you're doing enough of the right sort of exercise. I briefly fantasise about one tuned for journalistic life - “You have now had your... eighth.... gin and tonic and... twenty five... Marlboro Reds. Daily target achieved.”. And here's a bracelet RFID reader that lets you know about various household objects as you go to pick them up, and a mobile phone that grows a mixed field of flowers on its screen when you achieve the right mix of exercises and dietary choices. The machines are beginning to infiltrate our quotidian activities.
The same thought has occurred to others. Chien is going on about how different sorts of sensor will monitor different parts of everyday life to our advantage, when he is interrupted from the floor.
“Any good for shagging?” asks 'Mad' Mike Magee, the Inquirer's maverick master hack.
“Pardon?” says Chien. It is the only possible response.
Magee backs off a bit: “Aren't these things a bit intrusive?” he amplified
Chien gratefully swipes this one over the boundary, explaining that they'll only work when they're not too intrusive, how Intel is working all this out, and so on. My laptop – a rather gorgeous Sony VFN-TZ11XN I've borrowed for the trip, and more on that later – bleeps quietly, and a friend pops up on videoconference. It's pleasingly surreal, having Intel Research talking about how machines will help mediate relationships in the future while I'm pulling faces at my pal back in Europe right now.
And there's more. Intel has got the mashup bug. Wait, wait, I know – just two years after everyone else, right? The interesting part is that Intel thinks that this is the start of something much bigger, and that it's been working on tools to create complex compound sites basically just by browsing. You can see for yourself what's going on at Mash Maker — it may get renamed, though.
The Intel Research talk ends in a rush – Chien is out of time, so skips at warp speed through the most interesting bit, that about Intel's work on biosensors that can electrically detect a large range of interesting gunk that can course through your blood. I can't begin to describe a world where ubiquitous machines can detect what's going on in our bodies – I could, but only if I took six months off with the Philip K Dick Memorial Amphetamine Jar to write a dystopian novel about the death of dyspepsia. And I'm sure it's pure coincidence that Chien used to work on supercomputer design when he was a professor at the University of Illinois at Urbana-Champaign: you know, the place that built HAL-9000.
OK – time for the next speaker. After such a blast of futurology, you'd normally hope to come back down to earth with something dry about chipsets. But this is IDF, the only industry event that spends half its time in orbit. Mario Paniccia, director of the Photonics Technology Lab, gets up and announces the world's best performing silicon germanium photodetector. That's 40 gigabits per second with a dark current of less than 200 nanometres, built in such a way that it can be integrated onto a chip alongside all the other stuff – 40Gbps modulators, multiplexers and electrically pumped silicon lasers.
There'll be more on what that all means later – the mixture of silicon and germanium to create waveguides and detectors on a chip is exceptionally clever and quite difficult, and deserves an article all by itself, but Paniccia points out a few salient areas of interest. Prior to 2004, he said, the fastest modulation a silicon optical device could manage was a megabit a second. In 2004, that was raised to a gigabit, then ten gigabits, and now they're at 40 gigabits per second. That's three orders of magnitude in around two years. “We're not arguing now whether 100 gigabits a second is possible,” he said. “but whether it'll be good enough.”
Another illustration: the world speed record for fibre optic data transmission is held by NEC, and clocks in at 25 terabits a second. All in all, said Paniccia, that took enough equipment to fill half of this room – a room currently filled with a couple of hundred journalists, or about as big as Screen 4 at your local multiplex cinema. “We can take twenty five of our lasers, twenty five modulators, and so on, and put them all on one chip – total throughput, one terabit per second. That's on a sliver of silicon the size of a fingernail. “
All this is by way of supporting the Terascale project: if you've got a chip doing a million megaflops, it's going to need a lot of data. So if you can integrate an optical link on the chip capable of supporting that, it's a Good Thing.
“We'll commercialise this by the end of the decade”. Y'know, that's around two years away.
Phew. At this point, my own time runs out – there'll be more later, including stuff on how Intel is pushing software development tools for multicore systems (precis: slow advance in a number of areas, no one big breakthrough achieved or expected; the industry educating itself is key. Have a look at WhatIf, which is a community-flavoured area where Intel is delivering prototype programming tools in exchange for feedback on how to make them better. Best bit is probably a mixed-mode debugger that knows about Java, C and C++.
And how about an accelerator exoskeleton? Intel has one. But I don't know what it is. Yet.