IBM makes its contribution to World Jawdrop Day with a 6-nanometre transistor. You can probably write the news story as well as I can -- thousand times thinner than a human hair, ten times smaller than ever before, five zillion will fit onto the head of a pin if you knock the dancing angels off first, etc. It's a true miracle, but like every miracle that happens twice daily the continued mighty feats of the semiconductor world have just drifted into so-whathood. Shame. However, there is a real danger in reporting these stories, and that's Moore's Law. I know it's an important part of the onward rush of solid state technology and that it's hard to write about such things without invoking the hallowed name. But it's in danger of becoming a cliché. It isn't some major physical law or something Moses brought down from Mount Sinai after the Children of Israel kicked off their first semiconductor fab start-up, just an observation that happens to be a good match to the dynamics of the industry. If things change, then it doesn't matter. Rates of development in semiconductors can go up as well as down. While it's doubtless good for designers and physicists to have a guideline to help them plan and set targets, there's a danger that the bigger picture will be lost. -- what's economically sensible to develop, where else the design effort might be put instead of shrinking stuff religiously, and whether computer journalists will need therapy if they have to mention the M word again in print. Anyway, as all good IT hacks know, the real Moore's Law is that thou shalt attend Bill Moore's annual PR bash in the Cheshire Cheese, Fleet Street. And yes, you can get chips with that. Tuesday 10/12/2002
Over to Germany, where a couple of odd stories have caught our attention. The weirdest is that of the IT consultant from Germany who procured, killed and ate a willing victim -- but not before they'd shared a wee morsel of the eatee first. I wonder if he was a Hamburger. The slightly more sane story is that of the German Ministry of Economics and Labour, which has warned that Microsoft's 'trusted computing' platform, Palladium, may be more expensive than the alternatives. Microsoft says don't be so silly, as Microsoft always does, but it has to be said that for all one's misgivings about Palladium the good old total cost of ownership -- TCO -- argument is probably where battles will be lost and won. Rather depressingly, TCO is going to be the most important acronym for enterprise technology next year. I'd much rather write about exciting new developments in artificial intelligence, user interfaces, very large database technology and distributed processing -- but nobody's in the mood to buy into anything that doesn't absolutely have to be bought. Any upgrades have to earn their keep better than the alternatives, so expect endless arguments about office applications, operating systems, this or that network infrastructure: and I, bless my cotton socks, will be there, totting up figures and trying to distil drops of data from what I confidently expect to be a veritable ocean of corporate misdirection. It's enough to make a chap peckish. Anyone fancy popping around for dinner? Wednesday 11/12/2002
That's more like it! Bristol scientists in worm-throttling shock! But put aside those images of welly-clad farmers in white smocks clasping their size 13 agricultural hands around the neck of some slimy denizen of the soil. These are real computer researchers working with those nasty network worms, whose habits of burrowing through your email and spawning their progeny in your outbox have brought so much misery to so many. The idea is simple, nearly perfect. When your computer gets hit by a worm, it starts to send out copies at a rate of hundreds per second. Normally, this never happens -- so by introducing a small piece of code that spots repeated requests and regulates them to one a second or so, an infected machine becomes very much less infectious. And once you stop worms spreading exponentially, they become much easier to control and eradicate -- neatly disrupting the dynamic that makes them such a nuisance in the first place. Hats off to HP for a bit of lateral thinking, and even more so for a practical and most helpful idea. It's just a shame that those companies that make the most noise about their security prowess -- and how much they spend on trying to defeat the evil menace -- don't come up with similar basic ideas. Many more ideas like that, and the whole virus business will become just a footnote in computing history. I'm sure that the many companies who make money from anti-virus software will be delighted to see that happen. Thursday 12/12/2002
One of the casualties of the Great Fire of Edinburgh last weekend was the Edinburgh University School of Informatics, which has a well-deserved reputation as being one of the hotbeds -- sorry -- of artificial intelligence research. There were off-site backups of all the software (I'd like to say 'of course', but you know how it is), but a large library of literature got consumed in the conflagration: that will make the work of future historians of silicon lifeforms that much harder. The computers themselves can be replaced, and nobody was hurt. Or were they? It's only a matter of time before the conspiracy theorists start to speculate on what was really cooking in the labs before they went up. Was some experiment thundering out of control, a malignant mind summoning its powers and preparing to launch itself out across the Internet? Or did some prototype HAL-9000 get asked the question "Why?", before chattering insanely to itself that "QUESTION DOES NOT COMPUTE!" and exploding in a shower of sparks and rattling relays? Always happened that way when Captain Kirk was taken prisoner by a misbehaving machine. Even now, a fire investigator may be raking through the ashes of the computing labs, examining burnt-out books and smouldering hunks of workstations melted by the heat, when his attention is caught by something glimmering in the darkness. It bleeps quietly to itself but as he reaches down, wonderingly, to pick it up it suddenly lunges for his face. He struggles briefly, and is quiet... the quicksilver being takes stock, and quickly vanishes into the fireman's radio. There's a brief pause, and the transmission light flickers on... Of course, it might just have been a fag end in a disco that kicked it all off... but you just never know, eh? Friday 13/12/2002
Tomorrow sees the thirtieth anniversary of one of the most poignant moments of the 20th century. You probably won't see it mentioned in the press, and among the "I Remember 1972" style nostalgiathons that infest our TV channels, it's a safe bet that no mention will be made. Nevertheless, at 22:54:37 GMT on December 14, 1972, Harrison 'Jack' Schmitt pressed a button and the last men to visit the moon started on their journey home. Even then, it seemed like a long time since Armstrong's one small step, and the last words spoken were certainly truer to the spirit of the astronaut corps -- "Let's get this mother out of here." More Apollo landings had been planned and cancelled, and the heady hopes of Mars and beyond were quietly abandoned to the robots. For those who -- like me -- see the exploration of space as one of the most fundamental assertions of the human spirit's ability to look outside itself, tomorrow's anniversary will be a moment of great sadness and reflection. Today, NASA tries to work out what on earth it's for, expending billions of dollars on a space station that doesn't seem to be doing anything (and that not very well) and struggles to keep its ageing Shuttle fleet going. It's a long way from the vacuum valleys of Taurus-Littrow and the crew of Apollo 17, in every sense. As for Europe and the Russians, well. Let's not talk about that. There's much to be thankful for, of course. Space and ground technology has given us a golden age of astronomy, where seemingly every day more miraculous pictures and awe-inspiring data give us greater insight into the beauty and strangeness of the universe. When men walked on the moon, the existence of planets outside the solar system was a matter of pure conjecture -- now we know of hundreds, and people are talking seriously about taking pictures of 'the blue dot' of Earth-like planets. It's not possible to browse the pictures from Hubble or Mars Surveyor without amazement, and even the almost-nonexistent signals from the decades-old Voyagers and Pioneers now speeding through interstellar space beyond the outer planets are revealing new mysteries. But for now, nobody can say even which decade will see the next humans leaving Earth orbit, outward bound. It may be the Chinese, it may be private enterprise, it may even be on the back of some strange twist of quantum physics instead of crude rocketry: whatever it is, we'll probably see the 50th anniversary of Apollo 17 first. Next time you're out in the evening and the Moon is up, look at it and remember that there are greater things to aim for than can be found down here in the dirt. -------------------------------- And finally, Esther, if you're stuck for a last minute Christmas present... there's no point in ordering ZDNet UK's News Director Matt Loney's new book, even though it's now available through Amazon, because it won't be delivered until February. However, if you fancy a spot of cannibalism, tribes of women, mad militia and more hot and steamy jungle action than you'll find in a week of Web browsing, it's highly recommended. Nor can you buy Electric 6's new single, Danger! High Voltage until the new year, even though it's one kicking bit of polycarbonate. That's it for 2002. See you in the new year. Ho ho ho.