It's not the greatest computer in the world, but it's mine. An anonymous model cobbled together from various bits left over from other projects, most of it would have been state of the art a couple of years ago — but now it's just a 2.4GHz P4 with 512MB of RAM (which, being Rambus, I cannot afford to upgrade), a graphics card that's good enough for Google Earth and thus good enough full stop, and about 350GB of hard disk.
And now, it's deader than Scotty. I'm used to it being sick — it's a Windows machine, after all, and I've lost count of the number of times some random piece of software has infested the registry with the binary prions that lead to Mad Owner Disease — but this is different. Without warning, the screen instantaneously froze solid in a way that mere software can never quite manage. Flicking the power switch proved only that there was something very badly wrong with either the motherboard or the processor: I might as well have been praying at a cargo cult shrine made of sand and cow dung.
There's not much you can do in this situation. Reseat all the expansion cards, memory and processor, check the power connections… but it's a hot day. I know what the problem's going to be. Still, go through the motions. Charging! Stand clear!
The corpse doesn't even twitch. The CPU fan is working, though, which scotches my number one theory; the last two dead PCs I've had have been due to the failure of this particular quid-fifty part. Still, I carefully dismantle the heatsink assembly in a search for clues.
Aha. Beneath the fan sits a thick, fuzzy ring of dust, blocking the air. The last time I had this problem was on an Austin 1100 when the radiator clogged with oily gunk, leading to persistent boiling and cursing. Without a doubt, the processor has been cooked to perfection. At least my ancient Austin had a red light that came on when things got too hot — an art that appears to have been lost. Short of a regular inspection of the innards of my PC, it's hard to see what could be done to prevent this; that mere dust can bring down the cream of modern technology seems a bit ironic.
Now what? New processor, and carry on? That's simplest. New motherboard? That implies reloading XP, which is never pleasant — or switching to something else, which means such a wholesale investment in faffery that my soul sinks at the prospect.
It's at times like this that I sigh after my fantasy domestic thin client hosted service — a virtual PC sitting on someone else's properly redundant machinery. If my local PC gives up the ghost, then it doesn't really matter what I do to fix it — even if my flat spontaneously combusts, I can be back in business half an hour later at a friend's place.
Still, that ain't happening. It's time to don the spotless breastplate of investigative technical journalism, take up the trusty sword of fearless faith in the future and launch myself into the bold new dual-core future. Steps are underway to obtain and deploy a new mobo with fancy multiminded silicon from Intel, and I shall report on what pleasures and pain result. And I trust the company's involvement in the 'Hot Chips' conference in Stanford next month is no pointer to future immolations on the altar of the almighty watt.
Contrarianism can make vivid journalism. [Omnes: Oh no it doesn't! Goodwins: Oh yes it does!]. The fine art of selecting a sacred cow and saying it's a pig can inject a useful jolt of surrealism into the veins to unblock the blood clots of assumption, but it's not without its risks. Used too often, it can generate an unhealthy addiction to the rush of attention it brings — and as that rush diminishes on repeated application, the urge to up the dose can lead one into a state of advanced disassociated irony.
It was thus with some interest that I watched two of my favourite contrarianists get to grips with one of my favourite heavenly bovines. First to shout "Hey, Bull!" was John Dvorak, with whom I shared a magazine for five years in the paper-based 1990s. For most of that time, he stuck to his side of the Atlantic and I to mine, but there were plenty of occasions when we found ourselves at the same event. My advice to anyone similarly blessed is to stick tight: the man has a psychic ability to find the oldest and most unfeasibly expensive brandy within twenty miles, and thence to locate the only PR within thirty who can pay for it. I owe my only experiences of spirits older than myself to Dvorak, and that buys a considerable amount of affection.
But what on earth was he on when he ladled out the boo juice over Creative Commons? A bunch of misguided idealists poisoning copyright? A licence-like entity that does nothing but prevent commercialisation of ideas? A power-grabbing middleman appropriating the right to mediate otherwise unhindered transactions of intellectual effort? Where did that lot come from?
I use CC for my photographs on Flickr, not because I have some ill-formed wish to appear hip and trendy (if I want to do that, I tend to adopt my own contrarian inverted snobbery — usually with unintended, self-defeating results) but because I find it a very convenient way to express how I wish to use my copyright. If you can think of something fun and good to do with my IP then go ahead, you have my explicit and well-defined permission to do so freely, but if it's intended to make money then talk to me first. CC breaks down the mystique of copyright and makes it useful; it's anti-FUD. Convenience amplifies utility, and copyright is supposed to be useful.
Following on from Dvorak's peculiar cookery, Andrew Orlowski applies the sprinklies to the top of the cake. Orlowski and I have sat opposite each other in practice but rarely in spirit; yet oh boy, can he be hard to keep up with. While the rest of us are pootling around in three dimensions he has some sort of intellectual warp drive that lets him twist the fabric of spacetime and deliver context from outside any known frame of reference. Often this is as invigoratingly unique as a snapshot from Hubble: this time he seems to be overdosing on Dvorakian radiation from a contrarian event horizon.
CC is bad, he says, because it restricts and distorts the potential of creativity to a particular and limiting materialistic and technological context, it doesn't solve the big problem of getting paid, and it doesn't produce much worth having anyway. Bunch of old hippies who don't even like good music. Pah.
Which are odd sins to lay at the door of an idea that if you don't like, you don't have to use, one which explicitly helps people get their ideas out there with at least some chance of a cultural or material return. It's true that most CC stuff is uninspiring but I don't think Sturgeon's Law has been repealed yet, and you can hardly blame Professor Lessig for the Grateful Dead. And yes, it is a big fat digital thing, part of the same big fat digital thing that lets Orlowski (and I) lay out our wares in front of hundreds of millions of potential readers, instead of the few tens of thousands who read the trade press back in the day. Is that not a significant difference?
Still, if it gets people talking…
It’s getting increasingly difficult to divine the fate of Itanium. There’s a steady trickle of news about it — Intel launching a new version with a faster bus, SGI producing a new server based on the chip, and HP will give you a $3250 Itanium server if you attend a $2000 developer workshop — but also a fair dollop of anti-news. That selfsame HP (Intel’s closest partner in Itania) is keen for us to think of Itanium as the chip of choice for mammoth servers with thirty two processors or more, yet has just sold a whopping great 1024 processor supercomputer to the United States Department of Defence — built out of AMD Opterons. Curious stuff.
The new version with a faster bus is also mildly confusing. Although Intel will happily sell you one, it won’t sell you the supporting chipset needed to make it do anything other than paperweight duty, nor a board to plug it into. The only people who do make such a chipset — and thus the only customers for the chip — are those crazy guys at Hitachi. No idea how many servers they ship but this time last year their quarterly sales — when combined with those from four other Asian companies — were pegged at seventy. Let nobody say that Intel doesn’t care for its smaller customers.
The faster bus is, of course, nicked from Montecito — the forthcoming and widely-trailed dual-core Itanium that is due to appear in a blaze of glory (and a whirr of fans) later this year. It’s being bigged up something rotten, with Intel claiming that it will blow the socks off any other chip for floating point performance, with a single four-way, two core device clocking in at 45 gigaflops. Twenty of these should be able to reach a teraflop. For comparison, the first Cray-1 supercomputer in 1976 ran at 160 megaflops — that’s about three hundred times slower than a single Montecito — and required around a thousand times more power from its own 150 kilowatt generator. And no, it didn’t run Windows.
I know we’re supposed to worry about the commercial side of things: just look at Cray if you want to know where the pursuit of power for power’s sake gets you. And I don’t honestly know what the global market is for teraflop-sized commercial systems, nor I suspect does anyone else. It’s even impossible to imagine what on earth would happen if the Silicon Fairy turned up, waved its magic wand and gave us all teraflop computers sitting on our desks — what sort of software would appear, and would it really look that different to what we have now?
But would I like one? Hell, yeah. Even if it was just a paperweight.
Another one bites the stardust — James Doohan (don't even think of pretending you don't know who I'm talking about) has finally started his final slow dematerialisation. He lasted a long time for someone in a red shirt, eventually turning into the ultimate method actor — in the 80s and 90s, he was often to be found hanging around rocket scientists, talking avidly about ion engines and future projects — very early on making the decision that if you were going to be typecast, you might as well enjoy it.
Already, Planet Earth is planning to mark his passing. According to the relevant folklore, his character was — will be — born in Linlithgow, whose best-known offspring to date has been Mary, Queen of Scots, and the town is toying with the idea of putting up a blue plaque or something similar. That would be nicely weird, a memorial that starts "On or near this site in the year 2222…".
It would be traditional at this point to talk about the influences he and his buddies had on a generation of techies, the curious resilience of the mid-sixties catchphrases and icons which surrounded him, and perhaps throw in some historical context about exactly why the Scots became the engineers of Empire and gathered around them a culture of confidence where machines were concerned. Perhaps we can take all that as read, while neatly avoiding any delving into the idea of what things would have been like if he'd swapped places with that other screen Scotsman of the 60s, Sean Connery. How would they have played each others' roles? No, let's not.
I must say, I'm not looking forward to the great Galactic ghoul picking off the rest of the team one by one in some terrible drip-fed reminder of the loss of youth — they're probably not looking forward to it that much either. But with the Shuttle still having problems with its wiring — the ghost of that Austin 1100 certainly gets around a bit — and outer space still seemingly populated solely by the wandering hive-mind of Orlowski, it's worth keeping some of that fantasy alive, just so we can remind ourselves that dreams can drive reality. And, along the way, spawn some really cool mobile phones.
And so London slips into the weekend in a rather dazed state of mild shock. Reports reach me that tea is no longer seen as sufficiently bracing to cope with the events of this febrile summer, and citizens all over the capital are being forced to resort to Pimms. The latest news as I type this is that armed police are surrounding an Internet café on the Harrow Road, which seems a bit much even if someone was seen reading the Inq without due care and attention
There has been a curious shift in the way information flows to the curious. It used to be a common (if illegal) practice for certain naughty people to keep an ear on big events in the City by use of radio scanners tuned to the various police frequencies. Now everyone in blue is equipped with encrypted digital walkie-talkies, all you can get is dispatch riders and taxicabs moaning about roadblocks.
But tune to a local radio station, and caller after caller is reporting in with eyewitness accounts from everywhere across London. Whether this will eventually collide with the blooming bloggers, phonecam Flickristas and other unmediated digital links between the street and the screen, I cannot tell. It would certainly be possible to set up an automated phone gateway that would let anyone file a voice report on a web server, and for people to go through, identify and broadcast the most interesting and trustworthy messages. Semi-automatic talk radio, or mobile podcast aggregator?
Assuming I don't get shot, blown up, mown down by a speeding police car (by far the greatest danger to Holloway Road denizens at the moment — they are numberless as the stars at night, and as difficult to see), bowled out or hit on the head by a chunk of falling Shuttle, I shall return — clear, confident and connected, bringing clarity to your world — next week.
That's if I can resist the temptation to move to Dow Jones Newswires, which is seeking a London-based energy reporter to report and break news on Europe's power, natural gas and emissions markets." When it comes to natural gas and emissions, I bow to no man (it's far too [No, no, no. Not on my watch. - Ed ] dangerous) in my ability to break news like the very wind.