X
Business

Intel's platform games

IDF: Craig Barrett's swansong keynote gave some strong hints about the chipmaker's future plans
Written by Rupert Goodwins, Contributor

At his swansong IDF keynote, Craig Barrett was fulsome about the industry he joined thirty years earlier. He’d been attracted to Intel because of the speed at which technology was advancing, he said, but was worried at first that this frantic rate of change would slow down. He was wrong, he said. If anything, it's got faster.

This was seemingly backed up by an impressive exponential graph that flashed briefly onto the overhead screens. According to predictions made by the company, by 2008 we’ll see a tenfold increase in performance over the first Pentium 4. We’re currently enjoying three times the performance of those chips from the year 2000, due mostly to clock increases and on-chip parallelisation by way of hyperthreading, so multicore will engender a further massive improvement.

Glossing over the fact that a tenfold improvement by 2008 is only another threefold increase in roughly the same time as the last one, the prediction highlights one fact: we'll have to take Intel's word for it. Yet just because we can’t measure them, we shouldn't dismiss Intel's predictions of a bold new parallel world -- even if it sneaks in a naughty graph or two. As Microsoft’s Jim Allchin pointed out, an average laptop running XP has something like five hundred threads on the go at once. There is no shortage of opportunity for more CPU grunt to make a difference.

As the company completes its transition from a chips and bits organisation to a platform-focused outfit concentrating more on markets than megahertz -- a regeneration symbolised nicely by physicist Barrett’s replacement, the economist Paul Otellini -- the simultaneous sea change in underlying technologies means that we won’t be able to directly compare what happened before to what happens next.

This sense of shifting foundations has been exacerbated at the Developer Forum by a sudden surge in codenames that more than matches the promised increase in performance and has threatened to swamp even seasoned Intel savants. Pat Gelsinger himself, the consummate Intel insider, stumbled over some of the barrage of new names as he outlined the Enterprise Server Platform roadmap. This is because while old school Intel was happy to promote processors by themselves, the new way is to wrap them up with chipsets and technologies and give the whole schmeer its own identity as a platform -- and, unfortunately, yet another layer of semantically empty codewords.

Centrino blazed the way here. It's the only Intel brand that non-techies know, apart from the Pentium and the blandly universal 'Intel Inside'. That is a small triumph of marketing, given the complexities beneath the Centrino badge -- wireless networking, mobile processor, integrated chipset -- and it helped Intel to focus on the market just as much as it gave that market stability and coherence.

You can see this policy in everything Intel says and does at Spring IDF 2005. The Digital Enterprise Group has ruthlessly divided its world into client, server, storage and communications infrastructure platforms -- each of which could, perhaps even will, adopt a public brand that parallels the mobile division's Centrino. Although new technologies will still span platforms, they won’t in general be considered as individual entities. You want to talk about multicore? For now, you can -- but Intel would rather focus on what it means for Napa, Lyndon and Bensley, the codewords for portable, desktop and enterprise DP servers.

This approach has its problems. Intel may have launched Sonoma -- its next-generation Centrino system -- but in the absence of simultaneously refreshing the main brand itself customers, retailers and even industry watchers seem only vaguely aware that anything's happened. Platforms and brands are more complicated to maintain than pure products: long term, Intel has a lot to learn.

But this simultaneous change in marketing and engineering focus is no coincidence. Both reflect the way that technologies have stopped being interesting for their own sake and are now judged ruthlessly on what they deliver and how good they are at delivering it. This makes the job of Intel watching harder: as the things that Intel sell become more abstract, measuring their success or lack thereof becomes harder and more subjective. Some things won’t change: in the maxed-out, muscle-bound world of high-end servers, raw statistics translate directly to measures of goodness. But how to measure Intel's Active Management Technology against its competitors? If that troubles you, think what it must look like to AMD.

Whether the intelligence in the systems is adequate to efficiently manage huge numbers of independent execution units, and whether software designers will find good ways to exploit these resources in applications, are questions that only time will answer. Likewise, it'll take a while to find out whether Intel's platform approach will bring it into conflict with its OEMs, who have been used to making those platform pitches for themselves. One thing is undeniable, though: after a couple of shaky years, Intel has found the discipline to buy itself that time.

Editorial standards