How Intel turned failure into success

How Intel turned failure into success

Summary: Intel's executive reshuffle has its roots in an unfortunate decision nearly 10 years ago, when the chipmaker went with a system-on-a-chip design for Timna

SHARE:
TOPICS: Processors
2

Intel's shake-up of top-level management marks the final stage in a major transformation of the company's strategy.

It is one that started nine years ago with a failure that threatened to derail the career of Dadi Perlmutter, the very same Intel executive who has been just been promoted to managing product development for all processors. Indeed, he has now seen as a leading contender for chief executive, once Paul Otellini steps down.

That failure was Timna, a design that evolved under Perlmutter's leadership at the turn of the century in Intel's development centre in Haifa, Israel. Perlmutter hoped Timna would put the centre on the map. Which it did, but not in a good way, at first.

Timna was a system-on-a-chip (SoC) design, which combined a processor with some of the other components necessary to build a PC. SoC in itself is no bad thing, and the key integrated component of Timna, a memory controller, is a very good thing to include on the same silicon as the processor itself.

How a processor uses memory is absolutely key to its performance: AMD has made a lot of hay by putting its HyperTransport memory circuitry on-chip. There are a lot of performance benefits if that integration is done right. It is also cheaper to make single chips, and cheaper to design them into computers.

Timna, however, had issues. There were three major flaws with the design — two conceptual, one practical.

The biggest one was a gamble that went wrong: Intel predicted that a new high-performance memory technology, RDRAM, would become popular and therefore cheap. The full story of RDRAM is for another day: it is enough to note that while it had plenty of technical promise, some very questionable business decisions by the technology's owner left it high and dry and overpriced long enough for other, more mainstream memory designs to catch up.

By the time Timna was coming to fruition, it was becoming clear that RDRAM was not going to be anything like competitive enough for the low-cost sector.

Then there was Timna's positioning. Timna was intended to strengthen Intel's hand in the low-end PC market, where it would compete mostly with Intel itself. That is a decision which is very hard to make, and just as hard to stick to. In general, it takes an outside force to make such moves happen, even in a company such as Intel that has some history of making market-changing moves on its own initiative.

If RDRAM had proved to be a potent, cheap technology, and Timna had given Intel a unique foothold in cheap, high-performing PCs, then the decision would have been seen as a bold move to forestall the external competition. Without those results, it became an exercise in killing the company's own margins.

Which brings us to the third terror of Timna: it did not work properly. The memory controller was reputedly flawed, and flawed in a way that could not easily be fixed. The systems worked after a fashion — by the time Timna was canned in October 2000, there were already motherboards being shown off, and preparations for launch were well underway.

Intel has never talked about exactly what happened: the most persuasive rumour is that with RDRAM failing to take off, the system had to work well with more standard parts, and part of the design just wasn't cutting the mustard. It was designed to be high performance and thus took a lot of power — but did not deliver. Instead, the company looked at the market, looked at the chip, and decided that enough was enough.

Which was bad news for Haifa and Perlmutter, who suddenly saw their future as just another satellite of Santa Clara. But Intel is at heart an engineering company, and good engineers learn from their mistakes.

The design team mulled over the lessons from Timna, in particular those about picking a high-margin market and designing for high performance at low power, and made some bold predictions.

Then they went into negotiations with Otellini, a process vividly described four years ago by Intel manager Shmuel Eden, also part of the push. "We did it the Israeli way; we argued our case to death," Eden is quoted by Bloomberg as saying. "You know what an exchange of opinions is in Israel? You come to the meeting with your opinion, and you leave with mine."

Two years later, they delivered Banias, the first proper notebook processor and the building block for the hideously successful Centrino platform. It was also the start of Intel's strategic move from speed demon to a company with an appreciation of efficiency. This move had long been mooted, but was harder to act on than accept.

The move now looks a company lifesaver. The lessons learned from Centrino have been absorbed into the mainstream Core 2 design and have seen their most extreme evolution in the Atom. They are now key to all of Intel's target sectors, from embedded systems through netbooks to high-performance server parts. And just in time: with ARM starting its assault on servers, and the mobile revolution gaining pace, the world is mutating faster than Intel's roadmaps — and it's low power all the way.

No wonder Perlmutter's star is in the ascendent.

Topic: Processors

Rupert Goodwins

About Rupert Goodwins

Rupert started off as a nerdy lad expecting to be an electronics engineer, but having tried it for a while discovered that journalism was more fun. He ended up on PC Magazine in the early '90s, before that evolved into ZDNet UK - and Rupert evolved with them into an online journalist.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

2 comments
Log in or register to join the discussion
  • Although...

    What we have now in terms of CPU power & efficiency is very good from both Intel and AMD but things like the atom make me cringe a bit because yes its good, but its not that much to shout about, when they can make it twice as powerful as it is now for the same energy consumption then they can shout about it.

    Things like net books are great and all but they are under powered even in things like web technology's, that is plain to see when trying to do anything extensive with them.

    Again the progress made so far is all good but its not the be all and end all.
    CA-aba1d
  • Yep, one of the ten key moments that shaped IT.

    Ten key moments that shaped IT:

    Intel's debut of the 'Centrino' Pentium - m' processor

    I think the development of the Pentium 4 Processor into the optimized, low power 'Centrino' Pentium - m, mobile processor is quite important. Its design allowed battery times to extended to 3-4 hours on laptops, their low power meant less heat disspation too.
    Intel produced chips with different power requirements too:
    Low power variants, and Ultra low power variants
    The ultra low variants of this processor still have one of the lowest power requirements ~ around 10W.
    Their instruction pipeline design is also the forerunner and basis of both the core duo, and the subsequent core2duo - the mainstay of computing today.
    These single processors can still run Windows XP and Office Apps pretty well , more than 5 years after their debut.
    The Centrino Pentium-m design was renamed & used in the celeron-m 900 chips, used in first eeepc.
    adamjarvis