X
Tech

AMD fires new volley in chip core war

Advanced Micro Devices has begun laying the groundwork for its next battle with Intel.
Written by John G. Spooner, Contributor


Advanced Micro Devices has begun laying the groundwork for its next battle with Intel. Collectively, we might call these the core wars.
AMD on Dec. 14 unveiled more details behind Fusion, its vision for the future of the x86 processor, in an effort to get customers—both PC makers and end users—behind the idea of processors that contain graphics and other elements, not just numbers of processing cores.
Under Fusion, which is due in the 2009 timeframe, AMD plans to blend technology from its ATI graphics arm, with its own x86 processor cores, creating hybrid processors. Fusion chips will thus contain multiple AMD x86 cores along with graphics processing cores at the outset. AMD reserves the option to blend in additional cores, however. Among them could be accelerators designed to speed up the processing of things like Java. The result would bring processors that are better suited for a given environments, including notebook PCs, desktop PCs and servers. AMD says the approach will boost the performance of computers over a range of computing applications by knitting in features that specifically address the needs of those applications. AMD could create higher-performing Fusion chips with for servers or slim them down for use in consumer electronics—just to use two examples from opposite ends of the spectrum—by pulling from a broader set of silicon building blocks, the company said.

For the desktop PC market, AMD Fusion chips could help lower hardware costs for emerging markets. For servers it could be possible to add an onboard Java processing engine or build in additional image processing properties for medical imaging into its future Opteron chips. Of course, whatever gets added directly into a given AMD processor has to make sense for a broad range of its customers. Graphics capabilities, for one, do make sense for a broad set of customers, particularly given current trends toward graphical user interfaces—think Windows Vista and gaming—and the industry’s interest in lowering costs for emerging markets. However, AMD doesn’t appear to be doing anything to prevent the pairing of Fusion processors with discrete graphics. Its Torrenza plan will also allow for discrete accelerator chips to be added to servers. Torrenza will cover the bases for a number of technologies that might not make sense to add directly into Opteron processors immediately. By the way, there’s no reason I see that Torrenza couldn’t be brought to the desktop, where it could help out gamers or workstation users with add-on chips.  
Thus while AMD is eyeing 2009 for Fusion chips, it’s already begun laying the groundwork. Aside from its Torrenza strategy, AMD has unveiled plans to will introduce a dual-graphics technology that allows notebooks to use both integrated and discrete graphics. Due in 2007, this Dynamic Graphics Power Switching technology will allow a notebook to switch to integrated graphics, when running on battery power, but use its more power-hungry discrete graphics processor while plugged in. The promised result is extend battery life for mobile users. Ultimately, however, it appears to be a predecessor to a Fusion graphics strategy.

Overall, what you see here are the battle lines being drawn for the end of this decade and beyond. AMD, for its part, is going on the offensive with theme that states AMD equals smart chip integration. Intel, on the other hand, has been fairly mum about its plans past the four-core stage. (As an aside:  Intel has announced its first four-core chips already, whereas AMD is scheduled to announce its first four-core processor in mid-2007. Their respective four core strategies, including the different between multi-chip packages and so-called native arrangements, are a good subject for a different post.)

Intel has been on the same path toward multi-core integrated processors, for the most part. While its Tera-Scale Computing research project is clearly looking at the idea of putting tens and tens of cores into a single piece of silicon, such designs would include specialized cores as well. What isn’t so clear, right now, is how Intel plans to get from its four-core chips to many-core, integrated chips in the future. It’s a little too soon to declare victory in this battle.

Editorial standards