eWeek's Chris Preimesberger has a funny yet enlightening reporter's notebook entry from the Core Duo launch party that was held on Intel's campus last week. The party may have been for Core Duo's coming out but reporters were apparently just as interested in hearing from Intel CEO Paul Otellini on other issues such as the what AMD's acquisition of ATI meant for Intel and whether or not there would be any more layoffs at Intel. According to Preimesberger's account, Otellini was noticeably flustered (later "irked") when Intel's Sales and Marketing senior veep Sean Maloney rose to his defense at one point and said "Listen, this is a big day for us here... what kind of question is that?"
Among the short answers given by Otellini in response to questions that he would rather not have answered at the Core Duo launch was an inquiry from one reporter about whether or not Intel had plans to integerate graphics functionality into its processors the way AMD is clearly planning to do now that it has acquired ATI. Wrote Preimesberger:
"I'm just wondering," the reporter asked Otellini. "Since your main competitor [AMD] has just announced it is acquiring [graphics processor maker] ATI Technologies and said it intends to put graphics functionality right in the processor, will this mean that Intel will also put graphics in its chips?"
Somewhat flustered, Otellini glared and said: "The short answer is 'yes.' That's the only answer I'm giving. Next question."
Had I been standing there, I probably would have said "What about memory controllers, networking, or anything else?" Or, "Paul, is that the last draft of the Intel acquires NVidia press release in your breast pocket there?" But I wasn't there. AMD, as most know by now, has long believed that the benefits of integrating memory controller functionality right into the processor outweighs the sacrifice in flexibility that Intel's modular approach takes (something that AMD's director of commercial solutions Margaret Lewis and I talked about in a recent podcast interview). As AMD's acquisition of ATI was being announced, AMD CEO Hector Ruiz made it absolutely clear that ATI's graphics technology will be integrated directly into some of AMD's processor offerings.
Intel's integration of anything into the microprocessor obviously raises some interesting discussion points the first of which is how many more times will Intel start with downplaying some approach taken by AMD (as it did with 64 bit extensions to the x86 instructions set) only to eventually copy the approach. The other, perhaps more serious issue, is the divergence in the two companies' instructions sets that is already underway (something else Lewis and I spoke about). Thanks to cross-licensing agreements between Intel and AMD, the chips coming from the two companies have maintained a high-level of compatibility. Instructions that worked on one worked on another. When Intel introduced specialized multimedia instructions in the 90's (anyone remember MMX?), AMD followed suit with compatible instructions. When AMD introduced its 64 bit extensions (dubbed AMD64), Intel eventually followed suit with compatible instruction sets in its chips.
But, once virtualization started to become a hot topic and there were opportunities to do include special virtualization hooks at the microprocessor level, Intel and AMD parted ways. The virtualization technologies being integrated into the microprocessors from both companies are incompatible with each other thereby placing the burden on software developers that work at the chip level to branch their code with case structures and if-then-else statements based on the ID of the processor. Throw in a few integrated technologies like graphics and then things could really get a bit hairy as processors from the two companies become less and less compatible with each other over time.
The bottom line? If Intel plans to integrate graphics into some of its processors, that's not just interesting news because of the news itself. The next question is... what graphics technology? Intel has its own graphics technologies but they're viewed as laggards when compared to those that come from ATI and NVidia. Might Intel have an appetite for NVidia?