More than a year ago, a crescendo of rumors talked up the idea that Apple would shift the Mac platform to its ARM-based A-series processors. This idea appeared to go into hibernation, but with last week's release of the iPhone 5s and its 64-bit A7 processor, the rumor has gained traction.
Many are impressed with the continuing results of Apple's microprocessor division following its buyout of P.A. Semiconductor in 2008. However, Apple has long produced a variety of ASICs and cores, including various bridge chips and during the PowerPC days in the 1990s, the floating-point instruction set called AltiVec, VMX, or the Velocity Engine. AltiVec was used in P.A. Semiconductor's PWRficient processors.
The A7 processor received a rave review from Anand Lal Shimpi at Andandtech. He compared the processor to the past generations of the A-series but also to Intel's Bay Trail FFRD and Qualcomm's MSM8974 Snapdragon 800 MDP/T.
Shimpi also addressed the topic of "larger machines."
Before I spent time with the A7 I assumed the only reason Apple would go 64-bit in mobile is to prepare for eventually deploying these chips into larger machines. A couple of years ago, when the Apple/Intel relationship was at its rockiest I would've definitely said that's what was going on. Today, I'm far less convinced.
Apple continues to build its own SoCs and invest in them because honestly, no one else seems up to the job. Only recently do we have GPUs competitive with what Apple has been shipping, and with the A7 Apple nearly equals Intel's performance with Bay Trail on the CPU side. As far as Macs go though, there's still a big gap between the A7 and where Intel is at with Haswell. The deficiency that Intel had in the ultra mobile space simply doesn't translate to its position with the big Core chips. I don't see Apple bridging that gap anytime soon. On top of that, the Apple/Intel relationship is very good at this point.
Although Apple could conceivably keep innovating to the point where an A-series chip ends up powering a Mac, I don't think that's in the cards today.
In his Monday Note blog, Jean-Louis Gassée offers some interesting analysis of the 64-bit transition for Apple and its competition. In the post titled "64 bits. It's nothing. You don't need it. And we'll have it in 6 months," he points out a number of problems for the Android infrastructure in the transition — not so much on the hardware side but for the operating system and app developers.
In addition, Gassée examines the chances for A7-based Macs.
Can we see a split in the Mac product line? The lower, more mobile end would use Apple’s processors, and the high-end, the no-holds-barred, always plugged to the wall desktop devices would still use x86 chips. With two code bases to maintain ß OS X applications to port? Probably not.Apple could continue to cannibalize its (and others') PC business by producing "desktop-class" tablets. Such speculation throws us back to a well-known problem: How do you compose a complex document without a windowing system and a mouse or trackpad pointer?
We've seen the trouble with Microsoft’s hybrid PC/tablet, its dual Windows 8 UI which is considered to be "confusing and difficult to learn (especially when used with a keyboard and mouse instead of a touchscreen)."
There are many issues with input and performance with tablets. An iPad isn't a true substitute for a "real" computer such as a MacBook Pro or an iMac. Or the forthcoming Mac Pro workstation. Yes, an iPad is a fantastic piece of mobile technology and very capable of doing single tasks. I bring mine on short trips. But it can't really handle a complex content-creation workflow that requires the use of multiple applications and mass-quantities of data.
According to Gassée, the timing for ARM-based Macs is based on input, performance and workflow.
If Apple provides a real way to compose complex documents on a future iPad, a solution that normal humans will embrace, then it will capture desktop-class uses and users.
Until such time, Macs and iPads are likely to keep using different processors and different interaction models.
Now, I find the entire proposition of an ARM-based Mac is insanity, as I wrote a year ago.
First off, Apple has an existing, successful desktop strategy, part of which is based on a unique multiplatform value proposition: Macs are the only machines in the world that can natively run OS X, Windows and Linux. This capability is dependent on Intel logic.
The move to a proprietary ARM chip might return Apple back to its PowerPC days, where Macs required special "sole-source" justifications in enterprise and government for purchasing a machine that couldn't run Windows with adequate performance. With a modern Mac, users can run almost any commercially-available program in a number of ways: Mac native apps, Windows programs with a WINE wrapper, Windows in Boot Camp or in a virtualized environment, or native Linux. However, for all of this goodness to work, there needs to be an Intel processor.
What the performance and success of the 64-bit A7 brings is greater leverage with Intel when buying processors. I wrote last year that what Apple wants are price breaks from Intel.
So does Intel. The issue for Intel is likely all about margins — the company doesn't want to lower its gross margins. Intel could no doubt make its chips score better on power consumption, simply by lowering their performance. However, Intel wouldn't be able to charge as much for these chips (hint: performance is more the driver for price than power consumption).
Certainly, the best ARM processors next year and even through 2014 can't come close to providing the performance necessary for MacBook Air-class laptops. Still, perhaps in a few years, Apple might be able to have ARM-based processors good enough for a MacBook.
So, Apple's dance with its partner Intel continues. Apple can keep up the pressure in negotiations: make the lower-power and more importantly, lower-cost processors Apple wants, or Cupertino will phase out Intel. And now Apple doesn't just have to point Intel to stories about rumored future products; it can now point at the A7 and let Intel guess about the A-series roadmap. That won't take a clairvoyant.