Was Intel's x86 the "gateway drug" for Apple's ARM?

Was Intel's x86 the "gateway drug" for Apple's ARM?

Summary: Apple's move to the x86 Intel architecture for the Macintosh in 2005 may have only been a temporary stop on the way to its logical end-state: the acquisition of PA Semi and the creation of ARM-based personal computers.


I have been told that I am someone who speculates a great deal. However, like anyone who tries to make predictions about the industry, such speculation is based upon observing historical behavior and analyzing current trends on order to try to develop a vision for a future state. Other friends of mine like to call this "pulling stuff out of my ass". I'll meet them halfway.

If you closely examine the history of Apple, you will see that time and time again, the company makes strategic choices which allow it to increasingly take control of its customers, its ecosystem and its intellectual property. Indeed, Apple has always isolated itself from the rest of the industry, but as it has matured, it has become even more of a locked-down ecosystem.


The History

The Macintosh, Apple's flagship computer product, has undergone quite a bit of changes since its launch in 1984. Originally, it was based on Motorola's 68000 architecture and used custom firmware along with its proprietary operating system. Ten years later, in order to make pace with technology and performance, the Macintosh hardware architecture was changed to PowerPC and CHRP, along with other relevant OS changes.

In 1997, Apple acquired NeXT, the company that Steve Jobs founded after his ouster as Apple CEO in 1984, and NeXT's remaining intellectual property -- the OpenStep operating system and APIs -- became the foundation of Mac OS X.

In 2005, When Apple could no longer extract any more performance out of the desktop-class PowerPC chips and started to fall considerably behind the PC in technology, it went to the only other architecture it could viably pursue -- the Intel x86. Which brings us where we are today. In 2010.

In 2010, the Mac faces a number of problems that can only be resolved by yet another paradigm shift. One of these problems is that although the x86 Mac uses a different type of firmware than the Intel PC architecture, the Extensible Firmware Interface (EFI) -- Mac hackers have been successful in being able to trick the operating system to run on much less expensive clone hardware using software-based EFI emulation on PC BIOS using modified Darwin bootloaders.

One of these hackers, Rudy Pedraza, started up a mail order business in South Florida and sold what amounted to glorified PCs running Apple's Mac OS X. That company, Psystar, was litigated into oblivion.

While Apple through the force of its financial might was able to successfully litigate a tiny American company and make its cloning operations cease, the company still faces the real possibility that other nations with less favorable legal systems may be able to sustain businesses based on cloned Macs. And while Psystar is dead, the technology that it used to build its systems continues to be heavily developed by the clandestine Hackintosh community.

Additionally, and probably most importantly, further advances in X86 virtualization technology which permits abstraction of the OS from the hardware could potentially allow a consumer in the near future to install the Mac OS on their own PCs without a whole lot of fuss. Apple has been resisting implementing virtualization on Mac OS X, and for good reason -- they don't want to enable the people that could possibly damage their cash cow.

Based on Apple's patterns of 10-year technology refresh cycles and the company's increased isolationist behavior, all of this points to one thing -- another paradigm shift for the company is due. If 2005 and moving to x86 was the last paradigm shift, then the next one is due in 2014 or 2015. However, just like any Silicon Valley earthquake, you always get a few tremors and smaller quakes before the Big One hits.

The Future

While the iPod was the first little "tremor" that signaled a trend towards becoming more of a consumer electronics company than a computer company, it was the introduction of the iPhone in 2007 was the first "quake" which indicated another massive change was in store for the company.

With the iPhone, Apple ported much of its core BSD-based operating system, Darwin, to the ARM architecture, along with its Objective C development platform from Mac OS X. While it must have seemed logical to many to re-use existing assets in order to facilitate the development of the iPhone on the ARM architecture, what Apple really did was stage their transition/migration plan according to what they would actually be doing with their next generation of desktop and portable computers -- Multi-core ARM-based Macs.

Apple's $278 million purchase of Palo Alto Semiconductor (PA Semi) in 2008 gave the company the final piece of the puzzle they needed to become fully independent of Intel and any other microprocessor vendor, and would allow them to return to the completely closed system which they enjoyed in the 1980s and 1990s.

The first fruit of Apple's labor with PA Semi is the generation 1 iPad, which uses specially designed custom ARM Cortex A8-based silicon, the A4 processor.

While the 1Ghz A4 isn't powerful enough to run a Mac today, I believe that the next logical step is for Apple to continue to evolve the silicon into more and more cores and at higher clock speeds. With iPad 2, we might very well see 2 cores and certainly a higher clock speed.

The next step would be to move to 4 cores and larger amounts of cache, which may present enough computing power to form the basis of the next generation Macbooks or iMacs.

It is not implausible that within five years, six and eight-core or even sixteen-core Apple ARM chips could be released. Large amounts of cores with lower power chips are not out of the question, as this is where Intel and AMD are both going, and where Sun was going until it went down the path of acquisition.

Given the fact that there are now more applications for the iPhone/iPad ecosystem than there are for the Mac, and that the App Store software distribution is completely controlled by Apple, it makes perfect sense that Apple would move the Mac to a 100 percent proprietary platform, now that it is seeded by many developers and many applications.

It is also notable that the ARM architecture itself given the amount of shipped chips on cell phones and other devices rivals the x86 desktop ecosystem or possibly will even exceed it in the near future depending on whose figures you look at. Intel itself is already examining this market very closely, particularly with its most recent acquisition of Wind River, which it purchased in June of 2009.

Wind River creates software development and hypervisor stacks for embedded systems architectures, of which TI's OMAP and the Qualcomm Snapdragon, both ARM chips, are among the most popular used in Smartphones today.

Intel also continues to manufacture the ARMADA (formerly Intel XScale) embedded processor for Marvell. Given this heavy trend towards embedded I believe that Intel may follow Apple's lead and decide to purchase an ARM/embedded asset, such as Marvell, Freescale Semiconductor or possibly even Texas Instruments.

It is not that much of a stretch to imagine a beefed-up iPad with a larger screen, keyboard and mouse, with multiple processor cores and back-end connectivity to Apple's massive datacenters running Cloud services. You can call this the Macintosh TNG, or the "Cloudintosh", but I already gave this computer a name.

The Screen

While I believe there will be Google/Linux-Screens and even Microsoft "Screens" (as evidenced by developments in Android, Chrome OS, Ubuntu, MeeGo and Windows 7 Phone Series) it actually makes sense for Apple to be the first company to pioneer with "Screen technology".

Effectively, the iPad is the first Screen or the Proto-Screen. The next logical step is to scale up the size of the display to full 1080p with a faster multi-core CPU, more powerful graphics processing with multi-tasking and windowing, with tons of Cloud horsepower to back it up -- a synthesis between iPhone OS and Mac OS where the entire means of production, the systems architecture and the software/content delivery mechanism to the device is entirely Apple-controlled.

Indeed, it is entirely possible that everything I have said is pure conjecture, and I could be inferring far too much from Apple's activities in the past three to five years.

When I do revisit this subject in 2015, I'm curious as to how close or how far off my predictions will be. Talk Back and Let Me Know what you think.

Topics: Software, Apple, Hardware, Intel, Operating Systems, Processors


Jason Perlow, Sr. Technology Editor at ZDNet, is a technologist with over two decades of experience integrating large heterogeneous multi-vendor computing environments in Fortune 500 companies. Jason is currently a Partner Technology Strategist with Microsoft Corp. His expressed views do not necessarily represent those of his employer.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.


Log in or register to join the discussion
  • I'll meet you halfway.

    I agree this is the way that Apple is heading with its
    "consumer" range; iPod through iPad. Complete control
    over the package to ensure a good user experience and to
    protect the profits. Smart business.

    But, while full powered laptop and desktop computers are
    needed, and Windows is still King of the environment,
    Apple has little choice but to stay with an x86 chipset. It
    wasn't until they changed to x86 that the Mac really took
    off. Many Mac users still need to have Windows support
    for archaic programs that rely on activeX and other
    Windows only protocols.

    So, the only time I see that Apple will be able to move
    completely ARM, if that is indeed what they wish to do, is
    when Windows can run unhindered on it or the World loses
    its Windows dependence. I'll dig you up when it happens.
    A Grain of Salt
    • Agree.

      If Macs can't run Windows or all Windows apps then their sales will nosedive.
      Sleeper Service
      • Apple won't move their laptops and desktops to ARM until Microsoft ...

        ... supports ARM too.

        Why? Microsoft and Apple need each other as viable competitors and sometimes as awkward bedfellows.

        Without credible competition, Microsoft will face yet another barrage of attacks from those claiming that it has an unfair monopoly.

        Without Microsoft, Apple doesn't have a strong sales story of "use your Mac for all your new stuff ... and when you need to run legacy apps, you can always boot into Windows or even run Windows sumultaneously!".

        Don't discount Mac's ability to run Windows - for most business people, its an absolutely necessary evil and without that capability, Mac is not a viable business platform.

        What WOULD be interesting is if future Mac's were to have an Apple ARM CPU embedded on the motherboard, enabling the x86 core(s) to be powered up/down as necessary.
        • hope you're right

          the best thing about my Mac Book Pro is its
          ability to run the 2 best OS out there seamlessly
          without any stupid hacks.
        • Twin processors, ARM and Intel - it's been done!

          Acorn Computers (UK) recognised the need for Windows compatibility in the early days (late 80s) and built a twin processor machine called the RISC PC. The 80486 chip had to have a fan to keep it cool. The ARM chip was just cool. So yes, that is a possibility, and maybe Apple could go that route and demonstrate the superiority of the ARM processor. And who knows, maybe Microsoft might start writing Windows for ARM based PCs?
          • Dude - you're way off.

            Acorn's BBC Micro had "The Tube" interface (underneath) to which you could attach a Z-80 or 6502 co-processor.

            Acorn designed and built the Acorn RISC Machine processor - the worlds' first commercially available RISC CPU, beating others to the market by a few months.

            The ARM processor first appeared in the Arhchimedes A305/310 in 1997 which is where I learned to write ARM assembly code :) The barrel-shifter ROCKED!

            Recognize the A.R.M. abbreviation - yep, Acorn's ARM processor was spun off into Advanced RISC Machines and is the owner of the IP & most of the core CPU designs implemented by the ARM chips that you have in most of your cellphones, MP3 players, etc. today, and iPad/iPhone/WP7S and other devices in the future.

            And, no, the ARM-PC did NOT contain an x86 CPU - the ARM6 CPU it used was fast enough to emulate an x86 CPU of the day more than fast enough to run most apps. You *COULD* add an x86 co-processor card to speed up your x86 apps if you wanted, but that was an optional add-on.

            Microsoft already ported Windows to ARM (codenamed StrongARM) once before during the ill-fated Longhorn project (the cancelled pre-cursor to Vista). There was no market for Windows on ARM back then but who knows, perhaps there is one now? We can but hope ;)
          • Re: Dude...

            [i]Acorn designed and built the Acorn RISC Machine processor - the
            worlds' first commercially available RISC CPU, beating others to the
            market by a few months.[/i]

            And trailing the CDC 6600 by about 20 years.
        • Been done before.

          [i]"Don't discount Mac's ability to run Windows
          - for most business people, its an absolutely
          necessary evil and without that capability, Mac
          is not a viable business platform.

          "What WOULD be interesting is if future Mac's
          were to have an Apple ARM CPU embedded on the
          motherboard, enabling the x86 core(s) to be
          powered up/down as necessary."[/i]

          The Quadra line and I think maybe even the
          early PowerPC line had an Intel CPU on an
          expansion card to run Windows stuff about 15
          years ago. That's an expensive solution in my
          opinion. It was then and is now, plus requires
          a beefier power supply. I wouldn't consider it
          on a laptop in any way shape or form.

          And there's no room to shoe horn expansion
          cards into the current footprint of iMac
          desktops or MacMinis. Macs aren't going to get

          Your opening premise doesn't make sense at all.
          Apple went for almost 30 years without
          supporting any hardware architecture compatible
          with MS operating systems (unless you consider
          Xenix, but they had A/UX and AIX instead). And
          why would Apple care if moving to ARM caused MS
          to have a legal problem?

          The more likely prerequisite is Apple won't go
          100% ARM until they can support virtualization
          of the x86 [i]and[/i] x64 instructions sets on
          the ARM cpu and fast enough to not frustrate
          end-users. A cross-licensing from AMD along
          with a manufacturing contract with Global
          Foundries could solve those problems. I have a
          suspicion that we may soon find out why AMD
          caved in so quickly to the 1 billion settlement
          from Intel.
    • ... but have you ever met the ARM?

      The ARM was designed by Acorn Computers in Cambridge, UK because neither the Intel 8086 nor the Motorola 68000 (the 2 major 16 bit processors available at the time) could match the 8 bit 6502 processor in certain respects, such as interrupt latency, and Acorn was not prepared to replace its 8 bit 6502 based machine with a 16 bit computer with inferior performance in any respect.

      Intel took the compatibility-at-register-level approach to designing a 16 bit processor with the limitations that such an approach necessitates.
      Motorola (chosen by Apple) used the compatibility-at-assembly-language-level approach which still had limitations (and it meant all software had to be recompiled to run on the 68000) but was significantly less limited than Intel's approach.
      Acorn decided to forget all about compatibility at any level in the design and came up with a processor so fast and efficient that a software emulator of the 6502 could run 6502 programs on the ARM and perform as well or better than the original 6502 chip. This approach had no design limitations yet retained full backward compatibility via the emulator.

      If the ARM could do that in its early days with the limitations that the technology of the time imposed, I see no reason why an ARM processor today couldn't outperform anything that Intel or anyone else could design - given access to the latest technology. So come on ARM, show Apple what you can do and maybe in 10 years time we'll be looking back on X86 based computers with amusement at the huge amounts of power they consumed to do 'simple' computing tasks.
      • "I see no reason why an ARM processor today couldn't outperform anything ..

        ... that Intel or anyone else could design"

        No matter how much you might like it to be true, it won't be true!

        ARM CPU's are amazing - I've been programming ARM on and off since they were first released in the Acorn Archimedes A305. But make no mistake - Intel's x86 architecture has some major benefits that ARM just cannot match:

        1) Instruction density: x86 CPU's can fetch one word at a time from memory. Each fetch may return one or more instructions. ARM, on the other hand, has fixed width word-wide instructions that only permit one instruction per fetch. ARM introduced the variable-length THUMB and THUMB2 subsets of the ARM instruction-set to try and improve ARM's density, they still aren't as efficient as x86.

        2) ARM is a load-store architecture requiring all data to be read into the CPU, operated upon and results stored back in memory. x86, on the other hand, can usually operate directly on memory itself.

        Of course, ARM CPU's can be highly efficient in terms of code execution (esp. techniques to minimize branches), but these benefits are often offset by the enormous impedence mis-match of the memory bus.

        Where ARM does excel is in thermal and electrical efficiency due to the relative simplicity of their CPU's core implementation. That is why ARM is the primary choice for powerful portable electronic devices today.

        Intel is getting MUCH better at creating highly efficient CPU's and has the benefit of an ENORMOUS market of software that runs on their CPU's.

        Essentially, ARM and Intel are on a course to collide as ARM increases computational power and Intel improves its chips' thermal and electrical efficiencies. It's going to be fascinating to watch these two duke it out.
      • BS walks, perfornance talks

        That may have been true back in the old days, but I'd like to see actual benchmarks between modern machines, not reminiscing.

        I mean, the Z-80 blew the 6502's doors off on numerical tasks and some software benchmarks, too - which, following your reasoning, would mean Intel architecture should still rule, because the xxx86 instruction set is a superset of the Z-80s and the chips are a clear evolution from the Z-80 design.

        Heredity means nothing in silicon systems. Performance in the user environment determines market share. When Pentiums matured to the point where they could run complex GUI-based software, Macs lost their edge in the marketplace, because of the intense price competition in the WinTel market.

        Apple needs, more than a new proprietary hardware environment, someone to compete with them on their own turf, to keep them on their toes. Well, actually, whenever someone (like HRC or Psystar) tries to compete with Apple on something they consider to be proprietary to them (like desktops running their OS, or smartphones that work like iPhones) they respond with lawsuits instead of competing back at their competitors and developing even cooler stuff the wannabees can't make.

        Ah, the "look and feel" lawsuit - again. That is typical Apple behavior - suing over the graphical user interface, something they copied from Xerox PARC, and something that Palm and Handspring have priority on in hand-held devices, if anyone does.
    • The article may have a point, but poorly worded.

      The fact is that Apple is no longer interested in any computer applications sold after the computer purchase. They don't care about "pro" apps anymore; they want to be able to tell computer shoppers that yes, they can look at pictures, slap together a home video, and surf the Web. Then they're out the door and gone.

      The only apps Apple cares about are the ones it can skim revenue from, and that means the App Store. To take the App Store to larger devices, then yes, limiting it to ARM would deter hackers because no legitimate PCs run on ARM chips.
      • The virtual authorized Apple dealer....

        ...is the App store (and the AT&T shop where iPhones are sold here in the US). Top-down control over merchandising, OS, hardware and software.

        Apple would be wise to confine their ARM adventure to their niche apps - iPhones, iPods, iPads, "screens" - which are primarily consumer items with great brand loyalty attached to them.

        No IT managers who have to justify massive corporate buys, no need for the product to work and play well with other computing gear, a distinctive look-and-feel which Apple thinks is worth suing over, despite the fact that Palm and Handspring have priority on most of the things that an iPhone can do.

        In THAT environment, an ARM-based PC is plausible. Plenty of folks now insist that their personally-owned iMacs and iBooks are just incredible (which, taken literally, might be true) and outperform Intel iron. They're entitled to their opinion. I own several Macs from many points in Apple history; I use a WinTel laptop running Vista.

        In an business environment, though, ARM is an imponderable. Will it, with a proprietary Apple OS, work with the customer's files? How much work will it be to interface an ARM Mac to the corporate LAN, to legacy minicomputers and mainframes? Will 3rd party vendors even be interested in writing interfacing apps like Rumba for the new ARM PCs? And how many new hardware specifications will arise because Apple doesn't feel like supporting (say) USB any more?
  • Well stated points.

    I would like to add that the development of chips is best left to the (few) chip developers... and these days there are really only a handful of contenders that can pony up the billions of dollars necessary to create a viable foundry (Intel/AMD/IBM) to then mass produce them. For example, IBM built a new foundry in Fishkill, NY for approx $3.5 Billion (and the price keeps going up). While I understand that it *may* be in Apple's interest to invest in their own hardware, I don't see the manufacturing costs of owning a foundry PLUS god-knows-how-much cost in development costs and not to mention development D.E.L.A.Y.S. By the time they finally get up to the current speeds of AMD/Intel chips the PC's will be on 8 cores running at 5 GHz. It's a moving target, and unless you at least have them in your scope you'll never catch up. It makes no business sense whatsoever - unless they have some other trump in their cards and are keeping quiet about it. But let's face it, ARM's belong in cell phones and PDA's like the IPAD, not desktops, servers (EGADS), and laptops.
    • ever heard of fabless ?

      You have two separate business in chips, design
      and manufacture.
      • Precisely

        Samsung are making Apple's A4's today, but Apple could shift their manufacturing to TSMC or other chip fabs almost at the drop of a hat.
  • Amen! [nt]

  • They'd do it if it made sense, but it's about profit not "control"

    I'd also look at the hackintosh market from the other way. The market
    sector that wants to fiddle around with their computers so as to get
    the operating system (!) to work is truly small. Even if hackintoshing
    was snap easy, it requires opening a manual and is fraught with peril:
    will the next update mean I have to tweak stuff to get back to what I
    have today, which may be a compromise to begin with.

    And I come back to this, if people in general, were willing to be
    adventurous so as to save a few hundred dollars, we'd have a lot more
    Linux desktops out there.

    Now, where Mr. Perlow makes a major error is in his emphasis on
    Apple's alleged control-freakery. Apple's big lesson and it goes back
    to the 1980s is that the consumers look at a device, in a wholistic
    way, and primarily for what the device does. The manufacturer
    implements the device with software and hardware and the best
    devices make the underlying engineering invisible. While there are
    those people who will want to super-charge their toasters, for most
    folks, something that looks nice on the counter and which toasts the
    bread is enough. You can only toast two pieces of bread in a single
    thread and there is only one thread? Okay.

    If the user experience is the dog that wags the tail of profitability,
    well, it is fully understandable that Apple makes sure that
    independent people don't screw up the experience and tarnish their

    So, I imagine the PA Semiconductor acquisition is not about
    transitioning the Mac back to non-Intel, though, if Intel became
    unresponsive to Apple's requests, it is a possible leverage point. It is
    about having more control on the hardware side of their consumer
    devices, which are only incidentally computers, and which will be sold
    as a personal music player, a smart phone, an internet media display,
    and so on in the future.

    I mean, after all, Apple's success with the iPhone didn't mean they
    went back and put a telephone in the Mac. I suspect that the Mac will
    remain a personal computer and be as "open" as it is now. Being able
    to say to a customer "... and you can run Windows if you want." is a
    sales pitch this is very useful, as has been pointed out elsewhere
  • Just a Couple of Things

    You hit the nail on the head for your first two points.

    As to point number 3, I agree in essence. Just a bit of a technicality here. Hackintosh users are not "pirates" unless they don't buy the OS. Yes, Apple would love to keep organized efforts like Psystar from existing, but Psystar made it too easy for them by not purchasing OSX for every system they sold. That made them clearly in violation of copyright.

    I think their justification was that one OSX license was good for more than one machine when purchased by a Mac user. However, they failed to grasp the subtleties of copyright law or the license for OSX. The license can't be split between households. You can use it on more than one computer of your own, you can't load it on one of yours and those of two friends.

    The key to your fourth point isn't any lack in the ARM architecture. It's the question of who will push ARM in the right direction to power desktops. It's the same issue they faced with PowerPC chips. The design of the PowerPC chip was certainly not hampered from being pushed as far as x86, but nobody was pushing in the right direction. Motorola's research and development was going toward low power chips for handheld devices. IBM wasn't concerned about economies of scale and making a powerful enough chip cheaply enough. The question is, will anyone push ARM in the direction it needs to go to once again conquer the desktop?

    Point number 5 I don't see being an issue as much for ARM as for x86. Just because you develop a quad core ARM for the desktop doesn't mean you have to use it for portable devices. If anything, ARM still has an advantage for power consumption over anything based on x86 architecture.
    • Ok...

      Re: Point 3 - Point taken. Wasn't aware that Psystar wasn't shipping a fresh copy of OSX with every system. Silly move on their part.

      Point 4 - I was under the assumption that IBM/Motorola couldn't meet the demand Apple had for the G5 chip and a mobile G5 chip was just not happening - and that is why they switched to Intel.

      Point 5 - It may be a concern if it takes that quad core ARM chip to do the equivalent of a dual core X86 based chip. In which case, it would be an issue.