Rise of ARMs: How changing Mac’s processor could change the world

Some say once all a ship’s parts have been replaced, after years of service, it’s no longer the same ship. Mac may not be Macintosh any more, but Apple’s revival of an old idea suggests history may not have changed as much as we think.

ARM processors everywhere: What this means for the tech economy

The return of the ARM processor and RISC technology to the public discussion of the desktop computer market is like stumbling upon an episode of "Friends" when flipping through the "Favorite Channels" on your TV remote. First, you get a little giggly smile, a moment of heart warmth, watching those formerly young folks with their rainbow umbrellas dancing. You remember the sillier moments when characters did things you didn't expect, back before TV became saturated with such moments on a daily basis. 

see also

Arm processors: Everything you need to know

Apple’s recent move to redesign its Mac computers around chips that it fabricates for itself, replacing Intel, has cast a new spotlight around a class of processor that there’s a very good chance you own right now.

Read More

Then, for the next five minutes of show, the characters recite a script you've seen played out so often, you can almost see in your mind the directors' handwritten margin notes. You know where it's going already. So you consider bowing out and hunting for something that won't lead you down Exactly the Same Tired Sitcom Path.

Also: Arm-based Macs: Smart move for Apple, but irrelevant to the future of Windows PCs

Since the 1980s, the most successful (i.e., most widely read) technology news stories have been about competing methods: MS-DOS vs. DR DOS, GEM vs. Windows, Apple vs. IBM, Motorola vs. Intel, RISC vs. CISC, Linux vs. Windows, Europe vs. Windows, Apple vs. Windows, Apple vs. the music industry, Apple vs. ___. Most of my career has been about covering technology issues that were best symbolized with a fulcrum.

So the fact that Apple's recent news resuscitates the 30-plus-year-old RISC vs. CISC debate should be welcomed by any tech news publisher that appreciates the heart-warming power of reruns. During the pandemic, when it's hard to manufacture a story with a harder impact than the everyday troubles of just going outside, we can use all the reruns we can get.

Apple's ARM Processor Moment, as a few dozen amateur YouTube historians will inevitably call it, is a signal of the impending collapse of a very old (at least, as old as old gets in this industry) and very prevalent logjam. In late June, Apple announced it was shifting production of the Mac's CPU from Intel x86 over to its own, ARM-based "Apple Silicon." Using design and manufacturing methodology inspired by the company's existing approach for ARM-based chips for iPhone and iPad, the company will build custom A12Z processors for Mac, using the ARM64 instruction set licensed to it by ARM Holdings, Ltd. (Apple was actually a co-founder of the original ARM processor maker and a long-time stockholder in ARM Holdings.)

I know you're tempted. You've reached that point of realization. The Back button is just within reach. Yet I ask you this once, bear with me. Yes, this column is about the same garden path you've been down before, I admit it. Yes, it's a pandemic, and it's hard to care about technology when more than 1,000 Americans are dying every day, and secret police are taking out their frustrations on suburban moms with batons. But no, this won't lead you to the same destination you've seen a million times.

The tail end of Moore's Law

Until a few years ago, Intel's formula for business success has been Moore's Law, named for its founder, Gordon Moore. There was a virtuous economic cycle in cramming more components onto integrated circuits at a predetermined rate. Put another way, there is a market rate for appeasing the consumer's interest in bigger and better processors, even if "bigger" isn't something the consumer can physically see. If you can produce and sell componentized electronics at that market rate, you can be assured of a comfortable margin.

RISC's counter-argument to Moore's Law, as best demonstrated by ARM processors, has always been a David vs. Goliath story (you've got to love that "vs."). There's an efficiency gain to be realized if you can perform the job of one huge instruction with eight or ten smaller, logically connected, instructions. And in controlled circumstances, those gains could outweigh the performance advantages of cramming more components. It's why the smartphone in your purse or pocket can perform the function of a full-powered PC if you wanted it to, even though its CPU doesn't have an electrically powered, blowing fan.

Must read: 

Up to now, x86's competitive advantage, at least in the markets for larger computers, has been sustained with a bit of leverage, supplied in due course by a series of fortunate circumstances. The software base for nearly all large-scale computing has been compiled for x86 processors. Most folks who still use PCs, still need Windows. (Yes, there is a Windows 10 for ARM processors. But have you seen it?) The accelerator industry, the GPU industry, and all the interfaces we've built for computers presume the omnipresence of Industry Standard Architecture, whose last go-round in the "vs." arena came just before Iraq invaded Kuwait.

It's not that x86 processors are the same gas-guzzling giants they were in the Pentium days. Intel engineers are responsible for many of the greatest efficiency gains data centers have seen over the last six years. At the start of the Obama administration, the world's data centers were projected to consume as much as 12% of the world's total electric power by the end of its second term. Today, researchers including Jonathan Koomey (by my gauge, the Anthony Fauci of electricity) have shown that figure to actually be less than 1%.

So, we breathe a sigh of relief, for now. The unsustainability of our environment, along with that of our government and our culture, is something we're able to reasonably ignore, within limits, as long as economies of scale -- such as the one Gordon Moore discovered -- chug right along and don't collapse on us. We know we live today with a technology infrastructure that is unsustainable, in and of itself, for the long term -- not just in x86 processors, but the entire infrastructure network that supports them. AT&T has been sounding this clarion call for years and repeats this message wherever it can.

The business end of the brave new world

Economies of scale have limits. Moore's Law proved that competitive advantage and commoditization could co-exist and that the former could comfortably outpace the latter. All you needed was 1) a relatively stable global economy, 2) a nominally functional supply chain, 3) a secure stash of disposable income equitably distributed among consumers, 4) the full and uninterrupted cooperation of the laws of physics.

At this moment in history, we're one for four, and the one is hanging by a gossamer thread. One bad phone call from China, and it's all lost.

In any technology market since the invention of the rock, there are two forces simultaneously at work. The supply-side seeks to obtain a competitive advantage, and then to lock it in. The demand side drives effectively every product and service towards mass commoditization, to ensure availability and affordability. Every effort to automate the process -- or, like Moore's Law, to declare it automated for us -- is a balancing act with these two forces. A market will tolerate their coexistence, within limits, as long as we play like everything's peaceful and copacetic, and the "vs." in our headlines is no more meaningful than a pro wrestling match.

Also: Apple silicon: Why developers don't need to worry TechRepublic

We've reached those limits -- indeed, we've surpassed them. We can no longer cram things on top of precarious platforms and expect another 18 or 24 months of uninterrupted profitability. Apple shifting its Macs from x86 to ARM, with the promise of greater efficiency and performance for x86-based programs, should be a red flag warning for all of us in the technology business, like being witness to everyone in a country somewhere, suddenly wearing masks. Our free ride is up. The days where Linux vs. Windows even mattered, have passed. Thirty-year-old reruns of old architectural squabbles may as well be the Lincoln/Douglas Debates.

Our next moves in this world must be bold -- much bolder than even Apple's -- lest we leave behind no more than the decayed remains of a rerun of our society's downfall.

Told you.