OK, let's get the rules of engagement out of the way. I posted that I was working on this topic to Facebook and that posting blossomed into 137 replies and some fascinating in-depth discussions. Here's a link to that thread, which most of you should be able to read. If you comment, please be polite and constructive.
Also, please note: The following article aims to be a tactical analysis that explores technology adoption alternatives. It's not meant to be a religious screed. I use Macs, Windows machines, and Linux pretty much on a daily basis. Your preferences and your mileage may vary. I'm not saying you have to make a change. I'm simply exploring the question of whether a change is in order, and why that might be.
Keep in mind that any major architectural shift in technology presents an opportunity to explore alternatives. I'm going to take a look at whether now is the time to make a change, and why that might be.
Finally, I want to point you to a hyperbole-free, zero-snark and comprehensive look at why you might want to choose each of the big three operating systems.
OK, here we go...
Within the next year or so, some of us will start thinking about buying new machines to replace our current ones.
This is not a new phenomenon. Every year, Apple releases a new version of MacOS and every year, some machines are left behind. Those machines will not be able to upgrade to the new OS release and, eventually, will become vulnerable to security issues and no longer able to run certain applications.
In my case, I always wait six months to a year before upgrading operating systems. Only two of my current Macs have been updated to Mojave and I'll probably wait even longer to upgrade the rest. But there is a limit to how long I can wait. As I showed a few years ago, when the apps you rely on no longer function on the OS version you're running, time's up. You have to upgrade.
So, while many of us don't need to consider replacing hardware this week, it is an inevitability. I can probably keep running my fleet of newly obsolete Macs well into 2021, but by 2022, I will have to replace them.
And that begs the question: Replace them with what?
When hardware needs to be replaced, it becomes a natural time to consider alternatives. That, in turn, opens the door to the question of upgrading -- not to a Mac, but to Windows.
My Facebook correspondents had a lot to say on this topic, but for now I'll mention one issue: cost. It will be damned expensive replacing all those Macs, even with Windows machines. It'll be doubly-damned expensive replacing them with new Macs.
In a down economy facing the economic impact of the, cost is a factor.
Okay, let's move on.
Beyond the year-by-year obsolescence that comes when operating systems no longer support the underlying hardware, the entire installed base of Macs will become obsolete within 2-5 years. There will come a time, probably in 2024 or 2025, but possibly as early as 2023, when Intel Macs will no longer get operating system updates.
At that time, owners of Intel-based Macs will face the same question I mentioned above: Replace them with what?
As with year-by-year obsolescence, cost will be an issue. The Mac mini I bought in 2018 was a bit over $2,000. While I wouldn't be able to get exactly the same features in a PC, I did price out a roughly equivalent PC and it came to about half the price.
So, here's where I'll leave this part of the discussion. Within the next three to four years, all Mac users will have to face the decision of whether to upgrade their hardware with new Macs, or with non-Apple PCs.
The Linux question
Before I move on, I'll need to address the Linux elephant in the room. When I posted the title of this article on Facebook, a not-even-slightly-surprising firestorm erupted around whether Linux makes a good-as or better desktop than Windows or Mac.
Technology editors Christine Hall and Scott Mace had an interesting back-and-forth about the Linux desktop. Christine opened with, "Why squander a perfectly good opportunity to finally make the move to Linux by making a stop in the Windows world first?"
Scott responded, "Because the Linux desktop UI remains fairly impenetrable, even to Windows and Mac veterans such as myself?"
Christine objected, saying, "Have you tried something like Linux Mint recently? Almost everybody I've set up with it finds it easier to use than Windows. And which Linux UI? Gnome is much different than Windows, yes, but KDE, Xfce, and Cinnamon are not much different than the traditional Windows UI. Linux Mint is pretty much plug and play anymore. My HP all-in-one installed itself in Mint in about two minutes time with no input from me."
There is a lot of common ground between Linux and both Macs and Windows. Both MacOS and Linux are derived from a UNIX-like codebase, although Linux was a reimplementation and MacOS is based on BSD. That said, they share a lot of under-the-hood functionality and, for example, if you're a command-line wiz on the Mac, you'll immediately know command-line operations on Linux.
But since Linux was built from the ground up to run on off-the-shelf PC hardware, Linux shares its entire hardware base with Windows PCs. Virtually every Windows PC (or PC made from Windows PC components) can run Linux. Linux drivers have improved tremendously. As Steven said in the Facebook thread, "I haven't had a device driver problem with Linux in the last five years."
Here at Camp David, I have two old laptops running Mint, and five or six Raspberry Pi single board computers running various specialty versions of Linux, with most of them running Octopi, controlling my 3D printers. But... as much as I truly enjoy Linux, I would never be able to move to it for my desktop use because the applications I rely on -- namely Office and Creative Cloud, as well as Final Cut and a wide variety of vertical Mac apps -- don't run on Linux.
What about Windows?
This is going to be a difficult question for many more focused on their pocketbooks than their desktop UIs. I've admitted to how frustrated I've been with Apple's upgrade pace -- although we now know more about the limits Apple had with Skylake processors that led to their nearly-always-inevitable change in architecture direction.
Additionally, the wide variety of innovative PC implementations have often made Mac users jealous. There is no Surface Studio for Mac. There is no touchscreen interface. Beyond the insanely expensive Mac Pro, there is no way to build a big tower filled with components.
We extreme pro users, which are admittedly a very small percentage of Mac buyers, often felt hemmed in by the limited hardware choices offered by Apple. While Hackintoshes seemed like a moderately viable option, it was the vast selection of Windows configurations and options that inspired envy and a bit of jealousy.
But, if all those Intel-based Macs are entering obsolescence, isn't this a great time for some wish fulfillment? Why not get exactly the PC design and configuration you want, and just run Windows?
After all, after five years on the market, Windows 10 has grown into a quite fine operating system. So, it would seem that as Apple drains the life out of older Intel-based Macs, Windows is a viable substitute. Especially since the machines are considerably less expensive.
But are they? Really?
That depends on whether you price them based on how much it costs to transfer them from the factory to your hot little hands or how long you get to use them once you pony up your pennies. I'll give you an example.
For most of the 2000s, my primary machines were Windows -- Windows XP, Windows Vista, and Windows 7. I really liked XP and Windows 7. But beyond the issue of which apps I needed to use (more on that later) was my Windows upgrade cycle.
The pattern went like this: I'd build/buy a new Windows PC with as much power as I could get. I'd spend a month or so getting it configured the way I liked it. I'd love it for about a year or so. By month 15, I'd discover it was just no longer up to the task, or Windows cruft had set in, or drivers broke something critical, or a component failed, or there was some unfixable incompatibility with some component or another. By month 18, it became apparent that if I were to keep turning out my work product on a daily basis, I'd need to buy another PC.
Every 18 months. Rinse. Wash. Repeat.
And these were not cheap Windows machines. The Sager PC I bought in July 2012, for example, cost me $3,000. It had as much memory as I could get, the fastest processor I could get, the fastest media I could get, and it cost a ton. It also failed. Constantly. By month 15, I wanted to fling it out the window.
Now, let's look at my fleet of Macs. The machine I'm using right now to write this article is a 2013 iMac. Way back in 2016, I wrote about how this iMac broke my 18-month upgrade cycle, and I'm still using it. While it's not quite powerful enough for multicam 4K editing, it's great for coding. That's seven years -- and I'm probably going to use it until 2021, so figure eight years, not 18 months.
We're rocking a 2011 Mac mini and three 2012 Mac minis. These will also probably run through 2021, so we're looking at a nine or ten year life -- not 18 months. My 2015 MacBook Pro not only has all the ports, but it still has spooky-fast storage. That's a mere five years old. My newest Mac acquisition was a highly-spec'd 2018 Mac mini, purchased in November 2018. That was 19 months ago and far from needing to be refreshed, it still feels as new as the day I got it.
So while Windows machines have more options, are more varied in spec, and are cheaper to buy, I've found total cost of ownership to be vastly less expensive with my Macs. A typical Windows PC cost me roughly $1,000 to $1,500 a year to run. This wasn't a one-off situation. I measured this over the course of about 15 years, and probably 10 main machines.
To be fair, my wife's been using the same Samsung Ultrabook for seven or eight years now, so if you're not as power hungry as my work requires me to be, your cost-of-ownership might not be as high.
That said, by dividing cost of purchase with the number of years in service, my nearly-decade-old-and-still-in-active-use Mac minis have cost me $100-$150 a year to run. My MacBook Pro has cost $600 per year to run so far, but by end-of-life that will be down to about $450 per year. My very-expensive-to-purchase iMac will also have cost about $450 per year to run. Even the newest Mac mini, which was purchased 18 months ago and probably has four more years of MacOS upgrades available to it, will cost under $400 per year.
There are many reasons why Macs tend to have fewer "entropy" issues than PCs. but a big part of the reliability issue is the vertical integration of the hardware and software. If you look at a typical PC, you'll find components designed and built by many different companies, controlled by a motherboard designed by another, running an operating system designed by another. At design time, engineers try to make sure these components will work together, but because Microsoft's developers never, ever know the exact configuration you're running, it's a guessing game. By contrast, MacOS developers always know the configuration you're running, because it's only one of a possible ten or so.
On one hand, that's a huge lack of flexibility that extreme pros tend to chafe under, but it's also a formula for much more reliable engineering. Add to that the generally higher-quality components due to Apple's supply chain rigors, and you get a more reliable machine with a longer life.
From a cost perspective, then, it's hard to say that Windows machines cost less. I have the careful accounting to prove that's not the case. I'm not alone in that observation. Even IBM determined that Macs have a lower cost of ownership than PCs.
All that is to say that my Intel Macs will have paid for themselves by the time they go to the great parts recycler in the sky. And, it's likely that the new Apple Silicon-based Macs will offer similar cost and lifecycle benefits.
But that's nothing compared to...
Apps and the user experience
Here's where my Facebook correspondents weren't the slightest bit shy.
Renowned technology editor Esther Schindler says, "I don't use a Mac (or other equipment) because of the hardware inside the box. I use it because of the user interface. In all these years, for me Windows has always violated the principle of least astonishment. That is, when I don't know how to do something... using a Mac, my first or second guess is usually right. With Windows, I often spend 15 minutes trying to accomplish it. I truly don't care what hardware is under the hood. I only care about my UX."
Our own Jason Perlow posted, "Bottom line is you use the tools that run the apps you need. You shouldn't make a platform decision based on any other reason. Nobody should be deciding to migrate to anything now; the process of app porting and getting all the new systems changed over to new chips is going to take two years. That's a long time." He continued, "I'm not sure why an architecture change necessitates moving to Windows or Linux. It's still the same UX, and you're still bound to certain apps in certain verticals."
That's the case for me. There's no way I'm going back to the pain of Adobe Premiere after three years of Final Cut Pro X success. I have a whole bunch of apps I run that don't exist on any platform other than the Mac.
And, as technology editor Swapnil Bhartiya posted, "People don't use OSes, they use applications. People stay on any platform due to apps. As much as I use all three, for VR I rely on Windows, Linux is for my servers but everything else is on Mac. I can run Windows apps on... Mac. With macOS I get best of all three worlds (it's native UNIX so all Linux tools are at my disposal)."
This was my most-compelling benefit beyond cost-of-ownership. I can run Windows, Linux, and MacOS on one machine. As I mentioned last week, I get great performance running Windows in a Parallels VM and the ability to cut and paste (and drag and drop) between Windows and Mac applications is a huge win.
Open source attorney Mike Godwin lays down the law on this, saying "Honestly, I can't imagine why this would make a difference to anyone, one way or the other. I don't expect VMWare or Parallels even to slow down. The problem set for virtualization is different now from what it was 10 or 20 years ago."
We'll let Jason and Steven sum up the applications-are-king argument.
Steven says, "Getting to the question at hand. I can't see Mac users moving to Windows. If the apps they use now will be on an Arm Mac, they won't leave. They'll have no reason. And, as I've said many times before, Microsoft is heading to a Desktop-as-a-Service model as fast as they can. Windows as a standalone OS will only be used by developers, content-creators, gamers, and power-users. The Mac and its users will live on very happily just as they did after the 68x to PowerPC and the PowerPC to Intel architecture migrations. If MacOS looks and works the same, and it has the same apps, Mac users aren't going anywhere."
Jason added, "They aren't moving to Windows or Linux. They will move to another Mac when and if they need to. Mac people keep their systems for a long time and Apple supports OS releases for a long time. They won't abandon their hardware that quickly."
That pretty much sums it up. Even though the prospect of buying new, expensive Macs is scary, the total cost of ownership is aggressively less than Windows ownership. Beyond that, Mac users are Mac users for a reason (or, more accurately, many individual reasons). They, like I, will migrate to the new hardware when it becomes necessary.
For most Mac users, the move to Apple Silicon will not only be something of a non-event given Apple's skill in architecture migrations, most of us will simply move when it's time to buy a new Mac.
But what about you? Do you have Macs now? Are you expecting to move to Arm or will you consider switching to Windows or Linux? Let us know in the comments below.
You can follow my day-to-day project updates on social media. Be sure to follow me on Twitter at @DavidGewirtz, on Facebook at Facebook.com/DavidGewirtz, on Instagram at Instagram.com/DavidGewirtz, and on YouTube at YouTube.com/DavidGewirtzTV.