Cloud computing is the future, but keep an eye on those monthly bills
There was a time when all computing was work computing. Back in the days before personal computers, nearly all computer technology was for some sort of business use. Yes, there was also educational use, but the educational use of that primordial computing technology was either to solve scientific problems, manage the school (so, business), or teach computing technology.
Even then, there were glimmers of the recreational in computing technology. Some of the earliest games, like the ancient Tic-Tac-Toe and Spacewar, ran on minicomputers and mainframes. In fact, early versions of many of the recreational computing activities we perform now existed back in the ancient times of the 1960s and 1970s.
Early forms of social networking can be traced back to Usenet, along with early forums, and even early MUDs. Many of these recreational computing activities originated in universities, but they were played after hours at work. The thing is, until people were able to bring computers home, recreational computing rarely occurred at home.
The personal computer changed all that. It doesn't matter whether we're talking about an Altair 8800, Apple II, Macintosh, or Windows-based machine, the one-computer-one-person model helped personalize computing activities. People began using computers at home to manage finances, communicate with friends and family, play games, and support creative endeavors.
What made personal computers interesting was that they were useful for both work and home activities. The Apple II became the poster child of home computers, but its true popularity blossomed because of VisiCalc and how it improved productivity at work.
Also: From paper tape to a patched-together Altair 8800, the story of my first computers
The same was true of what we now call the PC. Initially, it was just a display, processor, and keyboard, like the early IBM PC and PC XT models. These were capable enough of running Lotus 1-2-3. Even though they were far more expensive in 1980s money than an iMac Pro is in 2018 dollars, they were worth it, again for the productivity benefits.
When good gaming-quality graphics came along, PCs blossomed. This was the era when Windows ruled all. PCs were Intel-based, had a graphics display, mouse, and keyboard. Gamers and power users would buy the most powerful PCs possible, while regular clerical users made do with hand-me-downs from previous generations or less powerful machines.
This was the endpoint model of computing that got us until sometime between 2007 and 2010. Even servers were based on essentially the same architecture as home machines. PCs (and Macs) got more and more powerful, cheaper and cheaper, and less and less distinguishable from each other. It got so bad that to stand out, some PCs were equipped with glowing lights, just for kicks.
Also: In 2018, Windows died at home and nobody cared
Today, we generally call these machines desktop machines, even though notebooks actually stormed the market. Although the lightest notebooks didn't run triple-A games as well as some would have liked, mobile computing shared the same fundamental mouse/keyboard/Intel architecture of the desktop machines.
For developers, power users and computer producers, this model was both good and bad. The prevalence of this usage model meant that the market was huge. Even very specialized add-ons could support a business because the total available market was astronomical.
Developers could develop for Windows and be reasonably assured of a market, and while the Mac was only about a tenth of the overall market share, the lack of as much competition among developers and Mac users' willingness to spend a little more often made it worth it for developers to create Mac-only or cross-platform products.
For edge-case users (as Jason Perlow often describes me), this was an ideal market. No matter what it is you wanted to accomplish, there was a way to patch together a solution. Customization was common, and many enthusiasts happily built their own rigs and entire setups.
The only problem was all the muggles, those folks who wanted to use computing technology on occasion, but didn't want to learn how to build a box, install Windows, or, as we did back in the day, deal with driver conflicts. In fact, most muggles didn't even know how to find applications, let alone install them.
Remember, this was in the days before cloud computing and mobile apps. It was also in the days before high-speed broadband and fast mobile networking was available to just about everyone. Yes, the web was out there, but it was very Web 1.0.
Also: WWDC 2018: Why the Mac you know has no future
In 1997, when I started ZATZ, if you wanted to publish an article online, you often had to know how to code HTML. There were few blogging platforms. There was no Facebook. If you had something to say, you often had to know how to get a hosting provider (or run a network to your closet), how to create accounts, how to set up a web server, how to code HTML, how to upload it, and so on.
But that soon changed. A lot more computing activities became turnkey in the early 2000s. We also started having more and more security threats, in large part because more and more users were online worldwide, permanently connected to each other.
SaaS (Software-as-a-Service) applications (like Salesforce, Gmail, QuickBooks, and so on) became the norm. In 2002, when my company was based in New Jersey and my top salesperson got married and wanted to move to California with her new husband, I had to say goodbye. We were running ACT on our local LAN, and there was no reasonably convenient way she could get into the system remotely to do her job.
Had Salesforce (or the hundreds of other relationship management tools like that) been around back then, it would simply have been a matter of her logging into her account from wherever she happened to be. Because of cloud computing and mobile apps, when I evacuated from Florida to Oregon last fall, there was barely a blip in my productivity.
With SaaS apps, installing a program is really a matter of signing up. The hard work is digging out your credit card number, entering your email address, and choosing some password that's not "password."
While there can be considerable tweaking done for some cloud-based applications, getting started as a new user is about as friction-free (and expertise-free) as it can be.
The same is now true of app store apps. Before the app store model, finding and installing apps was often a task that required some level of expertise. In fact, they were called applications, not just apps. You had to order the disk (or, later) download the file, unzip it, run the installer, and so on.
Now, the only thing most mobile users need to do when they decide to run an app is click install (and sometimes provide their fingerprint). Apps install and uninstall with as much ease as pressing a Like button on a Facebook picture of a cute kitten.
The proliferation of high-quality web-only applications means that students and educators can conduct their entire educational curricula online, through a browser window. All the cost, complexity, and overhead of traditional desktop PCs or laptops is no longer necessary.
This is why the Chromebook has taken over with such force in education. It's cheap, configures instantly, and does the job.
There's no need at the beginning of a semester (or throughout) to clear hard drives, reinstall images, and otherwise fight with getting students and faculty up and running. Just boot up and log in. No muss. No fuss.
For much home use, phones and tablets are not just good enough, they're actually better than traditional desktops and laptops. There's an incredible depth of capable, inexpensive, easy-to-use apps that tightly integrate with the mobile device environments.
Whether it's reading, writing, editing and curating photos and videos, or many other creative and social activities, the apps on mobile devices do the job quite well. For those who need to type, pairing a keyboard to an iPad or other tablet works so well, it's the default behavior for many. Others simply dictate what they have to say into their phones, conducting all of their messaging and online social activities through their phones, alone.
Gaming, of course, is huge on mobile devices. For those with a desire for more serious games, consoles do the trick. While PC gaming is still a powerhouse, the bulk of gaming consumers can get by with a phone, a tablet, and/or a PS4 or Xbox.
The bottom line is simple. Consumers (and I'm including students in here) no longer need traditional desktop or laptop computing. Everything they need can be accomplished by a mobile device, apps, and cloud-based applications.
This reality can be seen in the dwindling PC sales numbers. If machines based on the keyboard/mouse model are no longer needed by consumers, fewer machines will be sold.
But that's only part of the story. Because computing is used for more than just consumer computing. People doing serious work need computing power, and for them, sometimes the limits of mobile technology are just that: limiting.
Developers, for example, often need big screens, as do video editors. A little 9.7 or even 12-inch screen just won't do. I use three computers in an odd combination of connections to run my live video studio. Most video producers need a ton of technology to manage all the feeds and switching necessary to produce production quality video.
CNET: Best Desktops for 2018
Many work-based projects require huge and fast storage, lots and lots of RAM, lots of screen real estate, custom peripherals, and more. Even your basic spreadsheet jockey, who has to manage and work with spreadsheets all day, benefits from a mouse, because holding your arm up over the screen can be fatiguing and damage shoulder muscles.
The point of all this is that while consumers and students can get by using alternatives to the traditional PC (or Mac), workers (and enthusiasts) often can't. These so-called pro users need power, flexibility, and options.
The problem is that power, flexibility, and options do not a mass market make. It's far more cost effective for vendors like Apple to crank out millions of similar machines that will appeal to a large mass audience than it is to produce a small volume of custom machines or tweaker-level parts.
Those products still exist, especially in the form of the wide array of motherboards, cases, and components available from PC component makers. There is still demand. It's just not necessarily a demand big enough for an Apple, or a demand that's universal enough to support large scale manufacturing build-outs.
That's the challenge us edge cases are facing. Whether we're scientists in a large corporation, creatives in a movie or TV studio, or home-based workers using technology to magnify and augment our otherwise small-scale operations, we pros definitely need serious power and customizability.
TechRepublic: Want to build a home lab for containers and virtualization? Consider mini PCs
We are still traditional PC and Mac power users. That desktop and laptop, keyboard and mouse, Windows and macOS model is still in demand. It's just no longer universal.
On one hand, that's something of a blessing, in that we're spending a lot less time troubleshooting Windows installs on cousin Bob's old XP machine. Yes, we're still trying to explain routers and Wi-Fi passwords, but it's not the same.
On the other hand, those of us who need specialized solutions (like my need for a powerful headless Mac or Mac mini) are struggling to find answers. The selection that used to be available because consumers also contributed to the purchasing power is now reduced. Because we're pros, power users, and enthusiasts, we will find solutions to meet our needs. But it's a bit more of a challenge in some ways than it was before.
The architectural model of computing, and therefore the business model of computing products, has split. Workers do use consumer products (and demand consumer-level ease-of-use), but also need flexible power. Some consumers use traditional machines for gaming or other specialized use. But in the main, workers and consumers are now taking two different computing paths, with a little cross-over.
We'll see where that takes us. It used to be that home computer use prepared one for on-the-job computer use. That may no longer be the case. It used to be that large companies like Apple made a wide range of products suitable for pro users. That may also no longer be the case.
What will be interesting to see is where it all goes. Does consumer technology get more flexible? Will there be 27-inch or 36-inch wide iPad displays? Will there be a pro hub that supports a wide variety of peripherals that uses mobile devices as screens or the basic computing power?
If Jason is to be believed (and he's one of the wisest and most accurate prognosticators I've ever met) the days of Windows and Mac-based PCs are numbered. We kind of have a view of what will take their place at home. But the big question, at least in the mind of this pro user, is how will pro computing needs be met if Windows and Mac fade into the desert's dust?
You can follow my day-to-day project updates on social media. Be sure to follow me on Twitter at @DavidGewirtz, on Facebook at Facebook.com/DavidGewirtz, on Instagram at Instagram.com/DavidGewirtz, and on YouTube at YouTube.com/DavidGewirtzTV.