If you're reading this article here on ZDNet, you're also probably not most people.
We are of a unique breed of human known variously as technologists, geeks, nerds, or the guy down the street who can fix your printer. We do not think like most people. We do not act like most people. And as such, we -- I include the entire computer industry -- don't understand, at a fundamental level, what drives most people.
The Apple II was introduced on June 10, 1977, a whopping 38 years ago. While there were other personal computers introduced around that time, it's a convenience for us to mark June 10, 1977 as the beginning of the personal computer age.
The idea of the personal computer is interesting, because there were certainly business computers before June 10, 1977. There were even computers used by people for personal use before June 10, 1977. But right around that time is when people started using computers for home stuff, for things that didn't involve managing an office or managing processes.
I bought my parents their first personal computer, a dual floppy-drive CP/M machine, in the early 1980s. My dad used it to keep track of his stones; he was some sort of gem and rock collector. My mom used it to write. She wrote poems and short stories, and even a pretty darn good (but sadly unpublished) novel.
In the 1980s, personal computers were truly personal, because they didn't connect to other computers. With the exception of a few dial-up BBSs that us non-normal people would use, personal computers were standalone islands. And yet, they were personal. They were used for some simple activities, like balancing checkbooks, writing novels, and playing games.
Sure, some people used personal computers to write programs. We got Prince of Persia (and the entire Prince of Persia franchise) because Jordan Mechner wrote Karateka and it made a few bucks, giving him the opportunity to write more games. But Jordan isn't normal people, either. Normal people play games, they don't write them.
For nearly three decades, from roughly 1977 to 2007, personal computers ran on two tracks: PCs and video game consoles. Those consumers who just wanted to play video games bought Nintendos or XBoxes or PlayStations (yes, I had a TurboGrafx-16 and a Dreamcast, and they both rocked!). Consumers who wanted to do a little more bought a PC (or a Mac, which is still a PC).
This was a very big business. Yes, there was also the very big business of PCs for business, but that's almost a different universe. Sure, because PCs are general-purpose devices, some people might buy a PC and use it in the office and some other people might buy the very same PC and use it at home, but that was because -- at least until about 2007 -- the only way you got most of the home functionality was to use a PC.
If you wanted to use email, get a PC. If you want to go onto CompuServe or AOL, get a PC. If you want to share baby pictures with Grandma, get Grandma a PC. Consumers didn't buy PCs because they wanted PCs, they bought PCs because they wanted to see kitten and puppy and baby pictures, talk about their favorite sports teams, keep track of their favorite recipes, and write poetry.
This is why, as John Morris described here in ZDNet last week, "things went from bad to worse for the PC industry."
One of people who influenced my understanding of business strategy most is Micheal Porter of Harvard Business School. Porter's text, Competitive Strategy, has been essential to my understanding as a strategic thinker. I can't recommend it highly enough.
Porter talks about a model called "the five forces" model. These are forces that drive competition and include new entrants into a market, the bargaining power of customers and suppliers (two forces), rivalry among firms, and the threat of what are considered substitute solutions.
The PC market is being decimated on the consumer side by substitute solutions.
Earlier, I designated June 10, 1977 as a convenient start date for the personal computer. Let's bookend that by designating June 29, 2007 as the end date of the dominance of the personal computer.
What happened on June 29, 2007? I'd give you three guesses, but you only need one: the iPhone was released. Now, to be clear, the dominance of the personal computer didn't end because of the iPhone. The dominance of the personal computer ended because of the smartphone. But the iPhone was the launching point for smartphones, and so June 29, 2007 is the date we can put on the tombstone of the dominance of the PC industry.
It actually took about a year before the PC industry started to lose consumers. It wasn't until the app store model showed up that the writing was truly on the wall.
Think about what app stores did for consumers. Before app stores, you had to install software. That often required downloading zip files, unzipping them, running an installer, and on and on and on. Or popping open a disk drive and physically inserting disks, one after the other.
Managing software was more complex than the typical consumer was able to handle, so once again, the weird geeky guy down the street was the go-to guy.
But app stores are easy. One click installs, and it's there. There are not even any license codes to type in. Just push another button and pay your money, but there are no codes, no configuration, no compatibility issues. It just works.
Consumers don't care about the things we geeks care about. The world evolved and consumers discovered Facebook and YouTube. Sharing puppy and kitten and baby pictures got even easier. Talking smack among friends about sports scores got easier.
For those who wanted a little more screen real estate but still the same easy of use, tablets became commonplace. My mom could have very easily written her novel on an iPad or an Android or a Fire tablet. She certainly didn't need WordStar anymore.
The hottest device in the enterprise remains the tablet. Executives have pushed for them, IT departments have accommodated them, and users continue to clamor for them. Are they a fad or game-changer? We examine the productivity benefits, opportunities, and myths.Read now
Windows updates. Viruses. Driver compatibility. All of these things were ugly thoughts that consumers just didn't want or need to hear. And so the PC market collapsed, Sure, PCs are still sold to power users and business people. I write code and build enormous presentations and produce videos, and so I have a PC (well, actually an iMac) with four monitors on it. I have another Mac on the side of the couch with a swing-out arm, and an incredibly powerful gaming rig powering my 65-inch TV.
But I'm not normal.
The point of all this is that real world people were able to take control of their access to technology, because a smartphone or a tablet reduced a lot of the technical skills needed and the technical hassle required that were natural barriers to computer use, which today we can relabel as Internet access.
The smartphone has been around for almost a decade now, and we've seen the damage it has done to sales of traditional PCs. But while tablets and smartphones were incredibly viable substitute solutions for PCs, they did not remove all the barriers to entry for consumers to gain access to the Internet.
There's one more barrier, one that smartphones and tablets actually make more difficult: cost.
I have an iPhone 6s Plus with 128G of storage. This is a $950 phone. As such, it's far more expensive than many PCs and laptops. With AppleCare+, it's $1,150. I pay it monthly, in $45 increments, and next October, I'll get an iPhone 7.
At $45 a month, I don't really feel the impact of the cost of the device, but if you do the math, I'll be spending $540 for 12 months use of a phone -- and that's not counting the cost of the phone service itself. That's a hell of a lot of money. It's far more than many people can afford (and it's also far more than many smart people are willing to pay).
When you combine data plans with phone costs, most people with smartphones have to pay nearly a $100 a month. That's a lot of money to many people.
Ever since the personal computer became personal, sociologists have talked about something called "the digital divide." The digital divide is the series of socio-economic barriers that separate the digital haves from the digital have-nots.
In 1997, not having access to digital technology was unfortunate, an inconvenience. In 2015, not having access to digital technology is a life-sucking handicap. Today, if you want a job, the odds are you'll have to apply online. If you need to get support for most services, you need to go online. If you need access to government services, you're often guaranteed faster service online. If you want government-supported health care, you need to go online.
If you can't go online, you are a second-class citizen. You will suffer considerably compared to those with online access. Heck, if you can go online, you can take college courses for free.
Back in the days when consumer PCs were the thing, we in the industry used to talk about how transformative it would be to get PCs and laptops to below $1,000. The original Apple II was $1,298, which is more than $5,000 in today's money. The first Mac was $2,495 -- which would be more than $5,500 in today's money.
Think about the barrier of entry for consumers. Those of us who, today, could afford $5,500 to buy a computer or $1,150 to buy a phone. We're in a completely different socio-economic place than those who live on a fixed income, live off Social Security, or just have a basic, minimum wage job.
In fact, my iMac cost pretty near that $5,500 -- but I use it for work and configured it to basically be the Helicarrier of desktop PCs. It's a brute because I need brute capabilities for what I do.
Think back though to the launch of the Apple II and the Mac. Consumers then were expected to have the buying power equivalent to spending $5,500 on a PC today. That's some big money. And that's why there is such a digital divide.
Now, imagine what would happen if the barrier of entry to digital technology wasn't so high. As soon as laptops dropped from $5,000 to $500, a lot more people bought laptops. In fact, it's not tower PCs but laptops that are the dominant form of PC in today's world. It makes sense. They're portable, convenient, and use less power.
What would happen to the digital divide if access to the Internet got cheaper still? What would happen if consumers could get their hands on a tablet for as little as fifty bucks? Not fifty bucks a month. But just fifty bucks.
Fifty bucks is affordable to almost anyone. Fifty bucks has the potential to close the digital divide. It has the potential to be life-changing for some people out there.
That's why Amazon's new $49.99 Fire tablet is so interesting. No, it's not a top-of-the-line machine. Yes, it's a bit clunky. Yes, if feels like a device from 2012.
But who cares? It's fifty bucks. At fifty bucks, anyone can have a window into the Internet.
There are, however, some challenges. The $50 tablet gets us part of the way to closing the digital divide, but it's not a complete solution. There are still two major barriers for the have-nots that a $49.99 Fire tablet doesn't overcome.
First, there's getting the tablet. To order online from Amazon, you need online access and you need a credit card. Many consumers who could derive life-changing benefit from a fifty dollar tablet don't have online access and certainly don't have a credit or debit card.
This is where retail comes in. For the $50 tablet to erase the digital divide, it has to be available in WalMart. It has to be available somewhere where a consumer can go and pay cash to buy the device. Until it's accessible to those who can't go online, the tablet still remains unreachable.
Update: It's been pointed out that WalMart does offer a $50 tablet (although not in stores near me). That opens up one important door to Internet access, but it still leaves a big problem that's not getting better...
Second, there's the Internet service. There are many open WiFi points throughout the country, and if you had a $50 tablet, it's relatively easy to find an open WiFi point to slurp up some Internet juice. But putting WiFi into homes is still costly. The cost of Internet access (along with cable and satellite bills) are the only categories of consumer good or service that have gone up in price over the past two decades.
These are the two trends: inaccessibility for purchase and cost of Internet access that prevent the $49.99 Fire tablet from transforming the digital divide.
But there is reason for hope. The biggest challenge from a technical perspective was cost-reducing the technology enough that producing it in volume at a vastly reduced price was possible. Getting distribution, while something of a challenge (because the more middlemen involved, the less profit to be applied to product cost), is possible.
But if we can open up universal WiFi, even just in certain communities where the residents themselves can't easily afford Internet access, we may have a way to open doors previously closed to those who couldn't afford a $500 or $5,000 PC.
So the next time you hear about the PC business being decimated, remember that's simply because consumers were never really comfortable with the PC model to begin with, and also remember that there are many other opportunities out there to meet and exceed the needs of customers.
Because of the limiting reasons I described above, I can't predict whether the fifty buck Fire tablet will be a winner for Amazon. But I can tell you this: it's a harbinger of a future where the digital divide might not be so divided anymore. And that's a really optimistic and hopeful thought.
By the way, I'm doing more updates on Twitter and Facebook than ever before. Be sure to follow me on Twitter at @DavidGewirtz and on Facebook at Facebook.com/DavidGewirtz.