Update: Special thanks to TalkBacker denisrs for posting this link describing official Mac App store guidelines.
To me, at its most fundamental, software means freedom. Hardware is generally fixed, in that it's physical, and, well, hard. But software is malleable; the very same RAM, CPU, and hard drive can be made to do wildly different things based on the arrangement of bits into different sequences of ones and zeros.
To be fair, I'm a programmer, so I look at a computer differently than do regular users. When I look at a computer, I always factor into my thinking whether I could do better, whether the CMS I'm using is good enough, or whether I'd be happier writing my own code, whether the CRM system I'm using is good enough, or I'd be better off writing my own code, and so on.
Programmers can do that. Real users can't. Of course, we programmers are often so busy we wind up using the same off-the-shelf (there's an anachronistic term, eh?) software everyone else uses, but we know we have the freedom to bust out our development environment at the drop of a hat and code up what what we'd like, however we'd like it.
Apple's App store approach
I think that's why Apple's App store approach has always given me such a set of the willies, first for the iPhone, then the iPad, and now for the Mac.
I don't mind that Apple takes its 30% for software sold through the store. Actually, that's a great deal. Back in the olden days when I wrote boxed commercial software and sold it through brick and mortar stores like Egghead, the distribution channel wound up taking closer to 60-70%. Apple's share is a lot fairer to developers.
I also appreciate that Apple is essentially providing a warehouse and fulfillment function in electronic form. I think that's why developers have so taken to the App store concept. Rather than having to set up my own online store with download capability (easy enough, but nice that I didn't need to for iPhone apps), Apple does that part of the hard work.
Developers also don't have to make sure the cart is working, that the credit card gateway is working, and all the moving parts are in good working order.
In fact, I wrote 40 quite silly iPhone apps in September of 2008 and I haven't had to touch them in two years. I just let the (rather small) deposits accumulate in my bank account and occasionally use them to make a car payment. They've been completely maintenance free.
Fundamentally, it's the gatekeeper factor that I don't like.
That was then
When it came to the iPhone, the gatekeeping factor made sense. Apple has this completely unpredictable application approval process, where you submit your app for approval and, if you're lucky, sometime in the next 4-8 weeks, it'll be approved for sale.
But there's no predicting Apple. We all know the stories of Apple's capricious denial of apps for all sorts of reasons, including no reason at all.
In fact, that's one reason I never developed any applications for the iPhone bigger than my silly little apps. I didn't want to put six months or a year of coding into something (in a programming environment and language that didn't work anywhere else) only to have Apple decide that, oh, that email app duplicates their own minimal email functionality or that launcher app touches other applications in the system.
I just didn't want to lose a year of work to the random whims of Apple's developer-unfriendly policies.
But that was on a phone. It made some sense for the handset maker to have some restrictions because the device had to work on AT&T's network. So it made some sense for Apple to restrict, say, a podcast player application because Apple didn't want to stream video early on over AT&T's network.
When the iPad came out, I was deeply curious. Would Apple relax some of its restrictions and simply allow anything to run on the iPad?
In fact, Apple did back off some restrictions, including their insistence that programmers only use Xcode for development. Of course, there's no telling whether they'll reverse that almost-reasonable behavior out of the blue, because, you know, it's Apple.
Apple still doesn't allow anything to run on the iPad without restriction. iPad apps have to go through the same random, who-the-heck-knows app review process as iPhone apps, and unless you want to jailbreak your iPad, you're stuck with just the apps that Apple allows.
Now, we all know there are a lot of apps, so why complain, right? The reason is freedom. As long as Apple restricts what apps can be run, the device isn't free. It's not a computer, it's an appliance.
Some of you might argue that the iPad isn't meant to be a computer and it is, in fact, an appliance, and you're good with that. I'm not thrilled with that view, but I can accept it.
This is now
But what about the Mac? Windows fans have long maintained (incorrectly, I might add) that the Mac is a toy. It's not. It's a full-fledged, quite powerful computer. A computer. Not an appliance.
What happens, though, when Apple introduces its App store concept to the Mac? Initially, it'll just be a distribution option. You'll still be able to install non Apple-approved software on the Mac.
But what about the iteration after that? Will Apple eventually lock down the Mac, so the only software allowed to run is Apple approved? Then what?
Then there's the question of how the Apple approval process will work when it has to test real, big, or special purpose applications? Angry Birds is one thing. A process control application or, what about, oh, I don't know, Firefox?
Apple doesn't allow plug-ins in the App store. That shoots down essential applications like Firefox and, even, Photoshop.
Then, what happens to all those Adobe applications, now that Apple seems to hate Adobe? What happens to all those great email applications and Finder-tweaker applications? What happens to any application that doesn't fit Apple's political, moral, and ethical worldview?
What happens is, at that point, the Mac becomes just an appliance.
Even then, I'm not too concerned. The Mac still has only about a 10% market share, which puts it squarely into the not-particularly-relevant-to-the-real-world category. Heck, that's about half of what Ross Perot got in the 1992 general election and no one besides old politics junkies like me even remember Ross Perot (or 1992).
So, essentially, in the real world, the Mac isn't particularly relevant.
Most computer users have already voted, and they've resoundingly voted the Mac down. It's a marginal, fringe product, at best.
The problem is that while customers don't really care about the Mac -- I know, those of you commenting below are going to go off the edge about that statement, but numerically, it's true -- Apple does tend to be a trend setter in the computer industry.
That means that other companies are likely to set up their own App stores. Other companies like Microsoft.
To be fair, I honestly can't see Microsoft going out and filtering every application they're willing to allow to run on Windows. Microsoft has never, really, cared all that much how people use their wares, as long as they sell.
But what if Microsoft does decide to care? What if, suddenly, for Windows 8, say, Microsoft decides it, too, is going to filter the software you're allowed to run?
At that point, most computers will cease to be computers, we'll lose an absolutely essential set of freedoms, and the world will be that much worse off for it.
If anything, this could be the Mac's legacy.
Not that it's a machine with an undersold OS that has a user interface still stuck in the 1980s. Lower-left corner-only window resizing, they're singing your song.
Not that it's a machine built and marketed to insane fanbois who buy into it because of their sad, desperate belief that if they love their Macs enough, they'll be cool enough to be loved by real people.
Not that it's a machine priced, not based on the quality of the components inside (which are the same commodity components inside Windows machines), but priced double what it's worth because of the silly Apple logo and the lemmings to whom that means something everything.
No. Instead it's that OS X with a Mac App store could inspire other vendors to shut down software freedom and finally, drunk on DRM, make the nightmare of Orwell's 1984 into a reality.
Of course, there will always be Linux.