Why Apple's iPhone is like a 1981 IBM PC

The iPhone is inching towards the enterprise. So why has Apple limited it so badly?
Written by Rupert Goodwins, Contributor
Commentary--The iPhone: what is it good for?

When the iPhone was launched, it was a consumer device with very limited potential for third-party developers. Then there was either a change of heart or the delicate evolution of a secret master plan, and Apple unveiled a proper software-development environment--and started to talk about proper enterprise capabilities. Proper in all but one sense. While application designers are free to do almost anything they like, they can't create background tasks: software either runs on the screen or it doesn't run at all.

That's a limitation nobody's had to contend with for a couple of decades and one that's particularly keenly felt in a phone. After all, smartphones practically have to do lots of things at once, by definition. They play music and pick up calls. They browse the web and they run IM. They have calendars and they sync your email.

But not with an iPhone, or at least not unless you're using Apple's own software.

There are good arguments for this limitation, say Apple's defenders: you don't want hundreds of independent apps firing off network and phone requests willy-nilly--the battery would be dead in no time; other system resources, such as memory and CPU, are also limited — giving the user the chance to load too many items at once is a recipe for a terrible experience; there's no way for a user to safely interact with lots of programs going off in their own time, given the limitations of the iPhone's user interface; or perhaps it's that, with every new combination of resident programs, it becomes harder to test for unwanted interactions and other potential causes of unreliability; the iPhone is more like an iPod than a computer--it is an information appliance. Users and developers have to be educated to accept this.

These are good arguments. They're also wrong.

Let's deal with unwanted interactions between applications. Long-term survivors of IT will remember the nightmare days of MS-DOS, with battling terminate-and-stay-resident programs, conflicting expanded and extended memory managers, all layered on top of an medieval operating system with the manners and style of a hung-over syphilitic warthog. The result was a fabulously unstable computing environment that took a lot of time and expertise to keep alive. Nobody wants that for the iPhone. But this isn't 1981--it's 2008. Modern processors have memory-management hardware. Modern operating systems, especially those which, like OS X, sit at the end of decades of continual development, are robust. We know how to shield applications from each other.

Then there's the resource crunch. It's true that the iPhone has a mere 128MB of RAM and likes to keep its processor slow; it isn't a top-of-the-range desktop monster. But it isn't an information appliance either; that's what simple phones are for. Again, at this point in the history of software development, we know how to put a lot of functionality into a small space.

Users have reasonable expectations and know that, if you pile in too many programs at once, then things won't work. That's something which can be underlined by disciplined reporting of program requirements and a modicum of... ...sensible management. Moreover, with thin-client design methodologies, you can cram a very great deal of use into very tiny code stubs. You write code that works within the restrictions.

Perhaps the best arguments against Apple allowing background tasks are that they take up too much airtime, draining the battery, and that there's no way for them to communicate to the user when they need attention. If either of these two things were a given for background tasks, then Apple would have a point. But they're not, and it doesn't.

If the design of the iPhone precludes proper always-on connectivity--which wouldn't be the first time the company has gone for form over function--then have a decent scheduler, which understands the metrics of wireless access and makes intelligent decisions about when to allow what to connect. This does put the onus on application designers to understand the limitations and capabilities of such a channel and to create software accordingly, but then that is their job. Likewise, if there is a limited user interface, then create a common alert mechanism which mediates requests and interactions. There are good ways to do this; it takes cleverness, a feel for usability and a good understanding of design principles. Last time I looked, Apple had some form here.

To some extent, all these arguments are otiose. Background tasks clearly run well on the iPhone; Apple's own software uses them, as do the products of some of its closest friends. OS X is a modern operating system with all the capabilities needed, even in a restricted, real-time environment. Even the most rabid "fanboys" won't argue that background processing will never come. Instead, they say, we must trust Apple and let it deliver what it likes when it likes.

I don't know why Apple hasn't let anyone else have the keys to that particular kingdom. Perhaps it really can't make the technology work properly. Perhaps it wants to limit the amount of work it has to do to approve applications for distribution--after all, if you can't run any background tasks, you never have to worry about unforeseen interactions--and that 30 percent of retail price just won't pay for enough testing. Perhaps it doesn't trust application designers or users very much. Perhaps it wants the best software for itself, where it can limit what it can do in order not to upset its telco friends.

Whatever the reason, it reflects badly on Apple. It's either not as clever as it makes out, greedier than it likes to admit, more hemmed in by its design decisions than it wishes to make apparent or just determined to force its vision on the world regardless of what the world wants. Think different?

But it leaves the company vulnerable to the competition and to a loss of luster. The phone is not an iPod; it's a smartphone connecting to a universe of fast-changing data on behalf of innovation-hungry users. The sooner it stops pretending to be a 1981 IBM PC, the better it will be for everyone.

Editorial standards