In hindsight, Apple's announcement yesterday that it was pulling out of the Macworld Expo should have come as no surprise. Ever since the 2007 Macworld where Jobs announced the iPhone, Apple's emphasis has clearly been shifting towards consumer electronics and away from personal computers. To get the iPhone out the door, they took developers away from the OS X team & delayed Leopard 6 months. They even took the word "Computer" out of their company's name. At WWDC this year the process accelerated -- it was all iPhone all the time. And now, after 2009 Apple won't even bother showing up at the Macworld conference. What does this mean to Mac developers?
The answer is, not a whole lot actually. Thanks to the iPhone, the Mac OSX and NextStep legacy lives on. If you followed my series on the iPhone Development Bootcamp, you'll know that Apple pretty much carried their entire software stack over to the iPhone (and iPod Touch of course). So if you're a Mac developer, your skills will not go to waste. Even if Macs go away, there are plenty of iPhones out there to keep you busy. The form factor is different, pricing is different, marketing is different, but love it or hate it, the Mac's unique programming environment lives on in the iPhone.
So what happens to the Mac itself as a platform? Let's face it -- personal computers are a commodity, and increasingly, an anachronism. Apple may continue making Mac hardware for years, as long as enough of the faithful will pony up the money to keep it profitable. If making Mac boxes becomes unprofitable, then Apple will stop. In those waning days Apple may even release Mac OS X as open source in an effort to keep it going. But when or if that happens isn't really important as long as the application software built on top of that platform lives on.
Even if you're "a PC", you have to admit that some of the software created exclusively or originally for the Mac is amazingly creative and innovative. Photoshop, OmniGraffle, KeyNote, GarageBand, ... the list goes on and on. I believe one reason is that Mac development teams are much more artist-heavy than other teams. How do we keep this winning combination going?
I think what we need is a Mac run-time layer, let's call it "Cider", that lets those great Mac programs work well on other operating systems. For inspiration, look no further than the Wine project., which lets Windows programs run on Linux, BSD, and Mac OS X. Recently one of the keepers of the Ubuntu software repositories said that Wine was working well enough to include with Ubuntu.
I'm not talking about virtualization, which would require you to buy a separate copy of Mac OS X. Cider would be a translation layer or program loader for Mac OS X programs. Mac programs running in Cider could act as native programs would, running without the performance or memory penalties of an emulator.
Wine has been in progress for 15 years, but according to the web site "is not yet suitable for general use". Why did it take so long? One reason is it was done without the cooperation of Microsoft (and sometimes under the threat of lawsuits). Windows programs do lots of weird things and rely on API behaviors that are not documented. Apple could shave a lot of time off the development of "Cider" by encouraging and helping its development. Still, it might take years to create given the Mac's huge API surface. Better get started now.