Mac hacker Charlie Miller discovered a security hole in the way Apple digitally signs apps for the App Store and used this information to create a 'legitimate' app of his own that passed all of Apple's checks, but which could download and run unsigned and unauthorized in users iOS devices.
The app was interesting in that Miller could choose what payload was sent to the app. He could make it open an YouTube video, make the handset vibrate, and even get direct access to the file system and grab files like the address book database.
Now, is Apple doing the right thing by banning Miller's developer account and removing the app? Yes, it is. The app, while not containing any malicious code, still deliberately leverages a serious security loophole and can download malicious payloads to the handset. This sort of behavior violates Apple's developer terms and conditions and as such is more than enough reason for Apple to give Miller the shove.
Note: The app had been in the Apple Store since September.
So, what worries me isn't that Apple kicked Miller and his app off the developer program, it's that Apple didn't spot what this app was doing in the first place. Miller had to talk about it before Apple realized what was going on. That's what I find very worrying.
Note: Given his reputation, the fact that Charile Miller had submitted an app should have set alarm bells ringing at Cupertino!
So, what happens when a developer (even if that developer is a well-known hacker) submits an app that leverages a vulnerability to Apple for approval? Apple approves it and hopes it doesn't contain a hidden vulnerability. Apple yanked Miller's app from the App Store because he talked about it. Bad guys don't do that sort of thing, so vulnerable apps could go unnoticed for a very long time.
I thought Apple's iOS ecosystem was supposed to be a walled garden. Seems to me like it has a low fence at best, one that's quite easy to step over, and once you're over, there's little chance that Apple will find out what you've done.