X
Business

When will Apple get serious about security?

The tech community (and beyond) is an uproar over the recently revealed iOS and OS X SSL/TLS code flaw. Apple developers have questions about Apple's commitment to quality and the flaw itself.
Written by David Morgenstern, Contributor

Apple on Friday pushed out an iOS fix for the SSL/TLS bug. The concerns of the Mac community shifted to the then still-missing patch for OS X. An Apple spokesperson said the fix was due very soon. That "soon" didn't arrive on the weekend. Maybe Monday.  

When will Apple get serious about security?

Several Apple developer bloggers offered comments about the issue. Michael Tsai pointed out that this is a Mavericks problem and for developers, a problem in the various seeds of upcoming versions in the hands of developers.

You can test whether your device is affected at gotofail.com or imperialviolet.org:1266. At this writing, Mac OS X 10.9, including current seeds, is still vulnerable. iOS 5 and Mac OS X 10.8 never had the bug. It’s fixed in iOS 6.1.6 and iOS 7.0.6.

The problem line is an extra goto fail; that appears to be a simple copy/paste error. My guess is that the author may have used a multiple copy command and forgotten about the extra goto.  Tsai points to a very funny comic about goto by Randall Munroe on his XKCD, webcomic of romance, sarcasm, math, and language.  Tsai doesn't blame the programmer as much as the testing process.

The offending line of code is a single extra goto in SSLVerifySignedServerKeyExchange(). In my view, this is not an improper use of goto. The code follows a standard C error handling style. I’m also unpersuaded by the argument that the bug should be blamed on brace format preferences.

Any of us could have written a bug like this, especially when merging changes from different sources. But a flaw in process is what let the bug ship. If ever there were code that should be unit tested, it’s Secure Transport. Landon Fuller shows that it would have been easy to write a test to detect this regression.

However, Lloyd Chambers at The Mac Performance Guide said it's continuing evidence of "core rot." He's had a special report up on the subject for quite a while. Chambers says that Apple appears to have plenty of engineers for "eye candy," as well as, for screwing up usability, but not for security and testing.

In an age where millions of always-on devices are at risk, you don’t screw up fundamentally critical things like this. It’s one reason I abhor gatekeeper type services like the Apple App Store: one screwup and the entire system is at risk worldwide for tens or hundreds of millions of devices. I wrote about this months ago, and while some readers poo-pooed my remarks as alarmist, I repeat that warning even more emphatically now.

To Chambers, an advocate of open-sourcing, the answer is to open the code up for wider inspection. The more eyes the better. But that's not the Apple way. He's also concerned about Apple's lack of quick communication to its user base. For example, what about non-Apple browsers running on OS X?

It’s not clear at all if use of Google Chrome or Mozilla Firefox avoids the security issue, but Apple kicks Mac users in the teeth by not IMMEDIATELY making that point clear (so users can avoid Safari). Apple should be on paid television telling users exactly how to safeguard their internet use, how to play it safe. It’s unconscionable. The core rot extends to ethics apparently.

Good points. However, I suggest that Apple's top brass and corporate culture hasn't caught up to the demands of its new role as a market leader. A number of years ago, I noted that Apple's software engineering team was stretched to the limit by the release cycles of Mac OS and iOS. Engineers spent their energy working on one "side" (iOS) while bugs went unfixed on the Mac side. The software engineering was stretched thin. I was told that engineers with considerable experience on critical APIs were redirected to other projects, while their previous work was left fallow or in the hands of inexperienced replacements.

Apple's closed system keeps most OS X and iOS users safe. And there's still a modicum of safety from the neglect of malware writers; most phishing attacks are done for Windows users. Still, the key to Apple's strategy is that it can always execute on its OSes and applications. If it doesn't, then we all sink together.

Editorial standards