Like laws and sausage, perhaps one should never watch or hear about how protocols are made.
It’s often a messy business given the intelligence, dedication, egos, passion and allegiances in the kitchen.
Last week, the original editor of the OAuth 2.0 specification and the author of OAuth 1.0, Eran Hammer, pulled the curtain back on a process that has gone on for the past few years – and it wasn’t pretty.
Today, he clarified his remarks and put more casualties under the bus.
OAuth 2.0, which many hope will help secure APIs and native mobile applications, should be finalized this week by the Internet Engineering Task Force (IETF).
But, according to Hammer, it’s a failure out of the gate.
He says OAuth 2.0 was poisoned with complexity and security flaws through an IETF process that had too many cooks with too many corporate interests. The title of Hammer’s blog summed it up, “OAuth 2.0 and the Road to Hell.”
With today’s blog, “On Leaving OAuth,” he answered critics and stuck a fork in developers and others who agreed with Hammer’s road-to-hell assessment but failed to speak up while OAuth 2.0 was built.
“Now, that’s just sad and pathetic,” he wrote.
But some were having none of Hammer’s outburst.
Among those that disagreed included Dick Hardt, one of the original authors of OAuth 2.0 and the new editor of the IETF spec, who shot back at Hammer’s assessment and withdrawal from the work.
A Google representative said the protocol is working fine in production.
OAuth 2.0 is already used by the likes of Facebook, Google, Salesforce.com and others. A handful of vendors already support it in their infrastructure software and gateways. And a number of other identity and privacy specs are using OAuth as a foundation.
Hammer distanced himself from the completed standard, detailed a pattern of shortcomings, picked at its security or lack thereof, lamented its crossover focus from consumer to enterprise, blamed the outcome on the IETF and its members, and then apologized for his failure and his perceived failure of the group.
“The standards making process is broken beyond repair,” Hammer wrote. “The web does not need yet another security framework. It needs simple, well-defined, and narrowly suited protocols that will lead to improved security and increased interoperability.”
Although he praised the intelligence and capabilities of those who participated in creating OAuth, he said, “most of them show up to serve their corporate overlords, and it’s practically impossible for the rest of us to compete.”
On Monday, his follow-up comments nipped at OAuth derivatives, explained his demeanor, ripped the IETF again and lambasted those who did not help him.
Hammer’s blogs fostered a number of comments, most agreeing with his assessment of OAuth’s enterprise focus or their frustration with the IETF.
Hardt, however, fired back.
Writing in the comments section of Hammer’s original blog, Hardt said:
“Wow. You insisted on total editorial control, restructured the document several times, dragged your heels on submitting changes, ripped out the bearer token to a separate spec because you don’t like that mechanism, started the MAC spec for a signed token, but then resigned from that spec. Now you resign and blame enterprise use cases for a spec you [sic]. Herding the cats to end up with a simple specification is hard, and it is the job of the editor.”
Hammer acknowledged Hardt’s frustration, but he criticized Hardt’s actions within the group “you bailed out early, as soon as your contract ended. I have spend [sic] the last year working on this on my own dime,” Hammer wrote. “ It’s also telling how you are the only person who turned this into a personal attack.”
Despite the back and forth between the two, OAuth 2.0 is one IETF procedure from completion.
With the spec finalized, more end-users and developers will get their hands on OAuth 2.0, which should expand the real-world testing environment. The sink-or-swim test should be more telling than any IETF in-fighting.
In the balance hangs a number of OAuth derivatives. OAuth 2.0 is the foundation for OpenID Connect, which adds among other features an authentication mechanism, and User Managed Access (UMA), which provides end-users with controls over their personal data.
OpenID Connect is being created by the OpenID Foundation and UMA was incubated by the Kantara Initiative and is now at the IETF.
“I hope Eran and others are keen to look at what UMA is doing,” said Eve Maler, chair of the UMA working group. “I hope folks don’t think UMA messes with OAuth. We are using OAuth in a vanilla fashion and being inspired by it in our other flows.”
Maler, who is also an analyst for Forrester Research, said, “I have been advocating OAuth for a year now, well before it was on the radar of enterprise IT and I stand by that.” She characterized Hammer’s blog as a deliberate attempt to stop OAuth 2.0. “It is not complexity, it is value. OAuth is driving value by providing an opportunity to build applications on top of it.”
Count on more blogs and comments to follow.
Is your enterprise or mobile service using OAuth 2.0? What is your assessment of OAuth? Is it too complex? Is it working for you?
(Disclosure: My employer is supporting OAuth 2.0 in its products.)
- Facebook OAuth extension ruffles feathers, nixes user access permission
- Anatomy of hack on Google leads Plaxo to up API security
- Hackers, standards and non-profits: A trinity to rescue Internet identity?