Social Data Sharing Standard Nears Finish LineSocial Data Sharing Standard Nears Finish Line

Facebook, Google have already backed the OAuth 2.0 authentication scheme. What's holding up this "valet key" for social data?

David F Carr, Editor, information Government/Healthcare

August 31, 2011

7 Min Read
information logo in a gray background | information

Nimble Contact Integrates Connections Across Mail, Social Networks

Nimble Contact Integrates Connections Across Mail, Social Networks


Slideshow: Nimble Contact Integrates Connections Across Mail, Social Networks (click image for larger view and for slideshow)

The social media security protocol OAuth 2.0 soon ought to be fully baked--even if the editor of the specification is left grumbling that there have been too many cooks in the kitchen.

OAuth 2.0 is already one of the most important social software development standards, thanks to its incorporation into the Facebook authentication scheme used with the Open Graph API. As implemented by Facebook, it provides those popup dialog boxes asking you to grant an application the right to access your personal data and activity stream. Thereafter, OAuth provides the mechanism an application uses to prove to Facebook that you have granted it the right to access those resources.

Yahoo, Google, and Web 2.0 pioneers like 37signals have also implemented some version of OAuth without waiting for the specification to be final. Yet when the OpenSocial 2.0 specification was published last week, it referenced OAuth as an "incubating" standard because it was still in the process of being finalized through an Internet Engineering Task Force (IETF) working group.

What distinguishes OAuth from more primitive mechanisms for making Web applications talk to each other is that you don't have to fork over your password to allow one application to log into another on your behalf, nor do you have to allow the kind of open-ended access associated with letting the application login as if it were you. Instead, you authenticate yourself to a trusted application and grant whatever permissions you are willing to share. Instead of your password, you give the other application a token--an encoded file the guest application uses to prove you have given it permission--the next time it wants to access your profile or private data.

"It's sometimes compared to a valet key for the Web," said Layer 7 Technology CTO Scott Morrison, whose firm sells cryptographic middleware that includes OAuth support. Just as you may want to let the valet park your car without having access to what's in the trunk, providing an OAuth token to a photo-sharing site provides a way for you to grant access to view your photos but not to delete them--for example, to the website of a company that will print your photos on the side of a coffee mug. The coffee mug website says to the photo website, I have permission to access these resources "and here is the token to prove it," he said.

Later, a user can cut off access by invalidating the token without needing to change passwords.

OAuth 2.0 is significant because it simplifies the cryptographic requirements of setting up these transactions, because it addresses more enterprise requirements, and because (assuming it clears a few more hurdles) it will be an official IETF standard. The original OAuth specification was more of an ad hoc creation of a community of Web developers with a specific problem to solve, Morrison said. "Standardization of it kind of came second, and it got a lot of credibility that way because it was created to solve a specific problem."

However, some Web developers stumbled over implementing the encrypted tokens required by OAuth 1.0 and 1.0a. OAuth 2.0 eliminates that burden, allowing developers the option of having tokens encrypted in transit by the same sort of SSL encryption that protects credit card transactions, without the need to encrypt the token file itself.

Currently at draft 20 within the IETF working group, OAuth 2.0 is mostly down to arguments over individual words like "must" versus "should" in its requirements and recommendations, according to Eran Hammer-Lahav, editor of the specification."If it was up to me, it would be done by now," he said. The big Web players like Facebook have long since lost interest in the standardization debate and moved forward with usable implementations based on an earlier draft, he said. Meanwhile the debate has been hijacked by enterprise vendors trying to make sure the specification matches all the requirements of their software, he said.

Already at the "last call" stage for changes, OAuth 2.0 might go to one more draft before being voted out of the working group, Hammer-Lahav said. Barring a veto by higher-ups in the IETF, it ought to be recognized as an Internet standard by the end of the year, he said. "Best case, it could be considered final in two to three months. Really, the only problem that could hold it up now would be security issues--if somebody identifies a significant security flaw that can only be remedied by a protocol change. That's possible, but very unlikely. This has been reviewed for about three years now."

That is not to say every OAuth 2.0 implementation will be secure. OAuth 2.0 will provide developers with the building blocks for a secure system, but it won't guarantee that they will put them together the right way, Hammer-Lahav said. "The main issue the working group is a little bit stuck on right now is, just how far do we need to go to identify, describe, and possibly close every Web security issue this could possibly touch. The problem is, OAuth operates on the Web using the browser to do its work, and that means it's subject to every single attack vector that's possible on a Web browser."

Someday soon, he anticipates there will be "sloppy reporting" in the news implying that there are fundamental flaws in OAuth when really the problem is a poor implementation, he said.

The working group participants also hit an "unbridgeable disagreement," he said, over the right way to create and protect OAuth tokens, and so two alternate methods are specified in companion documents that will have be approved separately from the core OAuth 2.0 specification. The method that is most widely implemented so far is called a bearer token, which entrusts all responsibility for encryption on the SSL connection between browser and server. The other is based on a Message Authentication Code (MAC) technique.

Hammer-Lahav was on the MAC token side of this argument, calling it "a super-simplified version of the OAuth 1.0 token, where it takes you like two lines of JavaScript to implement it," but then you get some encryption of the token itself, making it harder for an attacker to steal. Bearer tokens, in his view, have all the same problems as session cookies. But website operators find that argument unconvincing because session cookies, for all their shortcomings, are so pervasive in Web applications that taking a similar approach with OAuth tokens strikes them as no big deal, he said.

Maybe so, Hammer-Lahav said, "but I wouldn't write a new protocol that keeps encouraging that sort of irresponsible behavior." On the other hand, if you want to do things his way, there is nothing stopping you, he said. In a recent presentation at the OSCon open source conference, he discussed the use of OAuth 2.0 and MAC tokens on Sled.com, an experimental collaboration website he has been working on for his employer, Yahoo. (Hammer-Lahav stressed he is not authorized to speak on behalf of Yahoo and his comments on OAuth should not be interpreted that way).

While Hammer-Lahav grumbles about enterprise requirements adding to the "bloat" of OAuth 2.0, working group member Dick Hardt said it addresses issues he faced while working at Microsoft on cloud computing requirements, like how to handle API calls into Windows Azure. Hardt has since left Microsoft and said he is working on a new venture. OAuth 2.0 is based on a proposal he submitted to the IETF, originally known as OAuth Wrap.

"In a cloud environment, one of things you'd like to do is have a resource up in the cloud get a token that is self-contained, and you can tell just from that whether or not it should allow access to the resource," Hardt said. "That's why the guys at Salesforce implemented it--it solves some of the issues you need to solve with an enterprise use case for the cloud."

Hardt believes OAuth 2.0 does a good job of addressing the security issues of this sort of resource sharing, even though he agrees it can be subverted by a sloppy implementation. "All the obvious attacks have been thought through, and the non-obvious ones are the ones we're working on now," he said.

Automation and orchestration technologies can make IT more efficient and better able to serve the business by streamlining common tasks and speeding service delivery. In this report, we outline the potential snags and share strategies and best practices to ensure successful implementation. Download our report here. (Free registration required.)

Read more about:

20112011

About the Author

David F Carr

Editor, information Government/Healthcare

David F. Carr oversees information's coverage of government and healthcare IT. He previously led coverage of social business and education technologies and continues to contribute in those areas. He is the editor of Social Collaboration for Dummies (Wiley, Oct. 2013) and was the social business track chair for UBM's E2 conference in 2012 and 2013. He is a frequent speaker and panel moderator at industry events. David is a former Technology Editor of Baseline Magazine and Internet World magazine and has freelanced for publications including CIO Magazine, CIO Insight, and Defense Systems. He has also worked as a web consultant and is the author of several WordPress plugins, including Facebook Tab Manager and RSVPMaker. David works from a home office in Coral Springs, Florida. Contact him at [email protected]and follow him at @davidfcarr.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights