Users can't opt out of the program, called "Facebook Beacon," altogether. Instead, they have to opt out on a case-by-case basis when they use one of the outside sites.
Chris Kelly, chief privacy officer of Facebook, said Facebook is transparent in communicating to users what it is tracking. When a user visits an outside site and completes an action like buying a movie ticket, a box shows up in the corner of his Internet browser telling that person the outside Web site is sending that information to Facebook. The user can opt out by clicking on text that reads "No, thanks." If the user doesn't, the next time they visit Facebook, the user will see a message from Facebook asking for permission to show the information to their friends. If the user declines, the information won't be sent.
The problem seems to be that like everything else on Facebook, users are expected to shoulder the burden of sorting through all the various bits of information and deciding what to do with each.
At lunch yesterday, I was discussing the problem of ad-supported Web sites. There's surely an upper bound, but what's worse--advertising distorts everything around it. Web sites start paying more attention to "eyeballs" instead of readers or participants and they're suddenly annoying rather than friendly and helpful.
Facebook realizes that simply relying on the targeted ads of the past won't garner much attention and that they have a tremendous asset in the social graph within their system. Facebook Beacon is an attempt to capitalize on that by using the social graph to make advertising more useful for the customer and more profitable for Facebook.
Unfortunately, they got it wrong. Instead of advertising, they should have focused on recommendations. No one is going to say "please show me more ads based on what my friends like." But plenty of people will ask a friend to recommend digital cameras or books to them.
This may seem like two sides of the same coin, but there are subtle and important differences. First, there's the asking. Asking someone for recommendations puts both parties in a position of giving permission. That changes the feel of the transaction.
Second, and more importantly, most of the people who are my friends on Facebook are probably complete bozos when it comes to buying cameras or LCD TVs. I'm not dissing them, it's just a fact of life that we trust certain people for certain kinds of things. I may trust one friend's judgment on clothes and another's on music.
Facebook has missed out on a tremendous opportunity to use recommendation permissioning to annotate their social graph with trust information--that's an order of magnitude more valuable than the graph itself. I hope they don't figure it out--then I can do it.