Loosely based on PICS -- the controversial content rating system that flopped in 1998 -- P3P uses a similar idea to put information collection practices in a strict language that can be read by browsers.
Several privacy advocates, including the Center for Democracy and Technology, participated in the brainstorming that laid the foundations of what would become P3P in 1995 and 1996. Yet it wasn't until the W3C got involved that PICS and privacy came together.
"PICS was a way to label Web content that never really got off the ground," said Lorrie Craner, senior technical staff member at AT&T Labs-Research and the chair of the P3P Specification Working Group at the W3C. "Initially a lot of the applications people envisioned labeling sites (by) attaching meta-data. Then it occurred to us that it could be information about a Web site's privacy practices."
Craner and others worked on the specification in 1997 and produced the first reports on P3P that October. The original vision included a way of negotiating the terms of any information sharing between a Web site's policy and a consumer's software "agent." In essence, the W3C hoped to build choice into the technology as well.
Yet, the problems with implementing such a powerful feature put the negotiation part of the technology on hold. "It would have made Web sites less likely to adopt it and make it harder technically and legally for them to use P3P," said Craner.
Scuttling the complex negotiation functions also helped the W3C working group get the technology out the door quickly. Last week, 10 companies showed off their implementations of user agents and policy generators for the technology and made certain each worked with the others. That "bake-off" went surprisingly well, said participants.
Still, P3P has a number of other hurdles to leap before it becomes a standard.
For one, many privacy advocates -- including the Electronic Privacy Information Center (EPIC), Junkbusters and Computer Professionals for Social Responsibility -- have lambasted the technology as a false start for consumers.
Privacy advocates worry that the technology will give consumers notice of a company's policy, but little choice in how it's used, said Karen Coyle, a spokeswoman for Computer Professionals for Social Responsibility and a librarian by trade, during a conference call. "There are some assumptions built in that are not well-founded," she said. "One of them is that consumers will have a choice. Consumer data is the coin of the realm, and that means there won't be a lot of sites that offer great privacy."
The report concluded that the technology may actually act as camouflage for companies to avoid regulation and continue to collect information.
"No one thinks this is a silver bullet," said Ron Perry, co-founder of privacy utility maker IDcide, "but it should help consumers understand what is going on at each Web site."
In addition, the privacy settings promised by Web sites are legally binding, said Perry. Furthermore, the technology finally gives consumers an easy way to discriminate between Web sites. Essentially, good privacy protections would have some marketing value.
In fact, at the bake-off, developers talked about enhancing search engines with the ability to rank sites according to how consumer-friendly their privacy protections are. Want a good book seller that won't sell your information? Search for one, and at the top will be those with the best privacy practices.
"P3P gives people information, where today they don't have a lot of information," said Martin Presler-Marshall, co-author of the P3P specification and so-called P3P champion for IBM's AlphaWorks Division.
"Any time consumers have more information, they have more power, and that's a good thing."
Back to Part I
Take me to the XML Special