Here's Technology Review on a fascinating proposal. The US Patent Office is working with several Silicon Valley companies to seek out groundbreaking new ways to deal with the problem of backlog and prior art.
The United States Patent and Trademark Office (USPTO) is in a pretty tight spot. The entire office is buckling under the weight of more than 600,000 backlogged applications.... Today, patent examiners must peck and hunt through an almost infinite amount of information to determine prior art for software patents. ...."There's a lot within the open source community that's valid prior art, but because of how it's stored, it's not accessible to examiners," says John Doll, commissioner of patents with the USPTO. "We have a hope that if we have a standardized system, we can find it in the future."
There are three systems: a centralized, searchable repository of open source code and project documentation, an indexing system, and the third would:
tap into the greater community's intelligence when reviewing patent applications (something organizations such as Wikipedia have done for years, albeit toward a different goal).
Metatagging may come into play with the patent office's repository and search tool, although opening up the repository to even a limited number of people may prove troubling for some. In addition to creating a centralized repository for all open-source code and related materials (diagrams, documentation), the project group is considering creating a taxonomy so that open-source developers can "label" their code to help patent examiners and other interested parties understand what it is. "The public could use it as well," says Mark Webbink, deputy general counsel for Red Hat. Then, a partner such as Google or IBM could create a search tool that would combine all the data and allow examiners to hunt the repositories for prior examples -- as simply as someone might search for an online recipe.
"[They] need a tool that will enable sifting through the code in such a way that's useful to the patent examiner," says Manny Schecter, an associate general counsel for IBM. "We should be able to have it done this year."
The social networking aspect came from NYU law prof Beth Noveck, who blogged about the need to tap into collective knowledge when it comes to the patent search process. What was needed, she wrote, was something like a wiki, where people could contribute their expertise on various matters. "We're at a critical moment," said Noveck. "We have the social software available with collaborative filtering, social reputation systems, so that we can do online peer review. There is so much dissatisfaction with the patent process; this is a ripe opportunity to move to peer review."
Not long after her posting in July, Noveck was contacted by IBM about the idea. The company had been considering something similar. Since then, Noveck has written a draft proposal for the plan and, in good form, has launched a wiki for people to contribute thoughts on the proposal. The system could work as follows: vetted experts in various fields sign up for RSS feeds and receive alerts whenever a new patent application is posted online that fell within their expertise (such applications are available to the public). The experts could contribute their thoughts to the appropriate examiner on whether or not prior art existed, assisting in the patent process. "What might take an examiner 15 to 20 hours to research and determine might take an expert 15 minutes," says Noveck.
There are legal problems, though. USPTO is looking to technical experts to vet the idea:
Meanwhile, Noveck is launching a nationwide tour in the early spring to colleges and intellectual property think-tanks to help vet the idea; and the companies involved in the repository and indexing projects are already at work. Interested parties can attend a public meeting on the various proposals at the patent office on February 16.