Bug hunters face online-apps dilemma

Security holes in online applications may go unreported because well-intended hackers don't want to risk going to jail.
Written by Joris Evers, Contributor
Security holes in online applications may go unfixed because well-intended hackers are afraid to report bugs.

Web applications pose a dilemma for bug hunters: how to test the security without going to jail? If hackers probe traditional software such as Windows or Word, they can do so on their own PCs. That isn't true for Web applications, which run on servers operated by others. Testing the security there is likely illegal and could lead to prosecution.

"There are more legal dangers to testing an application that is hosted on somebody else's system. That is a real challenge of this new application model," said Wendy Seltzer, an assistant professor specialized in Internet law at New York's Brooklyn Law School.

As a consequence of the legal threat, well-intended "white-hat" hackers often credited with finding bugs in traditional software are hesitant to audit Web applications. This means that online applications don't face the same scrutiny as traditional software and serious security holes could be left for unscrupulous criminal hackers to find them.

"We're losing the Good Samaritan aspect of security," said Jeremiah Grossman, chief technology officer at Web security company WhiteHat Security. "If it's illegal to find vulnerabilities in Web sites, it means only bad guys know where the vulnerabilities are. This is one of the big issues in information security as we shift to a Web 2.0 world."

Caleb Sima, chief technology officer at rival Web security firm SPI Dynamics, agreed that the legal threats effectively make Web applications less secure. "If a vulnerability existed, it would be the black hat hacker that would find it because they don't care. That causes Web apps to be less secure," he said.

The onset of what's become known as Web 2.0 is causing a splash, as it stretches the boundaries of what Web sites can do. But as sites become rich with new features, offering an experience akin to desktop applications, the security risks also increase, experts have said.

Bug hunting has been a legal gray area for people who probe desktop software. They may be breaking the law when they take apart, or reverse-engineer, software sitting on a PC. But the law is clear-cut when it comes to Web sites, said Jonathan Zittrain, professor of Internet governance and regulation at Oxford University's Internet Institute.

"The venerable Computer Fraud and Abuse Act in the U.S., and corresponding laws in other countries, criminalizes unauthorized access to a machine, including 'exceeding authorized access.' The point of a hack to expose a security vulnerability (in a Web application) is usually to do just that," Zittrain said.

Prosecutors could use several laws to go after security researchers who break into an online application, but the Computer Fraud and Abuse Act is the primary one. It provides for a fine or up to a year in prison for somebody who "intentionally accesses a protected computer without authorization, and as a result of such conduct, causes damage."

"It is a problem for people who do have the public interest in mind and who are trying to expose flaws that are putting people's privacy or information at risk," Seltzer said.

A case in point: Eric McCarty, a security professional, was sentenced in January to six months of house arrest and three years of probation and was ordered to pay $36,761.26 in restitution to the University of Southern California. McCarty pleaded guilty to hacking USC's online application system, but argued he was acting to get the system secured.

In the U.K., Daniel Cuthbert was ordered to pay about $1,750 for breaking into a Web site collecting donations for victims of the 2004 Asian tsunami. Cuthbert said he decided to check the security of the site because he feared he had fallen for a phishing scam.

But not all Web site owners will report security researchers to law enforcement.

"White-hat hackers are generally doing us a service," said Christopher Blum, security director at NetSuite, a San Mateo, Calif.-based provider of online business applications. He offered a caveat, however: Those hackers are providing a service if a vulnerability is reported privately, the company's operations weren't disrupted and customer data wasn't exposed.

There have been two instances in which a researcher reported a vulnerability to NetSuite, Blum said. In both cases, the problems were fixed and the individual wasn't prosecuted. "Responsible disclosure is a good practice and leads to better quality software," he said.

Others, including Google and Yahoo, also support "responsible disclosure" of vulnerabilities. Under this approach, advocated by software and Web companies alike, researchers who uncover a flaw will not publicly disclose the problem. Instead, they contact the maker of the affected product and share details, so that the company can fix it.

The best way for a Web bug hunter to hack without fear is to ask a target company for permission, legal experts said. NetSuite's Blum said his company would likely grant such a request, though with some strings attached. Many other companies, however, may not be inclined to allow somebody to poke around, said Sima of SPI Dynamics.

"Security through obscurity is helpful," he said. "I am not just going to open things up and give hackers the ability to go through my application."

Web companies could set up a copy of their applications for ethical hackers to probe. That way the main system wouldn't be disrupted and real customer data would not be at risk. While Sima's objections also apply to this approach, some security researchers do like it.

"This is a great idea," said Billy Hoffman, a lead researcher at SPI Dynamics. "A properly isolated mirror doesn't expose them to a larger security risk. The costs to the company are reasonably small and potential gains are huge. The smart ones would even offer a bounty on bugs that were found and properly disclosed."

Web companies, like traditional software companies, do hire security firms to have audits done. NetSuite, for example, uses Fortify Software's tools to scan its code for bugs and pays Ambiron Trustwave for a monthly scan of its Web site, Blum said. However, there are always more bugs that can be found, experts noted.

While security researchers may have the law against them, other laws are helping security, Seltzer noted. Data breach notification laws in particular are forcing organizations to tighten their security, she said.

"Companies start to realize that part of what they are selling to users of their applications is trust," Seltzer said. "Things like data breach notification laws will shine more light on security problems and make security along with privacy an element of what users are considering when they are evaluating sites to do business with."

Still, because security researchers can't freely probe Web applications, consumers are at risk, Seltzer said.

"If the only people who can investigate security are the gangs of foreign teenagers stealing credit cards and not those in the U.S. who would like to help the credit card holders shield themselves against these thefts, there is an imbalance," she said.

Editorial standards