X
Tech

Another massive Net attack looming?

Government and companies in the US and internationally are bracing themselves for the worst
Written by Bob Sullivan, Contributor

Six months after devastating attacks took down the Web's biggest sites, MSNBC has learned of new evidence that indicates it could easily happen again: a security researcher has found 125,000 networks with the same flaw that allowed the attacks.

In addition, MSNBC has learned that the White House, insurance companies and the security industry are considering quasi-government regulation to try to compel Internet firms to take basic security steps.

Top-level White House officials have been meeting with representatives of the insurance and security industries in an attempt to work out an agreement, according to Alan Paller, director of the SANS Institute, a non-profit computer security organisation. Details of the plan are still under discussion, but Paller said it would mirror the strategy the federal government has used with other developing industries, such as electricity.

"It will be a situation where insurance companies will say 'If you meet these minimum standards, we'll give you insurance... then government agencies will say 'You can't connect to our computers unless you have insurance'," Paller said.

More important, Paller said, would be getting major e-commerce Web sites to agree to the standards, and then for them to force their partners to meet the minimum standards.

Representatives from the White House and other federal agencies met with computer security researchers and privacy advocates Thursday evening to hammer out details of the strategy, Paller said.

Tentatively, the group plans to publish its suggestion by the end of this summer, then allow a 90-day review period. The plan is to create a new non-profit or cooperative agency by December that would act as a central repository for security information and minimum standards guidelines.

But while those discussions proceeded this week, other security professionals were disappointed -- but not surprised by -- new research that shows that thousands of networks are vulnerable to the same kind of "smurf" attacks used six months ago.

In a so-called "smurf" attack, a computer vandal sends a specially-formed data packet at a computer network's router. The packet tells the router to ask for a response from every computer on its network -- called a "broadcast ping". The network thus becomes an unwitting amplifier for an attack, because a single packet sent at a router connected to a network of 300 computers could result in 300 responses. What's worse, the responses can be tricked into heading toward another computer on the Internet. Attackers can then repeat the process in rapid fire, generating enough traffic to shut major networks.

In February, attackers who were likely accessing the Internet from a single dial-up connection were able to trick university computers into sending millions of "pings" at major sites, shutting down sites such as Yahoo, Amazon.com and eBay.

Solutions to prevent networks from being used as amplifiers exist: broadcast ping capability can be turned off, or the troubling "ping" packets heading out of the network can be stopped.

Still, the security researcher who discovered the vulnerable networks was able to generate over 10,000 responses from one network with a single "ping". Five other networks offered amplification factors of over 1,000. And the researcher, who requested not to be identified, said he has found more than 125,000 networks that allow at least some amplification.

"Everybody knew it, but nobody quantified it," said Chris Rouland, director of Internet Security System's research team. "Nothing has really changed since February."

The list of vulnerable networks was published earlier this week on the Web to embarrass network administrators and pressure them into fixing their networks -- a practice that has been adopted by many security professionals. Advocates of such "full disclosure" suggest that computer vandals might already have access to the information, so airing vulnerabilities in public at least gives the "good guys" a fair shot at fixing the problem before it's exploited.

But, Rouland said, the list is also a "great cookbook" that could be used by a computer attacker, who could "pop it right into a denial-of-service tool".

Part of the reason so many vulnerable networks remain is poor communication between network administrators and security experts, says security consultant Joel de la Garza of Securify.com.

"There isn't a good distribution site for contact information within various organisations, and even if there were, there very rarely is any attempt on the part of the person to notify the sites," he said. "There is also no formal notification process for third parties that discover problems with other peoples' networks."

Creation of some kind of central agency will be one element of the solution, Paller said. Still, while tools to reconfigure and protect networks are readily available, many network administrators are simply too busy to deploy them.

"There is good guidance but poor motivation," said Paller. "We're asking users who are not network wizards to change something that already works."

There is always a fear when making adjustments to network software that something will go wrong, and the network will crash.

"The idea is, 'If it works, don't touch it'," said Russ Cooper, who administers the NTBugtraq mailing list. Routers are so low-maintenance, Cooper said, that some administrators would probably say "I don't even remember what closet it's in."

February's attacks highlighted the interdependent nature of Internet security -- sloppy security at one network not only puts that network at risk, but endangers anyone else connected to the Internet.

"Your security depends on my security," said Kevin Houle, Incident Response Team leader the federally-funded Cert Coordination Centre at Carnegie Mellon University. "If an intruder can compromise my network, my network and my systems can be used to attack yours. It's an important social issue that needs increased awareness."

But awareness efforts are clearly not solving the problem: several warnings were issued in December and January by government and private organisations, including Cert, that had little impact on stopping the February attacks. Publication of this week's list will is also unlikely to prevent another attack.

That's why Paller says high-ranking White House officials are very interested in some kind of institutional solution for the problem.

"There's a half-life to these events of perhaps 45 days," he said. "Melissa hits and everybody updates their antivirus software, but 45 days later everybody's virus detection is out of date again. I don't want to blame anybody. I don't think we can pick on people who haven't [stopped broadcast pings]."

He favours creation of a central, non-profit, non-governmental organisation to act as standards bearer for the Internet community.

"There are two problems, really. First, when people say they want to do more... there's no consensus on what needs to be done," Paller said. "And second, how much is enough?.. Everybody's telling people different things."

He thinks the yet-to-be-named group could act as the central repository for security information. Acceptance of minimum standards would stop common flawed practices such as same time, such acceptance would help stop security consultants from using scare tactics to overcharge security-conscious companies.

"Within the group, there wasn't a sense that it's going to be easy, but there is a sense that it needs to be done," Paller said.

Take me to the Summer of Hacking Special

Take me to Hackers

What do you think? Tell the Mailroom. And read what others have said.

Editorial standards