Google's anti-malware team comes out of the shadows

Google's anti-malware team comes out of the shadows

Summary: Google's anti-malware team has emerged from the shadows with a new blog, a widely discussed research paper and enough clues about its ambitious drive to put a roadblock between dirty Web sites and end users.

SHARE:
TOPICS: Google, Browser, CXO
6

Google's anti-malware team has emerged from the shadows with a new blog, a widely discussed research paper (.pdf) and a few clues about its ambitious drive to put a roadblock between dirty Web sites and end users.

Niels Provos, Google engineer Over the last year, Google has quietly invested in several efforts to flag malicious sites that appear in its search results. Last month, at the HotBots '07 conference in Boston, these efforts came to light when staff engineer Niels Provos (left) released the "Ghost in the Browser" paper with hard numbers on the extent of the malware-on-the-web problem.

In the analysis, Provos and the Google anti-malware team investigated about 12 million suspicious URLs and found that about 1 million of those sites were launching drive-by downloads.

In the paper, Niels dropped a major hint at what's to come from Google:

[We] have started an effort to identify all web pages on the Internet that could potentially be malicious.

The plan has raised eyebrows in some quarters but, as Google's Matt Cutts explained, the company has been working on different ways to warn users about potentially malicious sites. These include an interstitial warning, annotations to listings that a site may be harmful and badware notifications to help Webmasters.

Provos, via e-mail, declined to discuss future plans but there are enough clues to suggest Google is working on some sort of tool to identify hijacked Web servers -- and block drive-by exploits from infecting end-users.

This would put the company up against McAfee's SiteAdvisor, Trend Micro's TrendProtect and Exploit Prevention Lab's LinkScanner, three browser add-ons that slap graphical warning signs (red, yellow or green labels) next to search results.

Provos himself has created SpyBye, an open-source utility that helps Web masters determine if their web pages are hosting browser exploits.

SpyBye operates as a proxy server and gets to see all the web fetches that your browser makes. It applies very simple rules to each URL that is fetched as a result of loading a web page. These rules allows us to classify a URL into three categories: harmless, unknown or dangerous. Although, there is great margin of error, the categories allow a web master to look at the URLs and determine if they should be there or not. If you see that a URL is being fetched that you would not expect, it's a good indication you have been compromised.

Provos told me he created SpyBye on his own time (it's not one of those Google "twenty percent time" projects) to provide a tool for web masters to verify their sites on their own and find out what is wrong with them.

"I wanted to make a tool available that would help web masters discover if their web pages had been compromised to infect users with malware. Many web masters know how to set up and maintain a site, but don't really understand why or how they got compromised. I hope that SpyBye will allow them to get a better understanding of the problem and also allow them to verify if their web pages are still malicious or if the problem has been fixed, Provos said.

Google also has a serious click-fraud problem that is directly linked to botnets of hijacked PCs so it figures that the aggressive anti-malware push will also target bots and Trojans.

It sounds very much like Google could emerge as a player in the anti-virus space. Can a big acquisition be far away?

Topics: Google, Browser, CXO

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

6 comments
Log in or register to join the discussion
  • Why index them at all?

    I just finished writing a <a href="http://whatis.blogs.techtarget.com/2007/05/20/web-based-malware-driveby-downloads-on-the-information-superhighway/">quiz about web-based malware</a> and when I was done, showed it to some colleagues. They asked me a question I couldn't answer -- why doesn't Google refuse to index "dangerous sites"?
    hrhsoleil
    • because

      What if a 'dangerous' site is legit and has been hijacked by malware authors (like the Superbowl stadium site)?

      _r
      Ryan Naraine
      • No longer legit

        If a "legit" site can be compromised or hijacked then it still is a "dangerous" website - in fact if the Superbowl Stadium website can be hijacked then that would make it more dangerous than a link to a website that no-one had heard of!
        coopermi
  • Desktop might be the answer?

    If click fraud is being conducted on the web, could Google work with clicks by integrating with the native environment of the PC? So, it knows what you click through your browser rather than ip address or other tracking features. If they could register a physical 'click' instead of a fetch (a bot would have to be pretty clever to hijack a mouse!) it could eliminate automatic click fraud.

    This sounds far fetched, but Google already has integration with toolbars and 'Gears. Could it be that in order to use Google searches in future, you have to install a tool? As distasteful as this sounds, it would make a lot of sense as a general movement. Hackers can always be cleverer because they focus on narrow fields of development and the payoff can be immense, according to last edition of New Scientist (http://www.newscientisttech.com/channel/tech/mg19426087.500-trade-in-software-bugs-plays-into-hackers-hands.html)

    If the reason malware is so prominent and difficult to track is because the very nature of surfing is insecure, maybe a more fundamental change in the way the WWW is interfaced would make it more secure for users?

    -Just a thought...
    http://www.zachbeauvais.com
    zbeauvais
  • What about the web site hosts?

    A program that checks web sites for dangerous links is a good idea.

    Programs that let web site owners and web site host companies check their portfolio of web sites for malware would also be nice. When malware was found on a site they could substitute that particular page with a clean backup.
    Thore
  • Interesting

    I would've thought that rather then include these malware serving sites in search results at all Google would instead permanently delist the involved websites (or alternatively provide some sort of mechanism for affected webmasters to resubmit their site once they remove the malware and the mechanism would automatically check the website to see if the malware still existed and act on the results accordingly).

    - John Musbach
    John Musbach