X
Business

Google: The search is never over

There is still much to improve in search, notes Google CIO Douglas Merrill, who says the company's role is not to censor information but to be "a picture of the Internet".
Written by Eileen Yu, Senior Contributing Editor
newsmaker Deaf as a child and coping with dyslexia, Douglas Merrill felt an inherent desire to find tools and use technology to help him--and others like him--achieve success in spite of the odds.
douglasmerrillgoogle-newsmaker.jpg

That is the tenet Merrill still swears by today, earning his keep as Google's chief information officer and vice president of engineering, and probably why he holds a PhD in Psychology as well as a degree in sociology and politics.

In an exclusive interview with ZDNet Asia, Merrill sheepishly admits he was wrong to hack into a bulletin board years ago simply because it had published white supremacy material.

He explains why this recognition of a past wrongdoing reflects Google's stance on censorship today, noting that the search giant cannot assume the role of a self-righteous gatekeeper.

Merrill also relishes the notion that Google is comparable to its nemesis Microsoft, but dismisses the suggestion that Google will venture beyond its search domain and become a security vendor--at least, not at the moment. He also defends the company's privacy stance, despite the fact that it was identified by Privacy International to have the worst privacy practices.

Founded in late-1995, Google's all-encompassing aim is to "make the world's information universally accessible and useful" or what he describes as "the democratization of information", where everyone has the ability to tell their story.

And the company would have everyone believe that search is the "oxygen", without which, no one would be able to find the information they need because there is simply too much data in the world today.

How does a guy who majored in social studies end up in a tech company?
Merrill: The problems I think are interesting are the places where technology, people and organizations overlap. We all use tools to get work done. The organizations in which we sit constrain what we do, and I find that area very interesting. What I quote a lot, so much so that it's sometimes described as Merrill's law, is: There are no lasting technical solutions to social problems.

A lot of interesting problems in the world today are not technical per se; they are a combination of technical and how the users use them in the world in which they live. Those problems are fascinating and that's how a guy with a PhD in Psychology and an undergraduate degree in Computer Science and Economics and Sociology, ends up here because it all kind of plays on the same set of rules.

So you grew up with an interest in tech, while recognizing it's not just about technology but also about the human psyche?
I was deaf as a child and had a learning disability called dyslexia. Those two things got me interested in how tools can help people be more successful. It's quite challenging to not be able to hear well, and dyslexia creates challenges in reading and some things I still struggle to overcome today. Those things got me interested in the combination of technology and the human psyche. How can I get tools to help me?

My whole life has been organized by trying to find a problem I cannot be 100 percent sure I can do, but to go and try it out anyway. Failure is okay. Learning from it is important. In fact, Google's core ethos is that we are a living experiment. There are between five and 200 experiments going on, at any given moment, related to search. We do the same thing with our products, we release things in beta and get feedback and iterate very rapidly to try and learn what our readers want. We're scientists and we experiment, so it's okay to fail.

Do you think Google is becoming such a big and successful company that, you're becoming like a Microsoft, where you're a company that people love to hate and want to pick on?
What a tremendous compliment you just paid us. You just said we can be legitimately compared to Microsoft, a company with 80,000 employees and tech company for 30 years. And more than that, you said we're such a strong brand that people spend time thinking about it and finding things that they see as inconsistencies and talking back to us. That's great!

I want our users to talk to us. I want them, when they think we're doing the wrong thing, to ask us about it. Sometimes we'll respond, sometimes we won't. Sometimes we'll do different things, and so on. We'll try to do what's right for the users in the broadest possible case.

But there will be companies like Privacy International, for example, that will rank you lowest in terms of privacy practices. A little unfair, would you say, since you are a search company and may need information to provide better results?
First, let's talk about user value. You can do a lot of different things for them without knowing much about them. For example, you can look out the window, down the street below and notice traffic jams. And we can write a useful traffic report based on what we see through the window. But we won't know anything about the drivers, where they're going, their dreams, fears and hopes. We just know there's slow traffic on this street and that people might want to avoid it. So there are a lot of things we can add value to without knowing anything about the searchers themselves.

Then there are products where we need to know something about you. But in exchange, we give you value when we talk about it. For example, a personalized homepage where you have to log in for us to give you what you want. And, if you want, you can sign out and we will forget who you are. And if we start building products that you don't like, you can take your data and go, we won't hold your data hostage. That's the critical point.

If you're going to give me some data, I need to protect it as much as I can and at the highest possible kinds of security. And I need to recognize that it's yours and give it back to you when you want. I can't lock you in because that would be wrong.

So, I actually think we have a good privacy stance. We're very public about when we take data from you. We're very public about giving you ways to opt out, and actually in most cases, you have to opt in. And we're pretty clear about how you can take your data and leave.

Obviously, we can get to know more about this area and engage in the conversation because user data is the cornerstone of our business. Our users' trust is our business. So we continue to engage in conversation, but overall, I'm actually pretty proud of our privacy and security stance.

Can you discuss some of the security practices Google observes?
First, every piece of code that's built on the site is reviewed by a second engineer. So I write the code and you review it, and we train reviewers to look for common security problems. Additionally, we have automated tools that we use to check for a bunch of security vulnerabilities, and so on. So we do a bunch of things to prevent them in the process.

There are also a lot of codes you can't write. You're not allowed to write codes around authentication, encryption and credit card. You must use our shared libraries, and these are designed very carefully. We put a huge amount of security around where we have customer data because these new kinds of Web applications are new applications. They are new ways of writing software, and we're going to find a lot of bugs in them.

And this leads to the second way we do security. We have a really rich interaction with security researchers, up to the point where we will credit them on our Web site. So we have lots of people who have found interesting bugs in our codes, and they reach out to us and we fix them and they get to publicize it. They might be security forums, engineering people, and so on. We're a huge supporter of responsible disclosure. If you find the problem, tell us and we'll fix it, and I don't mind you getting publicity for it.

And then there are stuff we do with partners and the industry. For example, we are an active participant and sponsor of StopBadware.org, which is a cross-industry and organization effort to try and attack malware sites and Trojans. We also recently released our Safe Browsing API (application programming interface) in which companies that are building Web applications can ask us for help to check against safe URLs.

Does this mean Google could become a security vendor?
I have no idea what will happen in the future but we have no plans in that regards today. We're busy making sure our products work and security is up there. We're excited about continuing to do work with our partners and making Internet security better as a whole.

So Google, the search company, remains a description you're happy with?
We are a search company, that's what we do. I think search is a critical component and has been for a long time and will continue to be. Google CEO Eric Schmidt has said that search is a 300-year problem and I agree with it. There's still so much room to go into, so many more languages to do a better job on, so much machine translations to do, and so much more work on ranking to make sure that I always get the results you meant to ask for at the top. There's so much room to improve still. We're very good, but we see all this opportunity to improve.

I read about how you once disabled an online bulletin that was spouting white supremacy ideologies. Obviously you see a place in how technology can play in solving some of the world's biggest social problems today. Is that a maxim you live your life by? And how does that carry forward into your role in Google?
Some interesting parallelism you drew here! I was wrong to do what I did.

Wrong? How many years ago was that?
Ages ago! It was pre-Internet, these were bulletin boards. And I didn't really disable them, what I did was publish their user list. I was wrong. What I should have done was publish information that talked about why some of the stuff they said was wrong. I should have published a contrasting vision. And that's kind of what we do today.

Google is not the Internet, we are simply a picture of the Internet and sometimes you can find unpleasant things on the Internet. And maybe that's okay, particularly if in the same search, you can find some unpleasant stuff, some controversial stuff and some stuff with a different perspective. Because the problem is, you don't know what truth is, particularly since history is usually written by the winners.

Sometimes, there are things that are morally repugnant to me that might be perfectly fine to that other person. It's not my job at Google to censor that because I find it morally repugnant. It's my job to try and reveal the entire panoply of different perspectives and let my users decide what they believe, and what they want to understand and contrast and compare.

Google's job is to algorithmically find all the world's information and present this so that our users can learn and understand the world they live in.

But with Google being a global company, with operations in countries where they work in different ways than others, that can be difficult to balance?
Google has to follow the law. In the United States, for instance, there are laws protecting against child pornography. For example, if we find a site on the Internet, we have to take the site out of our index. We can't take the content down because it's not ours. We just make it such that you can't find it through us anymore. So there are hundreds of examples like these where we have to remove things from the index. And, honestly, there are countries in which there's more removal than others.

What we try to do is publish what we're doing, specifically in cases where we removed a higher amount of results. So, in the event that we give a set of results where we had to remove some to adhere to local laws or for whatever reason, we'll flag and say we had to remove some results here.

It's not because we want to or don't want to, it's because that's the way the world works. We believe it's more important to provide 99.9 percent of the world's results than zero. And so if our choice is 99.9 versus zero, our choice is 99.9 but we're going to tell the user that we did something here. We won't capriciously remove things.

Editorial standards