Right to be forgotten: The failure is in us, not Google

Right to be forgotten: The failure is in us, not Google

Summary: There isn't a technical fix for the problem we have with empathy.

TOPICS: Privacy, Government, Legal, EU

We are more than the sum of our search results. We know that about ourselves without being told — how can the fragments of information about us displayed online ever possibly define the complicated, paradoxical, fascinating beings that we are?

And yet we don't apply the same logic when considering other people; we cheerfully judge them based on what we can find in a quick Google search. That means that someone who made a stupid joke a decade ago is still defined by it, or that someone who committed a crime years ago can never put it behind them despite thousands of good deeds which go unrecorded by the internet.

That's because whereas previously embarrassing stories about an individual would have been printed in newspapers and then forgotten (existing only in a yellowing copy of an old paper, or in our own fallible memories), the internet means these stories are visible every time someone searches for their name.

This is what Europe's right to be forgotten tries to remedy — to take the undeserved sting out of these ancient stories. It goes some way towards creating a half-life for information in an age when digital technology allows us to retain everything forever.

There are some very limited scenarios – such as those involving spent convictions which would not have to be disclosed normally – where a right to be forgotten makes sense. But to me, beyond that, it's very hard to see why information which is fair and accurate should be removed from view.

That's because the right to a private life — which right to be forgotten tries to protect — bumps up against some other rights that are necessarily for a fully functioning society, such as the freedom of expression and freedom of the press.

While it doesn't reduce a journalist's ability to write a story about someone, the right to be forgotten makes it harder for others to find that story through a search engine, which is of course how most people navigate the web. That raises the question, can you really have freedom of speech if no one can hear what you are saying? Freedom of speech implicitly includes the freedom to be heard, and that's what we could be putting at risk here.

As such, the right to be forgotten is too big and too complicated to leave to search engines (who don't really want to police it) and the individuals who want links removed. Many of the decisions made so far on delinking (some of them later reversed) seem hard to defend. We need a better understanding of the what the right to be forgotten means before we start turning search indexes — our outsourced collective memory — into Swiss cheese.

The right to be forgotten embodies one of the most profound challenges we face. Humans are by design forgetting machines; our fallible grey matter urges us on by helping us to forget old pains, and by preventing us from perfectly replaying happy memories over and over again. But now we have to deal with the consequences of having the capability to remember almost everything for all time.

The search engine provides the information, but we are the ones that make judgements based on it. It's not a failure on the part of the search engine if we judge someone wrongly based on a scrap of information that might be years or decades old. We need to make more informed decisions, not knee-jerk responses to old information.

The problem with right to be forgotten is that it takes that choice away from us. It means we don't have the information to make those judgements at all — right or wrong. We can't make intelligent decisions about how to respond to this information if it's hidden from us. Denial and obfuscation is not the right answer to the challenges ahead.

We need to see our fellow humans as we see ourselves — not as a collection of search results but as confused, inconsistent and changeable human beings. Our failure is not one of technology but of empathy, one that no amount of meddling with the search indexes can fix.

ZDNet's Monday Morning Opener is our opening salvo for the week in tech. As a global site, this editorial publishes on Monday at 8am AEST in Sydney, Australia, which is 6pm Eastern Time on Sunday in the US. It is written by a member of ZDNet's global editorial board, which is comprised of our lead editors across Asia, Australia, Europe, and the US.

Previously on Monday Morning Opener

Topics: Privacy, Government, Legal, EU

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.


Log in or register to join the discussion
  • Right on target.

    It is always up for interpretation... and people are not that good at it.
  • Right on target.

    It is always up for interpretation... and people are not that good at it.
  • so what im hearing from you is..

    when the right to privacy bumps against your right to free press, free press should always win. you talk about "the right to be forgotten takes choice away from us". What about the people it is centered on? what about their choice to be left alone and lead a private life?

    what if someone's "freedom of expression" is to just be left alone and forgotten? why do they have to be forced to be linked to say a mistake they made when they were 19, thirty years ago when they were still maturing?

    there is no simple answer. I wouldn't want say all criminal convictions to disappear. But people who dont choose a lifestyle that puts them in the spotlight like a celebrity or an athlete should be entitled to privacy
    • The "right" to silence others

      This "right", particularly as created by European judges, is an absurdity. They accept that the actual source of information has a right to keep it online - so instead, they invent a "right" to forbid third parties from revealing the presence of that information when asked.

      I could have accepted a right to complain about old, obsolete media stories staying online, as we already have for inaccurate ones; if and when the story gets taken down, routine crawled updates will then remove it from Google's index anyway. The idea of ordering Google "even though that page is still online and mentions this name, you have to pretend it doesn't", though? A terrible idea.
      • I like the way you phrased your point.

        "...instead, they invent a "right" to forbid third parties from revealing the presence of that information when asked."

        That is, in effect, what this legislation aims to do. Search engines are technically third parties that reveal the presence of information that falls in the public domain when asked by its users. This legislation aims to censor that information based on a private party's comfort on having public (not private) information revealed.
    • It is really disturbing

      that you believe celebrities or athletes (is there a difference?) do not have the same right to privacy that the rest of us do. The press was never free to report on matters of public interest... and no matters of public interest are not determined by the publics interest in a celebrities private life.

      Right to privacy does not bump against right to free press, as you put it. This "right to be forgotten" is not about right to privacy. It is about the right to obfuscate information that is not protected under right to privacy laws. The information is, in fact, legally considered public information or that information could be actually removed from publication rather than just search results.

      You cannot realistically, through legal or technological means, take public information that has already been published in the public domain and then make it private again any more than you can wipe the memories of people who are already aware of an event. When the world was smaller, in a social sense, there was never any question about a person's right to have their actions forgotten. People remembered and they judged based upon their own discretion. Eventually, the world became much larger, socially speaking, so that a person and their actions became somewhat anonymous and they could then escape from judgments of people who had knowledge of their previous actions. That this occurred does not indicate that a person has any more "right" to do so than they did in the past when it was harder to do so.

      The current technology of communications is such that news can spread far and fast, good news and bad news. A person can use such technology to improve or damage their reputation. This legislation allows people the ability to escape a damaging reputation with no consideration for what it means to the public good.
      • grumble... ZDNet! WHY CAN'T WE EDIT POSTS ANY LONGER!??

        First paragraph should have read...

        It is really disturbing
        that you believe celebrities or athletes (is there a difference?) do not have the same right to privacy that the rest of us do. The press was never free to report on matters of private interest... and no, matters of public interest are not determined by the publics' interest in a celebrity's private life.
  • Anybody looks further than the 2nd page of results?

    "...we cheerfully judge them based on what we can find in a quick Google search"

    That is the point and the vast majority of journalists do not act differently (although they should).

    The problem is that the news about, for instance, an accusation (front page) can be displayed on the top of the search results while the other news, about the its dismissal (surely for the sake of truthfulness but buried in a backpage...), might appear on the 5th of 10th page of results. Hands up for those who currently care to look further the 2nd one.

    I agree the fault is on people, not Google. But Google is a tool and should make a point of honor to implement measures so that inadequate use is kept to a minimum.

    The EU "right to be forgotten" is a small but necessary step. Perhaps in a future Google iteration one may find a box of links next to each result pointing to other context-filtered results (for instance, other published news), graded by date or any other indicator. At least ignorance would not be an excuse.
    • "...the vast majority of journalists do not act differently..."

      I would think that it would be the responsibility of the news media and journalists, first and foremost, to ensure that any report of an accusation of a crime would be updated with any relevant links to an acquittal. That would solve the problem nicely without shifting responsibility from those publishing information to those who merely index it.
  • There is no such thing as a right to be forgotten...

    By the logic of this ruling, we would have no history. Textbooks, microfiche, old newspapers, old newsreels, etc. all maintain records of information that I'm sure one side of the story would prefer be forgotten. I'm quite certain that Bill Buckner would love to forget his infamous error. Whether online or not, do you think there's a Sox fan who doesn't know about him?

    The thing about search engines is that you need to input something to search for. If you're searching, then by definition, have you forgotten?
  • forget me not

    a young person I know of was charged with and found guilty assaulting of assaulting a bouncer. on appeal the prosecution case was thrown out, but not before the judge had gotten on their high horse , which gained a fair amount of coverage. 7 years on, a generic search on the suburb he was from brings his name and the conviction up on either the first or second page of search results. as previously mentioned, there is no sign of the conviction being overturned. I wonder how many job opportunities he has missed out on as a result
    • sure, we all believe in everything we read on the internet

      I do not understand. Are you saying that this is media's fault and not the hiring agency or the companies that have idiots in their HR department who make their decisions based on some stupid articles they found on the internet that may or may not be relevant to the person who applied?
      • Were I an employer...

        ...I might ask about it in the interview, and if I hired the applicant, I'd make it clear that on the job violence will not be tolerated; but that's about as far as it would go. But if someone like that is really trustworthy, he'll explain what happened and why, if asked; and provide some assurance that it won't be allowed to affect his performance on the job (it's hard/impossible to do one's job from a jail cell).

        If I had been George Steinbrenner, I wouldn't have fired Billy Martin for brawling (he was still a great manager); but I would have fined him a large amount every time it happened; and insisted that he get whatever counseling/treatment he needed for the alcoholism (which ultimately killed him).
        John L. Ries
      • it will and does happen

        Whilst an employer interested in a person's skills might investigate this information fairly, in a country where there are a couple of hundred applicants for each job then some junior might simply weed out any applications which have any doubt about them, long before the decision maker (who only wants to read a dozen applications or less) gets to see them.
        Whilst there is a problem with reporting the key to this problem is that search results turn things up disproportionately in relation to their proper place in life. It should be remembered here that it is not the information that is expunged but only the ridiculous random access to it. I feel it is personally reasonable that a search for say "crime committed by Joe Smith in 1970" to come up with associated articles if they exist. Should they however feature in the first page of a search for "joe smith" or maybe even the small town he was born in. No one knows exactly how googles search engine works but we do know that businesses and organisations can manipulate results. Let's say that "smalltown local newspaper" have managed to ensure that their information will usually be near the top of search responses and that "Joe Smith" lived their for a couple of months in 1970. Because that is the only information they hold about Joe smith then the results may come near the top.
        The problem is not with anyone's rights as they existed before - the information always was, and still is, available in the public domain. The issue is with they way that Google's search engine works and that is the only issue addressed by this.
        It is not only people that suffer - I searched for reviews of a restaurant that I know and the fourth item on a google search revealed a site with three very bad reviews. Those reviews were nearly 10 years old before the current owners took over and turned the restaurant around. As with information about people you are right when you say that we should not make judgments based on things that leap out from a search - however the reality is that people believe that the first couple of pages of results are not only the most important but also logically the most recent items.
        Laws are made to protect people in the real world and not the world as you imagine it should be (however right that imagining may be) - therefore it is perfectly reasonable to take into account how results will generally be interpreted in this real world.
        • That requires a cultural fix...

          .Censorship will do nothing to fix that aspect of the culture, but will provide a weapon for those who can afford the right legal help to suppress any and all negative information about themselves.
          John L. Ries
  • I think you're presenting a flawed argument...

    "Freedom of speech implicitly includes the freedom to be heard, and that's what we could be putting at risk here."

    The trick to this sentence is the concept of 'freedom to be heard' and what it means. For example, people are free to say things, but it's not consequence free. Ignoring legal issues (and that's so not black and white), people will judge you by what you say and what you have said. More to the point, it can *never* be consequence free because speech has consequences. The whole POINT of speech is induce consequences.

    So, in a similar way, speech can be harmful. Even factual speech can be harmful. This is the whole core, for example, of secrecy in medical information. What you have affects what you can do. If you're a man whose slept with another man - even once - even protected - the consequences of that can be pretty staggering and unfair. Your health insurance may go up - you can't donate blood - you may suddenly find that it's harder to get work.

    And factual speech can affect people not directly related to it. If you knew I was a cereal killer killed people ate them for breakfast (ok ok - serial), it's not unlikely that you'll be wary around my son when he's not even bitten anyone. I may want to protect him.

    People do not have a right to know everything about everyone. There just isn't such a right - and the right to free speech doesn't create such a right.
  • Just let the internet search users do the filtering, not Google

    There is absolutely no need to depend on any search engine providers to decide what is relevant or necessary in a search data base.

    It is a simple fix that is beyond the comprehension of the technologies gate keepers and would streamline and increase the efficiency of the entire search load on the internet.


    1. Add the original inclusion date of the data to the search database, and a date for any relevant content update (if any, say 5 to 20% increase).

    2. Tier search results by the number of retrieval hits to the search results.

    3. Relegate data with no retrieval hits after 6 months to a separate limited life search archive, stop robots from re-including including it in the active database, then dispose of it after 1 year with no retrieval hits.

    Internet search users who are not interested in old information or data older than a certain set point would cull the database by it's use or non-use, irrelevant data would be automatically disposed.

    The internet will then clean itself of personal data that has limited general interest. The internet would have less garbage in the search results and no one would ever know about you plagiarizing cliff notes for your masters degree.
    Makes Things