X
Business

Trapped in a filter bubble: How Google and Facebook only feed you what you want to hear

Author Eli Pariser on the way the web's warped lens is keeping us from serendipitous discoveries...
Written by Nick Heath, Contributor

Author Eli Pariser on the way the web's warped lens is keeping us from serendipitous discoveries...

No one likes to think of themself as a victim of propaganda - of being fed a diet of information chosen to colour their views and mould their behaviour.

Yet every time we use a search engine or check updates from our friends on a social network, a digital profile of who we are and what we like is used to decide what information these services will show us.

These profiles - used by the likes of Google and Facebook - are a work in progress, continually revised to reflect the latest links we clicked on or the last location we logged on from.

Eli Pariser

Author Eli Pariser believes web filtering means people are increasingly living inside filter bubbles where their own views are reflected back at themPhoto: PopTech

It might seem like a win-win situation: Google and Facebook get to maximise clicks and advertising revenues and we get the links we seemingly want.

However, according to Eli Pariser, author of The Filter Bubble, there is a pernicious effect to this filtering, one that means each person is only exposed to information that reinforces their beliefs and behavioural patterns, and keeps them from stumbling across facts or opinions that contradict their point of view.

This filter bubble - the web's unseen information echo chamber - is altering our behaviour and could damage the free exchange of ideas and information that is the bedrock of modern democracies, Pariser believes.

"[US politician and sociologist] Daniel Patrick Moynihan famously said, 'We're all entitled to our own opinions but not our own facts'," Pariser told silicon.com.

"In the filter bubble you do get your own facts, and if you don't believe in climate change then you can live in a world where your search results confirm that view.

"What's missing is the ability to put yourself in someone else's shoes. Increasingly, it's like your own shoes follow you around wherever you go."

Profiling is big business for search engines and social networks. Every time we perform a search, Google reportedly collects myriad signals of user behaviour - from the time it takes to complete a search to the browser used to perform it - that it uses to build and update user profiles. That's without taking into account the user information Google extracts from the suite of apps it provides: Gmail, Google Docs and the like.

Facebook, on the other hand, captures the wants of each of its 750-million-plus users based on their activity and friends on the site and, more recently, their browsing habits across the web, using stats from the widely adopted Facebook Like button to track what pages they visit.

It was on Facebook that Pariser first noticed the distorting effect of...

...search engines and social networks, when he realised Facebook was omitting updates from his News Feed - Facebook's live stream of posts and updates from a user's list of friends.

Facebook News Feed

Facebook's News Feed is a prime example of filtering, showing only results it thinks you are interested in based on what you click mostImage: Facebook

"I'd gone out of my way to befriend people who had different political views from me and Facebook was editing them out of the News Feed," he said.

"It was watching what I was doing and saying, 'You say you want to hear from these people but you're not clicking on their links as much as you're clicking on the other links', and so their links disappeared from my view.

"That got me thinking about what it meant that this medium, that one in 11 people use, has this mechanism that is letting some information through and not others."

The ostrich society

This behind-the-scenes editing creates individuals who are blind to issues outside of their narrow scope of interest, according to Pariser - a state of being he believes could damage social cohesion and the electorate's ability to make informed choices at the ballot box.

The filtering mechanism rewards our shallowest interests - throwaway clicks to stories about Lady Gaga's latest fashion crime or video highlights from the big match - bringing these topics to our attention at the expense of issues less accessible but ultimately more consequential such as geopolitics.

"It's all about the stuff you'll click on first. What these systems are designed to do is increase page views," Pariser said.

"The unintended consequence is that there are a whole bunch of things that don't make it to a broad audience because they don't fall under those categories for anyone.

"The war in Afghanistan, for example, because it doesn't get a lot of clicks or Likes on Facebook, won't easily penetrate the filter bubble."

If left unchecked, Pariser said, the filter bubble will blind society to the issues they most need to face up to, creating "an ostrich society, where everyone's head is buried in the sand" and our thoughts are consumed by the trivia we feel compelled to click on.

"It feeds back into democratic decision-making, as our political priorities come from our media to a large extent. There's very strong psychological research that when people hear a lot of conversation about a topic they prioritise it higher politically," Pariser said.

"Whether it's climate change, the wars we are fighting or other issues, they drop precipitously in the public view and they become less and less of a concern.

"The end point is that we entirely lose sight of wider society and the problems in it, and that is a challenge because those problems don't lose sight of us."

To understand how the filter bubble might affect individuals and society, we can look at...

...studies of how traditional media warps a person's beliefs and behaviour, according to Pariser.

In The Filter Bubble, he writes about research by political scientist Shanto Iyengar, who in the 1980s asked a group of people to watch a set of news reports, which had been doctored to focus on a particular topic.

After six days of watching the skewed news reports, the test subjects rated the topic highlighted in those reports as being more important than they had at the start of the study - pollution, for example, rose from being ranked as the second-lowest priority topic to the second highest.

Filter bubble

Trapped in a bubble of our shallowest interests, we lose sight of the issues that are important to societyPhoto: Chris Guise

However, Pariser believes the effects of the internet filter bubble are more pernicious than the editorial choices taken by traditional media, as he argues that the majority of people don't understand the choices that are shaping what they see online.

"When you open up The Economist, you understand what the editorial viewpoint is, and therefore you can make a guess about what you're missing," he said.

Citing an observation by digital commentator Clay Shirky in the book, Pariser points out that in the case of newspapers, the average citizen who skips political stories 99 per cent of the time will periodically read the big political scandal of the day. In the filter bubble, however, that person won't even know the political scandal exists because they will never see it.

"Because you don't know who Google News thinks you are or who Yahoo News thinks you are, you don't know what you're missing. The consequence, especially as it develops more, is that you get a distorted view of the world and you don't even see how it's distorted, you don't know that there's editing at work."

How the filter bubble can be used against us

Of course, these personal profiles are not just useful to search engines and social networks - businesses and government organisations are beginning to use such data to decide how to treat us as individuals.

"The same pool of data that can be used for doing this personalised targeting can also be used to make decisions about people.

"For example, banks are starting to look at Facebook data as a way of judging whether somebody is credit-worthy or not.

"If your friends have good credit then you are likely to. Conversely, if your friends don't have good credit then you are not likely to as well. That's a very dangerous thing because that is essentially guilt by association.

"The challenge is that all of that happens entirely opaquely - it happens without any of us knowing it. The ways this data can be used to discriminate against us is never...

...made transparent to us."

What filtered reality means for our future

Digital profiling is not just being used to second-guess a person's interests - work is taking place to allow organisations to tailor messages to a person's temperament and even their thought processes.

In The Filter Bubble, Pariser talks about the practice of "persuasion profiling" - where marketers try to work out what tone of message a person is most likely to respond to: is this individual the type of person who will jump at a vanilla 'Buy one, get one free' message or do they respond best to a more subtle communiqué?

Augmented reality

Just as the digital world is filtered today, tomorrow the physical world could be shaped by the advent of augmented realityPhoto: Layar Augmented Reality

And just as the digital world is filtered today, tomorrow the physical world could be shaped by the advent of augmented reality (AR).

AR allows allows digital information to be superimposed over the top of video of the real world or over transparent displays in real time - for example, giving information about real-world objects that are in view. Today, AR is mainly used to layer digital information over the top of video feeds in smartphones or over heads-up displays used by the military but in the future it may become possible to combine AR systems with wearable transparent displays.

In such a scenario, the way we perceive the world around us could be coloured by the digital information that is layered on top of it.

For The Filter Bubble, Pariser interviewed the founder of a dating website who speculated that one day people wearing AR displays could walk into a bar and immediately see the people who they are most compatible with based on information from a dating site's database.

"Augmented reality represents the end of the naïve empiricism, of the world as we see it, and the beginning of something far more mutable and weird: a real-world filter bubble that will become increasingly difficult to escape," Pariser writes in his book.

With all of these techniques cajoling and corralling our behaviour, Pariser questions how much room there will be for individuals to make their own choices.

"As far as dystopias go, I'm more of an Aldous Huxley person than a George Orwell - it's less that something will be forcibly telling us what to do, and more that it will get good enough at hitting our psychological hot buttons that we'll think we're doing what we want to do, and actually we will be being manipulated," he said.

"To really be free you have to...

...have a sense of where your choices are - when you lose that sense, you lose something profound because it's hard to say that someone who only believes they have one choice is actually choosing that path."

How to burst the filter bubble

The secret to freeing people from the grip of the filter bubble, in Pariser's view, is to expose the public to its inner workings.

Google and its compatriots should show its users its working out, and make clear the underlying decisions that shape the results on a web search or social media news feed.

Google +

Websites like Google need to make users aware of exactly how their information and online behaviour is being used to profile and filter their contentImage: Google

"It requires more transparency and scrutiny," Pariser said.

"Not just transparency on the algorithm because, as I say, that will become increasingly inscrutable but transparency on the outcomes.

"What you really want is Google to open up a lot of the data surrounding how people are using the web search results they are getting, so you can look at it and say, for example, 'Is Google biased towards itself?' and answer that question authoritatively."

As people come to understand how companies are using their personal data and the implications on their lives, Pariser believes it may be in the commercial interests of companies like Facebook and Google to come clean.

"As people get more literate with these tools they'll be more aware of which companies are treating them with respect and which ones aren't," he said.

"Facebook is at a precarious moment and part of it is to do with the feeling that you can't really trust these guys."

Besides users objecting to being manipulated, personalisation presents users with a fairly shallow and ultimately boring pool of content to choose from, he said.

"I think that part of the reason Facebook is getting this down-tick is because the Facebook News Feed isn't a very rich media experience - it's fairly shallow, and after spending a year hooked on it you start to go, 'I'm not actually learning that much about the world or getting much value out of it'," he said.

Pariser is "fundamentally an optimist" and believes that as people learn how their personal information is being collected, analysed and used to control their behaviour, they will demand that the companies doing the filtering change their ways.

"I haven't talked to many people who like the feeling of being manipulated by tools they feel are supposed to be serving them," he said.

"That's what the book is trying to do - to raise people's awareness so we can make a decision to embrace a different path."

Editorial standards