X
Tech

Your brain vs technology: How our wired world is changing the way we think

Our brains adapt to how we use them - so what does our love affair with tech mean for our minds?
Written by Natasha Lomas, Contributor

Our brains adapt to how we use them - so what does our love affair with tech mean for our minds?

In the first of a series of articles examining the impact of technology on our society and ourselves, silicon.com's Natasha Lomas investigates the effects of technology on the human brain.

Technology is frequently accused of being the root cause of a raft of social problems.

Texting, social networking, googling - they've all been in the dock in recent years, accused of causing a range of social and behavioural problems.

'Authorities blame games for sword attack', 'Video game cause of murder of great-grandmother by teen', 'Facebook hurts grades, creates more narcissistic tendencies for teens' are just a few recent headlines, while earlier this year, the finger was pointed at social services such as Twitter and BBM for apparently amplifying civil disorder in August's riots.

Brain MRI scan

An MRI brain scan: Is technology changing how we think?Photo: Shutterstock

Of course, fearmongering and tabloid journalism make natural bedfellows, but could there be some substance to concerns that technology is affecting the way we behave - and even the way we think?

In the past decade, Baroness Susan Greenfield, professor of Synaptic Pharmacology at Lincoln College, Oxford, has publically questioned how the digital world might be affecting our brains, our intellect and our ability to form meaningful real-world relationships.

In 2006, Greenfield was quoted in a Guardian article warning of the impact of electronic media use on children's ability to learn, saying: "I am not proposing that we become IT Luddites, but rather that we could be stumbling into a powerful technology, the impact of which we understand poorly at the moment."

Similarly, in 2009, she penned an article for the Daily Mail targeting social networks as an area of particular concern: "By the middle of this century, our minds might have become infantilised - characterised by short attention spans, an inability to empathise and a shaky sense of identity," she wrote.

She has also variously called for more research on whether there is a link between ADHD and internet use, and has also questioned whether increases in autism might be down to too much screen time. The latter suggestion caused fellow Oxford University academic Dorothy Bishop, a neuropsychology professor, to post an open letter criticising her comments as specious, erroneous and uninformed.

Greenfield's public expressions of concern have frequently caused controversy, as Bishop's response illustrates - not least because of the scaremongering headlines her words tend to attract: 'Society should wake up to harmful effects of internet' and 'How Facebook addiction is damaging your child's brain: A leading neuroscientist's chilling warning' are just two examples.

Such headlines have added momentum to a populist notion that internet use is somehow bad for your health - or in Daily Mail-speak, it 'rots the brain'.

Speaking to silicon.com, Greenfield is quick to distance herself from such headlines: "What I'm concerned about more than anything is...

...the way sometimes I'm misquoted," she says. "People keep saying I've said [computers are] bad for the brain. I've never said that. I'm sure that it makes better, more sensationalist copy but what I've always said, and what I still say, is if the human brain is exquisitely adapted to the environment, which it is, if the environment is changing, which it is, then it's a given the brain will change.

"What we need to do is discuss how it's going to change, and whether some things are good, some things are bad. That's what I've said - not that computers are bad for you because it depends what you would regard as bad anyway and I wouldn't dream of giving value judgements. I think my concern has been that something that I think is very important has been derailed slightly by hype and sensationalism."

Susan Greenfield

Susan Greenfield, professor of Synaptic Pharmacology at Lincoln College, Oxford, giving a speech in 2006Creative Commons: Andy Miah

Nevertheless, in Greenfield's view there is more reason to be concerned now than when she first started voicing worries about the pervasiveness of the digital world on the neuroplasticty of the human brain.

Her fear is that the brain's ability to adapt itself to its environment is so good there's a risk of it effectively shrinking to fit a digital life at the expense of the world outside our Windows.

Greenfield sketches a possible future where the human mind becomes underdeveloped through a reliance on virtual experience - and people lose emotional depth and range as a result of living vicariously online.

What scientific evidence is there now to support the notion that internet use is bad for the brain? Greenfield points to Chinese research published in the journal PLoS ONE in June this year that she says shows "a very striking correlation between internet addiction and atrophy in the brain in young people".

The researchers studied the brains of university-age students who characterised themselves as internet addicted - finding that regions of their brains had decreased grey matter volume.

There were also changes in brain structure: "Our results suggested that long-term internet addiction would result in brain structural alterations, which probably contributed to chronic dysfunction in subjects with IAD [internet addiction disorder]," the study concludes.

Greenfield also cites a US study of empathy in college teenagers, published in Personality and Social Psychology Review, as relevant to her concerns about digital interactions. The study reported a sharp drop in empathy levels among US teens, in particular over the past 10 years.

Critics have accused Greenfield of being selective in the studies she cites but she rejects this - although she says there is not yet a scientific consensus backing her concerns.

"Evidence is starting to come out, but the issue with evidence is that this kind of work is obviously longitudinal. I think it's dangerous to hide behind the mantra there's no evidence, thinking absence of evidence is evidence of absence," she says.

"I know one swallow doesn't make a summer [but] those kinds of papers [cited above] are the kind of things I think we should be paying attention to."

"The human brain is so exquisitely adapted to be perfectly adapted to whatever environment it's in. If it's in an environment where it communicates mainly with computers, it too will become...

...like a computer - with both the good and bad things [associated with that]," she adds.

Greenfield is not the only public figure raising concern about the effects our online behaviour is having on our real-world abilities. The US writer Nicholas Carr penned an essay in 2008 entitled Is Google making us stupid?. Carr's contention was that using the internet had degraded his ability to concentrate - that, by accustoming him to access to so much data, the internet had made it harder for him to learn and think deeply by the age-old method of study and quiet contemplation.

"Media are not just passive channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the net seems to be doing is chipping away my capacity for concentration and contemplation," he wrote. "My mind now expects to take in information the way the net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski."

Carr's view is that the very fabric of the web teaches our brains to be distracted. He argues that a fundamental characteristic of the online experience - the hyperlink - actually disrupts our ability to concentrate on a piece of text because it forces us to make decisions about whether or not to click on the link. Add in ads, flashy graphics and other multimedia elements and the web is more eye candy than study library, he says. And little wonder when the businesses that dominate the online world have an interest in keeping us clicking, not contemplating.

"Most of the proprietors of the commercial internet have a financial stake in collecting the crumbs of data we leave behind as we flit from link to link - the more crumbs, the better. The last thing these companies want is to encourage leisurely reading or slow, concentrated thought. It's in their economic interest to drive us to distraction."

Flagging up the influence of commercial interests online is one thing but proving that thoughtful reading and the internet are somehow mutually exclusive - and that reading online erodes our ability to read offline - is much more difficult.

Another piece of research that Oxford's Greenfield points to to back up her concerns is a study published in 2000 which identified navigation-related structural change in the hippocampi of taxi drivers - the apparent result of the specialist type of learning that the cabbies engage in.

London cabs

Learning The Knowledge changes the structure of cabbies' brains, according to a UK studyCreative Commons: Garry Knight

The research implied that learning The Knowledge - committing to memory the entire map of London's streets - has an impact on the relative size of different portions of the taxi drivers' brains, with their posterior hippocampi being larger than control subjects' but control subjects having a larger anterior hippocampal region than cabbies.

The study notes that the data is in line with the idea that those who use their navigational skills a lot see an expansion in the area of the brain involved in...

...supplying the spatial representation of the environment.

"It seems there is a capacity for local plastic change in the structure of the healthy adult human brain in response to environmental demands," the study says.

The research suggests the brain acts almost like a muscle - bulking up in regions required to perform oft-repeated mental tasks but diminishing in regions used for less common types of thinking. Or to put it another way, for example: do a lot of mental arithmetic, and your brain will get better at doing mental arithmetic.

Like Greenfield, Carr also cites the taxi driver study. He goes on to suggest there is now a body of evidence that indicates the human brain adapts to suit how we use it. The question that follows is whether our technologies are making the best use of our grey matter.

"We now know that even as adults our brains are constantly adapting to circumstance and to the tools we use.

"What seems to be true is that our brains want to be efficient, so they dedicate more resources in terms of neurons and circuits etc to the types of thinking we engage in a lot. And then what weakens are the types of thinking that we don't practice," Carr tells silicon.com.

"So then the question becomes, exactly what's going on? How are we adapting? It's very hard to scientifically get the answer to that question because you can't go into healthy people's brains and fiddle with their neurons. [But] there are starting to be actual scanning studies and psychological studies that indicate there are changes going on."

He also believes evidence is building up through psychological studies to indicate that the online environment tends to reduce comprehension and understanding - making it harder to retain information or acquire new knowledge in a digital environment.

"At a behavioural level, I think there's a lot of evidence that we tend to read and think more superficially when we're taking in information the way it's delivered online versus, say, reading a printed book where there aren't distractions and interruptions and you're really able to focus on the text," he says.

"There have been a lot of studies that compare the reading of hypertext on a screen with the reading of straight text without links in it and when you look across all those studies you see that in fact when people read with hyperlinks their comprehension goes down.

"That's just one example about how when you start distancing people from their main focus even in very subtle ways you reduce their understanding, you reduce their learning, you reduce their attention and when you start to think about how people read or take in information in any way online there are huge numbers of distractions."

He discusses another study conducted in a lecture class where half the students were allowed to have their laptops open and the other half had to keep them shut. At the end of the lecture the two groups were tested on retention of information - the group without laptop access did "a significantly better job" of recalling what they'd just been taught.

"When you think about it, it's not a big surprise - you're not distracted. On the other hand, despite that, we're seeing more and more schools embrace the idea of allowing their students to have their laptops with them and have rich wi-fi connections in classrooms, so it's kind of interesting how...

...in many ways we assume that more technology's better - even though common sense would tell you sometimes it's not."

Carr cites a US study published earlier this year that investigated the effect of using Google on memory. The psychologists reported that study participants were less likely to remember information if they believed they could find it online - remembering instead where they could find the data, rather than the data itself. They concluded the internet has become a "primary form of external or transactive memory, where information is stored collectively outside ourselves".

"We've always had external sources of information that supplemented our memory but it seems to me the danger here is that if we, in effect, train our brains to forget rather than to remember you may still be able to find the individual bits of information when you need them but what you lose is the personal associations that happen when you actually go through the process of remembering something," says Carr.

Big conceptual knowledge is built not on individual facts, he argues, but on the connections and the associations made in personal memory. "If we bypass that whole process of creating those connections just because we can find stuff out on the internet so simply then I think, for all the benefits of having this huge store of information we do risk losing some of the foundations of individual knowledge," he says.

"Ultimately it seems to be that individual personality and the individual self comes from those connections as well, so I'd be at least a little worried that we're too quick to assume that a computer database is a substitute for personal memory, whereas I think they're very different things."

Nicholas Carr

Carr says we're too quick to assume that a computer database is a substitute for personal memoryCreative Commons: sander.duivestein

Yet technology does not stand still. Since Carr published his original essay in 2008, it's possible to argue that there has been a movement towards digital technologies that counter the 'too much information' issue by simplifying the user experience - developments such as the rise of apps, which offer a few choice functions in an easy-to-use package, or e-readers such as the Kindle which strip away digital distractions by mimicking the printed page.

Don't these developments suggest that worrying about the online world as it stands now might be a bit premature - because the digital world is already adapting itself to suit our needs?

"All of these are positive signs and encouraging signs, and at the very least I think they testify to the fact that, among at least some group of internet users, there is this yearning to escape the distractions and be more attentive. The big question is, is that a major trend?" says Carr.

"We don't know yet whether these new technologies are really changing the way people use the internet or whether they're fairly trivial in the greater scheme of things. Even as we have these kinds of more attention-prompting trends, we also have the kind of opposite trend - which is...

...the explosion in social networking that is an extraordinarily powerful way to keep people distracted by constantly bombarding you with small updates and tweets."

It's the uncertainty of how technology is affecting us that is also driving Oxford's Greenfield to speak out. Not knowing how something will affect us should not stop us from thinking about how we want it to shape us, she says.

"What we do know about the human brain is that we occupy more ecological niches than any other species on the planet because we are good at adapting - therefore the risk of us adapting in a way that we don't like is quite high. So what we need to do is discuss, what do you want your kids to be, what kinds of talents do you want them to have, what kind of lives do you want them to lead?" says Greenfield.

Of course, technology use is not without its perils for teenagers - disrupting circadian rhythms, for example. Using small, bright screens - screens that are smaller than TV screens - has been linked to a delay in melatonin secretion which can affect sleep patterns, according to Dr Paul Howard-Jones, senior lecturer in education at the University of Bristol.

Using small, bright screens at night has been linked to sleeplessness

Using small, bright screens in the evening has been linked to insomniaCreative Commons: Bashar Al-Ba'noon

And insomnia is not the only possible risk. The lure of always-on technology, or something funny just a click away, can encourage tech users to keep browsing long past bedtime. Disturbed sleep isn't just a problem for energy levels, as Howard-Jones notes, it can also affect memory. He cited one study that showed 13-year-olds who were allowed to play video games in the early evening experienced disturbed sleep and also impaired short-term memory.

"Their memory of what they had been doing before the video game was disrupted - so even playing a video game after your homework quite early in the evening may have some consequences. But that's one study and I think more research is needed in that area."

But it's not all bad where video games are concerned. There is evidence that action video games are boosting visual motor tasks, says Howard-Jones - helping to hone skills that can be beneficial to certain types of jobs.

"In lounges, front rooms and bedrooms all over the world, thanks to a type of technology, young people and many adults have been experiencing increased performance on many visual motor tasks, ones that require visual decisions, visual attention and response times, switching of their visual attentions from one thing to the next, improvements in their ability to suppress distracting visual influences, improvements in their ability to infer an action's probable outcome in a very short period of time, and improvements in their contrast sensitivity - that's a visual quality to your sight which is actually the primary factor that limits it," says Howard-Jones.

"It appears that just 10 hours of play for non-gamers, people who wouldn't normally touch the video games, can generate transferable benefits to other tasks.

"These effects even transfer to some professional activities. The likelihood is you will be a better laparoscopic surgeon if you...

...enjoy your action video games and have been practicing in your Nintendo Wii."

What's going on in the brain here? The reason video games can be such good teachers, according to Howard-Jones, is they have rapid schedules of rewards - not easily found in other non-digital activities - which stimulate the mid-brain regions.

Receiving a reward from onscreen activity, especially when a player does not expect to get a reward, leads to the release of a significant amount of dopamine - which plays a key role in learning.

"That type of mid-brain dopamine that's generated by games may help explain their engagement but it would also explain why games can be very good teachers, because mid-brain dopamine also predicts learning. We think this occurs via increased synaptic plasticity - the actual connections between the neurons are made in a more efficient way, so it speeds up the process of learning," says Howard-Jones.

Games, it seems, are excellent teachers - and can teach a wide spectrum of skills, both good and bad.

Xbox controller

Games are excellent teachers, owing to onscreen rewards triggering dopamine releasesCreative Commons: Jeff Nelson

"One of the things they can teach is effective response and there is converging data that violent video games teach aggression. But it all depends what you do with it - [it's not a] fact that gaming technology is bad," says Howard-Jones.

"We have a similar set of evidence to show that pro-social games teach empathy and games that involve getting points for healing can actually teach their player to be pro-social and empathetic."

Nevertheless, it is often the 'violent video games cause violence' angle that makes the headlines, skewing our understanding of technology's impact upon us and our brains in a single direction.

A recent review of the science researching the internet's impact on the brain, authored by Howard-Jones, looked at some 180 different studies to try to ascertain whether evidence is emerging to support popularist fears about technology.

Speaking at the RSA earlier this year, Howard-Jones suggested current research does not generally support the notion that internet use takes something away from us on a neural level.

He cited a 2007 study by UCLA professor of psychiatry Gary Small who examined brain activity in two types of internet users - experienced and inexperienced - and found different regions of the brain were active in the different groups when they used the web.

Small also compared the brain activity of the two groups when reading a book and conducting a Google search on the internet.

His study found that experienced internet users had lots of activity in different regions of the brain - more activity than inexperienced web users. Small's conclusion viewed this as 'internet use is altering our brains'. However, Howard-Jones pointed out that all learning experiences alter our brains - meaning Small's results should not surprise or panic us.

"It would appear [from Small's study] that experience with a search engine has changed these people's brains. I would certainly not panic because this is probably just about learning.

"Whenever we learn, there are changes that occur in our brains - the brain is plastic - and I think a lot of the concerns we have when we find out that things are rewiring our brains arise from a misconception that...

...our brain is in many respects hardwired."

The biggest issue that dogs research in this field is that science still does not have a perfect understanding of the human brain, which leaves neurological research open to interpretation. Howard-Jones' interpretation of Small's study, for example, is not negative but rather that the people who have learned to use Google are probably using the search engine in a different way to internet newbies.

MRI brain scan

MRI scans don't give us a perfect insight into the brain - the scans have to be interpretedPhoto: Shutterstock

"These experienced users are probably using more search strategies. They have different types of targets, a set of targets, more complex decision-making processes are in order, and this additional [brain] activity we see corresponds quite nicely to the idea that there are additional decisions being made and additional reasoning processes being employed," he noted.

"So, yes, using the internet rewires our brains. Should we be concerned? No, I don't think so. All experience, when it becomes a learning experience, rewires our brains to some extent."

Tanya Goldhaber, a PhD student at Cambridge University's Engineering Design Centre who has worked on a project researching the impact of communications technology on society, also highlights the way all learning will alter our brain.

"If you learn to do anything, that rewires your brain," she told silicon.com. "You can't learn without your brain becoming rewired, that's the definition of learning on a neurological level - so I don't find that particular phrasing helpful.

"When you learn to use a new piece of technology your brain accommodates that new knowledge but there's not necessarily this huge shift in how you're thinking about things.

"The studies I've seen have not convinced me that there's any sort of massive shift going on in terms of the structural organisation of your brain," she adds.

There is also lack of consensus among neuroscientists when it comes to interpreting brain scans. Forget technology, the science itself is in dispute here.

"There is a huge amount of contention in the field right now," Goldhaber says. "Even in terms of a single method. So take FMRI [functional magnetic resonance] imaging, where you're trying to see which parts of the brain are more active for certain tasks.

"There's huge disagreement in the neuroscience community about the best way to carry out those studies, and the best control conditions and the best way to analyse that data because you're having a huge amount of numerical data that's coming out of those studies that needs to be very carefully analysed.

"And so in terms of studying the effects on the brain, I do think it's...

...a challenging area and that it's going to be a while before we have the full picture."

With brain scan science a work in progress, a more objective way to assess whether technology's impact on the brain is positive or negative is to look at a person's overall well-being and skillsets, according to Goldhaber.

"We can say definitively the brain will change in response to learning a new piece of technology because that's what the brain does.

"Now what we're really concerned about is not if the brain is changing but whether or not that change is positive, negative or neutral. I think there are secondary effects - in terms of behaviour, skills, psychological well-being - that you can absolutely measure and that's the place you have to start if you want to understand if these changes are really positive or negative."

Here, Goldhaber also believes the debate has become skewed, with the talk all too often centred on skill loss. We also have to consider the skills being gained, she says.

"You see that sort of concern about skill loss going all the way back to the invention of the written word when you had people like Socrates saying, 'Oh no, people can write things down, they won't be able to memorise entire books anymore'.

"I don't think anyone today would say that memorising an entire book is necessarily the skill you should be focusing on. Because we've got books, we've been able to develop other skills like critical thinking and comparison of knowledge.

"When we look at technology we can't just look at loss, we also have to look at gain, and we also have to look at skillsets in the context of the modern world - our grandparents' skillset is not the skillset that will serve us the best."

Editorial standards