DeepMind's AI spots early signs of eye disease

Initial results of DeepMind's partnership with Moorfield's Eye Hospital in London suggest that a scanning method that uses artificial intelligence could provide quicker diagnoses and help prevent sight loss.
Written by Jonathan Chadwick, Contributor

Alphabet's DeepMind says it has used artificial intelligence (AI) to spot signs of eye disease as effectively as world-leading expert doctors.

The results of the partnership with Moorfield's Eye Hospital in London, detailed in Nature Medicine, suggest that the scanning technology could provide quicker diagnoses of sight-threatening eye diseases, ensuring that patients don't have to wait as long for urgent treatment.

The British company's AI program "trained" with diagnostic data from almost 15,000 patients, learning how to spot signs of eye disease from optical coherence tomography (OCT), an imaging technique that uses light waves to produce 3D images of the back of the eye.

Currently, eye-care professionals use OCT scans to help diagnose eye conditions, DeepMind explained in a blog post, but they are difficult for clinicians to read and take expert analysis to interpret.

DeepMind's system can automatically detect features of eye diseases from the OCT scan "in seconds", prioritise patients most in need of urgent care, and cut the time between patient scans and their resulting treatment.

"These are early results, but they show that our system could handle the wide variety of patients found in routine clinical practice," DeepMind said in the post. "In the long term, we hope this will help doctors quickly prioritise patients who need urgent treatment -- which could ultimately save sight."

Further, DeepMind said its solution tackles one of the key barriers for AI in clinical practice: Understanding the system's reasoning and why it arrived at a particular decision.

DeepMind AI combines two neural networks, which it classified as "segmentation" and "classification". While the first of these analyses the OCT scan to provide a "map" of the eye tissue and the features of disease, the other analyses this map to provide clinicians with diagnoses and a referral recommendation as a percentage.

DeepMind said that the tech would need to undergo clinical trials and gain regulatory approval before being used in practice. If and when it is, clinicians at Moorfields will be able to use it for free across their 30 UK hospitals and community clinics for an initial five-year period.

The technology could then be applied to different types of eye scanners, not just the device it was trained on at Moorfields, meaning it could be applied globally.

"We don't just want this to be an academically interesting result -- we want it to be used in real treatment," DeepMind added. "We're confident that, in time, this system could transform the diagnosis, treatment, and management of eye disease."

Since being bought by Google back in 2014, DeepMind began expanding into areas such as gaming and healthcare. The Royal Free teaching hospital in London was one of DeepMind's first healthcare partners; in 2015, the two began working on an app called Streams, which alerts clinicians to acute kidney injury in patients, allowing them to respond more quickly.

Last year, DeepMind announced a research initiative with Cancer Research UK Centre at Imperial College London to use machine learning to improve the detection of breast cancer by analysing X-rays and hoping to spot signs of cancerous tissue more effectively than current screening techniques.

It also applied machine learning to radiotherapy planning for head and neck cancer at University College London Hospitals NHS Foundation Trust in 2016.

DeepMind's AI gained notoriety in 2016 after its AlphaGo program beat professional Go player Lee Se-dol over the course of a five-game match-up. The computer program won all but the fourth game against the Korean player, earning itself an honorary 9-dan, the highest rank possible in the game.

DeepMind later said it had created the best Go player in the world because it was able to do away with human knowledge and start with a blank slate.

In June this year, the UK enlisted DeepMind's founder Demis Hassabis to advise its new government office for AI.


<="" p="" rel="follow">

    <="" p="" rel="follow"> <="" p="" rel="follow">

<="" p="" rel="follow">

<="" p="" rel="follow"> <="" p="" rel="follow">Newcastle University develops 3D-printed replacement corneas

3D-printed 'bio-ink' consisting of stem cells, alginate, and collagen, could be used in the future to ensure an unlimited supply of corneas for people requiring surgery.

Why swallowable robots could be the future of healthcare

A collaboration by Caltech researchers could pave the way for new therapies by creating tiny devices to monitor the inside of your body.

Google's DeepMind and the NHS: A glimpse of what AI means for the future of healthcare

The Google subsidiary has struck a series of deals with organisations in the UK health service -- so what's really happening?

DeepMind and the NHS: What it's really like to use Google's kidney health app

The Royal Free was one of Google's first healthcare partners. Two years on, how is the product of their partnership working out?

DeepMind research shows AI can make itself more human, and businesses should take notice(TechRepublic)

The findings of a recent study by DeepMind showed that the AI system spontaneously began moving similar to mammals.

Editorial standards