Google AI can predict your heart disease risk from eye scans

Google's deep-learning algorithm could offer a simpler way to identify factors that contribute to heart disease.
Written by Liam Tung, Contributing Writer

A new study by Google and its health-focused Alphabet-sibling, Verily Life Sciences, has shown that deep-learning algorithms can accurately predict heart disease by analyzing photographs of an individual's retina.

Scientists from the firms detail their findings in a new paper published in Nature Biomedical Engineering: 'Prediction of cardiovascular risk factors from retinal fundus photographs via deep learning'.

The retinal fundus image includes blood vessels of the eye, which the paper shows can be used to accurately predict cardiovascular risk factors, including whether a person is smoker, blood pressure, age, gender, and whether a person has had a heart attack. The algoritm was also able to infer a person's ethnicity, which is also a factor in cardiovascular related disease.

"Using deep-learning algorithms trained on data from 284,335 patients, we were able to predict cardiovascular risk factors from retinal images with surprisingly high accuracy for patients from two independent datasets of 12,026 and 999 patients," writes Google Brain Team product manager Lily Peng.

The dataset consisted of 48,101 patients from the UK Biobank database and 236,234 patients from EyePACS database.

As the paper points out, there are other ways of assessing cardiovascular risk from a patient's history and blood samples, but sometimes key information is missing, such as cholesterol levels.

The retinal image scans could offer a quick, cheap and non-invasive way of generating signals for heart disease.

Given that the algorithm could accurately predict risk factors, the scientists also trained the algorithm to predict the onset of a major cardiovascular event, such as a heart attack within five years.

"Our algorithm could pick out the patient who had the CV event 70 percent of the time. This performance approaches the accuracy of other CV risk calculators that require a blood draw to measure cholesterol," wrote Peng.

The researchers also used attention maps to look at how the algorithm was making its predictions, such as whether it was focusing on blood vessels to predict age, smoking status and blood pressure.

As Peng notes, opening the black box to explain how predictions are made should give doctors more confidence in the algorithm.

Verily's head of cardiovascular health innovations, Michael McConnell, said it is promising but early research.

"More work must be done to develop and validate these findings on larger patient cohorts before this can arrive in a clinical setting," he notes.

However, if further studies do validate the findings, the use of retinal images could lower the barrier to doctors discussing preventative measures with a patient.


In the gray retinal image used by the deep-learning algorithm, blood pressure is highlighted in shades of green.

Image: Google

Previous and related coverage

Project Baseline: Alphabet's five-year plan to map the entire journey of human health

An ambitious collaboration between US universities and Google-offshoot Alphabet's life sciences arm is aiming to map the factors that contribute to good health, and illness.

DeepMind and the NHS: What it's really like to use Google's kidney health app

The Royal Free was one of Google's first healthcare partners. Two years on, how is the product of their partnership working out?

Editorial standards