Video: DeepMind and healthcare: How is the NHS using artificial intelligence?
Back in 2014, Google bought UK artificial intelligence outfit DeepMind for a rumoured £400m. Since then, DeepMind has been expanding its AI capabilities into new areas including gaming and, most notably, healthcare.
The Royal Free, a teaching hospital located in the Hampstead area of London, was one of DeepMind's first healthcare partners. The pair began working on an app called Streams in 2015, and the app has been in the hands of clinicians since January 2017.
While DeepMind is best known as an AI company, the Streams app doesn't at present have any artificial intelligence elements: think of it more as BI for healthcare. The rationale behind the app is simple: to rapidly alert clinicians to acute kidney injury (AKI) in patients, allowing them to respond more quickly.
Acute kidney injury is suddenly-occurring kidney damage, usually as a result of another serious illness or injury. When the kidney stops working properly, the body's toxic waste products can build up in the blood and harm other organs. If an AKI becomes serious enough, it can prove fatal. According to the Royal Free, AKIs are linked to 40,000 deaths a year, and £1bn is spent on treating the condition.
The idea of the Streams app is to make sure that the right information about AKIs finds the right members of the hospital team at the right time.
It does that by analysing information, such as details about blood and liver function, from patients on certain wards including obstetrics and those leaving intensive care. It then sends an alert to a clinician's phone to let them know that a patient needs their attention.
In the Royal Free, around six alerts are sent out per day, four of which are dealt with by the renal team and two by specialist nurses.
Now read: Is it time to simplify software?
Sarah Stanley is a clinical nurse lead at the Royal Free, and one of the clinicians that uses Streams in her daily work. The app is held on a dedicated smartphone that alerts her to level two and three AKIs (the more serious stages of acute kidney injury) that need her attention.
The app crunches the data to detect in-patients suffering with acute kidney damage, what happens next is down to the doctors and nurses that get the alert. By opening the app, the clinician can get a snapshot of the patient's condition. By scanning which blood results are out of whack, they can start to build up a picture of what might be behind the AKI -- a low haemoglobin and raised urea might be indicative of blood loss, while a raised white cell count might suggest infection, for example.
The app also fills in other information about the patient: text summaries of their X-rays, for example, and details of previous hospital admissions. "If you can see in under a minute they have had 20 visits to the elderly care ward, then that gives you quite a good clinical picture," Stanley said.
Older people are particularly at risk of an AKI. Not only does kidney function diminish naturally as people age, those over 65 have an increased likelihood of having another serious long-term health condition and also tend to take longer to recover from knocks to their health. It's common for an elderly patient to come in with a pneumonia that can go on to cause an AKI, or for a patient to have been kept nil by mouth ahead of an operation and, being disinclined to drink afterwards, get so dehydrated their kidneys suffer as a result.
Streams allows such AKIs to be detected earlier -- a matter of several hours, according to Stanley. "They would have been picked up [before], but probably not until the next day on the routine bloods," she said. Rather than have to find a working PC and sort through the patient's electronic record to find all the information needed to decide if they've had an AKI, the information is surfaced straight into the clinician's hand. "It's a massive time-saver. You save one to two hours a day just through filtering information," Stanley added.
Finding an AKI earlier can mean that it can be treated sooner, and so the damage to the kidney and other organ systems is minimised. It can even be of benefit for patients that are terminally ill: "If the patient is imminently dying and we're getting to see them a day sooner, then we can make plans with them and their family about what could happen," Stanley said.
In the future, the Streams app could be more of a two-way affair when it comes to data flow. Rather than just surfacing analysed data to clinicians, it could be used to study performance of clinical teams -- recording how long it takes to respond to an AKI alert, for example, and patient outcomes related to different clinical activities. "Pulling data back is a massive plus... you are only learning from the data you collect," Stanley said.
There are currently two Streams phones in use, one by the day team and one by the night team. One of the devices is always on charge while the other is in use. While consultants can access the app over a VPN from home, for most staff, the app won't work when it's not connected to the Royal Free's wi-fi, keeping patient data accessible only within the bounds of the hospital.
The app is, according to Stanley, "very user friendly". That's perhaps not surprising given the work that DeepMind has put into the design of the UI -- every element of NHS workflows and UX has been carefully scrutinised and refined.
Even the model of phone chosen as the test device for Streams had thought behind it: it needed to be precisely the right size for the pocket of hospital scrubs, so users could make sure it wouldn't fall out while they're about their work on the ward. (Interestingly, considering DeepMind's ownership, it's an iPhone, but an Android version of the setup is planned.)
For the design of the app itself, DeepMind spent time trying to make the app intuitive: for example, staff were shown icons without context and asking users what action -- a tap, a swipe -- it made them want to take, to try and create a UI that's as simple to use can be. "It's always the way of IT things -- people are worried they will create lots of work, but when they use [Streams], they instinctively know how it works," Stanley said.
DeepMind even created a custom alert noise for Streams, to make sure that it couldn't be confused with any other alert sound a clinician might hear, and so would always get their ear.
The design had to not only work for nurses and doctors, it had to suit patients too. Stanley said that clinicians can use the app to help explain to patients about changes in their kidney health. Because the Streams app shows changes through graphs and trend lines, patients can clearly see if certain markers of kidney health are spiking or falling, and by how much, making the progress of their condition easier to grasp.
But not all patients whose data has been through Streams would have been aware. Earlier this year, a year-long investigation by the Information Commissioner's Office found that the 1.6 million patients whose data was used in testing weren't sufficiently informed that their information was being used in the pilot. The health trust that the Royal Free is part of, according to the ICO, wasn't as transparent as it should have been, and was advised of several measures it had to take to bring it into line with the Data Protection Act (there is now an opt-out form on the Royal Free's website). However, the research project itself was allowed to continue.
Though the two organisations have signed a five-year deal to work together, there's no projected end date to the DeepMind research going on at the Royal Free. Stanley is hoping that the pace of the implementation picks up. It would be a "tragedy" if the trust doesn't roll-out Streams further, she said. "We are not embracing it quickly enough."
Recent and related coverage
Skunkworks for healthcare wants to use AI to design new drugs
Medicinal chemists may be next victims of automation if project proves successful.
Science-based healthcare: How IoT and AI can help us make health decisions based on data not opinion
Philippe Kahn, CEO of Fullpower Technologies and creator of the first phone camera, talks about how the combination of IoT and sensor technology with machine learning and AI is leading to a digital transformation of healthcare and wellness research.
Nvidia, Nuance team up to bring AI, machine learning to radiologists
The partnership highlights how multiple vendors in the medical foodchain see machine learning and artificial intelligence as a way to revamp medical imaging.
Read more on DeepMind