X
Innovation

AI, MD: Five unexpected ways that artificial intelligence will make your healthcare better

In some areas, artificial intelligence can spot disease just as well as human doctors. But it's not the only way AI could really make a difference to healthcare.
Written by Jo Best, Contributor

Could artificial intelligence ever match a up to a doctor's skills in diagnosis? Could a piece of software ever outstrip the abilities of the nations' overworked medics? One study concluded it could; but as ever, the reality maybe somewhat more complicated.

The report published in the Lancet Digital Health conducted a meta-analysis -- a study of studies -- into how well doctors and AI systems compared when diagnosing particular conditions. 

After reviewing 82 studies, the report concluded: "Our review found the diagnostic performance of deep-learning models to be equivalent to that of health-care professionals". One nil to the machines? Perhaps not yet. The paper's authors noted that few of the studies comparing doctors and 'deep-learning algorithms' tested them on the same sample of data, or had their results validated by an external body. Similarly, it noted that "poor reporting" made determining the diagnostic accuracy of those involved problematic.  

SEE: How to implement AI and machine learning (ZDNet special report) | Download the report as a PDF (TechRepublic)

The overall level of diagnosing a disease by an AI was pegged at 87 percent for AIs, and 86.4 percent for healthcare professionals. Should we go ahead and sack a few docs and install Hal in their consulting rooms instead? It's interesting that the results of the man and machine studies should be so closely tied: are AIs replicating the successes and failures of the doctors whose training data they used, perhaps? It's also worth noting the limitations of what AI is being asked to do in the medical field. Almost exclusively, it's being asked to read pictorial scans – ultrasound, X-ray, CT and beyond. 

The AIs are there to spot what are often easily identifiable conditions – a fracture, a mass, changes to the retina. Having clearly-defined criteria for a disease makes it easier for AIs, and trainee doctors, to learn. 

But scans are rarely that simple. X-rays are often returned to doctors with 'possible evidence of condition X' or 'it could be disease Y, if that fits with the clinical picture'. Scans are never interpreted on their own: they're analysed alongside blood results, historical data, prescriptions from GPs and previous hospital admissions, referral letters, taking the patient's history and then taking it again, what the nurse told you before they went on their lunch break, and any number of other sources of information. 

Interpreting a scan alone will only take you so far.

Similarly, there's a saying in medicine that 'you don't treat the X-ray, you treat the patient'. Doctors will tell you they've seen scans of people that they think would be in great pain or disability who are managing fine without any help from the medical community; equally, they will have seen scans that show only minor disease or injury where the patient is in far more pain than would be expected from what's on the screen. Interpreting scans is useful, but it's only part of the picture – sitting down with a patient will give you far more information. There's another saying in medicine, too: 'listen carefully enough to the patient, and they'll tell you the diagnosis' – that is, if you talk enough with the patient, you won't necessarily need to order a battery of scans and tests in the first place. 

Nevertheless, having an AI that can read 800 scans – easily identifiable or otherwise – before a radiologist has had their morning coffee makes a doctor's life easier: if an AI can separate those who need a biopsy from those who need reassurance, it will spare doctors a lot of waiting around. It can take days for the results of a scan to reach the doctor managing a patient's care, meaning a delay until some clinical decisions can be made. AIs would help alleviate those problems. 

But why do human doctors take so much longer to turn around a scan than AIs? It's not a question of competence – it's usually a question of staffing. There simply aren't enough staff who can turn the scan around (staff shortages in the healthcare sector are significant and damaging due to a number of factors, not least constrained budgets). And while AI companies using NHS datasets may offer free or low-cost access to the products that result, it's difficult to imagine that successful diagnostic AIs won't end up making a hole in the budget over the longer term.

SEE: These robot snakes designed by AI could be the next big thing in the operating theatre

While the report may have pointed out the numerous flaws in much of the research looking at how well human and machine compare on diagnostic accuracy, I'd be surprised if you could find a medic that doesn't see AI writ large in their future. The idea – and it's an appealing one – is that by doing tasks such as interpreting scans, the AIs will free up more time for doctors to spend with their patients.

Doctors, in the UK, would love to spend to more time with their patients – it's just that the demands put upon them are so vast, even when the burdens of reading scans (or waiting for them to be reported and so on) are removed, there's still so much more work that still needs doing. 

With that in mind, here are my suggestions for AIs that could really make a difference in the health service:

Prescription monitoring

Depending on whose statistics you use, a small percentage of all prescriptions written will contain an error. Sometimes that error will make no difference to the patient, sometimes it can cause them serious harm. Given a lot of hospital drug charts are written on paper, you can see where errors can creep in – maybe the nurse reading a scrawled prescription sees a U as an 0 on the drug amount, maybe a sudden decline in organ function makes a medicine unsuitable, maybe a doctor at the end of their 12-hour shift doesn't notice a drug-drug interaction. Having a multifunction AI that could analyse the content of a prescription for likely errors, study test results for any changes that could impact what drugs are suitable, study external sources of data like care homes and GPs for useful info, spot when repeat prescriptions may have served their purpose and should be stopped, and reconcile drugs when the patient leaves hospital, would, quite simply, not only make patients' lives easier, but could even save a few of those lives too. 

Finding those lost to follow up 

'Lost to follow up' – a phrase that means a person who should have had a repeat appointment with a doctor never did. There are a number of reasons why that might be the case – the patient may have moved house and not got their appointment letter, or the doctor forgot to book them back in – but in all cases, it means that someone who should have been seen hasn't been. An AI that could spot where appointments that should have happened didn't, and correct the cause – find the patient's new address, remind the doctor of their oversight – could make sure more people are getting the right treatment. 

Putting an end to DNAs

On a related note, DNAs - short for 'did not attend' – are the bane of many medics' professional lives. When patients book an appointment and don't turn up for them, that wastes an appointment that could have gone to someone in need. While many surgeries rely on text messages to remind patients of the date and time of their appointment, a more sophisticated AI could reduce non-attendance even further. By consulting a person's socials, calendar, health-tracker and other data, it could work out who's at risk of missing an appointment, and prompt them to cancel or rebook. 

Developing a virtual secretary

There's nothing like good eye contact to help establish good doctor-patient rapport – and in a lot of consultations, there's nothing like good eye contact, because the doctor is hunched over their keyboard, frantically typing what the patient's saying into their electronic medical record. It seems a bit impersonal at best, rude at worst. Researchers and companies alike are working on making AIs that could extract useful information from a doctor-patient consultation and input it directly into the health record while the two have a much more human conversation. If having recorded the doctor talking about organising a test or referral, if the AI could then book it in, it would be the virtual secretary of GPs' dreams. 

Finding those missing supplies

There's always that one thing on a ward that's missing. The printer might be empty, there's a shortage of vacutainers, or all the blood culture bottles have vanished – and there's nothing in the storeroom. Nurses and doctors then embark on a magical mystery tour of the hospital, hunting down the elusive supplies, bargaining with a sister for a few spares until the next batch comes in. An AI that could intelligently predict demand for medical supplies, place orders for replacements ahead of time, or at the very least tell you which ward has the most plentiful resources, could really save some legwork – not to mention speed up treatment. 

Editorial standards