Ever since the fine staff at Poole General Hospital looked after me when I suffered an arm fracture a few years back, I have always kept an extra ear open (I generally like to have at least three) for news of software applications that sit within the medical field.
When my morphine drip ran out, it took the night nurse (a burly chap from Bournemouth called Brian) a good half an hour to get to me with more happy juice – and I sat there thinking: ‘why wasn’t this system linked to a pager that he could have been wearing so that he knew the minute I needed attention?’
Actually that’s a lie – I sat there thinking: ‘ow, agh, ow that hurts!’
So, leaving my morphine habit to the side for a moment and taking this idea up a notch or two – how critical does software testing become when building for really important “life-critical” (I’m not sure whether I made that up or heard it somewhere) applications?
I rant thusly because I’ve been mailed a few times this last week about a company called Klocwork that has recently been involved with testing embedded software in medical devices like defibrillators. Not the kind of unit you can wait around for a reboot on really – right?
The USA has a body called the Federal Drug Administration (FDA) that looks after the drug companies and medical device manufacturers to make sure they are creating the best products.
NOTE: The FDA is also responsible for stipulating that consumers know about all side effects of prescription drugs. You know those wacky ads on US TV? “This product may leave you unable to walk, speak or eat without the use of a tube. Some patients have also reported nausea, vomiting and extreme hallucination etc.”
On a serious note, the FDA has issued guidance for proper validation of medical device software in the form of the General Principles of Software Validation. The guidance applies to any "…software used as components in medical devices, to software that is itself a medical device, and to software used in production of the device or in implementation of the device manufacturer's quality system." The FDA's guidance covers all aspects of software development – everything from requirements and design reviews to software maintenance and retirement.
Do we do the same thing here in the UK? I’m sure we do – but I don’t think we’re quite as vocal about it are we? Is that because there are less indemnifying caveats to pay lip service to? Possibly. Is it because our medical software applications development work is inherently better? There’s no major reason to suggest that it is really.
The company that started my train of thought on this subject (as I said earlier) is Klocwork. CTO Gwyn Fisher was recently quoted on his approach to complex testing and the below comment may go some way to explaining why a special approach is need for critical medical system software.
“If you have a very large system, it’s made up of a whole bunch of functions, methods and modules all linked together into large system images. The problem, whether you’re using code review, runtime testing, or whatever, is finding enough combinations so you can trace through, from source to sink point, exactly what’s going to go on with a pointer, buffer, array, or whatever it happens to be. You need to point out across multiple different function boundaries, that this pointer you’re grabbing from over here, you’re using inappropriately over here, or this memory you’ve allocated here, you’re never releasing on this particular thread.”
So there you have it. Well there you have a very small dose (please excuse that terrible pun!) of the kind of approach needed for apps to survive in the programmer’s Petri dish if they are to migrate to life-critical healthcare deployment.
As for me, my nurse recoiled away from me when I told her that I was a journalist. I think she thought I was from the Daily Mail and was about to slag off the health service – she couldn’t have been further from the truth.