How best to treat brain trauma? Analyze big data

What do the stock market and brain trauma treatment have in common? Allow me to explain.
Written by Charlie Schick, Contributor on

Charlie Schick, PhD, is Director, Big Data Solutions, Healthcare and Life Sciences, at IBM, driving solution development, sales consulting, and go-to-client activities. You can follow his ramblings on the IBM Big Data Hub and IBM Healthcare Industry blogs or on Twitter as @molecularist. He’ll also keep you up to date with all things big data and healthcare at @IBMBigDataHLS.

Charlie Schick is the Director of Big Data, Healthcare and Life Sciences for IBM.

Every year, about 1.7 million Americans sustain a traumatic brain injury, according to the Centers for Disease Control. And these brain injuries, whether from a common fall, a concussion during a game, or a gunshot wound, contribute to nearly one-third of all injury-related deaths.

Understanding how to minimize the debilitating effects of brain injury, especially how these can lead to death, is crucial. Indeed, the Obama administration initiated a decade-long, multi-billion dollar initiative to examine the workings of the human brain. At the same time, the National Football League is funding research on concussions.

Furthermore, the Brain Injury Association of America, the country’s oldest and largest nationwide brain injury advocacy organization, notes that "brain injuries don't discriminate" and encourages "early and equal access" to treatment.

Because as widespread as brain injuries are, when it comes to tracking dangerous changes in a patient lying on a bed in a hospital, there’s often little nurses and doctors can do today other than react. Hospital staff can only react when a bedside monitor, constantly tracking vital signs, alerts them when brain pressure crosses a critical threshold. The staff then need to make instant decisions about whether the alarm is false or the condition is life-threatening and requires immediate action.

Taping into that big data stream
But now, UCLA is teaming up with IBM to figure out how to predict potential problems before they occur. Together, we’re applying big data analytics to give doctors and nurses the advance warning they need to predict changes in a patient’s condition so they can take preventive measures.

UCLA is testing an intracranial pressure monitoring system at the intensive care unit of the Ronald Reagan UCLA Medical Center. It builds on the UCLA department of neurosurgery’s work to implement predictive systems, funded by a $1.2 million grant from the National Institute of Neurological Disease and Stroke.

This early warning system is designed to detect trending increases in brain pressure, which can turn deadly very quickly, before a threshold has been passed. By analyzing in real time the thousands of vital signs being collected by the bedside monitor and spotting subtle changes in the patient’s pulse, blood and intracranial pressure, heart activity, and respiration, the system can signal that high-risk brain pressure is building.

Another dimension opens
New big data tools are evolving that can tap into new kinds of data sources such as medical monitors. Therefore, we’re able to capture and act on insights in ways that just weren’t possible before.

What’s interesting is that UCLA’s medical innovation is based on the same big data tools that are being used by other organizations to sift through huge masses of high velocity data at rates of up to petabytes a day. The New York Stock Exchange, for instance, relies on the same tools to ferret out irregularities in massive amounts of trading data.

Yet, stock brokers understand implicitly the importance of analyzing large volumes of fast-moving market data. We hope that the work we are doing with UCLA inspires medical innovators to look at the large volumes of fast-moving data streaming off of ICU monitors and other sensors and come up with new predictive models that will give us a new insight into health and patient care. This will open up a whole new dimension in healthcare that wasn’t accessible before, a dimension driven by big data.

This post was originally published on Smartplanet.com

Editorial standards