AI makes inroads in life sciences in small but significant ways: Lantern Pharma’s quest

Lantern Pharma, a tiny startup in Dallas, is helping to rescue failed cancer drugs with the application of neural networks. It's not deep learning, but it shows that very basic machine learning can be a valuable tool for life sciences.
Written by Tiernan Ray, Senior Contributing Writer

Artificial intelligence is everywhere in life sciences, and yet, it's hard to know just what it all means. Plenty of headlines tout AI as something that is already making breakthrough diagnoses, an automated diagnostician that is already replacing radiologists. There's a lot of hype, but what can AI really do?

Lantern Pharma, a five-year-old, privately held biotech startup based out of twin headquarters in Dallas and Kearney, New Jersey, is taking existing drugs and seeking to secure new use for them as more finely tailored cancer-fighting agents. 

ZDNet interviewed the chief executive, a seasoned entrepreneur named Panna Sharma. To enhance the understanding of AI in medicine, ZDNet subsequently requested additional information in response to specific questions, such as how the company's approach differs from plain old statistics. 

What resulted was an eight-page FAQ, emailed to ZDNet as a PDF file, with details about how machine learning is used to simulate some of the testings that happen in human drug trials. It's an enlightening view of science that goes beyond the headlines. 

It turns out machine learning forms of AI are making inroads in the work of medical diagnostics in small ways, but ways that may make a big contribution at some point down the road. 

"We are not inventing AI, we are leveraging AI, leveraging AI for a very specific question," Sharma told ZDNet

That question is whether drugs that never got approved might still offer a cure if the right population can be found to which to give them.  

Drug re-use, whether for the original indication of a drug, or re-purposing for new use cases, is a very hot topic in medicine because the whole process is so expensive. The average development cost for cancer drugs these days is $2 billion, according to data gathered by Lantern, and most drugs fail to win approval, so starting from scratch is a big deal.

Also: The subtle art of really big data: Recursion Pharma maps the body

The appeal of re-using some drugs is that they may still be promising therapeutic agents; they just may not have been administered to the right population of patients. Existing drugs, as opposed to de-novo development of drugs, have the virtue that such failed drugs have nevertheless often passed safety screening, so they are at least past certain hurdles. By re-assessing a drug that hadn't been approved, Lantern may be able to salvage all that development money and streamline its own outlay. 

The challenge with these drug trials is mostly enrollment, according to Lantern, meaning, who is selected for the trials. Because cancer can have many varieties, it matters on whom the drug is being tested. Lantern is using technology to select the genetic makeup of patients that may be most susceptible to the drug. It is really the advent of personalized medicine, at least in simplified form. 


Lantern is working on three different therapies that had been left aside after showing some progress in clinical trials but not winning approval. They address a variety of cancers, including prostate, ovarian, and non-small-cell lung cancer.  

Lantern is a tiny company with just nine full-time employees and three contractors. They are not going to run massive, multimillion-dollar drug trials on their own. But what they can do is test drugs in simulation before a trial happens and then partner with larger firms. 

The company's software platform is called "RADR," which stands for "Response Algorithm for Drug Positioning and Rescue." Not all of this is artificial intelligence. The process starts with choosing which of thousands of genes are likely responsive based on historical statistics about those genes, what is known as "feature selection."

That process leads to a shortlist of tens of genes that may be responsive. Lantern takes tissue samples from prospective patients and tests the individuals' unique genetic profiles, looking for the combination of genes that represents a "signature" that may be predictive of drug response. 

Here is where machine learning comes in. The company uses the "NeuralNet" package included with the statistics program R to build a neural network of multiple layers and train it to predict drug response based on the presence of the genes picked out in step one. 


Lantern's "RADR" software operation involves multiple stages, starting with a database search that narrows down candidates genes to a short list, a process of feature selection, and ending with a prediction phase that models how the combination of candidate genes affect drug response, via the use of neural networks. 

Lantern Pharma

In the FAQ that Sharma prepared for ZDNet, the company details how it compared using the neural network with other machine learning approaches, including "K nearest neighbors," "support vector machine," and "random forests," all of which are approaches that had established themselves before neural nets gained prominence in the past decade. The company didn't even work with standard linear regression analysis, because linear regression doesn't capture the non-linear response patterns that come up in the interactions of drugs and genes, it said. 

After testing these methods, the company found that the neural network "provides faster and more accurate predictions than others," the company states in the white paper. 

"From our research and experiments, it is evident that ANN [artificial neural network] is the most optimal algorithm capable of capturing non-linear relationships as well as providing higher prediction accuracy even for small or limited datasets when compared to other models."

Lantern's scientists had to do several things to make possible the use of neural networks, including creating the train, validation, and test data sets; performing "stratified three-fold cross-validation" (to avoid over-fitting); using "grid search" to tune the hyper-parameters of the neural net; and a variety of evaluations for things such as sensitivity, specificity, mean absolute error and root mean square error.

This is not deep learning research of the kind that gets gigantic headlines like DeepMind's AlphaZero program. The RADR software is not teaching itself biology, oncology, and pharmacology from scratch. Instead, RADR rests upon years of scientific information that has been gathered about genetic pathways in the body, the "transcriptome." It also rests upon knowledge of what kinds of drug response levels to look for, as the object of prediction.

Also: This algorithm can predict your risk of ending up in hospital

It may not be as flashy as AlphaZero, but the Lantern work shows a moment in time at which neural networks have become a valuable implement in the toolkit of scientists who need to test for complex interactions. 

Consider the detective work Lantern is performing for a drug called "LP-300." It failed in Phase III clinal trials for non-small cell lung cancer effectiveness. But it turns out it was effective with one particular group, female patients who aren't smokers. 

"We know it works very well in a targeted population, which happen to be female non-smokers," explains Sharma. "Why is that? Why do they respond? We know that it's a totally different disease," he says. "That wasn't understood seven years ago when the trial failed."

Using the neural network prediction technology, "we have been able to model out a signature for never-smoking females that we are putting into models and gearing new data to show why this drug works really well in this type of patient population." Lantern is in talks with the FDA and authorities in the UK to get the drug back into the system of approvals, says Sharma. 

Another effort underway in the town of Derry in Ireland is testing tissue samples of individuals for their response to a drug called LP-184, a so-called "DNA damage repair inhibitor," which is akin to the drug Irofulven. It is targeted at prostate cancer, but prostate cancer can take twelve different forms, so it's "not just one disease," as Sharma told the local Derry newspaper this summer.

Marketing and distributing and further developing drugs, of course, can take billions of dollars. That kind of budget tiny Lantern doesn't have, with just $8 million in funding to date. But Sharma has learned some hard lessons about how to build a biotech. He previously was chief executive for a biotech called Cancer Genetics, ticker "CGIX," a post he resigned in February 2018, after seven years, as the stock price cratered. That company, says Sharma, was initially focused on diagnostic tools, and he believes it could have had a bright future as a developer of drugs, but it never got the capital it needed up-front to develop its pipeline of drugs. 

"We couldn't capitalize all this genetic information we were sitting on," recalled Sharma. "It's really about the portfolio. You have to have a period to incubate the portfolio."

This time around, he is keeping things much simpler. "One of the things we have done is keep internal operations lean," he said. "We outsource almost all lab work, and keep as our core competence the strategy for the drugs and the mechanism and engine." In that sense, Lantern is very different from Recursion Pharmaceuticals, another biotech profiled by ZDNet this year, which has much more funding and insists on maintaining a wet lab in-house. 


"One of the things we have done is keep internal operations lean," says Lantern CEO Panna Sharma. "We outsource almost all lab work, and keep as our core competence the strategy for the drugs and the mechanism and engine."

Lantern Pharma.

The small steps with neural networks by Lantern point toward potential for AI down the road that could be much more significant. Only one part of the development process is AI at this time, the prediction of drug response based on a narrow white list of candidate genes. 

In the future, one can imagine Lantern and other companies developing neural networks that can search the entire transcriptome of tens of thousand genes. AI might be used to cull the candidate genes of the whole list, or, rather than creating a shortlist, AI might just run the prediction network on thousands of genes. Something similar to that is taking shape at the US Department of Energy's Argonne National Lab supercomputing center, using new, powerful computers from startup Cerebras Systems. 

Another possibility is that the connections that are happening in the hidden layers of the neural network will reveal important new "features" and representations that will bring greater insight into genetics and biology. Today, the neural net, as used by Lantern, is producing predictions, but its greater value may lie in what it ultimately articulates about the nature of patterns of genes that affect drug response.

So, yes, AI has a place in medicine. It may start small, but the implications of what's already being done are provocative. 

Editorial standards