Something like a pandemic can speed up scientific research.
Case in point, neural networks are turning up new kinds of signals in the COVID-19 disease that may change how scientists look for cures.
Argonne National Laboratories, one of nine giant supercomputing centers of the U.S. Department of Energy, has for several months now been employing massive computing power, including novel machines from the Silicon Valley startup Cerebras Systems, to model how existing drugs "dock" with proteins of the virus, meaning, how well a drug attaches to the functional part of a virus.
SEE: Life after lockdown: Your office job will never be the same--here's what to expect (cover story PDF) (TechRepublic)
If a drug can be found to dock well, that drug can in theory disable the virus. Predicting such docking scores in a computer simulation can guide scientists to a short list of drugs to try in the lab with live virus.
That work is showing some promising initial signs, things that are docking and showing some inhibition of the virus, said Rick Stevens, associate laboratory director at Argonne for computing, environment and life sciences, in an interview with ZDNet by phone.
Also: Startup Tenstorrent shows AI is changing computing and vice versa
It's a pace of discovery that is well outside the norm even for a high-powered lab like Argonne that produces tons of research at a fast pace.
"We are doing in a few months what would normally take a drug development process years to do," said Stevens.
Machine learning has been used for years to predict docking scores but the present work goes significantly beyond previous efforts. Argonne created a novel set-up of three different kinds of deep learning neural networks to make a combined score that's better than each network on its own.
But, "that's not the interesting part," said Stevens.
The interesting part, according to Stevens, is that the scientists have found they can make predictions based on images of a molecule. That's unusual because docking is usually done with chemical models of the molecules, known as "fingerprints" and "descriptors." Such descriptions contain a high degree of structural information about chemical bonds versus the mass of pixels in an image.
"The most interesting thing about this isn't the architecture" of the neural networks, said Stevens. "It's the fact that molecular images actually work as a way of representing the molecules."
"We think that there's some underlying reason for why it works so well, and we're in the process of writing that up in several different papers," he said.
"It has more to do with the fact that the inherent image-based representation of a molecule contains information that's much harder to extract from the fingerprints or descriptor-based information."
Neural networks that operate on graphs -- meaning, connections between molecules represented like a network -- can also provide some of the same information, he noted, "but not in an easily accessible way for programs necessarily."
The point of Argonne's work is to cut the time it takes to get to a promising drug. Starting from scratch presents an incredible combinatorial problem of searching through more than 10-to-the-60th-power possible small-molecule drugs, more than the number of stars in the universe, each of which would have to be tested across some 70 targets on the virus.
Instead, starting with a short list of known molecules of interest, scientists can potentially shave years off of the drug development process, which takes a decade, on average, if starting from scratch.
Also: AI on steroids: Much bigger neural nets to come with new hardware, say Bengio, Hinton, and LeCun
"What we're trying to do is produce the best possible list of lead compounds," said Stevens, "including all existing drugs, of which there are only about 9,000 or so, but also streaming through about 4 billion compounds that are articulated in the various libraries that companies have." Testing those 9,000 drugs is a process that is increasingly common in drug development, known as drug repurposing.
This search procedure at Argonne has its roots in cancer research that has been going on for years at the lab, but that has been sped up by the virus. "We just pivoted a lot of our cancer work towards the COVID-19 work," said Stevens.
It's not limited to Argonne itself: the project involves 280 people across twenty institutions, including nine DoE laboratories, but also academic labs, including University of Chicago, University of Michigan, University College London, University of Tennessee, University of Texas, and University of California at San Diego. "It's a big list," said Stevens.
Even with a smaller universe to search, Argonne must make use of massive computing power. To combine the descriptors and fingerprints with the images, Argonne has been making use of a novel computer from Cerebras, based in Los Altos in Silicon Valley.
Cerebras stunned the chip world last August when it unveiled the first computer chip ever that takes up almost an entire wafer of silicon, making it the biggest chip in the world by far. Argonne was the first customer announced for the computer Cerebras built to run the chip, called the CS-1.
The key, said Stevens, is that a CS-1 can "iterate" on combinations of many neural networks without the trouble of tying together numerous CPUs or GPUs.
"[The] Cerebras [machine] is very fast, and in the kind of iterative design workflow that we're doing, we need to test lots of ideas," he told ZDNet.
Also: Cerebras did not spend one minute working on MLPerf, says CEO
"Having the ability to run this network on the Cerebras many times faster than we can run on a conventional GPU has been extremely helpful in refining both the algorithm, but then actually doing the large-scale training that we need to do," said Stevens.
Training is the term used for the initial phase of machine learning, where scientists experiment with modifications to their neural networks to see which versions work best.
The Argonne docking model is actually three different neural networks, arranged in an upside-down tree structure. At the top is a "fully connected" layer into which the three different neural networks feed their individual predictions, like branches merging into a trunk. Two of the three networks are multi-layer perceptrons, each one having what's known as an attention layer. One of those multi-layer perceptrons is for the molecular fingerprints, the other for the descriptors.
The third neural network is based on the ubiquitous convolutional neural network architecture that is widely used in image analysis, the most popular of which has been ResNet. It is this third network, based on convolutions, that has produced the surprising new sources of information about molecules that Stevens finds most interesting. But it is the combination of the three networks that adds up to the greatest predictive accuracy.
That trifecta can run on any kind of computer, but the enormous size of such networks would mean "you can spend a lot of time trying to tune that across large numbers of GPUs in a cluster," said Stevens.
The enormous amount of on-board memory in the Cerebras WSE chip plays an important role in the image manipulation, said Jessica Liu, principal product manager at Cerebras, who has been working with Argonne on the project. The WSE has 18 gigabytes of on-chip memory, whereas GPUs typically carry only megabytes' worth. The small memory of each GPU means data must go back and forth to off-chip memory, which raises the problem of partitioning and distributing workloads.
Also: Cerebras's first customer for giant AI chip is supercomputing hog U.S. Department of Energy
"On a standard GPU, you couldn't go to a very large image size before you fill up all of the SRAM" on the chip, said Liu. "And so you would have to go to distributed training, and you might have to go to distributed model parallelism, which is very difficult." Liu was referring to the technique of breaking a neural network into many components that each run in parallel on a separate chip.
"And so we're going to make it possible for them [Argonne] to run very quickly on larger-resolution images," said Liu.
"Collaborating with this class of researcher is what we built the company for," Cerebras co-founder and CEO Andrew Feldman told ZDNet.
"They're running 30, 50 days on machines that cost a quarter of a billion dollars to do the work that we're doing in a single machine that's the size of a dorm fridge," said Feldman. "And that's enormously exciting."
In addition to the Cerebras system, Argonne will also be doing COVID-19 work on the new "A100" chips from Nvidia, announced a week ago.
"We're one of the first DGX 3 customers," said Stevens, referring to the multi-processor system Nvidia has designed for the A100. "It's been running now a couple weeks, and we're using that, again, on COVID-19."
"There's a ton of computing that we have to do on this problem," he said.
Asked about competition from the new Nvidia device, Cerebras's Feldman remarked, "I think it was interesting that Rick chose to be out talking about us knowing that that was coming," meaning, the Nvidia A100 announcement.
As to whether Argonne has any results to share, Stevens remarked, "some of those molecules are showing inhibition, so that's good."
There is a long chain of steps that has to happen in addition to the computer work to bring any molecules from pure speculation to promising vaccine candidate. Compounds that show promise are tested against live virus assays, and then will go on to be tested in animals. Any molecules that are not existing drugs will need to be synthesized by chemists.
Also: Cerebras has at least a three-year lead on competition with its giant AI chip, says top investor
"A handful of hits have progressed down the experimental pipeline, and we're trying to validate whether the computational work actually holds up in the experimental work, and a number of those are progressing to wholesale assays to test" against the virus, he said.
Argonne is moving on multiple other fronts as well, including identifying antibody design through machine learning, understanding the protein structure of the virus itself, and looking for targets in humans, meaning, "drug targets in the human that would interrupt the interaction between the virus proteins and human proteins," said Stevens.
Further background can be found in a virtual presentation that Stevens and colleagues made this week. Another resource is the DoE's National Virtual Biotechnology Laboratory.
As for publication, papers are in the process of being written up, said Stevens, and should start to appear "in the next few weeks."
"All this data has to go through peer review," he noted, "we're not trying to rush to press with something half-baked."