X
Innovation

Glaxo’s biology research with novel Cerebras machine shows hardware may change how AI is done

A novel computer system built by Cerebras Systems of Los Altos, California, is already leading to changes in the way drug giant Glaxo builds large neural nets for drug development.
Written by Tiernan Ray, Senior Contributing Writer

Artificial intelligence research experienced a renaissance in the last twenty years as a result in part of greater computing power, the rise of the graphics processing unit, or GPU. 

Now, novel AI computer systems may be poised to have a similarly large impact. They may change not just the speed of AI work but the kinds of experiments that are done in the field.

AI is changing the entire nature of computing, and as part of an inevitable feedback loop, computing will end up changing the nature of AI.

An example of that showed up this week in new work being done by GlaxoSmithKline, the big British drug maker.

Glaxo this week talked up its new AI "hub" in St. Pancras square in London. The company's head of AI R&D, Kim Branson, took time to talk with ZDNet about the company's partnership with a challenger to Nvidia, Cerebras Systems. 

cec8b9e1-8237-4339-8013-ad6d02c67b44.jpg

Cerebras's "CS-1" computer, here shown in an exploded view to reveal its components, is the equivalent of a supercomputer for AI calculations in a cabinet the size of a dormitory fridge. 

Cerebras Systems

The work with Cerebras, to hear Branson describe it, takes advantage of new kinds of neural network models that can be developed because of new hardware capabilities, such as much larger memory and throughput.

That would be a break with the traditional practice in the field of AI of designing neural networks in a way completely irrespective of the underlying hardware. 

"Typically, we never think physically about the chip," Branson told ZDNet.

"The Cerebras system is pretty unique in that when you even put your neural network architecture on the chip, you actually have to lay it out physically where the data flows through it," said Branson. "So there's some really unique computational changes and experiences you can do with that."

The Cerebras computer, a pint-size supercomputer in a cabinet the size of a dorm fridge, which came out last fall, contains the largest computer chip ever built, almost the entire surface of a silicon wafer. The chip is unique, with 400,000 individual computing "cores" that operate in parallel. The Cerebras computer has software to optimize how the basic operations of a neural net, known as matrix multiplications, are assigned to each core.

For Branson and team, the main appeal is the data throughput of the system, the ability to move values of a neural network at incredible speed across those 400,000 cores. 

"The thing about Cerebras is, it has extremely large memory throughput, there's 100 petabits per second you can push through the chip," explained Branson. "Because of the sheer size of the memory we have on a single chip, we can build things we can't build as easily on other distributed architectures."

stanley-exterior.jpg

GlaxoSmithKline's new artificial intelligence "hub," center, in St. Pancras square in London, is expected to house 100 or more AI researchers once lockdown restrictions are lifted. 

Image courtesy of TOG (The Office Group)

That includes very large models, said Branson, "sort-of, stacked encoders, they're very much like natural language processing models, but they have a huge amount of training data you have to flow through them," said Branson. He was alluding to state-of-the-art neural networks known as "encoder-decoders," such as Google's Transformer, or OpenAI's GPT-3.

But Glaxo is also experimenting with combing individual networks into assemblages. "We often care a lot about models where we have a generative model that studies chemical structure," like the encoder-decoder, "and then we also have to have other ways to understand and feed back into that."

"Often if you have one part on one chip, and another part on another chip, you can't run the whole thing together at wire speed," said Branson.

"One of the cool things about the Cerebras chip is you can put that all together on one piece of hardware." 

Branson's comments show that hardware innovation may be coming to the forefront of AI research after many years of machine learning research that simply ran GPUs faster. 

Experts in the field expect new kinds of computers can change the nature of the work done. In a talk last year, Facebook's AI director, Yann LeCun, who has himself designed many AI chips, remarked that "hardware capabilities and software tools both motivate and limit the type of ideas that AI researchers will imagine and will allow themselves to pursue."  

There will be novel neural networks, Branson indicated, that are more of Glaxo's own creation. There are "some things that are very specific to cellular architecture, and that's inferring gene expression, and to think across different cellular systems." 

The ultimate goal, for Glaxo, is to integrate machine learning more intimately within the company's experiments for drug development. 

"There are some very particular use cases that we had in mind for the Cerebras system," said Branson. 

One focus is finding out much earlier in the drug development process whether a particular gene is actually expressed in human tissue. "We understand what the genetic variants are, what we call the distance, but we then have to solve the problem of which gene is influenced by this variance," Branson explained. "That's where we are talking about really, really large data sets."

cerebras-wse.jpg

The "wafer scale engine" in the Cerebras computer, the largest chip ever made, has 400,000 individual computer "cores" to perform the basic mathematical operations of neural networks in parallel. 

Cerebras Systems

"We're about using machine learning systems to solve this target identification problem; we build a model, and make a prediction, and then we can use functional genomics to test that prediction." Branson was referring to CRISPR gene-editing, where knock-outs of particular genes can be made and the effect measured. The CRISPR work takes the role of validation for the neural network. 

"Uniquely at GSK, we now do experiments with the express purpose of improving machine learning models, which are quite different from the experiments a traditional biologist would do," said Branson.

Branson expects his team will be publishing scholarly papers about the work done on the Cerebras computer, perhaps by January, he said. 

All code for Glaxo's neural net models ultimately ends up getting posted on the AI site, Branson noted. 

The work between Cerebras and Glaxo has taken shape over the course of eight to ten months, said Cerebras co-founder and CEO Andrew Feldman. It began with Glaxo approaching Cerebras. "They found us," he said.

"They get great credit," added Feldman. "Big companies don't historically search the landscape for something innovative." Glaxo, he said, "came along with hard and interesting problems, huge data sets and extremely interesting models, fascinating things."

Cerebras has competition in the AI chip market from other startups that are also seeking to take chunks of Nvidia's market, including Graphcore of Bristol, England, Tachyum of Santa Clara, California, and Tenstorrent of Toronto, Ontario, to name just a few.

The Cerebras deal, however, is one of the biggest wins for an Nvidia challenger, consisting in "multi-year, multi-million-dollar contracts," between Cerebras and Glaxo, said Feldman. 

"They wanted to build relationships, they wanted to stay on the cutting edge over years," he said of Glaxo. "It's great that it's not a one-off."

The Glaxo win is Cerebras's first announced commercial deal. Previously announced deals have all had to do with academic and government installations, such as Argonne National Labs

Feldman intimated there are more commercial deals on the way, and there is work underway with the military and the intelligence communities.

"We sell to the hyperscalers, we sell to the supercompute world, we sell to the largest of the enterprise, where pharma lives," said Feldman. "That's where oil and gas lives, and we've got announcements down the road in multiple categories there, and, finally, in military intelligence."

Glaxo is also working extensively with the latest chips from Nvidia, and Branson blogged about that work this week. He said the ability to count on Cerebras's forthcoming products is a main draw of also going with a smaller, less-established firm. 

"Hardware is hard, right? And companies go away, or get acquired," observed Branson. "We want to make sure that this is the group that really knows what they're doing, that has something that you can buy today, and for which there are other uses as well, and that will be around."

Editorial standards