Technology is one of the most preservative practices in the world, in the sense that every invention builds upon the successes and failures that have come before it.
On Wednesday, AI startup Cerebras Systems was honored for carrying on that tradition in a ceremony at The Computer History Museum in Mountain View, California. The Museum has put up a display featuring the "Wafer-Scale Engine 2," or WSE-2, the second version of the company's AI chip that is the biggest computer chip ever made. The chip was introduced last year to run new versions of Cerebras's supercomputer, the CS-2.
"It is the honor of a lifetime to be inducted into the Computer History Museum's world-renowned collection," said Andrew Feldman, co-founder and CEO of Cerebras, in an interview with ZDNet via Zoom.
"The scale of what you've done is very powerful," said Dan'l Lewin, who is President and CEO of the Computer History Museum, in the same interview with Feldman. "This is a milestone in a journey forward," added Lewin, "the implications are stunning."
Humanity, said Lewin, is at a turning point where computer technology can either help solve big problems such as climate, or lead to a form of enslavement.
"The whole purpose, from my accumulated experience in this industry — if you go back to [Douglas] Engelbart [inventor of the computer mouse] and the rationale for the mother of all demos — well, we the human species created problems that are exponentially impacting us.
"We do have a bunch of real problems on a global scale," said Lewin. Technology can help or make things worse, he suggested.
"Being able to optimize compute in this manner," he said of Cerebras's chip, "will hopefully teach us that there are more potential positive uses of these technologies than the unfortunate things that have occurred as a result of some business models that have surfaced that are, effectively, programming the population without them really being aware of it."
The WSE-2 chip, and its predecessor, introduced three years ago, mark an epochal achievement in the history of fabricating transistors, the building block of all electronics, as an integrated part. The first "planar" integrated circuits, transistors fabricated together as a single manufactured object, a breakthrough achieved in the early 1960s simultaneously by Texas Instruments engineer Jack Kilby and Intel founder Bob Noyce, combined just a handful of transistors.
Then, in 1965, another Intel founder, Gordon Moore, hypothesized that refined manufacturing approaches would lead to an exponential increase in the number of transistors integrated on a single silicon chip. His guess proved right, coming to be known as "Moore's Law." The phenomenon of transistor growth made possible the digital age, from minicomputers to personal computers to smartphones to data networking to electronics embedded in vehicles and the Internet of Things.
The WSE-2 chip uses 2.6 trillion transistors, almost fifty times as many as the biggest GPU chip today from Nvidia, in a silicon substrate measuring 46 square millimeters, almost the entirety of a twelve-inch semiconductor wafer from which numerous chips typically are cut.
The chip contains 850,000 individual "cores" to process AI instructions in parallel.
The WSE technology realized a quest that had been going on for decades in the chip world, to make a single chip that would make use of an entire wafer. Cerebras's success came in part from going back to past failures and finding a new way to approach the problem.
"It means a lot to us that this institution has recognized the scope of the effort that had crashed and burned previously in other incarnations," said Cerebras's Feldman, "even by some of the founding fathers of our industry — even Gene Amdahl couldn't get it to work."
Mainframe computer pioneer Gene Amdahl had tried and failed in the late 1980s to make such a monolithic part. The very general impression formed in the chip industry from his attempt was that making a wafer-sized single chip was so difficult as to be practically impossible.
"When we went back and re-examined, in part it was the discipline to not take received wisdom that this can't be done, and look at in fact what couldn't be done and when, and what progress had been made," said Feldman of Cerebras's approach.
"When you look back and say Gene Amdahl failed at wafer scale, he was building it on a two-inch wafer," explained Feldman. "His wafer was smaller than something everybody makes today, and nobody thinks about that, and the tools that they had."
Advances since that time in silicon manufacturing processes, and chip design software tools, meant that the wafer-scale attempt was more feasible when Cerebras arrived at the problem thirty years later.
"The number of elements that we use that others have invented in order to take a big step forward is enormous," said Feldman.
As to the implications, Feldman, a serial entrepreneur in realms of networking and computing technology, is also mindful that the ramifications of their achievements often elude inventors and entrepreneurs.
"In the early part of my career, we built the early switches and routers that made IP switching approximately free, and all of us — Cisco, and Juniper and 3Com — we never once thought that something like WhatsApp would change the world," referring to Meta Properties' free Internet communications application.
"We knew good things would happen if you made communication almost free," he said. "When you embark on that trajectory, all sorts of other stuff steps up and builds on top of you, and others do things you couldn't imagine."
The Cerebras breakthrough, said Lewin, has in one sense to do with broadening the access of people to digital tools that were previously the captive realm of specialists.
"Industries go through transitions from vertical to horizontal," said Lewin. "There used to be these huge industries that were very vertical that were targeted at automating rational tasks like the word processing industry, and CAD/CAM as an industry," he said, referring to computer-aided design. The progress of microprocessors led to horizontal apps such as Microsoft Office that made those previously vertical industries "approachable by so many people."
The Museum's award, said Lewin, is a peek into the future as much as a rumination on the past. "History is not about the past, it's about the present having a conversation with the past," said Lewin.
He cited an example of the creative dialectic between hardware and software.
In an award ceremony some years ago, programming pioneer Grady Booch was honored for his many inventions such as Unified Modeling Language. "It's software, software, software," Booch told the audience, Lewin recalled.
"Gordon Moore got up and smiled, and said, 'software's interesting, but it's gotta run on something,'" Lewin recalled.
"And so, this handshake and accelerated opportunity, by taking this immense capability and optimizing it in this way, will help drive these changes, these shifts from horizontal to vertical — they're being compressed in time."
As a milestone along the journey, Feldman said the WSE-2 technology should be good for awhile. "I think this is the best we have until quantum [computing] arrives, and I am very comfortable that will be some time," he said.
Much like receiving a lifetime achievement award, the honor of being in a museum could almost seem like the end of things. Feldman expressed a conviction the Museum's recognition is the start of things.
"Museums are often warehouses of the past," observed Feldman. "We are now building a great company on the back of some pathbreaking technology, and we don't forget that: both are hard."