Technology is one of the most preservative practices in the world, in the sense that every invention builds upon the successes and failures that have come before it.
On Wednesday, AI startup Cerebras Systems was honored for carrying on that tradition in a ceremony held at The Computer History Museum in Mountain View, California. The museum has put up a display featuring the Wafer-Scale Engine 2, or WSE-2, the second version of the company's AI chip that is the biggest computer chip ever made. The chip was introduced last year to run new versions of Cerebras' supercomputer, the CS-2.
"It is the honor of a lifetime to be inducted into the Computer History Museum's world-renowned collection," said Andrew Feldman, co-founder and CEO of Cerebras, in an interview with ZDNet via Zoom.
"The scale of what you've done is very powerful," said Dan'l Lewin, president and CEO of the Computer History Museum, in the same interview with Feldman. "This is a milestone in a journey forward," added Lewin. "The implications are stunning."
Humanity, Lewin, said, is at a turning point where computer technology can either help solve big problems such as climate or lead to a form of enslavement.
"The whole purpose, from my accumulated experience in this industry — if you go back to [Douglas] Engelbart [inventor of the computer mouse] and the rationale for the mother of all demos — well, we the human species created problems that are exponentially impacting us.
"We do have a bunch of real problems on a global scale," Lewin said. Technology can help or make things worse, he suggested.
"Being able to optimize compute in this manner," he said of Cerebras's chip, "will hopefully teach us that there are more potential positive uses of these technologies than the unfortunate things that have occurred as a result of some business models that have surfaced that are, effectively, programming the population without them really being aware of it."
The WSE-2 chip and its predecessor, introduced three years ago, mark an epochal achievement in the history of fabricating transistors, the building block of all electronics, as an integrated part. The first "planar" integrated circuits, transistors fabricated together as a single manufactured object, a breakthrough simultaneously achieved in the early 1960s by Texas Instruments engineer Jack Kilby and Intel founder Bob Noyce, combined just a handful of transistors.
Then in 1965, another Intel founder, Gordon Moore, hypothesized that refined manufacturing approaches would lead to an exponential increase in the number of transistors integrated on a single silicon chip. His guess proved right and came to be known as Moore's Law. The phenomenon of transistor growth made the digital age possible, from minicomputers to personal computers to smartphones to data networking to electronics embedded in vehicles and to the Internet of Things.
The WSE-2 chip uses 2.6 trillion transistors, almost 50 times as many as the biggest GPU chip today from Nvidia, in a silicon substrate measuring 46 square millimeters, almost the entirety of a 12-inch semiconductor wafer from which numerous chips typically are cut.
The chip contains 850,000 individual "cores" to process AI instructions in parallel.
The WSE technology realized a quest that had been going on for decades in the chip world: to make a single chip that would make use of an entire wafer. Cerebras's success came in part from going back to past failures and finding a new approach to the problem.
"It means a lot to us that this institution has recognized the scope of the effort that had crashed and burned previously in other incarnations," Cerebras' Feldman said, "even by some of the founding fathers of our industry — even Gene Amdahl couldn't get it to work."
Mainframe computer pioneer Gene Amdahl had tried and failed in the late 1980s to make such a monolithic part. The very general impression formed in the chip industry from his attempt was that making a wafer-sized single chip was so difficult as to be practically impossible.
"When we went back and re-examined, in part it was the discipline to not take received wisdom that this can't be done, and look at in fact what couldn't be done and when and what progress had been made," Feldman said of Cerebras' approach.
"When you look back and say Gene Amdahl failed at wafer scale, he was building it on a two-inch wafer," Feldman explained. "His wafer was smaller than something everybody makes today, and nobody thinks about that, and the tools that they had."
Advances since that time in silicon manufacturing processes and chip design software tools meant that the wafer-scale attempt was more feasible when Cerebras arrived at the problem 30 years later.
"The number of elements that we use that others have invented in order to take a big step forward is enormous," Feldman said.
As to the implications, Feldman, a serial entrepreneur in realms of networking and computing technology, is also mindful that the ramifications of their achievements often elude inventors and entrepreneurs.
"In the early part of my career, we built the early switches and routers that made IP switching approximately free, and all of us — Cisco and Juniper and 3Com — we never once thought that something like WhatsApp would change the world," referring to Meta's free internet communications application.
"We knew good things would happen if you made communication almost free," he said. "When you embark on that trajectory, all sorts of other stuff steps up and builds on top of you, and others do things you couldn't imagine."
The Cerebras breakthrough, Lewin said, has in one sense to do with broadening the access of people to digital tools that were previously the captive realm of specialists.
"Industries go through transitions from vertical to horizontal," Lewin said. "There used to be these huge industries that were very vertical that were targeted at automating rational tasks like the word processing industry, and CAD/CAM as an industry," he said, referring to computer-aided design. The progress of microprocessors led to horizontal apps such as Microsoft Office, which made those previously vertical industries "approachable by so many people."
The museum's award, Lewin said, is a peek into the future as much as a rumination on the past. "History is not about the past. It's about the present having a conversation with the past," Lewin said.
He cited an example of the creative dialectic between hardware and software.
In an award ceremony some years ago honoring programming pioneer Grady Booch for his many inventions such as Unified Modeling Language, Booch told the audience, "It's software, software, software," Lewin recalled.
"Gordon Moore got up and smiled and said, 'Software's interesting, but it's gotta run on something,'" Lewin recounted.
"And so, this handshake and accelerated opportunity, by taking this immense capability and optimizing it in this way, will help drive these changes, these shifts from horizontal to vertical — they're being compressed in time."
As a milestone along the journey, the WSE-2 technology should be good for awhile, Feldman said. "I think this is the best we have until quantum [computing] arrives, and I am very comfortable that will be some time," he said.
Much like receiving a lifetime achievement award, the honor of being in a museum could almost seem like the end of things. Feldman expressed a conviction that the museum's recognition is the start of things.
"Museums are often warehouses of the past," Feldman observed. "We are now building a great company on the back of some pathbreaking technology, and we don't forget that: Both are hard."