X
Business

As Moore's Law slows, high-end applications may feel the effect, MIT scientist warns

"The gains from Moore's Law were so large that, in many application areas, other sources of innovation will not be able to compensate."
Written by Joe McKendrick, Contributing Writer

For decades, compute power has consistently followed Moore's Law, which stipulates that computer processing power doubles approximately every two years. Lately, however, it appears that growth in power has been slowing down -- processors simply can't get more densely packed than they are now. Perhaps quantum computing will speed things up again at a future time, but, until then, the ability to support increasingly sophisticated applications may be at risk. 

Neil Thompson, an MIT research scientist at the Computer Science and Artificial Intelligence Laboratory (CSAIL) and the Sloan School of Management, recently sounded a warning about the waning of Moore's Law, and how it may hold back innovation in critical areas. "We are already seeing increased interest in designing specialized computer chips as a way to compensate for the end of Moore's Law," he says. "But the problem is the magnitude of these effects. The gains from Moore's Law were so large that, in many application areas, other sources of innovation will not be able to compensate." Areas that may feel the crunch more immediately include high-computation applications such as weather forecasting, oil exploration, and protein folding for drug discovery, he says. 

While large, number-crunching systems are the first to feel the effects of diminishing growth in processing power, the effects may eventually filter down to more commonly used systems, which are also hungry for enhanced processing power for a growing stable of analytic-intensive applications, such as artificial intelligence, machine learning, and deep learning. These may depend on massive cloud-based services or high-powered edge devices.

In today's digital economy, all attention is on the applications that can deliver insights and capabilities at blazing speeds to users or customers all across the globe. But what often gets forgotten is the hardware underneath that makes it all possible.   

Industry leaders have been anticipating the potential slowing of Moore's Law with trepidation. "Keeping up with Moore's law has become much harder than ever before," Lieven Eeckhout noted in IEEE's Computer Journal a few years back. "And maybe at some point companies will have to start pushing back next-generation transistor technologies." While the impact won't be felt as acutely at the low end of the computing scale, such as with consumer applications, it is having a profound impact on high-end computing and data center capabilities, Eeckhout said. 

The implications of slowing processing power growth "are quite worrisome," says Thompson. "As computing improves, it powers better weather prediction and the other areas we studied, but it also improves countless other areas we didn't measure but that are nevertheless critical parts of our economy and society. If that engine of improvement slows down, it means that all those follow-on effects also slow down."

Software innovators have grown accustomed to the continuous rapid growth of processing power over the years, and have designed applications around such increases in power. Thompson's research shows that improvements in applications are mainly due to being able to take advantage of new processors -- estimating that between 49 and 94 percent of improvements in high-end computing areas are directly attributable to growth in computing power.  

"This is not someone just taking an old program and putting it on a faster computer; instead users must constantly redesign their algorithms to take advantage of 10 or 100 times more computer power," he says. "There is still a lot of human ingenuity that has to go into improving performance, but what our results show is that much of that ingenuity is focused on how to harness ever-more-powerful computing engines." 

For example, "with weather prediction, we found that there has been a trillionfold increase in the amount of computing power used for these models," Thompson points out. "That puts into perspective how much computing power has increased, and also how we have harnessed it."

Editorial standards