For decades, compute power has consistently followed Moore’s Law, which stipulates that computer processing power doubles roughly every 2 years. Lately, however, it appears that growth in power has been slowing down — processors simply can’t get more densely packed than they are now. Perhaps quantum computing will velocity things up again at a prospective time, but, until then, the ability to support increasingly sophisticated applications may be at risk.
Neil Thompson, an MIT research scientist at the Computer Science and Artificial Intelligence Laboratory (CSAIL) and the Sloan School of Management, recently sounded a warning about the waning of Moore’s Law, and how it may hold back innovation in critical areas. “We are already looking increased interest in designing specialised computer chips as a way to compensate for the end of Moore’s Law,” he says. “But the dispute is the magnitude of these effects. The gains from Moore’s Law were so large that, in many application areas, other sources of innovation will not be able to compensate.” Areas that may feel the crunch more immediately include high-computation applications such as climate forecasting, oil exploration, and protein folding for drug discovery, he says.
While large, number-crunching systems are the first to feel the effects of diminishing growth in processing power, the effects may ultimately filter down to more commonly used systems, which are also hungry for enhanced processing power for a growing stable of analytic-intensive applications, such as artificial intelligence, machine learning, and deep learning. These may depend on massive cloud-based services or high-powered edge devices.
In today’s digital economy, all consideration is on the applications that can deliver insights and capabilities at blazing speeds to users or clients all across the globe. But what often gets forgotten is the hardware underneath that makes it all possible.
Industry leaders have been anticipating the potential slowing of Moore’s Law with trepidation. “Keeping up with Moore’s law has become much harder than ever before,” Lieven Eeckhout noted in IEEE’s Computer Journal a few years back. “And maybe at some point companies will have to start pushing back next-generation transistor technologies.” While the impact won’t be felt as acutely at the low end of the computing scale, such as with consumer applications, it is having a profound impact on high-end computing and data center capabilities, Eeckhout said.
The implications of slowing processing power growth “are quite worrisome,” says Thompson. “As computing improves, it powers better climate prediction and the other areas we studied, but it also improves countless other areas we didn’t measure but that are nevertheless critical parts of our economy and society. If that engine of improvement slows down, it means that all those follow-on effects also sluggish down.”
Software innovators have developed accustomed to the continuous quickly growth of processing power over the years, and have designed applications around such increases in power. Thompson’s research shows that improvements in applications are primarily due to being able to take advantage of new processors — estimating that between 49 and 94 percent of improvements in high-end computing areas are immediately attributable to growth in computing power.
“This is not somebody just taking an old program and putting it on a faster computer; instead users should constantly redesign their algorithms to take advantage of 10 or 100 times more computer power,” he says. “There is nonetheless a lot of human ingenuity that has to go into improving performance, but what our results show is that much of that ingenuity is focused on how to harness ever-more-powerful computing engines.”
For example, “with climate prediction, we found that there has been a trillionfold increase in the amount of computing power used for these models,” Thompson points out. “That places into perspective how much computing power has increased, and also how we have harnessed it.”