By 2040 our computers will use more power than we can produce

The breathtaking speed at which our computers evolve is perfectly summarized in Moore’s Law — the idea that the sum of transistors in an integrated circuit doubles every two years. But this kind of exponential growth in computing power also means that our chipsets need more and more power to function — and by 2040 they will gobble up more electricity than the world can produce, scientists predict.

Image bia pixabay

The projection was originally contained in a report released last year by the Semiconductor Industry Association (SIA) but it has only recently made headlines as the group issued its final assessment on the semiconductor industry. The basic idea is that as computer chips become more powerful and incorporate more transistors, they’ll require more power to function unless efficiency can be improved.

Energy which we may not have. They predicted that unless we significantly change the design of our computers, by 2040 we won’t be able to power all of them. But there’s a limit to how much we can improve using our methods:

“Industry’s ability to follow Moore’s Law has led to smaller transistors but greater power density and associated thermal management issues,” the 2015 report explains.

“More transistors per chip mean more interconnects – leading-edge microprocessors can have several kilometres of total interconnect length. But as interconnects shrink they become more inefficient.”

So in the long run, SIA estimates that under current conditions “computing will not be sustainable by 2040, when the energy required for computing will exceed the estimated world’s energy production.”

Total energy used for computing.
Image credits SIA

This graph shows the problem. The power requirements of today’s systems — the benchmark line — are the orange line and total energy production is the yellow one. The point they meet at, predicted to be somewhere around 2030 or 2040, is where the problems start. Today, chip engineers stack ever-smaller transistors in three dimensions in order to improve performance and keep pace with Moore’s Law, but the SIA says that approach won’t work forever, given how much energy will be lost in future, progressively denser chips.

“Conventional approaches are running into physical limits. Reducing the ‘energy cost’ of managing data on-chip requires coordinated research in new materials, devices, and architectures,” the SIA states.

“This new technology and architecture needs to be several orders of magnitude more energy efficient than best current estimates for mainstream digital semiconductor technology if energy consumption is to be prevented from following an explosive growth curve.”

The roadmap report also warns that beyond 2020, it will become economically unviable to keep improving performance with simple scaling methods. Future improvements in computing power must come from areas not related to transistor count.

“That wall really started to crumble in 2005, and since that time we’ve been getting more transistors but they’re really not all that much better,” said computer engineer Thomas Conte from Georgia Tech for IEEE Spectrum.

“This isn’t saying this is the end of Moore’s Law. It’s stepping back and saying what really matters here – and what really matters here is computing.”

4 thoughts on “By 2040 our computers will use more power than we can produce

  1. safetynet2razorwire

    Read Malthus on horses and manure. Then check his prediction against reality.Then reconsider this prediction. Just a thought.

  2. BL4S7ER

    Given the last 20 years my pc now consumes far less power and is far more powerful so we must be doing something right.

    and even if this prediction holds some truth, the next breakthrough will shatter this which no doubt will be well before 2040.

  3. Bob Colwell

    First, where is the evidence that total world energy production is flat and will remain so into the indefinite future? (We can argue separately about whether that might actually be a good goal.) Second, "energy will be lost?" Quick, somebody go look for it, it's gotta be here somewhere. Third, energy consumption related to computing is already on an exponential rise. Is that "explosive?" How about we don't use vague terminology, please? Fourth, this article seems to be lumping all computing into one big pile. C'mon, that creates more confusion than illumination. At least separate out mobile computing, servers, and supercomputers — even if the overall energy consumed by all computers attracts the kind of gov't regulation that will stifle further growth, it doesn't mean that all computers will stop getting better. Fifth, the problem does not start where the lines intersect; no industry can start consuming more than around 10% of the nation's energy budget (or the world's, by extension) without serious attention by gov'ts everywhere, who must apportion that finite resource among a lot of contenders (transportion, agriculture, communications, military, etc.). Don't pretend we'll approach 100%. A crisis would hit way, way earlier than that, and will prevent the scenario outlined above. And finally, it IS about the end of Moore's Law. When we can no longer profitably cram more transistors onto a chip, the game is over, and the steady stream of marvelous improvements in computing will dry up. Presto, no more upward pressure on the world's energy budget.

  4. Pingback: Cryptocurrency mining comes at a great health and environmental cost – BTC News Paper

Leave a Reply

Your email address will not be published.