Cloud Computing’s Energy Consumption

About 1 percent of all electricity generated goes to cloud computing. By the end of this decade, we could be devoting 8 percent or more. How much energy can we dedicate to all this computing?
Software and hardware engineering will no doubt need to reorient their design practices more around power efficiency.

cleancloud

Read more:
Cloud Computing’s Coming Energy Crisis
https://spectrum.ieee.org/computing/hardware/cloud-computings-coming-energy-crisis

3 Comments

  1. Tomi Engdahl says:

    For comparison:

    On the face of it, the question about energy use is a fair one. According to the Cambridge Center for Alternative Finance (CCAF), Bitcoin currently consumes around 110 Terawatt Hours per year — 0.55% of global electricity production,
    https://hbr.org/2021/05/how-much-energy-does-bitcoin-actually-consume

    The most reputable such estimate comes from the University of Cambridge Bitcoin Electricity Consumption Index, according to which the global bitcoin network currently consumes about 80 terawatt-hours of electricity annually, roughly equal to the annual output of 23 coal-fired power plants, or close to what is consumed by the nation of Finland.
    https://qz.com/2023032/how-much-energy-does-bitcoin-use/

    Reply
  2. Tomi Engdahl says:

    Computers, data centers and networks consume 10% of the world’s electricity. 30% of this electricity goes to power terminal equipment (computers, mobiles and other devices), 30% goes to data centers and 40% goes to the network.
    https://en.wikipedia.org/wiki/IT_energy_management

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*