Data Centre Efficiency

Investment Note #12 - 26th April 2024

Data Centre Efficiency

Synopsis

  • Since technology is all about pushing electrons around, it is always worthwhile considering the energy implications of this.

  • The importance of this is magnified when the World is concurrently galloping towards the AI Age and embarking on the most challenging energy transition ever undertaken.

  • Following years of efficiency gains, that have led to data centres barely increasing power consumption – despite huge growth in Internet traffic and cloud computing – Large Language Models (LLMs) now appear to be causing a step-change in power demand.

  • This presents huge challenges, the scale of which, will only become fully apparent over time. However given the pace of AI adoption and the lead time on large energy projects, the luxury of waiting is not an option. This is resulting in tech companies embarking on energy initiatives that would have seemed unlikely a few years ago.

Data Centre Efficiency

  • The improvement in data centre efficiency levels has been truly impressive. Between 2010 and 2018, data centres grew compute capacity 6x and storage 25x while internet traffic grew 10x. However, thanks to Moore’s law and the application of machine learning to improve data centre efficiency, power consumption only grew 6% over that period (source: NZS Capital).

  • Furthermore, the early stages of AI didn’t look like it was going to alter the situation. Indeed, in 2016, Google’s DeepMind noted how AI itself was helping to reduce energy consumption, resulting in a 40% decrease in the need for cooling (source: Wired). This is important because data centres are big consumers of not just energy, but also water to cool hot semiconductors.

  • Unfortunately, it appears that we have now reached a tipping point where efficiency gains cannot offset the insatiably growing compute demand.

Step Increase in Energy Demand

  • LLMs and generative AI models appear to have taken data centre energy demand to a new level, with the thirst for power and water to run generative AI on a different level from what we have seen historically.

  • A University of California study estimates that a conversation with ChatGPT might consume around 500ml of water. The study also noted that Microsoft’s global water usage surged 34% from 2021 to 2022, to nearly 1.7 billion gallons, and Google’s water consumption increased by 20% to 5.6 billion gallons in 2023 (source: Business Today).

  • According to the International Energy Agency (IEA), global power demand from data centres could surpass 1,000 terawatt hours by 2026. That would be more than double 2022 levels — an increase equivalent to Germany’s total power demand in the space of four years (source: Financial Times). 

Next Wave of Innovation

  • It would be wrong to assume that efficiency gains in data centres have come to an end. There are significant efforts underway to make AI models more efficient, with the MIT Technology Review reporting on DeepMind’s efforts to combine AI with external memory (a kind of external cheat sheet for LLMs containing previous answers) that could lead to a 25x efficiency gain.

  • As AI rewrites, optimises and deploys its own software; it is entirely reasonable to expect a significant downward shift in how much power is required to run an AI query in the future. Indeed, just recently, Nvidia announced that updated software is giving a new chip 8x the performance of the last version of the equivalent predecessor chip (source: Nvidia Developer).

  • There is further reason for hope, as we need to broaden our thinking to incorporate the energy savings from AI replacing more cumbersome human efforts. Researchers in a recently published paper (The Carbon Emissions of Writing and Illustrating Are Lower for AI than for Humans) found that AI emits 130 to 1,500 times less CO2 for writing and 310 to 2,900 times less CO2 for generating an image vs. humans.

  • Notwithstanding historic efficiency gains and potential efficiencies that will be developed in the future, it does appear the near to medium-term challenges for the electricity grid will be significant.

The Grid Will Become Increasingly Challenged

  • Blackstone and Prologis alone are working on $75bn in new data centres (source: Business Insider). Ireland’s data centres are set to account for 32% of national electricity demand in 2026, according to the IEA (source: Financial Times). However, it will be in larger economies that bigger energy decisions will need to be made.

  • Nuclear fusion is, of course, at least a decade or two away – absent a surprise breakthrough. Hydrogen has many interesting applications, but it will likely be a local solution rather than an across-the-board replacement for fossil fuels. Green energy attracts a lot of attention and will certainly form part of the solution but is moving too slowly to overcome the growing energy demand, especially if AI growth progresses as expected.

  • It’s eerily similar to the dotcom and fibre/telecom equipment spending boom. Of course, the Internet proved to be far larger than anyone could have imagined, and we still continually need more bandwidth. It is reasonable to assume the same will be true of AI, only probably orders of magnitude more interesting and more unpredictable.

Splitting the Atom is back in Fashion

  • One potential solution that is attracting more attention is nuclear Small Modular Reactors (SMRs), which could provide a dependable baseload supply. Standard Power, a provider of infrastructure as a service to advanced data processing companies, announced its plans to develop two SMR-powered facilities that will together produce nearly 2GW of clean, carbon-free energy (source: Businesswire). SMRs are a technology that both Bill Gates and Sam Altman (head of Microsoft’s partner OpenAI) have been investing in for years.

  • Indeed, Microsoft is reported as planning to use next-generation nuclear reactors to power its data centre and AI ambitions and is seeking a principal program manager to lead its nuclear energy strategy (source: The Verge). In March, Bill Gates’ TerraPower announced plans to build its first US SMR nuclear plant in Wyoming (source: Financial Times).  Also in March, Amazon announced it was acquiring a data centre in Pennsylvania for $650 million that is already powered by conventional nuclear reactors (source: Elecktrek).

Summary

  • There appears to be a very slow transition away from fossil fuels colliding with a significant, long-term growth in energy demand. Absent new disruptive technologies that will provide abundant new sources of energy, it presents a real challenge.

  • The seemingly impossible-to-accelerate physical world of energy creation and transmission appears to be on a collision course with AI consuming huge amounts of energy. The digital world is in direct conflict with the material world.

  • So, while we might wish for the future to arrive now, the power to push all those electrons around may not be here for a while. In the meantime, we suspect many topics involving energy creation and usage will continue to get attention and somewhat ironically AI could provide the foundation for a nuclear renaissance.

Subscribe to the Key Capital Private - Investment Note

Subscribe