|
Godlike intelligence may need godlike energy production. Generated with ChatGPT
|
|
On the Horizon #7: AI & Energy
|
|
AI and energy are inextricably linked: information processing requires energy, stemming from Landauer's principle, which establishes that erasing a bit of information must dissipate a minimum amount of energy as heat. This physical constraint means that advances in AI are inherently tied to questions of energy consumption and heat dissipation, making energy considerations not just an engineering challenge but a fundamental physical bound on both expanding AI capability and capacity.
Energy has been the single most important commodity geopolitically for at least the last fifty years. In the Information Age, it is joined by advanced semiconductors and foundation models - the three key ingredients that will determine geopolitical leadership in the 21st century.
The compute and networking hardware in AI server consume 2-5x more power than traditional cloud servers, more when also accounting for additional cooling requirements.
In Situational Awareness, Leopold Ashenbrenner (ex-safety researcher at OpenAI), extrapolates the current exponential curve in AI model scaling through 2030, which leads to $1 trillion datacentres, each large enough to consume 20% of US electricity generation. In this hypothetical, the race for superintelligence would become a geopolitical issue, compelling the US and China to mobilise their full industrial capacity in order to win the race.
We take a more measured view, especially in light of recent research into scaling the post-training and inference stages of AI models. These new directions, along with architectural breakthroughs that may allow things like memory and context to be scaled sub-quadratically, may spare big tech from embarking on a 10 or 100GW cluster to scale pre-training.
Nevertheless, the rate of improvement and deployment in AI is faster than any other transformational technology in history, and it is soon to meet bottlenecks in much slower-moving, physical industries such as power generation. The response rate of nation states is even slower, the wheels are starting to turn in government bureaucracies, and we expect to see AI policy rise up the priority lists of countries around the world.
|
|
The UK government recently unveiled their 'AI Opportunities Action Plan', in an honourable effort to wring efficiencies from the public sector and jumpstart economic growth. Central to this plan is a target to increase the UK's 'compute' capacity by 20x by 2030. This assumes an 8x rise in compute (FLOPs) per pound by then, and a 4x increase in energy efficiency. Given the rate of progress in AI accelerators, led by NVIDIA's relentless annual product iteration, these assumptions seem reasonable. However, even with the forecasted efficiency gains, this implies the UK's datacentre capacity will grow from 1.5GW (2.5% of electricity consumption) to 7.5GW (12.5% of electricity consumption) in the next 5-6 years.
Firstly, is this plausible? Electricity generation capacity in the UK has been declining for two decades. Electricity consumption has been falling too over this period, both in households and in industry, but a continuation of this can't be relied on going forwards, given the electrification trend (especially the transition to electric vehicles). Roughly speaking, the UK has to deliver a +350bps swing in electricity supply growth, from a -1.5% annual decline to a +2.0% growth rate, based on the datacentre assumptions above. The UK government plans to set up a publicly owned company called Great British Energy to accelerate the renewable energy transition, but it is not clear how this will be paid for. 6GW of datacentres would consume 52.56TWh of electricity per year at maximum utilisation. Based on typical load factor ranges for wind and solar in the UK, you would need 30-50GW of nameplate capacity to supply this level of power, with costs in the tens of billions. Adding in a few days of battery storage to guard against intermittency could send costs north of £100 billion.
|
|
UK electricity generation has declined in recent years
|
|
|
Source: Department for Energy & Net Zero; Green Ash Partners
|
|
Secondly, can the UK compete at the frontier? The next generation of models from the likes of Google (Gemini 2.0), Meta (Llama 4), OpenAI (Orion), xAI (Grok 3) and Anthropic (Claude 4) were likely trained on clusters in the order of 100k NVIDIA H100 equivalents last year - around 150MW in terms of power rating, and single-digit-billions in investment terms. The UK government's plan targets a 100MW datacentre, to be scaled to 500MW in partnership with the private sector, but how long will this take? We note that 1GW+ datacentres have been under construction in the US for at least a year.
We take the UK example as a case study highlighting the issues that many countries will be wrestling with, not least in the EU, which has been quick to pass AI regulations before yet having any domestic industry to regulate (President Trump rescinded President Biden's AI regulations by executive order on the day of his inauguration). The numbers being contemplated by European countries are orders of magnitude smaller than those in the US - for example, French cloud provider Scaleway recently announced their ambition to build domestic AI infrastructure capacity, with..... 5k GPUs. To be competitive at the frontier, countries will need to deploy millions of GPUs, both for training clusters and to serve inference when models are deployed.
|
|
Silicon Valley is the long established global headquarters of big tech, and so it is to be expected that it is also home to the frontier of AI research. Microsoft, Amazon, Alphabet and Meta are spending a quarter of a trillion a year on AI infrastructure, on top of the last decade of public cloud infrastructure investment. Every key enabler of the infrastructure stack is based there, from NVIDIA and Broadcom, to the hyperscale cloud providers, to the largest software companies and frontier AI labs. This dense agglomeration of knowledge, capability and talent has created a prolific hub of AI research and deep capital markets for AI start ups.
|
|
Only the energy transition can match the scale of capex hyperscalers are deploying on AI infrastructure since the launch of ChatGPT
|
|
|
Source: Zeta Alpha; Green Ash Partners.
|
|
The US dominates AI research, with China a distant second. DeepMind's integration with Google Brain was enough to drop the UK's ranking from 3rd to 8th in 2023
|
|
|
Source: Zeta Alpha; Green Ash Partners. Adds up to >100 due to co-authorship
|
|
It's a similar story when we look at where compute is located - the US has 3x the number of datacentres than Europe and the UK combined...
|
|
|
Source: Statista, Apollo; Green Ash Partners.
|
|
... and AI start ups in the US have received more than 4x the VC funding than the rest of the world combined
|
|
|
Source: Dealroom.co; Green Ash Partners. As of 31/05/24
|
|
Even in energy, the US has an advantage versus other developed markets. This has been enabled by the shale revolution which began around 2005. It provided extremely cheap natural gas, which fell from an average price of around $10/MMBtu in 1995-2005 to $3.5/MMBtu in 2015-2025. In some oil-producing regions (such as Waha, Texas), natural gas prices regularly go negative, as they are produced as a by-product of shale oil production and companies lack the offtake capacity to send it anywhere. By contrast, the EU is almost wholly reliant on foreign gas, importing 90% of their requirements (the UK imports 63% of its natural gas, the rest coming from the North Sea). European gas prices were >3x more than the US on average last year, and peaked at 12.5x higher than the US during the height of the Ukraine war (August 2022).
|
|
US natural gas production stagnated for four decades, until the shale revolution started in 2005
|
|
|
Source: EIA; Green Ash Partners
|
|
The shale revolution greatly expanded the role of natural gas in the US electricity generation mix (from 19% to 43% from 2005-2023), displacing a substantial amount of coal (which has declined from 50% to 16%). Total US electricity generation capacity has only risen +5% over this 18 year period
|
|
|
Source: EIA; Green Ash Partners
|
|
Shale enabled the US to become energy independent, and even a net exporter. About half of the EU's Russian gas imports have now been replaced by US LNG
|
|
|
Source: Apollo; Green Ash Partners
|
|
US electricity prices are ~-40% lower than the European average. Countries with significantly cheaper electricity than the US are mostly subject to export controls on AI semiconductors
|
|
|
Source: GlobalPetrolPrices.com; Statista; Green Ash Partners. As of March 2024. N.B. these are household electricity prices
|
|
But despite these advantages, energy is becoming a bottleneck in the US, as AI scales. New model generations releasing every 1-2 years and giant new AI datacentre clusters taking 2-3 are starting to butt up against available baseload capacity, with utilities more accustomed to 5-10 year timelines for new power generation projects.
One issue is the proximity of datacentre demand to baseload power. 35% of the world's of hyperscale datacentres reside in Northern Virginia, and consume 26% of the state's electricity. They already consume more than twice the power generated by the state's largest nuclear power plant, and at the current rate of datacentre growth, power demand could triple over the next 15 years.
|
|
35% of the world's hyperscale datacentres are located in Virginia
|
|
|
Source: Apollo; Green Ash Partners
|
|
In the US as a whole, datacentre's consume about 4% of total electricity supply, which may double to 8% by 2030 (GS estimate). The big question is where this 170TWh of power will come from?
- Natural gas - gas power plants typically take 2-3 years to build, and about 30-40GW of capacity would be needed to meet this level of demand, at a cost of $36-48 billion. Supplying gas to the new power plants would require a +3-4% increase in US gas production
- Nuclear - it would take about 22GW nameplate of nuclear to generate 170TWh, but capital costs could be much higher at >$130-220 billion (there have been some recent cost overruns that could imply even higher numbers). Large reactor projects take much longer to complete - 5-10 years, including several years of permitting. Small Modular Reactor (SMR) designs are attempting to speed up permitting (and reduce upfront capital costs), but are as yet unproven
- Renewables - using mid-range capacity factors, 25GW of wind and 40GW of solar could meet datacentre demand growth by 2030, at a combined cost of $62-93BN. The main issue is intermittency. Adding 5 days of battery storage to this combination would take 2TWh hours of lithium batteries - double current global production - at a cost of $500 billion to $1 trillion!
An impediment to all of these options in the interconnection queue in the US, which has been steadily rising every year.
|
|
Total (cumulative) active capacity in queues is now nearly 2,600 GW; New (annual) capacity entering the queues has increased every year since 2014
|
|
|
Source: Lawrence Berkeley National Laboratory; Green Ash Partners
|
|
The Scale of the Challenge May Need All of the Above
|
|
We have been conservative with our assumptions in this essay - the scale of challenge is much larger, as datacentres are not the only driver of rising electricity demand over the next ten years. The Center for Strategic & International studies estimate US electricity demand could rise +19% over the next decade, versus just 2.4% in the previous one. On their longer time horizon, their estimates assume nearly double the growth in datacentre power demand versus our numbers above, with electric vehicles adding a similar amount again. Onshoring of semiconductor and battery supply chains, and electrification of buildings and industrial energy uses contributes another 145TWh annually. Furthermore, our rough cost calculations do not factor in ageing grids, grid expansions, and numerous other land, construction and equipment costs associated with raising the growth rate of a very large proportion of US infrastructure by +2ppts a year for a decade.
Factoring all of this in could easily get us into the trillions in terms of cost, and, even with the funding, major overhauls to permitting and regulation are required to make these timescales realistic. Of course, all of this should achieved in the most sustainable way possible from an emissions standpoint, but it is very likely natural gas will have to play a role in the solution.
We will be following up shortly with some thoughts on electrification more generally, and our outlook for the energy transition.
|
|
Key sources of electricity demand growth over the next decade
|
|
|
Source: CSIS Energy Security and Climate Change Program; Green Ash Partners
|
|
|
|