AI’s Energy Myth: Why Data Centres Aren’t About to Break the Grid

In recent months, headlines have warned of a looming electricity crisis driven by artificial intelligence. Breathless forecasts suggest that AI-fuelled data centres could soon consume half the world’s electricity, forcing us to build hundreds of new fossil-fuelled power plants just to keep the servers humming. But we’ve heard this tune before. And if history’s taught us anything, it’s that sensationalism makes for poor energy policy.

As someone who co-founded the CIX data centre in Cork, Ireland, and who now spends his time analysing energy and sustainability trends, I’m compelled to cut through the noise. What we need right now is not another hype cycle but a serious conversation about the future of data centre energy use – one grounded in evidence, efficiency, and economics.

Déjà Vu All Over Again

Let’s rewind the tape. In 1999, the U.S. coal industry claimed that the internet would need half the country’s electricity by 2020 [PDF]. Spoiler alert: it didn’t. In fact, through 2018, the global efficiency gains in data centres were so substantial that even a 550% surge in computing services barely nudged electricity consumption, which rose only 6%.

We’re seeing the same pattern today. A handful of data centre hubs like Virginia’s “Data Center Alley” are experiencing explosive local growth. This is being extrapolated to justify nation-scale expansions in power infrastructure. But as Rocky Mountain Institute founder Amory Lovins outlines in his new paper, “Artificial Intelligence Meets Natural Stupidity,” national data centre electricity use in the U.S. rose by only 0.1% of total grid demand in 2024.

The Mirage of AI Energy Explosions

According to the IEA, AI accounted for just 11% of global data centre power in 2025 [PDF]. Even with aggressive growth, the IEA projects AI data centres will use just 3% of global electricity by 2030. That’s hardly the apocalypse.

What’s more, advances in computing efficiency are astonishing. NVIDIA reports 45,000x improvements in AI inference efficiency since 2016. Chips like Blackwell are now offering 30x the performance of their predecessors using a fraction of the power. Meanwhile, software-level gains, from smarter algorithms to leaner inference, are shaving energy use even further. Google’s AlphaChip, an AI designed to design better AI chips, is pushing the efficiency envelope even further. There’s enormous room for optimisation that most forecasts simply ignore.

The Real Risk: Misallocated Capital

This matters because inaccurate projections have consequences. Utilities are greenlighting billions in new fossil-fuel generation based on speculative AI growth that may never materialise. These are long-term investments with lifespans measured in decades. If demand doesn’t follow through, consumers will be left holding the bill through higher rates, stranded assets, and a dirtier grid.

Just look at Texas. ERCOT has received requests for 99 GW of new load – more than its entire peak demand in 2023. Yet even its own analysts concede much of that is likely to evaporate before a single watt is drawn.

Smart Regulation, Not Panic

So what’s the alternative? Smarter regulation. Require data centre developers to guarantee payment for power supply, backed by insurance or performance bonds. Stop shifting risk onto everyday ratepayers. Pair data centres with renewables in what RMI calls “Power Couples” – colocating flexible digital loads with clean energy at existing grid interconnects.

And above all, stop using AI as a Trojan horse to justify more gas, coal, or nuclear plants that the market has already rejected.

Flexibility Is the Future

We need to stop thinking of data centres as monolithic, inflexible energy hogs. Emerging practices like carbon-aware computing and demand shifting show that workloads can be intelligently scheduled to align with clean energy availability.

Even more importantly, many data centres already include significant on-site generation and battery energy storage, enabling them not just to manage their own consumption but to contribute to grid stability. Participating in demand response programs, they can act as firming agents for struggling grids, not burdens.

In fact, the U.S. grid could unlock nearly 200 GW of load flexibility by 2030, saving $15 billion annually. Flexible data centre operations could serve AI demand from existing power plants, without building a single new gas turbine.

AI Can Save Energy Too

Ironically, AI might be more important as a tool for reducing energy use than increasing it. AI-driven building management systems, grid optimisation, and materials discovery are already demonstrating significant energy savings. And with 98% of global electricity going to things other than AI, the potential for AI-enabled efficiency is vast.

But there’s a caveat. AI is also being deployed to optimise fossil fuel extraction, potentially unlocking reserves that would otherwise remain unburned. If AI ends up enabling more oil and gas than it offsets through efficiency, we’re heading in the wrong direction.

Lessons from the Dot-Com Bubble

The parallels with the early 2000s are uncanny. Back then, speculative fibre builds and data centre overcapacity left billions in losses. We’re at risk of repeating the same mistakes.

Already, AI-linked stocks have seen steep losses, with NVIDIA suffering the biggest one-day drop in corporate history. DeepSeek’s breakthrough in efficient, small-scale AI models sent shockwaves through the industry. Suddenly, the trillion-dollar bet on ever-larger hyperscale data centres looks shaky.

Resilience Through Clean Power

If AI demand does scale as some expect, we need to power it with renewables. Fortunately, that’s increasingly feasible. Projects like Portugal’s Sines 4.0 and Apple’s all-renewable data centres prove the model. So does South Australia, now 82% powered by wind and solar with grid stability to match.

In 2025 alone, renewables will add 730 GW globally, with batteries contributing another 74 GW. Nuclear, by contrast, will add just 4 GW net. Let’s build where the market already is: fast, clean, modular.

Grounding AI Hype in Data

Ultimately, the best way to support AI is not by throwing money at power plants, but by building a smart, flexible, decarbonised electricity system. That requires nuance, not narratives.

Yes, AI might transform our economy. But it won’t do it by guzzling half the grid. And if it can’t save us energy or money, what’s the point? We didn’t build the internet to burn coal. Let’s not build AI that way either.

If you’d like to dig deeper into these insights, I highly recommend reading Amory Lovins’ full paper, available here [PDF].

Or better yet, subscribe to the Climate Confident podcast where we regularly tackle these thorny, high-impact topics at the intersection of technology and sustainability.

Let’s get this right. The future depends on it.


Discover more from Tom Raftery.com

Subscribe to get the latest posts sent to your email.


Comments

One response to “AI’s Energy Myth: Why Data Centres Aren’t About to Break the Grid”

  1. […] energy use, fears that AI will “break the grid” have been over-hyped. As I wrote in AI’s Energy Myth: Why Data Centres Aren’t About to Break the Grid, the numbers don’t support the apocalypse narrative. Data centre demand has jumped, but […]