Is AI Really the Energy Villain?
If you hang out on tech Twitter, Threads, or LinkedIn long enough, you’ll eventually see the same refrain:
“AI is an environmental disaster. These models are boiling the oceans.”
First, zoom out: data centres vs the whole grid
According to the International Energy Agency (IEA), data centres worldwide used about 415 terawatt-hours (TWh) of electricity in 2024, roughly 1.5% of global electricity consumption. That load is growing fast and is expected to more than double to around 945 TWh by 2030.
So yes, the data-centre footprint is real, and AI is a big part of why those projections are climbing.
But that 1.5% is all data-centre workloads lumped together: social media, video streaming, gaming, SaaS, classic web hosting, plus AI training and inference
How much of that is actually AI?
Recent analysis reviewed in Carbon Brief suggests that AI currently accounts for roughly 5–15% of data-centre power use, with a plausible path to 35–50% by 2030 if generative AI keeps scaling as projected.
So today, the majority of data-centre energy is still going to the “boring” stuff: serving video, running social feeds, ad tech, cloud storage, regular enterprise workloads
Think of it as the newest tenant in a building that was already over-air-conditioned.
Streaming and social: the original energy hogs
This is the part that rarely makes the headlines.
Work from researchers like Andy Masley has shown that streaming Netflix and YouTube consumes far more energy overall than services like ChatGPT, once you scale to global usage.In one comparison, he estimated that annual energy use associated with ChatGPT was comparable to a small U.S. region, while video streaming as a whole matched the electricity use of all New England plus New York.
Older IEA work on streaming puts one hour of video on a smartphone over Wi-Fi at about 0.037 kWh of electricity, most of which is in data transmission and the device, not the data centre alone. that sounds small until you multiply it by billions of hours of video per day.
On top of that, Canadian research has suggested that streaming video alone contributes over 1% of global greenhouse gas emissions, driven by sheer volume and a classic rebound effect: the easier it is to stream, the more we do it.
So when we talk about “the internet’s energy problem,” we’re really talking about an attention economy problem: Doomscrolling, Infinite video feeds, Autoplay everything, Cloud gaming, and increasingly, AI on top of all that
So why does AI feel like the villain?
Part of it is visibility. Large language models are tangible. You type a prompt, something happens, and journalists can point at a single query and say: “This uses 10× a Google search.” They’re not wrong; estimates put a typical ChatGPT query at roughly that order of magnitude.
But most people don’t think of five hours of TikTok or Netflix as “energy use.” It’s just Tuesday.
AI becomes the villain because it’s new, concentrated, and visible, while streaming and social are just “background noise” even though they still dominate the energy pie in absolute terms.
If you only blame AI, you’re not doing climate policy. You’re doing narrative management.
Where AI is a real grid problem
Now for the part where I agree with the critics.
Even if AI is a minority slice today, it’s driving where and how fast data-centre demand grows. IEA, Nature and others all converge on the same point: data-centre electricity use is likely to at least double by 2030, largely because of AI workloads.
BloombergNEF projects U.S. data-centre power demand reaching about 106 gigawatts by 2035, a sharp jump from earlier forecasts. Pew Research estimates data centres already account for about 4% of U.S. electricity use, with demand expected to more than double by 2030.
That doesn’t mean “AI breaks the grid globally,” but it does mean local pain: Stressed regional grids, higher prices near hyperscale clusters, awkward conversations about who gets power priority
This is where my inner infrastructure nerd and my inner pragmatist meet.
We’ve solved this kind of problem before:
Specialized AI accelerators and ASICs can deliver order-of-magnitude efficiency gains over general-purpose hardware. Some surveys put performance-per-watt improvements in the 10–50x range for certain workloads compared to classic CPUs and GPUs. Google’s TPU v4 shows roughly 2.7x better performance per watt than its previous generation. Startups like Positron claim their inference ASICs can beat Nvidia’s H200 systems on throughput while using about one-third of the power.
That doesn’t magically erase training footprints or embodied emissions, but it does mean the “AI will eat the grid” narrative is not a law of physics. It’s an engineering problem.
So what the hell are we actually arguing about?
If you care about the energy impact of AI, here are the more useful questions:
How fast can we push the transition from general-purpose GPUs to more efficient accelerators?
How do we stop building data centres in places where the grid is already on life support?
How do we account for all high-bandwidth attention platforms (video, social, gaming, AI) instead of picking a single villain of the week?
And how do we design policy that rewards genuine efficiency, not just better PR?
AI is not innocent. But it’s also not the only one at the buffet.
If we only yell at AI while ignoring streaming, social and the rest of the modern internet stack, we’re not saving the planet. We’re just doing climate cosplay.
And to borrow from a metaphor I used elsewhere: that would be like James Bond taking out No. 3 and ignoring the rest of SPECTRE.
