The AI Climate Panic Is Built on Bad Math
- Rich Washburn

- 21 minutes ago
- 3 min read


There's a conversation happening right now that sounds informed but isn't. It goes something like this: AI is destroying the planet. One query uses a pound of water. AI is consuming hundreds of millions of gigawatts of electricity. We're cooking the Earth so you can ask ChatGPT what to make for dinner. Let me stop you right there. These claims aren't just overstated — they're factually wrong and the units alone will tell you everything you need to know about who's driving the narrative.
Let's Talk Units First
When someone measures water consumption in pounds, that's your first red flag. Water is measured in gallons, liters, or cubic meters — not pounds. That framing is designed to feel heavier than it is. Literally.
The actual figure most cited comes from a 2023 UC Riverside study estimating that ChatGPT uses roughly 500ml of water per 20–50 queries — that's about half a liter for a sustained conversation, not a single query, and not a pound. Context matters. Units matter more.
The Electricity Numbers Don't Add Up Either
"Hundreds of millions of gigawatts used by AI" — this one isn't in the same zip code as reality. The IEA's most recent analysis puts all data centers at roughly 1–2% of global electricity consumption today, projected to rise to around 3% by 2030 as AI workloads grow. That's real, it's worth monitoring, and infrastructure investment needs to keep pace. But hundreds of millions of gigawatts would exceed the entire electricity output of Earth by orders of magnitude. That's not hyperbole — it's a number that was never fact-checked before it got shared.
[Source: IEA, Energy and AI Report — iea.org/reports/energy-and-ai]
Here's What the Comparisons Actually Look Like
You want context? Here's context. A typical ChatGPT query uses about 0.3 watt-hours of electricity. A Google search uses roughly the same — about 0.3 Wh. We've been fine with Google for 25 years.
[Source: Epoch AI — epoch.ai/gradient-updates/how-much-energy-does-chatgpt-use]
Streaming one hour of Netflix uses approximately 0.077–0.24 kWh — that's 77 to 240 watt-hours, depending on device and network. One hour of Netflix can equal hundreds of ChatGPT queries in energy terms. Nobody's canceling their subscription over it.
[Source: IEA Carbon Footprint of Streaming — iea.org/commentaries/the-carbon-footprint-of-streaming-video-fact-checking-the-headlines]
Golf uses more water than AI. The USGA estimates U.S. golf courses use approximately 2.08 billion gallons of water per day for irrigation. The entire U.S. data center industry — running everything from AI to streaming to banking — uses far less. Nobody is calling for abolishing golf in the name of climate justice. Though personally, I have other reasons to consider it.
[Source: USGA Water Resource Center — usga.org/content/usga/home-page/course-care/water-resource-center]
The Real Conversation Worth Having
None of this means AI infrastructure has zero footprint. It has one, and it's growing fast. The IEA projects data center electricity demand could double by 2030. There are specific facilities — particularly in water-stressed regions like the American Southwest — where cooling decisions were made without adequate local resource planning. Those are legitimate, addressable problems. But "AI is causing global warming" isn't an address. It's a headline. And bad headlines drive bad policy, bad investment decisions, and a public that's scared away from the most productive technology we've built in a generation.
The conversation that actually moves things forward: Which specific data centers are in water-stressed areas? What cooling technologies (air cooling, closed-loop systems, direct liquid cooling) reduce consumption? How do we co-locate AI infrastructure with renewable energy sources? How do we standardize reporting so comparisons mean something?
That conversation is happening — among engineers, operators, and serious policymakers. It doesn't need manufactured panic. It needs accurate data.
The Bottom Line
If your argument relies on measuring water in pounds and citing electricity figures that exceed the output capacity of Earth, you're not making a climate argument. You're making noise. AI uses power. So does everything. The question is whether the value created justifies the cost — and by that measure, AI is among the most productive tools humanity has ever deployed.
Now if you'll excuse me, I'm going to use ChatGPT. It'll consume less energy than the lamp I'm sitting under.




Comments