9 Comments
User's avatar
Tanner Janesky's avatar

Thanks for the analysis Hannah. Good research. Even though energy use per query is very low and most likely will get lower, we need to consider the rebound effect. By making AI efficient, easy, and cheap, more and more systems will rely on it. Many already do. As more users and back-end systems use it, overall energy requirements will increase. Just as the first electric lightbulbs were horribly inefficient, we use much more electricity for electric LEDs now since they have become efficient, cheap, and ubiquitous. Human civilization tends to increase in complexity and energy intensity over time, even if any individual technology gets more efficient.

Expand full comment
𝓙𝓪𝓼𝓶𝓲𝓷𝓮 𝓦𝓸𝓵𝓯𝓮's avatar

This isn't taking into account the electricity and water used to COOL THE DATA CENTERS. This is industry propaganda.

Expand full comment
Rich Miller's avatar

Thanks for the detailed analysis. It's interesting to see the amount of debate every time an AI company releases an energy data point.

What's clear, and important, is that Google is very rapidly reducing the energy and carbon impact of its AI infrastructure. This should push its rivals to do the same.

Expand full comment
𝓙𝓪𝓼𝓶𝓲𝓷𝓮 𝓦𝓸𝓵𝓯𝓮's avatar

How much electricity and water are being used to COOL the DATA CENTERS. That's what's important.

Expand full comment
Moments of Wonder's avatar

Amazingly detailed research, thank you. In other words, AI burns off brain cells as well as electricity!

Expand full comment
Jonathan Irons's avatar

Thank you so much for following up with this. Your reports are valuable when discussing this, and there's a lot of heat, especially in climate-conscious circles, around the subject.

That said, the pinch of salt I would take when processing Google's numbers is huge. They have been notoriously and consistently nebulous about their emissions, and wilfully conflate Scope 1, 2, 3 etc.

The real question is not what the carbon footprint of "me" using ChatGPT is, it's what is the carbon footprint *increase* of this huge surge towards AI. As ever, Jevons keeps looking over our shoulder here.

Expand full comment
John McGrath's avatar

For a point of comparison, the carbon footprint of a hamburger is about 4 kg CO2e. At an average US emissions factor, that's the CO2e from a little over 11 kWh of electricity generation. Comparing 11 kWh to 0.3 Wh, one hamburger uses 36,666x more electricity than one LLM query, on average.

Forgo the occasional hamburger, and use AI to your heart's content.

Expand full comment
Chris Schuck's avatar

Thanks for all this research. Is it possible to clarify the degree to which these figures already incorporate the total energy and water costs of building, maintaining and growing the data center infrastructure as a whole? To my mind this was always the biggest concern - the total network effects - not the per capita marginal cost of an individual user.

Also: if a user dumps an entire full-length document/article into the prompt and asks AI to summarize or process, that's much longer than the "average" prompt referenced here, correct? Because that's how many people I know use it.

Expand full comment
Matt Ball's avatar

As always, thank you for doing the research and using your platform to help bring focus back to where it should be (at least if we want to make a real difference).

Expand full comment