Discussion about this post

User's avatar
Tanner Janesky's avatar

Thanks for the analysis Hannah. Good research. Even though energy use per query is very low and most likely will get lower, we need to consider the rebound effect. By making AI efficient, easy, and cheap, more and more systems will rely on it. Many already do. As more users and back-end systems use it, overall energy requirements will increase. Just as the first electric lightbulbs were horribly inefficient, we use much more electricity for electric LEDs now since they have become efficient, cheap, and ubiquitous. Human civilization tends to increase in complexity and energy intensity over time, even if any individual technology gets more efficient.

Expand full comment
Jonathan Irons's avatar

Thank you so much for following up with this. Your reports are valuable when discussing this, and there's a lot of heat, especially in climate-conscious circles, around the subject.

That said, the pinch of salt I would take when processing Google's numbers is huge. They have been notoriously and consistently nebulous about their emissions, and wilfully conflate Scope 1, 2, 3 etc.

The real question is not what the carbon footprint of "me" using ChatGPT is, it's what is the carbon footprint *increase* of this huge surge towards AI. As ever, Jevons keeps looking over our shoulder here.

Expand full comment
31 more comments...

No posts