28 Comments
User's avatar
Andy Masley's avatar

Huge fan of your work so the shoutout means a lot, thanks so much for sharing the posts and adding context!

Expand full comment
Anna Ratkai's avatar

Thanks for this article Hannah. After reading, I have a few thoughts, I would be interested to hear your take on them.

1) Maybe one individual LLM use does not have a crazy emission/water impact, but it doesn't mean we should downplay our collective impact. For example, I can argue that my individual fast fashion consumption has such a low impact, only a couple of thousand liters of water, + a few kilos of textile waste per year. Who cares? Maybe true, my individual actions, if looked at in a bubble, don't have the biggest impact. But the thing is, I'm not the only one doing it. We do it collectively, and when everyone is buying fast fashion / uses LLMs, our impacts add up. And not just the environmental impact, but using and engaging with AI is a signal for companies that there is demand, so they will do even more of it. (I want to mention that I agree, there are good use cases for AI, but when it is shoved into every single product and allows us to do useless (generate funny images for no reason) or even harmful (spew out misinformation shared on social media) things, it is a misuse of our resources. And we definitely don't need more of AI that does these useless / harmful things.)

2) There is also an indirect environmental cost of using AI, in the form of mining, toxic chemical use, and environmental destruction of digging out all the necessary raw material to produce the hardware AI runs on. Would love to see that accounted for.

So I'm not anti-AI, but I'm sceptical & cautious, and I believe both individuals and companies should be more intentional with their AI use.

Expand full comment
Jadzania's avatar

Very good point and something that concerns me, too!

Expand full comment
Andy Masley's avatar

I talk about both of these points in a lot of detail in the cheat sheet post fwiw

Expand full comment
AVS's avatar

Hi there. I really agree that some use of AI seems completely useless and detrimental. For instance, misinformation is bad. I want to point that these are valid arguments. Misinformation is bad. AI-driven misinformation is bad. We do not need to make it about climate change. The goal of human societies is not only to stop climate change. Human well being is an important goal, for instance. No need to frame everything bad for society as something bad for climate change or the environment.

Expand full comment
Jonah Golden's avatar

How can you just ignore the most energy intensive parts of the process; model training, data centers, etc?

Expand full comment
WallFlamingo's avatar

"Including the cost of training raises the energy cost per prompt by 10%"

You should read the linked cheatsheet to answer your questions.

Expand full comment
Sineira's avatar

Looking at the numbers that statement seems completely wrong.

It's the query that's 10% of the training. And the query is 10 times higher than a search. so ...

Expand full comment
Andy Masley's avatar

Where are you getting the number implying the query is 10% of the training? I’ve never seen anything that implied that. It’d mean ChatGPT’s been used 100x less than I assumed

Expand full comment
Sineira's avatar

"An informal online estimate for ChatGPT indicates that it produces 0.382 g CO2e per query".

"Assuming that ChatGPT undergoes a full re-training of the model once per month and continues with an estimated 10,000,000 queries per day, the 552 metric tons divided by 300,000,000 queries equates to 1.84 g CO2e per query for the amortized training cost."

Expand full comment
Andy Masley's avatar

Can you share the link to what you're citing?

Expand full comment
Sineira's avatar

It was linked in the article as the "best estimates".

https://www.nature.com/articles/s41598-024-54271-x

Expand full comment
Richard's avatar

I read the original piece by Andy Masley and agreed with the analysis but then over time I've become unsure again, mainly triggered by watching the Studio Ghibli fad unfold.

Image generation (and video generation) must have a higher impact than text-only, and we should probably consider the utility of it all as well. I mean, what was the point of the Ghibli thing, really? If you eat a burger it has high utility - you stay alive for another day or two. If you generate a bunch of images (probably three for every one you actually want, given how the output isn't usually exactly what you hoped for), it has low utility. Millions of people generating pointless bullshit collectively could have a negative impact and shouldn't really be compared with the amount of water used by the food supply chain or the amount of electricity used by aircon.

Expand full comment
Andy Masley's avatar

What’s the point of fan art or video games? Both also aren’t necessary but make people happy. The Ghibli trend was fun for people and used tiny amounts of energy for each image. When other goofy trends like this happen online people never bring up the carbon cost, even though they always also have one. I suspect people are upset at AI for other reasons and are using the carbon cost to criticize it, but imo the point would make more sense if they just criticized AI directly instead of this roundabout “oh it uses 3Wh” thing

Expand full comment
Sineira's avatar

This is only taking into account the processing on the servers, not the infrastructure needed for the delivery.

And a basic sanity check tells me this is wrong. The added electricity consumption is too high to be waved off as nothing more than parts of a light bulb. It doesn't compute.

https://www.scientificamerican.com/article/the-ai-boom-could-use-a-shocking-amount-of-electricity/

Expand full comment
David Steigerwald's avatar

Yes, we are having huge controversies over proposed data centers each of which are proposing to use the energy of thousands of households, throwing the sustainability plans of the state out the window. If it isn’t AI, what is consuming all that energy?

https://cardinalnews.org/2025/04/11/energy-demand-will-outstrip-supply-in-virginia-as-data-centers-proliferate/

Expand full comment
Andy Masley's avatar

I go into a lot of detail about this in the cheat sheet post. AI more broadly is an environmental problem. Chatbots are an extremely tiny fraction of AI energy use.

Expand full comment
Sineira's avatar

It also ignores the energy wasted during idle time.

The total usage for the servers need to be accounted for, not just for each query.

Expand full comment
Andy Masley's avatar

Do you think that'd add that much? I also ignore the idle time of servers running YouTube videos. This applies to everything we do online, so if I include those, the energy costs of everything else will go up too, and ChatGPT won't really stand out.

Expand full comment
Sineira's avatar

I do think it adds a lot. These servers are dedicated AI servers.

The amount of power installed for them and theses calculations don't add up.

Expand full comment
Andy Masley's avatar

They don't add up because almost all energy used for AI isn't used for chatbots

Expand full comment
Jake Carroll's avatar

As with most things, it also comes down to trade off and intention. If I’m using an LLM to help build my impact business I’d imagine the net benefit far outweighs the cost.

This is an incredible piece. As someone who has the ChatGPT guilt often this helps a lot. Thank you for the work you do!

Expand full comment
quinibuzz's avatar

I can easily save 5 google searches if I use ChatGPT instead.

Expand full comment
Tom Mikulka's avatar

Great that I don't have to feel bad about my personal use but I do feel bad about those AI centers that will be powered by gas or coal. I also have big problems with the dumbing down of humanity. The recent NYTimes article on AI hallucinations had many comments about students relying on AI rather than developing writing and critical thinking skills. Lawyers are submitting AI generated briefs riddled with mistakes. Bill Gates can brag that AI will replace doctors and teachers but I am reminded of Oppenheimer's comment " now I am become death, the destroyer of worlds." AI is being used for evil purposes as I write. Pandora's box has been opened.

Expand full comment
Anneke Hobson's avatar

Thank you so much Hannah! This is incredibly helpful information.

Expand full comment
Grant Mowry's avatar

Which specific model are we talking about here? I use the advanced reasoning models a lot for writing code, which probably consumes much more power than your analysis suggests, but it is also worth considering the amount of human work/time saved. I’d be curious what the carbon footprint is of a person spending a couple extra minutes google searching because they didn’t get as useful an answer as they were looking for.

Expand full comment
rahul razdan's avatar

Nice work... it is nice to get this perspective

Expand full comment