90 Comments
User's avatar
Anna Ratkai's avatar

Thanks for this article Hannah. After reading, I have a few thoughts, I would be interested to hear your take on them.

1) Maybe one individual LLM use does not have a crazy emission/water impact, but it doesn't mean we should downplay our collective impact. For example, I can argue that my individual fast fashion consumption has such a low impact, only a couple of thousand liters of water, + a few kilos of textile waste per year. Who cares? Maybe true, my individual actions, if looked at in a bubble, don't have the biggest impact. But the thing is, I'm not the only one doing it. We do it collectively, and when everyone is buying fast fashion / uses LLMs, our impacts add up. And not just the environmental impact, but using and engaging with AI is a signal for companies that there is demand, so they will do even more of it. (I want to mention that I agree, there are good use cases for AI, but when it is shoved into every single product and allows us to do useless (generate funny images for no reason) or even harmful (spew out misinformation shared on social media) things, it is a misuse of our resources. And we definitely don't need more of AI that does these useless / harmful things.)

2) There is also an indirect environmental cost of using AI, in the form of mining, toxic chemical use, and environmental destruction of digging out all the necessary raw material to produce the hardware AI runs on. Would love to see that accounted for.

So I'm not anti-AI, but I'm sceptical & cautious, and I believe both individuals and companies should be more intentional with their AI use.

Expand full comment
AVS's avatar

Hi there. I really agree that some use of AI seems completely useless and detrimental. For instance, misinformation is bad. I want to point that these are valid arguments. Misinformation is bad. AI-driven misinformation is bad. We do not need to make it about climate change. The goal of human societies is not only to stop climate change. Human well being is an important goal, for instance. No need to frame everything bad for society as something bad for climate change or the environment.

Expand full comment
Artur Neland's avatar

It can also be about wasting time discussing things you already know or telling jokes to it.

Expand full comment
Jadzania's avatar

Very good point and something that concerns me, too!

Expand full comment
Andy Masley's avatar

I talk about both of these points in a lot of detail in the cheat sheet post fwiw

Expand full comment
Tim Bushell's avatar

... but does 'text' AI add to 'old style' Google searches or are they a 100% 'new and extra'.

Not sure of those splits, but I 'feel' (as I haven't yet tried to measure it), that my google searching has reduced and within the use of say Elict and Research Rabbit an AI question gets me to a better answer faster, so of...

... wander off to think on how to measure that, better.

Expand full comment
Alexander Schalk's avatar

I believe it also comes down to the question of how much more positive impact can be done with AI in comparison to working without AI. For example, if I can accomplish a project which serves the world in double the time or more, the energy/water consumption of the LLM would be worth it in my opinion.

So I don't think you can really compare it to consumer products which get thrown away in a couple of months or years, in the end LLMs just make things more efficient in the workplace, and I believe the "easy" google questions aren't used as often as work related questions and if so I would guess that the easy questions don't need as much resources.

Expand full comment
Andy Masley's avatar

Huge fan of your work so the shoutout means a lot, thanks so much for sharing the posts and adding context!

Expand full comment
Richard's avatar

I read the original piece by Andy Masley and agreed with the analysis but then over time I've become unsure again, mainly triggered by watching the Studio Ghibli fad unfold.

Image generation (and video generation) must have a higher impact than text-only, and we should probably consider the utility of it all as well. I mean, what was the point of the Ghibli thing, really? If you eat a burger it has high utility - you stay alive for another day or two. If you generate a bunch of images (probably three for every one you actually want, given how the output isn't usually exactly what you hoped for), it has low utility. Millions of people generating pointless bullshit collectively could have a negative impact and shouldn't really be compared with the amount of water used by the food supply chain or the amount of electricity used by aircon.

Expand full comment
Andy Masley's avatar

What’s the point of fan art or video games? Both also aren’t necessary but make people happy. The Ghibli trend was fun for people and used tiny amounts of energy for each image. When other goofy trends like this happen online people never bring up the carbon cost, even though they always also have one. I suspect people are upset at AI for other reasons and are using the carbon cost to criticize it, but imo the point would make more sense if they just criticized AI directly instead of this roundabout “oh it uses 3Wh” thing

Expand full comment
Richard's avatar

Perhaps we should think of the carbon cost of the other stuff too, though. One thing that struck me from your article was the high cost of YouTube, with an implication of broadly "you don't care much about that, do you?" I definitely had never thought about it, and I'm wondering whether we should care. Just because something makes people happy is not a slam dunk moral argument - people used to nail cats to trees for fun but we kind of outgrew that. The environmental cost of everything is probably a new level of sophistication in understanding the world that the new generations coming through will just naturally have a better handle on

Expand full comment
Buzen's avatar

How much carbon was emitted transmitting your banal points to my device? Excuse me while I go nail a cat to a tree, in atonement (purring emits gigatons of carbon globally, and for what? Making pet owners happy).

Expand full comment
Lori Williamson's avatar

. . . and we're back to throwing rocks at the moon.

As a species, we seem hell-bent on proving that evolution works backwards, too.

Expand full comment
Concetta's avatar

The Ghibli trend was horrifying for other reasons - namely, the ethics of taking something from a studio that is dedicated to human art and commodifying it into a "prompt and generate throwaway media" situation.

Reading the papers in nature and other places, with their calculations is that one image takes 4.5Wh to process. Given that most people don't usually accept the first image, and that Midjourney, CoPilot, and other LLMs produce FOUR images at once, the use of the image production is definitely larger. We just don't know how much as none of the LLMs will admit any stats about just how many images people are producing.

And as the image generators and now video generators are getting higher resolution AND more popular, the amount of energy needed is only going to grow.

I'm not anti-use of it, but these "trendy" type image generations feel like we are wasting energy on something unnecessary. It makes much more sense to do an image when you can't find what you're looking for already, or for folks who are really into making digital art, not to design an image that looks like a million others on social media. In a click or two, its generally easy to find a free image to use on Canva or Sharepoint (in my usual needs), and we don't need to use generative AI to do that. That's where the energy use, even if it is as small as argued per person, feels wasteful.

Expand full comment
Andy Masley's avatar

4.5 (or 12) Watt-hours is still pretty crazy small. For context, if you're searching online for an image using your laptop, you'd need to make sure to spend less than 8 minutes searching to find the exact image you want to beat the energy efficiency of the AI image generator. I don't have a problem with people spending energy to search for bland or goofy images, so I don't have a problem with AI either. I'm a huge fan of Miyazaki, but I've also never been bothered by cheap or bad fan art of Miyazaki movies. I see AI Miyazaki art as basically a generator that creates okay fan art. I think a lot of people actually got a ton of value out of that. Several people I know actually found drawings of themselves with their partners or friends very moving in an unexpected way. Ultimately I want to be pluralistic about other people's taste. Even if I think people's taste is sometimes goofy or bad, it's totally fine if they want to spend very small amounts of energy to pursue that goofy taste. Bad fan art also uses energy and is "unnecessary" but that seems like a draconian way of thinking about how to spend energy. Let people like what they like as long as it's not adding much to their energy budgets.

Expand full comment
Buzen's avatar

And what about all the energy wasted making music? And how much oil goes into a modern painting?

We should all just sit quietly in cold dark rooms silently contemplating our coming climate apocalypse. And no crying or heavy sighing, as we know those also emit carbon.

Expand full comment
zoe's avatar

The difference is that those are beautiful works of human art vs. a machine that simply copies work already done by past artists. One has emotion & meaning and one is a replica done by a computer. This is such a bad straw man argument.

Expand full comment
zoe's avatar

The difference is that those are beautiful works of human art vs. a machine that simply copies work already done by past artists. One has emotion & meaning and one is a replica done by a computer. This is such a bad straw man argument.

Expand full comment
zoe's avatar

The difference is that those are beautiful works of human art vs. a machine that simply copies work already done by past artists. One has emotion & meaning and one is a replica done by a computer. This is such a bad straw man argument.

Expand full comment
Sineira's avatar

This is only taking into account the processing on the servers, not the infrastructure needed for the delivery.

And a basic sanity check tells me this is wrong. The added electricity consumption is too high to be waved off as nothing more than parts of a light bulb. It doesn't compute.

https://www.scientificamerican.com/article/the-ai-boom-could-use-a-shocking-amount-of-electricity/

Expand full comment
David Steigerwald's avatar

Yes, we are having huge controversies over proposed data centers each of which are proposing to use the energy of thousands of households, throwing the sustainability plans of the state out the window. If it isn’t AI, what is consuming all that energy?

https://cardinalnews.org/2025/04/11/energy-demand-will-outstrip-supply-in-virginia-as-data-centers-proliferate/

Expand full comment
Andy Masley's avatar

I go into a lot of detail about this in the cheat sheet post. AI more broadly is an environmental problem. Chatbots are an extremely tiny fraction of AI energy use.

Expand full comment
Sineira's avatar

It also ignores the energy wasted during idle time.

The total usage for the servers need to be accounted for, not just for each query.

Expand full comment
Andy Masley's avatar

Do you think that'd add that much? I also ignore the idle time of servers running YouTube videos. This applies to everything we do online, so if I include those, the energy costs of everything else will go up too, and ChatGPT won't really stand out.

Expand full comment
Sineira's avatar

I do think it adds a lot. These servers are dedicated AI servers.

The amount of power installed for them and theses calculations don't add up.

Expand full comment
Andy Masley's avatar

They don't add up because almost all energy used for AI isn't used for chatbots

Expand full comment
Concetta's avatar

It would add a lot to it. But I think it also highlights just how much we don't know. We don't know if a server is dedicated to a particular LLM or if its also serving other purposes and just how many servers are dedicated to a particular LLM. We just know OpenAI or Google or Anthropic is opening a new data center.

I also think your calculation was missing the fact that while each individual typed sentence might be a small bit of energy used, its missing the details of the energy used to create it and train it. While that cost would come down with each query used, its not nothing. And now that LLMs are reaching out to the internet and adding information, there's a cost to that as well. The queries don't exist in a vacuum of space that discounts everything used to create them.

Expand full comment
Buzen's avatar

When you pop down to the pub for a pint with friends, the energy used to make that beer was more than a ChatGPT query, and you may not be accounting for the energy you and your friends used to get to the pub, or the lighting in the pub or the lorry to bring the beer to the pub.

Also, if you play trivia at the pub that’s a few hundred grams of CO2 per play.

Expand full comment
Jonah Golden's avatar

How can you just ignore the most energy intensive parts of the process; model training, data centers, etc?

Expand full comment
Keshav's avatar

"Including the cost of training raises the energy cost per prompt by 10%"

You should read the linked cheatsheet to answer your questions.

Expand full comment
Sineira's avatar

Looking at the numbers that statement seems completely wrong.

It's the query that's 10% of the training. And the query is 10 times higher than a search. so ...

Expand full comment
Andy Masley's avatar

Where are you getting the number implying the query is 10% of the training? I’ve never seen anything that implied that. It’d mean ChatGPT’s been used 100x less than I assumed

Expand full comment
Sineira's avatar

"An informal online estimate for ChatGPT indicates that it produces 0.382 g CO2e per query".

"Assuming that ChatGPT undergoes a full re-training of the model once per month and continues with an estimated 10,000,000 queries per day, the 552 metric tons divided by 300,000,000 queries equates to 1.84 g CO2e per query for the amortized training cost."

Expand full comment
Andy Masley's avatar

Can you share the link to what you're citing?

Expand full comment
Sineira's avatar

It was linked in the article as the "best estimates".

https://www.nature.com/articles/s41598-024-54271-x

Expand full comment
James Martin's avatar

Huge fan of your book, Hannah, but concerned about this post. As I told Andy on LinkedIn, 1/. We have no idea what ChatGPT’s real impact is because OpenAI won’t tell us (why, if it’s so small?) 2/. There are many less energy-consuming alternatives, it’s just people don’t know about them & 3/. Many macro-trends suggest that (generative/agentic) AI is a net negative for the climate. Notably the fact that US coal power stations previously set to be stopped are now being kept going, partially to meet AI’s soaring electricity demands. So in this sense, AI is actually slowing down the energy transition. Everything else we know for sure is here: https://bettertech.blog/2025/04/19/ais-impacts-how-to-limit-them-and-why/ - TL;DR: let’s look at the bigger picture before saying "this is fine". It’s not fine at all

Expand full comment
jcb's avatar
May 12Edited

Well said. What are you referring to with less energy-consuming alternatives?

Such a great point about the net negative climate impact. We can do math all day on the relatively small personal usage of ChatGPT but the fact remains that massive AI data centers are being built all over the world. That's a real impact.

I get that Hannah is comparing ChatGPT usage to our other everyday behaviors, but everything is additive and there is infrastructure behind it.

Expand full comment
James Martin's avatar

There are smaller LLMs that consume 30-60 times less energy than the bigger ones (like ChatGPT) with comparable performance. More on that here - https://www.linkedin.com/posts/jamesmartin75_so-youd-like-to-use-a-less-impactful-ai-activity-7327759658528497664-yC-T?utm_source=share&utm_medium=member_ios&rcm=ACoAAAB-X90BvzgQBovzuvLli_vQurS4maIr-f8 - but essentially, whatever ChatGPT’s impact is, it’s excessive and unnecessary. Tho indeed the impact is more at an infrastructure than an individual level (like streaming & a lot of other things)

Expand full comment
Boris Zlatopolsky's avatar

I appreciate this analysis and maybe I don't need to shame myself and others for using AI tools due to climate impacts. However as others have pointed out, collectively this adds up. We're currently trying to clean up the atmosphere but at the same time are adding new unnecessary energy consumption that's slowing us down.

The bigger issue with LLMs that many people using them happily ignore is the material they're trained on, ie the IP theft and the inherent biases and misinformation. That is a far better reason not to use them.

Expand full comment
Tim Bushell's avatar

... it is only one straw, 7 billion users said, etc., etc.

But I am still looking for the whole life cycle of design, infrastructure and construction of new data centres, rather than the previously planned upgrades of older style Google search centres.

Expand full comment
jcb's avatar

Yea I'm surprised this article focuses so much on percentages and relative personal use. It is often touted that the most impactful way to combat climate change at an individual level is through improved efficiency, e.g. better light bulbs, going all electric in your home, etc. Every percentage counts when we take a macro view.

I'm not saying this analysis is wrong or that we shouldn't focus on bigger efficiencies, but we are effectively ADDING more usage here so let's acknowledge that.

Also, a separate argument is on the validity of even using AI for the average person. It's one thing to need food and transportation, it's another to be using an energy-wasteful tool that is in most cases inferior to actual googling and research and critical thinking.

Expand full comment
Lori Williamson's avatar

Critical thinking? Perish the thought!

Expand full comment
Tim Bushell's avatar

Quiet... stats are 'nice' but so 'fakable'...

I am liking the recent (to me) one on the US average income;

$74,500... really?

$65,000 excluding the top 10 Americans... really?

$48,000 excluding the top 50 Americans... really?

$35,000 excluding the top 1,000 Americans...

Yet, Walmart and McDs make multi-billion dollar profits, and yet well over 50% of their employees are on benefits, not from the State or states, but 'my tax dollars. : )))))))

Expand full comment
Buzen's avatar

I wonder how much energy a TikTok post (shaming people for their ungreen ChatGPT queries ) uses, including the energy to show it to millions of viewers.

Maybe if the electricity grid managers in Spain had used ChatGPT to learn about how lack of inertial stability because of over reliance on solar power can cause grid failures they could have prevented the blackout and saved lots of energy and trouble. But they saved 3 Wh by not making that query.

Expand full comment
quinibuzz's avatar

I can easily save 5 google searches if I use ChatGPT instead.

Expand full comment
jcb's avatar

You're assuming what ChatGPT tells you is correct. It is NOT another google despite it looking that way. Treat it like an assistant/tool, not a guru. I've gotten many incorrect results with it with basic questions, even on how to remove a type of stain on a shirt.

Due diligence through proper research and knowing your sources is very important. Human are losing critical thinking skills with this, it's super depressing.

Expand full comment
Concetta's avatar

I would strongly suspect that is incorrect unless you are searching extremely ineffectively. Unless I'm asking something really general or asking for it to code something for me, I find much more information quickly and easily on Google (or even Bing when I'm forced to) than I do on any of the LLMs.

Expand full comment
Boris Zlatopolsky's avatar

Every single time?

Expand full comment
Jake Carroll's avatar

As with most things, it also comes down to trade off and intention. If I’m using an LLM to help build my impact business I’d imagine the net benefit far outweighs the cost.

This is an incredible piece. As someone who has the ChatGPT guilt often this helps a lot. Thank you for the work you do!

Expand full comment
Tam Hunt's avatar

This article looks just at the estimated energy demand of each search, or individual searches in the aggregate for each person, and argues that they’re negligible compared to other energy demand. But then he says “I am not saying that AI energy demand, on aggregate, is not a problem. It is, even if it’s “just” of a similar magnitude to the other sectors that we need to electrify, such as cars, heating, or parts of industry.”

And therein lies the rub, a lot of small things combined of course do lead to big impacts and this can’t be ignored. It’s like saying “well, a little bit of oil leaking from my oil rig off the coast of Santa Barbara is a tiny bit of the total oil used in the world.” That’s true but it ignores the aggregate of the impacts of oil leaks from all of those rigs.

And we know already that aggregate AI demand is massive, at a few % of total electricity use.

But the true rub is in the exponential growth of that demand. When it’s growing 100% roughly every six months it soon exceeds everything else. Literally. That’s the point of my analysis here, "How AI and Bitcoin will eat the world": https://tamhunt.medium.com/how-ai-and-crypto-will-eat-the-world-189d242210e1

Expand full comment
Linda's avatar

There is a reason why AI companies try to force users to use AI rather than searches, and it's not for our benefit.

Right now, all the AI companies are losing money. What might that investment be in, if it was not in AI that we are forced to use, online?

Some of us avoid using plastic bottles, besides trying to avoid using AI when we don't need it. Any little bit helps. Oh, I forgot, we can bury the bottles, another cost, and then they don't matter.

Expand full comment
Juniper Nichols's avatar

This gives me a lot to think about re: chatbots, but on the other hand, there are coal plants which were scheduled for retirement which will keep running specifically to support data centers for AI.

“Additional demand from new datacenters will double in just a year, to 47,448 GWh between 2024 and 2025, and rise more than eightfold by 2030 to 199,982 GWh, according to a forecast from S&P Global Commodity Insights. That could be a lifeline for coal power.

"There is certainly a strong chance for many of the existing coal [plants] out there to run longer than what was expected prior to the now-explosive growth forecasts in datacenter electricity demand forecasts/electrification," CreditSights analyst Nick Moglia told Commodity Insights.”

Whether or not those projections prove true, it has already built the political will to continue if not accelerate fossil fuel usage.

https://www.spglobal.com/commodity-insights/en/news-research/latest-news/electric-power/110524-us-power-generators-pump-the-brakes-on-coal-plant-retirements

Expand full comment
Tom Mikulka's avatar

Great that I don't have to feel bad about my personal use but I do feel bad about those AI centers that will be powered by gas or coal. I also have big problems with the dumbing down of humanity. The recent NYTimes article on AI hallucinations had many comments about students relying on AI rather than developing writing and critical thinking skills. Lawyers are submitting AI generated briefs riddled with mistakes. Bill Gates can brag that AI will replace doctors and teachers but I am reminded of Oppenheimer's comment " now I am become death, the destroyer of worlds." AI is being used for evil purposes as I write. Pandora's box has been opened.

Expand full comment
Anneke Hobson's avatar

Thank you so much Hannah! This is incredibly helpful information.

Expand full comment
Grant Mowry's avatar

Which specific model are we talking about here? I use the advanced reasoning models a lot for writing code, which probably consumes much more power than your analysis suggests, but it is also worth considering the amount of human work/time saved. I’d be curious what the carbon footprint is of a person spending a couple extra minutes google searching because they didn’t get as useful an answer as they were looking for.

Expand full comment