What's the carbon footprint of using ChatGPT?
Very small compared to most of the other stuff you do.
A couple of months ago I wrote an article looking at the energy use of artificial intelligence. Those numbers were focused on the macro picture: how much data centres and AI use on aggregate, for training, querying and everything else the world uses it for.
It was not a guide on what this meant for individuals and how they use AI tools. But this is something that people want to know (and I have been asked about it many times).
My sense is that a lot of climate-conscious people feel guilty about using ChatGPT. In fact it goes further: I think many people judge others for using it, because of the perceived environmental impact.
If I’m being honest, for a while I also felt a bit guilty about using AI. The common rule-of-thumb is that ChatGPT uses 10 times as much energy as a Google search [I think this is probably now too high, but more on that later]. How, then, do I justify the far more energy-hungry option? Maybe I should limit myself to only using LLMs when I would really benefit from the more in-depth answer.
But after looking at the data on individual use of LLMs, I have stopped worrying about it and I think you should too.
For these analyses, I usually go back to the drawing board and crunch all of the numbers myself. But I recently stumbled upon several articles by another Substack writer, Andy Masley, who covered all of this in great detail. So, rather than repeating the entire exercise, I’d like to draw even more attention to these articles and ask you to go there for the in-depth story.
Now, I’m not sharing these articles blindly. I have checked the assumptions, and they rely on many of the heuristics that I would have used, too.
Here, instead, I want to give a brief summary of the results, my takeaways and what they mean for your use of ChatGPT. I’ll also look at the possibility that our current “best estimates” for its energy use are at least 10 times too high. That would make this argument — that personal use of ChatGPT is not that bad for the environment — even stronger.
Energy footprint
The key number in Andy’s analysis — and what you’d get from previous ChatGPT vs. Google search comparisons — is 3 Wh (Watt-hours). That’s the amount of electricity that’s used when you ask ChatGPT a question.
On its own, that number is meaningless. So let’s give it some perspective.
The UK generates around 4,500 kilowatt-hours (kWh) — or 4,500,000 Wh — of electricity per person per year, which covers all of our household, services, and domestic industrial activities.1 That means that one ChatGPT search is equal to 0.00007% of our annual per capita electricity footprint.
Or to put it another way: average electricity usage in the UK is 12,000 Wh per day. A ChatGPT prompt is 3 Wh. 3 Wh!
Of course, people don’t just use ChatGPT once. Let’s assume that you’re doing 10 searches per day. That would be equal to 0.2% of per capita electricity use. Ramp it up to 100 searches per day — which I expect very few people are doing — and it gets to around 2%.
Electricity use in the United States is about three times higher than in the UK, so ChatGPT prompts are an even smaller piece of the pie. Ten searches per day would come to 0.09% of per capita electricity generation, while 100 searches would be 0.9%.
Unless you’re an extreme power user, asking AI questions every day is still a rounding error on your total electricity footprint.
The reason we often think that ChatGPT is an energy guzzler is because of the initial statement: it uses 10 times more energy than a Google search. Even if this is accurate, what’s missing is the context that a Google search uses a really tiny amount of energy. Even 10 times a really tiny number is still tiny.
Carbon footprint
What about your impact on the climate?
Of course, this question depends on how “clean” the electricity powering the data centres is.
Some of our best estimates are that one query emits around 2 to 3 grams of CO2. That includes the amortised emissions associated with training.
We’ll take the higher number. If you did 10 searches every day for an entire year, your carbon footprint would increase by 11 kilograms of CO2.2 Let’s just be clear on how small 11 kilograms of CO2 is. The UK average footprint — just from energy and industry alone — is around 7 tonnes per person.3
That means a moderate use of ChatGPT increases a Brit’s emissions by 0.16%. That’s similar to the percentages we saw for electricity consumption above.
For the average American — who has a higher carbon footprint — it would be 0.07%.
To illustrate this point, Andy Masley made the following chart (based on the original from Founders Pledge). It compares the tonnes of CO2 avoided from different behavioural changes, to asking ChatGPT 50,000 fewer questions (which is about 14 years’ worth of asking it 10 times a day, every day).
It saves less than even the “small stuff” that we can do, like recycling, reusing plastic bags and replacing our lightbulbs. These are worth doing, by the way, but not at the expense of the big stuff like diet, cars, home heating, and flights, which can often save tonnes of CO2 a year. This is even more true for ChatGPT: if we’re fretting over a few queries a day while having a beef burger for dinner, heating our homes with a gas boiler, and driving a petrol car, we will get nowhere.

Maybe ChatGPT uses 10 times less energy than we think?
All of the comparisons and conclusions above rest on the assumption that one search using ChatGPT uses around 3 Wh of electricity. Again, that comes from the statement that “ChatGPT is ten times as energy-intensive as a Google Search”.
That’s what Andy assumes. It’s also the rule-of-thumb that I quoted in my previous article on the energy use of AI.
But there is good reason to believe that we’re being incredibly conservative by using that number. I expect that energy use is now much lower than 3 Wh based on efficiency improvements in the last few years.
More up-to-date analyses suggest that a ChatGPT query now uses just 0.3 Wh — ten times less. In fact, the analysts at Epoch AI still think that they’re still being pessimistic/conservative with that 0.3 Wh estimate, so it could be even lower.
That would mean that our already small environmental impact numbers from above are ten times too high. 10 queries a day would not be equal to 0.2% of a Brit’s electricity consumption, but 0.02% instead.
I cannot say for certain that 0.3 Wh is the best “updated” number. But I’d bet that the real number is closer to 0.3 than to 3 Wh.
I mentioned this in my previous article, but let me say again how crazy I think it is that we’re left debating the order-of-magnitude energy use of LLMs. We’re not just talking about whether it’s 3, 3.5 or 4 Wh. We’re talking about whether our current calculations are ten times too high. Of course, tech companies do know what the right number is; it’s just that a lack of transparency means the rest of us are left bumbling around, wasting time.
If you were to use the 0.3 Wh estimate instead, here are some comparisons of how ChatGPT queries compare to other activities. This is shown for different lengths of query inputs. Most people are using a “typical” query, which is less than 100 words. It’s a simple question. There are then longer (7,500 words) and maximum-length queries (75,000 words) which use more energy, but I don’t know anyone giving ChatGPT an entire book to read.
A typical query uses far less energy than a standard lightbulb, or even just running your laptop for 5 minutes.
A standard text-based search with ChatGPT uses a tiny amount of energy. We are not going to make a dent in climate change by stigmatising it or making people feel guilty.

What I am not saying
Let me again be clear about what I’m saying and not saying here.
For the regular or even relatively high user of text-based LLMs: stop stressing about the energy and carbon footprint. It’s not a big deal, and restraining yourself from making 5 searches a day is not going to make a difference. In fact, it might have a net negative impact because you’re losing out on some of the benefits and efficiencies that come from these models.
This is not necessarily the case for power users who generate lots of high-quality videos and audio. Apparently, generating pictures has a similar energy cost to text-based queries, so the above still applies there. But I don’t have the numbers on video and audio, and I expect the footprint to be significantly larger.
I am not saying that AI energy demand, on aggregate, is not a problem. It is, even if it’s “just” of a similar magnitude to the other sectors that we need to electrify, such as cars, heating, or parts of industry. It’s just that individuals querying chatbots is a relatively small part of AI's total energy consumption. That’s how both of these facts can be true at the same time.
Again, Andy Masley covered this in much more detail, so if you want to dig deeper then check his articles out:
Note that this doesn't just include your own household electricity use. It also includes the electricity for public transport, services, and industry that make up our entire economy.
That's 3 grams, multiplied by 10 per day, multiplied by 365 days.
These are consumption-based emissions, so they adjust for the carbon footprint of goods that are imported into the UK and ultimately "consumed" by them.
Huge fan of your work so the shoutout means a lot, thanks so much for sharing the posts and adding context!
Thanks for this article Hannah. After reading, I have a few thoughts, I would be interested to hear your take on them.
1) Maybe one individual LLM use does not have a crazy emission/water impact, but it doesn't mean we should downplay our collective impact. For example, I can argue that my individual fast fashion consumption has such a low impact, only a couple of thousand liters of water, + a few kilos of textile waste per year. Who cares? Maybe true, my individual actions, if looked at in a bubble, don't have the biggest impact. But the thing is, I'm not the only one doing it. We do it collectively, and when everyone is buying fast fashion / uses LLMs, our impacts add up. And not just the environmental impact, but using and engaging with AI is a signal for companies that there is demand, so they will do even more of it. (I want to mention that I agree, there are good use cases for AI, but when it is shoved into every single product and allows us to do useless (generate funny images for no reason) or even harmful (spew out misinformation shared on social media) things, it is a misuse of our resources. And we definitely don't need more of AI that does these useless / harmful things.)
2) There is also an indirect environmental cost of using AI, in the form of mining, toxic chemical use, and environmental destruction of digging out all the necessary raw material to produce the hardware AI runs on. Would love to see that accounted for.
So I'm not anti-AI, but I'm sceptical & cautious, and I believe both individuals and companies should be more intentional with their AI use.