33 Comments
User's avatar
Tanner Janesky's avatar

Thanks for the analysis Hannah. Good research. Even though energy use per query is very low and most likely will get lower, we need to consider the rebound effect. By making AI efficient, easy, and cheap, more and more systems will rely on it. Many already do. As more users and back-end systems use it, overall energy requirements will increase. Just as the first electric lightbulbs were horribly inefficient, we use much more electricity for electric LEDs now since they have become efficient, cheap, and ubiquitous. Human civilization tends to increase in complexity and energy intensity over time, even if any individual technology gets more efficient.

Expand full comment
๐“™๐“ช๐“ผ๐“ถ๐“ฒ๐“ท๐“ฎ ๐“ฆ๐“ธ๐“ต๐“ฏ๐“ฎ's avatar

This isn't taking into account the electricity and water used to COOL THE DATA CENTERS. This is industry propaganda.

Expand full comment
Buzen's avatar

The statement about lighting isnโ€™t correct. In 2001 (according to the EIA) the US used 766 TWh of electricity for commercial and residential lighting, and by 2015 that had fallen to 374 TWh; which isnโ€™t surprising since LEDs are 90% more efficient than incandescent โ€” over 100 lumens per Watt for LED vs 10 lumens per Watt for incandescent. If you think lighting applications are using more energy now with LEDs than that would mean there are 10x the number of lights being kept on now or being kept on for 10 times longer, neither Iโ€™ve witnessed anywhere, even Las Vegas.

Efficiency will increase usage of a resource to an extant, as Jevons postulated, but it doesnโ€™t necessarily mean increasing usage so much that efficiency should be discouraged.

Expand full comment
Tanner Janesky's avatar

The point was about the first lightbulbs, a few years after they became available, comparable to now and where we are with AI. Of course, anyone can cherry-pick a short period of change that deviates from the long-term trend. A 10X efficiency gain of LEDs over incandescent is sure to have a short-term effect. Compare world electricity use for lighting in 1890 compared to today.

The point is, when new technology becomes cheaper and more efficient, more and more people use it. It's not just electric lightingโ€”it's air conditioning, smartphones, photovoltaic panels, laptops, dishwashers, etc. Classic Jevon's Paradox.

Expand full comment
Rich Miller's avatar

Thanks for the detailed analysis. It's interesting to see the amount of debate every time an AI company releases an energy data point.

What's clear, and important, is that Google is very rapidly reducing the energy and carbon impact of its AI infrastructure. This should push its rivals to do the same.

Expand full comment
๐“™๐“ช๐“ผ๐“ถ๐“ฒ๐“ท๐“ฎ ๐“ฆ๐“ธ๐“ต๐“ฏ๐“ฎ's avatar

How much electricity and water are being used to COOL the DATA CENTERS. That's what's important.

Expand full comment
msxc's avatar

Cooling water usage is quite confusing argument. The water is not "destroyed" or contaminated, whether it is nuclear power plant or data center. Cool water comes in, warmer water goes out(could be very useful resource too), that may have some issues for the environment (if heat is damaging say river habitat, during dry seasons), but is nowhere as scary as contaminating water with fertilizers, salts or other chemicals.

Expand full comment
๐“™๐“ช๐“ผ๐“ถ๐“ฒ๐“ท๐“ฎ ๐“ฆ๐“ธ๐“ต๐“ฏ๐“ฎ's avatar

Water is becoming a very scarce resource. So bad in the UK they want people to delete old emails and photos. AI is all fucking bad and it's utter trash๐Ÿšฎ Google will lie through its teeth to try to keep it afloat.

https://www.404media.co/uk-asks-people-to-delete-emails-in-order-to-save-water-during-drought/

Expand full comment
msxc's avatar

Yes- potable fresh water is quite a scarce resource. Cooling can be done in closed loops, warm water does have real usage (heating of spaces, warm residential water). Also water for cooling doesn't have to be "fresh"(salty is fine with heat exchangers). There are different scales of "problems" and solutions for some.

Expand full comment
Jonathan Irons's avatar

Thank you so much for following up with this. Your reports are valuable when discussing this, and there's a lot of heat, especially in climate-conscious circles, around the subject.

That said, the pinch of salt I would take when processing Google's numbers is huge. They have been notoriously and consistently nebulous about their emissions, and wilfully conflate Scope 1, 2, 3 etc.

The real question is not what the carbon footprint of "me" using ChatGPT is, it's what is the carbon footprint *increase* of this huge surge towards AI. As ever, Jevons keeps looking over our shoulder here.

Expand full comment
Paul Magnall's avatar

โ€œThe concept of the "carbon footprint" was developed by the advertising firm Ogilvy & Mather for British Petroleum (BP) in the early 2000s, aiming to shift responsibility for climate change onto individuals rather than the oil industryโ€

Is calculating the AI footprint for individual texts not following in the same footprints? 8-)

Expand full comment
John McGrath's avatar

This is useful information. But if the energy use is so low, why is Google (along with all the rest) pushing so hard to build so many data centers? A plausible answer is that, low as the per-query cost is, the number of queries is exploding--in which case, AI-driven load is still a problem.

I'm skeptical of these self-serving analyses until they include the meaningful part of it too--overall AI driven load growth. That's what needs to decline. Cloud computing electricity consumption did flatten out about a decade ago, countering expectations, so we have precedent that it's possible.

Expand full comment
Buzen's avatar

Why do you think AI driven load growth needs to decline? If itโ€™s valuable, as many investors think it is, then it will continue to grow, and more electricity will be needed and can be cleanly made with nuclear or solar or wind plus batteries.

Expand full comment
John McGrath's avatar

Because right now weโ€™re keeping coal stations in use and building new gas generation while the current administration tries to kill renewablesโ€”and nuclear takes decades to build.

Iโ€™m not anti-AI, but it can be made much more efficient.

Expand full comment
msxc's avatar

One benefit of AI and need for data-centers- it stopped silly arguments like "we don't need more power", and also "solar and wind are here ready to power the world", or "grids are not necessary". Now the questions to ask and demand some actions is how to build Nuclear in 4years like it was possible about 20years ago in Japan still (modern, efficient, safe Gen III+ power plants that should last 60+years). There are clear ways to go from "decades and ballooned costs" to what is should be.

Expand full comment
John McGrath's avatar

I agree. And we should be slamming in renewables and batteries at the same time. Instead of killing enormous near-complete projects.

Expand full comment
Laura's avatar

Interesting analysis Hannah. My issue would be that, regardless of the energy per enquiry and your comment that "for most people, even moderate use of ChatGPT is a very small part of their footprint", these companies are 'encouraging' use of LLM's EVERYWHERE in their relationship with their customers. Just look at the new phone launches which laud their AI credentials. WhatsApp has a (wholly irrelevant, in my opinion) search option up front and centre of their app now. The amount people are using LLM's without even knowing that's what they're doing with every search engine search producing an LLM summary at the top really needs to be quantified. I expect I use an LLM enquiry pretty much every time I pick up my phone and every time it's not because I've asked for the help or even been particularly aware of it. So their scaling down of energy use may be great but their scaling up of individual uses certainly outweighs by many factors the savings they're claiming. In my opinion!

Expand full comment
Ben Lane's avatar

My main issue with this discussion is the the VOLUME... its not enough to estimate the unit metric of an activity... you then need to multiply by the total number of events. Comparison with TVs or microwaves are weak as the number of inference prompts now in the billions per day, a small unit metric may end up having global impacts. Doh!

Also most of these numbers on the low side to me. Fortunately, an excellent paper was recently published by the University of Rhode Island which calculated that GPT-4.1 nano is the most efficient at 0.45 Wh for long prompts, with other models having significantly higher emissions; o3, for example, can consume as much as 39 Wh per query!

If this was powered using UK electricity (av 0.177 gCO2/Wh, DESNZ 2025 data) then the CO2e emissions would be in range 0.08g - 6.9g per prompt.

If this was powered using US electricity (av 0.350 gCO2/Wh, EPA 2024 data) then the CO2e emissions would be in range 0.16g - 13.7g per prompt.

FMI, see my original post: https://www.linkedin.com/posts/ecolane_how-hungry-is-ai-benchmarking-energy-water-activity-7342802353852456961-CLkk?utm_source=share&utm_medium=member_desktop&rcm=ACoAAAKh49MB6vw8E79Mxk6I1qsWdkTow7rgW_0

Expand full comment
John McGrath's avatar

For a point of comparison, the carbon footprint of a hamburger is about 4 kg CO2e. At an average US emissions factor, that's the CO2e from a little over 11 kWh of electricity generation. Comparing 11 kWh to 0.3 Wh, one hamburger uses 36,666x more electricity than one LLM query, on average.

Forgo the occasional hamburger, and use AI to your heart's content.

Expand full comment
Laura's avatar

Except that we actually have hamburgers 'occasionally' but do you know how many times you're using an LLM without even knowing it and being encouraged to do so even more in every interaction you have with the digital world

Expand full comment
Buzen's avatar

This has implications for the future of AI and robots taking jobs from us, since most workers will consume at least one hamburger a day on average, while AI computers and robots working in dark factories use hundreds of times less energy โ€” added benefit: the robots donโ€™t exhale COโ‚‚ all day long.

Expand full comment
Chris Schuck's avatar

Thanks for all this research. Is it possible to clarify the degree to which these figures already incorporate the total energy and water costs of building, maintaining and growing the data center infrastructure as a whole? To my mind this was always the biggest concern - the total network effects - not the per capita marginal cost of an individual user.

Also: if a user dumps an entire full-length document/article into the prompt and asks AI to summarize or process, that's much longer than the "average" prompt referenced here, correct? Because that's how many people I know use it.

Expand full comment
Buzen's avatar

Well it depends on where the data centers are located. If we build most of them in Iceland, which has fiber optic connections between North America and Europe, has all renewable energy (geothermal and hydro), lots of water, and cold temperatures then they would emit little CO2.

Or you could put them in Northern Canada which has all of the same along (except geothermal) with efficient CANDU nuclear power.

Expand full comment
Moments of Wonder's avatar

Amazingly detailed research, thank you. In other words, AI burns off brain cells as well as electricity!

Expand full comment
Jonathan McLachlan's avatar

Loved your book Not the End of The World and this morning I read in The Guardian you have another book - Clearing the Air, congratulations. I have worked with The Climate Fresk, Carbon Literacy, Protect Our Winters along with other climate change organisations as well as delivered a number of my own talks about climate change, I completely agree with your comment we must abandon the slogan โ€œkeep 1.5 aliveโ€ you can almost hear the groans whenever it is quoted. I believe that we are now experiencing the undermining of any message about climate change and abandonment of any climate initiatives is being led by big corporations and manufacturers as they dilute and abandon their messages. The consequence of which is society is beginning to push back following the lead of these businesses who can exert enormous influence in the markets. I have felt the push back and cynicism of the audience I speak to first hand, it is becoming more noticeable.

Expand full comment
Sustainability on the Inside's avatar

Thanks Hannah for this thoughtful analysis. As someone who works in tech, this is a conversation I have every other day. For me, the overarching message is that AI is here to stay and what we all need to be focused on is making it as energy efficient as possible. In order to do this, we need as much data as we can get our hands on and should be encouraging transparency.

Expand full comment
Niclas Bertelsen's avatar

What in the world of degrowth is this corner of Substack lol

Expand full comment
Rien's avatar

2700kWh elec/year av. UK household. 28.4 million households in UK, 69.23 million people in UK. (2700 x 28.4 / 69.23)/365 days = 3.03kWh/person/year, not 12kWhโ€ฆ

Expand full comment
Petra Sweden's avatar

I think the world still needs this pic even though bc noone else is telling them exactly this. Most do not even know the basic. Can you pls use my scentence Hannah? So many need to now ๐Ÿ˜ฉ

Expand full comment
Buzen's avatar

So from the numbers in the article, if you make an AI query from a browser on your laptop then most of the energy used is just having the laptop on to type in the question and to then read the answer, not in the data center processing itself. Not worth even worrying about.

Expand full comment