Thanks for this article Hannah. After reading, I have a few thoughts, I would be interested to hear your take on them.
1) Maybe one individual LLM use does not have a crazy emission/water impact, but it doesn't mean we should downplay our collective impact. For example, I can argue that my individual fast fashion consumption has such a low impact, only a couple of thousand liters of water, + a few kilos of textile waste per year. Who cares? Maybe true, my individual actions, if looked at in a bubble, don't have the biggest impact. But the thing is, I'm not the only one doing it. We do it collectively, and when everyone is buying fast fashion / uses LLMs, our impacts add up. And not just the environmental impact, but using and engaging with AI is a signal for companies that there is demand, so they will do even more of it. (I want to mention that I agree, there are good use cases for AI, but when it is shoved into every single product and allows us to do useless (generate funny images for no reason) or even harmful (spew out misinformation shared on social media) things, it is a misuse of our resources. And we definitely don't need more of AI that does these useless / harmful things.)
2) There is also an indirect environmental cost of using AI, in the form of mining, toxic chemical use, and environmental destruction of digging out all the necessary raw material to produce the hardware AI runs on. Would love to see that accounted for.
So I'm not anti-AI, but I'm sceptical & cautious, and I believe both individuals and companies should be more intentional with their AI use.
Hi there. I really agree that some use of AI seems completely useless and detrimental. For instance, misinformation is bad. I want to point that these are valid arguments. Misinformation is bad. AI-driven misinformation is bad. We do not need to make it about climate change. The goal of human societies is not only to stop climate change. Human well being is an important goal, for instance. No need to frame everything bad for society as something bad for climate change or the environment.
... but does 'text' AI add to 'old style' Google searches or are they a 100% 'new and extra'.
Not sure of those splits, but I 'feel' (as I haven't yet tried to measure it), that my google searching has reduced and within the use of say Elict and Research Rabbit an AI question gets me to a better answer faster, so of...
... wander off to think on how to measure that, better.
I believe it also comes down to the question of how much more positive impact can be done with AI in comparison to working without AI. For example, if I can accomplish a project which serves the world in double the time or more, the energy/water consumption of the LLM would be worth it in my opinion.
So I don't think you can really compare it to consumer products which get thrown away in a couple of months or years, in the end LLMs just make things more efficient in the workplace, and I believe the "easy" google questions aren't used as often as work related questions and if so I would guess that the easy questions don't need as much resources.
I read the original piece by Andy Masley and agreed with the analysis but then over time I've become unsure again, mainly triggered by watching the Studio Ghibli fad unfold.
Image generation (and video generation) must have a higher impact than text-only, and we should probably consider the utility of it all as well. I mean, what was the point of the Ghibli thing, really? If you eat a burger it has high utility - you stay alive for another day or two. If you generate a bunch of images (probably three for every one you actually want, given how the output isn't usually exactly what you hoped for), it has low utility. Millions of people generating pointless bullshit collectively could have a negative impact and shouldn't really be compared with the amount of water used by the food supply chain or the amount of electricity used by aircon.
What’s the point of fan art or video games? Both also aren’t necessary but make people happy. The Ghibli trend was fun for people and used tiny amounts of energy for each image. When other goofy trends like this happen online people never bring up the carbon cost, even though they always also have one. I suspect people are upset at AI for other reasons and are using the carbon cost to criticize it, but imo the point would make more sense if they just criticized AI directly instead of this roundabout “oh it uses 3Wh” thing
Perhaps we should think of the carbon cost of the other stuff too, though. One thing that struck me from your article was the high cost of YouTube, with an implication of broadly "you don't care much about that, do you?" I definitely had never thought about it, and I'm wondering whether we should care. Just because something makes people happy is not a slam dunk moral argument - people used to nail cats to trees for fun but we kind of outgrew that. The environmental cost of everything is probably a new level of sophistication in understanding the world that the new generations coming through will just naturally have a better handle on
How much carbon was emitted transmitting your banal points to my device? Excuse me while I go nail a cat to a tree, in atonement (purring emits gigatons of carbon globally, and for what? Making pet owners happy).
And what about all the energy wasted making music? And how much oil goes into a modern painting?
We should all just sit quietly in cold dark rooms silently contemplating our coming climate apocalypse. And no crying or heavy sighing, as we know those also emit carbon.
The difference is that those are beautiful works of human art vs. a machine that simply copies work already done by past artists. One has emotion & meaning and one is a replica done by a computer. This is such a bad straw man argument.
The difference is that those are beautiful works of human art vs. a machine that simply copies work already done by past artists. One has emotion & meaning and one is a replica done by a computer. This is such a bad straw man argument.
The difference is that those are beautiful works of human art vs. a machine that simply copies work already done by past artists. One has emotion & meaning and one is a replica done by a computer. This is such a bad straw man argument.
This is only taking into account the processing on the servers, not the infrastructure needed for the delivery.
And a basic sanity check tells me this is wrong. The added electricity consumption is too high to be waved off as nothing more than parts of a light bulb. It doesn't compute.
Yes, we are having huge controversies over proposed data centers each of which are proposing to use the energy of thousands of households, throwing the sustainability plans of the state out the window. If it isn’t AI, what is consuming all that energy?
I go into a lot of detail about this in the cheat sheet post. AI more broadly is an environmental problem. Chatbots are an extremely tiny fraction of AI energy use.
Do you think that'd add that much? I also ignore the idle time of servers running YouTube videos. This applies to everything we do online, so if I include those, the energy costs of everything else will go up too, and ChatGPT won't really stand out.
When you pop down to the pub for a pint with friends, the energy used to make that beer was more than a ChatGPT query, and you may not be accounting for the energy you and your friends used to get to the pub, or the lighting in the pub or the lorry to bring the beer to the pub.
Also, if you play trivia at the pub that’s a few hundred grams of CO2 per play.
Where are you getting the number implying the query is 10% of the training? I’ve never seen anything that implied that. It’d mean ChatGPT’s been used 100x less than I assumed
"An informal online estimate for ChatGPT indicates that it produces 0.382 g CO2e per query".
"Assuming that ChatGPT undergoes a full re-training of the model once per month and continues with an estimated 10,000,000 queries per day, the 552 metric tons divided by 300,000,000 queries equates to 1.84 g CO2e per query for the amortized training cost."
Huge fan of your book, Hannah, but concerned about this post. As I told Andy on LinkedIn, 1/. We have no idea what ChatGPT’s real impact is because OpenAI won’t tell us (why, if it’s so small?) 2/. There are many less energy-consuming alternatives, it’s just people don’t know about them & 3/. Many macro-trends suggest that (generative/agentic) AI is a net negative for the climate. Notably the fact that US coal power stations previously set to be stopped are now being kept going, partially to meet AI’s soaring electricity demands. So in this sense, AI is actually slowing down the energy transition. Everything else we know for sure is here: https://bettertech.blog/2025/04/19/ais-impacts-how-to-limit-them-and-why/ - TL;DR: let’s look at the bigger picture before saying "this is fine". It’s not fine at all
Well said. What are you referring to with less energy-consuming alternatives?
Such a great point about the net negative climate impact. We can do math all day on the relatively small personal usage of ChatGPT but the fact remains that massive AI data centers are being built all over the world. That's a real impact.
I get that Hannah is comparing ChatGPT usage to our other everyday behaviors, but everything is additive and there is infrastructure behind it.
I appreciate this analysis and maybe I don't need to shame myself and others for using AI tools due to climate impacts. However as others have pointed out, collectively this adds up. We're currently trying to clean up the atmosphere but at the same time are adding new unnecessary energy consumption that's slowing us down.
The bigger issue with LLMs that many people using them happily ignore is the material they're trained on, ie the IP theft and the inherent biases and misinformation. That is a far better reason not to use them.
... it is only one straw, 7 billion users said, etc., etc.
But I am still looking for the whole life cycle of design, infrastructure and construction of new data centres, rather than the previously planned upgrades of older style Google search centres.
Yea I'm surprised this article focuses so much on percentages and relative personal use. It is often touted that the most impactful way to combat climate change at an individual level is through improved efficiency, e.g. better light bulbs, going all electric in your home, etc. Every percentage counts when we take a macro view.
I'm not saying this analysis is wrong or that we shouldn't focus on bigger efficiencies, but we are effectively ADDING more usage here so let's acknowledge that.
Also, a separate argument is on the validity of even using AI for the average person. It's one thing to need food and transportation, it's another to be using an energy-wasteful tool that is in most cases inferior to actual googling and research and critical thinking.
I am liking the recent (to me) one on the US average income;
$74,500... really?
$65,000 excluding the top 10 Americans... really?
$48,000 excluding the top 50 Americans... really?
$35,000 excluding the top 1,000 Americans...
Yet, Walmart and McDs make multi-billion dollar profits, and yet well over 50% of their employees are on benefits, not from the State or states, but 'my tax dollars. : )))))))
I wonder how much energy a TikTok post (shaming people for their ungreen ChatGPT queries ) uses, including the energy to show it to millions of viewers.
Maybe if the electricity grid managers in Spain had used ChatGPT to learn about how lack of inertial stability because of over reliance on solar power can cause grid failures they could have prevented the blackout and saved lots of energy and trouble. But they saved 3 Wh by not making that query.
You're assuming what ChatGPT tells you is correct. It is NOT another google despite it looking that way. Treat it like an assistant/tool, not a guru. I've gotten many incorrect results with it with basic questions, even on how to remove a type of stain on a shirt.
Due diligence through proper research and knowing your sources is very important. Human are losing critical thinking skills with this, it's super depressing.
As with most things, it also comes down to trade off and intention. If I’m using an LLM to help build my impact business I’d imagine the net benefit far outweighs the cost.
This is an incredible piece. As someone who has the ChatGPT guilt often this helps a lot. Thank you for the work you do!
This article looks just at the estimated energy demand of each search, or individual searches in the aggregate for each person, and argues that they’re negligible compared to other energy demand. But then he says “I am not saying that AI energy demand, on aggregate, is not a problem. It is, even if it’s “just” of a similar magnitude to the other sectors that we need to electrify, such as cars, heating, or parts of industry.”
And therein lies the rub, a lot of small things combined of course do lead to big impacts and this can’t be ignored. It’s like saying “well, a little bit of oil leaking from my oil rig off the coast of Santa Barbara is a tiny bit of the total oil used in the world.” That’s true but it ignores the aggregate of the impacts of oil leaks from all of those rigs.
And we know already that aggregate AI demand is massive, at a few % of total electricity use.
But the true rub is in the exponential growth of that demand. When it’s growing 100% roughly every six months it soon exceeds everything else. Literally. That’s the point of my analysis here, "How AI and Bitcoin will eat the world": https://tamhunt.medium.com/how-ai-and-crypto-will-eat-the-world-189d242210e1
There is a reason why AI companies try to force users to use AI rather than searches, and it's not for our benefit.
Right now, all the AI companies are losing money. What might that investment be in, if it was not in AI that we are forced to use, online?
Some of us avoid using plastic bottles, besides trying to avoid using AI when we don't need it. Any little bit helps. Oh, I forgot, we can bury the bottles, another cost, and then they don't matter.
This gives me a lot to think about re: chatbots, but on the other hand, there are coal plants which were scheduled for retirement which will keep running specifically to support data centers for AI.
“Additional demand from new datacenters will double in just a year, to 47,448 GWh between 2024 and 2025, and rise more than eightfold by 2030 to 199,982 GWh, according to a forecast from S&P Global Commodity Insights. That could be a lifeline for coal power.
"There is certainly a strong chance for many of the existing coal [plants] out there to run longer than what was expected prior to the now-explosive growth forecasts in datacenter electricity demand forecasts/electrification," CreditSights analyst Nick Moglia told Commodity Insights.”
Whether or not those projections prove true, it has already built the political will to continue if not accelerate fossil fuel usage.
Which specific model are we talking about here? I use the advanced reasoning models a lot for writing code, which probably consumes much more power than your analysis suggests, but it is also worth considering the amount of human work/time saved. I’d be curious what the carbon footprint is of a person spending a couple extra minutes google searching because they didn’t get as useful an answer as they were looking for.
I've read it but I'm really unsure of the number of prompts used. I checked a few interactions I've had at work to find a solution to something, or helping me rewrite a note to something else.
The shortest interaction I've had included 15 promts to get it as wanted. And that's less than average for me. I'm often somewhere between 25-50 prompts, often I'm up to a hundred if it's something intricate I want help with.
And that is not taking picturemaking into account, those are most often 50+ to get it exactly right.
Maybe we're just unlucky with the version that I use, but I got friends who use it for work and D&D, and they often have several interactions at the same time with often 100+ prompts to it.
Huge fan of your work so the shoutout means a lot, thanks so much for sharing the posts and adding context!
Thanks for this article Hannah. After reading, I have a few thoughts, I would be interested to hear your take on them.
1) Maybe one individual LLM use does not have a crazy emission/water impact, but it doesn't mean we should downplay our collective impact. For example, I can argue that my individual fast fashion consumption has such a low impact, only a couple of thousand liters of water, + a few kilos of textile waste per year. Who cares? Maybe true, my individual actions, if looked at in a bubble, don't have the biggest impact. But the thing is, I'm not the only one doing it. We do it collectively, and when everyone is buying fast fashion / uses LLMs, our impacts add up. And not just the environmental impact, but using and engaging with AI is a signal for companies that there is demand, so they will do even more of it. (I want to mention that I agree, there are good use cases for AI, but when it is shoved into every single product and allows us to do useless (generate funny images for no reason) or even harmful (spew out misinformation shared on social media) things, it is a misuse of our resources. And we definitely don't need more of AI that does these useless / harmful things.)
2) There is also an indirect environmental cost of using AI, in the form of mining, toxic chemical use, and environmental destruction of digging out all the necessary raw material to produce the hardware AI runs on. Would love to see that accounted for.
So I'm not anti-AI, but I'm sceptical & cautious, and I believe both individuals and companies should be more intentional with their AI use.
Hi there. I really agree that some use of AI seems completely useless and detrimental. For instance, misinformation is bad. I want to point that these are valid arguments. Misinformation is bad. AI-driven misinformation is bad. We do not need to make it about climate change. The goal of human societies is not only to stop climate change. Human well being is an important goal, for instance. No need to frame everything bad for society as something bad for climate change or the environment.
It can also be about wasting time discussing things you already know or telling jokes to it.
Very good point and something that concerns me, too!
I talk about both of these points in a lot of detail in the cheat sheet post fwiw
... but does 'text' AI add to 'old style' Google searches or are they a 100% 'new and extra'.
Not sure of those splits, but I 'feel' (as I haven't yet tried to measure it), that my google searching has reduced and within the use of say Elict and Research Rabbit an AI question gets me to a better answer faster, so of...
... wander off to think on how to measure that, better.
I believe it also comes down to the question of how much more positive impact can be done with AI in comparison to working without AI. For example, if I can accomplish a project which serves the world in double the time or more, the energy/water consumption of the LLM would be worth it in my opinion.
So I don't think you can really compare it to consumer products which get thrown away in a couple of months or years, in the end LLMs just make things more efficient in the workplace, and I believe the "easy" google questions aren't used as often as work related questions and if so I would guess that the easy questions don't need as much resources.
I read the original piece by Andy Masley and agreed with the analysis but then over time I've become unsure again, mainly triggered by watching the Studio Ghibli fad unfold.
Image generation (and video generation) must have a higher impact than text-only, and we should probably consider the utility of it all as well. I mean, what was the point of the Ghibli thing, really? If you eat a burger it has high utility - you stay alive for another day or two. If you generate a bunch of images (probably three for every one you actually want, given how the output isn't usually exactly what you hoped for), it has low utility. Millions of people generating pointless bullshit collectively could have a negative impact and shouldn't really be compared with the amount of water used by the food supply chain or the amount of electricity used by aircon.
What’s the point of fan art or video games? Both also aren’t necessary but make people happy. The Ghibli trend was fun for people and used tiny amounts of energy for each image. When other goofy trends like this happen online people never bring up the carbon cost, even though they always also have one. I suspect people are upset at AI for other reasons and are using the carbon cost to criticize it, but imo the point would make more sense if they just criticized AI directly instead of this roundabout “oh it uses 3Wh” thing
Perhaps we should think of the carbon cost of the other stuff too, though. One thing that struck me from your article was the high cost of YouTube, with an implication of broadly "you don't care much about that, do you?" I definitely had never thought about it, and I'm wondering whether we should care. Just because something makes people happy is not a slam dunk moral argument - people used to nail cats to trees for fun but we kind of outgrew that. The environmental cost of everything is probably a new level of sophistication in understanding the world that the new generations coming through will just naturally have a better handle on
How much carbon was emitted transmitting your banal points to my device? Excuse me while I go nail a cat to a tree, in atonement (purring emits gigatons of carbon globally, and for what? Making pet owners happy).
. . . and we're back to throwing rocks at the moon.
As a species, we seem hell-bent on proving that evolution works backwards, too.
And what about all the energy wasted making music? And how much oil goes into a modern painting?
We should all just sit quietly in cold dark rooms silently contemplating our coming climate apocalypse. And no crying or heavy sighing, as we know those also emit carbon.
The difference is that those are beautiful works of human art vs. a machine that simply copies work already done by past artists. One has emotion & meaning and one is a replica done by a computer. This is such a bad straw man argument.
The difference is that those are beautiful works of human art vs. a machine that simply copies work already done by past artists. One has emotion & meaning and one is a replica done by a computer. This is such a bad straw man argument.
The difference is that those are beautiful works of human art vs. a machine that simply copies work already done by past artists. One has emotion & meaning and one is a replica done by a computer. This is such a bad straw man argument.
This is only taking into account the processing on the servers, not the infrastructure needed for the delivery.
And a basic sanity check tells me this is wrong. The added electricity consumption is too high to be waved off as nothing more than parts of a light bulb. It doesn't compute.
https://www.scientificamerican.com/article/the-ai-boom-could-use-a-shocking-amount-of-electricity/
Yes, we are having huge controversies over proposed data centers each of which are proposing to use the energy of thousands of households, throwing the sustainability plans of the state out the window. If it isn’t AI, what is consuming all that energy?
https://cardinalnews.org/2025/04/11/energy-demand-will-outstrip-supply-in-virginia-as-data-centers-proliferate/
I go into a lot of detail about this in the cheat sheet post. AI more broadly is an environmental problem. Chatbots are an extremely tiny fraction of AI energy use.
It also ignores the energy wasted during idle time.
The total usage for the servers need to be accounted for, not just for each query.
Do you think that'd add that much? I also ignore the idle time of servers running YouTube videos. This applies to everything we do online, so if I include those, the energy costs of everything else will go up too, and ChatGPT won't really stand out.
I do think it adds a lot. These servers are dedicated AI servers.
The amount of power installed for them and theses calculations don't add up.
They don't add up because almost all energy used for AI isn't used for chatbots
When you pop down to the pub for a pint with friends, the energy used to make that beer was more than a ChatGPT query, and you may not be accounting for the energy you and your friends used to get to the pub, or the lighting in the pub or the lorry to bring the beer to the pub.
Also, if you play trivia at the pub that’s a few hundred grams of CO2 per play.
How can you just ignore the most energy intensive parts of the process; model training, data centers, etc?
"Including the cost of training raises the energy cost per prompt by 10%"
You should read the linked cheatsheet to answer your questions.
Looking at the numbers that statement seems completely wrong.
It's the query that's 10% of the training. And the query is 10 times higher than a search. so ...
Where are you getting the number implying the query is 10% of the training? I’ve never seen anything that implied that. It’d mean ChatGPT’s been used 100x less than I assumed
"An informal online estimate for ChatGPT indicates that it produces 0.382 g CO2e per query".
"Assuming that ChatGPT undergoes a full re-training of the model once per month and continues with an estimated 10,000,000 queries per day, the 552 metric tons divided by 300,000,000 queries equates to 1.84 g CO2e per query for the amortized training cost."
Can you share the link to what you're citing?
It was linked in the article as the "best estimates".
https://www.nature.com/articles/s41598-024-54271-x
Huge fan of your book, Hannah, but concerned about this post. As I told Andy on LinkedIn, 1/. We have no idea what ChatGPT’s real impact is because OpenAI won’t tell us (why, if it’s so small?) 2/. There are many less energy-consuming alternatives, it’s just people don’t know about them & 3/. Many macro-trends suggest that (generative/agentic) AI is a net negative for the climate. Notably the fact that US coal power stations previously set to be stopped are now being kept going, partially to meet AI’s soaring electricity demands. So in this sense, AI is actually slowing down the energy transition. Everything else we know for sure is here: https://bettertech.blog/2025/04/19/ais-impacts-how-to-limit-them-and-why/ - TL;DR: let’s look at the bigger picture before saying "this is fine". It’s not fine at all
Well said. What are you referring to with less energy-consuming alternatives?
Such a great point about the net negative climate impact. We can do math all day on the relatively small personal usage of ChatGPT but the fact remains that massive AI data centers are being built all over the world. That's a real impact.
I get that Hannah is comparing ChatGPT usage to our other everyday behaviors, but everything is additive and there is infrastructure behind it.
There are smaller LLMs that consume 30-60 times less energy than the bigger ones (like ChatGPT) with comparable performance. More on that here - https://www.linkedin.com/posts/jamesmartin75_so-youd-like-to-use-a-less-impactful-ai-activity-7327759658528497664-yC-T?utm_source=share&utm_medium=member_ios&rcm=ACoAAAB-X90BvzgQBovzuvLli_vQurS4maIr-f8 - but essentially, whatever ChatGPT’s impact is, it’s excessive and unnecessary. Tho indeed the impact is more at an infrastructure than an individual level (like streaming & a lot of other things)
I appreciate this analysis and maybe I don't need to shame myself and others for using AI tools due to climate impacts. However as others have pointed out, collectively this adds up. We're currently trying to clean up the atmosphere but at the same time are adding new unnecessary energy consumption that's slowing us down.
The bigger issue with LLMs that many people using them happily ignore is the material they're trained on, ie the IP theft and the inherent biases and misinformation. That is a far better reason not to use them.
... it is only one straw, 7 billion users said, etc., etc.
But I am still looking for the whole life cycle of design, infrastructure and construction of new data centres, rather than the previously planned upgrades of older style Google search centres.
Yea I'm surprised this article focuses so much on percentages and relative personal use. It is often touted that the most impactful way to combat climate change at an individual level is through improved efficiency, e.g. better light bulbs, going all electric in your home, etc. Every percentage counts when we take a macro view.
I'm not saying this analysis is wrong or that we shouldn't focus on bigger efficiencies, but we are effectively ADDING more usage here so let's acknowledge that.
Also, a separate argument is on the validity of even using AI for the average person. It's one thing to need food and transportation, it's another to be using an energy-wasteful tool that is in most cases inferior to actual googling and research and critical thinking.
Quiet... stats are 'nice' but so 'fakable'...
I am liking the recent (to me) one on the US average income;
$74,500... really?
$65,000 excluding the top 10 Americans... really?
$48,000 excluding the top 50 Americans... really?
$35,000 excluding the top 1,000 Americans...
Yet, Walmart and McDs make multi-billion dollar profits, and yet well over 50% of their employees are on benefits, not from the State or states, but 'my tax dollars. : )))))))
Critical thinking? Perish the thought!
I wonder how much energy a TikTok post (shaming people for their ungreen ChatGPT queries ) uses, including the energy to show it to millions of viewers.
Maybe if the electricity grid managers in Spain had used ChatGPT to learn about how lack of inertial stability because of over reliance on solar power can cause grid failures they could have prevented the blackout and saved lots of energy and trouble. But they saved 3 Wh by not making that query.
I can easily save 5 google searches if I use ChatGPT instead.
You're assuming what ChatGPT tells you is correct. It is NOT another google despite it looking that way. Treat it like an assistant/tool, not a guru. I've gotten many incorrect results with it with basic questions, even on how to remove a type of stain on a shirt.
Due diligence through proper research and knowing your sources is very important. Human are losing critical thinking skills with this, it's super depressing.
Every single time?
As with most things, it also comes down to trade off and intention. If I’m using an LLM to help build my impact business I’d imagine the net benefit far outweighs the cost.
This is an incredible piece. As someone who has the ChatGPT guilt often this helps a lot. Thank you for the work you do!
This article looks just at the estimated energy demand of each search, or individual searches in the aggregate for each person, and argues that they’re negligible compared to other energy demand. But then he says “I am not saying that AI energy demand, on aggregate, is not a problem. It is, even if it’s “just” of a similar magnitude to the other sectors that we need to electrify, such as cars, heating, or parts of industry.”
And therein lies the rub, a lot of small things combined of course do lead to big impacts and this can’t be ignored. It’s like saying “well, a little bit of oil leaking from my oil rig off the coast of Santa Barbara is a tiny bit of the total oil used in the world.” That’s true but it ignores the aggregate of the impacts of oil leaks from all of those rigs.
And we know already that aggregate AI demand is massive, at a few % of total electricity use.
But the true rub is in the exponential growth of that demand. When it’s growing 100% roughly every six months it soon exceeds everything else. Literally. That’s the point of my analysis here, "How AI and Bitcoin will eat the world": https://tamhunt.medium.com/how-ai-and-crypto-will-eat-the-world-189d242210e1
There is a reason why AI companies try to force users to use AI rather than searches, and it's not for our benefit.
Right now, all the AI companies are losing money. What might that investment be in, if it was not in AI that we are forced to use, online?
Some of us avoid using plastic bottles, besides trying to avoid using AI when we don't need it. Any little bit helps. Oh, I forgot, we can bury the bottles, another cost, and then they don't matter.
This gives me a lot to think about re: chatbots, but on the other hand, there are coal plants which were scheduled for retirement which will keep running specifically to support data centers for AI.
“Additional demand from new datacenters will double in just a year, to 47,448 GWh between 2024 and 2025, and rise more than eightfold by 2030 to 199,982 GWh, according to a forecast from S&P Global Commodity Insights. That could be a lifeline for coal power.
"There is certainly a strong chance for many of the existing coal [plants] out there to run longer than what was expected prior to the now-explosive growth forecasts in datacenter electricity demand forecasts/electrification," CreditSights analyst Nick Moglia told Commodity Insights.”
Whether or not those projections prove true, it has already built the political will to continue if not accelerate fossil fuel usage.
https://www.spglobal.com/commodity-insights/en/news-research/latest-news/electric-power/110524-us-power-generators-pump-the-brakes-on-coal-plant-retirements
Thank you so much Hannah! This is incredibly helpful information.
Which specific model are we talking about here? I use the advanced reasoning models a lot for writing code, which probably consumes much more power than your analysis suggests, but it is also worth considering the amount of human work/time saved. I’d be curious what the carbon footprint is of a person spending a couple extra minutes google searching because they didn’t get as useful an answer as they were looking for.
I've read it but I'm really unsure of the number of prompts used. I checked a few interactions I've had at work to find a solution to something, or helping me rewrite a note to something else.
The shortest interaction I've had included 15 promts to get it as wanted. And that's less than average for me. I'm often somewhere between 25-50 prompts, often I'm up to a hundred if it's something intricate I want help with.
And that is not taking picturemaking into account, those are most often 50+ to get it exactly right.
Maybe we're just unlucky with the version that I use, but I got friends who use it for work and D&D, and they often have several interactions at the same time with often 100+ prompts to it.