I use AI every day, and yet, there have been at least three times when I couldn’t tell that a video I watched was entirely AI-generated.
While the debate on how AI will transform science is ongoing, its energy consumption is already a pressing issue.
So, let’s explore how AI impacts the environment and just how significant those effects are.
Today's Lesson: The True Impact of AI
Exploring how big its energy consumption really is
Number Of The Day
NewsCatcher and the AI detection tool from Pangram Labs found that more than 60 000 AI-generated articles are published every day. Note that this excludes other content, such as AI-generated videos published on YouTube. They used their own API and AI detection technology to scan daily published global news articles from over 75,000 sources. While less than 4% of politics or sports articles were AI-generated, more than 10% of science and finance were.
60 000
Maintaning Freezers Sustainably
AI now assists with everything from explaining scientific papers to refining manuscripts—even writing entire books.
According to the International Energy Agency, electricity consumption of global data centershas surged by 20–40% annually in recent years, now accounting for 1–1.3% of global electricity demand.
Projections estimate that by 2030, data centers could consume more than 9% of U.S. electricity generation annually—up from an estimated 4% today.
If the numbers are too small, you can click on the picture to enlarge. This graph stems from “Our World in Data”, with the data coming from: Energy Institute - Statistical Review of World Energy (2024); Smil (2017). Sources like Kiessling et al. or the International Energy Agency estimate only 25 000 TWh but with similar distributions.
To fully understand AI’s impact, we need to break it down into different lifecycle stages: training, inference, and data storage.
Training an AI Model
AI training is the process of converting raw code into a quasi-intelligent system.
It involves feeding massive datasets (both proprietary and scraped from websites) into AI models so they can learn to generate accurate responses.
Training 1 billion parameters requires ≈15 MWh of electricity, emitting 5.7 million kg of CO₂e.
A 6-billion-parameter model consumes ≈103 MWh, releasing 39 million kg of CO₂e—enough to power 4 U.S. households for a year.
GPT-3, with 175 billion parameters, used approximately 1 287 MWh.
OpenAI hasn’t disclosed how many parameters GPT-4 was trained on.
These data should stem from Ark Invest, however, take them with a grain of salt. Some more discussions about the cost of AI training in our free Slack Channel. The article that included this graph also stated that training costs for GPT-4 probably ranged around $100 Million.
However, DeepSeek R1—a model that performs at least as well as ChatGPT—was trained on 671 billion parameters (more in our free Slack)
Grok3 allegedly on 2.7 trillion.
Of note, the precise energy consumption depends on training methods, model type, settings, and infrastructure. These factors allow for variations of up to double-digit percentages.
These numbers seem staggering—but only comprise a fraction of the impact.
Inference: The Cost of Every AI Response
Once an AI model is trained, it must generate responses—a process known as inference.
While training is energy-intensive, inference is an even larger concern because it happens continuously as users interact with AI models.
This graph stems from a white paper that took it from a report-like-paper. Please take it with a grain of salt, but on average it is correct that a google search takes about 0.3Wh while it is 2.9 for an average ChatGPT query.
Tech giants like NVIDIA and Amazon Web Services (AWS) estimate that inference accounts for 80–90% of AI’s total energy consumption
Internal studies by Meta and Google suggest that it requires 1/3-2/3 their total machine learning (ML) energy use.
So what is the cost of using ChatGPT?
A single query to a large language model (LLM) consumes 0.5–50 Wh, releasing 1–5 grams of CO₂.
This is about 10 times the energy required for a Google search.
The energy cost is even higher for image generation, likely 20 times more than text generation.
Research firm SemiAnalysis estimated that ChatGPT required 28,936 GPUs, implying an energy demand of 564 MWh per day.
In other words, ChatGPT’s inference surpasses its total training footprint in just three days.
And it’s not just ChatGPT. With Perplexity, Claude, Grok, and Gemini competing, and Microsoft integrating AI into nearly all applications, AI’s energy consumption will only continue to rise.
Storage: The Cloud’s Carbon Footprint
From emails and lab notes to microscopy images and research papers, we increasingly rely on cloud storage for convenience.
According to some estimates, cloud storage can consume up to 1,000,000 times more energy than local storage on drives.
You can check out the white paper the graph is taken from right here. More on data centers in our Slack.
While it’s difficult to quantify how much of this is specifically due to AI, it’s clear that ChatGPT saves conversations by default, and most image-generation platforms require users to store generated images unless they pay for premium features. The cumulative impact is likely significant.
Already in the early 2020s it was expected that cloud storage alone contributes 1.5% of the entire global carbon footprint.
Applying The Knowledge
Although each prompt to ChatGPT only takes a few Wh, impacts add up. Talking to your ChatGPT app is processing the voice, transcribing it, inferring, searching the web and voice rendering it to talk back to you.
Before using a large language model (LLM), ask yourself:
Is AI necessary for my task?
If you need the molecular weight of a protein, searching UniProt or Google is probably not only more reliable but also much more sustainable.
Also, model size matters. Energy consumption scales roughly linearly with parameter count, meaning a model trained on 1 billion parameters could require ten times the energy of one trained on 100 million.
This data stems from a preprint by Luccioni et al., averaging the amount of carbon dioxide emitted for 1000 inferences.
While OpenAI doesn’t offer ChatGPT in different sizes, smaller, task-specific models often consume significantly less energy. The same applies to mixture-of-experts models, which selectively activate only portions of the network (see our Slack discussion).
Finally, we did not talk about embodied carbon, i.e., the carbon footprint due to the production of the GPUs, CPUs and servers. Luccioni et al. estimated that for their small 9 Billion parameter model another 11.2 tons of CO2eq can be added.
Upcoming Lesson:
Saving Energy In Your Studies
How We Feel Today
References
Hintemann, R., Hinterholzer, S., 2022. Data centers 2021: Data center boom in Germany continues—Cloud computing drives the growth of the data center industry and its energy consumption. Tech. Rep., Borderstep Institute for Innovation and Sustainability. doi:10.13140/RG.2.2.31826.43207.
Luccioni, S., Jernite, Y., Strubell, E., 2024. Power hungry processing: Watts driving the cost of AI deployment? Proc. ACM Conf. Fairness, Accountability, and Transparency (FAccT '24), 85–99. Assoc. Comput. Mach., New York, NY, USA. doi:10.1145/3630106.3658542.
Patterson, D., Gonzalez, J., Le, Q., Liang, C., Munguia, L.-M., Rothchild, D., So, D., Texier, M., Dean, J., 2021. Carbon emissions and large neural network training. arXiv, 2104.10350. doi:10.48550/arXiv.2104.10350.
de Vries, A., 2023. The growing energy footprint of artificial intelligence. Joule, 7(10), 2191–2194. doi:10.1016/j.joule.2023.09.004.
Luccioni, A.S., Viguier, S., Ligozat, A.-L., 2023. Estimating the carbon footprint of BLOOM, a 176B parameter language model. J. Mach. Learn. Res., 24, 1–15.
If you have a wish or a question, feel free to reply to this Email. Otherwise, wish you a beatiful week! See you again the 10th : )
Edited by Patrick Penndorf Connection@ReAdvance.com Lutherstraße 159, 07743, Jena, Thuringia, Germany Data Protection & Impressum If you think we do a bad job: Unsubscribe
Personal Note From Patrick, The Editor Hi Reader, what was your favorite innovation from last week? The ability to grow cells in petri dishes has been a major innovation enabling us to study cancer, metabolism, or the side effects of chemicals the way we do today. But we can’t do this without FBS—and like many innovations, it has come with a significant footprint that was long overlooked. So, where do we stand with innovations today? Today's Lesson: Exchanging Fetal Bovine Serum How we can...
Personal Note From Patrick, The Editor Hello Reader, do you remember a moment when a new innovation completely stunned you? Back when I was a student, I often wondered: What are companies and institutions actually doing to support sustainable science? I never really took the time to look it up... But later, I discovered just how many exciting solutions already exist. So today, I want to share a few of them with you. Today's Lesson: Innovations that Inspire Discovering what makes our science...
Personal Note From Patrick, The Editor Hi Reader, how long is your average conversation with ChatGPT? Last week, we saw that a single query to a large language model (LLM) consumes 0.5–50 Wh, releasing 1–5 grams of CO₂. That’s about 10 times the energy required for a Google search. Whether it’s AlphaFold, ChatGPT, let’s look at how we can reduce our footprint when using these models. Today's Lesson: Lowering Impacts of AI How we can calculate more sustainably Number Of The Day Far more than 9...