Hi Reader, how long is your average conversation with ChatGPT?
Last week, we saw that a single query to a large language model (LLM) consumes 0.5–50 Wh, releasing 1–5 grams of CO₂.
That’s about 10 times the energy required for a Google search.
Whether it’s AlphaFold, ChatGPT, let’s look at how we can reduce our footprint when using these models.
Today's Lesson: Lowering Impacts of AI
How we can calculate more sustainably
Number Of The Day
Far more than 9 billion hours of computation are spent every year on scientific inquiry. Although this number is just an estimate, the INCITE program alone allows scientists to use 5.95 billion computing hours and 100 trillion bytes of data storage at the Department of Energy’s (DOE) leadership computing facilities. And with the ever-broader use of large language models and applications like AlphaFold, this number will probably grow exponentially.
9 Billion
Reducing the Impact of AI
According to Lannelongue et al., training AlphaFold took 11 days and emitted approximately 3.92 tons of CO₂e.
And each time it predicts the structure of a protein with 2,500 residues, it adds another 3 kg CO₂e.
Grealey et al. developed a tool to measure the impact of various bioinformatic tasks. The numbers above the boxes represent mean values. Although none of the tasks involved AI support, many advanced analyses—such as those used in phylogenetics—already feature machine learning approaches.
This isn’t just limited to biology. For comparison, the telescopic research station in Hawaii produces an estimated 749 tons of CO₂e every year—mostly for research purposes. So how can we reduce our footprint?
Reduction – Handle AI with Care
A single ChatGPT prompt uses, on average, 10 times more energy than a Google search.
Generating a picture with DALL·E or Midjourney can consume between 0.5–1 kWh, translating to 100–400 grams of CO₂ emissions.
This image is from a retracted paper and was one of the first—and most hilarious—cases of AI-generated scientific figures (I dug out the original paper for you because the other picture is even worse). While clearly exaggerated, it illustrates a growing trend: the use of AI in both science communication (e.g., on social media) and in scientific research itself. Notably, as AI continues to improve in generating and editing text within images, its misuse becomes increasingly concerning—particularly in the context of predatory publishing, paper mills, and scientific misconduct. However, so far, AI seldomly helpful for Science communication.
Therefore, when it comes to simple questions—like protein sizes or chemical formulas—you might be better off searching in UniProt or Wikipedia.
If you use Large Language Models (LLMs) like ChatGPT, optimizing your prompts can make a difference.
For example, under standardized testing conditions:
Using “justify” in your prompt leads to 20–60% higher energy consumption than using “explain.”
Replacing “explain” with “list” can save another 10–30% of electricity.
Although a study by Adamska et al. suggests that prompt length has limited influence on energy use, response length is linearly correlated with consumption.
Scatter Plots from Adamska an colleagues, showing the relation between token count and energy consumption. A token is the “unit” in which LLM process language, that means it can be multiple characters but it still can be less than word. The model they used was trained on 7 Billion parameters.
Here are three habits that make your AI usage more efficient:
Be specific from the start. Give ChatGPT all relevant information upfront—style, tone, structure—so you don’t need multiple refinements. For other models, double-check settings for data fidelity, boundary conditions, or accuracy.
Reuse what you've already created. Provide ChatGPT with previous content or data to build upon instead of starting from scratch.
Interrupt bad generations early. If the response doesn’t match your intent, stop it immediately. Also, clean up conversations you no longer need.
Using Services & Storing Data
In March 2020, DE-CIX in Frankfurt—the world’s largest internet exchange point—recorded a throughput peak of 9.16 terabits per second.
That’s equivalent to more than two million HD videos being transmitted simultaneously.
The fluctuation of mean power used to power the 16 Nvidia A100 GPUs running a 176 billion parameter AI model. Importantly, even when not used (i.e., idle) the energy consumption is not 0. The red line indicates the mean power consumption of 1664W.
Storing 1,000 GB of data in data centers can generate 100–300 kg of CO₂ per year. That’s 20–50 watts per terabyte—continuously. (The nitty gritty of how optimized networks can reduce energy use by 30–95% you can read in our free Slack Channel).
Therefore:
Edit and store documents locally. If you don’t need real-time collaboration, use a local text editor instead of cloud-based tools like Google Docs or iCloud Pages.
Save AI outputs locally. For example, if you use Copilot or ChatGPT to summarize a paper, paste the summary into your Word file or citation manager. This avoids repeating the same energy-intensive queries.
Download and organize data. After generating AI responses or images, save them on your device rather than leaving them in the cloud.
As mentioned last week, there are several tools that help you – from analyzing papers, to explaining figures or writing entire books. Tools like the SciSpace Copilot can be great to understand papers from other fields, however, when it comes to details I found it to be less reliable.
Knowing the Limits of AI
ChatGPT performs well at explaining statistical concepts but is often unreliable for actually performing calculations. For that, you’re better off using dedicated statistical software.
How to test and validate:
A) Start small. Run test prompts or calculations on limited datasets to evaluate performance.
B) Break tasks into steps. Divide large requests into manageable chunks—so if something fails, you don’t need to rerun everything.
C) Reuse working prompts. When something works well, save it. From there, develop a prompt-engineering strategy (aka remember what patterns work 😄).
Getting the Time Right
Electricity grids fluctuate in carbon intensity throughout the day, depending on demand and the availability of renewables like solar and wind.
According to Dodge et al., if you time your bioinformatics jobs strategically:
Shifting start times by just a few hours can cut emissions by up to 80% for short jobs (<30 minutes). For longer jobs (>1 day), shifting often saves less than 1.5%.
Suspend and resume. For long jobs, pausing during carbon-intensive hours and resuming during greener periods can reduce emissions by up to 25%.
You can find some additional actions to reduce impacts in our free Slack of you like.
Applying The Knowledge
Don’t shy away from AI for fear of its environmental impact. AI is here to stay—and it can play a key role in improving science.
But when using AI, ask yourself:
What | Which model or service are you using?
How | Can you optimize your prompt or setup?
When | Can you run jobs during low-carbon hours?
Where | Can you store data locally?
Why | Do you really need AI for this task?
A good rule of thumb to reduce your impact: keep computation times as short as possible. That means optimizing prompts, double-checking settings, and planning AI use as part of your experimental design.
R: Use CarbonR to estimate emissions from statistical models and simulations.
Upcoming Lesson:
Exciting Innovations For A Sustainable Future
How We Feel Today
References
Today we have quite a list :D
Although it took a while to go through them all, it was pleasant to see that people are publishing on this topic.
Grealey, J., et al., 2022. The carbon footprint of bioinformatics. Mol. Biol. Evol. 39(3), msac034. doi:10.1093/molbev/msac034.
Posani, L., et al., 2019. The carbon footprint of distributed cloud storage. Tech. Rep., Cornell University. arXiv:1803.06973. doi:10.48550/arXiv.1803.06973.
Flagey, N., et al., 2020. Measuring carbon emissions at the Canada–France–Hawaii Telescope. Nat Astron4, 816–818. doi:10.1038/s41550-020-1190-4
Gröger, J., et al., 2021. Green Cloud Computing: Lebenszyklusbasierte Datenerhebung zu Umweltwirkungen des Cloud Computing. Tech. Rep., Umweltbundesamt. Forschungskennzahl 3717 34 348 0.
Luccioni, A.S., Viguier, S., Ligozat, A.-L., 2022. Estimating the carbon footprint of BLOOM, a 176B parameter language model. Tech. Rep., Cornell University. arXiv:2211.02001. doi:10.48550/arXiv.2211.02001.
Dodge, J., et al., 2022. Measuring the carbon intensity of AI in cloud instances. Proc. ACM Conf. Fairness, Accountability, and Transparency (FAccT '22), 1877–1894. Assoc. Comput. Mach., New York, NY, USA. doi:10.1145/3531146.3533234.
Chien, A. A., et al., 2023. Reducing the carbon impact of generative AI inference (today and in 2035). Proc. 2nd Workshop on Sustainable Computer Systems (HotCarbon '23), Article 11, 1–7. Assoc. Comput. Mach., New York, NY, USA. doi:10.1145/3604930.3605705.
Hanafy, W. A., et al., 2023. CarbonScaler: Leveraging cloud workload elasticity for optimizing carbon-efficiency. Proc. ACM Meas. Anal. Comput. Syst. 7(3), Article 57, 28 pages. doi:10.1145/3626788.
Adamska, M., et al., 2025. Green prompting. Tech. Rep., Lancaster University. arXiv:2503.10666. doi:10.48550/arXiv.2503.10666.
Lannelongue, L., Inouye, M., 2023. Environmental impacts of machine learning applications in protein science. Cold Spring Harb. Perspect. Biol. 15(12), a041473. doi:10.1101/cshperspect.a041473.
If you have a wish or a question, feel free to reply to this Email. Otherwise, wish you a beatiful week! See you again the 17th : )
Edited by Patrick Penndorf Connection@ReAdvance.com Lutherstraße 159, 07743, Jena, Thuringia, Germany Data Protection & Impressum If you think we do a bad job: Unsubscribe
Personal Note From Patrick, The Editor Hi Reader, what was your favorite innovation from last week? The ability to grow cells in petri dishes has been a major innovation enabling us to study cancer, metabolism, or the side effects of chemicals the way we do today. But we can’t do this without FBS—and like many innovations, it has come with a significant footprint that was long overlooked. So, where do we stand with innovations today? Today's Lesson: Exchanging Fetal Bovine Serum How we can...
Personal Note From Patrick, The Editor Hello Reader, do you remember a moment when a new innovation completely stunned you? Back when I was a student, I often wondered: What are companies and institutions actually doing to support sustainable science? I never really took the time to look it up... But later, I discovered just how many exciting solutions already exist. So today, I want to share a few of them with you. Today's Lesson: Innovations that Inspire Discovering what makes our science...
Personal Note From Patrick, The Editor Hi Reader, how often do you use ChatGPT? After discussing how to reduce the energy consumption of -70°C freezers, today we’re tackling the energy demands of AI. I use AI every day, and yet, there have been at least three times when I couldn’t tell that a video I watched was entirely AI-generated. While the debate on how AI will transform science is ongoing, its energy consumption is already a pressing issue. So, let’s explore how AI impacts the...