Green Education - Reducing AI Footprints


Personal Note From Patrick, The Editor

Hi Reader, how long is your average conversation with ChatGPT?

Last week, we saw that a single query to a large language model (LLM) consumes 0.5–50 Wh, releasing 1–5 grams of CO₂.

That’s about 10 times the energy required for a Google search.

Whether it’s AlphaFold, ChatGPT, let’s look at how we can reduce our footprint when using these models.


Today's Lesson: Lowering Impacts of AI

How we can calculate more sustainably


Number Of The Day

Far more than 9 billion hours of computation are spent every year on scientific inquiry. Although this number is just an estimate, the INCITE program alone allows scientists to use 5.95 billion computing hours and 100 trillion bytes of data storage at the Department of Energy’s (DOE) leadership computing facilities. And with the ever-broader use of large language models and applications like AlphaFold, this number will probably grow exponentially.

9 Billion


Reducing the Impact of AI

According to Lannelongue et al., training AlphaFold took 11 days and emitted approximately 3.92 tons of CO₂e.


And each time it predicts the structure of a protein with 2,500 residues, it adds another 3 kg CO₂e.

This isn’t just limited to biology. For comparison, the telescopic research station in Hawaii produces an estimated 749 tons of CO₂e every year—mostly for research purposes. So how can we reduce our footprint?

Reduction – Handle AI with Care

A single ChatGPT prompt uses, on average, 10 times more energy than a Google search.


Generating a picture with DALL·E or Midjourney can consume between 0.5–1 kWh, translating to 100–400 grams of CO₂ emissions.

Therefore, when it comes to simple questions—like protein sizes or chemical formulas—you might be better off searching in UniProt or Wikipedia.

If you use Large Language Models (LLMs) like ChatGPT, optimizing your prompts can make a difference.

For example, under standardized testing conditions:

  • Using “justify” in your prompt leads to 20–60% higher energy consumption than using “explain.”
  • Replacing “explain” with “list” can save another 10–30% of electricity.

Although a study by Adamska et al. suggests that prompt length has limited influence on energy use, response length is linearly correlated with consumption.

Here are three habits that make your AI usage more efficient:

  1. Be specific from the start. Give ChatGPT all relevant information upfront—style, tone, structure—so you don’t need multiple refinements. For other models, double-check settings for data fidelity, boundary conditions, or accuracy.
  2. Reuse what you've already created. Provide ChatGPT with previous content or data to build upon instead of starting from scratch.
  3. Interrupt bad generations early. If the response doesn’t match your intent, stop it immediately. Also, clean up conversations you no longer need.

Using Services & Storing Data

In March 2020, DE-CIX in Frankfurt—the world’s largest internet exchange point—recorded a throughput peak of 9.16 terabits per second.

That’s equivalent to more than two million HD videos being transmitted simultaneously.

Storing 1,000 GB of data in data centers can generate 100–300 kg of CO₂ per year. That’s 20–50 watts per terabyte—continuously.
(The nitty gritty of how optimized networks can reduce energy use by 30–95% you can read in our free Slack Channel).

Therefore:

  • Edit and store documents locally. If you don’t need real-time collaboration, use a local text editor instead of cloud-based tools like Google Docs or iCloud Pages.
  • Save AI outputs locally. For example, if you use Copilot or ChatGPT to summarize a paper, paste the summary into your Word file or citation manager. This avoids repeating the same energy-intensive queries.
  • Download and organize data. After generating AI responses or images, save them on your device rather than leaving them in the cloud.

Knowing the Limits of AI

ChatGPT performs well at explaining statistical concepts but is often unreliable for actually performing calculations. For that, you’re better off using dedicated statistical software.

How to test and validate:

  • A) Start small. Run test prompts or calculations on limited datasets to evaluate performance.
  • B) Break tasks into steps. Divide large requests into manageable chunks—so if something fails, you don’t need to rerun everything.
  • C) Reuse working prompts. When something works well, save it. From there, develop a prompt-engineering strategy (aka remember what patterns work 😄).

Getting the Time Right

Electricity grids fluctuate in carbon intensity throughout the day, depending on demand and the availability of renewables like solar and wind.

According to Dodge et al., if you time your bioinformatics jobs strategically:

  • Shifting start times by just a few hours can cut emissions by up to 80% for short jobs (<30 minutes). For longer jobs (>1 day), shifting often saves less than 1.5%.
  • Suspend and resume. For long jobs, pausing during carbon-intensive hours and resuming during greener periods can reduce emissions by up to 25%.

You can find some additional actions to reduce impacts in our free Slack of you like.

Applying The Knowledge

Don’t shy away from AI for fear of its environmental impact.
AI is here to stay—and it can play a key role in improving science.

But when using AI, ask yourself:

  • What | Which model or service are you using?
  • How | Can you optimize your prompt or setup?
  • When | Can you run jobs during low-carbon hours?
  • Where | Can you store data locally?
  • Why | Do you really need AI for this task?


A good rule of thumb to reduce your impact: keep computation times as short as possible.

That means optimizing prompts, double-checking settings, and planning AI use as part of your experimental design.

Want to track your own emissions?

  • Python: Try Carbontracker or CodeCarbon, which integrate directly into your pipeline.
  • R: Use CarbonR to estimate emissions from statistical models and simulations.

Upcoming Lesson:

Exciting Innovations For A Sustainable Future


How We Feel Today


References

Today we have quite a list :D


Although it took a while to go through them all, it was pleasant to see that people are publishing on this topic.

Grealey, J., et al., 2022. The carbon footprint of bioinformatics. Mol. Biol. Evol. 39(3), msac034. doi:10.1093/molbev/msac034.

Posani, L., et al., 2019. The carbon footprint of distributed cloud storage. Tech. Rep., Cornell University. arXiv:1803.06973. doi:10.48550/arXiv.1803.06973.

Flagey, N., et al., 2020. Measuring carbon emissions at the Canada–France–Hawaii Telescope. Nat Astron4, 816–818. doi:10.1038/s41550-020-1190-4

Gröger, J., et al., 2021. Green Cloud Computing: Lebenszyklusbasierte Datenerhebung zu Umweltwirkungen des Cloud Computing. Tech. Rep., Umweltbundesamt. Forschungskennzahl 3717 34 348 0.

Luccioni, A.S., Viguier, S., Ligozat, A.-L., 2022. Estimating the carbon footprint of BLOOM, a 176B parameter language model. Tech. Rep., Cornell University. arXiv:2211.02001. doi:10.48550/arXiv.2211.02001.

Dodge, J., et al., 2022. Measuring the carbon intensity of AI in cloud instances. Proc. ACM Conf. Fairness, Accountability, and Transparency (FAccT '22), 1877–1894. Assoc. Comput. Mach., New York, NY, USA. doi:10.1145/3531146.3533234.

Chien, A. A., et al., 2023. Reducing the carbon impact of generative AI inference (today and in 2035). Proc. 2nd Workshop on Sustainable Computer Systems (HotCarbon '23), Article 11, 1–7. Assoc. Comput. Mach., New York, NY, USA. doi:10.1145/3604930.3605705.

Hanafy, W. A., et al., 2023. CarbonScaler: Leveraging cloud workload elasticity for optimizing carbon-efficiency. Proc. ACM Meas. Anal. Comput. Syst. 7(3), Article 57, 28 pages. doi:10.1145/3626788.

Adamska, M., et al., 2025. Green prompting. Tech. Rep., Lancaster University. arXiv:2503.10666. doi:10.48550/arXiv.2503.10666.

Lannelongue, L., Inouye, M., 2023. Environmental impacts of machine learning applications in protein science. Cold Spring Harb. Perspect. Biol. 15(12), a041473. doi:10.1101/cshperspect.a041473.


If you have a wish or a question, feel free to reply to this Email.

Otherwise, wish you a beatiful week!
See you again the 17th : )

Find the previous lesson click - here -


Edited by Patrick Penndorf
Connection@ReAdvance.com
Lutherstraße 159, 07743, Jena, Thuringia, Germany
Data Protection & Impressum

If you think we do a bad job: Unsubscribe

ReAdvance

Here to share how we can make labs greener - based on my personal experience and those from labs all around the world

Read more from ReAdvance

Personal Note from Patrick, the Editor Hi Reader, how do sustainability and science fit together? There's clearly a great deal of misunderstanding and fatigue when it comes to sustainability. Of course, I'd like to change that. As one of the few successful advisors and communicators for sustainability in science, I have developed a rather unique perspective. Let me explain how rethinking sustainability has enabled me to integrate it into science: Today's Lesson: Rethinking Sustainability...

Personal Note from Patrick, the Editor Hi Reader, we have more sustainable innovations in laboratory items available than ever before. Last time, we discussed what sustainable procurement entails. But how should you approach it, and which pitfalls can turn good intentions into unnecessary stress? Let’s give you an idea what it entails to generate impact numbers and to make purchasing more sustainable: Today's Lesson: Optimizing Procurement How to effectively change purchasing practices Number...

Personal Note from Patrick, the Editor Hi Reader, what do you associate with procurement? For many people, it’s a technical term linked to paperwork and frustration. However, my goal is to show that procurement is much more tangible, and often simpler, than it seems. Let’s start by exploring what more sustainable purchasing of lab equipment involves and why it matters to you: Today's Lesson: Understanding Procurement What sustainable purchasing means and why it matters Number of the Day...