Hi Reader, do you think we should include AI in our everyday work?
It could make our world much more sustainable — at least according to some authors.
Today’s piece is a bit longer, as it tells a unbelievable story about how misleading data is generated... and published in a Nature journal.
Believe me, reading it will be worth it:
Today's Lesson: Discussing Misleading Data
How publications can support questionable claims
Number Of The Day
AI and machine learning approaches hold great potential for environmental monitoring and prediction. For example, Castelli et al. were able to predict air quality indices with an accuracy of 94.1%—an important achievement, given that pollution-related illnesses cost the U.S. over $30 billion each year. Similarly, Jaafari et al. developed AI models capable of predicting wildfires with over 98% accuracy.
94.1
Is AI More Sustainable Than Humans?
Recently, I came across a paper making a bold claim:
“Our findings reveal that AI systems emit between 130 and 1500 times less CO₂e per page of text generated compared to human writers"
Now, those numbers certainly grab attention. But it continues:
", while AI illustration systems emit between 310 and 2900 times less CO₂e per image than their human counterparts.”
The authors intended to use “best practices in life cycle assessment” to arrive at these results.
Where These Numbers Come From
To estimate AI’s emissions, the authors used two language models (ChatGPT and BLOOM) and two image generators (Midjourney and DALL·E 2).
They combined:
The estimated CO₂e of a single query,
The amortized training footprint,
And the embodied carbon of the hardware.
BLOOM is an AI model created by researchers to assess the energy consumption of training and inference (still a preprint).
This graphic shows the BLOOM architecture. I added it because the important point here is that ChatGPT and BLOOM share a similar architecture — a so-called decoder-only autoregressive Transformer. In simple terms, it doesn't “see” the entire input but predicts the next token one by one, like writing word by word based only on what has already been processed (although for GPT-4, not all features have been disclosed). Interestingly, more than 200 scientists contributed to developing BLOOM. You can access the preprint (published Nov 2022) here.
For humans, they took the total annual carbon footprint of an average person, then estimated what fraction of that should be accounted for, based on assumed time spent writing or illustrating.
I.e., A US resident is approximately 15 metric tons CO2e per year + embodied carbon and energy usage of laptop divided by the time needed for the task.
The Results:
A single page of human-written text supposedly emits ~1400 g CO₂e in the US, compared to ~2.2 g CO₂e for ChatGPT.
An illustration by a US-based human emits ~5500 g CO₂e, versus ~1.9 g CO₂e for Midjourney.
At first glance, AI seems vastly superior. But this illusion starts to crumble when you examine the methods.
What’s Wrong With the Approach?
Up until the day before yesterday, I hadn’t checked who wrote the paper or where it came from—on purpose. I tried to evaluate it as independently as possible – I only knew it was published in Nature Scientific Reports.
The authors did not declare any conflict of interest apart from one of them owning NVIDIA stock (I hope for him ever since 2023).
In essence, my conclusion was the followng:
The paper isn’t particularly rigorous to say the least.
It reads more like a hybrid between a blog post and a journal article.
And the same can be said for its impact calculations. Here's where it falls short:
Unreliable Sources: Many estimates were pulled from internet articles, blog posts, or statements without references. For example, their estimate for ChatGPT’s energy use comes from an unverified blog post—and no rationale is given for dividing daily usage by 10 million queries.
Oversimplified Assumptions: The human estimates assume someone works a certain number of hours on a creative task, then divides their total footprint accordingly. That’s all they did.
No Sensitivity Analysis: There’s no discussion of uncertainty, no error ranges, and no scenario testing. This makes the study’s conclusions unreliable at best.
Lack of Technical Rigor: The term “inference”—the correct term for running an AI model—isn’t mentioned once.
Misleading Framing: Suggesting that AI is “x-times” more sustainable than a human oversimplifies a highly contextual and dynamic topic.
Exactly, this is where our authors got their data from. In their paper they state the following “Holz [CEO of Midjourney, an image creation AI] stated, with regard to Midjourney’s computer usage, that “[e]very image is taking petaops ... So 1000s of trillions of operations. I don’t know exactly whether it’s five or 10 or 50. But it’s 1000s of trillions of operations to make an image... [W]ithout a doubt, there has never been a service before where a regular person is using this much compute” Thus, our authors simply converted the 50 petaoperations into g CO2e.
They also assumed that text-to-image generation takes as much energy as text-to-text generation... We discussed that it does not in a previous piece (and here a publication on the topic).
So, Is AI More Sustainable?
Without doubt, AI systems can generate content using relatively little electricity, especially when compared to all the indirect emissions tied to human labor, like commuting, heating offices, and maintaining infrastructure.
On top, it is trained on almost all human knowledge giving it an extremely broad context and thus, allowing it to generate adept output.
However comparing human vs AI in terms of environmental impact directly is deeply misleading.
Why? Because the meaningful comparison isn’t:
Human vs AI
To my mind it is:
Human using AI vs human using previous tools
That includes the full picture:
Time spent by humans for generation and editing the output until publication,
The emissions from AI training and inference,
The energy used for data storage and transfer,
The hardware needed to access and operate AI or conventional tools.
And most importantly: the purpose of the task. Are we using AI to write critical research or just generate cat memes? Are we saving time or simply producing more volume with no added value?
That said, analyses like those conducted by Wang et al. have shown that, under certain conditions, AI adoption can lead to an overall environmental benefit given that AI promotes innovation and enables enhanced technology to analyze and interact with our environment. (See Olawade et al. for an amazing review).
This is the basic model used by Wang et al. For those with a lot of curiosity and stamina: “EF, CE, and ETR are the explained variables, representing ecological footprint, carbon emission, and energy transition respectively. AI is the explanatory variable, representing the level of artificial intelligence development. C represents the control variables vector, which is kept the same in the three models. ui is the individual fixed effect, and εit is the disturbance term… L1-L3. qit is the threshold variable and γ is the corresponding threshold value. I(·) is an indicator function that takes 1 when the condition is satisfied and 0 otherwise.”
But these studies include:
Detailed mathematical models,
Sensitivity analyses,
Robust data sources.
In other words, they did what the other paper should have.
Applying The Knowledge
This example highlights an uncomfortable reality: peer-reviewed does not mean flawless. We’ve seen journals like Nature retract high-profile papers.
Therefore, always ask:
Where did the data come from (trash in equals trash out)?
Are the assumptions realistic?
Under which circumstances becomes the data invalid?
For the discussed paper, it is more like concluding electric cars are better than motorbikes because they do not use gas—still ignoring that walking emits even less.
An Example of Citation Gone Wrong
If you think I’m being harsh, here’s an example from the same paper that highlights how flawed citations can shape a narrative.
The authors cite a statement claiming that:
“One barrel of oil provides the work equivalent of 11 hours of human manual labor.”
But the actual source of this claim?
At first, the authors cite a paper that actually speaks of one barrel of oil being equivalent to 11 years of human work.
And when you finally arrive ... you see the claim is basically non-existent. In the Blog Post the word "oil" is not even mentioned. PS: I did not include all arrows since that was simply looking aweful - the journey goes from the upper left to the lower right.
However, this paper links to a reference that links to a blog post summarizing a public talk, with a recording on Soundcloud that has been deleted.
This is how misinformation spreads—even in peer-reviewed literature. And once it’s out there, it gets cited again and again.
In The End
The danger is that Google or Microsoft could now refer to a peer reviewed paper when negotiating with politicians in how far they should support the development of AI…
The paper we discussed is well distributed: On the website it reads: “This article is in the 99th percentile (ranked 477th) of the 422,728 tracked articles of a similar age in all journals and the 99th percentile (ranked 13th) of the 3,693 tracked articles of a similar age in Scientific Reports”. In our free Slack I provide some more insights in which weird statements one of the authors made in an interview.
This isn’t about bashing AI or defending humans. It’s about being honest with how we generate and use data.
I guess the question to ask is "How can we optimize AI use while maximizing value to make the emissions worth it?"
Upcoming Lesson:
How Much Energy Do Your Lab Instruments Consume?
How We Feel Today
A Little Announcement
Don’t forget — today, My Green Lab is hosting their webinar on the new ACT 2.0! This label tells you about the environmental impact of your lab equipment. See you, if you register right here.
References
Scao, T. L. et al., Bloom: A 176b-parameter open-access multilingual language model. arXiv preprint arXiv:2211.05100 (2022). https://arxiv.org/abs/2211.05100
Luccioni, S. et al., Power hungry processing: watts driving the cost of AI deployment? Proceedings of the 2024 ACM Conference on Fairness, Accountability, and Transparency (FAccT '24), 85–99 (2024). https://doi.org/10.1145/3630106.3658542
Wang, Q. et al., Ecological footprints, carbon emissions, and energy transitions: the impact of artificial intelligence (AI). Humanities and Social Sciences Communications11, 1043 (2024). https://doi.org/10.1057/s41599-024-03520-5
Olawade, D. B. et al., Artificial intelligence in environmental monitoring: advancements, challenges, and future directions. Hygiene and Environmental Health Advances12, 100114 (2024). https://doi.org/10.1016/j.heha.2024.100114
Of course, I do not list the publication we discussed as a reference...
If you have a wish or a question, feel free to reply to this Email. Otherwise, wish you a beatiful week! See you again the 15th : )
Edited by Patrick Penndorf Connection@ReAdvance.com Lutherstraße 159, 07743, Jena, Thuringia, Germany Data Protection & Impressum If you think we do a bad job: Unsubscribe
Personal Note From Patrick, The Editor Hey Reader, have you ever read a Life Cycle Assessment? The goal is quite straightforward: to assess all environmental impacts of a given product or process. The final number that comes out is absolutely essential for deciding which instrument or lab item is more sustainable. And one of the assumptions that really matters is how to account for biogenic carbon — unfortunately, there's little digestible education on the topic, but here, you’ll learn...
Personal Note From Patrick, The Editor Hi Reader, what was your favorite innovation from last week? The ability to grow cells in petri dishes has been a major innovation enabling us to study cancer, metabolism, or the side effects of chemicals the way we do today. But we can’t do this without FBS—and like many innovations, it has come with a significant footprint that was long overlooked. So, where do we stand with innovations today? Today's Lesson: Exchanging Fetal Bovine Serum How we can...
Personal Note From Patrick, The Editor Hello Reader, do you remember a moment when a new innovation completely stunned you? Back when I was a student, I often wondered: What are companies and institutions actually doing to support sustainable science? I never really took the time to look it up... But later, I discovered just how many exciting solutions already exist. So today, I want to share a few of them with you. Today's Lesson: Innovations that Inspire Discovering what makes our science...