Hi Reader, laboratories can save plastics, reagents, and time through robotic automation.
We discussed these advantages last time, and I recently posted about how you might finance your purchase - but everything has a flip side.
That means if we implement automation incorrectly, we might end up with a much higher footprint.
So, how do we do it safely?
Today's Lesson: The Risks of Automation
The potential downsides of modernizing labs
Number of the Day
The average AI prompt produces about 2 g of CO2e, depending on the model, task, and location. While this is not a large footprint in itself, the key problem is that almost every software application nowadays offers AI support. That means almost every Google search, every PDF we open, and every analysis we run is AI-supported. What previously only required electricity to run your computer now adds significant energy consumption for servers, as well as for the training and inference of AI models...
2
Scaling Footprints through Automation
The Jevons paradox describes a situation in which an initially more sustainable solution leads to higher environmental footprints.
This rebound occurs because the perceived savings are used to justify more frequent use. This is also the main challenge with automation.
Faster and broader screens save time and miniaturize individual tests, but they also increase throughput.
The issue is that by running more experiments without investing accordingly in experimental design and original thinking, we will eventually end up with larger footprints.
Scaling Unnecessary Data
Larger datasets don’t compensate for insufficient approaches. There are three aspects to this:
More data won’t automatically help us generate new, intelligible insights.
Of course, there are many opportunities to automate and increase throughput as Fraunhofer proves. However, just because we can measure and process things does not mean we should. Moreover, large amounts of data present their own challenges. For example, we are currently more limited by data storage capacity for genomic data than by sequencing speed. At the same time, many organizations struggle with data protection and IT infrastructure.
Similarly, investigating with a flawed approach or on the wrong scale can’t be fixed simply with more data.
Finally, just because we can measure something doesn’t mean it needs to become a new default - whether for research or clinical diagnostics.
For example, in the data acquisition of many instruments, data is collected and processed that is never used.
Whether it is fMRI studies or NMR analyses (from own experience), turning off unnecessary data processing saves meaningful amounts of energy.
This graph is adapted from Souter et al., who showed that simply disabling unnecessary processing can, in one case, save up to 48% of energy without sacrificing data quality.
In essence, as automation reaches the laboratory, we must ensure that we don’t measure just because we can.
The Impact of Instruments
Creating advanced technology comes with a footprint.
This is not just due to the building materials, but especially because of the metals and rare earth elements needed for the chips.
Another factor is that these instruments are often built across various factories around the world.
While there is little data on where the parts of our instruments come from, we can expect that it looks similar to those of our phones, as visualized by Lifewire. In essence, parts for advanced technology are shipped globally, sometimes multiple times, before final assembly.
Furthermore, these instruments consume energy.
While it is true that modern machines are more efficient and save energy by being faster overall, we still face a familiar issue:
> Leaving instruments running unnecessarily.
By the same token, we have to consider that we may replace them more frequently than needed.
Staying with phones for a second: planned obsolescence refers to manufacturers intentionally creating weaknesses in their products that limit their lifetime. As outlined in this blog, Apple was actually sued in 2017 in a case that was eventually named “Batterygate.” It was found that Apple had been purposefully slowing down the software of iPhones 6 and 7 to try to preserve battery health. This led to a $500 million payment spread out among thousands of iPhone users. Apart from hardware, the more software our instruments carry, the easier it is to force hardware updates (just think of Windows support).
Finally, the more complex instruments become, the more can break, potentially requiring enhanced maintenance:
When Things Go Wrong
Speaking of replacements, malfunctions become more dangerous the more integrated systems become.
And finding a workaround by having human personnel step in when entire processes are automated is rarely feasible.
Problematic downtime in clinical, biotechnological, or pharmaceutical settings can be devastating, especially because these systems are designed for large-scale operations. Not only could critical medicines be delayed, but reagents and upstream resources may go to waste because they cannot be processed.
Moreover, IT safety becomes a concern. Collecting data, especially sensitive biological or patient data, also means we must invest in keeping it safe.
There have been cases where data breaches or ransomware attacks have shut down entire departments for months.
Finally, while working on a nanoscale level can save resources, if we overlook an issue while producing large amounts of data, we also generate large amounts of wasted data.
Dependence
When high-tech systems enter laboratories, we may see increasing differentiation between labs.
It is not only about having the financial resources...
Although this super-slim plate reader looks very cool and might save resources, innovations always come with a price tag. That is not to say we should not adopt them, but it means we need to be aware that we might increase the gap between labs depending on their funding.
To conduct or reproduce experiments, more specialized instruments from specific manufacturers may lead to further fragmentation of the research field.
This also leads to a strong dependence on specific manufacturers. In fully automated labs, it may not be possible to replace one component with another from a different supplier.
As shown here, automated cell culture systems or integrated plate reader systems can save a lot of time. However, once a lab is set up, it might become harder to switch to another manufacturer - just like replacing an employee. And as shown on the far right, if you have established a tracking or tagging system, change becomes a hassle. Nothing impossible, but something that gives providers more leeway.
Of course, these systems must also be optimized to work sustainably but optimization requires highly skilled personnel - or once again, the manufacturer.
By the same token, troubleshooting complex systems is difficult - and in the case of some AI solutions, impossible, as we do not understand their exact workings.
Applying The Knowledge
Automation engineers and well-trained scientists will be critical to ensure automated systems are integrated and optimized properly.
For example, Croxatto et al. have shown that one can save time and improve their sensitivity, but it required specific expertise.
In projects I have been involved in as an advisor, one challenge was that robotic pipetting systems are less flexible, meaning that they require either all pipette tips to be reused or none at all.
All of these risks don’t mean we should reject automation. As Cain-Hom has reported, they were able to increase data quality and Cq consistency by 75% while reducing assay reagents by roughly 50% in their qPCR workflow. We just have to make sure we transition consciously – for example, we must remember that while optimization can require stopping processes and reconfiguring systems, the associated savings can be greater due to the leverage of automation. NTC stands for No-template control.
As we discussed, to avoid scaling the wrong solutions, we must establish and follow best practices.
Part of this means that we have to think long term. AI systems need to be trained on appropriate datasets, including negative results.
At the same time, data should include sufficient metadata and be broadly available to enable transparency and reproducibility.
All in all, this means that we need to reflect on how we aim to investigate and reinvest the time and capacity gained through automation into thoughtful experimental design.
How We Feel Today
References
Souter, N.E., et al., 2024. Measuring and reducing the carbon footprint of fMRI preprocessing in fMRIPrep. Human Brain Mapping, 45(12), e70003. doi:10.1002/hbm.70003.
Croxatto, A., et al., 2015. Comparison of inoculation with the InoqulA and WASP automated systems with manual inoculation. Journal of Clinical Microbiology, 53(7), pp.2298–2307. doi:10.1128/JCM.03076-14.
Cain-Hom, C., et al., 2016. Mammalian genotyping using acoustic droplet ejection for enhanced data reproducibility, superior throughput, and minimized cross-contamination. Journal of Laboratory Automation, 21(1), pp.37–48. doi:10.1177/2211068215601637.
If you have a wish or a question, feel free to reply to this Email. Otherwise, wish you a beautiful week! See you again on the 26th : )
Edited by Patrick Penndorf Connection@ReAdvance.com Lutherstraße 159, 07743, Jena, Thuringia, Germany Data Protection & Impressum If you think we do a bad job: Unsubscribe
Personal Note from Patrick, the Editor Hi Reader, how do sustainability and science fit together? There's clearly a great deal of misunderstanding and fatigue when it comes to sustainability. Of course, I'd like to change that. As one of the few successful advisors and communicators for sustainability in science, I have developed a rather unique perspective. Let me explain how rethinking sustainability has enabled me to integrate it into science: Today's Lesson: Rethinking Sustainability...
Personal Note from Patrick, the Editor Hi Reader, we have more sustainable innovations in laboratory items available than ever before. Last time, we discussed what sustainable procurement entails. But how should you approach it, and which pitfalls can turn good intentions into unnecessary stress? Let’s give you an idea what it entails to generate impact numbers and to make purchasing more sustainable: Today's Lesson: Optimizing Procurement How to effectively change purchasing practices Number...
Personal Note from Patrick, the Editor Hi Reader, what do you associate with procurement? For many people, it’s a technical term linked to paperwork and frustration. However, my goal is to show that procurement is much more tangible, and often simpler, than it seems. Let’s start by exploring what more sustainable purchasing of lab equipment involves and why it matters to you: Today's Lesson: Understanding Procurement What sustainable purchasing means and why it matters Number of the Day...