Writing credit: Emilia Coverdale, Marketing Lead at Asperitas
In today's digital age, artificial intelligence (AI) is omnipresent, transforming industries and daily life with unprecedented speed. From generative AI integrated into Apple's hardware to Nvidia's soaring stock value driven by AI advancements, it's clear that AI is everywhere and in everything. However, this rapid proliferation of AI comes with a significant environmental cost, particularly through the energy-intensive datacentres that power these technologies. As we delve deeper into the intersection of AI and sustainability, it becomes crucial to understand the “green cost” of AI and explore sustainable thermal management to help mitigate the global environmental impact.
AI applications require substantial computational power, primarily provided by datacentre. These facilities, which house the servers and infrastructure needed to process and store data, consume vast amounts of electricity. According to the Tony Blair Institute for Global Change, datacentres will consume up to 13% of global electricity by 2030 if current trends continue (source). The energy use (and associated carbon emissions) by datacentres now rivals the aviation industry and is expected to double by 2026. Rutger de Haij, Asperitas CEO comments:
“As the demand for AI-powered services continues to surge, so does the energy consumption of datacentres. At Asperitas, we’re dedicated to providing sustainable cooling solutions that not only meet increasing compute demands but do so with a significantly reduced environmental impact. Our thermal management expertise is helping datacentres transition to greener operations. By embracing sustainable cooling solutions, our customers ensure success in their growth as well as environmental stewardship.”
While many users might not perceive their devices as power-hungry, the reality is that the processes running AI applications are not localized. When we use AI-powered services like virtual assistants, recommendation systems and healthcare applications, the computations occur in energy-intensive datacentres rather than on our device. This disconnect obscures the true environmental cost of our digital habits. But how do we lower our green cost and impact per scroll?
We see that governments and regulatory bodies are addressing the environmental impact of datacentres with stringent regulations. The Corporate Sustainability Reporting Directive (CSRD) in the European Union is one such initiative, aiming to enhance transparency and accountability for environmental, social, and governance (ESG) metrics. The CSRD mandates that companies, including datacentres with an energy consumption above 500kW, report annually on their environmental performance. This regulation will impact around50,000 EU companies, and the target is to lower the EU's final energy consumption by 11.7% by 2030— a legally binding goal (source).
For datacentre operators, this means they must meticulously monitor and report on various aspects of their environmental performance, including energy consumption, cooling efficiency, and water usage. The updated thresholds will not only affect European datacentres but will also set a precedent for global operations, urging them to adopt more sustainable practices.
The rapid expansion of datacentres to support AI, particularly hyperscale datacentres, necessitates a robust and sustainable approach. Companies like Microsoft and Google are investing heavily in new datacentre infrastructure, with plans to build facilities worldwide, including a $1billion datacentre in Hertfordshire, UK. However, these investments must be coupled with a commitment to sustainable practices and clean energy solutions.
Governments play a crucial role in this transition. Policies mandating energy usage reporting and incentivizing clean energy investments are essential. The Tony Blair Institute for Global Change's report on "Greening AI" outlines a policy agenda that includes mandating reporting and promoting green certifications for datacentre and AI models to avoid greenwashing (source).
Let's talk about carbon in terms of trees – not to make an unverified green product claim, but more to also highlight the significant environmental savings to those who don’t work in the datacentre landscape. This could be a conversation that takes place over dinner…
We know that carbon emissions contribute significantly to climate change, a crisis that threatens ecosystems worldwide. There is no doubt that immersion cooling lowers the carbon footprint of datacentres. Let’s now translate these carbon savings into trees saved. For instance, a 1MW datacentre using immersion cooling instead of air cooling can save approximately 7,766 trees annually. While this figure is symbolic — since no trees are literally being saved — this metric helps contextualize the carbon emissions reductions. This helps us understand the green cost and impact per scroll.
Through the Asperitas TCO model, we can calculate the amount of carbon emissions saved through immersion cooling. We can also determine the equivalent number of trees needed to absorb that carbon. The calculation is based on the amount of carbon dioxide (CO2) a tree absorbs over a year. On average, a single tree can absorb about 21 kilograms of CO2 annually. Therefore, reducing carbon emissions by approximately 163,086 kilograms per year (7,766 trees x 21 kg/tree) through immersion cooling is a tangible way to understand the environmental impact.
"Immersion cooling technology is the game-changer for datacentre operations as it drastically reduces energy consumption and operational costs.” Says Andy Young, Asperitas CTO.
“At Asperitas, we've engineered our solutions to optimize thermal load management, resulting in lower electricity usage and maintenance expenses. This directly translates to a reduced TCO, providing a compelling financial incentive for businesses to adopt sustainable practices."
The bottom line is – the Asperitas TCO model calculates over €1 million in cost savings per year from a single 1MW datacentre that moves to immersion cooling from air cooling.
Asperitas champions sustainability through thermal management expertise and immersion cooling technology. The Perpetual Natural Convection (PNC) and Direct Forced Convection (DFC) platforms not only optimize cooling efficiency but also align with global green initiatives. By minimizing energy consumption and utilizing environmentally friendly fluids, Asperitas contributes significantly to reducing the carbon footprint of your datacentre.
Asperitas are the only datacentre thermal management experts to provide the full range of datacentre cooling solutions. The PNC platform offers ultra-low energy overhead (circa 100W overhead out of 44kW capacity per tank) and is the most efficient immersion cooling solution on the market. The DFC platform is designed for enhanced thermal performance and scalability, with high power cooling exceeding 2kW per processor. These solutions not only reduce the energy consumption of cooling systems but also allow for higher server density and better performance, aligning with the increasing computational demands of AI applications. This is particularly valuable for datacentres needing to comply with regulatory reporting requirements like the CSRD and Energy Efficiency Directive (EED). Here's how:
Cooling efficiency: Immersion cooling drastically reduces the energy required for cooling compared to traditional air-cooling systems. This translates to better Power Usage Effectiveness (PUE) metrics, a critical factor in environmental reporting.
Energy consumption: The improved efficiency of immersion cooling reduces overall energy consumption, allowing datacentres to meet and exceed energy performance targets set by regulations.
Water usage: Unlike many traditional cooling systems that rely heavily on water, immersion cooling systems use dielectric liquids, significantly cutting down water usage — a crucial aspect of sustainable operations.
The energy demands of AI are immense and multifaceted. Training AI models, especially large language models (LLMs) like ChatGPT, involves processing terabytes of data and can take millions of hours of computation. Once trained, these models require substantial ongoing energy to handle queries and provide outputs. Putting this into context, training a single AI model can emit as much carbon as five cars over their lifetimes (source).
The disparity in energy use between traditional search engines and AI-powered search engines is stark. Studies suggest that AI-powered queries can use up to 30 times more energy than traditional text-based searches. More complex AI tasks, such as generating high-resolution images, can consume as much energy as half a smartphone charge per image, highlighting the significant energy costs of advanced AI operations.
Trees, car emissions, phone charges – more food for thought at our dinner table chat about green cost and impact per scroll. Now we understand.
We are in the middle of the technological revolution driven by AI, so it's imperative to ensure that this transformation is both sustainable and responsible. Tech doesn't run for free. The environmental impact of AI and the energy demands of datacentres are significant, with more and more IT equipment needing to be cooled in our datacentres. Effective thermal load management and regulatory frameworks can help mitigate these effects. Datacentres can do more to lower the green cost and impact per scroll and not be limited by cooling.
“We must demystify AI, including its climate impact, and hold both corporations and governments accountable for sustainable practices” says de Haij. “Just as we scrutinize the environmental impact of the cars we drive and the conditions in which our clothes are made, we must also consider the environmental footprint of our digital consumption. By prioritizing green technologies and policies, we can harness the power of AI while safeguarding our planet for future generations."
To download this article, please click here.