The hidden cost of AI: Unpacking its energy and water footprint

This blog post is cross-published with the IEEE.
On 12 February 2025, the OECD and IEEE co-organised an event on the margin of the French AI Action Summit with diverse experts to discuss AI’s growing environmental challenges. The event’s three main sessions covered critical sustainability concerns: the environmental cost of inference, the impact of data centres on the electricity grid, and AI’s water footprint.
The sessions’ main takeaways will feed into the IEEE P7100 Working Group’s work to develop a technical standard to measure AI’s environmental impact. Here is a summary of the day’s discussions.
Session 1: The environmental cost of inference
While AI has revolutionised countless industries, its environmental impact is a pressing issue. Energy-intensive training of AI models receives a lot of attention, but it is during the inference stage, when trained AI models are put into use, that the significant long-term environmental cost is becoming more apparent. Discussions focused on how inference, which could account for more energy consumption over time than training, considering the level of scale, presents a growing challenge for sustainability.
Growing energy consumption from AI inference
Panellists recognised AI inference’s substantial energy requirements, especially as models scale and become more complex. One speaker pointed out that, for some models, inference’s cost now outweighs the cost of training after 50 million usages and that current incentives do not encourage companies to optimise AI’s energy use. This lack of incentive leads to companies prioritising financial gain over sustainability, underscoring the need for stronger regulatory frameworks to balance technological progress with environmental responsibility. The moderator stressed that optimising the inference process could minimise energy consumption by reducing, for instance, unnecessary computations at the model level.
The role of developers in optimising AI inference to reduce the carbon footprint
Developers play a crucial role in minimising the environmental impact of AI. One speaker advocated for critical assessments of software development choices and prioritisation of smaller, more efficient AI models. One panellist said this ties into the growing demand for “green skills” in AI-related professions. Skills such as energy management and environmental, social, and governance (ESG) policy competence are becoming increasingly valuable in reducing AI’s carbon footprint.
Evaluating energy efficiency claims
One speaker also mentioned that while some models like DeepSeek are more energy efficient during training, there are still uncertainties about their efficiency during the inference phase, especially when they provide longer reasoning time for completing queries. There is still no research on the environmental impact of these recent developments, and transparent evaluations are needed to gauge AI’s environmental cost realistically.
Session 2: The impact of data centres on the electricity grid
As AI expands, data centres, the backbone of AI systems, are increasing pressure on the world’s electricity grids. These massive facilities consume vast amounts of energy, which raises concerns about sustainability, grid stability and reliability.
Data centres as major energy consumers
The moderator highlighted the difficulty of obtaining reliable estimates on the energy consumption of data centres and the extent to which energy consumption can be specifically accounted to AI, making it challenging to fully understand the magnitude of the problem. However, speakers expressed that while AI contributes to growing energy demand, it is not the most significant driver of global electricity consumption. For example, the transportation sector’s electrification outpaces AI regarding electricity use. One panellist mentioned the importance of real-time cloud energy tracking to reduce uncertainty in emissions calculations.
Renewable energy and carbon offsetting efforts
Some companies, like Google, have committed to clean electricity in data centres by 2030. However, one speaker warned that even with these efforts, renewable energy supplies may not be sufficient to meet demand by 2025. Given AI’s 24/7 energy needs, it may continue to rely on fossil fuels unless there are comprehensive policy changes. One participant noted that we often overlook the role nuclear energy, including small modular reactors (SMRs), could play in AI sustainability. Another broader initiative focuses on reusing heat generated by data centres, such as AWS’s projects in Ireland. However, it is a significant challenge to scale these efforts responsibly.
Modulate energy demand
Other aspects were also mentioned, including the location of data centres and the use of mobile devices for inference, which could help offload energy demands, considering the charging of mobile phones during the night.
Session 3: AI water usage
AI-driven data centres don’t just consume massive amounts of energy—they also require vast quantities of water for cooling purposes in data centres and other parts of the AI lifecycle, including producing AI-specific hardware or water-intensive electricity generation. In regions where water is scarce, this increases the strain on the local water supply and raises serious concerns. The moderator highlighted that water consumption hasn’t received the same level of attention as energy consumption, that calculations are complex, without commonly accepted metrics, and are not included in regulatory compliance requirements, which may explain why it remains under-measured.
AI’s hidden water footprint
One speaker explained that AI models, particularly large-scale ones, require significant amounts of water to cool servers. Data centres often use fresh water for cooling, which leads to substantial water loss due to evaporation. This loss affects the environment and poses risks to the communities that depend on this water for drinking and agricultural use. To further exasperate the issue, AI has an indirect water footprint. For example, semiconductor consumption uses water, and so does generating electricity for AI operations.
Measuring AI’s water footprint
A key challenge in addressing AI’s water usage is the lack of standardised tools for measuring water consumption. One speaker explained that the use of water in data centres can be divided into embodied use and operational use. Embodied use, water used in manufacturing hardware, represents 30% of AI-related consumption. Operational use refers to water used to cool data centres and AI model processing. It equals 70% of AI-related water consumption. However, one speaker pointed out that many companies use energy consumption as a proxy for water use. Still, these methods have an error margin of up to 500%, which often leaves companies unaware of their water consumption’s true scale.
Geographical and ethical consideration
One speaker signalled that global water allocation is already unsustainable: we are currently not on track to meet the Sustainable Development Goal 6 water targets, and AI is an exacerbating factor. AI water consumption is a social justice issue. Many data centres are in water-scarce regions, which can worsen local water scarcity and create competition for resources. All speakers emphasised that water is a local issue.
The Pricing Paradox: Water as an undervalued resource
One participant raised an important question: should water be priced differently to incentivise conservation? Speakers argued that water is often treated as a “free” resource, but this mindset fails to account for the risks associated with water scarcity. As such, there is a differentiation between the cost and the value attributed to water. By adjusting water pricing to reflect its true cost, companies and consumers could be incentivised to use it more efficiently.
Transparency and accountability
To address the issue of AI’s water usage, one speaker suggested that companies disclose their water consumption as part of their sustainability efforts. A representative from industry noted that new data centres use non-potable water sources, such as canal or seawater, to reduce their reliance on freshwater sources, while others have on-premises wastewater treatment to reduce freshwater use in data centres.
The path forward: Collaboration is key
The event had a recurring call for collaboration across all sectors. The OECD emphasised the need for multi-stakeholder cooperation involving (local) governments, industries, academia, citizens, indigenous populations, and standard-setting bodies like IEEE to ensure that AI development aligns with sustainability goals. The IEEE echoed that sentiment, stressed the importance of standardised metrics to measure and manage AI’s environmental impact comprehensively, and invited participants to join the IEEE P7100 standardisation effort.
The discussion ended, and participants recognised progress and what was left to do. The complexity of AI’s environmental, energy, and water usage challenges requires ongoing research, policy innovation, and technological advances. The good news is that as AI continues to evolve, as does our understanding of its environmental footprint. By working together, we can create a future where AI is both innovative and sustainable.
The environmental impact of AI is an urgent issue that requires collective efforts. As it continues to shape the future, we must prioritise sustainability at every stage of its lifecycle, from model development to deployment. By aligning our efforts and sharing knowledge, we can mitigate the environmental costs of AI and build a greener, more sustainable future for all.