AI’s Environmental Impact Drives New Guidelines for Data Centres
At a glance
- UNEP published guidelines to reduce data centre energy and water use in June 2025
- Global data centres used about 415 TWh of electricity in 2024
- AI systems could emit up to 79.7 million tons of CO₂ annually in 2025
Artificial intelligence is increasingly recognized for its potential in climate solutions, but its expanding use also brings environmental challenges related to energy and water consumption.
The United Nations Environment Programme (UNEP) released Sustainable Procurement Guidelines for Data Centres and Servers on 12 June 2025. These guidelines are intended to help reduce the energy and water demands of facilities that support AI and digital infrastructure.
According to UNEP, global data centres consumed approximately 415 terawatt-hours of electricity in 2024, representing about 1.5% of worldwide electricity use. Martin Krause, Director of UNEP’s Climate Change Division, stated that data centres require substantial amounts of energy and water, and that this consumption is expected to rise.
Estimates for 2025 indicate that AI systems could be responsible for emitting between 32.6 and 79.7 million tons of carbon dioxide each year. Water use by these systems is projected to reach between 312.5 and 764.6 billion liters annually, highlighting the scale of resource demands associated with AI technologies.
What the numbers show
- Global data centres used about 415 TWh of electricity in 2024, or 1.5% of global supply
- AI systems in 2025 could consume up to 764.6 billion liters of water annually
- U.S. data centres used 66 billion liters of water in 2023, with potential to quadruple by 2028
In the United States, AI data centres are estimated to emit 24 to 44 million metric tons of carbon dioxide per year between 2024 and 2030. Water usage by these centres is projected at 731 to 1,125 million cubic meters annually during the same period.
Research indicates that AI servers in the U.S. could drive annual increases of 200 to 300 billion gallons of water consumption and add 24 to 44 million metric tons of CO₂ equivalent emissions by 2030. This trend reflects the growing environmental footprint of AI infrastructure.
Data centres in the U.S. consumed around 66 billion liters of water in 2023. Projections suggest that this figure could double or even quadruple by 2028 as demand for AI and cloud services expands.
Analysis of AI model queries shows that a single short GPT-4o query uses about 0.43 watt-hours of electricity. At a scale of 700 million queries per day, this results in electricity consumption comparable to that of 35,000 U.S. homes and freshwater evaporation equivalent to the annual drinking needs of 1.2 million people.
While AI is often presented as a tool to address climate challenges, its resource requirements present new environmental considerations for data centre operators and policymakers. The UNEP guidelines aim to provide a framework for more sustainable management of these facilities as AI adoption continues to grow.
* This article is based on publicly available information at the time of writing.
Sources and further reading
- UNEP
- [2601.06063] The Environmental Impact of AI Servers and Sustainable Solutions
- UNEP
- [2505.09598] How Hungry is AI? Benchmarking Energy, Water, and Carbon Footprint of LLM Inference
- [2512.11863] Expert Assessment: The Systemic Environmental Risks of Artficial Intelligence
Note: This section is not provided in the feeds.
More on Science
-
Edward L. Deci, Self-Determination Theory Co-Founder, Dies at 83
The professor co-founded Self-Determination Theory with Richard M. Ryan and passed away on February 14, 2026, according to reports.
-
ACA Rule Changes Proposed by Trump Administration Could Raise Family Deductibles
A proposed rule outlines potential family deductibles reaching $31,000 by 2026, according to reports. Out-of-pocket maximums may hit $27,600.