[475 Cover] From Innovation to Exhaustion: How AI growth deepens climate and water risks
Artificial intelligence (AI) has rapidly transformed modern life. Large Language Models (LLMs) derive their capabilities from extensive training on massive, diverse datasets. This foundational training enables them to process and generate human-like language in real time, thereby delivering unprecedented efficiencies and convenience across a spectrum of industries, from finance to healthcare and media. The launch of OpenAI’s ChatGPT sparked a global surge in AI adoption. However, this rapid wave of innovation comes with an environmental cost often overlooked by users. While AI is easily accessible online, few users grasp the enormous energy and water consumption required by the data centers that secretly power these sophisticated systems. The global expansion of AI is severely stressing water resources, since, as discussed below technology drives a dramatic increase in data center consumption, making this an urgent environmental concern.
AI data centers emerge as new environmental hotspots
It is well known that AI consumes tremendous amounts of electricity. Models like ChatGPT and Gemini rely exclusively on high-performance GPUs to train and process massive volumes of data. These processors consume vast electricity and generate intense heat, requiring constant cooling. To prevent GPUs from overheating and failing, data centers run power-intensive cooling systems nonstop.
Traditional web servers have long required cooling, but the surge of generative AI has pushed data center energy consumption to unprecedented heights. Since ChatGPT’s public release in late 2022, the scale of computing infrastructure and the energy required to sustain it have grown exponentially. Global demand for AI computing capacity has triggered a race among tech giants to build more data centers, often in regions with cheaper electricity and abundant water resources. However, this expansion raises serious concerns about sustainability and environmental impact.
A single AI prompt on services like Gemini or ChatGPT uses 10 to 40 times more power than a typical web search. This stark disparity is rooted in the underlying process. Unlike search engines that merely retrieve and index existing data, generative AI must process the request through a massive neural network to synthesize a completely novel response. This involves activating billions of parameters simultaneously, each performing complex mathematical calculations in real time. The cumulative effect of these computations translates into extraordinary energy consumption, even for seemingly simple tasks such as writing a paragraph or generating an image. In the past, simple fan-based air cooling was sufficient, but AI facilities now rely on massive quantities of water for cooling. Modern data centers circulate water through heat exchangers and cooling towers to maintain stable temperatures, often evaporating thousands of liters daily. The environmental consequences are particularly acute in drought-prone regions where water scarcity is already a pressing issue. New research highlights the steep energy price of generative AI inference. According to a 2024 analysis by The Washington Post and University of California researchers, generating a simple 100 word email with ChatGPT demands approximately 0.14 kWh of electricity. This is the equivalent of powering 14 LED bulbs for an hour, and the water footprint is equally staggering; the same AI process consumes roughly 519 mL of water, a volume greater than a standard bottle of drinking water. When multiplied by the billions of queries processed daily world wide, the cumulative environmental footprint becomes enormous.
As AI continues to integrate into daily life, from education and entertainment to healthcare and finance, the question of sustainability grows increasingly urgent. The challenge now lies in balancing innovation with responsibility, ensuring that the technology powering the future does not quietly drain the planet’s most vital resources.
Data centers spark local outrage over water use in the southwest U.S.
In the arid regions of Arizona, Nevada, and California in the United States, data centers are intensifying local water shortages. The Washington Post reported growing public anger as residents link major tech facilities to the declining water levels of The Colorado River, the main water source for 40 million Americans. After two decades of drought, parts of the river have dropped to record lows.
Mesa, a city situated approximately 320 km east of The Colorado River, relies heavily on the river and its tributaries for most of its water supply. While droughts are natural, the construction of new data centers by companies like Meta has worsened fears of scarcity. Data centers consume enormous amounts of energy and use millions of liters of cooling water daily. Citing The Washington Post, a single large data center requires an estimated one million to five million gallons of water daily. This staggering volume is comparable to the daily consumption of a mid-sized town housing 10,000 to 50,000 residents.
The city of Mesa, home to approximately 500,000 residents, confronts a severe impending crisis. When Meta completes its three new data centers by 2026, they are expected to use up to 80% of the city’s entire current water supply. Similar issues have surfaced across the country. In The Dalles, Oregon, Google’s data center is reportedly responsible for consuming a staggering 25% of the city’s total water supply. In Los Lunas, New Mexico, local farmers have staged direct protests against government officials over the approval of Meta is data center expansion plans.
Cooling and water conservation strategies in AI infrastructure
To achieve sustainable growth, developers are turning to advanced water-saving technologies such as closed-loop cooling, immersion cooling, and air-based cooling systems that reuse or minimize water. Closed-loop systems recycle both wastewater and freshwater, enabling multiple cycles of use. Cooling towers use outside air to lower water temperatures, reducing freshwater consumption by as much as 70%. These systems are increasingly being adopted in hyperscale data centers operated by companies like Microsoft and Meta, where even minor efficiency gains can translate into enormous resource savings. By keeping water and coolant within a sealed circuit, closed-loop designs also help reduce contamination and maintenance costs. Some facilities integrate sensors and AI algorithms to monitor flow rates and temperature in real time, optimizing cooling performance while minimizing waste. This combination of automation and sustainability demonstrates how digital intelligence can improve the efficiency of the very systems that support it.
Free cooling methods use cold outdoor air to lower equipment temperatures, a practice effective mainly in cooler climates. Air-cooling systems use ventilation ducts to dissipate heat from chips and circuits, making them suitable for areas where power is inexpensive and water is scarce. This method has gained traction in regions like Northern Europe and Canada, where natural air temperature remains low for most of the year. In these environments, operators can drastically reduce the need for mechanical chillers. Some data centers even take advantage of underground or coastal locations, channeling naturally cool air through heat exchangers. Although air-cooling is limited by seasonal variations, it serves as an important model for sustainable infrastructure design, showing that geography can be leveraged to offset environmental costs.
Immersion cooling submerges servers and chips in non-conductive liquid contained within sealed tanks, efficiently dissipating heat. The heated fluid flows to a heat exchanger, cools, and then recirculates through the system. Although immersion systems cost more upfront than conventional liquid cooling, they deliver major energy savings and greater space efficiency while using far less water overall.
Shifting toward renewable energy and smarter efficiency
Data centers powered by renewable sources like solar and wind consume far less water than those relying on fossil fuels. Nevertheless about 56% of the electricity consumed by data centers globally is still generated from fossil fuels. Expanding the use of clean energy could sharply reduce overall water consumption. However, transitioning to renewable energy on a global scale is not a simple process. Solar and wind power are intermittent by nature, meaning their output fluctuates depending on weather and daylight conditions. To ensure consistent power supply for data centers that must operate 24 hours a day, companies are now investing in large-scale battery storage systems and hybrid grids that combine renewable sources with traditional backup power. In regions with abundant sunlight, such as the southwestern United States or southern Europe, data centers are experimenting with on-site solar farms, while northern countries like Finland and Sweden are capitalizing on wind energy and naturally cold climates to reduce cooling needs.
Tech giants such as Google and Amazon are exploring small modular nuclear reactors, or Small Modular Reactors(SMRs), to power future data centers. SMRs are viewed as a potential breakthrough because they can provide stable, carbon-free energy in compact facilities that are easier to deploy than conventional nuclear plants. Proponents argue that these reactors could make large-scale computing both sustainable and resilient, though questions remain about safety regulations, radioactive waste management, and public acceptance. Nevertheless, the concept illustrates how far major technology companies are willing to go to meet the massive power demand driven by AI and cloud computing. Data centers increasingly use economizer systems, directing filtered outside air or water into the cooling process when temperatures drop to cut energy and water consumption. In addition to economizers, some facilities are adopting advanced liquid cooling methods that circulate non-conductive fluids directly over high-performance chips, significantly improving thermal efficiency. Others are experimenting with heat reuse systems that redirect waste heat from servers to warm nearby buildings or greenhouses, turning a byproduct of computing into a useful energy source. In countries like Denmark and the Netherlands, these systems are already part of urban sustainability strategies. Collectively, these innovations signal a growing commitment within the tech industry to balance digital expansion with environmental responsibility.
The situation highlights the consequences of technological growth outpacing environmental responsibility. For AI to be genuinely transformative, sustainability must take precedence over speed. The true measure of AI’s future will lie not in its speed or power, but in its ability to operate in harmony with the planet that sustains it. Beyond innovation and efficiency, the next frontier of AI development must focus on minimizing its ecological footprint. Tech companies are racing to build faster models and larger data centers, yet few are equally investing in technologies that ensure long-term environmental balance. Integrating renewable energy, optimizing cooling systems, and designing energy-efficient algorithms are no longer optional. They are imperatives for a sustainable digital ecosystem. The question is not whether AI can change the world it already has. The real challenge is whether humanity can guide this technology responsibly, ensuring that progress does not come at the expense of the planet’s most vital resources. As the digital age accelerates, so too must our commitment to sustainability. The future of intelligence, both artificial and human, depends on our ability to innovate without depletion and to advance without eroding the very environment that makes progress possible.