
Key Points
- ChatGPT uses about 500ml of water per query for data center cooling, equivalent to a small water bottle
- Two primary cooling systems, evaporative cooling and air conditioning, consume massive water volumes
- Electricity demands from millions of daily users could power a city of 100,000 residents
- Data centers in water-scarce regions like Arizona and parts of India exacerbate local water crises
- AI’s carbon footprint could reach 1% of global emissions by 2027 if powered by fossil fuels
- Tech giants are exploring liquid immersion cooling and renewable energy to reduce environmental impact
When users type a question into ChatGPT, they rarely consider the environmental resources consumed beyond electricity. However, a comprehensive report by the Washington Post and University of California, Riverside reveals that each query consumes approximately 500 milliliters of water, roughly the amount in a standard disposable water bottle. This consumption is not direct but occurs through the cooling systems that prevent the massive server arrays from overheating. For context, if ChatGPT’s 100 million weekly active users each ask just one question, that translates to 50 million liters of water consumed weekly, enough to fill 20 Olympic-sized swimming pools. The researchers based their calculations on OpenAI’s data center locations, cooling technologies, and the average computational power required for each response, providing the first detailed public assessment of AI’s water footprint.
Why AI Data Centers Require Immense Water Resources
Large AI models like ChatGPT operate on thousands of specialized servers housed in massive data centers that process data continuously 24/7. These servers, particularly those running NVIDIA’s H100 GPUs and Google’s TPUs, generate enormous heat, with each rack consuming up to 50 kilowatts of power and producing equivalent thermal output. Two main cooling systems are employed to manage this heat. First, evaporative cooling systems use water to absorb heat as it evaporates, a highly efficient method that can reduce temperatures by 15-20 degrees Celsius but consumes significant water through evaporation. Second, traditional air conditioning units use water in their cooling towers and humidity control systems. These processes result in the average consumption of half a liter of water per question, though actual usage varies based on query complexity, server load, and ambient temperature. In hot climates, water consumption can increase by 40% as cooling systems work harder to maintain optimal operating temperatures of 18-27 degrees Celsius.
Electricity Consumption Reaches City-Scale Proportions
Along with water, AI systems require staggering amounts of electricity to operate. According to reports from the International Energy Agency, a single ChatGPT query consumes approximately 2.9 watt-hours of electricity, nearly ten times more than a standard Google search. As ChatGPT’s usage increases, with millions of people using it daily for work, education, and creative tasks, its energy requirements could soon equal the electricity consumption of an entire city of 100,000 residents. A recent study by researchers at the University of Massachusetts Amherst projected that training a single large language model like GPT-4 can emit as much carbon as five cars over their entire lifetimes. If current growth trends continue, AI’s total electricity consumption could reach 85-134 terawatt-hours annually by 2027, comparable to the annual electricity consumption of the Netherlands. This is particularly concerning in countries where electricity generation still relies on coal or other polluting sources, such as India (75% fossil fuels) and China (60% coal), where many AI data centers are located.
Regional Water Scarcity Exacerbated by Data Centers
The growing environmental impact of AI is creating acute problems in regions already facing water scarcity. Data centers located in drought-prone areas like Arizona, California, and parts of India are exacerbating local water crises. In Arizona, where major tech companies have built data centers to take advantage of tax incentives, each facility can consume up to 1.8 million liters of water daily, equivalent to the water usage of 6,000 households. In India, where water stress affects 600 million people, new data centers in cities like Hyderabad and Noida are competing with agriculture and residential needs for limited water resources. The constant demand for electricity puts additional strain on energy resources, and if this energy doesn’t come from renewable sources, carbon emissions increase rapidly. A recent analysis showed that Microsoft’s data centers in Arizona consumed 250 million liters of water in 2024, a 30% increase from 2023, directly correlating with the expansion of AI services like Copilot and OpenAI integration.
Industry Responses and Mitigation Strategies
Tech companies are beginning to acknowledge and address AI’s environmental impact. Google has pledged to use only carbon-free energy by 2030 and is investing in advanced cooling technologies, including liquid immersion cooling where servers are submerged in non-conductive liquid, reducing water consumption by 90%. Microsoft is experimenting with underwater data centers that use seawater for cooling and is building new facilities in regions with abundant renewable energy, such as Iceland and Sweden. OpenAI has partnered with renewable energy providers and is exploring the use of nuclear power for its data centers. Some companies are developing more efficient AI models that require less computational power, such as Mistral’s Mixtral and Google’s Gemini Nano, which can run on local devices. Researchers are also working on “sparsity” techniques that activate only necessary parts of neural networks for each query, potentially reducing energy consumption by 50%. However, these solutions are still in early stages, and the rapid growth in AI adoption is outpacing efficiency gains.
Future Implications and Sustainable Development
It is clear that in the future, as AI develops, its environmental impact will need to be considered seriously. The AI industry must adopt sustainability as a core design principle, not an afterthought. Policymakers are beginning to take notice, the European Union’s AI Act includes provisions requiring environmental impact assessments for large AI systems, and India is considering similar regulations. Some experts propose a “green AI” certification system that would inform users about the environmental cost of their queries, similar to carbon footprint labels on products. Others suggest time-based usage incentives, encouraging users to access AI services during off-peak hours when renewable energy is abundant. The development of more efficient hardware, such as photonic chips that use light instead of electrons, could revolutionize AI’s energy efficiency in the coming decade. For now, individual users can reduce their impact by batching queries, using AI judiciously, and choosing services powered by renewable energy. The challenge is balancing AI’s transformative potential with planetary boundaries, ensuring that our quest for artificial intelligence doesn’t compromise natural resources that future generations will depend on.



















































