How Thirsty
Is Your AI?
Every AI query evaporates freshwater to cool the servers that power it. One teaspoon at a time, multiplied billions of times a day, it adds up to rivers.
One teaspoon.
That's the mid-range estimate for a single AI query β accounting for both the water evaporated cooling server racks and the water used by power plants generating the electricity.
direct + indirect water use
drink the same.
π Direct Water (On-site)
Water evaporated in cooling towers at the data center itself. This is what companies self-report: ~0.3 ml per query.
β‘ Indirect Water (Grid)
Water consumed by power plants generating the electricity. Typically 6β12Γ larger than direct cooling and almost never disclosed.
water bill.
| Company | 2021 | 2022 | 2023 | Growth |
|---|---|---|---|---|
Google | 16.3B L | 19.7B L | 24.2B L | +48% |
Microsoft | 4.8B L | 6.4B L | 7.8B L | +63% |
Meta | ~2.7B L | ~2.9B L | 3.1B L | +15% |
Amazon | 29.1B L* | Not disclosed | Not disclosed | Unknown |
xAI | N/A | N/A | ~1.8B L (est.) | No reports |
More water.
Every company has dramatically improved water efficiency per unit of compute. But demand is scaling faster than efficiency gains β a textbook Jevons Paradox.
(0.49 β 0.30 L/kWh)
(10M β 2.5B, 2023β2025)
Inference is the rising tide.
first drop.
AI's water footprint is simultaneously smaller and larger than headlines suggest. A single query's 3β5 ml is modest compared to a cup of coffee. But 2.5 billion daily queries transform teaspoons into rivers. The trajectory β not the snapshot β is what matters most.
Each query has a cost. Be specific. Reduce back-and-forth. Use simpler models for simple tasks.
Only Google and Mistral have published detailed per-query disclosures. Push other providers for full-scope data.
Data centers concentrate in specific regions. From Memphis to The Dalles, communities bear costs they didn't choose.