The ai water consumption myth debunked starts here: AI does use water — but the viral claim that a single ChatGPT conversation "drinks" a bottle of water is almost certainly wrong, or at least wildly misleading. That figure comes from a 2023 paper by researchers at UC Riverside that estimated water consumption across entire data centre cooling systems, then divided by query volume in ways critics argue conflate very different processes. AI's water footprint is real, measurable, and worth scrutinising. But lifecycle analysis — the method scientists use to track resource use from production to disposal — is ferociously complicated, and most headlines skip the complications entirely. Understanding what's actually happening inside a data centre, and how that compares to other industries, changes the picture completely.
Where does the 'AI drinks a water bottle per query' claim actually come from?
The statistic that shocked millions originated from a preprint paper by researchers Pengfei Li, Jianyi Yang, and colleagues at UC Riverside, published in 2023. Their core argument was that training large language models and running inference queries requires enormous amounts of cooling water in data centres — and when you add up the gallons evaporated to keep servers at safe temperatures, the per-query figure sounds alarming.
The number itself — roughly 500ml per short conversation — was not fabricated. But it was calculated using a specific methodology that several engineers and environmental scientists have since challenged. The figure averaged water consumption across entire data centre campuses, including water used for cooling unrelated workloads like cloud storage, video streaming, and enterprise software. Attributing all of that to the AI query running at that moment is a bit like blaming one passenger for all the fuel a plane burns.
Hank Green, a science communicator with millions of subscribers, made exactly this point in a widely-watched 2025 video. His argument: lifecycle and resource-use analyses are genuinely hard, and journalists — including well-meaning ones — routinely misapply their conclusions. A number that sounds precise often has a confidence interval wide enough to drive a truck through.
None of this means the researchers were wrong to raise the issue. Water stress near data centre hubs is a documented problem, particularly in arid US states like Arizona and Nevada where major facilities cluster. The question is whether the per-query framing communicates a useful truth or a misleading one.
How do data centres actually use water?
Most large data centres use one of two cooling strategies, and understanding the difference matters enormously for any honest accounting.
Air-cooled systems use chillers and fans to move heat away from servers. They consume electricity but relatively little water. Evaporative cooling towers — the dominant approach in older and larger facilities — work like industrial-scale swamp coolers: water evaporates, carrying heat with it. This is where most of the reported water consumption comes from.
The key variable is where that water goes. Evaporated water is lost to the atmosphere — it doesn't return to the local water table, which matters in drought-prone regions. But it also isn't contaminated or destroyed. Whether this counts as a serious environmental harm depends entirely on local hydrology, which varies by facility.
Here's what the narrative often misses:
- Many of the newest hyperscale data centres — including facilities built by Google and Microsoft in recent years — use closed-loop cooling systems that dramatically reduce evaporative water loss.
- Microsoft has publicly committed to being water positive by 2030, meaning it aims to replenish more water than it consumes globally.
- Google reported in its 2023 environmental report that it reduced its water usage intensity — litres per megawatt-hour — compared to previous years, even as AI workloads grew.
The infrastructure is not static. The data centres being built today look very different from the ones the 2023 UC Riverside paper was modelling.
Is AI's water footprint actually large compared to other industries?
This is where the myth really starts to unravel — or at least get complicated in ways that should make anyone pause before sharing a viral infographic.
Agriculture accounts for roughly 70% of all global freshwater withdrawals, according to the Food and Agriculture Organization of the United Nations. A single kilogram of beef requires approximately 15,000 litres of water to produce, when you account for feed crops, drinking water, and processing. A cotton T-shirt takes around 2,700 litres. These aren't contested estimates — they come from widely replicated lifecycle analyses conducted over decades.
The entire global data centre sector, including everything from Netflix servers to banking infrastructure to AI, accounts for less than 1% of global water withdrawals by most credible estimates. That's not nothing — location and timing matter, and a data centre drawing heavily from a stressed aquifer in Phoenix causes more harm than the same facility in water-rich Norway. But the proportional scale matters when we're allocating public attention and policy energy.
Researchers who study technology's environmental footprint generally argue that electricity sourcing is a far more important variable than water. A data centre running on coal-heavy grid power causes significantly more environmental damage than one running on hydroelectric or wind power — and that trade-off barely registers in the water-centric discourse.
The framing of AI as an environmental villain also tends to ignore the counterfactual. AI tools used in climate modelling, grid optimisation, and materials science research may offset significant resource consumption elsewhere. That's genuinely hard to quantify, but so is the water figure — and critics rarely apply the same scepticism to both.
Why do the headlines get this so consistently wrong?
Lifecycle analysis is, in the words of engineers who use it professionally, inherently political. Not in a partisan sense — in the sense that every methodological choice embeds assumptions, and those assumptions determine the answer you get before you've done any arithmetic.
Do you count the water used to manufacture the server chips? The water embedded in the concrete of the data centre building? The water footprint of the electricity generation? Each decision can shift your final number by an order of magnitude. The UC Riverside researchers made reasonable choices for their purposes. But those choices aren't the only defensible ones.
News cycles reward alarming, concrete-sounding numbers. "AI uses a water bottle per query" fits in a tweet and triggers a clear emotional response. "AI's water consumption varies by between 50ml and 2,000ml per query depending on facility type, location, cooling technology, grid mix, and how you define the system boundary" does not get shared.
There's also a deeper asymmetry: tech companies are easy targets for environmental criticism, and often deserve it. But the specific criticism has to be accurate to be useful. Misdirected outrage can crowd out pressure on the actual levers — like pushing for renewables-backed data centres, investing in liquid cooling research, or zoning AI infrastructure away from water-stressed regions.
Science communicator Hank Green put it bluntly in his 2025 analysis: the story of AI and water is interesting and worth telling, but the way it's currently being told teaches people to distrust nuance rather than demand it.
What does the evidence actually say we should worry about?
The honest answer is: specific harms in specific places, not a global civilisational water crisis driven by chatbots.
The most credible concern is geographic concentration. When a single hyperscale campus draws millions of gallons per day from a local aquifer that's already under pressure — as has happened in parts of Arizona and the American Southwest — that's a real, documentable harm to local communities and ecosystems. Residents in Mesa, Arizona raised legitimate concerns when data centres expanded rapidly in their region during the early 2020s, coinciding with Lake Mead hitting historic lows.
This is a genuine problem. It is also a solvable one through zoning policy, water pricing reform, and the accelerating shift to advanced cooling technologies — not through individual users feeling guilty about asking an AI to draft an email.
Research consistently suggests that the three highest-impact levers for reducing AI's environmental footprint are:
- Decarbonising the electricity grid that powers data centres
- Deploying next-generation cooling technologies, including direct liquid cooling of chips
- Improving model efficiency — smaller, faster models that do the same job with fewer computations
All three are actively being pursued by researchers and, to varying degrees, by the industry itself. Model efficiency in particular has improved dramatically over the last five years: newer AI architectures achieve comparable performance to earlier models at a fraction of the compute cost, which reduces both energy and water consumption proportionally.
The story of AI and water is real. It's just not the story most people have heard.
The ai water consumption myth debunked doesn't mean AI has no environmental footprint — it absolutely does. But a number stripped of its methodology, its geographic context, and its comparison class isn't information. It's a vibe. The real issues are local water stress near specific facilities, electricity sourcing, and model efficiency. Those are fixable problems with measurable solutions. Panic about a bottle of water per ChatGPT message is not. Demanding precision from the people measuring AI's impact is the same skill as demanding precision from AI itself — and it matters just as much.
Originally published on SnackIQ

