The cloudy issue of data center water cooling
For years, digital infrastructure lived in the background—an abstract “cloud.” Today that cloud is unmistakably physical: multi-building data center campuses spanning hundreds to thousands of acres, drawing hundreds of megawatts to gigawatts of power and, in many regions, substantial volumes of water to cool increasingly heat-dense AI processors. Communities are feeling the footprint and pushing back.
Across North America, local governments are rejecting or pausing projects over water and energy concerns. In Alberta, Canada, Rocky View County voted 6–1 against a proposed 1,100-acre AI/data center campus with a 900-MW onsite power plan, citing farmland impacts and uncertainty around future water demand. Similar concerns have stalled or cancelled billions of dollars in US developments.
Cooling is now central to a project’s social license. A one-GW campus using traditional evaporative cooling can consume 2–4 million gallons of water per day—on the scale of a small city. Modern adiabatic systems, using far less water, can cut that to roughly 0.5–1 million gallons per day, closer to the consumption of a single golf course.
The lesson is clear: data centers will only be approved when communities understand how they will benefit and have confidence that the project’s water and energy footprint is transparent, justified, and responsibly managed. Water strategy has become synonymous with social license: the path to acceptance runs not through branding or corporate pledges, but through credible, local, watershed- specific stewardship that communities can see and verify.
Transparency earns trust
Without clear accounting of withdrawals, consumption, reuse, and basin-matched replenishment, social license is difficult to gain. Operators that demonstrate hydrologically connected replenishment (as Meta now requires for high-stress basins), adopt adiabatic or reuse-driven cooling, and openly disclose water impacts face far less community pushback. Water strategy has become synonymous with social license: acceptance comes from credible, watershed-specific stewardship that communities can verify.
Economic and social benefits matter too. In Loudoun County, Virginia, data centers generate 38% of the county’s General Fund and nearly half its property tax revenue, sharply reducing the tax burden for residents. For every $1 of public services they consume, data centers contribute $26 in tax revenue, and each job creates 3.5 additional local jobs.
The Grid Water Paradox
Fully dry, water-free designs risk triggering the Grid Water Paradox: shifting water use upstream to thermal power plants if the local grid is fossil-dominated. Whether dry, adiabatic, or evaporative, there is always an impact that must be managed and acceptable relative to the facility’s value. Local water context matters, and it’s important to look beyond the security fenceline for opportunities with other processes within the watershed.
Hyperscale operators have pledged to be water positive by 2030 through efficiency, reuse, rain capture, and watershed investments. Some require hydrological connection, restoring 200% in high-stress basins so replenishment occurs where withdrawals occur. But risks of “water washing” remain when replenishment happens far from the impact basin—an echo of carbon offset controversies. A “water positive” certificate in a boardroom does nothing for a farmer whose well has run dry. The credibility test is basin level: measure, mitigate, replenish locally.
A second blind spot is that hyperscalers influence roughly 41–44% of global data center capacity, but only about half is in facilities they own. The rest is in leased colocation sites where operators, not hyperscalers, set cooling and water policy. That means hyperscalers directly control roughly 20–22% of global capacity where their water pledges are fully enforceable.
Close the data gap
Only about 51% of operators monitor water use, according to surveys cited by the Uptime Institute. Standardized, auditable metrics are needed, akin to how power Usage Effectiveness (PUE) normalized energy discussions. Key metrics should include:
- Withdrawals, consumption, discharge, reuse rate
- Water source (potable vs. reclaimed) and basin ID
- Local replenishment (basin-matched, hydrologically connected)
- Cooling mode run hours (evaporative/adiabatic/dry) tied to Water Usage Effectiveness (WUE) and PUE
With common definitions and third-party verification, communities can see where water is drawn, how much is consumed, how impacts are mitigated, and whether replenishment is local. That transparency can de-risk permitting and prevent last-minute cancellations.
Rocky View’s rejection, following a 10-hour hearing with more than 50 community objections, was not an engineering failure; it was an alignment failure. A data center campus with basin-specific water accounting, a heat recovery plan for local greenhouses or district loops, and standardized disclosures might have earned a different outcome.
Watershed- aware, circular by design
At Hatch, we are moving beyond simple PUE and WUE to design watershed-aware, energy-circular infrastructure. Our approach integrates:
- Sitting by grid-water co-optimization: Evaluating grid health (capacity, emissions, water intensity of generation) and water abundance (basin stress, reuse opportunities) before site control.
- Energy circular design: High temperature direct-to-chip liquid cooling, thermal storage, and district heat export to reduce evaporative hours and monetize heat.
- Basin-matched replenishment: If water is consumed locally, replenish locally—with hydrological connection—and disclose it under a standardized framework.
- Standardized water transparency: Uniform, auditable reporting of withdrawals, consumption, reuse, and replenishment per basin.
Done well, the conversation changes. Data centers become community energy hubs, grid assets, and watershed partners. If you’re exploring a new project, we can help you develop water, energy, and community strategies that secure durable social license, minimize pushback, and strengthen support. Contact us to learn more!
.jpg)
A modern data center environment showcasing advanced server systems and the expertise behind reliable, high‑capacity digital infrastructure.


