Water is extremely important in most large scale cooling systems, whether it be swamp coolers (aka evaporative cooling) or traditional HVAC (aka chillers).
It will rain somewhere. Generally places that already have rain. If you're counting global amount, we have plenty of fresh water, but we don't have it in the places where we need it.
The reason evaporative coolers are cheap is because they use a fraction of the electricity that chillers do.
And note that the majority of data center water usage is indirect via power generation, so using less water on site but more indirectly by consuming more power is both more expensive and less efficient.
Unfortunately, evaporative coolers are the best way to go, for now.
When calculating water use, it's important to not only look at the water used directly to cool data centers, but also at the water used by power plants to generate that 205TWh.
The researchers also tracked the water used by wastewater treatment plants due to data centers, as well as the water used by power plants to power that portion of the wastewater treatment site's workload.
Last year, our global data center fleet consumed approximately 4.3 billion gallons of water. This is comparable to the water needed to irrigate and maintain 29 golf courses in the southwest U.S. each year.
From the WaPo article:
A large data center, researchers say, can gobble up anywhere between 1 million and 5 million gallons of water a day — as much as a town of 10,000 to 50,000 people.
For California at least, residential use is about 10% of all water usage iirc. So if data centers are dwarfed by that...not a big concern in the big picture.
The issue I guess is when data center usage sucks up all the local supply. State and region wide they don't use much but they do use a lot in one small area.