The massive computer clusters powering artificial intelligence consume vast quantities to answer the world’s queries, but how is Big Tech redressing the balance?
Writing a 100-word email using ChatGPT (GPT-4, latest model) consumes 1 x 500ml bottle of water It uses 140Wh of energy, enough for 7 full charges of an iPhone Pro Max
Why does the article make it sound like cooling a data center results in constant water loss? Is this not a closed loop system?
I’m imagining a giant reservoir heat sink that runs throughout a complex to pull heat out of the surrounding environment where some liquid evaporates and needs to be replenished. But first of all we have more efficient liquid coolants, and second that would be a very lazy solution.
I wonder if they’ve considered geothermal for new data centers. You can run a geothermal loop in reverse and use the earth as a giant heat sink. It’s not water in the loop, it’s refrigerant, and it only needs to be replaced when you find the efficiency dropping, which can take decades.
Datacenters are usually not located where this would be useful. They're placed where space and energy are cheap, because everything they do only needs Internet access. At most they'd heat the rest of the building for whatever office space there is.
Evaporative coolers save a ton of energy compared to refrigerator cycle closed loop systems. Like a swamp cooler, the hot liquid that comes from cooling the server is exposed to the atmosphere and enough evaporates off to cool the liquid by a decent percentage, then it's refrigerated before going back into the servers.
Data centre near me is using it and the fire service is used to be being called by people concerned the huge clouds of water vapor are smoke
It is a closed loop, but the paper treats it as if it's an open loop, and counts all water use for the building, as well as all the water that went into creating any equipment used.... and the water that escapes power plants in powering the buildings.... it also includes any other buildings that might house related services. Here is the original "study" which is about what maths could be done given the above assumptions:
In short, it has nothing to do with reality, and is more just an attempt at the authors to get their names out there (on bad science that the media is interested in publicizing for click bait reasons).
It highly depends on every data center, but it is very likely that they do use municipal water for cooling. Mainting a Reservoir is extremely expensive for the amount of thermal mass it requires, these things kick off HEAT.
I don't know why they aren't using reclaimed water from treatment plants. I don't see why potable water is necessary as long as the substitute isn't corrosive, but I might be missing something here.
I don't know, but given that ground-source heat pumps are one of the most efficient ways to heat a building, and this suggestion is just exactly that in reverse (pumping the heat into the ground instead of out of it), I'd imagine that it will not just "reach its heat capacity". The heat would flow away just as it flows to a heat pump. If the entire earth reaches its heat capacity I think we'd have problems.
Deep Geothermal goes deeeeeepppp to where there is a heat source that is replenished.
Shallow geothermal pulls heat where there is no replenishment, and you have to run it in reverse (use it as an AC in the summer) to swap out the heat. You can't only pull heat out for shallow geothermal. You may be able to for a time, but also remember that heating for a house is pretty small overall.
It's not the entire earth that is the heat sink, it's a relatively short distance from the pipe. We don't get the massive heat from the molten core at the surface.