Skip Navigation

The ugly truth behind ChatGPT: AI is guzzling resources at planet-eating rates

Despite its name, the infrastructure used by the “cloud” accounts for more global greenhouse emissions than commercial flights. In 2018, for instance, the 5bn YouTube hits for the viral song Despacito used the same amount of energy it would take to heat 40,000 US homes annually.

Large language models such as ChatGPT are some of the most energy-guzzling technologies of all. Research suggests, for instance, that about 700,000 litres of water could have been used to cool the machines that trained ChatGPT-3 at Microsoft’s data facilities.

Additionally, as these companies aim to reduce their reliance on fossil fuels, they may opt to base their datacentres in regions with cheaper electricity, such as the southern US, potentially exacerbating water consumption issues in drier parts of the world.

Furthermore, while minerals such as lithium and cobalt are most commonly associated with batteries in the motor sector, they are also crucial for the batteries used in datacentres. The extraction process often involves significant water usage and can lead to pollution, undermining water security. The extraction of these minerals are also often linked to human rights violations and poor labour standards. Trying to achieve one climate goal of limiting our dependence on fossil fuels can compromise another goal, of ensuring everyone has a safe and accessible water supply.

Moreover, when significant energy resources are allocated to tech-related endeavours, it can lead to energy shortages for essential needs such as residential power supply. Recent data from the UK shows that the country’s outdated electricity network is holding back affordable housing projects.

In other words, policy needs to be designed not to pick sectors or technologies as “winners”, but to pick the willing by providing support that is conditional on companies moving in the right direction. Making disclosure of environmental practices and impacts a condition for government support could ensure greater transparency and accountability.

379 comments
  • So... Absolutely need to be aware of the impact of what we do in the tech sphere, but there's a few things in the article that give me pause:

    Research suggests, for instance, that about 700,000 litres of water could have been used to cool the machines that trained ChatGPT-3 at Microsoft’s data facilities.

    1. "Could". More likely it was closed loop.
    2. Water isn't single use, so even if true how does this big number matter.

    What matter is the electrical energy converted to heat. How much was it and where did that heat go?

    Moreover, when significant energy resources are allocated to tech-related endeavours, it can lead to energy shortages for essential needs such as residential power supply. Recent data from the UK shows that the country’s outdated electricity network is holding back affordable housing projects.

    Can you say non sequitur ?

    The outdated network holding back housing is that it doesn't go to the right places with the capacity needed for the houses. Not that OpenAIUK is consuming so much that there's no power left. To use a simily, there's plenty of water but the pipes aren't in place.

    This article is well intentioned FUD, but FUD none the less.

    • 700.000 litres also sounds like much more than 700 m³. The average German citizen consumed 129 litres per day or roughly 47 m³ annually. The water consumption of 15 people is less than most blocks.

      Energy consumption might be a real problem, but I don't see how water consumption is that big of a problem or priority here.

    • “Could”. More likely it was closed loop. As I understand it this is an estimate, thus the word "could". This has nothing to do with using closed or open look water cooling. Water isn’t single use, so even if true how does this big number matter.

      The point they are trying to make is that fresh water is not a limitless resource and increasing usage has various impacts, for example on market prices.

      The outdated network holding back housing is that it doesn’t go to the right places with the capacity needed for the houses. Not that OpenAIUK is consuming so much that there’s no power left. To use a simily, there’s plenty of water but the pipes aren’t in place.

      The point being made is that resources are allocated to increase network capacity for hyped tech and not for current, more pressing needs.

    • “Could”. More likely it was closed loop.

      Nope. Here's how data centres use water.

      It boils down to two things - cooling and humidification. Humidification is clearly not a closed loop, so I'll focus on the cooling:

      • cold water runs through tubes, chilling the air inside the data centre
      • the water is now hot
      • hot water is exposed to outside air, some evaporates, the leftover is colder and reused.

      Since some evaporates you'll need to put more water into the system. And there's an additional problem: salts don't evaporate, they concentrate over time, precipitate, and clog your pipes. Since you don't want this you'll eventually need to flush it all out. And it also means that you can't simply use seawater for that, it needs to be freshwater.

      Water isn’t single use, so even if true how does this big number matter.

      Freshwater renews at a limited rate.

      What matter is the electrical energy converted to heat. How much was it and where did that heat go?

      Mostly to the air, as promoting the evaporation of the water.

      Can you say non sequitur ?

      More like non sequere than non sequitur. Read the whole paragraph:

      Moreover, when significant energy resources are allocated to tech-related endeavours, it can lead to energy shortages for essential needs such as residential power supply. Recent data from the UK shows that the country’s outdated electricity network is holding back affordable housing projects. This will only get worse as households move away from using fossil fuels and rely more on electricity, putting even more pressure on the National Grid. In Bicester, for instance, plans to build 7,000 new homes were paused because the electricity network didn’t have enough capacity.

      The author is highlighting that electrical security is already bad for you Brits, for structural reasons; it'll probably get worse due to increased household consumption; and with big tech consuming it, it'll get even worse.

      • Data center cooling towers can be closed- or open-loop, and even operate in a hybrid mode depending on demand and air temps/humidity. Problem is, the places where open-loop evaporative cooling works best are arid, low-humidity regions where water is a scarce resource to start.

        On the other hand, several of the FAANGS are building datacenters right now in my area, where we're in the watershed of the largest river in the country, it's regularly humid and rainy, any water used in a given process is either treated and released back into the river, or fairly quickly condenses back out of the atmosphere in the form of rain somewhere a few hundred miles further east (where it will eventually collect back into the same river). The only way that water is "wasted" in this environment has to do with the resources used to treat and distribute it. However, because it's often hot and humid around here, open loop cooling isn't as effective, and it's more common to see closed-loop systems.

        Bottom line, though, I think the siting of water-intensive industries in water-poor parts of the country is a governmental failure, first and foremost. States like Arizona in particular have a long history of planning as though they aren't in a dry desert that has to share its only renewable water resource with two other states, and offering utility incentives to potential employers that treat that resource as if it's infinite. A government that was focused on the long-term viability of the state as a place to live rather than on short-term wins that politicians can campaign on wouldn't be making those concessions.

  • The forefront of technology overutilizes resources?

    Always has been.

    Edit: Supercomputers have existed for 60 years.

    • The difference is that supercomputers by and large actually help humanity. They do things like help predict severe weather, help us understand mathematical problems, understand physics, develop new drug treatments, etc.

      They are also primarily owned and funded by universities, scientific institutions, and public funding.

      The modern push for ubiquitous corpo cloud platforms, SaaS, and AI training has resulted in massive pollution and environmental damage. For what? Mostly to generate massive profits for a small number of mega-corps, high level shareholders and ultra wealthy individuals, devalue and layoff workers, collect insane amounts of data to aid in mass surveillance and targeted advertising, and enshitify as much of the modern web as possible.

      All AI research should be open source, federated, and accountable to the public. It should also be handled mostly by educational institutions, not for-profit companies. There should be no part of it that is allowed to be closed source or proprietary. No government should honor any copyright claims or cyber law protecting companies' rights to not have their software hacked, decompiled, and code spread across the web for all to see and use as they see fit.

      • While I absolutely agree with everything you've stated, I'm not taking a moral position here. I'm just positing that the same arguments of concern have been on the table since the establishment of massive computational power regardless of how, or by whom, it was to be utilized.

    • AI is on another completely different level of energy consumption. Consider that Sam Altman, of OpenAI, is investing on Nuclear power plants to feed directly their next iterations of AI models. That's a whole ass nuclear reactor to feed one AI model. Because the amount of energy we currently create is several magnitudes not enough for what they want. We are struggling to feed these monsters, it is nothing like how supercomputers tax the grid.

      • Supercomputers were feared to be untenable resource consumers then, too.

        Utilizing nuclear to feed AI may be the responsible and sustainable option, but there's a lot of FUD surrounding all of these things.

        One thing is certain: Humans (and now AI) will continue to advance technology, regardless of consequence.

  • This is the best summary I could come up with:


    It is hardly news that the tech bubble’s self-glorification has obscured the uglier sides of this industry, from its proclivity for tax avoidance to its invasion of privacy and exploitation of our attention span.

    The industry’s environmental impact is a key issue, yet the companies that produce such models have stayed remarkably quiet about the amount of energy they consume – probably because they don’t want to spark our concern.

    Google’s global datacentre and Meta’s ambitious plans for a new AI Research SuperCluster (RSC) further underscore the industry’s energy-intensive nature, raising concerns that these facilities could significantly increase energy consumption.

    Additionally, as these companies aim to reduce their reliance on fossil fuels, they may opt to base their datacentres in regions with cheaper electricity, such as the southern US, potentially exacerbating water consumption issues in drier parts of the world.

    In an era where we expect businesses to do more than just make profits for their shareholders, governments need to evaluate the organisations they fund and partner with, based on whether their actions will result in concrete successes for people and the planet.

    As climate scientists anticipate that global heating will exceed the 1.5C target, it’s time we approach today’s grand challenges systemically, so that the solution to one problem does not exacerbate another.


    The original article contains 766 words, the summary contains 214 words. Saved 72%. I'm a bot and I'm open source!

379 comments