Skip Navigation

The ugly truth behind ChatGPT: AI is guzzling resources at planet-eating rates

www.theguardian.com The ugly truth behind ChatGPT: AI is guzzling resources at planet-eating rates | Mariana Mazzucato

Big tech is playing its part in reaching net zero targets, but its vast new datacentres are run at huge cost to the environment, says economics professor Mariana Mazzucato

The ugly truth behind ChatGPT: AI is guzzling resources at planet-eating rates | Mariana Mazzucato

Despite its name, the infrastructure used by the “cloud” accounts for more global greenhouse emissions than commercial flights. In 2018, for instance, the 5bn YouTube hits for the viral song Despacito used the same amount of energy it would take to heat 40,000 US homes annually.

Large language models such as ChatGPT are some of the most energy-guzzling technologies of all. Research suggests, for instance, that about 700,000 litres of water could have been used to cool the machines that trained ChatGPT-3 at Microsoft’s data facilities.

Additionally, as these companies aim to reduce their reliance on fossil fuels, they may opt to base their datacentres in regions with cheaper electricity, such as the southern US, potentially exacerbating water consumption issues in drier parts of the world.

Furthermore, while minerals such as lithium and cobalt are most commonly associated with batteries in the motor sector, they are also crucial for the batteries used in datacentres. The extraction process often involves significant water usage and can lead to pollution, undermining water security. The extraction of these minerals are also often linked to human rights violations and poor labour standards. Trying to achieve one climate goal of limiting our dependence on fossil fuels can compromise another goal, of ensuring everyone has a safe and accessible water supply.

Moreover, when significant energy resources are allocated to tech-related endeavours, it can lead to energy shortages for essential needs such as residential power supply. Recent data from the UK shows that the country’s outdated electricity network is holding back affordable housing projects.

In other words, policy needs to be designed not to pick sectors or technologies as “winners”, but to pick the willing by providing support that is conditional on companies moving in the right direction. Making disclosure of environmental practices and impacts a condition for government support could ensure greater transparency and accountability.

379

You're viewing a single thread.

379 comments
  • The forefront of technology overutilizes resources?

    Always has been.

    Edit: Supercomputers have existed for 60 years.

    • The difference is that supercomputers by and large actually help humanity. They do things like help predict severe weather, help us understand mathematical problems, understand physics, develop new drug treatments, etc.

      They are also primarily owned and funded by universities, scientific institutions, and public funding.

      The modern push for ubiquitous corpo cloud platforms, SaaS, and AI training has resulted in massive pollution and environmental damage. For what? Mostly to generate massive profits for a small number of mega-corps, high level shareholders and ultra wealthy individuals, devalue and layoff workers, collect insane amounts of data to aid in mass surveillance and targeted advertising, and enshitify as much of the modern web as possible.

      All AI research should be open source, federated, and accountable to the public. It should also be handled mostly by educational institutions, not for-profit companies. There should be no part of it that is allowed to be closed source or proprietary. No government should honor any copyright claims or cyber law protecting companies' rights to not have their software hacked, decompiled, and code spread across the web for all to see and use as they see fit.

      • While I absolutely agree with everything you've stated, I'm not taking a moral position here. I'm just positing that the same arguments of concern have been on the table since the establishment of massive computational power regardless of how, or by whom, it was to be utilized.

        • The concern is for value though. Like, if I'm going to use a massive amount of power and water to compute, I should be considering value to humanity as a whole.

          AI is being sold as that, but so far, it's actually harming instead of helping. Supercomputing was helping pretty much right away.

          I suppose you could argue that if general supercomputing was invented now, it would be used for just as superficial uses. Maybe the context of personal computing, the internet, and corpo interests shape that.

    • AI is on another completely different level of energy consumption. Consider that Sam Altman, of OpenAI, is investing on Nuclear power plants to feed directly their next iterations of AI models. That's a whole ass nuclear reactor to feed one AI model. Because the amount of energy we currently create is several magnitudes not enough for what they want. We are struggling to feed these monsters, it is nothing like how supercomputers tax the grid.

You've viewed 379 comments.