While the mass adoption of AI has transformed digital life seemingly overnight, regulators have fallen asleep on the job in curtailing AI data centers’ drain on energy and water resources.
The annoying part is how many mainstream tech companies have ham-fisted AI into every crevice of every product. It isn't necessary and I'm not convinced it results in a "better search result" for 90% of the crap people throw into Google. Basic indexed searches are fine for most use cases.
AI is just what crypto bros moved onto after people realized that was a scam. It's immature technology that uses absurd amounts of energy for a solution in search of a problem, being pushed as the future, all for the prospect of making more money. Except this time it's being backed by major corporations because it means fewer employees they have to pay.
I skimmed the article, but it seems to be assuming that Google's LLM is using the same architecture as everyone else. I'm pretty sure Google uses their TPU chips instead of a regular GPU like everyone else. Those are generally pretty energy efficient.
That and they don't seem to be considering how much data is just being cached for questions that are the same. And a lot of Google searches are going to be identical just because of the search suggestions funneling people into the same form of a question.
AI's rapid growth has transformed digital life, but its significant environmental impact remains largely unchecked.
AI-powered features can consume up to 10 times more electricity than traditional searches, potentially equating to a country's power usage.
The proliferation of energy-intensive data centers powering AI is outpacing the electric grid's capacity, forcing utilities to maintain fossil fuel plants for reliability.
Estimates suggest AI could account for 9% of U.S. energy demand by 2030, substantially contributing to climate change.
Lack of industry transparency and mandatory reporting makes quantifying AI's full environmental toll difficult.
Tech companies negotiate discounted utility rates, shifting costs to ratepayers and reducing incentives for energy efficiency.
Government regulation has been slow and industry-influenced, focusing on hypothetical future risks over current, tangible harms.
The burden of AI's environmental impact disproportionately falls on Global South communities where data centers are located.
Tech companies resist mandatory disclosures, prioritizing profits over sustainability while the public bears the physical costs.
The confounding part is that when I do get offered an "AI result", it's basically identical to the excerpt in the top "traditional search" result. It wasted a fair amount more time and energy to repeat what the top of the search said anyway. I've never seen the AI overview ever be more useful than the top snippet.
To be fair, it was never "hidden" since all the top 5 decided that GPU was the way to go with this monetization.
Guess who is waiting on the other side of this idiocy with a solution? AMD with cheap FPGA that will do all this work at 10x the speed and similar energy reduction. At a massive fraction of the cost and hassle for cloud providers.
If only they did what DuckDuckGo did and made it so it only popped up in very specific circumstances, primarily only drawing from current summarized information from Wikipedia in addition to its existing context, and allowed the user to turn it off completely in one click of a setting toggle.
I find it useful in DuckDuckGo because it's out of the way, unobtrusive, and only pops up when necessary. I've tried using Google with its search AI enabled, and it was the most unusable search engine I've used in years.
I'm genuinely curious where their penny picking went? All of tech companies shove ads into our throats and steal our privacy justifying that by saying they operate at loss and need to increase income. But suddenly they can afford spending huge amounts on some shit that won't give them any more income. How do they justify it then?
I'm surprised it's only 10x. Running a prompt though a llm takes quite a bit of energy, so I guess even the regular searches take more energy than I thought.
I wonder what the power consumption of getting to the information in the summary is as a whole when using a regular search, clicking on multiple links, finding the right information and extracting the relevant parts. Including the expenditures of energy by the human performing the task and everything that surrounds the activity.
There are real concerns surrounding AI, I wonder if this is truly one of them or if it’s just poorly researched ragebait.
I would point out that Google has been "carbon neutral" with it's data centers for quite some time, unlike others who still rape the environment ahem AWS.
This is terrible. Why don't we build nuclear power plants, rollout a carbon tax, and put incentives for companies to make their own energy via renewables?
You know the shit that we should have been doing before I was born.