First-of-its-kind US bill would address the environmental costs of the technology, but there’s a long way to go.
one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes. It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.
Within years, large AI systems are likely to need as much energy as entire nations.
That doesn't sound like they're taking future hardware optimizations into account, we won't be using GPUs for this purpose forever (as much as Nvidia would like that to be true lol)
Not to mention that increasing usage of AI means AI is producing more useful work in the process, too.
The people running these AIs are paying for the electricity they're using. If the AI isn't doing enough work to make it worth that expense they wouldn't be running them. If the general goal is "reduce electricity usage" then there's no need to target AI, or any other specific use for that matter. Just make electricity in general cost more, and usage will go down. It's basic market forces.
I suspect that most people raging about AIs wouldn't want their energy bill to shoot up, though. They want everyone else to pay for their preferences.
Not that you don't have a point, but there's is this theory, paradox or law or something, it escapes my memory at the time, which says that when technology advances, so do requirements. So what's going to happen is that when hardware is 100x more efficient, the fucking corporations will use 100x more, and nothing gets solved in the pollution front.
I am betting in renewable energy as the best way to combat the environmental issues we're facing.
Anything power saved by hardware design improvements will be consumed by adding more transistors. You will not be seeing a power consumption decrease. Manufacturers of this hardware have been giving talks for the past two years calling for literal power plants to be build co-resident with datacenters.
That was my thought too. I heard a take about how we may see us shift away from GPUs to purpose built PUs as a way to continue process progress now we’re getting pretty small on the silicon scale. Neural nets may be one of these special “PU”s we see.
The original article doesn't specify a unit of time:
Most experts agree that nuclear fusion won’t contribute significantly to the crucial goal of decarbonizing by mid-century to combat the climate crisis. Helion’s most optimistic estimate is that by 2029 it will produce enough energy to power 40,000 average US households; one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes.
Based on context clues, it's probably consumption per year
Ok, I'll do it my way (though yours was interesting!):
This article says that an average U.S. house consumes about 30 KWh per day. Let's round it down to 24 KWh, so we can say that 1 household consumes 1 KW per hour.
According to this article, 1 Joule = 0.238902957619 kcal. And 1 Watt = 1 Joule (per second.) So, 1 KW = 1,000 Joules per second, or about 239 kcals per second. You said that 1 bagel is about 264 kcals, so to simplify things, let's round it down to 239 kcals (and yes, food calories are really kcals - go figure), so 1 KW = 1 bagel.
So, 1 household consumes 1 bagel per second, or 3,600 bagels per hour. In my opinion, that sounds excessive, so maybe my math is not the best. But let's assume I did everything correctly.
So, ChatGPT consumes the equivalent of the energy consumed by 33,000 homes. So, ChatGPT consumes 3,600 bagels times 33,000 = 118,800,000 bagels per hour. That's almost 119 million bagels per hour!
You came up with 2.4 billion bagels, but we don't know if that's per hour, per day or what. Let's divide both numbers and see if that gives us a clue: 2.4 billion divided by 119 million is roughly 20, which is close-ish to 24. So chances are, your calculations are bagels per day.
Again, that's a lot of bagels!!!
Edit: As for the brain, this site says that the brain consumes 20 watts, or 0.0056 KW per hour. We established that 1 KW = 1 bagel, so 0.0056 KW is, well, 0.0056 bagels. If we multiply that by 3,600, we get 20.16. So the brain consumes 20 bagels per hour. That can't be right. I wish I could eat as many as 20 bagels per hour just to power my brain - I'd be a happy man!
But anyway, 20 bagels per hour is definitely a lot less than 119 million bagels per hour.
Since one kilowatt is equal to 3,412.14245 btu per hour
30 KWh/day x 365 days x 3,412 Btu/KWh = 37,361,400 Btu
Which is half the value I found for 2015. Does ecoflow have more current data and houses are twice as efficient? Maybe. They're also trying to sell something, so maybe it's based on data from their products. They don't mention where they got it from.
The welovecycling conversion is off by 1000 (maybe the kilocalorie threw them off?)
Nice! I knew my math was off by three zeros somewhere, but it was late at night, and the exercise per se was fun enough so I wrapped it up.
Your corrections make everything make more sense, at least on the brain side. Considering the 2000 calories per day recommendation for the whole body, which is 83 cals per hour, the brain consuming 17.2 cals, or 20% of it sounds about right - though from another point of view, 20% consumption of the whole energy intake sounds like a lot! The brain weighs about 3 lbs, so in an adult male weighing 190 lbs, that's roughly 1.5%.
1.5% of the "cell population" consumes 20% of the total energy. That's some Occupy Wall Street stuff right there!
And bringing it back to bagels, 17.2 cals represents less than 10% of a bagel. So, a bagel bite. The brain consumes a bagel bite per hour. Which is wild given how complex it is (at least, complex to us.)
Finally, I said "at least on the brain side" because a household requiring just 3.6 bagels per hour sounds quite low. You mean to tell me that if I burn 4 bagels, I can power my TV, my fridge and my AC for a full hour? Now I know what to do with expired bagels!!!
Anyway. Thanks for humoring me! Awesome exercise and awesome discussion. It was fun!
I'm not sure if future optimization wouldn't bring more demand. At least, that's what my hardware and apps shown in a couple of decades. If another start up would have an ability to train with additional billion or trillion of parameters, I'm sure they would. It also leads to a wider window for poor optimization.