While I appreciate them going a greener route, if these chat AIs are still this inefficient to simply train, maybe it is best left to return them back to the research phrase.
You say "simply train," but really, the training of these models is The most intensive part. Once they are trained, they require less power (relatively) to actually run for inference.
So it sounds like they need a shitload of GPU power. You know what also costs a shitload of GPU power crypto mining? Could they not outsource the work to all those GPUs that stopped mining crypto once it plummeted?
I am surprised this hasn't become a community project already. I assume there is some limitation that I am unaware of.
But they (MS) are planning on doing it either way, why not crowdsource and even pay a small pittance for the GPU power? I think it would be popular... there are a lot of sad people with extra GPUs sitting around not being used for much.
There's tradeoffs. If training LLMs (and similar systems that feed on pure physics data) can improve nuclear processes, then overall it could be a net benefit. Fusion energy research takes a huge amount of power to trigger every test ignition and we do them all the time, learning little by little.
The real question is if the LLMs are even capable of revealing those kinds of insights to us. If they are, nuclear is hardly the worst path to go down.