AI model collapse is not what we paid for
AI model collapse is not what we paid for
AI model collapse is not what we paid for
In an AI model collapse, AI systems, which are trained on their own outputs, gradually lose accuracy, diversity, and reliability. This occurs because errors compound across successive model generations, leading to distorted data distributions and "irreversible defects" in performance. The final result? A Nature 2024 paper stated, "The model becomes poisoned with its own projection of reality."
A remarkably similar thing happened to my aunt who can't get off Facebook. We try feeding her accurate data, but she's become poisoned with her own projection of reality.
It's such an easy thing to predict happening, too. If you did it perfectly, it would, at best, maintain an unstable equilibrium and just keep the same output quality.
Unstable, yes. Equilibrium... no.
She sometimes maintains coherence for several responses, but at a certain point, the output devolves into rants about how environmentalists caused the California wildfires.
These conversations consume a lot of energy and provide very limited benefit. We're beginning to wonder if the trade-offs are worth it.
What all this does is accelerate the day when AI becomes worthless.
It was always worthless. Or, at least, it was always worthless thinking that LLMs were a substitute for reasoning AI, which is what it appears many people have been suckered into.
Yeah… I have tried LLMs, and they have horrible hallucinations. For instance, when I tried to “teach” one about Hit Selecting in Minecraft, I used an example of a player that uses it (EREEN), it kept corrupting it to EREEEN. Even when I clarified, it kept doing it, forever.
Google Search has been going downhill for way longer than a few months. It's been close to a decade now.
TBF, SEO and other methodologies that game the rankings muddy the waters and make it harder to get to what you are looking for.
That is not the problem though, Google used to just give you the results containing what you searched for, the problem started when they tried to be "smarter" than that.
Look at how they give results for youtube, maybe three relevant and then its back to suggestions.
Because Google allows them to. They could easily ignore these kind of tricks but choose not to
But could I pay for model collapse? I'd be down for that.
Could also be the AI crawler flood & the responds from website administrators 🤔