Skip Navigation

The AI Feedback Loop: Researchers Warn Of "Model Collapse" As AI Trains on AI-Generated Content

venturebeat.com The AI feedback loop: Researchers warn of ‘model collapse’ as AI trains on AI-generated content

As a generative AI training model is exposed to more AI-generated data, it performs worse, producing more errors, leading to model collapse.

The AI feedback loop: Researchers warn of ‘model collapse’ as AI trains on AI-generated content
1

You're viewing a single thread.

1 comments
  • Any data sets produced before 2022 will be very valuable compared to anything after. Maybe the only way we avoid this is to stick to training LLMs on older data and prompt inject anything newer, rather than training for it.