I've started relying more on AI-powered tools like Perplexity for many of my search use-cases for this very fact - all results basically warrant a pre-filtering to be useful.
Unfortunately the spam arms race has destroyed any chance of search going back to the good ole days. SEO and AI content farms means we'll need a whole new system to categorize webpages, as well as filter out human sounding but low effort spam.
Point being, it's no longer enough to find a page that's relevant to the topic, it has to be relevant and actually deliver information, which currently the only feasible tech that can differentiate those is LLMs.
It would be interesting tho to use a LLM to spot AI/SEO crap and add whole domains to a search blacklist. In that case we wouldn't need AI to do the actual search, and this could easily just be a database for end users by the SE's side (kinda like explicit content filters).
I'd call that option "Bullspam filter" and leave it on "moderate" by default.
This is one solution to the issue, and it seems silly you are being downvoted for it.
Google became what it became, and years of seo optimisation cat & mouse play has reached new heights. Those obviously target Google instead of their competitors for now.
Would that we could have perfect search results, it would be beneficial to google as well.