You can even see the decline in posts and votes before GPT became mainstream.
This definitely look more like search engine failing to get rid of those cheap copycats.
Yeh, suddenly you need to know what the language calls the operator within the context you want to use it.
At which point, you probably don't need to Google the symbol!
On Google and on Duck Duck Go too. On DDG you can't get rid of the over-optimized websites anymore even if you use -"website name". Luckily -site:address still works.
That's crazy. Google/DDG bloat from SEO websites had already driven me out a while ago, so I hadn't noticed. I've been using Kagi for a few months now, and I find I can trust my search results again. Being able to permanently downgrade or even block a given website is an awesome feature, I would recommend it just for that.
Wait WHAT? I was just asking on discord the other day if there existed a search engine that allowed you to blacklist websites as a user setting. I need to curate out all AI written garbage from my results.
Hmm, not really used to the idea of paying for search, but I understand.
Is it good at filtering AI generated sites and sites that are clearly copy pasted. Or do you kind of have to identify that yourself and manually block?
I think it's worth testing it with the free 100 searches. All you need is an email address (no credit card unless you're actually subscribing).
I've only been using it a few days but I don't think it filters out AI generated sites. But you can set a ranking by site (block, lower, normal, raise, pin) so you can make stack overflow be priorised and block quora.
They have a ranking board of top sites in each category so you can go through it and set the rank of a bunch of sites upfront.
There's no specific AI detection at the moment, as far as I can tell. But it has "listicle" detection. If you ask "best lawn mower", all these "the 5 best lawn mowers of 2023" websites with affiliated Amazon links get pooled into a compact Listicle section, that you can just scroll past and ignore.
I think they're trying to implement a sort of "smart prediction" thing, where it assumes that if you go back the link you clicked wasn't relevant. And so it tries to remove closely related results. Which works the opposite if you get two results from the same page and you click the wrong one. Which makes looking up technical or programming related issues a nightmare.