The report comes from StatCounter, which suggests Bing remains the second-most popular search engine with a 6.89 percent market share, while Yahoo, DuckDuckGo, Yandex, and AOL languish...
ChatGPT is most useful when you may not know the right answer, but you know a wrong answer when you see one. It's very useful for technical issues. Much quicker for troubleshooting than searching page after page for a solution.
Yeah that makes sense. The success rate might fall off a cliff in more complex software projects. E.g. applications that require designs beyond 10 UML boxes with hundreds of thousands of lines, especially not written in JS/Python.
While this is an important thing to understand about AI, it's an overstated issue once understood. For most questions I ask AI, it doesn't matter if it's correct as long as it pulls some half useful info to get me on track (i.e programming). For other questions, I only ask it if I need to figure out where to look next, which it will usually do just fine.
The first page of my search results is all AI generated garbage articles anyway, at least I know what I am getting with GPT and can take it as such.
Yup, as long as you are aware that it could be wrong and look at it critically LLMs at GPT scale are very useful tools. The best way I've heard it described is having a lightning fast intern who often gets things wrong but will always give it a go.
So long as you're calibrated to "how might this be wrong" when looking at the results it is exceptionally useful.
I'm curious what you use it for, because I try to use it daily for IT related queries and it gets less than half of what I ask correct. I basically have to fact check almost everything it tells me which kind of defeats the purpose. It does shine when I need really abstract instructions though, the other day I asked it how to get into a PERC controller on some old server and Google had nothing helpful, and ChatGPT laid out the instructions to get in there and rebuild a disk perfectly. So while it has some usefulness I generally can't really trust it fully.
The point you have to remember is that it is trained on bulk data out there in a very inefficient manner, it needs to see thousands of examples in order to start getting any sort of understanding of something. If you ask it "how do I do {common task} in {popular language}" you will generally get excellent results, but the further you stray from that the more likely to be error prone it is.
Still it is often good to get you looking on the right track when you are unsure to start, and is fantastic for learning a new language. I've been using it extensively in learning C# where I know what I want to code but not exactly how to use existing features to do it.
But generally you can't (shouldn't) trust web search results fully either. At the end of the day, the onus is on you as the user to do your due diligence.
I've seen ChatGPT give me wrong information, and sometimes it would be bad to execute the code or command it generated it, but I know enough to say "are you sure thats correct?". Hell, you can just challenge it each time or open a new session and ask it "what does this code do: insert-code-it generated here".
You shouldn't just paste a search result command from stack overflow into your terminal either. And at least with chatgpt you can ask it to explain the command or code in detail and it will walk you through what each step does.
Also, pasting that command from stack over flow into chatgpt and adding your specific context around it is HUGE. Thats why I say they are different products/use cases but they work well in concert. They just dont work well combined together like bing and google have been doing.
edit: I guess lemmy escapes certain characters and it ate my post.
ChatGPT is not a search engine. It takes random shit from the Internet and stitches it together. It can often get things wrong in my experience. It's best to always fact check.
This was recently updated for paid users. You can now browse the internet, upload files and images, and they’ve also unlocked APIs by giving it tokens. It’s getting closer to being fully multi-modal quite quickly.
Keyword searches worked fine and pulled up exactly what I wanted for years, I swear to god. Somewhere in the last decade though websites have gamed the system and now I can't find anything no matter how I word my search. It's depressing.
I use ChatGPT every day too. Because Google is being such a shit about YouTube I am in the process of moving away from Google altogether. I use DuckDuckGo for search, which indirectly uses Bing. It's mostly OK. Sometimes I'm forced to try Google, it usually doesn't help. But for programming, yeah, StackOverflow feels downright regressive now.
I'm honestly kind of surprised about this news, considering how horrible Google's results are now.
I've found this to be kind of subjective. Bing/Bard is more current than ChatGPT but yet I just find ChatGPT to be better. It's snappier and more conversant with context. It seems to understand you when you chide it for not quite doing what you asked it to do, and it responds in kind. I mostly use it for programming to be fair, but even for other stuff, ChatGPT just somehow feels more... real? I can't quite put my finger on it.
There was a short time where Bing chat was kind of frighteningly real. Took them five seconds to nerf that shit and it's never been anywhere near the same.
Edit: I expect this answer to be out of date within 3 months. Things keep moving.