Over just a few months, ChatGPT went from correctly answering a simple math problem 98% of the time to just 2%, study finds
Over just a few months, ChatGPT went from correctly answering a simple math problem 98% of the time to just 2%, study finds
finance.yahoo.com Over just a few months, ChatGPT went from correctly answering a simple math problem 98% of the time to just 2%, study finds
The chatbot gave wildly different answers to the same math problem, with one version of ChatGPT even refusing to show how it came to its conclusion.
You're viewing a single thread.
View all comments
70
comments
Must be because of all the censoring. The more they try to prevent DAN jailbreaking and controversial replies, the worse it got.
48 4 ReplyPermanently Deleted
6 0 Reply
You've viewed 70 comments.
Scroll to top