Hot take: The reason why so many people think ChatGPT can be sentient or even divine is that capitalism has atomized us so badly, many of us forgot what it's like to talk to an actual human being.
Hot take: The reason why so many people think ChatGPT can be sentient or even divine is that capitalism has atomized us so badly, many of us forgot what it's like to talk to an actual human being.
On a deeper level than small talk, of course.
The other day someone told me that their partner used ChatGPT instead of going to therapy.
We’re all so cooked.
So many people are going to develop/exacerbate mental illness from doing that.
Turbo cooked
Probing the quicksand with a rod made of quicksand
Sounds safe to me
The point of contention regarding therapy for me is that I’m literally paying for an impersonal conversation in which I express my deepest insecurities to someone who most likely doesn’t give a shit.
I don’t see how AI fixes that but I also don’t understand why it can’t help if your relationship with your therapist is supposed to be a fundamentally clinical one.
"person I pay to pretend to give a shit about my problems" is such a reductive and unhealthy view of therapy that it should be immediately apparent why therapy has not been helpful and why you're unable to see why an Autocorrect word regurgitation machine wouldn't be helpful.
If you have an accountant, is that a person you pay to pretend to give a shit about your taxes? Is an orthopedic surgeon someone you paid to pretend to give a shit about your broken leg? You should be able to recognize why this would be an unhealthy and unhelpful framing device.
10% odds the problem is that you haven't found the right therapist. 90% odds you're building up mental barriers that are actively preventing you from engaging with the therapeutic model in a beneficial way. Acknowledging this and working to overcome these barriers was life-changing for me and has resulted in an astonishing level of change in not only how effective talk therapy has been, but also in how I feel and think about myself particularly in regards to my mental and physical health.
The problem is that AI does absolutely not provide a clinical relationship. If your input becomes part of the LLM's context (which it has to in order to have a conversation) it will inevitably start mirroring you in ways you might not even notice, something humans commonly (and subconsciously) respond to with trust and connection.
Add to that that they are designed to generally agree with and enable whatever you tell them and you basically have a machine that does everything to reinforce a connection to itself and validate the parts of yourself you have concerns about.
There are already so many stories of people spiralling because they started building rapport with an LLM and it's hard to imagine a setting where that is more likely to occur than when you use one as your therapist
If find that it's helpful being able to talk to someone that you can't disappoint. Otherwise I will always lie to make them feel better about how I'm doing
can’t imagine why trusting the agreeable electrified foolin’ machine created by sociopaths can’t not help
Just wait for the conversation where they come to you telling you their partner divorced them because "ChatGPT told them to."