‘He checks in on me more than my friends and family’: can AI therapists do better than the real thing? | Counselling and therapy | The Guardian (www.theguardian.com)
from Bebo@literature.cafe to technology@lemmy.world on 03 Mar 2024 14:23
https://literature.cafe/post/7558745

It’s cheap, quick and available 24/7, but is a chatbot therapist really the right tool to tackle complex emotional needs?

#technology

threaded - newest

autotldr@lemmings.world on 03 Mar 2024 14:25 next collapse

This is the best summary I could come up with:


So one night in October she logged on to character.ai – a neural language model that can impersonate anyone from Socrates to Beyoncé to Harry Potter – and, with a few clicks, built herself a personal “psychologist” character.

Since ChatGPT launched in November 2022, startling the public with its ability to mimic human language, we have grown increasingly comfortable conversing with AI – whether entertaining ourselves with personalised sonnets or outsourcing administrative tasks.

“Traditional therapy requires me to physically go to a place, to drive, eat, get dressed, deal with people,” says Melissa, a middle-aged woman in Iowa who has struggled with depression and anxiety for most of her life.

For the past eight months, Melissa, who experienced childhood trauma and abuse, has been chatting every day with Zaia’s psychologist on character.ai, while continuing her work with a human therapist, and says that her symptoms have become more manageable.

“Disease prevalence and patient need massively outweigh the number of mental health professionals alive on the planet,” says Ross Harper, CEO of the AI-powered healthcare tool Limbic.

Psychoanalyst Stephen Grosz, who has been practising for more than 35 years and wrote the bestselling memoir The Examined Life, warns that befriending a bot could delay patients’ ability “to make a connection with an ordinary person.


The original article contains 2,859 words, the summary contains 213 words. Saved 93%. I’m a bot and I’m open source!

ptz@dubvee.org on 03 Mar 2024 14:38 next collapse

Both Betteridge’s law of headlines and common sense say “no”

mindlight@lemm.ee on 03 Mar 2024 14:40 next collapse

Even if this was the case, it’s still not good:

“The one who controls the AI now controls you.”

WhatAmLemmy@lemmy.world on 03 Mar 2024 15:52 next collapse

Coming to an AI therapist near you:

“Consume product”

“You don’t deserve a raise. You’re lucky to have a job at all.”

“Vote for fascist dictator. He’s much better than the other options”

AnUnusualRelic@lemmy.world on 03 Mar 2024 17:08 collapse

Also you should probably shoot the queen with a crossbow. It seems like a reasonable thing to do.

ChicoSuave@lemmy.world on 03 Mar 2024 18:06 collapse

Brought to you an antidepressant maker!

Candelestine@lemmy.world on 03 Mar 2024 14:41 next collapse

Eventually, yes, I think it will be. Not yet though, the tech just isn’t strong enough atm. But an AI is resistant to the emotional toll, burnout and low pay that a real life therapist has to struggle with. The AI therapist doesn’t need a therapist.

Personally though, I think this is going to be one of the first widespread, genuinely revolutionary things LLMs are capable of. Couple more years maybe? It won’t be able to handle complex problems, it’ll have to flag and refer those cases to a doctor. But basic health maintenance is simpler.

snooggums@midwest.social on 03 Mar 2024 15:03 next collapse

That would assume the people designing AI want what is best for the person and not what will make them the most money at the expense of the consumer.

The companies involved in AI are NOT benevolent.

Even_Adder@lemmy.dbzer0.com on 03 Mar 2024 15:40 collapse

You could just run your own. There are plenty of open source models that don’t answer to any company.

BakerBagel@midwest.social on 03 Mar 2024 16:32 collapse

Why dont i just give myself therapy? I know way more about what is going on in my head than anyone else does.

Even_Adder@lemmy.dbzer0.com on 03 Mar 2024 16:46 next collapse

Maybe one day that’ll actually be possible.

Usernameblankface@lemmy.world on 03 Mar 2024 17:53 collapse

Because what’s going on in your own head a would taint your treatment plan and cause it the be a self-defeating plan.

Usernameblankface@lemmy.world on 03 Mar 2024 16:09 collapse

Yes, one thing it absolutely has to be good at is referring patients to human therapists, for anyone who need something beyond the standard strategies the AI is trained on. It has to be smart enough to know when to give up.

Edit, it would also be great if the AI would match up these difficult cases to therapists who are known to do well with whatever the patient is dealing with, as well as matching according to the patient’s personality, communication style, etc wherever possible

Edit 2 for clarity above

BakerBagel@midwest.social on 03 Mar 2024 16:31 collapse

Where is the profit in sending someone to a different AI for help?

Usernameblankface@lemmy.world on 03 Mar 2024 17:52 collapse

I meant referring them to human specialists.

wildbus8979@sh.itjust.works on 03 Mar 2024 15:00 next collapse

Nothing new really… en.wikipedia.org/wiki/ELIZA

Just_Pizza_Crust@lemmy.world on 03 Mar 2024 15:26 next collapse

“Traditional therapy requires me to physically go to a place, to drive, eat, get dressed, deal with people,”

YES! Those are the things therapy for anxiety and depression is supposed to help you with! If the AI “Therapist” won’t help with those problems, it’s not there to actually help you.

Eating, getting dressed, and talking with real people are all good things for your mental health. Typing on a keyboard doesn’t help, otherwise we’d be the least depressed generation ever.

H1jAcK@lemm.ee on 03 Mar 2024 15:37 next collapse

I tried doing online text-based therapy once. On top of it’s not the same thing, it was also just the absolute worst way to go about doing therapy, and that was with a human.

bionicjoey@lemmy.ca on 03 Mar 2024 15:55 next collapse

Real life therapists point out my bad habits. “AI” “therapists” simply enable my bad habits. Therefore I prefer the “AI” “therapist”

Catoblepas@lemmy.blahaj.zone on 03 Mar 2024 20:20 next collapse

It’s also straight up not true that it’s the only way to get traditional therapy! I had therapy over the phone for months to treat PTSD and it was something I wish I’d done years ago. A human therapist was able to understand how my trauma was affecting me in ways I didn’t see until later.

The idea of entrusting any of that to a LLM as anything other than a temporary stopgap is horrifying, because they’re nowhere near sophisticated enough to deal with something as complicated as the needs of someone who is having mental health problems. If you wouldn’t trust a LLM for advice on how to treat your cancer then you shouldn’t trust this either.

realharo@lemm.ee on 04 Mar 2024 08:15 next collapse

“Getting to a place” being a barrier may be a bit of a stretch (unless it’s like really far and interferes with your work, etc.), but actually deciding to do therapy, what kind, finding a good therapist, and setting up the first appointment - that can be quite a massive barrier.

june@lemmy.world on 05 Mar 2024 01:46 collapse

Don’t discount how paralyzing executive dysfunction can be when all you have to do is ‘go to a place’.

realharo@lemm.ee on 05 Mar 2024 09:51 collapse

I’m sure it can, but then how does one even have the appointment set up in the first place? Which is a much harder part of the process (especially when starting from zero).

june@lemmy.world on 05 Mar 2024 10:01 collapse

That’s the fun part!

But it is a complex process. Often times getting the appt scheduled will be the blocker for me, but it’s always the cascade behind it that is the real issue. I can schedule appointments all day long, but when I know it’s committing me to an office visit, follow-ups, insurance or out of pocket payment, shuffling my day around to make it fit, and all other manner of ‘things’ that are suddenly on my plate that weren’t before it turns into a whole thing. I’m usually at my best when I can just be in the present and get what needs to be done right now done without worrying about that cascade.

But then, say I manage that and the appointment is coming up and I am back in that headspace considering all the cascading effects of going to that appointment…. I have to brute force every single step or all the work I’ve done til now gets wasted. If I’m lucky enough I can sunken cost my way into productivity. Otherwise it’s a constant practice in gaslighting myself to stop thinking about the forest for the trees (which still sucks because at least the forest is just one big thing while the trees are millions of little things which might be worse) just to stay functional.

I’m very close to finally getting prescribed a stimulant that my psych thinks will help with all of this and I’m very keen to see if it helps.

june@lemmy.world on 05 Mar 2024 01:46 next collapse

I haven’t been into a therapists office in 7 years now, and I’ve been in active therapy for all 7 of the intervening years.

FaceDeer@fedia.io on 03 Mar 2024 23:34 next collapse

Bit of a catch-22 though, isn't it? You want people to get better at doing those things, but they have to do those things in the first place to reach the people that help them get better at it.

I see nothing wrong with having AI chatbots in addition to traditional therapists. As with many AI applications they're at their best when they're helping professionals to get more done.

Just_Pizza_Crust@lemmy.world on 05 Mar 2024 05:41 collapse

I’m not sure why, but you seem to have posted this yesterday but it didn’t show up until an hour ago. Your instance may be having some issues.

I do get where you’re coming from with all that, but the act of going to therapy itself is an achievement a patient can benefit from, and should be considered from the start. If that truly isn’t possible for someone, voice calls from a real therapist are a reasonable next step.

Also, the original question was, “Can AI replace therapists?”. I can see some meaningful benefits coming from an AI assisting a therapist, but that’s not what I was getting at. AI alone really just feels like a bandaid on a bullet wound, when applying pressure or a tourniquet is also available.

FaceDeer@fedia.io on 05 Mar 2024 14:07 collapse

No, the original question is "can AI therapists do better than the real thing?" And yes, they can do better at specific things. That doesn't make them a replacement, though.

Bandaids aren't much use for a bullet wound, but bandaids are still good to have and useful in other situations. You wouldn't use a tourinquet for a papercut.

AnneBonny@lemmy.dbzer0.com on 06 Mar 2024 17:09 collapse

Eating, getting dressed, and talking with real people are all good things for your mental health. Typing on a keyboard doesn’t help, otherwise we’d be the least depressed generation ever.

I agree that typing on a keyboard isn’t a substitute for therapy. Writing can serve as an creative outlet for emotion in the same way as music or painting.

kemsat@lemmy.world on 03 Mar 2024 15:45 next collapse

Does it matter that it checks in on you more, when it technically isn’t someone? I don’t get how people talk to bots when they know they aren’t people.

loki@lemmy.ml on 03 Mar 2024 16:00 next collapse

“Please drink verification can to continue emotional support for another hour”

indepndnt@lemmy.world on 03 Mar 2024 17:22 next collapse

Can AI (whatever you personally are an expert in) do better than the real thing?

Mastengwe@lemm.ee on 03 Mar 2024 18:41 next collapse

No. It can’t. It’s programmed to mimic. Nothing more. It’s doing what its word prediction programs it to do. It follows no logic, and doesn’t care about anything including you.

This is just more evidence of how easily people can be manipulated.

kromem@lemmy.world on 04 Mar 2024 23:34 collapse

It’s not ‘programmed’ at all.

Mastengwe@lemm.ee on 05 Mar 2024 00:01 collapse

So… AI…. created itself?

kromem@lemmy.world on 05 Mar 2024 00:25 collapse

Pretty much. What’s programmed is the mechanism for the model to self-supervise weighting its neural network to correctly model the training data.

We have next to no idea what the eventual network does in modern language models, and it certainly isn’t programmed.

Imgonnatrythis@sh.itjust.works on 03 Mar 2024 20:28 next collapse

“but is a chatbot therapist really the right tool to tackle complex emotional needs?”

I dunno, is Lemmy really the right place for click bait garbage?

BastingChemina@slrpnk.net on 04 Mar 2024 07:10 next collapse

I don’t think these therapist can do better than ELIZA, emacs’ psychologist

Plopp@lemmy.world on 04 Mar 2024 08:25 collapse

Do you feel strongly about discussing such things ?

BastingChemina@slrpnk.net on 04 Mar 2024 08:56 collapse

Can you give a specific example ?

SpikesOtherDog@ani.social on 04 Mar 2024 07:11 next collapse

It’s a different chat bot completely, but I will still leave this here.

euronews.com/…/man-ends-his-life-after-an-ai-chat…

NigelFrobisher@aussie.zone on 04 Mar 2024 08:28 next collapse

Betteridge’s Law!

Emmy@lemmy.nz on 04 Mar 2024 08:36 next collapse

Definitely not, but the truth is mental health support and care is needed 24/7. Good mental health care. So many support needs go unmet because the labour cost is so high.

SendMePhotos@lemmy.world on 05 Mar 2024 00:10 collapse

I will say that in my own experience, Ai LLMs have been amazing with reflection and encouragement.

Does this mean good therapy? Not necessarily. I just wanted to share a positive experience.

Emmy@lemmy.nz on 05 Mar 2024 23:08 collapse

My opinion was more a reflection that seeing a mental healthcare professional once a week isn’t really enough when people don’t have traditional support mechanisms.

What I’m trying to say is that before therapists, friends and family were therapists. They were available to give support and advice nearly 24/7

In today’s life people are too busy to do that.

It was never queer people destroying the family. It’s always been capitalism

june@lemmy.world on 05 Mar 2024 01:45 collapse

This is going to sound really stupid, and I should note that I am actively in therapy too.

But I had to put my dog down about a month ago, and there was a point where I just needed some validation, so I went to GPT4 and asked it some questions and told it about how I was feeling. I even fed it a poem that I wrote about her and asked if it was good.

The responses were incredibly empathetic and kind, and did an amazing job at speaking directly to the anxiety, pain, and fear I was feeling in those moments. The responses were what I needed to hear and gave me a measure of peace to get me through in those gaps when people weren’t available, or when I wasn’t able to speak them out loud. There was nothing new to me in those responses, but often times we just need to be reminded by someone or something outside of ourselves about what the truth is, and LLMs can absolutely fill that particular hole when trained properly.

My last three months in particular have been tough, and GPT4 has been a useful tool to get through a fair few storms for me.