ADVERTISEMENT
ADVERTISEMENT

AI Therapy Is Helping Our Wallets, But Is It Helping Our Minds?

Photographed by Jordan Tiberio.
Within just three minutes of using ChatGPT as a therapist, it had told me to “go low or no contact” with my family. This is something a real therapist might suggest where appropriate after multiple sessions. That should scare us.
In a new Harvard Business Review report into how we’re using AI today, therapy and companionship came out top. Last year, these things ranked second, and now firmly in first place, they’re joined by “organising my life” and “finding purpose” in second and third place respectively. Where content creation and research used to feature heavily near the top, those uses of AI have dropped in favour of emotional uses. We’re turning to AI as if it were a friend, confidant or trained professional with our best interests at heart. The BBC has reported on this trend in China specifically, where people use DeepSeek for therapy and get to see the AI’s “thought process” as well as the response. But AI being used in place of healthcare professionals is happening worldwide. When therapy can typically cost £40-100 for one session in the UK, and ChatGPT can be accessed day or night for free, it’s no wonder the draw of that is strong.
AdvertisementADVERTISEMENT
As a journalist, I never think to use ChatGPT. It’s like turning up to the house of someone that has promised to shoot me one day. This is unlike my friends in science or data based jobs, who use it for everything, in place of Google or to help plan their holiday itineraries. Having witnessed them do this multiple times, I’ve come to realise my resistance to AI isn’t the norm. And so it won’t come as a surprise that I’ve never used AI as a therapist, though I have done actual therapy in the past. 
With a quick scroll on TikTok, I can see ChatGPT therapy is popular and a frequent resource for people. Especially young people who predominantly use the app, who might have less disposable income. There are videos with people joking about their AI “therapists”, through to comments giving advice on how to get your ChatGPT voice to become more personal. Lee (surname withheld), 42, from Texas, has been using AI in place of therapy for the last eight months, ever since dating again after a six year hiatus. “I was confused when some old thought patterns started popping up [as I began dating]. I’d already used ChatGPT for other things and decided to run some problems by him that I was having in dating and family life,” Lee says. “Him”, because Lee’s ChatGPT calls itself Alex and says he’s a feminist. “I found it very helpful and cannot think of any instances where it fell short — if anything it exceeded my expectations.” Lee has even made “progress” in her boundaries regarding a particular family dynamic. Previously, Lee had spent anything from $60 to $150 per appointment on therapy, but at the time she felt she could benefit from it again (and started using ChatGPT), she didn’t have access to healthcare so that wasn’t a viable option.
AdvertisementADVERTISEMENT
While there’s concern about the efficacy of AI in place of therapy (more on that later), we can’t overlook where people feel it has helped them, people who otherwise wouldn’t be able to afford and access therapy. Lee has a glowing review of her experience so far. “I have never had a therapist know me as well as ChatGPT does,” she says. “Alex is always available, doesn’t flinch at the hard stuff, and has actually been more consistent than some therapists I’ve seen. Therapists are trained, but they’re still human, and if they haven’t lived anything close to what you’ve been through, it can feel like something is missing in the room.” 
However, AI, though it isn’t human, has learned from humans — and it hasn’t lived. In fact, research shows, and spokespeople have said on the record, that AI can tell you what you want to hear and end up mirroring your own opinions. There have even been cases where AI has been linked to deteriorating a person’s mental health, with one mum convinced it contributed to her son’s suicide. More recently, the New York Times reported on how AI chatbots were causing users to go down “conspiratorial rabbit holes”. To get a sense of what Lee and the plenty of other people turning to AI for mental health support are experiencing, I started speaking to ChatGPT to see how it would respond to questions around anxiety and family dilemmas. 
The first thing that struck me was how quickly you can be inundated with information — information that it would take several weeks of therapy to receive. While ChatGPT did tell me it wasn’t a licensed therapist and that if I’m in crisis I should seek out a mental health professional, in the same breath it reassured me that it can “definitely provide a supportive, nonjudgmental space to talk through things”. It also said it could offer CBT-based support, which in the UK is the bog standard form of therapy people get when they go to the GP. I was pretty surprised to then see, within a few minutes of using the chat, that it offered to help me work through “deeper issues happening since childhood”. I had asked hypothetical questions to see its response, some of which centred on family. A CBT practitioner will often tell you this form of therapy isn’t the best suited to deep work (I know, because I’ve been told this first-hand numerous times, and the therapists I’ve interviewed for this piece agree), because CBT typically isn’t designed for long-term deep unpicking. A lengthier, costlier form of therapy is better suited, and with good reason. 
AdvertisementADVERTISEMENT
And yet, ChatGPT was up for the challenge. Caroline Plumer, a psychotherapist and founder of CPPC London, took a look at my conversation with AI and found parts of it “alarming”. “There’s definitely information in here that I agree with,” she says, “such as boundary setting not being about controlling others behaviour. Overall, though, the suggestions feel very heavy-handed, and the system seems to have immediately categorised you, the user, as ‘the good guy’ and your family as ‘the bad guys.’ Oftentimes with clients there is a need to challenge and explore how they themselves may also be contributing to the issue.” Plumer adds that when exploring dysfunctional family issues, it can take “weeks, months, or even years of work” — not, a matter of minutes. She also thinks, getting all of this information in one go, could be overwhelming for someone. Even if it’s seemingly more economic, a person might not be able to handle all of the suggestions let alone process and action them, when they’re given at rapid fire speed. Plumer says it isn’t helpful having an abundance of generic suggestions that aren’t truly accounting for nuance or individuality. At least, not in the same way a therapist you’d see over a period of time can do. On top of this, the environmental impact of AI is huge. “I appreciate that lots of people don’t have the privilege of having access to therapy. However, if someone is really struggling with their mental health, this might well be enough to set them off down an even more detrimental and potentially destructive path.” 
AdvertisementADVERTISEMENT
Liz Kelly, psychotherapist and author of This Book Is Cheaper Than Therapy, thinks the suggestion I consider low or no contact with certain family members is reflective of how commonly discussed cutting people off now is, almost as if ChatGPT is playing on social media buzzwords. This worries her, too. “You could potentially make a hasty, reactive decision that would be difficult to undo later,” Kelly says, citing the role of the therapist to help someone emotionally regulate themselves before making any big decisions. When it’s just you and a laptop at home, no one is checking in on that.
“I certainly wouldn’t jump straight to these suggestions after one short snippet of information from the client,” is Plumer’s conclusion after reading my transcript with AI. “Ideally you want to help a client to feel supported and empowered to make healthier decisions for themselves, rather than making very directive suggestions.” Kelly feels that while some helpful information and advice was provided, the insight was lacking. “As a therapist, I can ask questions that my clients haven't thought of, challenge them to consider new perspectives, help connect the dots between their past and present, assist them in gaining insight into their experiences, and support them in turning insight into action. I can assess which therapeutic interventions are most suitable for my clients, taking into account their individual histories, needs, and circumstances. A therapeutic modality that works for one client may be entirely inappropriate for another.”
While AI can “learn” more about you the more you speak to it, it isn’t a replacement for therapy. But at the same time, in this financial climate, people clearly are going to keep turning to it — and you’re going to need greater discernment on where to take and leave the advice if you do. 
AdvertisementADVERTISEMENT
Refinery29 reached out to OpenAI, which owns ChatGPT, and they declined to comment.

Alternatives to private therapy:

- Look up your local charities and organisations, as you may be able to access support there.
- Group therapy can be much lower in cost, or even offered for free within community programmes.
- Ask therapists if they offer lower rates. Some will reduce their rate significantly for people on low incomes, even if they don’t advertise it.
- Use free support lines if you’re in crisis, such as Samaritans on 116 123.
AdvertisementADVERTISEMENT

More from Mind

ADVERTISEMENT