With therapy waitlists stretching for months and the cost of mental health support inaccessible to many, more and more people are turning to Artificial Intelligence (AI) platforms for emotional support.
On social media, users are openly sharing their experiences using chatbots like ChatGPT to cope with mental health struggles.
@ruehalloway i use it everyday #fyp #chatgpt ♬ Classic classical gymnopedie solo piano(1034554) – Lyrebirds music
“So we all using ChatGPT as a therapist now? I thought it was just me,” a TikTok user said.
“My therapist actually said there’s a lot of studies where they’re testing real doctors & therapists vs AI and AI gets way better reviews bc they actually listen to the patients & their feelings,” another user on TikTok commented.
Yet, there are also some users who are weary about using AI as a substitute for a human therapist.
@user927611973 What do you guys think about using ChatGPT or AI as a therapist? I understand especially in countries where mental health care systems are extremely expensive, this may feel like the better option. Sometimes it may even be the only option, so I can empathize with that. But there is something so sinister about society being so individualistic that people are now opting to open up to a robot instead of lean on their loved ones! It’s not the fault of those seeking any bit of help they can (what I say in the video is just me being humorous!), but it does showcase something ugly about the world at large. #mentalhealth #therapy #chatgpt #ai #society #opinion #robots #fyp ♬ original sound – ☆ B Talks | for the girls ☆
@parisbeyk Be careful if you like to use ChatGPT as a therapist – if you’re getting sucked into a validation loop, set boundaries & step away #chatgpt #ai #emotionalsupport #validation #psa #selfawareness #fyp #mentalhealth #mindsetshift ♬ original sound – PARIS
“Talking to AI is honestly sad, plus they always agree with you, it feels weird and forced. I don’t want something to glaze my opinions, I need people to be honest,” a user said.
But can a bot trained on internet text truly replace the human experience of therapy?
Rob Pintwala is the founder and CEO of First Session, a Canadian platform that helps people find the right therapist. While he doesn’t believe AI can replace human therapy, he understands why so many are turning to it for support.
His journey began when his wife’s frustrating experience with a local therapist prompted him to create a better matchmaking process for clients.
“She built up all this courage only to be disappointed by bad conversations,” Pintwala said. “She didn’t try again for some time, and that experience motivated me to start the [First Session] platform.”
Launched in 2019, First Session focuses on what Pintwala calls the “therapeutic alliance” — the trust and rapport between client and therapist that is often the key to a successful outcome.
With over 150 therapists and more than 10,000 clients served, the platform allows users to browse through therapist videos and bios to get a sense of a fit.
WHERE DOES AI FIT INTO THE MENTAL HEALTH SPACE?
“I definitely think [AI systems] can provide a lot of value. It is so well programmed in terms of the support and tone that it responds to the user, and it just provides the sort of gentle, compassionate, optimistic tone,” Pintwala expressed.
For people struggling to normalize their experiences or simply feel heard, AI systems like ChatGPT can offer low-barrier reassurance.
Still, Pintwala says the risks are real.
“A lot of folks now are self-diagnosing themselves, and I think if they come to ChatGPT with a perceived diagnosis, I don’t think it is programmed to validate that diagnosis,” Pintwala cautions.
Another risk, he adds, is isolation.
“It’s one of the biggest issues, particularly with young people. And it’s like the double-edged sword of social media is isolation and lack of true community, empathy and compassion online,” he said.
“Even though ChatGPT can provide that empathy and compassion as a machine, it’s not like humans. Humans need meaningful human connections to thrive, and if it’s giving you just enough, and it’s your companion, you know that might be better than nothing, for sure.”
Pintwala also flags concerns about commodified therapy platforms that underpay therapists, leading to burnout.
“What’s happening is these therapists on these therapy platforms, they are being overworked, burnt out, and so sometimes they’re not able to fully show up as they wish to for clients and provide them that kind of really present, grounded, judgment-free zone,” Pintwala explains.
“That could create more negative experiences that people have with these kinds of commoditized therapy platforms, the more mistrust for the industry in general. And no, chat GPT doesn’t have judgment. It’s not human.”
THE AI PERSPECTIVE
Dr. Ozgur Turetken, an AI expert and associate dean and business professor at Toronto Metropolitan University (TMU), agrees that AI has potential but warns it must be approached with caution.
He likens generative AI to a “very smart baby” who is brilliant, but is only as good as what it’s taught.
“I would be very careful using a tool that is trained with that kind of heterogeneous data to give sensitive kind of advice, such as what might happen in a therapy session. My guess is, just like everything else, 95 per cent of what may have gone in that kind of a discourse between the tool and the individual might be benign and most likely useful, but there might be things that also might be kind of ill advised or in some cases even dangerous,” Turetken said.
Turetken emphasizes that AI doesn’t understand suffering or emotions; it just mimics empathy based on patterns it has seen in data.
“It is all about pattern recognition. They [AI] will be more empathetic if the training data comes from sources that are empathetic.”
Turetken noted that while humans develop empathy through a mix of nature and nurture, AI has no “nature” of its own. It is entirely shaped by the data it’s trained on. This fundamental difference, he says, makes AI inherently unlike humans, even though it can mimic human emotion so convincingly that many people may not be able to tell the difference.
“For everyone, it may be different, whereas there’s no nature for AI, everything is what you nurture it with. So, in that sense, it is different. It’s always going to be different. But again, it can come very, very close that for most people, it may be impossible to differentiate whether you’re actually interacting with a super intelligent but artificial agent or a person,” Turetken explained.
It also raises the stakes in high-risk mental health scenarios.
“For less risky situations, AI would do the job for the most part, it would even probably pass as a decent therapist. But if somebody is high risk, let’s say they’re suicidal or severely depressed or hallucinating, it may lead to very severe results. So, that might not be a good idea.”
A DIGITAL STARTING POINT, NOT A DESTINATION
As both Pintwala and Turetken suggest, AI tools like ChatGPT may have a role to play, particularly as a first step for those unable or unwilling to access therapy.
AI can offer comfort, normalize emotions, and even nudge individuals toward healthier habits, but they are no substitute for trained professionals or real human relationships.
“We see AI coming, therapists are using AI, but a lot of them are concerned about it. I’m doubling down on human connection, meaningful human connection is the answer,” Pintwala said.