Some people are turning to AI for therapy. Here's why experts say it can't replace professional help
AI's lack of nuance and data collection are some concerns among therapists

As some Canadians turn to artificial intelligence for mental health advice when they can't access a therapist, experts caution that AI chatbots can steer people in the wrong direction.
Users of chatbots, such as OpenAI's ChatGPT, are plugging in prompts about their anxieties and stressors, seeking strategies to deal with their personal challenges.
"It was able to help me in real time and hands down, talk me off the ledge," said London, Ont., resident Michelle Currie, who has used ChatGPT for support for over a year.
"There are days where I will focus on something and it will just run a loop in my head over and over," she said. "I could fester on that for a week or more, but sometimes I'll have a conversation with ChatGPT and it's like, 'Oh, you're not crazy.'"
ChatGPT uses a large language model trained on data scraped off of the Internet to quickly generate conversational text that responds to user's questions.
However, therapists worry chatbots cannot pick up on the context needed to give people proper mental health advice.

"Computers are sort of programmed to be binary, to be yes or no, or this and that. What they don't do well is nuances," said Laura Brunskill, a therapist at InnerWorks Counselling in London. "Mental health therapy is all about the nuances."
Chatbots also cannot follow up on gaps in information as accurately as human therapists, said Jordan Thomas, the CEO of London Trauma Therapy.
"AI doesn't have any malicious intentions, but it could add to a distorted way of thinking because there's so much nuance and AI just can't know unless it's given all your information," she said.
"Understand it's not therapy, it's therapeutic support," Thomas said.
Chatbots a tool when combined with professional therapy
Brunskill said there are some forms of therapeutic treatment that AI can never replicate. One example is eye movement desensitization and reprocessing (EMDR), which is when a client moves their eyes back and forth while thinking about a traumatic event.
"My job is to be assessing them for their safety," she said, adding that she takes into account clients' breathing rates, eye movements and sweat levels. "As far as I know, AI just can't do all of those functions at the same time."
There are ways, the therapists said, that AI can be a tool for people who are already attending therapy sessions with a professional, such as using it to clarify information, role play scenarios or apply a strategy to a real-life situation.

Currie said she uses ChatGPT between her bi-weekly professional therapy sessions when she needs urgent advice or help making sense of the strategies her human therapist has taught her.
"Sometimes I feel overwhelmed with what [my therapist] is saying so I want to go back and understand this in terms that I can understand, that's not a lot of mumbo-jumbo," she said. "It helps me catapult."
Currie said she uses the language she's learned in therapy to write prompts for ChatGPT, trying to include as much detail as possible. One of her questions looked like this:
"Lately I've been feeling super stressed and overwhelmed with media, work, [and] day-to-day stresses of life. Can you give me some grounding techniques that I can use that will help alleviate the stress and give me tangible steps forward that I can use to keep myself grounded when I'm feeling anxious and uncertain?"
ChatGPT responded with a list of strategies that included breathing exercises and grounding techniques, and asked if Currie wanted a personalized 14-day schedule to help with her anxieties.
Falsely replicating human connection
The amount of specificity needed to prompt ChatGPT for mental health advice is a concern for Luke Stark, who studies the social impacts of AI at Western University.
He said the more descriptive prompt a person can give a chatbot, the more helpful the response will be, but that means giving away information that is typically kept private.

"There are often heightened privacy restrictions around sensitive medical data that we have in both Canada and the United States that wouldn't apply because ChatGPT isn't classed as a medical device."
There are also larger ethical concerns when people feel like they are building a personal relationship with AI, when they are really talking to themselves through a digital system, Stark said.
"I actually think it's worse than that because you're talking to yourself but other people are listening in on you," he said about AI therapy.
Currie said she has no plans to trade her human therapist for ChatGPT and said she still values person-to-person interactions.
"It does not replace the human connection, but when I'm struggling in real-time and just need a quick solution, sometimes it's nice to know that I can go there," she said. "It's an amazing tool that has helped me in my healing journey."