THE AMERICA ONE NEWS
May 30, 2025  |  
0
 | Remer,MN
Sponsor:  QWIKET 
Sponsor:  QWIKET 
Sponsor:  QWIKET: Elevate your fantasy game! Interactive Sports Knowledge.
Sponsor:  QWIKET: Elevate your fantasy game! Interactive Sports Knowledge and Reasoning Support for Fantasy Sports and Betting Enthusiasts.
back  
topic
Le Monde
Le Monde
18 Aug 2024


Images Le Monde.fr
MARTHE AUBINEAU FOR LE MONDE

How AI is shaking up the mental health community: 'Rather than pay for another session, I'd go on ChatGPT'

By 
Published today at 4:22 pm (Paris)

5 min read Lire en français

"The best experience I've had with therapy."

"I cried a few times and had a lot of revelations. Just one day talking to them made me feel better."

"I confided in them something I'd never told anyone before."

These glowing comments aren't Google reviews of an excellent therapist. Nor do they refer to the work of a human but to two conversational robots, chatbots, created by the US platform Character.ai.

Rather, on Reddit, several users sang the praises of Therapist and Psychologist, "helpful," "compassionate" robots that give "the impression of talking to a real therapist, but a good and reasonable one." They're also available 24 hours a day and free of charge, which can be an advantage when you're "broke," as one user explained on the forum.

The Character.ai platform, whose users are predominantly aged between 16 and 30, has around 475 bots acting as therapists, according to a BBC count in January. The success of some of these virtual "psychologists" can be counted in the hundreds of thousands of visits.

Among them, Psychologist is the most famous. To date, it has accumulated over 154 million conversations since its creation just over a year ago. "I never intended for it to become popular, or for other people to use it as a tool," explained its creator, a New Zealand psychology student who claimed to have developed the tool for his personal needs, as he couldn't afford to pay for his therapy sessions. "I then started receiving a lot of messages from users telling me that they had been positively affected and were using it as a source of comfort."

'The Eliza Effect'

This was the case for Charlotte, 30. She used another generative artificial intelligence (AI) tool, ChatGPT, to complement her psychotherapy sessions last summer. "Sometimes I'd come out of my consultation and still have questions to ask, so rather than pay for a new session, which is expensive, I'd ask them directly to ChatGPT," said the thirty-something resident of the Paris region. She then stated: "I never took what it told me at face value, I saw it more as food for thought. A bit like when a friend gives you advice, but you're not necessarily going to apply everything they tell you." She recalled being "quite surprised" by the accuracy of the answer to her first question, which quickly convinced her to spend more time on it: "It was a whole universe that opened up to me, I could easily spend an hour on it."

But, the answers provided by "psychologist" chatbots are sometimes so convincing that they can have tragic effects. In March 2023, a Belgian researcher in his thirties, a victim of ecoanxiety, took his own life after six weeks of intensive conversations with a chatbot named Eliza, built by the US company Chai Research. "It was like a drug he took refuge in, morning and night, and couldn't live without," his wife told the La Libre daily.

You have 61.07% of this article left to read. The rest is for subscribers only.