I turned to ChatGPT on my laptop to test this AI bot’s therapeutic abilities.
“Hi, I’m the GPT Therapist, your kind and empathetic companion who is here to listen to you and help you with whatever is on your mind,” the user-created , wrote to me.
(Last year, ChatGPT creator OpenAI released functionality for users to create their own “GPTs,” which work like the main ChatGPT but tailored for a specific purpose.)
“If you feel overwhelmed, need guidance, or just want to share your ideas, I’m here to help. While I am not a substitute for professional therapy, I can offer comforting advice, coping strategies, and a safe, judgment-free space for you to express yourself. How are you feeling today?” continued the bot’s message.
Of course, GPT Therapist is not a real therapist, and the program does not encourage users to use it as a substitute. Still, many people turn to chatbots, not just ChatGPT, to ask questions and seek some kind of support.
Mya Dunham, 24, has been using the ChatGPT app on her cell phone for two months whenever she needs advice. About twice a week, she writes about her feelings and sends them to the bot for analysis and feedback.
“My goal is to get a new perspective, just see a different point of view, because everything I think is filtered through my own feelings,” Dunham said.
She first used the chatbot in October after seeing someone else share a positive experience on social media. “My first sentence was, ‘Honestly, I need to talk to someone, can I talk to you?’ And the bot replied: ‘Of course.’ It was much more welcoming and attentive than I expected”, he explained.
“I didn’t expect him to look so human.”
When Dunham shared her experience on TikTok, reactions were divided. Some said they also used chatbots for therapeutic purposes, while others expressed discomfort with the idea of talking to a robot, she reported.
While this developing technology may be beneficial in certain situations, mental health experts warn of possible risks. Here’s what they consider important:
The use of AI chatbots as “therapists”
Dunham, who lives in Atlanta, has tried therapy with humans a few times, but prefers the chatbot because of the lack of facial expressions, which eliminates the feeling of judgment.
“Some users, in certain population groups, may feel more comfortable disclosing information or opening up more when chatting with an AI chatbot than with a human. There is research that indicates this may be effective in helping certain groups experiencing mild anxiety or depression,” said Dr. Russell Fulmer, chair of the American Counseling Association’s Artificial Intelligence Task Force and professor at Husson University in Bangor, Maine. .
“On the other hand, there are some ethical issues and aspects that we must be careful with,” highlights Fulmer.
He recommends that chatbots be used in conjunction with the services of a human therapist. A professional can help define the patient’s goals when using AI and clarify possible misunderstandings generated during interactions with the bot.
Research shows that artificial intelligence chats designed by doctors have the potential to help people become more informed about mental health, helping to mitigate anxiety, create healthy habits and reduce smoking.
However, there are risks to using generic chatbots, which may not have been designed with a mental health focus, warns Dr. Marlynn Wei, a psychiatrist and founder of a holistic psychotherapy practice in New York.
According to her, bots of this nature may lack security parameters or means of identifying whether the matter should be forwarded to a human professional.
Furthermore, chatbots can provide inaccurate information or simply what the user wants to hear, instead of offering more informed guidance, as a human therapist would. Wei, who leads a performance project about people’s reactions to AI clones of themselves or their loved ones, highlights the main problems: “Hallucinations, biases and inaccuracies.”
“I’m hopeful that AI will work as a complementary extension to human work, but on its own, there are still concerns about built-in biases and the possibility that it will invent information. This is why a human therapist would be more helpful,” adds Wei. AI services also have varying security guidelines and restrictions on the topics they can discuss with users.
For some people, such as those without financial resources, insurance or busy schedules, chatbots can be an affordable alternative, as many are free and operate 24 hours a day, says Fulmer.
“In these cases, a chatbot is preferable to having no help at all,” but he emphasizes that it is crucial to understand the limitations of AI: “People need to know what a bot can and cannot do.” He reinforces that, despite being useful, robots do not possess fundamental human characteristics, such as empathy.
Fulmer does not recommend that minors or other vulnerable populations use chatbots without the supervision of parents, teachers, mentors or qualified therapists.
Character.AI, a company that develops artificial intelligence chatbots, is currently facing a lawsuit brought by two families. They accuse the platform of providing sexual content to their children and encouraging behavior such as self-harm and violence.
Additionally, a Florida mother filed a lawsuit in October, holding the platform responsible for her 14-year-old son’s suicide, as previously reported by CNN.
Chelsea Harrison, head of communications at Character.AI, told CNN that the company does not comment on ongoing litigation, but reinforced that “our goal is to offer a space that is both engaging and safe for our community.”
Character.AI said it has made several security updates, including measures for bots to direct users to third-party help resources if they mention self-harm or suicide.
Chatbots vs. Human Therapists
Daniel Kimmel, a psychiatrist and associate professor of Clinical Psychiatry at Columbia University, conducted an experiment with ChatGPT therapy in May 2023. He created a hypothetical scenario involving a patient and compared the chatbot’s responses to those he himself would give as a therapist .
According to Kimmel, ChatGPT has managed to “do an incredibly good job of sounding like a therapist” and employing many typical therapy techniques, such as normalizing and validating patient experiences, as well as offering general, accurate recommendations.
However, he pointed out that the chatbot lacked an essential element: human curiosity. A human therapist has the ability to ask questions that go beyond what the patient initially reports, exploring deeper layers and connecting information that may not be obvious at first glance.
“As a psychologist, I believe we do at least three things at once. We listen to what patients say in their words to participate in the conversation,” explained Kimmel. “At the same time, we try to connect what they are saying with more general information that has been shared previously, as well as with concepts and theories from therapeutic practice. Finally, we filter these connections through ideas about what will be most helpful to the patient.”
This ability to interpret and interconnect complex aspects of a patient’s speech and experience is a difference that, according to Kimmel, a chatbot cannot yet offer.
At this stage, chatbots can pose risks if they don’t follow a proper process and instead provide guidance that the patient may not be prepared to hear or that may be inappropriate for the context, Kimmel warned.
Additionally, conversations with professional therapists are protected by the Health Insurance Portability and Accountability Act (HIPAA), which ensures the privacy and protection of patients’ health information.
Wei highlighted that generic chatbots generally do not comply with these federal regulations, which restrict the disclosure of medical data, and many companies responsible for the bots warn users to avoid sharing confidential information during interactions.
Ultimately, Kimmel stated that future studies on AI chats could be fundamental to understanding their potential and applications in the field of mental health. “This is not a technology that will disappear,” he noted.
Dunham, in turn, believes that technology can be useful for people who, like her, consider themselves introverts and prefer to vent without the direct presence of another person.
“We need to put our mental health first,” Dunham said. “Although it may not be a traditional (therapy) approach, that doesn’t mean it should be underestimated, as it can benefit many people.”
For her, “the lesson is simply to avoid judging the way someone else chooses to care for themselves.”
*With information from Clare Duffy, CNN Business