Is it safe to use ChatGPT for therapy? See what experts say

by Andrea
0 comments

Artificial intelligence (AI) chatbots — like ChatGPTfrom OpenAI — have recently been used for therapy. Nicknamed “Therapist GPT“, the program does not advise users to substitute a real therapist for the tool. Still, many social media users are turning to chatbots for consultation.

Mya Dunham, 24, has turned to the app for the past two months when she needs advice. About twice a week, Dunham writes down her feelings and sends them to the bot for analysis and feedback.

“My goal is to learn a new perspective, to have a different point of view, because what I think in my head will be based on my own feelings,” says Dunham.

Dunham first used the chatbot in October after seeing someone post about a positive experience on social media. “My opening line was, ‘Honestly, I just need someone to talk to, can I talk to you?’ And the bot responded, ‘Absolutely.’ And it was much more welcoming and inviting than I expected”, he says.

When Dunham posted about her experience on TikTok, comments were divided about using chatbots in this way. Some said they also turn to it for therapeutic purposes, while others expressed doubts about whether they would feel comfortable talking to a robot.

Experts say this developing technology could be beneficial in certain situations, but there are also risks to consider.

Using AI chatbots as therapists

Dunham, who is from Atlanta, has tried therapy with humans a few times, but says she prefers the chatbot because of its lack of facial expressions. The bot doesn’t seem to judge her, in her perception.

“Some users, some populations, may be more likely to open up or disclose more when talking to an AI chatbot, compared to a human, (and) there is some research that supports its effectiveness in helping some populations with mild anxiety and mild depression,” says Russell Fulmer, chair of the American Counseling Association’s AI Task Force and professor and director of graduate counseling programs at Husson University in Bangor, Maine.

“On the other hand, there are some ethical concerns and things we need to be careful about,” he noted.

Fulmer recommends that people use chatbots in collaboration with human advice. A therapist can help navigate a patient’s personal goals using the bots and clear up any misunderstandings from the chatbot session.

There has been some research into clinician-designed chatbots that can potentially help people educate themselves more about mental health, including mitigating anxiety, building healthy habits, and reducing smoking.

But the risks that come with using chatbots generally are that they may not have been designed with mental health in mind, according to Marlynn Wei, a psychiatrist and founder of a holistic psychotherapy practice in New York. Bots may lack “safety parameters and ways to identify whether the issue needs to be escalated to a clinician or a human professional.”

Chatbots can provide incorrect information or information that the user wants to hear rather than what a human therapist would recommend with mental health in mind, according to Wei, who has a performance project that explores people’s reactions to AI clones of themselves and of your loved ones.

“The (problems) are “hallucinations” and biases and inaccuracies”, says Wei. “I have a lot of hope in AI as a way of combining and expanding work, but in itself, there are still concerns about the bias that exists within AI, and also the fact that it can invent things (…) I think that’s where that having a human therapist would be more helpful.”

AI services also have different security guidelines and restrictions in terms of what bots can discuss with users.

Chatbots may be more accessible to certain people, such as those who don’t have money or insurance for therapy or who don’t have time in their schedules, as some chatbots are free and can respond day and night, according to Fulmer.

“In these cases, a chatbot would be preferable to nothing”, but people need to understand what a chatbot “can and cannot do”, he says, adding that a robot is not capable of certain human traits such as empathy.

Fulmer does not advise minors or other vulnerable populations to use chatbots without guidance and supervision from parents, teachers, mentors or therapists.

Character.AI, a chatbot company from , is currently facing a lawsuit from two families who have accused it of providing sexual content to their children and encouraging self-harm and violence. Separately, a Florida mother filed a lawsuit in October alleging the platform was responsible for her 14-year-old son’s suicide, as the CNN previously reported.

Chelsea Harrison, head of communications at Character.AI, explains to CNN previously that the company does not comment on pending litigation, but that “our goal is to provide a space that is engaging and safe for our community.” The company says it has made several security updates, including ensuring that bots will direct users to third-party resources if they mention self-harm or suicide.

Chatbots vs. human therapists

Dr. Daniel Kimmel, a psychologist and assistant professor of clinical psychiatry at Columbia University, tried ChatGPT therapy in May 2023, giving the chatbot a hypothetical patient and comparing the responses to what Kimmel would have offered the patient.

He says to CNN that the chatbot “did an incredibly good job of sounding like a therapist and using many of the techniques… that a therapist would use to normalize and validate a patient’s experience (and) make certain types of general but accurate recommendations.”

But what was missing was the curiosity that a human psychotherapist might have with a patient, asking questions that go a little beyond what the patient initially says and that “connect the dots beneath the surface,” he adds.

“As a therapist, I believe therapists are doing at least three things at once. We are listening to what patients are saying in their words. You have to do this to be in the conversation,” says Kimmel. “So in the back of your mind, you’re trying to connect what they’re saying to bigger things that the patient has said before (and) concepts and theories that you’re familiar with in your experience, and then ultimately filtering the result of that through of ideas about what will be most useful for the patient.”

Right now, chatbots can pose risks if they don’t follow these steps and instead provide guidance that the patient may not be ready to hear or that may not be helpful in the situation, according to the expert.

Additionally, conversations with professional therapists are covered by the Health Insurance Portability and Accountability Act, known as HIPAA, and your health information is private and protected, according to Wei. General chatbots often do not comply with federal law restricting the disclosure of medical information, and the companies behind the bots often advise users not to share sensitive information in their conversations with the bots, Wei adds.

Finally, Kimmel says that future research on AI chatbots would be beneficial to understand their potential and applications for mental health. “This is not a technology that is going to disappear.”

Dunham says she believes the technology could be helpful for people like her who feel more introverted and want to express their feelings without another person present.

“We have to prioritize our mental health above all else,” says Dunham. “Even though it doesn’t seem like a traditional form (of therapy), we shouldn’t take it for granted, because it can help a lot of people.”

For her, “the lesson would be to just not judge the next person by how they heal.”

source

You may also like

Our Company

News USA and Northern BC: current events, analysis, and key topics of the day. Stay informed about the most important news and events in the region

Latest News

@2024 – All Right Reserved LNG in Northern BC