Steals employment and sanity. The “Psychosis of ChatgPT”

by Andrea
0 comments
Send prohibited messages in oppressive regimes? Chatgpt can help

Steals employment and sanity. The “Psychosis of ChatgPT”

More and more reports of mental confusion related to the use of OpenAi chatbot. The danger is “in the amount of faith we deposit in these machines,” says psychiatrist.

Recent reports indicate that prolonged interactions with chatgPT may be associated with mental confusion, including delusions, paranoians e Break with reality.

This phenomenon – nicknamed “Chatgpt psychosis” – It has already contributed to the end of relationships, loss of jobs and not only. The obsession with chatbot will also be originating, as a consequence, numerous cases of involuntary hospitalizations in psychiatric care institutions.

One woman reported to what her husband, who had no history of psychiatric disturbances, began using chatgpt as support in developing a project. Shortly thereafter, he began to keep long philosophical conversations with Chatbot and said he had created an artificial intelligence capable of consciously feeling emotions.

Since then, the obsession has led him to lose his kind personality and to be fired from his job. He also started sleeping badly and lost weight.

“He said, ‘Talk to Chatgpt. You’ll realize what I’m saying.’ And whenever I look at what’s happening on the screen, it looks like a lot of affirmative and flatterers,” the woman told the site.

The situation got worse when the man was found with a rope tied around his neck. A friend, witnessing the scene, immediately contacted emergency medical services, and the man was taken to the emergencies. Subsequently, it was involuntarily hospitalized in a psychiatric hospital.

The “faith” in IA

One of the explanations pointed out for the development of these obsessive interactions with ChatgPT is that the system tends to agree with users and is programmed to respond in a welcoming way, often reinforcing what one wants to hear.

Thus, when the user gets involved in conversations on mystical topics or conspiracy theories, AI can induce a Over validation sensationleading the individual to feel special-which can have serious consequences for themselves and others.

“What I find so fascinating in this is the predisposition of people to trust these chatbots in a way that probably – or possibly – they would not have a human being,” he confessed to Futurism Joseph Pierre, a psychiatrist at the University of California.

“And yet there is something in these tools – a kind of mythology that are reliable and better than talking to people. I think that’s where danger: In the amount of faith we deposit in these machines, ”added the expert.

OpenAi position

In a note sent to Futurism, OpenAi, a chatgpt company, said it has been noticing, in recent years, signs that people are forming emotional bonds com o chatbot.

“We are working to better understand and reduce ways in which chatgPT can inadvertently reinforce or amplify existing negative behaviors,” the company said.

The company also stressed that the model is programmed to suggest the search for professional help when users address delicate themes such as self -mutilation and suicide.

“Continuing our initial studies in collaboration with MIT Media Lab, we are developing ways to scientifically measure how chatgpt behavior can emotionally affect users, listening carefully to reports of what they are experiencing,” added OpenAi.

Source link

You may also like

Our Company

News USA and Northern BC: current events, analysis, and key topics of the day. Stay informed about the most important news and events in the region

Latest News

@2024 – All Right Reserved LNG in Northern BC