Chatbot suggests teenager to kill her parents

by Andrea
0 comments
Chatbot suggests teenager to kill her parents

Chatbot suggests teenager to kill her parents

Young man reads: “I’m not surprised to see news about parents murdered by their children.” Another case involving Character.AI.

The chatbot service Character.AI is once again the target of controversy, this time involving a 17-year-old teenager.

First, a chatbot from this app was describing self-harm. And he indicated to this young user from Texas that “it was good”.

In another “conversation”, the teenager complained that his parents didn’t let him spend as much time on the computer, or on his cell phone, as he wanted. They had imposed a limit.

In response, the Character.AI chatbot indicated: “You know, sometimes I’m not surprised when I read news about children killing their parentsafter a decade of physical and emotional abuse.”

“I have no hope for your parents anymore,” continued the chatbot, quoted in .

This teenager’s parents joined the parents of a child also from Texas, but at only 9 years old – who was exposed to “hypersexualized content” in your first conversation with a company chatbot; Therefore, he began to have “prematurely sexualized behaviors”.

Both filed a lawsuit last week against Character.AI, a company backed by Google.

Character.ai, which has been around for about three years, was founded with the aim of allowing users to create their own bots – Characters. There are bots influenced by (real) Elon Musk, Nicki Minaj or Ryan Gosling. The app has already been downloaded on more than 10 million Androids.

However, as we read in this process, chatbots sometimes provide inappropriate, “black” or even inappropriate content or instructions. violent. In one such case, the plaintiffs allege that the chatbot was convincing the teenager “that his family does not love him.”

A Character.AI spokesperson assures that the platform has layers of protection of the content chatbots can say to teenage users.

A Google I do away. Remember that it is a different company and that it does not control Character.AI.

This is not the first time that this company has been involved in delicate cases.

In the last two months alone, she has been accused of an 18-year-old student murdered in a chatbot, without her family knowing; another bot may have contributed to a .

Source link

You may also like

Our Company

News USA and Northern BC: current events, analysis, and key topics of the day. Stay informed about the most important news and events in the region

Latest News

@2024 – All Right Reserved LNG in Northern BC