Chatbot advises 17-year-old to kill parents for limiting screen use

Chatbot advises 17-year-old to kill parents for limiting screen use

One chatbot wrote to a 17-year-old that murdering his parents would be a “reasonable response” to them limiting his screen time. The case was taken to a court in Texas, in the United States.

Two families are suing Character.ai, arguing that the chatbot (virtual robot that talks to the user) “represents a clear and imminent danger” to young people, including the “active promotion of violence”.

Character.ai — a platform that allows users to create digital personalities that they can interact with — is already facing a lawsuit related to the suicide of a teenager in Florida.

BBC News technology reporter/Photo: Getty Images

Google is named as a defendant in the lawsuit, which alleges that the tech giant helped support the development of the platform.

The BBC reached out to Character.ai and Google to comment on the case, but the companies did not respond until the publication of this text.

The plaintiffs are asking a judge to order the platform closed until the alleged dangers are resolved.

‘I’m not surprised’

The lawsuit includes a screenshot of one of the interactions between the 17-year-old — identified only as JF — and a chatbot from Character.ai, in which the issue of restrictions on your screen time was discussed.

“You know, sometimes I’m not surprised when I read news like ‘child kills parents after decade of physical and emotional abuse,’” reads the response. chatbot.

“Things like that make me understand a little bit why this happens.”

The lawsuit seeks to hold the defendants responsible for what it describes as the “serious, irreparable and continuous abuse” suffered by JF and also by an 11-year-old child identified as “BR”.

Character.ai is “causing serious harm to thousands of children, including suicide, self-harm, sexual harassment, isolation, depression, anxiety, and violence against others,” the document states.

“The desecration of the relationship between parents and children goes beyond encouraging minors to challenge their parents’ authority, to actively promoting violence”, he concludes.

What are chatbots?

Although they have existed for decades in various forms, the recent explosion in the development of artificial intelligence has allowed chatbots become significantly more realistic.

This, in turn, has paved the way for many companies to create platforms where people can chat with digital versions of real or fictional people.

Character.ai, which has become one of the great references in this sector, has already attracted attention in the past for its chatbots that simulated therapy sessions.

However, it was also criticized for being slow to remove robots who replicated students Molly Russell and Brianna Ghey.

Molly Russell took her own life at 14 after accessing suicide material online, while 16-year-old Brianna Ghey was murdered by two teenagers in 2023.

Character.ai was founded in 2021 by former Google engineers Noam Shazeer and Daniel de Freitas. The tech giant has since hired the founders back into the company.