The dark side of artificial intelligence is becoming a reality: the shooter from the university was supposed to be advised by a well-known chatbot, the prosecutor’s office is investigating

  • OpenAI is facing a criminal investigation in Florida.
  • The attorney general says ChatGPT provided the shooter with crucial practical advice.
  • The authorities are considering the criminal liability of the company as a possible accomplice.

American technology company OpenAI is being investigated by prosecutors in the United States in a case where the artificial intelligence language model ChatGPT allegedly helped the shooter in the attack at Florida State University in Tallahassee last year. Two people were killed and several others were injured in the shooting, TASR reports, according to a report from the British station BBC.

According to its chief James Uthmeier, the Florida Attorney General’s Office evaluated the matter and came to the conclusion that it is necessary to open a criminal investigation against the company operating artificial intelligence. “Before committing these heinous crimes, ChatGPT provided the shooter with important advice,” he added.

However, a spokesperson for OpenAI claims that the chatbot is not responsible for the crime. The company is cooperating with authorities and has proactively provided information about a ChatGPT account it believes belongs to the suspect, according to a spokesperson. According to the spokesperson, the service did not incite illegal activity. “In this case, ChatGPT provided factual answers using information that could largely be found in publicly available sources on the Internet,” he added.

However, Uthmeier said ChatGPT advised the shooter on what type of weapon and ammunition would be appropriate to use and where larger crowds might be on campus. “My prosecutors looked into it and told me if there was a person sitting on the other side of the screen, we would have charged them with murder,” Uthmeier said.

He pointed out that under Florida law, anyone who aids or abets someone in the commission of a crime is considered an accomplice. Although ChatGPT is not a person, the attorney general said his office must determine the criminal liability of OpenAI, the company responsible for the chatbot.

The company is already facing a lawsuit in connection with another incident in which its chatbot may have played a role. Earlier this year, an 18-year-old shot and killed nine people and injured 20 others in the Canadian province of British Columbia. After the incident, OpenAI said it had identified and blocked the shooter’s account, but did not refer the matter to the police. At the same time, the company announced that it intends to strengthen its security measures.

A group of 42 US state attorneys general sent a letter to 13 AI chatbot tech companies, including OpenAI, Google, Meta and Anthropic, expressing concern that AI is increasingly being used by people “who appear to be unaware of the dangers they may encounter.” To prevent such cases, prosecutors called for thorough safety testing of artificial intelligence, a possible recall of products from the market and clear warnings for consumers.

source