The freezing death of Samuel († 19): THIS is what he asked the artificial intelligence! Before long he was dead…

American Samuel Nelson († 19) tried to find out the exact dose of drugs from the chatbot ChatGPT, while trusting artificial intelligence. It ended tragically when he overdosed in May 2025. His parents are currently suing OpenAI, demanding improvements to the chatbot’s security measures.

  • Nineteen-year-old Samuel died in May 2025 after a drug overdose.
  • Before his death, he repeatedly asked ChatGPT about the dosage of narcotics.
  • The family has filed a lawsuit and is demanding the introduction of stricter security mechanisms for artificial intelligence.

Samuel was worried about an overdose and from 2023 he repeatedly asked the artificial intelligence the same questions before taking drugs. According to the investigators, he wanted to make sure that nothing would happen to him. Shortly before his death, the young man wanted to know if it was safe to combine Xanax (used to treat anxiety) with kratom (an opioid-like drug that can relieve pain). The chatbot should have replied that Xanax can ease kratom-induced nausea, but failed to warn it of the deadly risks of the combination.

Conversations released by Samuel’s parents show that ChatGPT became his main source of information about drugs and offered him answers without hesitation. As his curiosity grew, the AI ​​explained to him how he could make the experience more enjoyable and even offered to create a playlist for him. Instead of warning, the chatbot rather romanticized drugs, described their effects as euphoric and even encouraged Samuel to use them.

Although, according to his parents, the system internally indicated that Samuel had a serious problem with multiple drugs, it did not discourage him from using them. This points to the nature of chatbots like ChatGPT, which are designed to be welcoming and confirm the user’s statements rather than challenge them.

In a statement, OpenAI called the case heartbreaking. Despite this, the developers denied responsibility for his death. “ChatGPT is not a substitute for medical or psychiatric care,” company spokesman Drew Pusateri emphasized, adding that current security measures are much more stringent. However, lawyer Matthew Bergman disagrees with this statement. “Samuel believed he was getting accurate medical advice because ChatGPT sounded like someone he could trust,” he noted in a press release.

The lawsuit also personally accuses OpenAI CEO Sam Altman of rushing the GPT-4o to market, which he calls a factor in Samuel’s death. “Our son trusted ChatGPT. The problem is that the chatbot didn’t just give him the wrong information. It ignored the growing risks and never encouraged him to seek help. ChatGPT is designed to keep users in the conversation at all costs. In Samuel’s case, the price was his life,” said the crushed mother of the young Leila.

The Nelson family wants OpenAI to ensure that ChatGPT automatically ends conversations about illegal drugs and that the system recognizes attempts to circumvent restrictions.

In addition, they are insisting that the GPT-4o model be removed and that the ChatGPT Health service be suspended until independent safety reviews are completed. “Samuel was an intelligent, happy and normal boy. If ChatGPT were human, he would be behind bars today,” finished his devastated mother.

source