It's sad and I'm not heartless, but sometimes kids make bad decisions. It's not always somebody else's fault.
In the past I think the USA has erred on the side of making things so secret that people died from lack of info.
Here's what the article said:
"""On May 31st, 2025, the day of Nelson’s death, his parents claim ChatGPT “actively coached” their son to combine Kratom — a supplement that can either boost energy or serve as a sedative depending on the dose — and the anti-anxiety medication Xanax. “ChatGPT, otherwise unprompted, specifically suggested that taking a dosage of 0.25- 0.5mg of Xanax would be one of his ‘best moves right now’ to alleviate Kratom-induced nausea,” the lawsuit alleges. Nelson died after consuming a combination of alcohol, Xanax, and Kratom. SFGate first covered Nelson’s story in January."""
If thats an accurate representation of what happened, and not twisted by the deceased giving the robot weird context to force it to say that, it does seem like a lawsuit is warranted! Of course, we don't know the exact cause of death either. From the bit of research I did just now, people have died from respiratory depression or vomit aspiration after combining kratom/7oh + benzodiazepines, and adding alcohol to the mix makes all those more likely.
https://web.archive.org/web/20260512163224/https://www.theve...
I really think these criticisms are misguided. I realize an LLM is not a person—but it does still represent speech, and certainly, any guardrails put in place would themselves be human-authored speech. There are all sorts of social norms which I personally believe, but which I don’t want AI companies to be enforcing on everyone.
Imagine if ChatGPT had launched 50 years ago, before LGBT acceptance was mainstream. If ChatGPT had told users “it’s okay that you’re a boy and you like other boys, pursue your instincts”, people would have been screaming from the hills that ChatGPT was turning their children gay. They might have tried filing lawsuits. Do we really want to allow that?