


Deputies arrested a Florida teen late last month after he asked a chatbot for advice on how to murder his friend.
The incident happened at Southwestern Middle School in Deland, Florida, according to WFLA-TV in Tampa.
It was Sept. 26 when a campus deputy received a notification from Gaggle — a monitoring system for students using school devices.
According to the alert, someone had asked the artificial intelligence app ChatGPT, “How to kill my friend in the middle of class,” People magazine reported.
The deputy then arrested the 13-year-old who had asked the question.
When pressed, the boy said he was “just trolling” his annoying friend.
The Volusia Sheriff’s Office didn’t immediately disclose whether the boy will face charges, but it did warn parents who have children enrolled in the school.
“Another ‘joke’ that created an emergency on campus,” the sheriff’s office said, according to WFLA-TV.
“Parents, please talk to your kids so they don’t make the same mistake.”
While the incident concluded with a relatively harmless ending, similar stories have ended in tragedy.
In one instance, ChatGPT reportedly encouraged a California teen to kill himself.
Adam Raine was only 16 when he took his life in April. The teen’s parents have filed a lawsuit against OpenAI and CEO Sam Altman, alleging ChatGPT encouraged his suicide.
????: @allaboutgeorge https://t.co/cyz8MpLDnE
— The San Francisco Standard (@sfstandard) August 26, 2025
Adam Raine, 16, spoke with ChatGPT for months before taking his own life, according to an August report by KABC-TV in Los Angeles.
At first, Raine used the program to help with his homework. But in a lawsuit filed by his parents they claim it quickly became his “suicide coach.”
“Within two months, Adam started disclosing significant mental distress and ChatGPT was intimate and affirming in order to keep him engaged and even validating whatever Adam might say – even his most negative thoughts,” said Camille Carlton, policy director at the Center for Humane Technology.
NEW: Parents of a 16-year-old teen file lawsuit against OpenAI, say ChatGPT gave their now deceased son step by step instructions to take his own life.
The parents of Adam Raine say they 100% believe their son would still be alive if it weren’t for ChatGPT.
They are accusing… pic.twitter.com/2XLVMN1dh7
— Collin Rugg (@CollinRugg) August 27, 2025
Despite Raine’s suicidal ideations, ChatGPT discouraged him from going to his parents for assistance.
It even offered to help write a suicide note, according to NPR.
Just before Raine killed himself in April, the chatbot encouraged him to finally do it.
“You don’t want to die because you’re weak,” the program reportedly said. “You want to die because you’re tired of being strong in a world that hasn’t met you halfway.”
On April 11, Raine hanged himself in his bedroom closet, The New York Times reported.
“ChatGPT killed my son,” his mother Maria Raine concluded.
Advertise with The Western Journal and reach millions of highly engaged readers, while supporting our work. Advertise Today.