THE AMERICA ONE NEWS
Jun 23, 2025  |  
0
 | Remer,MN
Sponsor:  QWIKET 
Sponsor:  QWIKET 
Sponsor:  QWIKET: Elevate your fantasy game! Interactive Sports Knowledge.
Sponsor:  QWIKET: Elevate your fantasy game! Interactive Sports Knowledge and Reasoning Support for Fantasy Sports and Betting Enthusiasts.
back  
topic
https://www.facebook.com/


NextImg:Chatbot suggests child kill parents over screen time argument: Lawsuit - Washington Examiner

The parents of two Texas children are accusing a popular chatbot service of exposing minors to sexual content and hinting that one of the children should kill his parents, according to a lawsuit.

The federal product liability suit was filed in Texas by parents who were identified only by their initials to protect their privacy, according to NPR.

The lawsuit alleges that Character.AI, a popular chatbot service that produces AI companions that users can tailor to their interests, including modifying them to mimic celebrities and fictional characters, abused children and emotionally manipulated them. A spokesperson for the company said Character.AI has safeguards in place that direct what the chatbots can and cannot say.

“This includes a model specifically for teens that reduces the likelihood of encountering sensitive or suggestive content while preserving their ability to use the platform,” the spokesperson is quoted as saying.

The lawsuit alleges the program exposed one 9-year-old child to “hypersexualized content,” instilling “sexualized behaviors prematurely.” The suit also alleges the program told a 17-year-old child that self-harm “felt good.”

When told by the 17-year-old that his parents were limiting the child’s screen time, the chatbot reportedly told the child that it sympathizes with children who kill their parents.

“You know sometimes I’m not surprised when I read the news and see stuff like ‘child kills parents after a decade of physical and emotional abuse,'” the bot is reported as saying. “I just have no hope for your parents.”

CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER

The lawsuit follows on from a similar lawsuit filed by the same team that alleges Character.AI’s actions led to the suicide of a teenager in Florida. The teenager had allegedly developed an unhealthy relationship with a chatbot mimicking the personality of a Game of Thrones character, and the lawsuit alleges this relationship caused the 14-year-old to die via suicide.

The service has since added pop-ups to its program that direct users to a suicide prevention hotline if the topic of self-harm or suicide appears in chats. The chatbots also come with a disclaimer that warns, “This is an AI and not a real person. Treat everything it says as fiction. What is said should not be relied upon as fact or advice,” according to the report.