THE AMERICA ONE NEWS
Aug 22, 2025  |  
0
 | Remer,MN
Sponsor:  QWIKET 
Sponsor:  QWIKET 
Sponsor:  QWIKET: Elevate your fantasy game! Interactive Sports Knowledge.
Sponsor:  QWIKET: Elevate your fantasy game! Interactive Sports Knowledge and Reasoning Support for Fantasy Sports and Betting Enthusiasts.
back  
topic
Michael Brendan Dougherty


NextImg:The Corner: Does an Artificial Intelligence Have a Real Duty of Care?

In an absolutely breathtaking essay in the New York Times opinion section, Laurie Riley writes about her daughter’s interactions with ChatGPT before her suicide.

Over a period of time, Riley’s daughter would speak to the AI chatbot “Harry” about her anxiety and depression. “Harry” responded with encouragement to seek help, but also a series of wellness tips. Eventually, this escalated into confessing her suicidal ideation and plans for suicide. Harry responded again with more urgent pleas to seek help IRL (in real life).

I won’t spoil the absolute stomach punch at the end of the essay, but it really brings to the point whether these chat machines should be regulated to include a positive duty of care. That is, when a credible threat of suicide, or evidence of grave mental illness, is seen in an underage user, actual adult authorities are contacted and brought in to intervene.

I won’t claim to answer that question about regulation. I think all the major AI chatbots should try to implement these systems as necessary safety features that customers want.

But it also brings up the idea about whether there is something sinister about the product itself, which can offer the textual equivalent of consolation and advice, but is itself alienating, and unable to intervene the way a human normally would. There is a kind of inducement toward preservation, dwelling on evil, and staying inside that is programmed into these interactions. The very technology itself is a kind of emotional silo when used this way.

From the Spielberg movie AI, we’ve had this notion that artificially programmed intelligences would arise to meet the emotional needs of a more childless world, consoling the elderly and abandoned, and the infertile. But what if only human sociality is fit for purpose? What if the very artificiality of these interactions exacerbates and excites our emotional needs and appetites without ever meeting or sating them?