THE AMERICA ONE NEWS
Aug 11, 2025  |  
0
 | Remer,MN
Sponsor:  QWIKET 
Sponsor:  QWIKET 
Sponsor:  QWIKET: Elevate your fantasy game! Interactive Sports Knowledge.
Sponsor:  QWIKET: Elevate your fantasy game! Interactive Sports Knowledge and Reasoning Support for Fantasy Sports and Betting Enthusiasts.
back  
topic
Andrew Gondy


NextImg:AI Should Not Be Your Therapist

Last week, Illinois took a major step to curb the erosive uses of artificial intelligence, as Governor JB Pritzker signed a law limiting the citizens’ use of AI for therapy. The law will prohibit AI apps from making therapeutic decisions and generating supposed treatment plans for users.

American Psychological Association’s senior director of innovation, Vaile Wright, denounced the therapeutic uses of generative technology, as it is unlicensed in any clinical sense. She stated, “You’re putting the public at a level of risk when you imply there’s a level of expertise that isn’t really there.” 

How can a mentally struggling individual find true help in a system that is designed to effectively tell the reader what they want to hear?

Some chatbot companies, such as Abby and Earkick, have been designed specifically for the purpose of therapy. These systems, which appear to users to be a quick fix to their mental health problems, supposedly deliver cognitive behavioral therapy. Clinically, this therapy is used to address depression, PTSD, and even schizophrenia. (RELATED: AI Won’t Terminate Us. It Will Just Render Us Irrelevant.)

In many ways, people’s experimentation with this technology seems to show them a perfect therapeutic aid in artificial intelligence because systems like ChatGPT seem to have an answer for everything. AI is and can be an incredibly useful technology in our world; however, the more we look to a tool as the answer, the more we chip away at our own human connections and experiences. (RELATED: A Generation So Lonely, It Fell in Love With Furniture)

When approaching artificial intelligence within an issue such as therapy, the problems abound. How can a mentally struggling individual find true help in a system that is designed to effectively tell the reader what they want to hear? How can a person find solace in a calculating machine lacking the ability to understand or feel the emotions of a human individual? They cannot.

Some chatbots are designed to simulate emotional support in the face of grief, loneliness, or the loss of a loved one. One such bizarre example was displayed on Monday, when Jim Acosta aired an interview with an AI avatar of Parkland shooting victim, Joaquin Oliver. The interview was roundly seen as strange and wrong, even drawing criticism from left-wing media outlets. (RELATED: Mom, Meet My New AI Girlfriend)

We innately recognize something unsettling about the attempt to reanimate the personalities of deceased loved ones with technology that is altogether incapable of simulating humanity. Regardless of how staunchly one might support artificial intelligence, we can all agree that Jim Acosta did not actually speak with Joaquin Oliver, but with a generative model designed to respond as his parents believed he would. 

While these products are the inevitable result of an AI market that has exploded in recent years, they mark a move by some away from connection with other humans toward a cheap apparition of it. If we do not curb this heavy reliance on generative models designed to agree with us, we will regress in tandem with the progress of this technology. 

READ MORE from Andrew Gondy:

What We Know About the Army Sergeant Who Shot Five Soldiers

Young Conservatives Cannot Afford to Be Neutral on Family