THE AMERICA ONE NEWS
Jun 25, 2025  |  
0
 | Remer,MN
Sponsor:  QWIKET 
Sponsor:  QWIKET 
Sponsor:  QWIKET: Elevate your fantasy game! Interactive Sports Knowledge.
Sponsor:  QWIKET: Elevate your fantasy game! Interactive Sports Knowledge and Reasoning Support for Fantasy Sports and Betting Enthusiasts.
back  
topic
Steve Straub


NextImg:Arizona Mom Claims AI Cloned Daughter's Voice in Terrifying $1 Million Ransom Scam

Artificial intelligence (AI) has reached a new level of sophistication as a mother in Arizona reports scammers used AI to mimic her daughter’s voice, demanding a $1 million ransom in a frightening new voice scheme.

Jennifer DeStefano, the concerned mother, told WKYT that she “never doubted for one second it was her” daughter’s voice, describing the incident as “freaky” and deeply unsettling.

This revelation comes amid an increase in “caller ID spoofing” schemes, where scammers pretend to have kidnapped a family member and threaten harm if they are not paid a specified amount of money.

Loading a Tweet...

Scottsdale resident DeStefano recalled receiving a call from an unfamiliar number, which she initially considered ignoring but decided to answer since her 15-year-old daughter, Brie, was on a ski trip.

Upon answering the call, DeStefano heard her daughter’s voice sobbing, exclaiming, “Mom!” and “I messed up.”

She then heard a man’s voice telling “Brie” to put her “head back” and “lie down.”

The man proceeded to threaten DeStefano, saying, “You call the police, you call anybody, I’m going to pop her so full of drugs, I’m going to have my way with her, and I’m going to drop her off in Mexico.”

Brie’s voice could be heard in the background, pleading for help.

Initially demanding a $1 million ransom, the faux kidnapper lowered the amount to $50,000 when DeStefano insisted she didn’t have the funds.

The ordeal finally ended when another mother at DeStefano’s other daughter’s studio called 911 and confirmed Brie’s safety on her ski trip.

However, DeStefano remains convinced that the voice she heard was her daughter’s, stating, “It was completely her voice. It was her inflection. It was the way she would have cried.”

The identity of the scammer remains unknown, but computer science experts say voice cloning technology has advanced to the point where someone’s tone and manner of speaking can be recreated from only a few seconds of audio.

Subbarao Kambhampati, a computer science professor and AI authority at Arizona State University, explained that it now takes just “three seconds” of a person’s voice to create a near-perfect imitation, including their “inflection” and “emotion.”

DeStefano found the voice simulation particularly concerning, given that Brie does not have any public social media accounts featuring her voice.

The few public interviews she has given for sports and school events could have provided a large enough sample of her voice for the AI to imitate.

FBI experts caution that fraudsters often find their targets on social media, with Assistant Special Agent Dan Mayo of the FBI’s Phoenix office advising people to ask scammers questions about the “abductee” that they wouldn’t know.

He also suggested looking for red flags, such as unfamiliar area codes or international numbers.

DeStefano warned people on Facebook to inform authorities if they or someone they know experiences a similar scam, urging them to have a family emergency word or question to verify whether they are being targeted by AI scams.

This advice comes amid a recent surge in kidnapper schemes, with multiple instances reported on social media platforms like TikTok.