


Authored by Javier Simon via The Epoch Times (emphasis ours),
Artificial intelligence (AI) has revolutionized the way we complete tasks, and it’s becoming a part of our everyday lives. But like many technological innovations, AI can be a double-edged sword.
It can make life easier. And it can open the door to a new generation of scammers and fraudsters who can steal everything from your money to your identity. So it’s important to know what you’re up against in the modern world.
Believe it or not, you may get a call from a robot claiming to be a loved one stating they’re in a desperate situation and need money. But before you reach for your debit card, understand this may be a scam.
It could be tied to what’s called voice cloning. Scammers gather a clip of someone speaking from anywhere including social media. They then use voice synthesis technology to generate new speech that sounds identical to the voice they analyzed.
These tricks are also known as grandparent scams, because thieves often dupe older individuals into thinking they are their grandchildren in urgent need of money.
And that’s key. Voice-cloning scammers often create an extreme sense of urgency. If you feel the call is too intense, it’s okay to hang up. Call the person directly and ask about the situation, if any.
Some experts also recommend that you use secret phrases among your loved ones in order to confirm their identity. But be sure to make it as obscure as possible. Don’t make it something users can find on their social media pages. And don’t share it through email or anywhere else, as these can be compromised as well. It should only be shared through word of mouth among your loved ones.
But what happens when it’s not a loved one, but someone you admire? Voice-cloning scammers have been known to duplicate the voices of celebrities and public officials to generate robo calls that trick people into donating to a cause or investment scheme.
You should immediately hang up on these types of calls. And if you’re really curious, you may want to check on the official websites or verified social media profiles of these individuals to see if they are involved with any organization.
This trick is similar to a voice-cloning call. But it adds another convincing layer: video. Scammers use AI-generated videos of fake people or real people like your loved ones to make video calls. In these situations, they also may create a sense of urgency and ask for money. They may direct you to a malicious website where you’re tricked into providing sensitive financial information that the scammer can steal.
The rule of thumb is that if it seems incredibly urgent and requires money, a red flag should go up. Hang up the call and contact the person in question directly if possible. And beware of these videos elsewhere. They can appear in online advertisements and across social media—often involving celebrities and news anchors.
Scammers may create AI-generated malicious websites. In some cases, these websites serve as fake stores or marketplaces. You may find the links to these malicious sites on social media or via text or email.
In some cases, they’ll list a popular product that you’ve been searching for at an exceptionally discounted price. But if it sounds too good to be true, it probably is. The scammer would take your financial information to “process the order,” and you’ll never receive the product. The bad actor would instead keep your sensitive financial information and either sell it on the dark web or use it to make fraudulent purchases.
But these go beyond popular products. They could also involve listings for apartments and houses.
If you believe you’ve been a victim of this type of scam, it’s important that you contact your financial institution immediately to see how you may proceed. In some cases, you may be able to get your money back.
But if you feel you’ve been a victim of any type of scam, it’s also important to report it to the Federal Trade Commission at ReportFraud.ftc.gov.
Phishing has long been a favorite for scammers. They often involve emails disguised to seem like they are coming from legitimate sources like your employer, a loved one, or a government agency. But a scammer uses these emails to trick you into sending money or divulging sensitive information like your financial information, passwords, and Social Security number. In some cases, they also send links that actually lead to a fraudulent website or malware designed to destroy your device and steal sensitive information.
Back in the day, you could spot a phishing email by looking for things like typos, grammatical errors, non-legitimate email addresses, and more. But AI has muscled up these emails, and they now seem more convincing than ever. So if you see an email claiming to be from a legitimate source and asking for things like money or your sensitive information, close the email. Don’t click on any links. Delete it.
Reach out to the organization in question directly via an official phone number or through their official website.
And keep in mind that financial institutions and businesses would never ask for your password via email, call, or text.
AI has made completing tasks easier for many. But that also applies to scammers and fraudsters. Criminals are now armed with the most sophisticated technology, and they’re using it to prey on unsuspecting victims. You may fall for tricks like deep-fake phone or video calls, as well as AI-assisted phishing scams. As a result, you could lose everything from your bank account to your retirement savings—even your own identity.
This is why you must remain vigilant. Beware of any type of communication that involves giving up money or sensitive information like your passwords, financial information, and Social Security number. Take a deep breath and carefully analyze what’s in front of you. It may also help to contact a trusted friend. Sometimes, an extra set of eyes can help identify a scam in progress.