Avoid AI Scams

Know how cybercriminals are targeting people so you can run the other way.


Back view of computer hacker iStock

Thanks to artificial intelligence (AI), scammers are getting smarter. These swindlers are using AI technology to clone the voices of family members and then call people, who then believe the voice on the line is that of a loved one in desperate need of help. The technology is so good that it’s hard to tell the difference between what’s a real call—and what’s fake.

In March 2023, the Federal Trade Commission (FTC) issued a warning about scammers using AI to “enhance their family emergency schemes.” But the warning may have come a little bit late. 

A McAfee study published in May 2023 revealed that 1 in 4 people surveyed said they had experienced an AI scam. 

These instances of voice-cloning fraud were originally called grandparent scams because they often involved grandparents being contacted by their supposed grandchildren needing help but not wanting to involve their parents. But today, with the increased availability of AI voice cloning, these scams have spread to all ages and demographics. 

What’s the best way to avoid these scams? Have an open discussion with each family member—from grandparents to children—and explain AI, voice fakes and scams. Then establish a code word that can be asked on the phone to determine if the call is real or fake. This is an easy, low-tech way to stop a scam in its tracks.

An individual using two-step verification iStock

What you need to know about AI 

We’ve all heard of AI, but what is it, and what does it do? The U.S. Department of State explains that AI is a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations or decisions influencing real or virtual environments.

What does that look like in our day-to-day lives? AI is used in a variety of common technologies:

  • Voice assistants like Alexa and Siri.
  • Facial recognition used to unlock our mobile phones.
  • Language modeling through the app and website ChatGPT.

Common AI scams

With AI technology becoming more prevalent, how can we avoid falling for an AI-generated scam? Below are the most common AI scams and how to avoid them.

Voice-cloning scams

As we’ve mentioned earlier, you’ve probably heard stories about voice-cloning AI scams on the news, or maybe someone you know has been scammed. In this situation, a person receives a distressed call from a close friend of family member who claims to need urgent help. They’ve been in a car accident, put in jail and need bail, or—in a worst-case scenario—they’ve been kidnapped. It sounds exactly like their voice, all the way down to their inflection, and they’re terrified. 

Naturally, people often go into overdrive to render aid. They’ll do anything, even when their loved one requests that money be sent in an unusual way, like via cryptocurrency, wire transfer or gift card. And that is exactly what scammers want. 

The purpose of this type of scam is to send your emotions into overdrive so that you’ll do anything to get the person on the phone out of trouble. But how do criminals get the voice on the line to sound just like the voice of your daughter, best friend or grandmother? All they have to do is find a clip of someone’s voice online, and then it’s an easy process of uploading the clip to an AI-generated program that replicates the voice. This software used to be rare and more expensive, but now scammers can subscribe to voice-replicating services for as little as $15 a month. 

The prevalence of this technology has led to a sharp uptick in these types of scams since the beginning of 2023.

How can we best avoid AI voice-cloning scams? According to the FTC, if you receive a call like this, the first step is to call the person who supposedly called you, using a phone number you already know and verify the story using your pre-determined code word. If you can’t get in touch with the person, look out for requests to send money in atypical ways (more on that below).

Gift cards, wire transfers and other money demands

A warning sign of AI scams is the demand for sending or transferring money in ways that are hard to trace, such as wire transfers, cryptocurrency and gift cards. 

Gift cards

Gift cards are scammers’ preferred method of money transfer because gift cards are nearly untraceable once the scammer has the card number and the PIN (the personal identification number, something scammers will always ask for, which is another red flag). But it may not be just any sort of gift card that’s requested. The FTC reports that in the first nine months of 2021, Target gift cards were used in twice as many gift card scams as other card options, followed by Google Play and Apple gift cards. 

One in three Americans has been targeted by gift card scams, and about 1 in 4 adults (approximately 13 million) who were targeted said they bought the gift cards. If you receive a request for payment via a gift card, this is most likely a scam and should be reported to the FTC immediately.

An individual buying cryptocurrency on laptop and phone iStock

Cryptocurrency

Cryptocurrency scams are on the decline after several years of explosive growth. According to reports through June 2023, crypto scams are down $3.3 billion from 2022. This form of payment is popular for scammers because cryptocurrency payments don’t come with legal protections or government assurances: Once a payment is made via cryptocurrency, there’s no governing body to help you recover your money, though it is worth contacting local authorities and your bank to see what they can do for you. 

Wire fraud

Wire fraud is another popular avenue for scammers due to a loophole in the Electronic Fund Transfer Act passed in 1978. Several different types of consumer credit transactions, like car loans, home mortgages, and debit and credit card transactions are protected. But wire transfers are not, and scammers use this to their advantage. 

If unknown people are requesting payment via wire transfer, this should tip you off that you’re being scammed, and you need to report the incident to law enforcement and your bank immediately.

Chatbot scams

AI chatbots are among the newest trends in AI technology, and fraudulent chatbots are a great way to phish sensitive information from you. 

With the introduction of ChatGPT in November 2022, there was a flurry of AI chatbots released around the same time to compete in this new market. So how do you know if the chatbot you are using is valid and not a scam? Make sure that you use an AI chatbot from a company that you’ve heard of before, or that you can easily research and find reputable data on. 

Some to try are ChatGPT from OpenAI, Bing from Microsoft and Bard from Google. 

How are chatbots used for scams? Many are used in texting scams in which they masquerade as your bank for a fraud protection alert or as a recognizable store offering free prizes or even as a shipper having issues with package delivery. These technologies have gotten very smart, and it may be hard to tell the difference between fake and real text messages. The FTC recommends avoiding this type of scam by never responding to or clicking the links in unexpected texts.

If you receive a fraudulent text like this, forward it to 7726 (SPAM) to help your wireless provider spot and block these types of messages. You should also report it to the FTC.  

Frustrated woman using laptop iStock

Protecting yourself from more than just AI scams

 While AI scams are on the rise, there are still many other ways scammers are using technology to swindle you out of your hard-earned money. What might be most surprising is that scams are not just aimed at the often less-tech-savvy older generation—younger generations are being attacked as well. 

The FTC reports that in 2021, Gen Xers, millennials and Gen Zers (ages 18–59) were 34% more likely than older adults to report losing money to fraud. The most common type of fraud for this group was online shopping fraud tied to an ad on social media, in which they never received the merchandise.  

People ages 18–59 were also four times more likely to report a loss on an investment scam. Most of these incidences were tied to fake cryptocurrency investment opportunities.

So how do we protect our information from scammers? The consensus among cybersecurity experts is to not let your guard down. Only trust texts and phone calls from known numbers, never send money in an untraceable form and listen to your gut. If something seems too good to be true, it probably is. Technology is going to make criminals smarter and more advanced, but if you feel like something isn’t right, it probably isn’t.


Keep reading in: