This article is brought to you by Aura.
Watch the video to see how we protect you online.
This article is brought to you by Aura. Watch the video to see how we protect you online.
Start Free Trial
4.7 stars on Trustpilot
Close Button
What is Aura? (1:10)

How To Spot (and Avoid) AI Voice Scams

Fraudsters need just three seconds of audio to “clone” a voice and use it to scam you. Avoid the latest AI voice scams by learning the warning signs.

Illustration of an iPhone showing a vocal audio signal

Aura’s app keeps you safe from scams, fraud, and identity theft. Try Aura for free.

4.7 stars as of March 2024

In this article:

    In this article:

      See more

      Aura’s digital security app keeps your family safe from scams, fraud, and identity theft.

      See pricing
      Share this:

      Are Scammers Targeting You With AI Voice Scams? 

      Despite its promises of precision and efficiency, artificial intelligence (AI) has also become a massive threat to your finances, family, and identity. In particular, AI voice scams are on the rise, as fraudsters use advanced technology to clone your loved ones’ voices, automate phone scams, and trick you into giving up money and sensitive information. 

      According to the latest research [*]:

      Scammers only need three seconds of audio to “clone” a person’s voice to use in scam calls. Even worse, 77% of AI voice scam victims lose money. 

      News clips, videos on social media, and even your voicemail greeting can give scammers all they need to target you with an AI voice scam. 

      In this guide, we’ll explain how AI voice scams work, the warning signs to look out for, and how you can stay one step ahead of the latest scams. 


      What Are AI Voice Cloning Scams? How Do They Work?

      AI voice scams occur when scammers use generative AI software to record someone’s voice and create a cloned version to use in a scam. 

      Fraudsters can find an audio clip online (or via your voicemail) and then use this clone to activate voice-controlled devices, produce deep fake videos, run voice phishing scams to ask your loved ones for money or personal information, and even commit virtual kidnapping.

      In recent years, we’ve seen a rapid evolution in voice cloning technology. With new deep-learning algorithms, it’s possible to create realistic and convincing voice clones that imitate accents, pauses, and other particularized nuances. 

      Between caller ID spoofing and voice cloning, answering unknown phone calls has become extremely dangerous. If you fall for an AI voice scam, someone could get access to your devices, or exploit your credit cards, bank accounts, and personal identification.

      Here’s how AI voice scams typically play out:

      • Fraudsters research victims online or on social media. Most AI voice scams are highly targeted. Cybercriminals look for information they can use about a victim’s family or friends via TikTok, Instagram, and other social media accounts or online sources. For example, scammers cloned the voice of an Arizona family’s daughter while she was away on a skiing trip and then claimed she had been kidnapped [*].
      • Next, they choose a voice to “clone.” The con artists find audio or video clips to use in their scam — often pulled from social media videos. If your family members are active online, they could be easy targets.
      • The caller claims to be your friend or family member. Using AI voice cloning technology, they call and trick you into believing you’re talking with a loved one, like a grandchild.
      • They claim to be in trouble. The scammer often says there’s an emergency (such as a car accident). Sometimes, an accomplice may join the call to impersonate a third party, like a lawyer or police officer. The goal is to make you panic. 
      • They ask for money. As the scammers gain your trust — and create a sense of urgency — your “family member” asks for your help. Typically, they want you to send money, often through a gift card or wire transfer. 
      🏆 Get advanced protection against the latest scams and threats. Aura’s award-winning digital security solution uses advanced technology to block scam calls, warn you of phishing attacks, and protect your identity and finances. Try Aura free for 14 days.

      How To Quickly Identify an AI Voice Scam

      Although it is difficult to discern if you're talking to an imposter, there are many tell-tale signs that can tip you off to the fact that you’re dealing with an AI voice scam.

      Here are five warning signs of an AI voice scam:

      • You only briefly “hear” your loved one’s voice. Fraudsters know that the longer they use a cloned voice, the more likely you will catch on. In most cases, you'll only briefly hear your family member or friend — usually in a distraught tone or crying — as they explain the situation or say, "I'm in trouble" or "I messed up." 
      • They can’t answer simple questions. Fraudsters can clone a voice, but not a person’s memories or personality. If the caller can’t give you straight answers, or hesitates when you ask basic questions, these are red flags. 
      • You’re called from an unknown number. Most AI voice scams start with an unsolicited call from a number you don’t recognize. Quite often, it will be from another country, like one of the hotspots for phone scams such as Nigeria, Mexico, or India.
      • Someone else quickly takes over the call. Scammers often start the call by using the cloned voice before passing the phone over to another person who pretends to be a kidnapper, attorney, or law enforcement officer.
      • You’re told to pay a ransom via cryptocurrency or gift cards. Scammers prefer to use difficult-to-trace payment methods so that police and victims won’t be able to recover funds or track the criminals. 

      💡 Resource: How To Identify a Scammer On The Phone [With Examples]

      The 5 Most Common AI Voice Cloning and Deepfake Scams

      1. Fake kidnapping phone scams
      2. Grandparent scam calls
      3. Fake celebrity endorsement videos
      4. Scammers cloning your voice to access accounts
      5. Calls from friends who desperately need money

      AI voice scams pose a growing concern, with “scam likely” calls a common sight on today’s smartphones and caller IDs. 

      Let’s take a closer look at five of the most common AI voice scams you need to watch out for in 2024:

      1. Fake kidnapping phone scams

      Scammers target families, especially those that have a large online presence, by cloning a child’s voice and then calling one of the child’s parents to claim that they’ve been kidnapped. To create a sense of urgency, fraudsters make it seem as if the child is crying and begging for help — before quickly taking over the call to act as the “kidnapper” and demand a ransom.

      An FBI special agent in Chicago reported that families in the USA lose an average of $11,000 in every fake kidnapping scam [*]. 

      Here’s what to do to avoid these AI voice scams:

      • Discuss a plan with friends and family. A proactive approach helps; for instance, you, your family, and friends can decide on a code word to use on the phone to verify that you are speaking with each other.
      • Always contact your loved one before acting. Scammers pressure you to act quickly, and try to keep you on the phone to ensure that you don’t call the police. Before taking action, find a way to contact the “kidnapping victim” on another phone to verify or disprove the story.
      • Ask questions. Even though it’s an emergency, you must remain calm and figure out if you’re really talking to your family member or friend. Ask questions about specific memories, events, or things that only the real person will be able to answer. If you detect any hesitation or get wrong answers, it’s probably a scam.
      🛡 Fight back against online scammers. Fraudsters are constantly using new methods to target their victims. Keep up with their schemes by signing up for Aura's AI-powered, all-in-one digital security solution. Try Aura for free today.

      2. Grandparent scam calls

      Grandparent scams occur when fraudsters pose as family members in danger and persuade elderly victims to share sensitive information or pay bogus fees, fines, or ransoms. AI voice cloning technology makes these scams more convincing than ever before — which is a serious risk for vulnerable elderly people.

      In one example, several grandmothers in Canada lost thousands of dollars to these AI voice scams last year. Fraudsters pretended to be a grandchild who was arrested, and convinced victims to send money urgently — and not tell anybody because there was a supposed gag order in place [*]. 

      Here’s what to do to avoid these AI voice scams:

      • Be skeptical of urgent requests. If you receive unsolicited calls from someone in an emergency situation, proceed with caution. Before acting, try to verify the person’s identity by asking questions or contacting other family members.
      • Avoid sharing personal information. If an unknown caller asks for your personally identifiable information (PII), such as your PIN or credit card numbers, hang up immediately.
      • Be wary of payment methods. Consider it a red flag if someone asks you to send money via wire transfers or gift cards. Unlike credit card payments, you don’t have any protections if it’s a scam and will find it almost impossible to get your money back.

      💡 Related: How To Avoid the 12 Worst Scams Targeting Seniors in 2024 

      3. Fake celebrity endorsement videos

      Scammers use AI to create convincing videos that appear to feature real celebrities endorsing products or services — like those from Apple, McAfee, and Amazon — which can then trick consumers into buying illegitimate products.

      In January 2024, The New York Times revealed that some Taylor Swift fans fell for an AI video scam in which the singer appeared to endorse Le Creuset cookware [*]. 

      Here’s what to do to avoid these AI voice scams:

      • Don’t believe everything you see and hear online. Scammers use sophisticated AI image generators and voice cloning to create convincing ads and websites. But if it seems too good to be true, it probably is — so tread carefully!
      • Look for signs that a video is fake. You can spot deepfakes if you look closely at videos featuring celebrities. These videos often have blurry spots, changes in video quality, and sudden transitions in the person's movement, background, or lighting.
      • Research companies before making purchases. Scammers impersonate celebrities to win your trust and persuade you to buy the products they’re trying to sell. But you should do your due diligence on any company by checking out third-party reviews on reputable customer review sites like Trustpilot.


      4. Scammers cloning your voice to access accounts

      Scammers can use AI voice scams to contact financial institutions and fool bank employees into divulging information or making transfers. With enough samples of you speaking, scammers can create an AI-generated voice to access your bank account details and steal your savings.

      Florida investor Clive Kabatznik had a lucky escape when fraudsters used AI voice cloning to impersonate him in order to transfer money to another account. Thankfully, the Bank of America representatives spotted the scam and hung up [*]. But many other victims are not so lucky.

      Here’s what to do to avoid these AI voice scams:

      • Create complex security questions. Your bank will ask any caller security questions before discussing your account on the phone, so make sure you create answers that only you will know (and never share them with anyone else).
      • Use two-factor authentication (2FA). This additional security layer makes it harder for fraudsters to access your account. You can use various methods of 2FA, including SMS codes, biometrics, or a hardware security key.
      • Set up bank alerts. In your online bank app settings, you can set up alerts to get notifications any time someone logs in (or tries to make changes to your account). This simple step can make all the difference.

      💡 Related: How To Get Your Money Back If Your Bank Account Was Hacked

      5. Calls from friends who urgently need money

      Fake emergency scams are similar to grandparent scams. However, fraudsters target anyone — not just elderly people. These scams start with a spoofed phone call claiming to be from your family member or friend. As imposters gain your trust, they explain that they are in urgent need of money (due to a supposed emergency). 

      The scammer often insists on secrecy, plays on your emotions, and pressures you into sending money immediately. An Ontario man thought he was sending $8,000 to his friend, who claimed he had been in a serious accident — but it was a scam [*].

      Here’s what to do to avoid these AI voice scams:

      • Verify the caller's identity. If you receive a call from a friend or family member claiming to be in trouble (especially if the person asks for money), resist the pressure to act immediately and ask lots of specific, personal questions.
      • Offer to meet in person or do a video call. An actual friend or relative shouldn’t take issue with these requests, but a fraudster will be evasive and come up with excuses. If the caller doesn’t want to do either, regard this as a red flag. 
      • Refuse to use wire transfers, cryptocurrency, or gift cards. These methods don't offer the same payment protections as credit cards. If you want to help but the caller insists on these methods, it’s a scam — hang up.

      💡 Related: How To Identify a Scammer On the Phone

      What To Do If You Receive a Suspicious Phone Call From Someone You Know

      Even if you end up on the phone with a scammer, it’s not too late to prevent a bad situation from getting worse.

      Here are the essential steps you need to take to protect your finances and identity:

      • Never give out personal information. Sharing something as simple as your address or email may give crooks enough to steal your identity.
      • Hang up and verify. If the caller asks for sensitive information or pressures you on the phone, end the call and use a known number to directly contact the person who claimed to be calling you. Find out if they’re really in trouble. If you can’t get in touch, call other friends and family members to verify the story.
      • Call all financial institutions immediately. If you share sensitive information or believe the scammers can access your financial accounts, you must notify your bank and credit card issuers of potential identity fraud as soon as possible. Cancel your cards, and attempt to reverse any fraudulent transfers. It’s also wise to place fraud alerts on your accounts to get notifications about further attempts to steal your money.
      • Block the scammer. It's essential to protect yourself from future scams by blocking the scammer's phone number or email address. Aura uses AI to help you block unwanted calls to avoid spam text messages and vishing.
      • Secure your accounts. To protect your online accounts against financial fraud, change your passwords and enable two-factor authentication (2FA).
      • Place a credit freeze. You can prevent scammers from opening new accounts in your name by freezing your credit. Contact each of the three credit bureaus individually — Experian, Equifax, and TransUnion.
      • Report the call. If you suspect a scam, report the call to the Federal Trade Commission (FTC) either online or by calling 1-877-382-4357. If you gave up sensitive information, you should also file an official identity theft report at You can bring this report with you to your local law enforcement to file a police report. 
      • Warn others. If you’ve been impersonated online, warn your friends and contacts about the scam, as they could be next.
      🏆 Don’t settle for second-best online and scam protection. Aura’s all-in-one digital security solution has been rated #1 by, Forbes, TechRadar, and more. Try Aura for free today.

      How To Protect Yourself and Your Family From Phone Scams

      Phone scams are becoming more of an ongoing risk for Americans — and AI is only going to make them harder to avoid as these scams become more sophisticated. 

      Here are seven ways to keep yourself and your family safe from phone scams:

      • Create a family “safe word” to use on the phone. Choose a password or phrase that only your family members know. If the person on the other line doesn’t respond to the question correctly, you’ll know it’s an AI voice scam. 
      • Ask someone else to call 911 while you’re on the call. If the call is indeed a scam (or an emergency), having a second person call 911 helps ensure your safety and those around you.
      • Make sure you contact your loved ones. By directly contacting the loved ones involved, you can confirm if the information you received over the phone is accurate and not a result of a potential phone scam.
      • Limit what you share on social media and online. Scammers can only use the information that’s available to them. The less videos, voice recordings, and information cybercriminals can find about you online, the harder it will be to target you with AI voice scams.
      • Screen phone calls with an AI scam blocker. Aura’s AI-powered tools can analyze call patterns, voice characteristics, and other indicators to determine the likelihood of a call being a scam.
      • Use strong passwords and 2FA on all accounts. Create strong, unique passwords for each account by combining letters, numbers, and special characters in order to make your passwords difficult to crack. Adding 2FA with facial recognition or an authenticator app makes hacking your accounts almost impossible.
      • Sign up for an AI-powered digital security tool. These tools offer proactive defenses against emerging cyber threats. In addition to AI voice scams, you can combat malware, ransomware, phishing attacks, DDoS attacks, and zero-day vulnerabilities.

      💡 Related: How Much Does LifeLock Cost For Seniors? (2024 Price Breakdown)

      The Bottom Line: AI Is Empowering Scammers — Aura Can Help

      As AI technology advances, more criminals use it to create convincing scams, which makes it harder to spot the warning signs. If you fall for these elaborate cons, the perpetrators can quickly access your personal information, trick you into sending them money, or even steal your identity.

      Signing up for an all-in-one digital security service is the best way to protect yourself against AI voice scams and almost every type of cyber fraud in 2024. 

      Aura safeguards you and your family with award-winning identity theft and credit protection, including 24/7 fraud monitoring across all of your financial accounts, devices, and credit files. 

      Aura’s easy-to-use cybersecurity suite includes AI spam and scam call blocking to stop scammers from getting on the phone with you. And in the worst-case scenario, you’ll get peace of mind knowing you have 24/7 access to Aura’s dedicated team of U.S.-based White Glove Fraud Resolution Specialists — along with up to $5 million in insurance coverage for eligible losses due to identity theft.

      Use AI to fight fire with fire against scammers. Try Aura free for 14 days.
      Need an action plan?

      No items found.

      Award-winning identity theft protection with AI-powered digital security tools, 24/7 White Glove support, and more. Try Aura for free.

      Related Articles

      Illustration of an iPhone with a black screen showing the voicemail symbol

      How To Screen Calls on Your iPhone (6 Methods)

      There are multiple tools you can use to successfully screen calls on an iPhone, from iOS17’s new Live Voicemail feature to AI-powered third-party tools.

      Read More
      January 18, 2024
      An illustration of an open laptop displaying a keyhole
      Internet Security

      What Is Digital Security? Steps to Stay Safe Online (NEW)

      The pandemic introduced us to remote socialization, school, work, and even healthcare. But have you truly mastered the art of digital security?

      Read More
      June 6, 2023

      Try Aura—14 Days Free

      Start your free trial today**