Cybersecurity

AI Voice Cloning Scams: How They Work and How to Protect Your Family

A call from your child begging for help — but it is not really them. AI can now clone any voice from just a few seconds of audio. These scams are surging. Here is how to protect your family.

Updated: March 2026 FTC consumer alert Silent Security Research Team

What Are AI Voice Cloning Scams?

AI voice cloning scams use artificial intelligence to replicate a person's voice and impersonate them over the phone. The technology behind these scams has advanced rapidly — modern AI voice synthesis tools can create a convincing clone of someone's voice from as little as three seconds of recorded audio. That short clip from a TikTok video, a YouTube upload, or even a voicemail greeting is enough.

The cloned voice can then be used in real time to make phone calls that sound exactly like a loved one, a boss, or a trusted authority figure. The FTC issued a consumer alert warning that AI is now being used to "supercharge" impersonation scams and that these attacks are increasing at an alarming rate. The FBI's IC3 reports that business email compromise (BEC) — a category that increasingly includes AI voice impersonation — caused losses exceeding $2.7 billion, with AI voice cloning contributing to a growing share of those cases.

How These Scams Work

AI voice cloning scams follow a predictable pattern:

  • Step 1 — Harvest a voice sample. Scammers find audio of their target on social media platforms, podcasts, YouTube videos, company websites, or voicemail recordings. Even a few seconds of clear speech is sufficient for modern cloning tools.
  • Step 2 — Clone the voice. Using commercially available AI tools, the scammer feeds the audio sample into a voice synthesis model. Within minutes, the AI can generate new speech in the cloned voice, saying anything the scammer types or speaks.
  • Step 3 — Call the victim. The scammer calls a family member, colleague, or friend of the person whose voice was cloned. Caller ID is often spoofed to display the real person's phone number. The cloned voice delivers an urgent, emotional scenario designed to trigger panic and immediate action.
  • Step 4 — Demand money. The scammer pressures the victim to send money immediately — typically via wire transfer, gift cards, cryptocurrency, or payment apps. They insist on secrecy and urgency to prevent the victim from verifying the situation.

Real-World Examples

The Grandparent Scam 2.0

The classic grandparent scam has been supercharged by AI. A grandparent receives a call that sounds exactly like their grandchild, crying and claiming they have been arrested, are in the hospital, or have been in a car accident. The caller begs for money and pleads with the grandparent not to tell anyone. In 2023, the FTC reported cases where families lost tens of thousands of dollars to these calls because the voice was indistinguishable from their real grandchild.

Fake Kidnapping Calls

In one of the most disturbing variations, scammers call a parent using a cloned version of their child's voice, screaming and crying. A second scammer then takes over the call, claiming to have kidnapped the child and demanding ransom. These calls have been reported across the United States, with the FBI warning families to be aware that AI makes these scenarios far more convincing than they were even two years ago.

CEO Fraud and Business Email Compromise

AI voice cloning is also used in business contexts. Scammers clone the voice of a company executive and call an employee in the finance department, instructing them to wire funds to a specific account for a "confidential deal" or "urgent vendor payment." One widely reported case involved a UK-based energy company that lost $243,000 after an employee received a call from what sounded exactly like their CEO.

How to Verify a Suspicious Call

If you receive an urgent call from someone who sounds like a loved one or colleague, take these steps before doing anything else:

  • Use your family safe word. If your family has established a code word for emergencies, ask for it. An AI clone cannot provide information it was never trained on.
  • Hang up and call back on a known number. Do not call back the number that contacted you — it may be spoofed. Instead, call the person's real phone number from your contacts. If they do not answer, try another family member or friend who would know their status.
  • Ask a question only they would know. Ask about a private detail — what you had for dinner last night, the name of a childhood pet, or something from a recent conversation. Scammers cannot answer these questions because the AI only clones the voice, not the person's memories.
  • Listen for audio artifacts. AI-generated speech may have subtle irregularities: unnatural pauses, slight robotic quality, inconsistent background noise, or a brief delay before responses. These clues are becoming harder to detect but can still be present.
  • Do not act under pressure. Scammers create urgency specifically to prevent you from thinking clearly. A real emergency can wait the two minutes it takes to verify the call.

How to Protect Your Family

Establish a Family Code Word

Choose a secret word or phrase that every family member knows. It should be uncommon, unrelated to public information, and never shared on social media. If anyone calls in an emergency, they must provide the code word before you take action. The FTC specifically recommends this strategy as a frontline defense against AI voice impersonation.

Limit Voice Exposure Online

Every public video, voice message, or audio clip is a potential source for voice cloning. Consider the following:

  • Set social media accounts to private, especially those with video content.
  • Avoid posting videos where your voice is clearly audible to public audiences.
  • Use a generic voicemail greeting instead of recording your own voice.
  • Be cautious about voice-based social media platforms and audio chat rooms.

Use Call Screening Tools

Call screening apps can block known scam numbers and flag suspicious calls before you answer:

  • Truecaller uses AI-powered caller identification and spam detection to screen incoming calls. It maintains a database of known scam numbers updated by its global user community and can automatically block suspected fraud calls.
  • Google Call Screen (available on Pixel phones and select Android devices) uses Google Assistant to answer unknown calls, ask the caller's purpose, and provide a real-time transcript so you can decide whether to pick up.
  • Carrier tools such as T-Mobile Scam Shield, AT&T ActiveArmor, and Verizon Call Filter offer free and premium scam-blocking features built into your phone plan.

Monitor Your Family's Identity

AI voice scams are often part of broader identity fraud operations. A family identity monitoring service like Aura can alert you if personal information — such as phone numbers, Social Security numbers, or financial accounts — appears in data breaches or on the dark web. Early detection gives you time to lock accounts before scammers can use stolen data to make their impersonation more convincing.

What to Do If You Receive an AI Voice Scam Call

  • Do not send money. No matter how real the voice sounds, do not wire money, send gift cards, or transfer cryptocurrency until you have independently verified the situation.
  • Verify independently. Hang up and contact the person directly through a known phone number, text message, or in person.
  • Document the call. Note the phone number that called you, the time of the call, what was said, and any identifying details. If possible, record the call (check your state's recording laws first).
  • Report to the FTC. File a report at ReportFraud.ftc.gov. The FTC tracks AI impersonation scams and uses reports to pursue enforcement actions.
  • Report to the FBI IC3. Submit a complaint at ic3.gov. The FBI's Internet Crime Complaint Center aggregates reports to identify patterns and build cases against organized fraud rings.
  • Alert your bank. If you did send money, contact your bank or payment provider immediately. Time is critical for recovering wire transfers and reversing fraudulent transactions.
  • Warn your family. If scammers targeted you using a family member's cloned voice, alert the rest of your family. The same voice clone may be used to target other relatives.

Protecting Elderly Family Members

Older adults are disproportionately targeted by AI voice scams. The FTC reports that adults over 60 lose more money per incident in impersonation scams than any other age group. Here is how to help protect elderly family members:

  • Have the conversation. Explain that AI can now clone voices and that a call from a "grandchild in trouble" may not be real. Many older adults are unaware this technology exists.
  • Set up a family safe word and practice using it. Make sure your elderly family member is comfortable asking for it and understands why it matters.
  • Install call screening. Set up Truecaller or your carrier's scam-blocking service on their phone. Configure it to block high-risk calls automatically so suspicious calls never ring through.
  • Establish a verification buddy. Designate a trusted family member that your elderly relative can call immediately to verify any urgent request before acting. Make sure this person's number is saved and easily accessible.
  • Reduce their voice footprint. Help them change their voicemail to a generic greeting and review their social media privacy settings to limit public access to audio or video content.
  • Consider identity monitoring. An Aura family plan can monitor an elderly family member's personal information and financial accounts, providing early warning if their data is compromised.

Bottom Line

AI voice cloning scams exploit the most powerful trigger scammers have: the sound of someone you love in danger. The technology is only getting better and cheaper. Your best defenses are low-tech — a family code word, a policy of always hanging up and calling back, and healthy skepticism toward any urgent call demanding money. Set these up with your family today, before the call comes.

Frequently Asked Questions

How do scammers clone someone's voice with AI?

Modern AI voice cloning tools can create a convincing replica of a person's voice from as little as three seconds of audio. Scammers harvest voice samples from social media videos, TikToks, YouTube clips, voicemail greetings, and even recorded phone calls. The AI analyzes the speech patterns, tone, and cadence, then generates new speech that sounds nearly identical. The FTC has warned that these tools are now cheap, widely available, and increasingly difficult to distinguish from real voices.

What should I do if I get a call that sounds like a family member in distress?

Hang up immediately and call the person back on their known phone number — not the number that called you. Do not send money, gift cards, or cryptocurrency while on the original call, no matter how urgent it sounds. If you cannot reach the person, contact another family member or friend who may know their whereabouts. The FBI's Internet Crime Complaint Center (IC3) advises that legitimate emergencies can always be verified through a separate call.

Can AI voice cloning be detected?

AI-generated voices are becoming harder to detect, but there are still clues. Listen for unnatural pauses, robotic cadence, background noise inconsistencies, or a slight metallic quality. Some calls may have a brief delay as the AI processes responses. However, the technology is improving rapidly, so behavioral verification — like a family safe word or calling back on a known number — is far more reliable than trying to detect AI audio by ear.

What is a family safe word and how does it protect against voice cloning scams?

A family safe word is a secret code word or phrase that only your family members know. In any urgent call requesting money or action, you ask the caller for the safe word before proceeding. Because the AI clone only replicates the voice — not the person's private knowledge — a scammer cannot provide the correct safe word. Choose something uncommon and unrelated to public information. The FTC recommends this as one of the most effective defenses against AI impersonation scams.

Where do I report an AI voice cloning scam?

Report the scam to the FTC at ReportFraud.ftc.gov and to the FBI's Internet Crime Complaint Center at ic3.gov. If you lost money, also contact your bank or payment provider immediately to attempt recovery. File a report with your local police department as well. Reporting helps law enforcement track these scams and warn others. If the scam involved impersonation of a specific company or government agency, report it to that organization directly.