Menu Close

Scammers using artificial intelligence (AI) to enhance family-emergency schemes

You get a call. There’s a panicked voice on the line. It’s your son. He says he’s in deep trouble — he wrecked the car and landed in jail. But you can help by sending money. You take a deep breath and think. You’ve heard about these scams. But darn, it sounds just like him. How could it be a scam? Voice cloning, that’s how.

Artificial intelligence is no longer a far-fetched idea out of a sci-fi movie. We’re living with it, here and now. A scammer could use AI to clone the voice of your loved one. All he needs is a short audio clip of your family member’s voice — which he could get from content posted online — and a voice-cloning program. When the scammer calls you, he’ll sound just like your loved one.
So how can you tell if a family member is in trouble or if it’s a scammer using a cloned voice?

Don’t trust the voice. Call the person who supposedly contacted you and verify the story. Use a phone number you know is theirs. If you can’t reach your loved one, try to get in touch with them through another family member or their friends.

Scammers ask you to pay or send money in ways that make it hard to get your money back. If the caller says to wire money, send cryptocurrency, or buy gift cards and give them the card numbers and PINs, those could be signs of a scam.

If you spot a scam, report it to the FTC at ReportFraud.ftc.gov.

Want to stay up to date on the latest scams? Check out the FTC’s consumer alerts: https://consumer.ftc.gov/consumer-alerts

We encourage you to share this information with family members of friends.