FBI Warns of Fake Video Scams

The FBI is warning of AI-assisted fake kidnapping scams:

Criminal actors typically will contact their victims through text message claiming they have kidnapped their loved one and demand a ransom be paid for their release. Oftentimes, the criminal actor will express significant claims of violence towards the loved one if the ransom is not paid immediately. The criminal actor will then send what appears to be a genuine photo or video of the victim’s loved one, which upon close inspection often reveals inaccuracies when compared to confirmed photos of the loved one. Examples of these inaccuracies include missing tattoos or scars and inaccurate body proportions. Criminal actors will sometimes purposefully send these photos using timed message features to limit the amount of time victims have to analyze the images.

Images, videos, audio: It can all be faked with AI. My guess is that this scam has a low probability of success, so criminals will be figuring out how to automate it.

Posted on December 10, 2025 at 7:05 AM4 Comments

Comments

mw December 10, 2025 7:17 AM

I don’t think the probability of success is low. It depends on the audience. Anny educated scientist or engineer might dtetct this as scam, but many people without these skills will be trapped.

Clive Robinson December 10, 2025 9:34 AM

@ Bruce, ALL,

The moment I read,

“The FBI is warning of AI-assisted fake kidnapping scams”

My brain said

“Another logical extension”

Then,

“Why did it take so long…”

My first thought coincides with your after quote comment of,

“Images, videos, audio: It can all be faked with AI.”

But a further thought was OK “Images, videos, audio” but “What will be next?”

Will the AI be capable of being “interactive” such that it can realistically beg/plead for their life.

Which makes me further think people should have “fail safe words” that they can use to act as an “authenticator”.

It’s one of the things used in certain military circles to stop “fake rescue” interrogation tricks.

KC December 10, 2025 10:30 AM

Nice PSA. Report kidnapping scams to Liam Neeson, er, uh IC3.gov.

Many news stories. Here’s one about sisters.

YouTube: ‘AI scam tricks woman into thinking her sister was kidnapped’ (2:45)

BCS December 10, 2025 6:06 PM

I’d like to see this scam prosecuted at a similar level to as if it was a real kidnap for ransom.

If the success rate is 0.1% and the odds of getting caught and spending a decade or three in prison is about the same then it’s a loosing proposition.

Leave a comment

Blog moderation policy

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.