Scammers are targeting parents by using AI to clone their children’s voices and then calling with fake emergencies and ransom demands. The rapid rise of artificial intelligence technology has allowed scammers to make sophisticated scams with as little as three seconds of a victim’s voice.
Scammers are using artificial intelligence to mount terrifying scams which see social media users’ voices cloned – with fakers then calling their targets’ parents and pretending to be in trouble, before begging for cash for help.
So-called imposter scams, where a fraudster impersonates someone in order to steal money, are the most common scam in the US, losing American’s $2.6 billion in 2022 alone, the Federal Trade Commission reported.
Artificial Intelligence has now supercharged the ‘family emergency’ scam, where the criminal convinces the victim that a family member is in distress and in immediate need of cash.
One in four respondents to a McAfee survey in April 2023 said they had some experience of an AI voice scam and one in ten said they had been targeted personally.
Scammers can need as little as three seconds of audio, which can easily be extracted from a social media clip, to clone a person’s voice, McAfee’s study found.