With the rise of messaging apps (such as WhatsApp) and video-based social networks (such as TikTok), it is now commonplace for people to share their voice recordings online. In fact, a recently published McAfee study (PDF) reveals that 53% of adult users share their voice online at least once a week…and that 49% do it up to 10 times in the same time.
It may seem harmless, but those few seconds we share in a voice note are enough for scammers to obtain a credible duplicate of our voice with which they can manipulate our friends and family to their advantage.
DO NOT BE FOOLED! The main SCAMS in ONLINE SHOPPING and HOW TO AVOID THEM
What do scammers take advantage of?
Humans use mental shortcuts every day to solve problems, which in psychology is known as heuristics. In this kind of scams, in fact, they take advantage of those mental mechanisms…
…they know that our brains are likely to take shortcuts and believe that the voice we are hearing is really that of a loved one, as it pretends to be. Therefore, it may a ‘near perfect’ match between the real voice and the synthesized voice is not even necessary.
McAffe also sees a clear tendency to take advantage of older people and/or people with little technological knowledge: in most known cases of this kind of scam, the victims are parents or grandparents who denounce that a cybercriminal has cloned the voice of a son or grandson.
Ignorance in the face of danger
However, despite the rise and danger of some of the scams based on AI voice cloning (such as fake kidnappings), since 77% of victims involved in cases like this have lost money (More than a third lost more than $1,000, while 7% lost between $5,000 and $15,000)…
…36% of adults surveyed say they have never heard of this kind of fraudwhich, according to McAfee, “indicates the need for further education and awareness in the face of the rise of this new threat.”
In the words of Steve Grobman, CTO of McAfee,
“Targeted scams based on impersonation are not new, but the availability and access to advanced AI tools is. Now, instead of just making phone calls or sending emails or SMS, with very little effort a cybercriminal can impersonate another person using AI voice cloning technology, thus playing on your emotional connection and sense of urgency to increase the likelihood that you will fall for the scam.”
Two ways to limit the likelihood of being impersonated in a scam:
- Think before you click and shareA: Who is going to have access to that audio or video? Do you really know (and trust) all your contacts? Resort to the personalization of the privacy of your publications in those networks where this is possible.
- Use identity monitoring services: They can help alert you if your personally identifiable information is available on the Dark Web. The availability of this data helps lend credibility to impersonation attempts where voice cloning is not enough.
And four ways to avoid being a victim yourself or your loved ones of this kind of scam:
- Set a “key phrase/word” with your children, family or trusted friends, and agree with them to ask for it whenever you call or send a message for help, especially if they are elderly or vulnerable.
- Distrusts and explicitly asks for information to verify identity of your interlocutor. For example: “When is your father’s birthday?” If we’re lucky, we’ll catch the scammer by surprise.
- Do not get carried away by your emotions, think before you respond. Does this sound like this person who claims to be talking to you? Hang up and call the person directly or verify the information before answering.
- In general, it is good advice do not answer calls from strangers. If they leave a voicemail, you will have time to reflect and contact your loved ones on your own to confirm that they are safe.