A voice warns, tearful: “They kidnapped me, dad. I need you to help me”. Desperation seizes the father, who tries to do everything possible to prevent his son from being harmed. But kidnapping does not exist: they cloned the voice of the false victim thanks to Artificial Intelligence.
There are more and more cases of this type, and with the advances in technology the future does not seem to be very encouraging.
In the United States, the Senate’s Special Committee on Aging sent a letter to the chairman of the Federal Trade Commission. The aim is to express concern about the “growing threats posed by fraud and scams related to Artificial Intelligence”.
According to the senators, there is “an increase in cases of scammers using AI to impersonate loved ones and scam people, often the elderly.”
However, there are ways to avoid voice cloning scams.
Prevention against voice cloning scams through Artificial Intelligence
The first thing is to know that while the more we use social networks in audio or video, the more chances there are that they will clone our voice. It is always recommended to decrease, not to say stop doing this.
But if cloning has already happened, to do?
Niklas Myhr, professor at Chapman University, recommends two important steps to prevent fraud.
:quality(70)/cloudfront-us-east-1.images.arcpublishing.com/metroworldnews/B3PHCU3MYJGIVLGEOOM6OL2KQU.jpg)
The first: agree with your loved ones a secret word “You need to somehow verify that this is the real person,” Myhr says.
And the second: always mistrust, Be skeptical of these kinds of circumstances. Not only from the voices, but from the videos. “You don’t have to give in to pressure. Today there are voice impersonations, and soon there will be more people trying to scam video impersonators. You have to be vigilant, ”she stressed.