Voice deepfakes: watch out for scammers’ new favorite trap – Business AM

Scammers are always one step ahead, and often have a certain affinity with the most innovative technologies. After the photo or video deepfakes, here are the fake phone calls from people whose voices you will be sure to recognize. It’s almost untraceable, and it can cost you dearly.

It’s a technique as old as the web, or even as writing: you receive a message from a close person who finds themselves in a difficult situation, whether due to health problems, an administrative problem at foreigner, or a bit of all of this at the same time. To solve the problem, he or she begs you to send him money. You will of course be reimbursed very quickly, we promise. Either way you trust this person.

And of course, you’ll never see your money again: it was an identity theft scam. Of course, this kind of trick by email or via social networks is quite well known now. But where we could be fooled is if this kind of scenario occurs on the phone, and you recognize the voice of your loved one.

On the phone with a copycat robot

It’s a perfectly believable scenario according to Fast Company. After photo and video, the science of deepfake is perfected until it can imitate voices convincingly. And this thanks to the convergence of a few advanced techniques which, taken separately, already have a strong dystopian potential. First, synthetic voice generators, more and more convincing in the hands of a good audio technician. But also ChatGPT-like chatboxes, which are starting to generate realistic scripts with real-time adaptive responses. Enough to hold a telephone conversation convincingly with a human being, and make him believe that he is really talking to a close person.

Every year, thousands of people are victims of voice scams, necessarily much more difficult to trace than a written message, of which we can at least take screenshots. It is much more difficult to trace the calls, identify the scammers – who may operate from abroad – and recover the funds.

Simulate the voice of the boss to reach the jackpot

The sums thus extorted were estimated at a total of 11 million dollars in 2022, according to Ars Technica. A sum to be distributed over 5,000 known victims, in the USA alone. However, there is no doubt that many of these scams pass under the radars of the Federal Trades Commission.

And these scams can be much more imaginative than “simple” fake calls for help that predate the state of panic and the solidarity of the victims. In 2019, an energy company was defrauded of $243,000 when criminals faked the voice of the boss of its parent company to order an employee to transfer funds to a supplier. In theory, no business is immune to this kind of audacious theft. And even if it means investing in the necessary vocal equipment, you might as well aim big.

Welcome to a world where nothing is certain and where absolutely all facts, all interactions, can hide a falsification. Enough to thrill the most cynical authors and other dystopian screenwriters. The only way to protect ourselves is to appeal to our human intelligence: keep a cool head, and recontact our loved ones via another channel, or even develop a rule that any call must be preceded by a SMS. But it is not won.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.