The Ferrari scam: Deepfakes that almost tricked an executive
Sep 11, 2024
It was all too real.
Ferrari was one of the recent companies that was hit by the recent increase in deepfake scams that are targeting businesses. Fortunately, the attempts were not successful due to the vigilance of an executive.
On a seemingly ordinary Tuesday morning, an executive in the company began receiving unexpected WhatsApp messages that appeared to be from CEO Benedetto Vigna. The messages, sent from a different number, featured a profile picture of Vigna in a suit and claimed that the executive was needed for an important acquisition deal. Further, the executive was instructed to sign a Non-Disclosure Agreement.
Though the messages seemed suspicious, they were initially pretty convincing. But the executive noticed subtle differences in intonation and flagged the communication.
The sophistication of the scam:
The messages were detailed and also included internal company matters, which made it sound more authentic
They also called the executive and imitated Vigna’s Southern Italian accent almost perfectly. Imitating someone’s accent increases the sophistication because we usually associate deepfakes with very mechanical tones
WhatsApp, a method of communication where it is hard to verify authenticity was used, along with uncanny audio elements
But thankfully, the executive noticed subtle differences in the intonation and flagged the communication immediately. Further, to verify if it was actually Vigna speaking, the executive also asked a question that only Vigna could have answered - a recently made book suggestion. When the question was not answered, it was confirmed to be an impersonation.
Why should we be concerned?
CEOs do not talk to employees everyday. After a tiring day at the office, if you had received a deepfake call, would you still be vigilant enough to pick up subtle cues, from someone you do not talk to everyday?
Accents, idiosyncrasies in speech etc. is what makes things sound more human.
Very little additional data is required to train large models that are available to use, to make it sound exactly like someone, imitating their intonations, mannerisms and accentRemote work, offices situated in different geographical regions etc. will make it harder to identify impersonations
As deepfake technology becomes more sophisticated, it is important to have safeguards in place. Something as simple as having a security question that needs to be answered can help protect your business operations. Tools like Karna, help detect deepfakes in real time, and are effective ways to protect your business from deepfake attacks. Book a demo with us soon!
2024 © Project Karnā Inc.