Please Don’t Put Words in My Mouth

Please Don't Put Words in My Mouth
‘Well… now you’re just putting words in my mouth.’

Adobe recently announced Project VoCo at the November Adobe Max conference. It’s purported to have the ability to take recordings of someone’s voice, then create audio that sounds like it is from that person. In a nutshell, it’s Photoshop for audio. People in the television industry, book narrators, and podcast creators may be rejoicing as this would mean less time redoing mistakes made in the studio. On the other hand, this could be a malicious social engineer’s dream tool. According to Adobe the software needs about twenty minutes of someone’s voice and then it can recreate that voice exactly. The software doesn’t just find words and patch them together; the demo shows it can actually create speech that the person never said. Couple that with the fact that spear phishing of C-suite employees is becoming a bigger problem and you’ve got a volatile mixture. It’s not hard at all to find twenty minutes of audio on most CEOs and other high-level employees, considering many of them participate in press conferences, speeches, podcasts, and interviews.

How Could This Attack Work in The Real World?

Once the audio is acquired (through OSINT) and loaded in the program, it could just be a matter of typing in what you want the program to produce. A malicious social engineer’s attack vector may go like this:

  1. Perform OSINT and find that a CEO will be spending a week overseas for a conference.

  2. Create a fake voicemail from the CEO to the head of finance stating, “Hi Sue. I’m in London this week, so I need you to talk to our new vendor Phil Jones tomorrow to transfer some funds for a critical purchase. He’ll call you tomorrow around 9 a.m.

  3. When Sue gets to work and hears this voicemail from her CEO, the pretext has been primed and she’s now expecting a phone call.

  4. The following morning the malicious visher calls posing as Phil Jones, and gives Sue the instructions to initiate a wire transfer for a large sum of money.

Isn’t That Scenario Too Implausible?

It may sound far-fetched, but phishing and vishing fraud is occurring like this on a daily basis without the help of VoCo. The F.B.I. recently reported a 270% increase in “CEO Fraud” since 2015. An estimated $2.3 Billion was lost over the last three years to these attacks, and adding VoCo to the mix could significantly increase this amount.

How Else Might This Possibly be Exploited?

Furthermore, with the increasing use of voice activated assistants like the Amazon Echo and Google Home, VoCo could be used to create attacks against these devices. Many are able to integrate with IoT devices, like a garage door or home security system. Imagine if a criminal could pick a lock to a residence and then play an audio file in the homeowner’s voice saying “Alexa, disable home alarm.” Some systems require a vocal PIN as well to disarm the system, but that could possibly be gathered by phishing the target.

Finally, VoCo could be used as a smear-campaign tactic used to sway elections or cause severe controversy. Imagine if multiple “leaked recordings” emerged of a CEO leaving sexually harassing voicemails. The recordings could likely go viral, causing the stock of the company to plummet. Unlike Photoshop, it would be much harder to disprove the authenticity of the recordings made in VoCo; giving the attacker enough time to make several lucrative stock trades in their favor. How could a CEO really prove that they didn’t leave the incriminating voicemails that they were accused of doing in this case?

What Can I Currently do to Mitigate The Threat?

Currently VoCo is still in development phase and appears to have some limitations, but this will change as the technology is tweaked and improved. The best way to fight against this possibility is to continually train employees to be vigilant against vishing and spear phishing attacks. This will help prepare them to continue to fend off these attacks, as they get more sophisticated with tools like VoCo. You may also want to consider what you allow a voice-activated assistant to control in your home, and if available, setup a disarm PIN on your voice-controlled security system.

Written By: Laurie V.

Sources:
https://www.youtube.com/watch?v=I3l4XLZ59iw&feature=youtu.be&list=PLD8AMy73ZVxVLnQh5m-qK0efH3rKIYGx2
https://krebsonsecurity.com/2016/04/fbi-2-3-billion-lost-to-ceo-email-scams/
https://www.cnet.com/news/scouting-out-a-security-system-that-talks-to-amazons-alexa/