09 Sep

AI Voice-Mimicking Software used in $240K Theft

Cyber security attacks are getting more and more advanced as we go from Phishing emails to now Voice-mimicking software. Yes, you read that right, voice-mimicking software. News broke about the $240k voice mimicking software theft this month and as we are shocked, we are also reminded to be much more aware and thorough with our security standards. Researchers are calling it the first publically reported artificial-intelligence heist.

Here’s exactly how the breach happened- The managing director of a British energy company, believed his boss was on the phone when he was ordered to send hundreds of thousands of dollars to a secret account. The voice mimicking software imitated the company’s executive’s speech and tricked the employee into sending the large amount of money. The director noted in an email that the request was “rather strange” but that the voice was so lifelike that he felt he had no choice but to send the money. It was later announced that there was also an email from the employee tricked to make the request seem even more valid.

With the expansion of technology, it is inevitable that criminals will use any new technology that will accelerate their scam process. AI start-ups and Voice-synthesis software copies the rhythms and intonations of a person’s voice and are used to produce convincing speech. Google and smaller tech firms such as “ultrarealistic voice cloning” start-up Lyrebird have helped refine the resulting fakes and made the tools more widely available for free and unlimited use. Lyrebird.

“Deepfakes,” otherwise known as Synthetic audio and AI-created videos, have contributed to the growing fear of violating public trust, empowering criminals and making traditional communication from business, personal and even governmental a vulnerable reality in the cyber security world.

But it doesn’t stop there, we’re seeing more AI apps being created that unfortunately will be used for criminal manipulation. Take for example the Chinese app ZAO that can put your face in place of a celebrity such as Leonardo DiCaprio through a series of videos. Now, it’s scary to even think about but imagine breaking news hits your television and it’s the President of the United States on your screen making what sounds like an absurd demand to all citizens about transferring your funds to a special account to regulate spending. It’s a ridiculous request, but at the same time the President of the United States is on your television screen, and might we add, he looks and sounds exactly like himself.

It’s getting even harder for us to detect is if a voice over a phone call is legitimate. The thieve will often note that the delayed response is due to being in a car, or rushing to make a business flight. In addition to validating the delayed audio responses and other strange instances, criminals will use deadlines to pressure a victim. If an owner of a company is calling you to transfer money and you think it seems strange but they note they’re in a rush and need this done now, it makes you think to yourself, “ok they’re in a tight situation, that’s why they’re calling me directly to do this.”

With the rise of new technology, we have no choice but to adapt. AI developers are working towards building systems that can detect and combat fake audio but we still need to recognize that with such evolving technology, criminals are becoming more agile with their hacking tactics. At CHIPS, we always recommend reviewing your security policies, including educating your team members on what to look out for when it comes to suspicious emails and phone calls.

Share this