Written by: Guest | Best Company Editorial Team
Last Updated: July 8th, 2020
Guest Post by Kayla Matthews
Voice hacking is a relatively new phenomenon whereby cybercriminals use advanced technologies, including artificial intelligence (AI), to impersonate people. This new cybersecurity issue can become problematic in a couple of ways.
If hackers create sound clips that mimic your voice, they could use them to gain access to sensitive information. Some banks use voice verification to help users get into their accounts, and there are smart speaker apps that let you check your balance or perform other financial-related tasks by uttering the correct voice commands.
There was also a case where a company leader received multiple calls, seemingly from his boss. The caller instructed him to transfer a substantial sum of money, which he did. The exact procedure the wrongdoers used to fool the executive is not known, but they recreated his boss's voice so authentically that someone well-known to him believed it.
Given that voice-activated gadgets are increasingly common, what can you do to prevent hackers from using your recordings?
Periodically delete the stored recordings of your voice
Google, Apple, and Amazon all store clips of things you say to your smart speaker or voice-activated assistant. However, you can go through steps to delete the recordings that those companies store. Amazon also lets you opt-out of having human contractors review your voice recordings to improve the Alexa service.
Being able to retrieve and hear which recordings the major smart tech brands have should give some peace of mind, but if people don't delete them often, it could also give hackers access to a treasure trove of material they could use for voice hacking. If an unauthorized person gets access to months or years of stored recordings, they have plenty of material to exploit. Deleting the content frequently reduces that risk.
Beware of urgent-sounding calls
Google made headlines recently when it showed off its Duplex service. That offering can call places like hair salons or restaurants to make reservations on behalf of the people using the technology. The computer-generated voice was so realistic that it even included aspects of casual speech, like a person saying "um" during the conversation.
Some computer-made sounds are not so innocent, though. Many Americans get bombarded with robocalls. Statistics show that 60 percent of people block numbers to avoid them. Although the topics associated with these calls vary, many of them try to make recipients reveal sensitive information.
Analysts familiar with robocalls say it currently takes a lot of work to create realistic-sounding voice spoofing. However, as technology improves and criminals find faster or more effective ways to use voice hacking, the possibilities of what could happen are endless.
You might get a call from someone who sounds like your spouse's colleague saying that something happened to your partner during a business trip, and you need to send money for a jail bailout. You could hear from a supposed administrator at your child's school saying to come right away because your youngster is sick. Once your house is empty, the hackers might rob it.
Real-life examples like these are rare, with the only known one so far being the case mentioned earlier of the man who thought he was talking to his boss. However, it caused the leader to send the equivalent of $243,000. The best practice to avoid getting tricked through a voice-spoofed call is to hang up and get in touch with someone who can confirm the legitimacy of what you heard.
Change your passwords and take other recommended measures after breaches
Another thing that makes voice hacking more complicated is that criminals could use genuine information to get you to act in certain ways. Research from First Orion found that 75 percent of people tricked by scammers reported that the criminals had personal information they used for financial gain.
Unfortunately, we're in a time where you cannot necessarily trust someone who seems to have the right information. However, it's better to be safe than sorry. End the call and get in contact with the company that supposedly contacted you for verification.
First Orion's data also indicated hackers were more successful because they used information seized during massive data breaches. Do your best to stay informed about attacks as they happen. Then, if you're impacted by one of them, change your passwords and take any other actions the breached company recommends.
First Orion examined instances where hackers impersonated actual companies, not cases in which hackers faked voices. However, as we learned earlier, hackers doing that in a more widespread way is likely not out of the question.
Keep device software updated
Cybersecurity researchers warn that smart speakers could be hacked so that they carry out commands without an owner's knowledge. One of the problems is that AI processing tools decipher inputs humans can't hear. Lab demonstrations have shown that those inaudible sounds could make a smart speaker do things like unlock a door and deactivate a security camera.
The most straightforward safeguard to take is to always install new software updates when available. Device manufacturers release new ones as needed, often to deal with known security shortcomings.
Weigh convenience with security
Voice-enabled devices are handy and often seen in today's society. However, they can leave you vulnerable to several kinds of voice hacking efforts. Besides the specific tips mentioned here, ask yourself whether the option to use your voice to perform actions is worth the risks.
Kayla Matthews, a tech and security journalist, has written articles for sites including WIRED, Information Age, Security Boulevard, and the National Cyber Security Alliance. To see more of her work, follow her on Twitter @KaylaEMatthews or check out her tech blog, Productivity Bytes.