Are others listening in?

Your smart gadgets may be vulnerable to voice-hackers

Amazon Echo. File Picture
Amazon Echo. File Picture
Image: Flickr

The sound of homeowners yanking their Amazon Echo from the wall socket could be heard across Britain this week, after one device was caught listening to a family conversation and sending it to another person.

The recipient, a colleague of the owner, called to tell them to switch off Alexa immediately.

It is not the first gaffe for the smart speaker.

Last year, a six-year-old girl became an internet sensation after a YouTube clip showed her ordering a $170 (R2 137) doll’s house and a packet of biscuits.

Days later a broadcaster on California’s CW-6 uttered the words: “I love when the little girl says ‘Alexa: order me a doll’s house’,” kicking a number of Amazon Echos across the state into action. Amazon promised to refund the transactions.

Then there was the Burger King advert that deliberately tried to activate viewers’ Google Home speakers and read out a description of its whopper.

It is not just the paranoid who are growing concerned that hackers might listen in or clone their voice. A picture of Mark Zuckerberg, the king of technology himself, revealed that he taped up his laptop microphone jack.

The newfound ability to control our lives using voice alone is liberating, but there are risks associated with it.

Simon Edwards, of global cyber security company Trend Micro, says it is a legitimate concern.

“A voice is just zeros and ones to a computer. That means we are able to manipulate voices all the time – like pop stars with auto-tune. So why wouldn’t someone use this manipulation for nefarious purposes?”

Edwards, a security expert at one of the largest global enterprise security companies in the world, refuses to put an Amazon Echo in his home, yet he is aware of the potential it has to help our day-to-day lives.

“My wife is disabled, she has multiple sclerosis and would find one of those things quite useful for controlling the environment around her,” he says.

“She uses voice-activated banking because her hand shakes, making it hard to type. However, you could easily replay that.

“You would still need the more traditional security elements to make it completely safe.”
Here lies the problem. While voice-activated gadgets serve a seemingly novel purpose, the way we speak is now becoming a hot trend in security.

Software company Nuance, which is valued at $4-billion (R50.2-billion), specialises in algorithms that can detect when someone is not who they say they are.

The company’s resident voice biometrics expert, Brett Baranek, says businesses are under pressure to switch to more secure forms of authentication.

The average Briton experiences several security measures a day, including Pins or fingerprint ID to unlock smartphones, or security questions and passwords when logging into online or telephone banking.

“The consumer today is still using a very insecure method of verifying their identity, a mechanism that was conceived in a world where there was no internet,” Baranek says.

Voice recognition does not always get it right, however.

Last year a BBC journalist managed to fool HSBC’s voice system using his twin.

On the surface, this appears a disaster but it could be perceived as a success.

Most fraud is at the hands of the victims’ family or friends, who will most likely have access to personal details and can pass a Pin or security questionnaire well, so voice recognition creates a bigger obstacle.

There will be instances where voice hackers crack into accounts, but the scale will be much smaller.

In a world where fraud levels are growing by double digits, it is clear why businesses might opt for voice recognition.

However, there has been much talk about “deepfakes” and Russian trolls potentially manipulating voices and videos to make it appear as if high-profile politicians are saying sensational things.

“I don’t want to make it sound like voice biometrics are invincible,” Edwards says.

“Alexa’s ability to understand accents shows you how good a system is becoming.

“But what this machine is doing is converting your voice to numbers. You wouldn’t need to steal somebody’s voice: just the pattern recognition algorithm that can detect them.”

subscribe