Jul 6 2020
At the Technical University of Darmstadt, scientists have joined with partners from France and the United States to design a novel device that can detect when audio recordings are streamed to the Internet by smart home devices, without the consent of the users.
Developers of smart home devices are progressively adding voice assistant traits to an extensive range of devices such as doorbells, security systems, thermostats, televisions, and smart speakers.
Therefore, a majority of such devices are fitted with microphones, which raise considerable concerns with regard to privacy. For example, users are not always informed when audio recordings are delivered to the cloud and are not aware of which individuals are likely to have access to the recordings.
Wrong Wake Words
The researchers from the profile area cybersecurity (CYSEC) of the Technical University of Darmstadt and their associates were able to demonstrate that several devices with built-in voice assistants can accidentally listen to conversations.
Usually, a wake word activates voice assistants, as in the case of Amazon’s Echo, which is typically woken up by sentences beginning with “Alexa.” The researchers performed a wide range of experiments with voice assistants, such as Alexa, and detected various English terms that are incorrectly interpreted by Alexa as wake words.
Such wake words activate the voice assistant, which results in a sudden transmission of audio. Instances of these words involve day-to-day words like “mixer” or “letter.”
During their experiments, the scientists used terms that are both spoken by the human voice and the synthesized voice. To overcome this issue, they have designed a proof-of-principle device that identifies such anomalies and constructs a completely functional prototype.
Dubbed LeakyPick, the device can be positioned in a user’s smart home and applied to test other voice assistants in its proximity on a routine basis through audio commands.
The resultant network traffic is tracked for statistical patterns that would denote audio transmission. The LeakyPick subsequently determines the devices that are recording the audio without users’ knowledge and then informs them.
In addition, the LeakyPick device could help against an advanced attack on Alexa, the voice assistant. In such attacks, the adversary transmits commands and wake words in the ultrasonic range, which is not audible to humans, and places orders online with Amazon, for instance.
The human ear cannot pick up the ultrasound commands, which are interpreted by Alexa. The LeakyPick device detects this activity in the network traffic via statistical pattern analysis for benign behavior.
At present, the device is in the prototype phase and is yet to become commercially available.