Near Ultrasound Inaudible Trojan – Smart Assistants At Risk From “NUIT”
Near Ultrasound Inaudible Trojan – Smart Assistants At Risk From “NUIT”
“Near Ultrasound Inaudible Trojan” (NUIT) is a new form of attack named has been discovered by researchers from the University of Texas. This NUIT is designed to attack voice assistants with malicious commands remotely via the internet.
Impacted assistants include Siri, Alexa, Cortana, and Google Assistant.
This attack relies on abusing the high sensitivity of microphones found in these IoT devices. They can pick up what is described as the “near-ultrasound” frequency range (16kHz – 20kHz), and this is where NUIT lurks.
A NUIT sound clip can be played on the targeted device’s speaker, allowing the voice assistant to be attacked on the device itself or even another device altogether.
This attack can be launched in 2 different ways. One is where NUIT is happening on the targeted device itself. This could be, for example, a rogue app or an audio file. Below is a video where the NUIT attack results in an unlocked door.
The second form of attack is where the first device containing a speaker is used to communicate with a second device containing a microphone. This is the daisy-chain style approach, where all of the cool technology in your devices slowly comes back to haunt you.
According to researchers, it is said that a smart TV containing a speaker and a quick blast of YouTube could be all that’s needed. Even unmuting a device on a Zoom call could be enough to send the attack signal to your phone sitting next to the computer as the meeting occurs.
Social engineering plays a large part in being successful via the NUIT attack. Entry points for voice assistant shenanigans could be bogus websites, apps, and audio.
Once access to a device is gained, an attacker lowers the device’s volume. This is so the device owner cannot hear the assistant responding to commands sent. Meanwhile, for the attack to actually take place, the speaker needs to be above a specific noise level. If this occurs, the bogus command length has to be below 77 milliseconds, or it won’t work.
Regarding current impact, researchers say Siri devices “need to steal the user’s voice”. Meanwhile, the other 16 devices tested can be activated by using a robot voice or, indeed, any other voice at all, for that matter.
The NUIT attack will appear at the upcoming USENIX Security Symposium in August, giving a complete overview of how this works. For now, for you to be on the safer side, it would be better to follow the advice for possible defences against this new form of attack listed by the researchers, which includes the following:
- Use earphones. If the microphone can’t receive malicious commands, the compromise can’t occur.
- Awareness is key. Be careful around links, apps, and microphone permissions.
- Make use of voice authentication. If you’re on an Apple device, now is the time to fire that up.