Go ahead, I'm Listening
What if your smartphone starting making calls, sending texts, and visiting malicious internet sites without you asking or even knowing?  Unfortunately, it seems that hackers can make this a possibility by using your phone or other mobile device's personal assistant such as Siri or Alexa.
 
A team of researchers from China's Zhejiang University have discovered a way to activate mobile device voice recognition systems without even speaking a word by exploiting a vulnerability that apparently is common across all major voice assistants. Dubbed "DolphinAttack", this technique works by feeding these artificial intelligence assistants commands in ultrasonic frequencies which are too high for humans to hear but are perfectly audible to the microphones on smart devices. The bad guys can silently whisper commands and hijack Siri or Alexa and force them to open malicious websites or other applications contained on your phone or device.
 
These attacks work on every major voice recognition platform, affecting every mobile platform such as iOS and Android. The Chinese researchers were actually able to make an iphone dial a specific number and, according to the researchers, this high frequency (think dog whistle) attack can also cause the hijacked device to:
 
1.  Visit a malicious website
2.  Send bogus texts or emails
3.  Initiate outgoing videos 
4.  Turn on airplane mode, disconnecting all wireless communications
5.  Dim screens and lower volumes in order to hide the attack.
 
How can these attacks be prevented? Until the manufacturers set device parameters for the assistants to not recognize high frequency commands, the only way to avoid the issue is to turn off the voice assistant capability when not in use.  Hopefully, the manufacturers will fix this issue quickly...giving us a little break before the next creative new attack.