Researchers in China and the USA are attempting to prove that AI assistants like Siri are susceptible to “hidden” commands that are humans cannot detect, but can force them to perform actions against their owners will. These subliminal commands can be utilised with malicious intentions such as gaining access to bank accounts and purchasing items online.
A research paper published by students from American universities revealed that by embedding commands into spoken text or music recordings they were able to converse with Siri and Alexa, and while the owner of the item is entirely oblivious, unable to hear the malicious commands, the AI assistant will respond to the instructions.
One of the authors of the paper, Nicholas Carlini, believes that the technology and method by which they were able to accomplish the commands may already be utilised and developed by malicious groups.
“We wanted to see if we could make it even more stealthy,” said Nicholas Carlini, a fifth-year Ph.D. student in computer security at U.C. Berkeley and one of the paper’s authors.
Mr. Carlini added that while there was no evidence that these techniques have left the lab, it may only be a matter of time before someone starts exploiting them. “My assumption is that the malicious people already employ people to do what I do,” he said.
It was only last year that it was demonstrated that AI assistants could be activated and manipulated through the use of frequencies outside of the range processed by humans when researchers produced a transmitter to attack the assistant and send a command that dialled a specific phone number.
Carlini informed The New York Times that his team would be able to launch successful attacks against “any smartphone on the market.”, and that they wanted to prove to companies that this potential problem ought to be handled. Apple itself has strong security when it comes to certain Siri commands, and often a password must be given before such a command can be carried out, however, Siri’s susceptibility to an attack is the greatest worry for the moment.