Inaudible Attack on AI Speakers
-
Published:2023-04-19
Issue:8
Volume:12
Page:1928
-
ISSN:2079-9292
-
Container-title:Electronics
-
language:en
-
Short-container-title:Electronics
Author:
Alchekov Seyitmammet Saparmammedovich1, Al-Absi Mohammed Abdulhakim1ORCID, Al-Absi Ahmed Abdulhakim2ORCID, Lee Hoon Jae1ORCID
Affiliation:
1. Department of Computer Engineering, Graduate School, Dongseo University, Busan 47011, Republic of Korea 2. Department of Smart Computing, Kyungdong University, Gosung 24764, Republic of Korea
Abstract
The modern world does not stand still. We used to be surprised that technology could speak, but now voice assistants have become real family members. They do not simply turn on the alarm clock or play music. They communicate with children, help solving problems, and sometimes even take offense. Since all voice assistants have artificial intelligence, when communicating with the user, they take into account the change in their location, time of day and days of the week, search query history, previous orders in the online store, etc. However, voice assistants, which are part of modern smartphones or smart speakers, pose a threat to their owner’s personal data since their main function is to capture audio commands from the user. Generally, AI smart speakers such as Siri, Google Assistance, Google Home, and so on are moderately harmless. As voice assistants become versatile, like any other product, they can be used for the most nefarious purposes. There are many common attacks that people with bad intentions can use to hack our voice assistant. We show in our experience that a laser beam can control Google Assistance, smart speakers, and Siri. The attacker does not need to make physical contact with the victim’s equipment or interact with the victim; since the attacker’s laser can hit the smart speaker, it can send commands. In our experiments, we achieve a successful attack that allows us to transmit invisible commands by aiming lasers up to 87 m into the microphone. We have discovered the possibility of attacking Android and Siri devices using the built-in voice assistant module through the charging port.
Funder
National Research Foundation of Korea
Subject
Electrical and Electronic Engineering,Computer Networks and Communications,Hardware and Architecture,Signal Processing,Control and Systems Engineering
Reference50 articles.
1. Brannon, D. (2022, June 20). Attacking Private Networks from the Internet with DNS Rebinding. Available online: https://medium.com/@brannondorsey/attacking-private-networks-from-the-internet-with-dnsrebinding-ea7098a2d325. 2. Diao, W., Liu, X., Zhou, Z., and Zhang, K. (2014, January 7). Your voice assistant is mine: How to abuse speakers to steal information and control your phone. Proceedings of the 4th ACM Workshop on Security and Privacy in Smartphones & Mobile Devices, Scottsdale, AZ, USA. 3. A study on the user experience of smart speaker in China-focused on Tmall Genie and Mi AI speaker;Xiao;J. Digit. Converg.,2018 4. Kumar, D., Paccagnella, R., Murley, P., Hennenfent, E., Mason, J., Bates, A., and Bailey, M. (2018, January 14–16). Skill squatting attacks on Amazon Alexa. Proceedings of the 27th USENIX, Santa Clara, CA, USA. 5. Clinton, I., Cook, L., and Banik, S. (2016). A survey of Various Methods for Analyzing the Amazon Echo, Citadel, Military College South Carolina. Available online: https://vanderpot.com/2016/06/amazon-echo-rooting-part-1/.
|
|