The clicking of your keyboard may be enough for AI to steal your data, even over a video call

A robot wearing headphones listening to a laptop in the background.
(Image credit: Bing Image Creator)

What you need to know

  • A team of researchers from Cornell University published a paper detailing how they trained AI to listen to audio inputs from keyboards and interpret what the user typed.
  • With specific keyboards and matching references, the AI was able to detect what was typed with 95% accuracy.
  • Using touch for typing input reduced keystroke recognition down to 40-64%, while white noise and extra keystrokes also lowered the accuracy.

The future of AI is now, and it's bringing with it some really weird cyberattacks. A team of researchers at Cornell University have recently published a study detailing their hypothetical cyberattack that involved training AI to recognize a user's input based on the audio of their keystrokes. The process of using audio and sonic surveillance to scrape data is known as an "acoustic side channel attack", and while the process of using audio to steal sensitive information is not new, the idea of pairing it with AI is a leap in the technology that makes it much more efficient.

According to the research team behind the project the attack could easily use everyday technology like a cell phone microphone or Zoom recordings to acquire the training audio that is then fed into the AI algorithm to analyze the sound before translating it into readable text. With AI that is properly trained on the keyboard being used the model was capable of predicting what the user had typed with 95% accuracy, though this did drop to 93% when using Zoom recordings to train the AI.

To create the hypothetical cyberattack, the research team pressed 36 keys on a MacBook Pro 25 times each with varied amounts of pressure while recording the sound of each keystroke. The 25 audio samples were then combined into one file for each key and fed to an AI algorithm.  

You don't need to throw out your favorite mechanical keyboard just yet. There were ways to thwart the AI, including adding in extra keystroke sounds when possible or using third-party software to produce a noise that could muddy up any audio that would be potentially used to train AI. Using varied text cases and randomizing your typing style could also help, though changing up something like your very manner of typing may be easier said than done. 

Swapping to biometric protections for your data is certainly going to be an easier method for most users. Those who are genuinely concerned about an acoustic side channel attack could opt to use a touch screen keyboard as the study showed that swapping to a touch screen keyboard could lower the accuracy of the AI to as little as 40%. 

Analysis: Not your biggest security concern

It's always a good idea to be in the know when it comes to cybersecurity measures, but when it comes to something like acoustic keyboard attacks it's very unlikely the average typist is going to need to be concerned. If, of course, you're dealing with highly sensitive information you may want to be a little more cautious than the average user, but there are certainly easier methods of data scraping out there than going through the hassle to train an AI with a person's specific keyboard and typing behavior in order to steal data.

Cole Martin
Writer

Cole is the resident Call of Duty know-it-all and indie game enthusiast for Windows Central. She's a lifelong artist with two decades of experience in digital painting, and she will happily talk your ear off about budget pen displays.