Don’t you dare even think about your banking account password when you slap on those fancy new brainwave headsets.
Or at least that seems to be the lesson of a new study which found that sensitive personal information, such as PIN numbers and credit card data, can be gleaned from the brainwave data of users wearing popular consumer-grade EEG headsets.
A team of security researchers from Oxford, UC Berkeley, and the University of Geneva say that they were able to deduce digits of PIN numbers, birth months, areas of residence and other personal information by presenting 30 headset-wearing subjects with images of ATM machines, debit cards, maps, people, and random numbers in a series of experiments. The paper, titled “On the Feasibility of Side-Channel Attacks with Brain Computer Interfaces,” represents the first major attempt to uncover potential security risks in the use of the headsets.
“The correct answer was found by the first guess in 20% of the cases for the experiment with the PIN, the debit cards, people, and the ATM machine,” write the researchers. “The location was exactly guessed for 30% of users, month of birth for almost 60% and the bank based on the ATM machines for almost 30%.”
To detect the first digit of the PIN, researchers presented the subjects with numbers from 0 to 9, flashing on the screen in random order, one by one. Each number was repeated 16 times, over a total duration of 90 seconds. The subjects’ brainwaves were monitored for telltale peaks that would rat them out.
The EEG headsets, made by companies such as Emotiv Systems and NeuroSky, have become increasingly popular for gaming and other applications. For the study, the researchers used the Emotiv Epoc Neuroheadset, which retails for $299.
The researchers — Ivan Martinovic of Oxford University; Doug Davies, Mario Frank, Daniele Perito, and Dawn Song of UC Berkeley; and Tomas Ros of the University of Geneva — analyzed P300 peaks, an important component of event-related potentials — electrical potentials that happen after the user is presented with a stimulus.
The P300 “occurs approximately 300 milliseconds after an event happens,” said Frank, a postdoctoral researcher at Berkeley, in a phone interview with Wired. “The potential arises if you already prime your thoughts toward a particular event…. An attacker could try to prime the thoughts of the victim towards a particular secret that a victim has in mind. For instance, if you know the face of some person, you might be able to observe a brainwave pattern that is evidence of the user thinking about the face.”
“Brain Spyware”
Emotiv and NeuroSky both have “app stores,” where users of the devices can download third-party applications. The applications use a common API for access to the EEG device.
“In the case of the EEG devices, this API provides unrestricted access to the raw EEG signal,” write the researchers. “Furthermore, such applications have complete control over the stimuli that can be presented to the users.”
The researchers envision a scenario in which a potential malicious attacker could write “brain spyware” to harvest private information from the user, which could be legitimately downloaded as an app.
“We simulated a scenario where someone writes a malicious app, the user downloads it and trusts the app, and actively supports all the calibration steps of the device to make the software work,” said Frank. In these seemingly innocuous calibration steps, which are standard for most games and other applications using the headsets, there could be the potential to harvest personal information.
“We realized that these devices are becoming increasingly popular — maybe in five, 10 years, it’s very likely that many households will have one,” Frank said. “At the same time, you can use all kinds of third-party apps for these devices. In this setting, as security researchers, we identified that there is a potential to make some bad stuff, to turn this technology against the user.” He said, however, that there was no immediate threat in using the devices. But the experiments devised by the researchers point to the devices’ darker potential.
“The simplicity of our experiments suggests the possibility of more sophisticated attacks,” write the researchers, warning that “with the ever-increasing quality of devices, success rate of attacks will likely improve.”