Mind Hunting: Could Your Brain be a Target for Hackers

(YOU CAN ALSO READ THIS ARTICLE ON INTERESTING ENGINEERING)

Imagine the scenario where you are minding your business someday in the near-future as you scroll through social media using only the power of your thoughts. Apple’s new phone contains a brain-computer interface that lets you control every feature with your mind—a miracle of science when it was introduced… but now you’re used to it.

But imagine if you will, what hacking a device with that kind of technology could mean. While you browse the web or do your work, hackers could be using spyware on your mind-- gathering your most private information directly from your brain signals. Your likes and dislikes. Your political preferences. Your sexuality. Your PIN.

The idea frankly is not all that far-fetched. The need to secure our very thoughts is quickly becoming a reality in a world where brain-computer interface machines (such as EEGs) are becoming more prevalent both in medical scenarios and in other applications like personal computing and even gaming. The simple fact is, if we don’t address this problem quickly, it will inevitably be too late.

Your Brain as Big Data

Less than a year ago, researchers in the University of Washington Biorobotics Lab built a device to show how a brain-computer interface coupled with subliminal messaging in a video game could be used to extract private information about an individual.

It works like this: while wearing an EEG device, a player would start a computer game they called “Flappy Whale”, a simple platform game based on the addictive Flappy Bird. They simply guide a flopping blue whale through the on-screen course using the keyboard arrow keys.

But occasionally, something unusual will happen. Logos for different companies will start appearing, each flickering in the top-right of the screen for just milliseconds before disappearing again. Blink and you'd miss them.

The idea behind this is simple: Hackers could insert images like these into a dodgy game or app and record your brain's unintentional response to them through the brain-computer interface, perhaps gaining insight into which brands you're familiar with, or which images you have a strong reaction to.

Now, you might not care who knows your weak spot for Kentucky Fried Chicken, but you can imagine where this might be going. Imagine if these "subliminal" images showed politicians, or religious icons, or sexual images of men and women. Personal information gleaned this way could potentially be used for embarrassment, coercion, or manipulation.

Where does the problem lie?

Broadly speaking, the problem with brain-computer interfaces is that, with most of the devices these days and coming down the technological pipeline, when you're picking up electric signals to control an application, the application is not only getting access to the useful piece of EEG needed to control that app; it's also getting access to the whole EEG. And that whole EEG signal contains rich information about us as persons.

And it's not just stereotypical black-hat hackers who could take advantage. One could see police misusing it, or governments—if you show clear evidence of supporting the opposition or being involved in something deemed illegal, you could be charged with a literal thought crime. China has already taken steps in this direction with the implementation of their “social credit score” system, how far really is this away from a full-fledged Orwellian state?

The biggest misuse of brain-computer interface tech though could, in fact, be advertising, which could pose a threat to users' privacy as opposed to their security. You could see them as the ultimate in targeting ads: a direct line to consumers' brains. If you wore an EEG while browsing the web or playing a game, advertisers could potentially serve ads based on your response to items you see. Respond well to that picture of a burger? Here's a McDonald's promotion.

Inception and Extraction

Though I never pass up an opportunity to reference a Christopher Nolan classic, here it actually makes sense. While still in the realm of science fiction, for the most part, the true danger of brain hacking is not just in hackers reading your mind, it is in the idea that they could eventually implant or extract specific ideas in your brain.

Books worth of research already exist detailing the precise locations specific pieces of information are stored in the neural web. From there it is hardly a stretch to imagine malware in a sufficiently advanced brain-computer interface that could target a specific area to pull out or insert a specific piece of information for someone else’s nefarious ends.

In a similarly troubling development, researchers at the University of Buffalo have developed a system to corrupt a brain-computer interface to remotely control a person’s brain. They were able to make their mouse subjects run, freeze in place, or even completely lose control of their limbs.

At the moment, their approach works through a technique called “magneto-thermal stimulation” which still requires a minimally invasive procedure to work, which would be hard to due to someone without their knowledge… That being said, the implications of this kind of research are troubling at best.

What can we do?

Most researchers in this space agree that there needs to be some kind of privacy policy in apps that utilize brain-computer interfaces in the coming years to ensure that people know how their EEG data could be used, or at least to limit the scope of what can legally be done with their thought-data. We usually know when we're giving up our privacy online, but this provides an opportunity for someone to gather information from you without you knowing about it at all.

But while policy solutions- like a congressional order or an app store certification for compliant products- would go a long way towards helping, a more technical solution might be necessary. The Washington team has detailed a prospective technology they call a “Brain-Computer Interface Anonymizer” which would effectively "filter" signals so that apps could only access the specific data they require. 

Compare this to smartphone apps having limited access to personal information stored on your phone. Unintended information leakage is prevented by never transmitting and never storing raw neural signals and any signal components that are not explicitly needed for the purpose of a device’s communication and control.

At the end of the day, it may be important for the world at large to recognize that what we need is the establishment of a new human right that would protect people from having their thoughts and other mental information stolen, abused, or hacked. Some may believe that it may seem a little early to worry about brain hackers stealing our thoughts, but at the same time, it is usually more effective to introduce protections for people sooner rather than later.

We cannot afford to have a lag before security measures are implemented. After all, it always seems too soon to assess a new technology until it’s suddenly too late.