During this summer’s Digital Revolution exhibition at London’s Barbican Museum, a small brainwave-influenced game sat sandwiched between Lady Gaga’s Haus of Gaga and Google’s DevArt booth. It was Not Impossible Labs’ Brainwriter installation, which combined Tobii eye tracking and an OpenBCI Electroencephalography (EEG) device to allow players to shoot laser beams at virtual robots with just eye movement and brain waves. “Whoa, this is the future,” exclaimed one participant.
But the Brainwriter is designed for far more than just games. It’s an early attempt at using Brain-Computer Interface technology to create a comprehensive communication system for patients with ALS and other neurodegenerative disorders, which inhibit motor function and the ability to speak.
The brain is one of the final frontiers of human discovery. Each day it gets easier to leverage technology to expand the capabilities of that squishy thing inside our heads. Real-world BCI will be vital in reverse-engineering and further understanding the human brain.
Though BCI is in an embryonic state — with a definition that evolves by the day — it’s typically a system that enables direct communication between a brain and a computer, and one that will inevitably have a major impact on the future of humanity. BCIs encompass a wide range of technologies that vary in invasiveness, ease of use, functionality, cost, and real-world practicality. They include fMRI, cochlear implants, and EEG. Historically, these technologies have been used solely in medicine and research, but recently there’s been a major shift: As the technology becomes smaller, cheaper, and woven into the fabric of everyday life, many innovators are searching for real-world applications outside of medicine. It’s already happening, and it’s often driven by makers.
The field is expanding at an astounding rate. I learned about it two and a half years ago, and it quickly turned into an obsession. I found myself daydreaming about the amazing implications of using nothing more than my mind to communicate with a machine. I thought about my grandma who was suffering from a neurodegenerative disorder and how BCIs might allow her to speak again. I thought about my best friend who had just suffered a severe neck injury and how BCIs might allow him to walk again. I thought about the vagueness of attention disorders, and how BCIs might lead to complementary or even supplementary treatments, replacing overprescribed and addictive medications.
I went on to found OpenBCI with Joel Murphy as a way to offer access to every aspect of the BCI design and to present that information in an organized, collaborative, and educational way. I’m not the only one who sees the potential of this amazing new technology. But creating a practical, real-world BCI is an immense challenge — as the incredibly talented Murphy, who designed the hardware, says, “This stuff is really, really hard.” Many have attempted it but none have fully succeeded. It will take a community effort to achieve the technology’s potential while maintaining ethical design constraints. (It’s not hard to fathom a few not-too-far-off dystopian scenarios in which BCIs are used for the wrong reasons.)
Of the many types of BCIs, EEG has recently emerged as the frontrunner in the commercial and DIY spaces, partly because it is minimally invasive and easily translated into signals that a computer can interpret. After all, computers are complex electrical systems, and EEG is the sampling of electrical signals from the scalp. Simply put, EEG is the best way to get our brains and our computers speaking the same language.
EEG has existed for almost a hundred years and is most commonly used to diagnose epilepsy. In recent years, two companies, NeuroSky and Emotiv, have attempted to transplant EEG into the consumer industry. NeuroSky built the Mindwave, a simplified single-sensor system and the cheapest commercial EEG device on the market — and in doing so made EEG accessible to everyone and piqued the interest of many early BCI enthusiasts, myself included. Emotiv created the EPOC, a higher channel count system that split the gap between NeuroSky and research-grade EEG with regard to both cost and signal quality. While these devices have opened up BCI to innovators, there’s still a huge void waiting to be filled by those of us who like to explore the inner workings of our gadgets.
With OpenBCI, we wanted to create a powerful, customizable tool that would enable innovators with varied backgrounds and skill levels to collaborate on the countless subchallenges of interfacing the brain and body. We came up with a board based on the Arduino electronics prototyping platform, with an integrated, programmable microcontroller and 16 sensor inputs that can pick up any electrical signals emitted from the body — including brain activity, muscle activity, and heart rate. And it can all be mounted onto the first-ever 3D-printable EEG headset.
In the next 5 to 10 years we will see more widespread use of BCIs, from thought-controlled keyboards and mice to wheelchairs to new-age, immersive video games that respond to biosignals. Some of these systems already exist, though there’s a lot of work left before they become mainstream applications.
This summer something really amazing is happening: Commercially available devices for interfacing the brain are popping up everywhere. In 2013, more than 10,000 commercial and do-it-yourself EEG systems were claimed through various crowdfunded projects. Most of those devices only recently started shipping. In addition to OpenBCI, Emotiv’s new headset Insight, the Melon Headband, and the InteraXon Muse are available on preorder. As a result, countless amazing — and maybe even practical — implementations of the BCI are going to start materializing in the latter half of 2014 and into 2015. But BCIs are still nascent. Despite big claims and big potential, they’re not ready; we still need makers, who’ll hack and build and experiment, to use them to change the world.
7 thoughts on “Open BCI: Rise of the Brain-Computer Interface”
For the disabled, for sensitive work, special uses where there’s no place or time for keyboards, I can see this is a great idea. For all those too lazy to type, well, they deserve to use such invasive, humiliating crap, after all, Ubuntu went Unity, and Windoze is also trying to make PCs into Pads… Next they will replace perfectly working bodies for Robots, yattta, yatta, yatta… Lots of Sci-fi books about it, none ends well.
I can’t wait to see where this goes. Is the limitation these days still on the interpretation end with making sense of the massive amount of information collected? or is it a combination of this and accurately picking up signals?
For perspective, Steven Hawking has ALS.
When he first started using the computer, he’d click a button/switch.
Now he’s progressed to the point where he can’t click the switch, so has to twitch his cheek.
The ability for him to use brainwaves directly would ensure he isn’t trapped in his body when he inevitably loses even the small movement he has left.
This is a GREAT thing.
First Stephen Hawking does not have ALS. He has a rare motor neuron disease similar and related to ALS but not ALS.
Second this is a great thing for all those with handicaps not just Stephen Hawking.
Nowhere was it implied that it would only be great for Stephen Hawking.
I’m interested in this technology. Is it out yet? If it is where can I buy a kit? If not, when will it be out?
You can purchase a kit at http://www.openbci.com
Comments are closed.