Control This Robot Arm – With Your Brain

Arduino Education Robotics Science
Control This Robot Arm – With Your Brain
DSC_1291
Jisoo Kim concentrates. Photo: Nathan Hurst

Jisoo Kim imagined waving her left arm, and a robot arm moved to the left. She concentrated on her right arm, and the robot arm flexed over to the right. Sort of.

Kim was wearing an electroencephalographic device for the “Make it Move” interactive display at Cognitive Technology, a new exhibition on understanding and influencing brain activity at the Exploratorium in San Francisco. The device was plugged into a computer and screen, and into a two-jointed aluminum arm built by Jon Ferran.

The neoprene cap worn by Kim was embedded with EEG sensors that read her brain waves and output them to a brain-computer interface called OpenBCI. “I think if this technology advances more, it will help a lot of disabled people who can’t move their arms,” says Kim. “Since everything is open source, people can build it themselves, so I think it will advance a lot more.”

Tomas Vega is a student majoring in computer science and cognitive science, one of the builders of the exhibit, and a member of the University of California Cognitive Technology Group. Cog Tech’s goal is to help make this technology ready to use in the real world.

“We are aiming to make a change, show the world that BCI is not something in the future, it is something that is already happening,” says Vega. “We want to enable this technology so that anyone can use it, with no experience in code or anything.”

3I1A8813
Training to use the BCI. Photo: Anja Ulfeldt

When you think about kicking a football, explains Vega, your brain behaves as if it’s actually kicking. An EEG (electroencephalography) device can read that in the form of electrical signals on the scalp, and the information from those signals can be processed, filtered, and analyzed to provide feedback that can be recognized by a computer.

To make the exhibit, the Cog Tech team adapted an EEG board from OpenBCI. OpenBCI’s original headset measures brain activity in both hemispheres, and records that data on 8 channels. But it requires electrodes pasted to the skull — infeasible for an exhibit that will allow many people to pass through and experiment every day. So Cog Tech made a soft helmet with dry electrodes from Cognionics, which sits on the head with a velcro strap under the chin.

“I really think BCI has the potential for an additional way to interact with computers,” says Pierre Karashchuk, another Cog Tech member. “For human computer interaction, we’ve been kind of limited.” He means mice, keyboards, and lately, touchscreens, all of which are ultimately not that different.

Vega (left) and Karashchuk explain how to use the BCI. Photo: Nathan Hurst
Vega (left) and Karashchuk explain how to use the BCI. Photo: Nathan Hurst

There’s a lot that you can do with a direct interface that you couldn’t with a keyboard. Some are detecting and expressing mood, like an EEG beanie (Make: Volume 43), or to gauge attention paid to a graphical user interface.

Karashchuk is a statistician and computer science major, and he’s working on how the software interprets the information gleaned by the EEG, developing libraries that help process the signals and turn them into an output. But the signals come in very noisy, and the software has to employ machine learning to sort it out.

3I1A8710
The cap, with the OpenBCI at the back. Photo: Anja Ulfeldt

On top of that, points out Vega, it can be expensive to work with BCIs, and EEG interpretation faces skepticism from the academic community. In the setting of an exhibit, there’s the additional barrier of teaching visitors to actually control the interface. It’s an acquired skill, and one that can be very difficult to master, or even pick up the basics.

“The training interface is a big roadblock to BCI being adopted for control of different devices,” says Cog Tech’s Stephan Frey. “This is kind of like a proof of concept. If we can get people, even just some people at the Exploratorium who have never used this before, and use this pool of visitors to teach them about BCI but also show that this training interface paradigm can work on people who haven’t tried it before, then that makes a really solid case that it can be used … if you’re paralyzed, or locked in, or have some larger disability where you’re relying on the BCI.”

Historically, BCIs have been used primarily for scientific research, says Frey, who is a senior studying cognitive science. But as the tools develop more precision, as we get more info, they can be used to channel info back to the science, and even for more practical problems.

Karashchuk adjusts the cap. Photo: Anja Ulfeldt
Karashchuk adjusts the cap. Photo: Anja Ulfeldt

It’s easy to imagine applications for a brain-controlled robot arm. A locked-in patient could perform tasks for him or herself. The Cog Tech crew has used it to control the flight of a drone. And BCI has potential applications in basic computer control. But Vega has a more personal motivation, too. “I want to be a cyborg,” he says. “That’s my long-term goal. I’m going to work all my life to make this a reality. There’s nothing that makes my heart beat faster than this dream of being enhanced by technology. This dream of being augmented, and augmenting my capabilities as a human, and trying to push the boundary.” Vega shaves his head to get a better connection with the EEG. His goal for spring semester is to learn to drive a wheelchair with the technology.

Ferran's arm. Photo: Anja Ulfeldt
Ferran’s arm. Photo: Anja Ulfeldt

To make this happen will require a great deal more computer learning. As it is, the arm basically just waves back and forth. Ferran, who built the arm, combined an Arduino Mega with components from an old bartending arm, and gave it just one axis because the BCI could only handle one degree of freedom anyway — left and right.

“It was pretty difficult,” says Kim, of making the arm move. “The most difficult part was to think the way that can control the arm; imagining moving my left or right arm is different from moving it.”

Still, Kim says she could feel herself getting better. Training for hours or days could make a big difference in how effective a person could control something.

“Your brain actually learns to generate these waves,” adds Karashchuk. Taken that way, the BCI is training us just as much as we are training it.


Editor’s notes:

Photos from Anja Ulfeldt.

Discuss this article with the rest of the community on our Discord server!
Tagged

Nathan Hurst is an editor at Make. He loves anything having to do with science or bicycling. He tweets as @nathanbhurst.

View more articles by Nathan Hurst

ADVERTISEMENT

Maker Faire Bay Area 2023 - Mare Island, CA

Escape to an island of imagination + innovation as Maker Faire Bay Area returns for its 15th iteration!

Buy Tickets today! SAVE 15% and lock-in your preferred date(s).

FEEDBACK