Recently I’ve seen some popular videos online showing these really neat toys called quadrocopters. They’re basically just small flying toys with four rotors on each corner of the shell. They can hover and maneuver quite easily and are pretty interesting to watch. The other day I was in Metrix, and to my surprise I saw a couple of people working with one of them! I wanted to see what they were up to, and found out that this particular model of copter could be controlled directly from an iPhone! It also had a camera attached to the front allowing the phone user to see exactly what the copter sees. How cool! Of course, it wouldn’t be Metrix if someone wasn’t tweaking it to do something awesome. These two guys were attempting to get the copter working directly with the power of… your brain.

How were they going to accomplish that? Well one of the guys, Andrew Becherer brought the copter, and the other, Adam Smith-Kipnis brought a brain wave measuring device. I’ve seen these used in conjunction with toys before, but this one seemed a bit more sophisticated. It’s basically a headset with headphones and a sensor that rests against the forehead. This sensor can detect brainwaves (raw EEG values) and outputs it as numerical values. This digital data can then be used to play some games that come with the device, but it also allows its users to be more creative.

Both the copter and the headset come with development tools for multiple platforms. This allows them (and anyone else who owns one) to write software to use the devices in new and fun ways. The copter is pretty straightforward to understand and control. Depending on the speed of each fan, the craft can move in a variety of directions. Brain waves, on the other hand, are much more abstract. The two modes that the headset can understand are “attention” and “meditation”. But what does that mean? Adam told me that one of the challenges of this project was figuring out what those words meant to the headset, how to consistently reproduce those feelings, and how to translate a feeling into code that operates another device. They’re currently working on writing the interface (in C++), and hope to then be able to use that to allow the two devices to interact.
He said that one of the reasons it was so hard to understand what the feelings of attention and meditation mean to the headset is that the manufacturer has decided to keep that information proprietary. However, with the stream parser that comes with the headset, they hope to maybe be able to translate other feelings into usable instructions. Could feeling happy allow the copter to move up and down? What about feeling worried? Understanding other feelings is a bit down the road though, so currently they’ve been learning how to train their bodies to feel in a way the headset can already understand. Adam said that meditation got the best response when he was in a half sleep-half awake state, and that attention felt like focusing on something really hard. He said it takes a lot of training to be able to put out the right kind of brain waves.

Andrew mentioned that the headset is also trained to ignore facial motion, except for blinking. He thought that if they’re able to accurately control their wave functions, that maybe they could also incorporate blinking. One blink could mean one command, two blinks could mean another. That way they could have at least four parameters. He said they were also thinking of maybe bringing in another set of controls, such as with a Wiimote, joystick or iPhone to work in combination with the body. To get a more sophisticated response from the copter, you have to be able to do tilting and spinning at the same time, which would be a lot easier with an external device.

Another cool part about this story is how these two guys came to be working on their project. When I first saw them sitting there, I thought for sure they had been friends for a while. Turns out, they only met two weeks ago! They got to talking about the neat stuff they wanted to hack together and came upon the idea for this project right then. Even though they are currently only working on software and don’t actually need to use any of Metrix’s tools, they still chose to stop by. Why? They told me that Metrix had good creative energy and was a good place to work. I certainly agree!
They plan on meeting once a week at Metrix to get their proof of concept finished (controlling altitude with attention), and then hope on getting it fully functioning. Once this is finished, they might move on to controlling a robot Andrew owns with brain waves as well, or even building their own quadrocopter. Their code will be released open source, so perhaps you could make something too!

These kinds of projects really make me think about the future. I’ve already seen videos of amputees controlling robotic arms with just their brains, and computers seem to be moving in a direction of more intuitiveness and organic type interaction. Are we on a path to controlling all our devices with our brains? Imagine never having to use a mouse or keyboard again. I also wonder what making art with direct brain output (instead of using hands as a middleman) would look like. If our brains can talk to our devices and our devices can talk to each other, is telepathy in our future? I’m probably stretching a bit, but it’s still pretty cool to see a piece of the future in action today.