The brain controlled Mars rover

It’s cool enough that people are making robots to crawl around Mars, but what if we could control those robots with nothing but our minds?! Well, that’s pretty much exactly what Northeastern grad student Umut Orhan of assistant professor Deniz Erdogmuscognitive systems lab did on Saturday.

Except instead of Mars, the robot was in Worcester. But still.

So…what in the heck is she talking about? you may wonder. Here’s the deal:

Erdogmus’ lab focuses on signal processing, machine learning, and their applications to contemporary problems in biology and biomedical engineering. In one project, Erdogmus has teamed up with fellow assistant professors of electrical and computer engineering Gunar Schirner and Kaushik Chowdhury as well as Worcester Polytechnic Institute’s robotics engineering professor Taskin Padir, to develop a Brain Computer Interface (BCI) for the population of functionally locked-in individuals, who are unable to interact with the physical world through movement and speech.

An alien life form hanging out behind a rock at the RASC-AL competition

A BCI is a program that turns electrical signals from the brain (captured through electrodes strategically placed on the head using the cap you see in the photo) into computer commands. There are several ways of doing this, but one of the most popular uses stimulation of the visual cortex. A patient looks at a screen with a set of checkerboards each flashing at their own frequency and each capable of generating a unique electrical signal in the brain. Padir will develop robots that can respond to these electrical signals for the purposes of both communication and self-feeding, two of the most important priorities for the locked-in community.

Another of Padir’s robotics projects is designing Oryx, a “planetary exploration mobility platform” (aka planet crawling robot or rover). For two years in a row now, Oryx has won the national Student Rover Design Competition held by NASA and the National Institute of Aerospace. RASC-AL Robo-Ops, as the competition is called, asks teams of undergraduate and graduate students to design and build a rover and then control it from a remote location to pick up rocks and “alien life forms.” Traditionally, Oryx is controlled using standard keyboard and mouse commands via wireless computer.

But the research teams came up with an even better idea. Over the weekend, Erdogmus and Padir’s labs participated in WPI’s TouchTomorrow festival of science, technology and robotics by combining BCI with Oryx. Instead of controlling a robot that brings food to a locked-in patient’s mouth, they controlled the rover with the same technology. This particular project was spearheaded by grad student Hooman Nezamfar.

Orhan and one of Erdogmus’ post doctoral researchers, Murat Akcakaya, holed up in a dimly lit room in the cognitive systems lab on Saturday morning while the other half of the team enjoyed the sunshine over in Worcester. With a streaming video from a camera mounted atop Oryx, the Boston duo could see what the robot saw. When they connected the BCI to Oryx over the 4G network, stimulation of Orhan’s visual cortex translated into movement commands for the rover.

Oryx in Worcester. The dudes in the background are skyping with Orhan and Akcakaya, trying to remedy the network connection failure.

There were a few glitches: the network was all sorts of clogged so there was a significant delay between Orhan’s visual commands and the robot’s movements, but otherwise it was a pretty stellar thing to watch. Very slight movements in Orhan’s visual attention (even his peripheral vision) caused the robot 50 miles away to do its little dance for the kids and adults watching at the festival.

Orhan said that in addition to therapeutic applications, BCI could be useful for military purposes. “It could allow a pilot to control a plane if his hands were indisposed or incapacitated,” he said.