A key to communication for locked-in syndrome patients

For the past seven years, one man receiving care from LifeStream, a Massachusetts-based human services organization, has only been able to communicate by blinking his eyes in response to yes-or-no questions. He has cognitive awareness but is paralyzed with respect to nearly all of his voluntary motor muscles due to severe injuries he suffered in a car accident.

His condition is called locked-in syndrome, which affects more than 50,000 people each year as a result of traumatic brain injuries, strokes, or other afflictions. Over the past year, six Northeastern electrical and computer engineering seniors have been developing a better communication system for individuals such as the man at LifeStream, who tested the group’s device himself.

“Existing systems in which there’s eye-tracking through infrared can be extremely expensive,” said Colin Sullivan, E’14. “One of the systems we looked at cost upwards of $10,000. Sometimes health insurance won’t cover it, and even if it does there’s a large co-pay.”

For their engineering design capstone project, the students created an alternative device that costs a mere $223. And that’s before optimization and commercialization, both of which could decrease the price even further, said group member Robin Yohannan, E’14.

Instead of detecting the location of the pupil or using expensive brain-computer interfaces, the team’s EOG Assisted Communication Device takes advantage of a unique characteristic of the human eyeball. “The eye has an intrinsic property where the front of the eye is more positively charged than the back of the eye,” Yohannan said. When the eyes move to the left or the right, electrodes taped to either side of the head detect a voltage differential based on that polarity, he explained.

Team member Ryan Whyte, E’14, worked on the hardware of the system, building off a circuit he learned about in a biomedical electronics course taught by electrical and computer engineering assistant professor Mark Niedre. The circuit filters and amplifies the signals coming from the electrodes and turns them into reasonable inputs for the arduino microcontroller, for which Yohannan and Jeffrey Mui, E’14, wrote the code.

“This allows us to translate the digital signals into reliable eye movements on the screen,” Mui explained.

Group leader Spencer Wenners, E’14, worked on the graphical user interface, which takes the digital output from the arduino and converts it into a cursor moving around a computer screen. Now, the user can control that cursor using nothing but the movements of his eyeball.

He looks at an on-screen keyboard, moves the cursor to a letter, then blinks his eyes for two seconds or more to select that letter. In this way, the device allows users to type out messages to their loved ones and care givers.

Sullivan, who also works at the Center for Research Innovation, worked on the testing and debugging of the system as well as moving the technology through the provisional patenting process.

Associate professor Waleed Meleis, the capstone team’s faculty adviser, has high hopes for the future of the device. The project will live on in the newly developed student group Enabling Engineering, which Meleis started in an effort to extend the reach and lifespan of promising technologies that aim to help the disabled community. LifeStream, where the locked-in patient lives, is an official supporter of the group.

The six seniors, for their part, have big post-graduation plans. Wenners, for instance, will be taking on a job as a product development engineer at Intel, and Mui is getting more seriously involved in a nonprofit group that develops prosthetic hands for children. Even though they’re moving on from their capstone project, the students said they’ve learned lessons from their experience that will stay with them throughout their engineering careers.

“We all took four or five years of engineering classes and transformed that into something that can actually help the community,” Wenners said.