The positions we take in our sleep can have implications for our health, including affecting the symptoms of conditions such as sleep apnea or carpal tunnel syndrome. But most of us sleep under blankets in dark rooms, where it’s hard to study our natural sleeping poses.
Researchers at Northeastern are combining different sensing technology with machine learning techniques to monitor a sleeper’s position even under the covers in total darkness. Their work could make this monitoring easier for doctors and less invasive for patients.
“As healthy adults, we spend almost one third of our life in bed, if not more,” says Sarah Ostadabbas, an assistant professor of electrical and computer engineering at Northeastern. “For patients that are in hospitals, the elderly, and young kids, it can go up to 100 percent of the time. We can bring the power of computer vision and artificial intelligence to make the process of understanding human behavior in bed easier.”
Ostadabbas and her team in the Augmented Cognition Laboratory received a grant from the National Science Foundation to build a data set of sleep positions and use it to train algorithms to recognize and identify the pose of a sleeping person. They recently shared both the data set and their algorithms on their website, to allow other researchers to use and expand on their work.
“There were no data sets out there that we could use to teach our algorithm how to understand, predict, and diagnose some of these sleep behaviors or abnormalities,” Ostadabbas says. “So we decided to collect our own data and release it to the public.”
To build the data set, the team recruited more than 100 people from the Northeastern community to come to their lab and lie down in various positions. The researchers collected several types of data, with and without a sheet or blanket: A mat under the person took pressure readings, an overhead sensor mapped the depth changes across the bed, and two cameras took standard and infrared images.
Each session took about two hours. At the end of it, the researchers had compiled more than 14,000 position samples and labeled each one.
In a recent paper, Ostadabbas and her team described how they used the infrared data to train an algorithm. Currently, using a pressure mat is the least invasive way to monitor how a person is sleeping. But pressure mats are large and expensive, which limits their use. Infrared cameras, which detect a person’s body heat, could prove to be a cheaper and less cumbersome option.
“If the patient is covered by a sheet, the heat will transfer to the surface to form a profile quite similar to the patient,” says Shuangjun Liu, a doctoral student who is the student lead of this project. “We wanted to use this as a clue to recognize the poses. And from the result, we believe it works well.”
Their algorithm, relying exclusively on infrared data, accurately captured the pose of a sleeping person 96 percent of the time. But this is just the first step, Ostadabbas says. Her team is inviting the medical and machine-learning communities to add to the data set, and expand on their work.
“Better algorithms and better data sets will lead us to a toolbox that is easy to use, easy to tune, and can work in different applications,” Ostadabbas says. “At the end of the day, we are hoping to have better quality of life for patients that are suffering from some of these sleep-related or bed-related disorders.”