Skip to content

Providing context for perceiving emotion

Courtesy photo.

It’s long been said, “Don’t judge a book by its cover.” According to new research by Lisa Feldman Barrett, Distinguished Professor of Psychology at Northeastern University, that adage is certainly true when it’s applied to our ability to read emotions in a person’s facial expression.

Barrett explains that for more than 50 years, scientists have commonly thought that there are at least six basic facial expressions that indicate how a person is feeling. Barrett, however, found that many more factors — including body language, visual scenery, other voices and even a person’s cultural orientation — are essential to perceiving emotion accurately.

“This challenges the long-held belief in the science of psychology that faces are the main event,” Barrett says.

The findings were published in the October 2011 issue of the review journal Current Directions in Psychological Science. Barrett is the lead author, and collaborated with Northeastern graduate student Maria Gendron, and Batja Mesquita, a researcher at the University of Leuven in Belgium.

Barrett says, for instance, that a scowl could indicate anger, fear or disgust. She points to an iconic image of tennis player Serena Williams after she had just defeated her sister, Venus, at the 2008 U.S. Open. By only focusing on Williams’s face, it appears she is screaming in anger. However, zoom out at the entire picture, and she appears ecstatic and is clenching her fist in victory.

The research also found that the context of language aids facial perception. She cited a study in which participants were found to more accurately determine emotion when asked to choose a term of emotion from a predetermined set of words, rather than when participants were asked to come up with the appropriate term themselves.

These examples, Barrett says, illustrate that facial expressions by themselves do not broadcast feelings like words on a page. When the context is stripped away, a person’s face can broadcast whether they are in a positive or negative state or should be approached or avoided — but it doesn’t indicate whether that person is angry, sad or afraid.

Barrett says these findings will have major implications across a broad spectrum of research areas dealing with emotion. For example, security training methods for law enforcement and airport screeners may have to be reevaluated, given that training is often based on the idea that emotions can be read on a person’s face. She also says the findings could lead to greater understanding of the feelings of people suffering from mental illnesses.

“This has real world-implications for people to understand the limits of their own perception,” Barrett said. “You are an active architect in the way that you perceive the world. You’re not just a sounding board, simply receiving information from the world and detecting what’s there.”

Cookies on Northeastern sites

This website uses cookies and similar technologies to understand your use of our website and give you a better experience. By continuing to use the site or closing this banner without changing your cookie settings, you agree to our use of cookies and other technologies. To find out more about our use of cookies and how to change your settings, please go to our Privacy Statement.