Skip to content

Moral Hypocrisy is Deliberative, Finds Northeastern University Researcher

Study finds bias toward self disappears under cognitive constraint

Boston, Mass. – Moral hypocrisy is an antisocial behavior familiar to most of us in which people tend to judge their own moral transgressions more leniently than the exact same transgressions when committed by others. Yet, until recently, the origin of this bias was not known. Northeastern University researchers Piercarlo Valdesolo and David DeSteno have now found that at heart, the mind is just as sensitive to our own transgressions, but that bias in favor of protecting the self actually grows out of cognitive rationalization processes. The research is discussed in the latest issue of the Journal of Experimental Social Psychology.

In their study, Valdesolo and DeSteno demonstrated not only that participants viewed their own transgressions as significantly more “fair” than the same transgressions enacted by others, but also that this bias was eliminated under conditions of cognitive constraint. They found that hypocrisy readily emerged under normal processing conditions, but disappeared under a cognitive load, which “ties up” the mind’s ability to engage in higher order rationalization and reasoning.

“Our findings support the view that hypocrisy emerges from deliberative processes,” said David DeSteno, Associate Professor of Psychology at Northeastern University. “It stems from volitionally-guided justifications, which shows that at a more basic level, humans possess a basic negative response to violations of fairness norms whether enacted by themselves or others.”

In their studies, the authors gave participants the option to assign fun and onerous tasks to themselves and others either randomly, or by personal choice. Other participants did not make the choice themselves, but watched other individuals assign themselves the more enjoyable task. When the authors asked individuals to judge the fairness of these actions, everyone who assigned the preferable task to themselves judged this action to be more fair than did those who judged another person assign the easy task to him or herself.

However, when these judgments were made under cognitive constraint (i.e., remembering a random digit string), “participants experiencing cognitive load judged their own transgressions to be as unfair as the same behavior enacted by another,” said Piercarlo Valdesolo, graduate student of psychology at Northeastern University. “It is also clear that when contemplating one’s transgressions, motives of rationalization and justification temper the mind’s initial negative response to fairness transgressions and leads to more lenient judgment.”

This study provides strong evidence that moral hypocrisy is governed by a dual-process model of moral judgment wherein the prepotent negative reaction to the thought of fairness transgression operates in tandem with higher order processes to mediate decision making.

“In light of our findings, future work should aim to further define the conditions which temper hypocrisy and ultimately suggest ways in which humans can better translate moral feelings into moral actions,” added DeSteno.

For more information, please contact Renata Nyul at 617-373-7424 or at

About Northeastern

Founded in 1898, Northeastern University is a private research university located in the heart of Boston. Northeastern is a leader in interdisciplinary research, urban engagement, and the integration of classroom learning with real-world experience. The university’s distinctive cooperative education program, where students alternate semesters of full-time study with semesters of paid work in fields relevant to their professional interests and major, is one of the largest and most innovative in the world. The University offers a comprehensive range of undergraduate and graduate programs leading to degrees through the doctorate in six undergraduate colleges, eight graduate schools, and two part-time divisions. For more information, please visit

Cookies on Northeastern sites

This website uses cookies and similar technologies to understand your use of our website and give you a better experience. By continuing to use the site or closing this banner without changing your cookie settings, you agree to our use of cookies and other technologies. To find out more about our use of cookies and how to change your settings, please go to our Privacy Statement.