Social psychologists have documented that people believe they’re more likely than others to donate blood, give to charity, treat another person fairly, or give up their seat on a crowded bus for a pregnant woman.
But research from University of Chicago postdoctoral researcher Nadav Klein and Chicago Booth’s Nicholas Epley suggests that people don’t necessarily believe they are holier than others. Instead, people simply believe they are less evil than others. Understanding the fine distinction between the two ideas could be used to help guide behavior.
The researchers conducted a series of experiments to explore how people think of their own and others’ virtue. In one, Klein and Epley called people into a lab and told some participants (“targets”) that they would be paired up with other participants (“actors”). Each pair had a chance of winning $10 at the end of the experiment, but the actors got to decide how this $10 would be split between themselves and the targets.
Klein and Epley instructed some actors either to keep $9 out of the $10 for themselves (a selfish action) or to keep only $1 (a generous action). The researchers then asked all participants whether the actors would have done the same thing had they been free to do so.
The actors who were told to selfishly keep $9 were self-righteous—they were less likely than the targets to believe that this selfish action represented their true character. However, the actors told to keep only $1 were no more likely than targets to believe that this generous action represented their true character. Actors saw themselves as less evil than targets thought them to be—but they didn’t see themselves as more ethical.
Other experiments revealed the same “asymmetric self-righteousness,” as the researchers call it: people consistently reported they were less evil but no more moral than others. In one case, participants predicted that they’d feel worse after carrying out a selfish act (giving away just $1 of $6), but not any better than others after carrying out an unselfish act (giving away $5 of $6). In another case, participants believed they’d give more money to another person than others would in their most selfish moment—but no more than others in their most generous moment.
And people felt they were less likely than others to engage in immoral behaviors, such as stealing a $20 tip left for a waiter—but neither more nor less likely to engage in moral behaviors such as returning a lost wallet.
When it comes to ethical behaviors, people judge themselves based on their own positive intentions, but they judge others based on their actions, the researchers write. And when it comes to unethical behaviors, people justify their own actions but are less understanding of others’.
The findings could benefit people writing policies that try to encourage particular behaviors. Take, for example, a company that wants to initiate policies that promote ethical practices, such as avoiding gender bias in job applications. If people believe they are unlikely to engage in unethical behaviors, saying these policies are aimed at preventing unethical behavior might be ineffective “since in that case people might think, ‘Oh, that doesn’t apply to me!’” says Klein. “So the better way to frame these policies may be as ones aiming to promote ethical—rather than discourage unethical—behavior.”