We live in an era of environmental awareness, corporate social responsibility, and financial crises and scandals such as Enron. In this climate, the mission of business schools to understand the ethical decision making of managers and leaders has greater relevance than ever.

Adding a unique perspective to this field is Benoît Monin, an expert on moral psychology. The associate professor of organizational behavior at Stanford’s Graduate School of Business probes what shapes people’s ethical decision making and behavior in everyday situations. He finds that lofty ethical principles matter much less than we think.

“At the end of the day, morality is an important currency by which we evaluate ourselves and others. Many of our choices are guided less by abstract principles than by whether we presently feel like a good or bad person, and whether we see other people as good or bad,” says Monin, who has taught ethics and management at the business school, and is also an associate professor of psychology in Stanford’s School of Humanities and Sciences.

Monin’s research reveals the complex interplay between self-image and morality. His key idea: Your sense of self-worth greatly affects your behavior and how you judge the behavior of others. If your self-worth is threatened, you’ll likely defend it and try to compensate. So you’ll rationalize your behavior by judging yourself more moral, or you’ll put down the morality of others. On the other hand, if you are confident that you are a good person, you can actually feel less compelled to act ethically. “The choices we make are influenced by how confident we are that we’re a good person. If you’re confident in your self-worth, you’re not as sensitive to threats,” says Monin.

Using techniques from experimental social psychology, Monin has identified psychological concepts and everyday situations where such moral dynamics come into play.

Under a phenomenon he calls the “sucker to saint” effect, people tend to cast themselves as morally superior when another person’s behavior makes them feel naïve or foolish. For instance, if a colleague succeeds better than you by cutting corners when you have dutifully followed every rule, you would be motivated to derogate the co-worker as unethical, and elevate yourself as a paragon of morality to justify your inferior performance.

In one of Monin’s experiments, Stanford undergraduates were asked to do a tedious task — write the words for numbers (one, two, three, etc.) as quickly as possible until told to stop. Another person playing the role of “rebel” entered the lab and was asked to do the same thing, but quit after just one minute. Participants were then asked to rate themselves and the rebel on numerous traits. Those who completed the chore and witnessed the other person quit rated themselves more moral than the rebel. “People claim to be saints rather than feel like suckers when they see others take shortcuts that they didn’t think of themselves. Rather than uphold abstract principles of justice, moral judgment may sometimes just help people feel a little less foolish,” noted Monin and coauthor Alexander H. Jordan in a 2008 research paper.

Similarly, people often resent “moral rebels,” or peers who do the right thing by refusing to go along with a questionable status quo. Their moral high ground implicitly puts you on lower moral ground. Their principled stand points out your failure to do the right thing. You might get annoyed and dismiss the do-gooder as “just goody two shoes,” says Monin, calling the phenomenon “do-gooder derogation.” This is especially problematic when people in organizations take a stand against a practice or request that they find ethically questionable, and must face resentment from peers who feel threatened by their stance.

In one study by Monin and his coauthors, undergraduates were asked to pick the most likely suspect in a burglary. They were shown three photographs and given information implicating the sole African American over the other two suspects, both white. Participants were later asked to rate fictitious participants who supposedly had been given the same task. The fictitious people were either “obedient” ones who picked the black suspect or they were “rebels” who refused to circle a face, saying the task was “obviously biased.” Other neutral participants first rated the fictitious participants and then did the task themselves — so they weren’t first implicated in the task. Participants who had been implicated by choosing the black face first reported disliking the moral rebels. Those who hadn’t yet made a selection liked the rebels and called them more moral. “But once you’ve done it yourself, the rebel makes you feel like a schmuck, and you resent the person that you would have otherwise embraced as a hero,” observes Monin.

In earlier research, Monin and Dale Miller, The Class of 1968/Ed Zschau Professor of Organizational Behavior at the business school, developed the concept of “moral credentialing.” That is, if you do something “good,” you feel licensed to then do something “bad.” You’re off the hook. Monin and Miller’s research demonstrated that people who chose a black or female applicant for one job were more likely to prefer a white or male candidate in a second recruiting task. That’s because once they established moral credentials as nonprejudiced, they felt freer to express views that could appear biased. “If I’m confident I’m a good person, that’s going to liberate me,” Monin says.

Monin believes there are lessons for business in the dynamics of moral psychology. Managers might be wiser about hiring and firing if they are keenly aware of how their perception of their own self-worth affects their decision making. Companies embarking on green initiatives should be vigilant not to let themselves off the hook in another area of social responsibility. Marketers selling green products might become aware of the risk of consumer backlash if the green aspect is marketed too conspicuously. And managers promoting good practices should anticipate that people who do the right thing might face resentment from their peers.

Monin joined Stanford’s psychology faculty in 2001, becoming associate professor in both the GSB and psychology department in 2008. A native of France, he arrived in the United States in the late 1990s and became interested in “political correctness” in American society. This led to his fascination with the “moral components” of racism and to his overall pursuit of moral psychology. Monin says his approach to morality is not based on “big principles” and “philosophical orientations,” but on how moral decision making “relates to the self.”

For media inquiries, visit the Newsroom.

Explore More