It’s not just character. We also need to avoid ‘moral overconfidence.’
Whenever we see examples of leaders who suffer an ethical or moral lapse, our knee-jerk response is to say there was a flaw in their character, and that deep down, he or she was basically a bad person. That view is part of our larger tendency to sort the world into good people, who are stable, enduringly strong, and blessed with positive character, and bad people, who are inherently weak, frail, or malicious.
In this way of thinking, character is an immutable trait. It’s largely formed during childhood and adolescence, with parents playing a key role. People who adhere to this worldview believe character is something hidden inside each of us, waiting to be revealed during adversity or through careful testing.
In my view, this is really the wrong way to think about it.
Instead, I look at character development in the same way we consider the development of intelligence, wisdom, or subject expertise – as a lifelong process. It’s also not as binary as some observers like to think. The world is not divided into “good people” and “bad people.” Most people have the potential to behave well or poorly, depending on the context.
Most people also overestimate their strength of character – a problem I’ve come to think of as “moral overconfidence.” Consider two famous experiments that highlighted how situation and context affect moral behaviour.
The first is called the Good Samaritan Experiment, and it was conducted by Darley and Batson at Princeton University. They took 67 divinity students and divided them into groups to deliver short talks to other students. Half the students would talk about career prospects for divinity school graduates. The other half would discuss the parable of the Good Samaritan. The speakers were given maps marking the location of the classroom where they would give the talk. Some were told they were running late; some were told they were on schedule; some were told they had extra time.
On the way to the distant classroom, the students encountered a man lying in a doorway, eyes closed, groaning. The question: would these students stop to help the injured man, or keep walking? Remember, these weren’t just any students—they were divinity students, who aimed to spend their careers helping people. Also remember, half of them were about to deliver a speech on the Good Samaritan who stopped by the road to help an injured stranger—the exact scenario that they were encountering. Overall, just over 40% of the students stopped to offer aid. Among those who were running late, just 10% stopped. Of those who were preparing to talk about the Good Samaritan parable, 53% helped the man. The lesson most people draw from this: even among a group of students, you might expect to possess above-average measures of “character,” a high number behaved poorly when placed in the moderately stressful situation of running late for an appointment.
The second research is the so-called Milgram experiments, conducted at Yale University in the early 1960s. Subjects were placed in the role of a teacher who’s asked to help a learner – really a paid actor – to memorise sets of word pairs. Teacher and learner were placed in separate rooms, but they could hear each other via a speaker. Each time the learner made a mistake, the teacher was instructed to turn a dial to administer an electric shock to the learner. With each mistake made, the voltage was increased. In reality there was no shock being administered, but as the voltage went up, the actor began screaming in pain and banging on the wall. Many of the teachers expressed discomfort giving the shocks, but they were told it was an essential part of the process, and they should continue. Despite their misgivings, 65% of the subjects kept administering the fake shocks to the maximum 450-volt level. Milgram’s research, which was motivated in part by his desire to understand why so many lower-level soldiers helped implement the Nazi Holocaust, offers powerful evidence that obedience to authority – in this case, the person in charge of the experiment – is a powerful force. People will do abhorrent things if they’re directed to and made to feel like they’re just following orders.
While both the Good Samaritan and Milgram experiments highlight humans’ propensity to engage in poor behavior in specific circumstances, it’s worth pointing out that in each experiment, a fair proportion of people did the right thing, even under pressure. The trouble is, people wildly overestimate their own strength of character, and far more people assume that they’d be in this “good” group than really would be. When I teach MBA students or executives about the Milgram experiments, I always ask them to raise their hands if they think they’d have stopped delivering shocks, despite the admonition to continue. At least two-thirds of the people say they’d have had the courage to stop – and in some groups, 80% believe they’d have behaved well. In reality, the experiments suggest that only one-third would behave responsibly.
Behavioral finance has taught us that more people are far too confident about their investing prowess, and the same is true in ethics: most people exhibit moral overconfidence; they overestimate their own strength of character in the face of pressure or temptation.
In this way of thinking, character is an immutable trait. It’s largely formed during childhood and adolescence, with parents playing a key role. People who adhere to this worldview believe character is something hidden inside each of us, waiting to be revealed during adversity or through careful testing.
In my view, this is really the wrong way to think about it.
Instead, I look at character development in the same way we consider the development of intelligence, wisdom, or subject expertise – as a lifelong process. It’s also not as binary as some observers like to think. The world is not divided into “good people” and “bad people.” Most people have the potential to behave well or poorly, depending on the context.
Most people also overestimate their strength of character – a problem I’ve come to think of as “moral overconfidence.” Consider two famous experiments that highlighted how situation and context affect moral behaviour.
The first is called the Good Samaritan Experiment, and it was conducted by Darley and Batson at Princeton University. They took 67 divinity students and divided them into groups to deliver short talks to other students. Half the students would talk about career prospects for divinity school graduates. The other half would discuss the parable of the Good Samaritan. The speakers were given maps marking the location of the classroom where they would give the talk. Some were told they were running late; some were told they were on schedule; some were told they had extra time.
On the way to the distant classroom, the students encountered a man lying in a doorway, eyes closed, groaning. The question: would these students stop to help the injured man, or keep walking? Remember, these weren’t just any students—they were divinity students, who aimed to spend their careers helping people. Also remember, half of them were about to deliver a speech on the Good Samaritan who stopped by the road to help an injured stranger—the exact scenario that they were encountering. Overall, just over 40% of the students stopped to offer aid. Among those who were running late, just 10% stopped. Of those who were preparing to talk about the Good Samaritan parable, 53% helped the man. The lesson most people draw from this: even among a group of students, you might expect to possess above-average measures of “character,” a high number behaved poorly when placed in the moderately stressful situation of running late for an appointment.
The second research is the so-called Milgram experiments, conducted at Yale University in the early 1960s. Subjects were placed in the role of a teacher who’s asked to help a learner – really a paid actor – to memorise sets of word pairs. Teacher and learner were placed in separate rooms, but they could hear each other via a speaker. Each time the learner made a mistake, the teacher was instructed to turn a dial to administer an electric shock to the learner. With each mistake made, the voltage was increased. In reality there was no shock being administered, but as the voltage went up, the actor began screaming in pain and banging on the wall. Many of the teachers expressed discomfort giving the shocks, but they were told it was an essential part of the process, and they should continue. Despite their misgivings, 65% of the subjects kept administering the fake shocks to the maximum 450-volt level. Milgram’s research, which was motivated in part by his desire to understand why so many lower-level soldiers helped implement the Nazi Holocaust, offers powerful evidence that obedience to authority – in this case, the person in charge of the experiment – is a powerful force. People will do abhorrent things if they’re directed to and made to feel like they’re just following orders.
While both the Good Samaritan and Milgram experiments highlight humans’ propensity to engage in poor behavior in specific circumstances, it’s worth pointing out that in each experiment, a fair proportion of people did the right thing, even under pressure. The trouble is, people wildly overestimate their own strength of character, and far more people assume that they’d be in this “good” group than really would be. When I teach MBA students or executives about the Milgram experiments, I always ask them to raise their hands if they think they’d have stopped delivering shocks, despite the admonition to continue. At least two-thirds of the people say they’d have had the courage to stop – and in some groups, 80% believe they’d have behaved well. In reality, the experiments suggest that only one-third would behave responsibly.
Behavioral finance has taught us that more people are far too confident about their investing prowess, and the same is true in ethics: most people exhibit moral overconfidence; they overestimate their own strength of character in the face of pressure or temptation.
Source : IIPM Editorial, 2012.
An Initiative of IIPM, Malay Chaudhuri
and Arindam Chaudhuri (Renowned Management Guru and Economist).
For More IIPM Info, Visit below mentioned IIPM articles.
IIPM Best B School India
Management Guru Arindam Chaudhuri
Rajita Chaudhuri-The New Age Woman
IIPM's Management Consulting Arm-Planman Consulting
An Initiative of IIPM, Malay Chaudhuri
and Arindam Chaudhuri (Renowned Management Guru and Economist).
For More IIPM Info, Visit below mentioned IIPM articles.
IIPM Best B School India
Management Guru Arindam Chaudhuri
Rajita Chaudhuri-The New Age Woman
IIPM's Management Consulting Arm-Planman Consulting