Unintended bias

Research shows that well-meaning managers aren't quite as unbiased and ethical as they think they are.

As an IT manager, you may not be as ethical as you think you are, says Mahzarin R. Banaji, a professor of social ethics at Harvard University. In the December issue of Harvard Business Review, she and co-authors Max H. Bazerman and Dolly Chugh write that most managers are burdened with unconscious biases that often result in unintentionally unethical decisions. Banaji discussed these biases with Kathleen Melymuka and offered suggestions for minimizing their impact.

You say most managers aren't as ethical as they think they are. Why not? Most people are not as ethical as they think they are because there are constraints on ethics that are not visible to the conscious mind, and even if they are, may not be easily controlled. All behavior, including behavior that has ethical implications, can be guided by thoughts and feelings that reside in unconscious form. Managers are no different, except that their attitudes and behavior have greater impact on others because of their role.

You talk about a number of sources of unintentionally unethical decision-making. Let's start with implicit prejudice. Can you give me an example in an IT setting? An implicit bias that may flourish in an IT setting may concern gender. If the association is that men are better at IT than women are, that may lead to men being selected for such positions more often than women, being given positions of higher responsibility and retained with greater confidence.

Yet you say implicit prejudice is rooted in the fundamental mechanics of thought. Can you explain? I refer to the type of bias we study as "ordinary prejudice" to distinguish it from conscious bias. It's ordinary in two ways. First, prejudice -- seemingly a thing of the heart -- is rooted in the fundamental mental mechanisms of perceiving, categorizing and remembering. To categorize, for example, is a basic skill without which we could not tell apart things that belong to different families -- say apples and oranges. To be able to do this is central to thinking. And this mechanism is involved in seeing that people belong to different social groups -- male and female, young and old, rich and poor, etc. It is in this sense that I say that prejudice has its origins or roots in the ordinary mechanics of thought. The second sense in which I might use the term ordinary is related. If (implicit prejudice) is rooted in the fundamental mechanics of thought, then it should be visible in all of us, not just a minority that may also be consciously biased.

Another source of bad decisions is in-group favoritism. What's wrong with putting in a good word for a friend? It's not that putting in a good word for a friend is wrong; it's the simultaneous lack of putting in a good word for others that makes the playing field unequal. It's the relatively greater access to economic and social benefits that accrue from unequal in-group favoritism that makes in-group favoritism problematic. If this attribute of being recommended were equally distributed such that all people equally got recommended by all others, we would have nothing to say here. It is because this benefit of being recommended is very much a function of where in society one sits that the simple act of recommending becomes discriminatory.

You also say people subconsciously give themselves too much credit. What's the danger of that in an IT setting? To the extent that much of IT work is collaborative, i.e., teams of people working together, taking more credit for work than is correct can cause (other team members to feel) disaffection and a sense of being unrecognized.

Finally, there's conflict of interest. That seems pretty unethical. How can it be unintentional? We are not speaking about intentional conflict of interest, that is, where people explicitly use the power they have in one context to gain favor in another. We are instead concerned about conflict of interest that may occur more subtly, for example, where a person stands to gain because of the more distant affiliations that play a role in who does favors for whom. Here, identifying situations that may lead to conflict of interest, even though they are "required" to be treated that way, is the progressive way to think.

How can an IT manager deal with these ethical lapses if he's not even conscious of them? Act affirmatively (to combat discrimination), not because the group has been wronged in the past, but because the harm from implicit bias is a thing of the present. We also say some more specific things about shaping one's conscious attitudes -- that can trickle down to the implicit level with practice -- and shaping the environment.

Can you explain and give me some advice on the strategy of shaping an IT environment? A diverse workforce does this in the most obvious way. It allows daily associations of good and bad to be made to many different types of people, blurring the simple association of only one type of association with a group. If Firm A has 20 Asian women and Firm B has two, the employees of Firm A have a higher likelihood of stereotypes of Asian women being disconfirmed.

Ultimately, you say, "vigilance even more than intention is a defining characteristic of the ethical manager." What do you mean by that? Good intentions don't always protect us. Vigilance is an effective way to begin the process of change.

Evaluate your ethics

Mahzarin R. Banaji and her colleagues say that implicit, unintentional bias is all around us. The Implicit Association Test has been designed by researchers to measure this unconscious bias by asking test-takers to make split-second associations between words with positive or negative connotations and images of different types of people. More than 2.5 million people have taken these tests -- many online. Among the findings:

-- Seventy-five percent of test-takers favor young, rich and white people over old, poor and nonwhite people.

-- Those who show bias on tests are more likely to act that way in face-to-face interactions and decisions.

-- The desire not to be biased doesn't eliminate bias. You can begin to deal with your own hidden biases by discovering them at www.implicit.harvard.edu and www.tolerance.org/hidden_bias.

Join the newsletter!

Or
Error: Please check your email address.

More about Harvard UniversityVigilance

Show Comments