In measurement science, “bias” refers to the systematic error component of the measurement system. Unlike other types of measurement error that are randomly distributed, a bias predictably shifts a measurement in the same direction.
For instance, I recently facilitated a “round robin” measurement correlation study with two other companies, where we compared the outputs of our hardness testers using the same set of test samples. While preparing for that study, I realized that one of our hardness testers, on average, tested 1.1 Rockwell B points higher than the reference sample. It wasn’t testing exactly 1.1 points over, but instead ranged from .8 to 1.4 points over across a series of tests. In other words, we had two error components: a bias of +1.1 points and a random error of +/- .3 points. To compensate for this bias, we shifted down the output reading of the tester by 1.1 points, leaving only the random error component in our tester’s output values.
While bias is easy to identify in the world of measurement, systemic errors in our decision-making processes are more difficult to detect. Research from psychological science shows that, despite our best intentions, our behavior is often influenced by biases operating outside our conscious awareness.1 These unconscious biases act like the current in a river consistently shifting our thinking in the same direction, resulting in a predictable pattern of errors. But because these biases generally operate below our conscious reasoning, they’re difficult to see in ourselves but easier for a trained observer to see in others.
Everyone has met a “glass is half-empty” pessimist. For these people, everything is bad, everything is too expensive, and Murphy is just over the next hill. But when confronted with their persistent negativity, pessimists usually claim that they’re not pessimists, but realists. They genuinely believe they’re seeing the world as it is, yet the people around them can readily see their negative bias.
A pessimism bias in a decision-maker isn’t much different from the tendency of my hardness tester to consistently report samples harder than they actually are. No matter what the pessimist is presented with, they always see it worse than it truly is.
As professionals, we like to think that we can see the world as it is; that we can observe data objectively. When we approach a problem, we like to imagine that we’re starting with a blank slate free of preconceived ideas. But the truth is, we don’t. No one does. We all make mistakes, of course. But many of our mistakes aren’t simply random misjudgments, they’re biases. And we all have these biases to some degree that repeatedly steer our thinking in the same erroneous direction.
One of the most pervasive unconscious biases is called the “confirmation bias”. This is the tendency for people to filter out feedback that doesn’t support their established assumptions. In essence, we tend to favor data that supports our beliefs and ignore data that refutes them.
One popular mantra in today’s culture goes something like, “Kids these days don’t want to work … They’re lazy. All they want to do is play video games.” And sure enough, people who espouse this belief are quick to point out their nephew or neighbor’s kid who doesn’t seem to be accomplishing as much as they think they should. But in their quest to spot underachieving teenagers, they unconsciously filter out the young man washing dishes after school to help support his family, or the young lady who’s mowing her elderly grandfather’s lawn.
As a result, people with a strong confirmation bias tend to jump to conclusions, offer over-simplified explanations for observations that contradict their prejudices, and possess unwarranted confidence in the veracity of their claims. After all, everywhere they look is evidence supporting their view.
Another well-studied bias is called the affinity bias. This is the unconscious tendency to gravitate toward people like ourselves. How we can manifest this tendency is straightforward: people with similar educational backgrounds, demographics, or experiences require less effort to connect with. They’re easier for us to understand. And this streamlined process of personal connection combined with our limited time and energy leads us to further relationships with people like ourselves.
The affinity bias, sometimes called the similarity bias, can have severe consequences on organizations. In the hiring process, for instance, managers often interview candidates with an eye for things like “cultural fit” or “chemistry”. Placing your answer to the question “How well will this person fit into our organization?” above “How well-qualified is this person to excel in the position?” opens the door wide to the negative effects of affinity bias.
One final unconscious bias for consideration is termed the anchoring bias. This is where we tend to over-value the first piece of information we are given about a topic. In a popular study of the anchoring bias2, independent groups of people were asked two questions each. One group was asked the following two questions:
– Is the population of Turkey greater than 35 million?
– What’s your best estimate of Turkey’s population?
The second group was asked these two questions:
– Is the population of Turkey greater than 100 million?
– What’s your best estimate of Turkey’s population?
Invariably, the second group estimated the population of Turkey many millions of people higher than the first.
Anchoring bias can unduly influence our decisions in a variety of ways, and as a result, “anchoring” is widely used by marketers, attorneys, and negotiators to sway their intended audiences toward their first offer.
Now that we understand some of the common unconscious biases, the obvious question is, “What can we do to prevent these from interfering with our decision-making processes?”
A good first step is simply to recognize that we all have unconscious biases; no one can see everything objectively. Possessing biases doesn’t make us a bad person; it makes us human. Starting with some knowledge of our flawed framework may propel us to improve it.
A second step toward minimizing the effects of your biases is educating yourself on these tendencies. Pessimism, confirmation, affinity and anchoring are a few of the more commonly discussed biases. But psychologists have identified several others shown to have profound negative effects in the workplace. These include the halo effect, the sunk cost trap, the fundamental attribution error, and the conformity bias. Understanding how these biases work will help you spot them sooner.
Including more people in decision-making also helps decrease the impact of biases because it allows more perspectives to be weighed into the analysis. Sharing your knowledge about these subtle tendencies and their potential consequences with your colleagues is also an effective step toward improving your organization’s collective decision-making skills. As you and your team weigh out options and exchange ideas about how unconscious biases may be affecting your organization’s decisions, everyone has the potential to improve their decisions.
And lastly, try dismissing the thought that you fully understand another’s opinion or experiences. Catching yourself in the act of reverting to what you already understand and doing the work of learning someone else’s perspective takes practice, but it’s doable. Allow yourself to receive constructive criticism. Resist the urge to immediately defend your ideas, but instead, let other’s perspectives challenge your thinking. These practices will not only help you weed out the biases in your own thinking, but help you build connections and acquire knowledge you would have been previously unable to do so.
1. “US Supreme Court Recognizes Role of Unconscious Bias in Disparate Treatment”. Association for Psychological Sciences. July, 2015. https://www.psychologicalscience.org/news/releases/us-supreme-court-recognizes-role-of-unconscious-bias-in-disparate-treatment.html
2. “The Hidden Traps in Decision Making”. Hammond, John, Keeney, Ralph, and Raiffa, Howard. Harvard Business Review. October 1998. https://hbr.org/1998/09/the-hidden-traps-in-decision-making-2.