The Problem with Hidden Biases and a Strategy to Reduce Bias
Have you ever been in a volunteer or board meeting where one person suggests a course of action and everyone just goes with it, without really questioning the viability of the solution? Conversely, have you ever been in a meeting where the group sees a need to discuss a decision until it is long dead and buried and refuses to decide until more “research” can be done?
Frustrating, right? Luckily, there are ways to prevent these situations, increase the quality of decisions being made, and inspire volunteers to be more mindful, all at the same time!
In the cases above, a cognitive bias is a key barrier to success. Cognitive bias is defined as mental error caused by simplified info processing, sometimes known as “snap decision-making.” Bias is a result of subconscious habits and applies to everyone, no matter your education, where you come from, or how you were raised.
Bias is Innate, But Bad for Business
The notion of cognitive bias was first introduced in 1972 by Daniel Kahneman and his colleague Amos Tversky. In 2002, Kahneman won the Nobel Prize in Economics for his research on how humans often take shortcuts that depart from basic principles of probability and rationality when judging a situation, otherwise known as prospect theory.
Researchers have found that the brain uses snap decision-making to protect us from a perceived threat. Over the course of history, organisms that placed more urgency on avoiding threats than they did on maximizing opportunities were more likely to survive and pass on their genes. Over time, the prospect of loss has become a more powerful motivator of behavior than the promise of gains.
In addition, our perceptions are formed by our experiences, culture, upbringing, and messages received through mass media. The challenge for us today is that these unconscious decisions distort our judgment and can lead to stereotyping and bad decision making in our volunteer programs.
Reducing bias isn’t just annoying, though. It has real business consequences. A recent McKinsey study of more than 1,000 major business investments showed that when organizations worked at reducing the effect of bias in their decision-making processes, they achieved returns up to seven percentage points higher.
8 Common Cognitive Biases
Below are several types of biases that commonly affect our decisions. These are innate biases we learn over time, mostly unconsciously. Exhibiting any of these doesn’t mean we’re a bad person. They are just part of being human. What’s important is to recognize and challenge them to ensure you are making the best decisions and judgments you can.
Have you seen any of these in action in your personal or professional life? (Note: When I present these at trainings, most people start to chuckle knowingly. We’ve all been there!)
- Conservatism Bias — We believe prior evidence even when new, contradictory evidence has emerged
- Confirmation Bias — We listen only to info that confirms our preconceptions
- Halo Effect — We assume that if a person, organization or approach is successful in one area it will be in another
- Information Bias — We tend to seek more information even when it does not affect action or is relevant
- Anchoring Bias — We weigh one piece of information (usually the first) more heavily than another in decision-making
- Affect Heuristic — We let our current emotions influence our decisions and focus
- Loss Aversion — We prefer to avoid loss instead of acquiring gains (similar to Status Quo Bias)
How to Meet Bias Head On
Although we know they exist, individual biases will not simply disappear because we become aware of them. We need to create situations where our decisions can be challenged before they become final. Working as a team, versus as an individual, can help us unearth our biases and make better decisions.
Here are a few specific ways to address bias head on to make sure it isn’t affecting the decision making of you or your volunteer team.
- Acknowledge that we all have innate biases
- Clarify why & how decisions are made to slow down decision making
- Consider how you relate to certain people & make decisions about them reflexively
- Consciously collaborate with people who are not like you
- Seek out regular feedback on your own behaviors & actions
- Challenge your own beliefs & check your own biases
- Be courageous!
A Strategy to Reduce Bias in Meetings
Designing “mindful meetings” can help address bias and create team processes to flush out differing perspectives. Here are some ways to reduce bias in a proactive way in your next team meeting:
- Invite the right people to the table
- Focus on expertise, not rank.
- Have those that are most affected by decisions be present.
- Assign homework
- Make sure people do individual homework to consider alternatives, challenges to hypotheses, & research facts before the meeting.
- Create the right atmosphere
- Encourage dissent and “depersonalized” debate.
- Allow for ambiguity and uncertainty.
- Manage the debate
- Layout a decision-making process at the beginning of the meeting.
- Communicate assumptions made about uncertainties.
- Ask people to write down their initial positions and ask them to create balance sheets of pros and cons.
- Assign “devil’s advocates” for the meeting.
- Follow up
- Commit to the decision and cease debate about it after the meeting.
- Conduct a post-mortem on the decision, once the outcome is known.
Two Great Resources
As part of your strategy to reduce bias, you will want to educate volunteers about bias. Share both of these resources and facilitate a discussion around them during a volunteer training.
- Taking the Bias Out of Meetings Fact Sheet & Podcast (McKinsey)
- The Big Idea: Before You Make That Big Decision Article (Harvard Business Review)
What specific steps do you take to reduce bias at your organization? Add them to the comments below.
Great article! Would love to hear this concept applied to the use of consultants in non-profits. Are consults hired to actually effect change or are they hired to confirm a bias? Do consultants, in order to gain more work, tell organizations what they want to hear? thanks!
Great question @ Meridian — As a consultant myself, I consider it my ethical duty to challenge my own assumptions as well as to provide my clients my candid observations — the good, the bad and the ugly. In addition, my job is to put myself out of work — in other words to help nonprofits build enough capacity so that they can manage on their own when I’m long gone. That said, as I note above, everyone has blind spots. So, it’s a good idea to ask any prospective consultant what they do to uncover any hidden cognitive bias BEFORE you give them the job. In my case, I often involve volunteer and/or staff advisory groups in the projects I work on and encourage robust participation. That way, we develop solutions that include a variety of perspectives — which makes the solution that much stronger.