What Can We Do About Our Bias?

A 4-step roadmap for developing an always-on, honest relationship to bias.

The original post is here: https://medium.com/better-humans/what-ca...

In 2016, I published the Cognitive Bias Cheat Sheet on Medium, and as of today it’s been read over 1.3 million times, and inspired a book titled Why Are We Yelling? The Art of Productive Disagreement, coming out this November. In the meantime, I’ve continued to try to simplify things in the hopes of getting to the true core of what biases are, how they help us, and how we can best manage our relationship to them.

TL;DR —

We can’t avoid our biases. The best we can do is maintain an honest dialogue with our blind spots and commit to identifying and repairing inadvertent damage caused by them as efficiently as possible.

WHAT THIS POST COVERS

  1. Four steps to developing honest bias
  2. The 3 Conundrums and 13 strategies that generate biases
  3. Example: the 2020 presidential election
  4. How to apply this to real life
  5. Some things to help you remember

There are many obstacles to seeing things clearly.


🌀 Four Steps to Developing Honest Bias

Step 1: Opt-in. Developing honest bias requires us first and foremost to wake up to our own blindness and to stop trying to pretend it doesn’t exist. Only you can decide if you’re up for the challenge of taking it on.

Step 2: Observe (Beginner level). Take steps to reduce the amount of time and energy you spend trying to hide or ignore your biases and blind spots. For example: read articles like this to get familiar with the variety of biases. Notice when your defenses are triggered and check whether A) you’re really in existential danger right now, or B) if there’s an opportunity to learn from a new perspective (even in a small way).

Step 3: Repair (Intermediate level). Take steps to reduce the time and energy it takes for you to identify and begin to repair inadvertent damage caused by your biases and blind spots. For example: when you notice a blind spot, look into it and identify people and ideas that may have been undervalued or harmed by you and others. Look for ways to reverse that trend and repair damage.

Step 4: Normalize (Advanced level). Take steps to reduce the time and energy others have to spend challenging your blind spots and recruiting you to address the damage that you’ve contributed to. For example: actively seek out information and perspectives that challenge your own. Invite the best representatives of positions you don’t agree with to productive disagreements. Actively attempt to falsify your own beliefs.

🙅‍♂️ What not to do 🙅‍♀️

There’s no way to become completely unbiased. All of the steps to develop honest bias are about continuous maintenance rather than one-time and permanent fixes. The temptation to seek permanent fixes is great (believe me, I’ve looked for them too), but the 3 conundrums don’t have permanent fixes. If you think you’ve found one, or are on its track and will catch it any day now, check yourself. There’s a good chance that it’s intended to resolve your anxiety about the problem rather than fix the problem itself. See the shortcut “treat experience as reality” below. Focus on openness, responsiveness, and maintenance instead.


The 3 Conundrums & 13 Strategies That Generate Biases

No matter what we do, we can’t escape these conundrums, but 13 strategies help us think within their constraints.

3 CONUNDRUMS

  1. 🧠 There’s too much information (so we must filter it).
  2. 🧡 There’s not enough meaning (so we use stories to make sense).
  3. 🖐 There’s not enough time (so we motivate towards action).

Each of the following strategies helps us compensate for one of the 3 conundrums by overvaluing a certain kind of possibility and undervaluing everything else. Each strategy is a collection of cognitive biases, mental shortcuts, and energy savers that help us filter information, make sense of things and get stuff done.

13 STRATEGIES

🧠 1-5 HELP US FILTER INFORMATION

  1. We depend on the context to figure out what to notice and remember.
  2. We accept what comes to mind, and don’t worry much about what doesn’t come to mind.
  3. We amplify bizarre things.
  4. We notice novelty.
  5. We seek takeaways to remember and toss the rest.

🧡 6-10 HELP US MAKE SENSE OF THINGS6.

We fill in the gaps with stereotypes and generalities.

  1. We favor familiar things over the non-familiar.
  2. We treat experience as reality.
  3. We simplify mental math.
  4. We are overconfident in everything we do.

✋ 11-13 HELP US GET THINGS DONE

  1. We stick with things we’ve started.
  2. We protect existing beliefs.
  3. We will opt to do the safe things, all other things being equal.

Example: The 2020 Presidential Election

Let’s use the example of the upcoming 2020 presidential election to see how each of these strategies could impact how we decide who to vote for.

Strategy 1: Depend on the context

🧠 Helps us filter information by overvaluing options that fit your current context or mindset (but undervaluing options that don’t).

What we notice, and what we remember, is influenced a great deal by the context that we’re in at the time. We’ll tend to look for candidates that correct for the current President’s flaws. When we’re anxious, we’re more likely to want someone to come in and flip tables; and when we’re happy with things, we more likely want someone to avoid rocking the boat. It’s worth just making note of the different contexts that might influence you because whatever context you’re in at the moment you make a final decision will override all the past contexts you might have experienced.

Related biases: Cathedral effect, Focalism, Generation effect, Levels of processing effect, Magic number 7+-2, Misattribution of memory, Next-in-line effect, Picture superiority effect, Self-relevance effect, Spacing effect, Testing effect, Tip of the tongue phenomenon. Learn more… Strategy

Strategy 2: Accept what comes to mind

🧠 Helps us filter information by overvaluing options that come readily to mind (but not even considering options that don’t).

Who you end up considering, and who you eventually vote for, is limited to the candidates that you’re even aware of. Being a household name like Joe Biden is a huge advantage because he’s at least in the list of people you could vote for. If you haven’t heard of Mike Gravel, Seth Moulton, or John Delaney, you won’t vote for them because they don’t come to mind as options. For important decisions with a finite set of options, like an election, it’s possible to reduce the influence of this one by actively researching the full set of options before deciding who to pick.

Related biases: Attentional bias, Availability heuristic, Change blindness, Cryptomnesia, Cue-dependent forgetting, Frequency illusion, Source confusion, Survivorship bias. Learn more…

Strategy 3: Amplify the bizarre

🧠 Helps us filter information by overvaluing loud abnormalities that stand out (but undervaluing quiet things that blend in).

Candidates are often remembered by a signature issue or characteristic that makes them stand out. If that happens to be something you value, you’ll notice it and boost their importance. Elizabeth Warren is promoting bold policies, Pete Buttigieg is the gay candidate, Andrew Yang supports universal basic income, Bernie Sanders wants Medicare for all and free college tuition, Tulsi Gabbard is the first Hindu member of Congress, etc. By promoting one big thing that makes them special, they stay in our attention more easily. And yet, we know that candidates are well aware of this effect and actively use it to influence our choices. It’s worth considering what boring, totally-not-weird things you think make a candidate stronger, and proactively considering these qualities as well.

Related biases: Bizarreness effect, Negativity bias, Publication bias, Von Restorff effect. Learn more…

Strategy 4: Notice novelty

🧠 Helps us filter information by overvaluing shiny new things (but undervaluing stuff that’s been around a while and has lost it shine).

Who are the up-and-coming political stars that we should pay attention to? Newcomers have an advantage because they get noticed and can be blank slates for hopes and dreams to be projected onto. Obama, Trump, and many of the candidates under consideration this time around have a novelty bump in their favor. New is sometimes better. New is sometimes worse. This is another strategy that campaigns are well aware of and will invest in promoting if it’s to their advantage. But what’s new is also unknown — rather than assuming new is good, when you find your attention gravitating to the new, spend some time actually looking at what’s there rather than filling it in with your hopes and dreams.

Related biases: Anchoring, Appeal to novelty, Contrast effect, Decoy effect, Distinction bias, Framing effect, Weber–Fechner law. Learn more…

Strategy 5: Seek takeaways

🧠 Helps us filter information by overvaluing things that present themselves as take-aways (but undervaluing things that don’t seem relevant in the moment).

What we experience in the moment is often different from what we take away. This is because our brains tend to latch on to specific kinds of details more than others. If there’s a long list of candidates, you’ll remember the ones on the end of the list better than the ones on the front of the list. When remembering past experiences, we often undervalue how long we were unhappy and only remember the peak of unhappiness. Choosing who we vote for requires us to reflect on the past few years to evaluate what should chance, but by default we’ll undervalue long and misremember a lot of the details. If you’re experiencing something that you want to remember and reflect on later, it’s better not to rely on your memory to do so.

Related biases: Duration neglect, Leveling and sharpening, Memory inhibition, Misinformation effect, Modality effect, Part-list cueing effect, Peak–end rule, Primacy effect, Recency effect, Serial position effect, Serial recall effect. Learn more…

Strategy 6: Fill in the gaps

🧡 Helps us make sense by overvaluing how things conform to perceived generalities (but undervaluing how things differ from perceived generalities).

Imagine if you had 500 words to describe your life’s work — so much would necessarily get left out. And yet, reading 500 words about each of the 26 candidates is quite a task. Some will do it. Most will rely on a few items highlighted in previous steps (amplify the bizarre, notice novelty, seek takeaways) and then fill in the rest of the picture for candidates based on what we think is likely. This means, realistically, that most of what we think of when we think of, say, Amy Klobuchar, is stuff we project onto her from our stereotypes and generalities. It’s worth double checking some of those assumptions with an honest, open inquiry if there’s an opportunity to do so.

Related biases: Anthropomorphism, Apophenia, Argument from fallacy, Clustering illusion, Confabulation, Conjunction fallacy, Essentialism, Functional fixedness, Gambler’s fallacy, Group attribution error, Hot-hand fallacy, Identifiable victim effect, Illusion of validity, Illusory correlation, Implicit stereotypes, Insensitivity to sample size, Just-world hypothesis, Murphy’s Law, Pareidolia, Placebo effect, Prejudice, Recency illusion, Self-licensing, Stereotyping. Learn more…

Strategy 7: Favor the familiar

🧡 Helps us make sense by overvaluing things that you are associate with (but undervaluing things that you struggle to associate with).

Election example: Kamala Harris is a California senator, and there’s a mural of her painted at my son’s elementary school (she attended the same one). Her name is familiar to me, and I have spent enough time hearing about her to know she has strengths and weaknesses, a track record, etc. The fact that I have greater resolution into her character gives her an advantage relative to lesser-known candidates. Think about which candidates you are already familiar with, and know that you’re going to give them the benefit of the doubt more often than you would give to other candidates you’re less familiar with.

Related biases: Armchair fallacy, Attribute substitution, Bandwagon effect, Cheerleader effect, Conservatism, Continued influence effect, Cross-race effect, Declinism, Defensive attribution hypothesis, Extrinsic incentive error, Fading affect bias, Fundamental attribution error, Halo effect, Hyperbolic discounting, Illusion of asymmetric insight, Illusion of external agency, Illusory superiority, Illusory truth effect, In-group bias, Law of the instrument, Mere exposure effect, Not invented here, Omission bias, Optimism bias, Out-group homogeneity bias, Pessimism bias, Positivity effect, Pseudocertainty effect, Reactive devaluation, Rosy retrospection, Sapir Whorf Korzybski hypothesis, Suggestibility, Ultimate attribution error, Well-traveled road effect. Learn more…

Strategy 8: Treat experience as reality

🧡 Helps us make sense by overvaluing stories that match your experience (but undervaluing stories that don’t match your experience).

The world is big, and there are many different ways people will experience it. The days of pretending that there’s a single unifying “common sense” that connects us all are over. Therefore, we can’t assume that people who come to different conclusions than us are by default less intelligent or worthy of representation. And yet, we each exist within a dramatic narrative about the country — what’s good and what’s bad about it, what feels objectively true — and candidates will adopt these narratives to connect with you. But it’s worth remembering that we’ve filtered out a lot of information, and filled a lot of gaps in with our own perspective, and the reality we experience might not be what others experience. If you even have a 1% doubt about whether your experience is the reality for others, you can start with a question about other peoples’ experiences, and avoid immediate escalations to battle.

Related biases: Abilene paradox, Affective forecasting, Bias blind spot, Curse of knowledge, Egocentric bias, Empathy gap, False consensus effect, Illusion of control, Illusion of transparency, Immune neglect, Impact bias, Moral luck, Naïve cynicism, Naïve realism, Pro-innovation bias, Projection bias, Self-consistency bias, Spotlight effect, Time discounting. Learn more…

Strategy 9: Simplify mental math

🧡️ Helps us make sense by overvaluing things of extremely high or low probability (but undervaluing things with probabilities in the middle of the spectrum).

We have a hard time working with fuzzy uncertainty, and tend to assume things are either “definitely going to happen” or “very unlikely to happen.” Trump won when predictions were in the 30% range, and many people incorrectly assumed that that meant the prediction had been wrong. The prediction wasn’t wrong because unlikely things happen all the time. Polling and predictions are a complicated science, and we tend to take that information and collapse things down to absolutes anyway. When you feel your mental math taking on a particularly extreme position of 100% or 0%, know that you’re creating a blind spot for a possibility that likely exists. It might be worth restating the position in fuzzier terms.

Related biases: Appeal to probability, Base rate fallacy, Denomination effect, Extension neglect, Hindsight bias, Masked man fallacy, Mental accounting, Money illusion, Neglect of probability, Normalcy bias, Outcome bias, Planning fallacy, Subadditivity effect, Swimmer’s body illusion, Telescoping effect, Time-saving bias, Zero sum bias, Zero-risk bias. Learn more…

Strategy 10: Be overconfident

🧡 Helps us make sense by overvaluing our ability to control everything (but undervaluing the interdependent nature of things).

Imagine you had all the information you needed to make the objectively best choice about who to vote for (maybe you are a time traveler and have seen the future timelines of every possibility). Now, compare that to what you actually know. Our biases for overconfidence are there to prevent us from hesitation and delay, but one misconception about uncertainty is thinking that you need to have high confidence in order to act. In cases where action is required, like an election, or with an urgent decision, it’s okay to admit that you’re acting with low confidence and to still remain open to new information and changing your mind as long as possible. Hedging bets by supporting multiple candidates is one way to act without deciding who is your top pick.

Related biases: Barnum effect, Dunning-Kruger effect, Hard-easy effect, Lake Wobegone effect, Overconfidence effect, Restraint bias, Risk compensation, Self-serving bias, Social desirability bias, Third-person effect, Trait ascription bias. Learn more…

Strategy 11: Stick with it

🖐️ Helps us get things done by overvaluing the status quo (but undervaluing a shift in direction).

The drive for answers, productivity, and growth pushes us into the direction of convergence (narrowing down), and is a lot less interested in divergence (expanding possibility). This is 10x truer when we’ve already converged onto an answer to something ambiguous like “who should I vote for?” Once we settle on a position, these biases kick in to keep us committed to that position and discourage us from re-opening the question again. One way around this is to delay convergence until you absolutely need to act on a single option. Another is to practice re-entering an undecided phase about things that are less intimidating: trying a food that you think you don’t like, asking for best recommendations in a genre of music that you typically don’t like, etc. Imagine scenarios where your favorite candidate is eliminated in the primaries and the process of reconsidering one or more candidates that you had previously decided not to vote for.

Related biases: Chesterton’s fence, Disposition effect, Effort justification, Endowment effect, IKEA effect, Information bias, Loss aversion, Social comparison bias, Status quo bias, Sunk cost fallacy, System justification, Unit bias. Learn more…

Strategy 12: Protect existing beliefs

🖐️ Helps us get things done by overvaluing winning (but undervaluing truth).

These biases become active when we put winning over being right. They assume that winning is more important than being right. There are some situations where this is extremely valuable — for example, when fighting for survival. If a tiger is chasing you, it doesn’t matter if the tiger is right thinking that you’re its lunch, because you’d rather live! If two candidates are debating, your candidate’s success will impact your well-being and chances for survival (especially when issues of health care, women’s health, and gun control are on the ballot) but the difference is that the downstream impact of a particular candidate on those policies is probably less certain than they feel in the moment. We already know that overconfidence will increase our certainty that a threat will lead to negative outcomes. All the more reason why we should take steps to be open to being wrong, even if it means losing the current argument. Reducing the time and energy it takes to correct incorrect beliefs is a better way to run from the tiger in today’s modern world.

Related biases: Backfire effect, Belief bias, Bucket error, Choice-supportive bias, Confirmation bias, Congruence bias, Escalation of commitment, Law of narrative gravity, Observer-expectancy effect, Ostrich effect, Post-purchase rationalization, Reactance, Selective perception, Semmelweis reflex, Subjective validation. Learn more…

Strategy 13: Do the safe thing

🖐️ Helps us get things done by overvaluing agreement (but undervaluing criticism).

The safe thing is to vote how everyone around you votes. Nobody can call you out for doing that, even if the consensus leads everyone in the wrong direction in hindsight. These biases are all about going with the flow, prioritizing the danger of upsetting the flow over contributing to the exploration and discovery of better answers. You can do better than that.

Related biases: Ambiguity effect, Authority bias, Automation bias, Law of triviality, Less-is-better effect, Occam’s razor, Rhyme as reason effect. Learn more…


How to Apply All of This to Real Life

I’ve talked to a lot of people about this and have found that the main challenge isn’t about information recall. It’s about shifting the way you think about blind spots and bias from “1-time quick fixes” to “always-on repair and maintenance”. This is a tough switch to flip in our heads because as you’ve seen from the strategies, we overvalue easy takeaways and undervalue inconvenient truths. You can see the same pattern with dieting and exercise. The best way to be healthy is to eat whole foods, exercise frequently, get enough sleep, and be kind to yourself and others. But that’s not the answer we want! We want a new gimmicky diet and exercise plan that will provide instant results, and keep falling for marketing that promises these things even though we know they don’t really exist.

Step 1: Opt-in to honest bias

Applying “honest bias” requires us first and foremost to wake up to our own blindness and to stop trying to pretend it doesn’t exist. This means letting go of the false belief of permanent fixes and being open to answers that aren’t in the initial set of easy answers you had in mind. Only you can decide if you’re up for the challenge of taking it on.

Step 2: Observe (beginner level)

Take steps to reduce the amount of time and energy you spend trying to hide or ignore your biases and blind spots.

It’s easier to see that we have blind spots in isolated moments than to always remember this in our day-to-day. The 13 strategies are useful ways to overcome the 3 conundrums, but they systemically undervalue the following things:

  • options that aren’t elevated by the current context
  • options that don’t readily come to mind
  • options that blend in and don’t squeak
  • options that have been available for a while
  • options that don’t feel immediately relevant
  • options that don’t match perceived stereotypes
  • options that we don’t identify with
  • options that don’t match our personal experience
  • options that aren’t certain
  • options outside our sphere of control
  • options that challenge our existing beliefs
  • options that require challenging consensus

These undervalued options accumulate over time. It’s very likely that very good options have been prematurely routed to one of these 13 undervalued pools. If you revisit big decisions from the past, you may be able to see how in hindsight an option that was actually valuable had been dismissed for one of these 13 reasons. Many times, it’s other people who are adversely affected by these blind spots, but sometimes you suffer too. All that’s required for this step is to be aware that this happens and to come up with some way to remember it. Two ways to do this:

  • Create an always-on reminder system. We can design the environment around us to consistently re-orient us towards blind spots instead of away from them. This can be a repeating calendar event, a custom phone lock screen, a poster on a wall you see a lot, etc.

  • Broaden your social circle. Each of us has our own set of natural biases that systemically overvalue and undervalue information based on our own motivations. If our social circles are diverse, no blind spots will have 100% coverage and can be brought to attention by someone in the group.

Step 3: Repair (intermediate level)

Take steps to reduce the time and energy it takes for you to identify and begin to repair inadvertent damage caused by your biases and blind spots.

When your always-on reminder system and/or your broadened social circle calls to attention a potential area of blindness or inadvertent damage caused by biases and blind spots, you commit to acting on it as authentically and spontaneously as possible.

This is all about practicing the art of revisiting options that have been inadvertently missed, ignored, or dismissed.

Here are 13 questions that can help us peek into our own blind spots:

13 BLIND SPOTS TO QUESTION:

  1. Out of context: What have I missed because options were hidden from my particular circumstances and context in the moment?
  2. Out of mind: What have I not considered because they just didn’t come to mind at the time?
  3. Lackluster: What have I missed because something else immediately grabs my attention when I think about this?
  4. Expired: What options have I neglected because they didn’t present themselves as shiny and new?
  5. Irrelevant: What have I undervalued because it didn’t fit my expectations as a proper take-away?
  6. Untypical: What options have I not seen in their true light because I projected stereotypes and generalities onto them?
  7. Unfamiliar: What have I passed over simply because it didn’t feel familiar to me?
  8. Unrelatable: What options have fallen to the wayside because they didn’t match my own personal experiences?
  9. Ambiguous: What options have I dismissed because they felt less certain or more risky in the moment?
  10. Underestimated: What have I ignored or neglected because I overestimated my ability to control certain situations?
  11. Costly: What options have I dismissed because they would require changing course from previous decisions?
  12. Threatening: What have I dismissed because I wasn’t ready to accept that I might be wrong?
  13. Unpopular: What options have been pushed aside because I felt gathering consensus would put me at risk in some way?

Step 4: Normalize (Advanced level)

Take steps to reduce the time and energy others have to spend challenging your blind spots and recruiting you to address damage that you’ve contributed to.

If you can get to the point where you’re able to respond to incoming information that reveals our blind spots in a healthy way, then this step is all about turning that reactive stance into a proactive one. Instead of relying on reminders and a broadened social circle to bring this information to you, you can go out and seek it directly. This only works if you have enough surplus attention and energy to repair damage from blind spots and biases at a higher rate than you were already receiving, so being mindful of that is critical.

  • Create neutral spaces for productive disagreement to become a cultural norm. This means identifying difficult problems and building relationships with people and ideas that are volatile by nature and helping them fuse into a cultural norm that can sustain itself over time.

  • Normalize honest bias: Help make the idea of honest bias the default response to systemic problems. Instead of trying to avoid, ignore, or permanently fix these problems, introduce the idea of always-on maintenance and repair to every niche discipline.


Some Things to Help You Remember

If you decide to go with this roadmap, here are a few ideas for how to get it to stick.

Learn about my book: This article is an adaptation of ideas that are more fully explored in my forthcoming book titled Why Are We Yelling? The Art of Productive Disagreemen. Developing honest bias is 1 of 8 “things to try” that I explore in greater depth in this book. It’s available for pre-order and will be published on November 19th, 2019.

Bookmark these references: Additionally, I’ve created a small website and mobile-friendly resource for those who’d like a way to look up one of the 13 strategies or 180+ biases at any time. Check it out.

Get notified: For bias and disagreement-related news, subscribe to my newsletter.

· In these piles: dialogue, book · Original post