We have the best ideas.
I mean, okay, actually, I have the best ideas.
Unfortunately, that’s sort of the way our minds work: when we believe ourselves to be right, we seek out information that, consciously or not, affirms our ‘rightness’, even when our failure to check that reality could be, in an advocacy arena, fairly epically bad.
That’s one of the most alarming insights I gleaned from Decisive: confirmation bias means that even our most diligent research may fail to illuminate weaknesses in our proposed policy solutions, or even our framing of the problem, because we’re wired to discount that which disagrees with our way of seeing the world, and to hone in on anything that affirms it.
Today’s patterns of media consumption, of course, accelerate and exacerbate this.
In my own life, I start my mornings with NPR streaming on the treadmill, see print stories specifically selected by my Facebook friends over breakfast, and scan through blog posts highlighted by my Twitter followers, all sources explicitly selected by me because they echo my concerns.
I think we mostly know this, by now, but what struck me from Decisive is that, even when we think that we are intentionally accounting for this, we’re still not very good at overcoming confirmation bias.
Just knowing that we have this tendency does not, in other words, protect us.
And, of course, we’re not the only ones thus susceptible; those we are trying to convince/lobby have their own confirmation bias at work, and it influences how they experience the arguments we present, as well.
Not incidentally, confirmation bias is particularly a concern for folks like us, since it tends to be the strongest in emotion-laden spheres, including politics (p. 95), although, certainly, some high-profile failures suggest that even such ‘technical’ fields as engineering are not immune to the dangers of seeing things as you believe them to be, instead of how they really are.
But all is not lost.
What we need, in addition to this basic awareness of our vulnerability to confirmation bias and the importance of accounting for it (because it’s really not enough for us to just believe that we are right, even when we believe it so sincerely and vehemently), are concrete steps to counteract it, and to shape our advocacy so as to help overcome others’ confirmation biases, too.
Some ideas from Decisive that I think apply particularly well to policy advocacy:
- Intentionally reality-test our assumptions, ideally with some small-scale experiments
- Seek out partnerships and mentors with decidedly different ways of seeing the world, explicitly to challenge our thinking when necessary–I have seen, in my own advocacy, how important this is in the field of immigration advocacy, where our messages and tactics are decidedly improved through our collaborations (delicate as they are) with business groups and others who approach immigration reform slightly (or more than slightly) differently
- Develop processes designed to lead us to the right questions–one of my favorites is a sort of counter-factual that asks ‘what would have to be true?’ for a given position to be true, or for a particular approach to be desirable. This can help us to explore alternative possibilities and test our own assumptions, but it can also expose ways in which slight changes in the fact assumptions could surface some new options from which we can then choose (p. 100). For example, prior to the deinstitutionalization of people with mental illnesses, what would have to be true for it to be possible to close most of the institutions providing them with service? It would have to be possible for people to manage their symptoms effectively with outpatient treatment. With the arrival of sophisticated pharmaceuticals, this set of facts emerged, and a radically new option became viable, in ways unimagined by those closest to the issue.
- Doubt your own knowledge and question your own process–what if we asked, as a part of any policy research, “What’s the most likely way I could fail to get the right information in this situation?” What if we used this same thinking to point out to policymakers (gently) that they may not be getting the information they need, either, as a way of easing the path towards their acceptance of some of our information, over the objections of their own confirmation bias?
Where do you see, once you’re looking for it, confirmation bias in your own policy advocacy? What alternatives do you disregard out of hand, because they don’t fit your way of seeing the world, or at least your issue? How do you account for this tendency in your own analysis? How do you break through others’ confirmation bias, in your advocacy?