There’s a brilliant piece by Ezra Klein on Vox that discusses the problem of “identity-protective cognition,” or reasoning in such a way as to get the facts in front of you to fit your own worldview, and its effect on politics (“How politics makes us stupid“). Yale Law professor Dan Kahan explains, “As a way of avoiding dissonance and estrangement from valued groups, individuals subconsciously resist factual information that threatens their defining values.”
What’s perhaps most frightening about identity-protective cognition is that the smarter the person, the better they are at twisting the evidence in front of them to say what they want it to say. Let me repeat: being smarter actually makes you more likely to be susceptible to identity-protective cognition when it comes to the issues you care about. For example, Kahan ran an experiment in which participants were asked to look at some (fake) numbers about crime and gun control legislation. The numbers were set up such that there was a clear mathematical answer to the problem, and when the same numbers were used in a problem about the effectiveness of a beauty product, unsurprisingly those with stronger math skills were more likely to get the right answer. However, when the word problem was about gun control, “Partisans with weak math skills were 25 percentage points likelier to get the answer right when it fit their ideology. Partisans with strong math skills were 45 percentage points likelier to get the answer right when it fit their ideology.” YIKES.
Obviously this creates a thorny problem when it comes to an evidence-driven search for religious truth, because if there’s one thing more polarizing and more likely to make you want to stick with your “tribe” than politics, it’s got to be religion. Klein sums up the problem we’re facing:
If the work of gathering evidence and reasoning through thorny, polarizing political questions is actually the process by which we trick ourselves into finding the answers we want, then what’s the right way to search for answers? How can we know the answers we come up with, no matter how well-intentioned, aren’t just more motivated cognition? How can we know the experts we’re relying on haven’t subtly biased their answers, too? How can I know that this article isn’t a form of identity protection? Kahan’s research tells us we can’t trust our own reason. How do we reason our way out of that?
So… how do we reason our way out of that? Klein concludes, “If American politics is going to improve, it will be better structures, not better arguments, that win the day.” Maybe that’s the right answer for politics, but I’m not sure what the parallel would be for religion. When I think of “structures” in the religious world, I think of the religious institutions which are more akin to political parties, inherently preaching one view over another, than to structures that enable rational, un-biased decision-making. If anyone has any ideas for structures that could help solve the problem of bias in the examination of religion – other than a blog that attempts to look at all sides – I’m all ears.