We aren’t treating tribalism as a basic human drive, but that’s what it is. Fast food lowered the cost to satisfy a basic drive, and we grew fat. Social media lowered the cost to exhibit tribal behaviors, and we are growing apart.
... with political scientist Lilliana Mason and psychologist Dan Kahan, two researchers exploring how our tribal tendencies are scrambling public discourse and derailing so many of our best efforts at progress — from science communication, to elections, to our ability to converge on the facts and go about the grind of building a better democracy.
... we explore how some incorrect beliefs can be changed with facts alone, with evidence. For those kinds of beliefs, like “It’s going to rain on Sunday,” when we learn new information, we update our priors.
To manage our beliefs in this way is to think, as they say in some circles, like a Bayesian, a term that doffs its hat to the 18th century statistician Thomas Bayes who used a pen and paper to scribble out a formula to describe how that kind of reasoning works. To think like a Bayesian, you imagine your beliefs as a percentage of confidence instead of a simply true or false. So, instead of saying “I believe my hamster is alive and well,” you would say, “I am 70 percent sure that my hamster is alive and well, based on the evidence available to me at this time.”
If we were driven by the pursuit of accuracy above all else, Bayesian reasoning would be how we updated all of our beliefs, but we aren’t and it isn’t. That’s because humans are motivated reasoners. We interpret facts in ways that best meet our goals, and our goals are not always the pursuit of the truth.
In a professional domain like medicine, science, academia, or journalism, people are trained to pursue accuracy, to operate within a framework that helps them overcome other motivations. But we are not always motivated by such empirically lofty goals. Outside of fact-based professions, we are often more motivated to maintain our social support networks, or prevent the decoherence of our identity, or keep our jobs or our bonds with our family or our churches, so if doing so means being wrong about climate change or the moon landing or gun control, that’s an acceptable price to pay to reach such goals.
... the latest evidence coming out of social science is clear: Humans value being good members of their tribes much more than they value being correct, so much so that we will choose to be wrong if it keeps us in good standing with our peers.
Once an issue becomes politicized, it leaves the realm of evidence-based reasoning and enters the realm of tribal signaling. It’s always been a challenge to progress, but the power of modern media and modern social media has allowed humans to signal their tribal loyalties on a scale that has never, ever been possible, and this one thing might just be what is driving our intense, modern polarization problem.
Compromise and agreement on policies and laws and decisions and judgments and notions of what is and is not true will naturally become more and more difficult as our ability to signal to others to which tribes we belong increases.
... with political scientist Lilliana Mason and psychologist Dan Kahan, two researchers exploring how our tribal tendencies are scrambling public discourse and derailing so many of our best efforts at progress — from science communication, to elections, to our ability to converge on the facts and go about the grind of building a better democracy.
... we explore how some incorrect beliefs can be changed with facts alone, with evidence. For those kinds of beliefs, like “It’s going to rain on Sunday,” when we learn new information, we update our priors.
To manage our beliefs in this way is to think, as they say in some circles, like a Bayesian, a term that doffs its hat to the 18th century statistician Thomas Bayes who used a pen and paper to scribble out a formula to describe how that kind of reasoning works. To think like a Bayesian, you imagine your beliefs as a percentage of confidence instead of a simply true or false. So, instead of saying “I believe my hamster is alive and well,” you would say, “I am 70 percent sure that my hamster is alive and well, based on the evidence available to me at this time.”
If we were driven by the pursuit of accuracy above all else, Bayesian reasoning would be how we updated all of our beliefs, but we aren’t and it isn’t. That’s because humans are motivated reasoners. We interpret facts in ways that best meet our goals, and our goals are not always the pursuit of the truth.
In a professional domain like medicine, science, academia, or journalism, people are trained to pursue accuracy, to operate within a framework that helps them overcome other motivations. But we are not always motivated by such empirically lofty goals. Outside of fact-based professions, we are often more motivated to maintain our social support networks, or prevent the decoherence of our identity, or keep our jobs or our bonds with our family or our churches, so if doing so means being wrong about climate change or the moon landing or gun control, that’s an acceptable price to pay to reach such goals.
... the latest evidence coming out of social science is clear: Humans value being good members of their tribes much more than they value being correct, so much so that we will choose to be wrong if it keeps us in good standing with our peers.
Once an issue becomes politicized, it leaves the realm of evidence-based reasoning and enters the realm of tribal signaling. It’s always been a challenge to progress, but the power of modern media and modern social media has allowed humans to signal their tribal loyalties on a scale that has never, ever been possible, and this one thing might just be what is driving our intense, modern polarization problem.
Compromise and agreement on policies and laws and decisions and judgments and notions of what is and is not true will naturally become more and more difficult as our ability to signal to others to which tribes we belong increases.
Comment