Last week I was tagged several times in the comment section of a Facebook post I didn’t remember liking. Puzzled, I scrolled through the comments and realized that people were repeatedly tagging me in order to demand justification for why I had liked the post. A woman had been wondering if it was important to preserve women’s spaces in efforts to de-gender bathrooms. I vaguely remembered thinking this was an important topic presented in an open-minded way, so I liked it. This, apparently, was a transphobic thing for me to do, and the woman who posted the original comment was deemed even more transphobic. The commenters were excoriating her. No one asked this woman about her question, her perspective, her experiences. No one responded to her honest and humble inquiry with empathy or respect. They just leapt down her throat with accusations of prejudice and evil. She commented “BURN THE WITCH” and left the group.
We live in what has been demonstrated to be one of the most polarized eras of American history, and pretty much everyone knows it. CBS Sunday Morning, PBS, and MSNBC all have recent segments on the topic. The Pew Research Center, one of the most high quality polling and statistics institutes on American public life, calls the growing gap between liberals and conservatives “a defining feature of American politics today.” In 2014 the forum reported that Republicans have moved more right—up to 92% in 2014 from 64% in 1994, and Democrats have moved more left—up to 94% in 2014 from 70% in 1994. In February 2019 Pew reported that Republicans and Democrats have grown significantly further apart on what the nation’s top priorities should be.
But we are not just polarized. We are also obstinately and angrily entrenched in our viewpoints, exhibiting what Pew calls a “substantial increase” in partisan animosity. In each political party, the share with a highly negative view of the opposing party more than doubled between 1994 and 2014. Most of these people believe the opposing party’s policies “are so misguided that they threaten the nation’s well-being.” It is not just difference we now experience but contempt—and it is nearly identical on both sides of the aisle. “Very unfavorable” views of the opposite party rose from 16% to 38% among democrats and 17% to 43% among Republicans. This information on antipathy was gathered in 2014, when Obama was president and Republicans were predisposed to partisan antipathy. Under Trump today, liberal antipathy has grown to 44%, precisely equal to Republican antipathy under Obama.
Political behavior is no longer thought to be motivated by loyalty or hope; it is instead mostly motivated by what political scientists Anto Iyengar and Masha Krupenkin describe as “out-group animus.” Alan Abramowitz and Steven Webster simply call it anger. In 2012, 33% of Democrats and 43% of Republicans were angry at the other party’s candidate; in 2016, these levels rose to 73% and 66% respectively. What I saw and experienced last Tuesday was not an exception. It was the rule.
One of the most commonly diagnosed culprits for this shift is social media. Two particular phenomena are often highlighted in the discussion: filter bubbles, and echo chambers. The term filter bubble refers to the ways in which our preferences shape our online experiences. Artificially intelligent algorithms quickly and efficiently figure out what we like based on the words we search and the content we engage; they then provide experiences to us that are tailored to our own filters. The term echo chamber refers to the back and forth that happens between like-minded people in messages and comment sections that reinforces mutually shared views.
But a recent study at the Oxford Internet Institute (corroborated an independent report done by the Knight Foundation) has demonstrated that we are actually not as isolated on social media as was once thought. According to its authors, Grant Blank and Elizabeth Dubois, our fears about filter bubbles and echo chambers are exaggerated, and may actually be unfounded. People actually encounter different perspectives in their social media feeds on a regular basis—in fact, perhaps more often than before the rise of social media. And people do apparently attempt to reign in their own biases by seeking out multiple sources of information. According to the study, only eight percent of UK adults are predicted to be at risk of being in an echo chamber, a statistic the authors see preliminarily mirrored in their USA data.
The problem isn’t necessarily that we aren’t exposed to alternative viewpoints. We clearly are. All of us know from personal experience that this is at least a little bit true. We all encounter difference on our feeds: we have in our circles people we knew a long time ago, friends “from back home,” casual acquaintances we met at a conference just once or twice. We know different opinions exist, and we encounter them on a regular basis.
The real problem isn’t lack of exposure to different viewpoints. It’s that we don’t listen.
We encounter people who are different from us all the time. But we don’t listen to them. We don’t set aside our immediate reactions; we don’t try to understand their perspectives; we don’t ask questions for clarification before making assumptions. Benefit of the doubt and intellectual humility have all but vanished from our collective psyches. But why? Why don’t we listen?
There are many reasons we don’t listen. Social media is still implicated. Our education is implicated. Our parenting styles are implicated. And most certainly basic group psychology is implicated: humans are well-known for attaching self-worth and identity to that of a group, a phenomenon that makes us fiercely defend our own groups while denigrating others.
But there is another insidious factor not often discussed. It is subtle, stealthy, elusive, hard to detect and hard to address. But it is immensely important for understanding our current cultural moment and what we can do to address it. It is that we are desperately attached to certainty, while at the same time living in the most uncertain era in human history.
I don’t have the space in this article to enumerate all the ways in which our society is more uncertain than any other or previous society. But just imagine briefly the contrast. Humans were once tribal hunter-gatherers who cherished inherited narratives, favored the memorization of facts and stories, and had little impetus to question metaphysical or epistemological presuppositions. Today, after the invention and disbursement of writing, after philosophy and science developed into widespread institutions that encourage systematic analysis and doubt, with so much media exposure about threats to our safety, and after the world technologized and globalized such that we are in close contact with millions of other ways of making sense of things, we sit with a lot less presumptive ease about our places in the world than we once did. One way to defend ourselves against this deeply felt and pernicious uncertainty is to close our minds and entrench ourselves in our worldviews as vociferously as possible.
While it may seem crass to assert that our current partisan crises is caused by something so intangible as the status of certainty, it is actually anything but. In 2016 Archy de Berker and colleagues published the results of a study of uncertainty: participants played a computer game in which they had to overturn rocks, and some of the rocks had snakes under them. When there was a snake, participants received a shock. When there wasn’t, nothing happened. A sophisticated algorithm was built into the game that could make the appearances of the snakes more or less predictable for the participants. Based on a wide variety of measurements of stress (skin conductance, cortisol levels, pupil diameter, subjective reporting), the researchers determined that the greatest level of stress occurred not when participants could anticipate that a shock was coming, but rather when they had the hardest time predicting whether a shock was coming or not. It wasn’t pain but uncertainty about pain that stressed people out the most.
Psychologists have long known how potent uncertainty can be. In 1972 Jerome Kagan argued that uncertainty resolution is one of the primary determinants of human behavior. It is intrinsic to human functioning, said Kagan, to be averse to uncertainty and do everything possible to avoid feeling it. This is simply, said Kagan, because humans crave the comfort and safety that come from knowing the shape of their world.
This idea has some truth to it, though more recently a leading social psychologist named Arie Kruglanski has added an important level of nuance to the conversation (2004). It isn’t necessarily intrinsic to humanity to prefer certainty at all times, says Kruglanski. It is, instead, highly conditioned by cultural context and personal circumstance. Indeed, according to Kruglanski, people exist across a wide spectrum with respect to what he calls the generalized need for certainty: on the far left end, people need certainty, and on the far right end, people need uncertainty.
The extent to which we need certainty is influenced both by long-term and short-term conditions, so people tend to be generally situated somewhere along the spectrum but can shift based on the more immediate moment. We all tend to be pushed from where we are towards needing more certainty when we are unsafe and need to make a plan, when we have a lot on our minds we need to figure out, when there is time pressure, or when we are tired or otherwise emotionally or energetically low on fuel. Pretty much any circumstances that take up emotional, mental, or physical resources can tilt us towards closing off uncertainty more quickly than we would otherwise. These facts are most certainly worth paying attention to in today’s Xanexed and coffee-addicted hustle.
Long-term conditions such as how much a society values precision and answers over ambiguity and open-mindedness also play a significant role. Sociologist Donald Levine has conducted analyses of various cultures’ positions towards ambiguity and concluded that the modern West is uniquely interested in precise and certain answers. Levine demonstrates for example that in Java communication “that is open and to the point” is rude (1985: 23); the Somali language encourages and permits a richness of metaphors and poetic allusions that can be interpreted any number of ways by the receiver (24). Even in the West before the seventeenth century, says Levine, human affairs weren’t necessarily considered with precise propositions but were best understood in terms of experience, travel, and worldliness; this may not have erased the West’s obsession with absolute truth, but it may very well have meant the psychological impact of ambiguity was not so harshly felt as it is today.
Because make no mistake: the West is unequivocally and relentlessly obsessed with certain, absolute truth, and it has been for hundreds if not thousands of years.
In 1641, Rene Descartes published his Meditations on First Philosophy, which was arguably one of the most influential philosophical texts in Western history. One thing that was unique about the Meditations was the length to which Descartes was willing to doubt everything he had ever known. In this text Descartes famously invented a hypothetical all-powerful demon that attempted to deceive him at every turn. Descartes discarded—or at least thought he discarded—every single assumption and presupposition he ever held, so that he could construct with genuine confidence a belief system that consisted only of certainly true beliefs.
Descartes sought absolute truth. Everything had to be doubted that so certain truth—the real goal of inquiry—could be achieved. Ultimately, he was able to keep absolute truth alive with the idea of an omniscient God, which is actually where the West’s obsession with truth comes from in the first place. We could know things, said Descartes, because we have a link to God, and God knows all things. God is all powerful, all good, and all knowing. The world can be known, if only through the perfect lens of God’s vision.
Philosopher Donald Crosby describes the Western view of truth as one that is precise, perfectionist, absolutist, and really only possible if we can have what he calls a “God’s-eye” view of reality (1988: 137). The standards of this view are incredibly high: truth can only be considered truth if it is perfect—as if it is being looked upon by an omniscient God. Knowledge is absolute, certain, and grounded in God’s infinite power and wisdom. Up until about the time of Descartes, thinkers knew that their knowledge was flawed because they were simple, sinful humans contingent upon God. But whatever access to truth they did have, they had through God. God was the anchor of truth, and in being so, the anchor of certainty. The West has been spiritually and emotionally reliant on God’s perfect truth for millennia (for more see Nicholas Wolterstorff 1984: 30); Descartes is just one example.
But there is a problem with this reliance: it is reliance. Take away the anchor and the whole web suffers. As philosophy began to part ways with the dogma promulgated by the Church in the sixteenth and seventeenth centuries, people began to shed their belief in an omniscient God. Modern philosopher Mary Midgley calls the process God surgery (2002); God was literally excised from Western culture. As this happened, the webs of meaning, experience, and solace that Western thinkers had hooked onto God suddenly frayed, lacking an anchor. People suddenly had to look elsewhere for a ground to the knowledge whose certainty had always been presumed. Rationalists such Rene# Descartes, Gottfried Leibniz, and Baruch Spinoza sought foundations in deductive reasoning. Empiricists such as John Locke, George Berkeley, and David Hume sought foundations in the sensory apprehension of the world.
In 1781 Immanuel Kant argued that even if you perfectly apprehend the phenomena of the world, you will never know for sure if it corresponds to the actual reality—the noumena—of existence. Basically, you can’t know reality no matter how much you want to. Since then no one has ever really gotten around the problem. Philosophers have continued trying to figure out how to properly interpret the world, how to read texts, and how to conduct science—and sure, various answers to these questions have been found, and some people have even claimed they’ve discovered firm foundations to knowledge—but by and large, the quest for firm foundations to knowledge failed over and over again. Somewhere in the mid twentieth century philosophers began to take very seriously the possibility that they may have been on a fool’s errand all along.
So rushed in the era of postmodernism, which has been very contentious, but which has also made some excellent points about the perspectival and political nature of knowledge. “Truth” is deeply culturally and psychologically embedded. Foucault used the phrase “power/knowledge” to demonstrate just how inseparable power and knowledge actually are. Pretty much everything we think we know we know only because of our specific context and our specific position within the social and political hierarchies. Our own perspectives and interests are so inescapable that we can never actually be objective, and maybe the quest for truth is nothing more than a quest to enhance our own power anyway. Some people think science is the answer to this problem of subjectivity, and in many ways, it is about the best we can do. But that doesn’t provide a final solution. Objectivity is dead, and relativism reigns.
In 1983 Philosopher Richard Bernstein coined the term “Cartesian Anxiety” to describe a longing for firm foundations to knowledge that don’t actually exist. More recently a scholar named William Stahl has demonstrated precisely how this can affect us by applying the concept to the entrenched battle between New Atheists and Christian Fundamentalists.
These two groups, says Stahl, may seem very different on the surface. But both New Atheists and Christian Fundamentalists are characterized by a significant need for the authority of truth. For the Fundamentalists, this authority comes in the form of the Bible. For New Atheists, it’s science. These are very different epistemologies, but attached to each of them is an entrenched defensiveness. Subconscious knowledge that your vision of an ordered world is subject to attack can lead to a deep underlying anxiety that manifests as bellicose rage. Cartesian anxiety can also be applied to modern liberals and conservative. Partisan animus, in this sense, is a defense against instability, ambiguity, and the lack of control over cultural ideas and norms.
All in all, the death of truth has not landed on us lightly.
But we are not without hope.
As problematic as our entrenched and deep needs for certainty are, we know from Kruglanski’s work that they can be mitigated. We can actually reduce people’s needs for certainties by adjusting both their short-term and long-term conditions. In the short term, we can work towards many changes that are crucial though regrettably outside the scope of this article: create a world that both is and feels safer with safety nets, honest governments, media accountability, environmental care, and international cooperation, to name a few.
In the long-term, we can shift our culture to value intellectual humility, open-mindedness, and even uncertainty itself—to love and respect the phrase “I don’t know”—so much that we have deeply embedded at the root of our mentality and relationality a powerful inclination to refrain from tribalistic close-mindedness.
Let’s get rid of absolute truth. It’s dead anyway; all we need to do is accept it. And if we do, we will be able to understand without fear or resentment that all of our positions are nearly ineluctably incomplete. We are perspectival. We do know truths only through our own worlds and experiences. Postmodernism in this sense is certainly right, and we must listen to it.
But this doesn’t mean we have to erase truth altogether. It simply means we must understand that we only have access to pieces of it, and that we can only get closer to the real truth by asking questions, exercising empathy, and seeking to grow together in global community. So far as scholars can best tell, even after postmodernism, the world remains intelligible (Crosby 1988: 366). It is knowable. We simply only have access to slices of it, and need other slices from other people and sources of information if we are going to actually find real solutions to real problems and answers further along on the road to truth. We can have conversations with one another from our unique positions if only we turn to available evidence, discuss it rationally, acknowledge our own limitations, and proceed with a quest for truth in mind.
Our angry partisanship is in part a reaction against the crumblings of our claims to truth and authority. We don’t like it, so we are rejecting it violently and covering it up with bravado. But who are we helping when we do this? What cause are we serving? Most people do quite obviously believe that they are serving a great cause when they shout at others. If they believe in climate change, or the importance of Black Lives Matter, or in cutting taxes, or anything, and they encounter people who don’t, they’ll probably feel justified, even important—socially significant—if they try to teach the nonbelievers a lesson. But minds are rarely changed by shouting. Minds are changed by finding common ground, by experiences of mutual respect, and by taking an attitude of seeking truth together (Shelton and Rogers 1981, Perloff 1993, Limbu 2015, Dunn and Goodnight 2016), as opposed to bludgeoning one another into submission.
If we can manage to change our vision of what truth is and how we can access it, then we’ll no longer have unrealistic expectations about what it can provide us. We won’t have to seek solace in being absolutely correct and smashing our ideological enemies; we’ll be able to seek solace in the connections we can make and shared human we can experience in endeavoring to find answers together. Instead of fighting for dominance in the question about who is right, we can relax with mutually shared humility in front of the enormity of our quest for a better world. Human love, fraternity, community—these are the things that can be gained by letting go of our attachments to our own sense of rightness and authority, all on top of having a better understanding of the world, of truths, and of the people who believe them.
Is shifting culture in this way easier said than done? Yes. But Kruglanski’s work has demonstrated that culture matters. It matters because values matter: those who uphold the “cool” values are deemed more socially significant, and social significance is arguably the most basic human priority (###). So let’s make intellectual humility, empathetic dialogue, a willingness to be unsure cool again. Speak up about what it means to have valuable discourse and why it’s important. Make friends not on the basis of similar views but on willingness to talk about them respectfully. Point out how small, close-minded, and unproductive it is to lead with shouting instead of listening.
We live in arguably the most uncertain era of any in human history. This has the potential to wreck us if we don’t handle it well. But it also is in a sense the most honest era of human history, because we are being forced to confront the limits to our knowledge. It’s not that truth itself has died; it’s that our epistemological hubris has, or needs to. We may be burdened with uncertainty, but this is also a fantastic opportunity. If we can learn to live without dogmatic adherence to our own beliefs, we will be able to survive our modern crises, and maybe even grow beyond them.