Just before his death in 1543, Nicolaus Copernicus argued that the Earth revolves around the Sun in a book called On the Revolutions of the Celestial Spheres. This moment is often heralded as the birth of modern science.
Now, theories and experiments that we would today likely recognize as scientific existed before Copernicus. There was, for example, the sixth century monk Dionoysius Exiguus, who studied and taught lunar cycles; there was also a group of scholars in the fourteenth century known as the Oxford Calculators who developed a physical principle called the Mean Speed Theorem.
But these proto “scientists” differed from Copernicus in one important respect: they operated solely within and subordinate to the dominant ideas presented to them at the time. The monks who studied and established lunar calendars, for example, did so in order to compute the date of Easter and help distinguish Christianity from Judaism and paganism (McCluskey 1998: 85-87). The Oxford Calculators were only secondarily mathematicians and were first and foremost theologians or religious officials. The most influential Calculator, Thomas Bradwardine, wrote treatises on Original Sin and viewed his mathematical studies as explorations of the world God Created out of the “infinite void” (Kaiser 1997: 116). Before 1543, people who carried out experiments and analysis did so with inherited wisdom as a conceptual frame within which they assured all of their results would cohere.
Copernicus was the man who changed all this. He tied an anchor to inherited wisdom and flung it into the deep, never to resurface again. He wasn’t trying to be a revolutionary. By all appearances he thought controversy was a bit of a drag, and he was on quite good terms with Pope Clement VII (Repcheck 2008).
But he nevertheless shook the world with a revolutionary act: he didn’t try to make his new idea fit with old ones. Copernicus did not prioritize tradition, harmony with doctrine, or believing what he’d been told. Instead, he prioritized mathematical elegance, theory, and simplicity. Much as Copernicus envisioned himself working safely within the parameters of his Christian world, what he was actually doing was discarding dogma as a first principle, and turning towards the authority of his own experience, observation, data, and mathematics.
Basically, he was instantiating an age of science.
In the centuries following Copernicus, scientists (then called natural philosophers) continued to follow his lead, often in similarly spectacular, Earth-shaking ways. One such impactful figure was Isaac Newton. Like Copernicus, Newton had no overt intention to disrupt the Church. He was a deeply committed if unconventional Christian who devoted at least half of his enormous body of 10 million written words to theology (Mann 2014). However, also like Copernicus, Newton both excelled at and had an affinity for mathematics. He believed that truth was to be found in “simplicity,”—specifically the simplicity of mathematics unified with experimentation—and not “in the multiplicity of things” (Fragments from a Treatise on Revelation?). To this end he accomplished two remarkable feats: he discovered gravity, and he formulated laws of motion. He was convinced that these breakthroughs left room for God in the system; he found solace in the fact that the laws described how the universe worked but not “who set the planets in motion” (Tiner, J. H. 1975 Isaac Newton: Inventor, Scientist and Teacher. Milford, Michigan, US Mott Media). But by forcing his religious convictions to conform to his calculations and not the other way around, Newton followed directly in Copernicus’s scientific footsteps.
Another such revolutionary figure was Charles Darwin. Darwin’s relationship with Christianity was complicated. Darwin identified as a member of the Anglican Church for some time and in fact was once destined for the cloth; however, Darwin’s nominal religiosity lasted only until the time of his daughter’s death at which point he denunciated his faith (Moore 1989). It is not altogether clear what he thought of Christian dogma even before her death. In 1839 he wrote in one of his notebooks that belief in God is an evolutionary artifact—“love of the deity [is the] effect of organization, oh you Materialist!” (Darwin 1838: 166). Evidentially, Darwin was happy to probe far beyond the strictures of faith; this helped him to develop a theory that neatly unified biological, geological, and ecological observation. But he was tortured by the power he knew this theory had, so much so that he was plagued with chronic migraines, vomiting, and anxiety throughout his entire life. No longer could people think of themselves as intentionally crafted by a supernatural personality who made them in His image. No longer could they imagine themselves and other creatures as intentional pieces of a divine plan. Evolution, in George Levine’s words, “notoriously decentered the human” (150). Darwin severed truth from the traditional worldview just as Copernicus and Newton had, this time by turning humans into apes.
Today, we celebrate Darwin, we celebrate Newton, and we celebrate Copernicus. We love their triumph and their rebellion. We envision them as courageous adventurers who dared to trust in what they observed and hypothesized. As an example, consider the Google Doodle that celebrated the 540th anniversary of Copernicus’s birth on 19 February 2013. In the synopsis of this Doodle, its artist Jennifer Hom writes that “his heliocentric theory rocked convention.” Forbes finds Copernicus worth celebrating for similar reasons, as an article by Alejandro Jenkins on the site describes his On the Revolutions of Celestial Spheres as “a great conceptual leap, courageously made in the face of the common-sense view… in the face of the enormous prestige of Ptolemy….and in the face of Christian authorities.”
Copernicus dared to defy convention by prioritizing his intellect over received wisdom. Newton and Darwin did the same. So have many others—so have many thousands of others—in the intervening centuries. The result of their courage is a radically new world built on testable, empirical data. The result is technology, efficiency, progress. The result is having answers to our greatest questions and problems dance seductively at our fingertips. The result, it often seems, is truth.
The flip side
All of this is may be well and good, but we misconstrue the scientific enterprise if we focus only on what is won, as opposed to what is lost. So much as science brings new knowledge into play, old knowledge goes out. Of course, this may seem a no-brainer: science establishes theories built on empirical observation. In doing so, it may dislodge various superstitions, faiths, cultural narratives, mores, or what scholar Wesley Wildman calls “cognitive defaults”—things in which it comes naturally for humans to believe.
We understand and celebrate that this was the case in specific instances with the world-shattering figures of Copernicus, with Newton, and with Darwin. But we tend to underestimate how this process of dislodging and deconstructing is an inherent part of the scientific process as a whole. Science is as much about unlearning as it is about learning. Woven into the very fabric of science is literal destruction.
In this sense, the story of science—and perhaps even the story of all of humanity—is realizing that we were once wrong. History is littered with the carcasses of old ideas. Sometimes these ideas are of only marginal importance, as in, for example, the phlogiston theory of air and combustion in the seventeenth century. Yet often—especially when we’re talking about large-scale, paradigm-shifting discoveries—these ideas carry great emotional weight. Often they are cornerstones of our wellbeing or sanity. Often they are the kinds of ideas on which we hang our salvation.
As human animals, it is a basic and necessary part of our being that we form attachments to ideas, such as about how the world works, or the foundations of a moral life. We cannot really help it. If we didn’t form these attachments, the vast majority of us would not be able to make sense of the world or to do so in ways that feel good or provide a semblance of order. This means that when we do not pay careful attention to the ways in which science burns through received ideas—which is actually all of the time—there can be significant psychological, sociological, or political fallout.
One of the most clear examples of negative political consequences arising as a result of a scientific theory is the development of American fundamentalism in the wake of Darwinism in the late nineteenth century. Starting around 1870, which was ten years after the publication of The Origin of Species, a group of American Protestants began to feel threatened by the accelerated liberalization—that is, the gradual encroachment of science and Biblical criticism—on their faith. Liberal churches (concentrated in the North) moved to interpret literature in a less literal way; conservative churches (concentrated in the South) recoiled at the idea of losing their station as God’s favored species, on His chosen planet, and generally resisted the continual degradation of their faith by philosophy, textual criticism, and science (Woodberry and Christian 1998: 25-56). Importantly, political tensions between Northern and Southern churches ran high. Sociopolitical sources of conflict are not to be underestimated. But there were also centuries of deconstructive philosophy and science bearing down on people who were raised to prioritize scripture. Something had to give.
In 1910, it did. Protestantism fractured. Pastor Rueben Torrey edited and released a collection of essays under the name The Fundamentals and began to distribute them widely among American Protestants of all denominations. Torrey and others continued to disseminate this twelve-volume series over the course of the next several years, with three editors, sixty-four authors, and three million volumes eventually reaching more than two hundred and fifty thousand religious professionals and communities. They were intended to promote biblical literalism and discourage external criticism, including any which may come from the sciences. In essence, they encouraged the prioritization of scripture and faith as conceptual schemes, much like the medieval thinkers before Copernicus had done.
In 1920, the Baptist editor Curtis Lee Laws coined and popularized the term “fundamentalism” to designate Protestants who were ready to “do battle royal” for the fundamentals (Law 1920: 834; see also Straub 2018). As a paramount example of one of Lee’s fundamentalists, in 1925 William Jennings Bryan went to court to try to keep evolution out of schools in the famed Scopes trial. The rest, as they say, is history. Since the time of Bryan America has been plagued by an increasingly polarized spiritual and political landscape.
I am not the first person to note the often devastating potency of Darwin’s theory of evolution. Philosopher Daniel Dennett in 1995 published a Pulitzer-Prize nominated book titled Darwin’s Dangerous Idea: Evolution and The Meanings of Life. In this book, Dennett describes evolution as a “universal acid.” This acid is a liquid that is so corrosive that it eats through anything with which it comes into contact, even a potential container. One such potential container could be Christianity, or dualism, or any traditional doctrine that concerns ideas about who we are as a species or what the meaning of life is. Darwinism, writes Dennett, “eats through just about every traditional concept, and leaves in its wake a revolutionized world-view, with most of the old landmarks still recognizable, but transformed in fundamental ways” (1996: 63). Evolution makes it intellectually incoherent to believe in a literally-interpreted Bible, to believe in a personal God, to think of humans as having souls or going to heaven. At least in the West, it corrodes traditional ideas and structures everywhere we look.
And yet, it is not evolution per se but science itself that drives this kind of destruction. As the process of science is on-going, we have to move forward knowing that no particular idea we may hold or cherish is safe. Even ideas that many secularists or lovers of science hold dear are subject to erosion. In recent decades, for example, cognitive science has brought into serious question what exactly it means to be a “self.” Consciousness can be divided and reunited without the person who’s experiencing it being none the wiser. This may sound obscure but the problem is essentially that neuroscientists cannot seem to find any reason to believe that consciousness is one unified entity—that is, that the “you” that you’ve thought you were since you were a child actually exists (see e.g. Damasio 2010, Flanagan 2002). This can be a startling or even terrifying revelation. The wild popularity of Robert Wright’s book Why Buddhism is True suggests that many Westerners are turning to Buddhism to cope with these changes science has forced on their worldviews. They may assert it’s for other reasons, but I would argue that they have been unknowingly pushed by the incompatibility of their culture’s religious and philosophical milieu with science.
Depending on which side of history you’re on, a look at science from the time of Copernicus until now reveals one of two things: either a series of triumphs in the production of empirical knowledge, or a series of battles in the fight for traditional, cherished narratives and beliefs. In truth, it is actually both, and the lines are increasingly blurred as science continues to demonstrate that we were wrong about just about everything—not just God, but ourselves, and the universe, too. We will probably, as Thomas Kuhn has made abundantly clear, continue to find out that we were wrong about many things in which we have great confidence today. It is for this reason that we can best think of science as eroding more than it establishes. This is the nature and the beauty (if often also the tragedy) of the scientific endeavor: to question, to investigate, to churn through that which is known or hypothesized or assumed. Dennett called Darwinism the universal acid, but in reality the universal acid is science itself.
Kuhn was in a sense onto this when in 1962 he published his seminal book The Structure of Scientific Revolutions. From this book, college freshman all over the world learn the lesson that science advances through what Kuhn calls paradigms. Rather than being a linear progression towards some transcendental, ultimate truth, Kuhn argues, science is much messier than that. Theories are tested, become established, are defended, and then eventually, often by the next generation of scientists, are discarded in favor of what seem to be increasingly accurate depictions of reality. But what we need to do—in our understanding of how popular culture interfaces with the sciences—is include cultural norms, mores, and ideas in our analysis. Science eats away at itself and it eats away at that which is on its periphery—or its horizon—not just venturing into the unknown or tweaking its own conceptions of what is known, but also deconstructing concepts we may hold dear on which we may hang the hat of our existential sanity: concepts not just such as God and the afterlife but also free will, the self, absolute values, and morality.
Science opens up more questions than it closes. According to theoretical physicist Marcelo Gleiser in his book The Island of Knowledge, the Theory of Everything of which Hawking and Dawkins and Weinberg dream, in which humanity is predicted to come to a complete understanding of why and how everything in the cosmos works, is a fantasy. He calls it the “fallacy of final answers.” Rather than knowledge being an expanding pool that eventually eats up the unknown until there is none left, it is actually an island adrift in an ocean of infinite questions. As soon as science closes up one area of knowledge—at least for the time being—it opens up at least two more. For example: When you discover a proton, you don’t hit the end of the road, but you face multiplying questions. What is a proton made of? How much does a proton weigh? How does a proton stay in such close proximity to a neutron in the nucleus of an atom?
Anyone who thinks that science is going to provide the same kinds of fullness and complete answers that the things we regard as religion once did is kidding themselves; science has not delivered unto us finally an age of final answers. It has delivered us into an age of increasing ambiguity and uncertainty, and many of the things that have become uncertain are those which matter most: Who are we? Why are we here? What does it mean to live a moral life? These were questions that once had unambiguous answers. They served as pillars of what I consider to be our existential sanity. They were bedrocks of our understanding of who we were and how we fit into the world.
Science has severely disrupted our ability to have these kinds of robust answers, to relax into safety and certainty regarding our deepest concerns and most ultimate questions. In the event that we do hold onto these answers—as is the case for fundamentalist Christians in the US—and if we participate in discourse with secularists, we become social pariahs, defensive, embattled and embittered. In the event that we are able to let go of the kinds of firm foundations humans so naturally enjoy, we must constantly be doing damage control and making sure that we have filled in the gaps in our metaphysical and emotional landscapes as best we can. It is not easy. And it is not over. And those of us who sit loftily alongside science without having been obviously disturbed by its destructive churning should not be so complacent. Just because we are fine does not mean we will always be, nor does it mean that we haven’t been unknowingly affected by what its wrought (we have).
The problem is that science is indifferent to our humanity. We don’t do well with uncertainty and we don’t do well when our beliefs (even perfectly “rational” ones) are challenged, or displaced. But the data doesn’t care. We don’t get to have certain notions just because we’d like to, or because it comes naturally to us as human beings, or because we have been cultured to believe it. We only get to believe things if they make the most sense with our observations and corresponding theories. This is just the way things are today, and they have been this way ever since Copernicus took the first steps down this path five hundred years ago.
Science is ruthless. We may revere science as the wonder that brought us indoor plumbing and Prozac, and we may even appreciate its relentless rigor, but we also know, even if we do not even know that we know, that science has robbed us of certainty, that science has irrevocably changed religion, that science has disrupted our ways of experiencing meaning and purpose, and that science has forced us to confront ourselves in our most naked humanity. These are not necessarily bad things, and I imagine many people reading this article (myself loosely included) would consider them good, but we must understand they are powerful ones, and ones that have caused great pain to many, great psychological wounds on our culture, and great divides among now polarized constituencies. We tend to believe that science is our great creator, and in many ways the source of our salvation. But science is at the same time the Shiva of the West. So science giveth, so it also taketh away.
With this view of science in mind, we have two important steps we need to take moving forward. The first is that we must work intentionally to take care of the people who hold dear ideas that science displaces. This is very necessary because even while sometimes our attachments are inconsequential, other times they most assuredly are not. If we had known back at the turn of the twentieth century how people form attachments to ideas and how to gently change them, American fundamentalism may not have become such a problematic part of the political landscape. Today, we still lack this ability. But we have the potential to develop it.
The first step to developing systemic strategies to cope with change and attachment, is to conduct more research as to what precisely those strategies might be. How do we become as receptive as possible to self-criticism, to change, to instability, to evolving with the times? Our efforts should focus in two primary ways: on the psychological and sociological levels. We need to know in greater detail, psychologically, what humans require in order to relax attachments and become less dependent upon specific beliefs for emotional stability. We also need to know how events often unfold in this manner, so conducting historical and sociological analyses will also be important.
While developing more systemic approaches to this problem, we can still be impactful by integrating these insights into our personal behavior. We should move forward with more compassion and empathy for others, as we understand how precarious worldviews are and what can happen to us when they are shaken.
Consider as an example of what not to do the work and sentiments of famed New Atheist Sam Harris. Harris is deeply concerned about the political downsides of fundamentalism, so much so that he says things such as: “The danger of religious faith is that it allows otherwise normal human beings to reap the fruits of madness and consider them holy” (2005: 73). But if he wishes to change the world for the better, should he actual use rhetoric like this? Not in the slightest. Calling a whole community stupid usually doesn’t endear you to them or make them charitable to your arguments. Your opponents will become defensive, angry, and only further entrenched in their own viewpoints. And people on your own side will become more vitriolic, disdainful, and also further entrenched in their own viewpoints. All Harris does is create warzones. He adds fuel to the polarizing fire, and in this sense is a scourge on the qualities of communication, exchange, and progress in the West.
Compassion, empathy, and listening, on the other hand: these have been demonstrated time and time again to actually be able to facilitate productive conversations (eg: Shelton and Rogers 1981, Perloff 1993, Limbu 2015, Dunn and Goodnight 2016). When you get to know a person and understand where they are coming from, not only do they come to trust you, but they will listen to your ideas and suggestions more readily. The same of course goes for you, if you learn to trust somebody and then become more willing to hear their perspective and see how it might fit with your own. In any case, whether you are trying to persuade someone of something or not, the right approach is compassion. Let us not fight people who have been, in one way or another, victims of the zeitgeist. But let us help them, by being kind and attentive neighbors, providing educational resources, and offering support where needed.
Generally speaking, one of the most important tasks with which we are today faced as a species is learning to live with ambiguity. This may sound trite but it is anything but. Knowledge is constantly shifting and we have to be able to adapt to it, welcome the uncertainty of it, refrain from becoming too emotionally attached to one idea or another. If human beings can be said to be characterized by any quality it might be that we easily become attached to things—including ideas—and then fall on our swords when change comes, as it inevitably does.
We need to be able to live with change, live with uncertainty, live with the possibility that we don’t have all the answers, bow our heads in front of the great mystery of the cosmos and the pulsing dance between knowledge that is “known” and knowledge that is or must be lost. This is Gleiser’s preferred solution. In The Island of Knowledge, he encourages us to embrace uncertainty even while we feel so compelled to burrow ourselves in safe havens. He writes: “We may crave certainty but must embrace uncertainty in order to grow” (2015: 280). In a personal correspondence with me he once referred to the Tom Stoppard play Arcadia, in which a character says that “it’s wanting to know that makes us matter.” Knowledge, for Gleiser, is not so much a ground on which we rest, but a journey that we take. It is a quest, and in allying ourselves with the search for understanding, as opposed to the literal understandings we may temporarily achieve, we embrace the dynamic nature of our inhabitance in the world and live fully as creatures capable of change and growth.
To embrace uncertainty is certainly not an easy task. Humans have been demonstrated to experience the secretion of stress hormones in time of uncertainty (Berker et al 2016), and, importantly, to secrete the greatest amount of stress hormones when faced with the greatest amount of uncertainty. Uncertainty is arguably the greatest stressor humans face. Most of the time, we are not aware of the effect it is having on us. This only happens when we confront specific decisions we are trying to make. But what if we feel uncertain about our view of the world, our morals, our futures, our experiences of loss and death and suffering and how to make sense of it all? Uncertainty is uncomfortable, and affects us always.
Because uncertainty is so stressful for us, it is also something we avoid at all costs. We formulate answers to questions we might otherwise rationally know don’t have good answers. We ally ourselves to political parties and form attachments to ideals and causes, without checking how biased we are by our emotional attachments. We cling to the certainty and simplicity promised to us by politicians, demagogues, fascists. We form concrete opinions and become defensive, even aggressive, when our safe havens are threatened. We find things in which to feel certain and hang our existential sanity on them. This is the human way, perhaps especially in our tumultuously uncertain and frenetic modern world.
But we cannot afford to do this any longer. As a species, we must learn to feel comfortable in the spaces in between. As the sciences advance, we might think that we are marching towards greater knowledge and surety, but it is actually the opposite. We must be ready for our worlds to be changed, lost, shattered, and to care for one another as we live courageously in the spaces between.