Sometimes It Is Best To Say, “We Don’t Know”
In what was without a doubt a scientific triumph, scientists sequenced the genome of the new virus causing a rapidly spreading pandemic and then harnessed a new technology involving messenger RNA (mRNA) to bring us vaccines that dramatically reduce the risks of our getting severely ill or dying from it.
The above statement is the accurate way of looking at what happened in the first year of the coronavirus pandemic. How is it possible, then, that a Pew Research Center report issued on February 28 tells us that “fewer U.S. adults are confident that scientists are acting in the public’s best interest than before the pandemic began.” The report showed that between April 2020 and December 2021 the percentage of Americans who have a great deal of confidence that medical scientists act in the best interests of the public declined from 43% to 29%.
There is clearly a huge disconnect here between what scientists have accomplished during the pandemic and how people view their work. That means there is major work ahead of us to understand what caused that lack of trust and to repair it. Fortunately, many scholars and policymakers are examining the problem and recommending steps for us to take to improve the situation.
Uncertainty Drives Mistrust
A driving force behind the growing lack of trust in medical scientists seems to stem from the discomfort some of us have with uncertainty. James Dillard, a communication scientist at the Pennsylvania State University who studies warning messaging, recently explained that “Pro-vaccine messaging is taking place in a highly competitive message environment—one that involves active efforts to undermine public health advocacy…The fact that scientific knowledge evolves and always possesses a degree of uncertainty explains why health agencies change—and continue to change—their messaging. Regrettably, this inconsistency also undercuts the impact of health messaging in a public that wants simple, consistent answers.”
It turns out that we vary in how much uncertainty we can stand and that many factors determine that variability. “As intolerance of uncertainty has begun to be studied as a separate trait from a tendency to worry, “writes Francine Russo in Scientific American, “psychologists have identified typical behaviors—often unconstructive ones—that people use to tamp down their distress at not knowing…Some collect every bit of information they can find online and offline.” If scientists appear uncertain, in part because they change their recommendations based on evolving information, then people with high uncertainty avoidance will seek certainty elsewhere, putting their trust in sources that seem confident even if we know them to be misleading or incorrect.
Russo describes a study conducted at the University of Illinois, Chicago two years before the coronavirus pandemic in which neural activity during an uncertainty test was measured using functional magnetic resonance imaging. Two years later, participants in the Chicago study who manifested high activity in a brain region called the anterior insula during the uncertainty test reported increased anxiety, depression, or emotional distress during the pandemic. The anterior insula is associated with unpleasant emotions and fear and this finding suggests that there is a biological trait linked to discomfort with uncertainty.
Lack of trust in health authorities may also have deep roots. A recent study from the U.K. showed that a history of childhood trauma predicted COVID-19 vaccine hesitancy and was independently associated with low levels of trust in National Health Service information about COVID-19. Any attempt to improve our trust in medical science must therefore take into consideration these traits, some of which are inherent in our makeup, some of which are cultural, and others acquired by the often traumatic experiences we have through our lives.
Borrowing from their research in child development, scientists Tamar Kushnir, David Sobel, and Mark Sabbagh recently wrote that “Research suggests that people are more likely to follow advice delivered with confidence and to reject advice delivered with hesitancy or uncertainty.” When, however, that advice changes, as it has frequently and inevitably during the pandemic, the uncertainty created in peoples’ minds leads to decreased trust in the advice-givers. Their work suggests a solution. “Our research suggests that, in many cases people trust those who are willing to say ‘I don’t know.’” They go on to assert that:
“The good news is that, based on our research, we believe the human mind doesn’t balk at hearing communicated uncertainty – quite the opposite. Our minds and brains are made to handle the occasional ‘I think so,’ ‘I’m not sure’ or ‘I don’t know.’ In fact, our ability to do this emerges early in child development and is a cornerstone of our ability to learn from others.”
It may not be uncertainty alone, then, that is so discomfiting to us but rather the illusion of certainty that is breached by overly confident health officials that most unnerves our trust in them. What follows from this is that we need a massive campaign to train scientists and health authorities about how to communicate uncertain science without undermining confidence in science. Just as important, we need to educate people how to live with that uncertainty. It will be crucial to ensure that this work is culturally sensitive as well, as the ability to tolerate uncertainty differs not only among individuals but among cultural groups as well. Scientists need to be able to say “this is what appears to be the case from the work we have done so far and therefore we are making recommendations based on what we know now. More research will be done, and these recommendations may change as new information emerges. We’ll be sure to let you know right away when there is a new development.”
The effort to improve trust in scientists will take time. A Yale University research group led by Tauhid Zaman has developed a method of moving people away from online inflammatory ideas that “begins with an attempt to build trust over time,” a tactic he calls “pacing and leading” that is played out over months. “I come close to you,” Zaman explains about his work with people expressing uncivil attitudes about things like immigration on Twitter, “and then I take a step. I start to pace you and lead you somewhere else.” He has expressed the belief that his slow and steady approach to building trust with people online will also work in the public health arena. About COVID-19 vaccines, for example, he says:
“I think public officials’ approach to persuasion about the vaccine has been totally misguided. There are people who don’t want to get the vaccine, and the response has been to tell them it’s good and jam that message down their throats. That’s just going to create more antagonism, more skepticism, more denialism. Persuasion takes time; it’s not instantaneous.”
A similar approach to regaining trust is followed by astronomer Katie Mack of North Carolina State University. “What I try to do is share the real information, and I try to share it in a nonjudgmental way. When people ask me a question, and they have a misunderstanding, I try to correct that misunderstanding in a way that doesn’t imply that they’re dumb to believe it in the first place. I try to start from the point of, ‘it’s great you’re interested in learning more about this. Here’s how you can learn more about it.’”
Last February New York Times columnist Ezra Klein put the issue bluntly. “Public health is rooted in the soil of trust. That soil has thinned in America.” Klein cited a recent paper in the journal Lancet that examined the relationship between trust in government and fellow citizens and COVID-19 infection rates in 177 countries. “This yields the paper’s most striking finding,” Klein wrote. “Moving every country up to the 75th percentile in trust in government…would have prevented 13 percent of global infections. Moving every country up to the 75th percentile of trust in their fellow citizens…would have prevented 40 percent of global infections.”
Mistrust in government, public health authorities, and medical scientists has a real effect on our individual and collective health. It stems in large part from an inability to accept the uncertainty inherent in science and is worsened by communications from health authorities that are overly confident and that do not prepare us for the inevitable changes in guidance that result from newly emerging research. We need to think of trust as a public good, something that governments are in large part responsible for maintaining and, when necessary, repairing. It will take an understanding of the many factors that impinge on the tendency to uncertainty intolerance and of the ways we form and maintain trust in institutions and authorities. It will also take time, patience, and an empathic stance. Ignoring the growing issue of mistrust in public health institutions and medical science would be to put the public’s health in ever graver peril.