Oversimplified messages about the COVID-19 pandemic hurt compliance with mask-wearing and vaccinations, researchers say, and need to be addressed to restore public confidence in science.
While the University of Minnesota’s annual Research Ethics Conference on Wednesday addressed broader issues, it often turned to the challenges of conducting research amid COVID-19 and suggesting protective measures during an unfolding emergency.
Boiled-down statements like “vaccines are safe” might have been well-intended, but they had consequences later on and needed to be nuanced, said Kathleen Hall Jamieson, director of the University of Pennsylvania’s Annenberg Public Policy Center.
“When we say ‘safe,’ it’s heard by people as categorically safe under all circumstances,” she said in her conference presentation. “And then you get a side effect, which we know do exist, and now people say, ‘Wait a minute, you lied.’ “
Public confidence in science as a means of discovery remains high, but it has wavered in opinion polls — something that rarely happened over the past half-century. An inability to replicate results in select oncology and psychology studies had already raised questions before COVID-19, Hall Jamieson said, but the pandemic exacerbated issues by forcing policymakers to make decisions based on initial or limited scientific evidence.
Numerous studies found that masks reduced the odds of coronavirus transmission when they were worn properly, used in combination with other prevention measures, and worn both by infected people and those around them at risk. But the message that often filtered to the public was just that “masks work,” said Brian Nosek, director of the University of Virginia’s Center for Open Science.
“We just have to represent the uncertainty,” he said. The message around mask-wearing should have been, ‘We don’t yet know all the evidence about masks, but here are the reasons that we favor masking right now based on the available evidence and these are the things we’re going to figure out.’ “
Many scientists had remained open to the theory that the pandemic started with a viral leak from a lab in China, but the overriding message to the public was that scientists believed that explanation was a hoax. Nosek said that perception proved costly now that the idea is gaining more credibility — with the U.S. Department of Energy offering a low-confidence finding that a lab leak might have been at the root of the pandemic.
“The lab leak theory is now a viable theory,” he said. “It’s not certain. Having stated with confidence that it is a conspiracy theory is a reputational cost that we’ve earned as scientists.”
Nosek said solutions to build trust in research include disclosing limitations but also the process by which research is established and then verified with follow-up studies.
Survey results showed that people trusted the ability and ethics of researchers when their findings were confirmed by follow-up research. But they viewed the researchers as equally capable and more ethical when their results were contradicted and they conducted new studies to find out why.
People “do not judge researchers’ ability and ethics solely on whether they were right or wrong, but rather on how they pursued the truth,” Nosek said.
Public confidence was lower in scientists who gained initial findings and then didn’t conduct follow-up research to confirm the findings. Nosek and Hall Jamieson agreed that more U.S. research spending needs to be diverted toward confirmatory studies, rather than just toward the next innovations.
“If we don’t know if our innovation is accurate, then it’s not an actual innovation,” Nosek said. “It’s not helping us.”
Hall Jamieson said America’s 20 million scientists might need to become better ambassadors in their communities for their work, because more people are starting to view them as capable but not sharing their values. They also need to embrace research as a fluid process and not fear contradictory findings or corrections to their work when new findings or evidence emerge.
Scientists are “stigmatized” by the term retraction, because it has come to describe both research withdrawn or amended for honest mistakes and studies pulled from publication because of outright fraud, she said: “We need to find a way to incentivize finding one’s own mistakes … and incentivize more self-correction.”
Jeremy Olson is a Pulitzer Prize-winning reporter covering health care for the Star Tribune. Trained in investigative and computer-assisted reporting, Olson has covered politics, social services, and family issues.
© 2023 StarTribune. All rights reserved.