Truth, trust, and hope. Where to start?
Last week I was invited to the Nobel Prize Summit on information integrity at the National Academy of Science. I’ve attended quite a few of these types of events lately—discussions on mis- and disinformation’s impact on truth, trust, and hope. I keep noticing the same themes bubbling up. Here are a few thoughts.
Mis/disinformation is a major problem.
Truth is now debatable. The major challenge in scientific communication is that the truth is now networked by peers. Because of this, disinformation and misinformation are eroding public trust in science, becoming a threat to the planet, and costing lives. But it goes beyond a pandemic—climate change, routine vaccinations, gun violence, reproductive health. Everyone—the private sector, government, researchers, and communities nationally and internationally—is rightfully worried.
AI will make it worse.
The problems are just getting started. Before AI, false news spread 6 times faster than the truth on social media. AI will accelerate the speed, reach, complexity, and innovation of disinformation. It will create more effective content that plays on human emotion and, thus, goes viral and changes human behavior.
A prime example emerged last week when an AI-generated image displayed an explosion at the Pentagon, which impacted the stock market.
AI will diminish safety for those of us on the front lines of communication, too. AI-generated deep fakes will make it impossible to discern what is true and what is not true. The threats and smear campaigns will only get worse. For example, the president of another country shared fake porn pictures of a reporter covering an important story.
What is trustworthy? And who is trustworthy? These questions are going to get more difficult to answer with time.
Too much talk. Too little action.
There is a lot of talk about the problem. There is a lot of doom and gloom from leadership.
I don’t think this is useful. We know what to do, but I’m getting increasingly frustrated with inaction. We need to coordinate and mount a proactive and reactive response to address the biosecurity that it is:
Proactive. Prevent information voids from forming in the first place. The public, rightfully so, has many questions about health. Often they can’t find answers in a timely or digestible manner; cue health mis- and dis-information. In public health, this is where we start, given that we have very few resources on the ground. Anticipate concerns and drive national conversations. Listen. Implement a community of practice, training, and communication in plain language. There is SO much we can do, but we just aren’t moving. For example, what are we doing right now to communicate and anticipate RSV vaccine misinformation in fall?
Reactive. Providing accurate, timely, empathetic information isn’t enough. It needs to be supplemented with action to combat mis- and dis-information through monitoring, training, and support.
There is really fantastic work being done on the ground by volunteers, trusted messengers, and entrepreneurs in both of these spaces. But the support is suboptimal—to say the least—casting doubt on the sustainability.
Everyone has a role.
There is a lot of finger-pointing. Everyone thinks someone else should be doing something. And, of those who are doing something, little of their work is supported. Institutions are needed for the long-term solution:
Governments. Congressional courage is needed. In the U.S., other government entities have a role, too: the National Institute of Health (train scientists to communicate and translate; prioritize funding more research in this space), the FDA and CDC (anticipate information needs of the public), Health and Human Services (create, engage, and support communication networks), Department of Defense (create a robust, well-funded surveillance system to understand where, how, and what health misinformation is circulating in real-time), Department of Education (strengthen STEM integration). State governments have a role with medical boards and local action, too.
Academic institutions: Reward scientific communication and knowledge translation through training. Elevate this soft skill through the tenure process, for example. Where does this go on a CV?
Private industry. Some of the biggest problems aren’t the people, but rather the reward structures on social media platforms. Private industry needs to get their act together: Is this truly the future we want? The lowest hanging fruit is transparency: content moderation, algorithm impacts, data processing, and integrity policies.
We have a problem and we are moving at a snail’s pace—on the backs of effective volunteers who find themselves shouting for support into the abyss. We aren’t helpless, but inaction is hindering progress. Government, private industry, academic institutions—just do something. That’s how we move toward truth, trust, and hope.
“Your Local Epidemiologist (YLE)” is written by Dr. Katelyn Jetelina, MPH PhD—an epidemiologist, data scientist, wife, and mom of two little girls. During the day she works at a nonpartisan health policy think tank and is a senior scientific consultant to a number of organizations, including the CDC. At night she writes this newsletter. Her main goal is to “translate” the ever-evolving public health science so that people will be well-equipped to make evidence-based decisions. This newsletter is free thanks to the generous support of fellow YLE community members. To support this effort, subscribe below: