Battling the Tides of Social Media Misinformation

As social media platforms unleash a torrent of content, the battle against inaccuracies becomes increasingly daunting. Algorithms, touted as gatekeepers, have not worked. Can crowdsourced corrections stem the tide of misinformation?

There can be little doubt that social media is becoming a more significant source of information for Americans. Nor is there doubt that much of that information is inaccurate or deliberately misleading – it should come as no surprise that it is difficult to convey complex medical and biological conceptss to the general public in 280 characters or roughly 56 words. (I’ve used 381 in just these two sentences). As Mark Twain wrote:

"I didn't have time to write a short letter, so I wrote a long one instead."

Faced with pressure from advocacy groups and the government, social media has tried to moderate and flag inaccurate or misleading posts with a variable degree of success. TikTok posts 280 of its short videos every second or 16,000 a minute; X posts, on average, around 6,000 tweets per second or over 350,000 tweets sent per minute. As a result, most moderation on social media platforms is algorithmic, making moderation more of a sieve than a wall.

Crowd Sourcing to Identify Misinformation

In 2021, X brought the wisdom of the crowd to combat misinformation with the introduction of what is now called Community Notes. Here, anonymous volunteers create “notes” independently, identifying posts that misinform them and offering corrections. Again, because of the number of tweets, these notes, also in the form of a tweet, are subject to algorithmic presentation alongside the tweet being corrected. The algorithm assesses the helpfulness of a contributor’s prior notes to develop a Rating Impact score and, in some manner, assigns it to the left or right of the political spectrum. Ratings from both sides of the political divide are necessary for a Note to post and fulfill its function to debunk a claim or “reduce interaction with biased content.” 

Does Crowd Sourcing Work?

A new study report in JAMA Network Open looked at the Notes generated between December 2022 and December 2023 for one of the usual hotbeds of misinformation COVID vaccination. 657 Notes were related to COVID vaccination during that interval, and they were categorized in terms of their veracity as

“entirely (scientifically supported), partially (scientifically debated), or not (scientifically unsupported) accurate.”

The topic and frequency of the notes were also characterized as adverse events (51%), conspiracy (37%), vaccine recommendations (7%), and vaccine effectiveness (5%).

  • 97% of the Notes were entirely accurate, 2% partially accurate, and 1% inaccurate
  • 49% cited highly credible sources, e.g., primary sourced peer-reviewed journals or government websites
  • 44% cited moderately credible sources, e.g., major news outlets or fact-checkers
  • 7% cited low-credibility sources, e.g., blogs or tabloids. 

Of course, those characterizations of credibility can be questioned. Peer review doesn’t guarantee accuracy, nor does being a major news outlet. On the other hand, some blogs can be quite credible, especially when linking to multiple primary sources.

As the researchers note,

“A sample of Community Notes added to posts on X containing COVID-19 vaccination misinformation primarily addressed adverse events and conspiracy theories, were accurate, cited moderate and high credibility sources, and were attached to posts viewed hundreds of millions of times.”

The researchers demonstrated that these Notes, designed to counter misleading or inaccurate information, were viewed, but as they noted among the study's limitations, they could not measure the efficacy in changing minds. They then make a statement that, while sounding quite definitive, hides a great deal of wishful assumption.

“Higher credibility yields greater persuasiveness.”

If only that were true! I prefer the assumptions made by Australian science communicator Craig Cormick, who has written extensively on the science of communicating science. [1] These two quotes are particularly salient.

“When we are time poor, overwhelmed with data, uncertain, driven by fear or emotion, we tend to assess information on mental shortcuts or VALUES, not FACTS. …

Attitudes that were not formed by logic and facts, cannot be influenced by logic and facts.”

We are time-poor, if for no other reason than how social media platforms reduce our attention span as we jump from shiny object to shiny object. As is becoming increasingly clear, algorithms optimized to serve up ads facilitate attention-seeking, primarily through our in-born fear response, overwhelming choices of what to ponder next. While Notes may indeed present counterfactual evidence to immediately debunk misleading information, we are more influenced by how the evidence “feels.”

“People most trust those whose values they feel mirror their own. …

People … are less happy when science conflicts with a deeply held view, when they are being asked to believe something that goes against what they feel is true.” [emphasis added]

It is this "feeling," rather than some "cherry-picked" facts that underlies confirmation bias. As a result, Cormick believes that you cannot presume to “change people’s minds with more information." This contradicts the thoughts of one of the study’s co-authors, Eric Leas, Ph.D.

“Rather than censoring misleading content, Community Notes fosters a learning environment where users can glean insights from corrections to misinformation to prevent similar misunderstandings in the future. By providing context and credible sources alongside contentious posts, the platform empowers users to discern fact from fiction, a skill they will find useful as they navigate all claims.”

Whether the assumptions I share with Craig Cormick, or those of Dr. Leas are more “correct” requires further study. Navigating social media’s disinformation requires a deeper understanding of human psychology and a commitment to fostering meaningful connections. While Community Notes is a testament to collective action, its efficacy may not lie in conveying factual information but in signaling a commitment to truth and the pursuit of knowledge – the real scientific method.

[1] The Science of Communicating Science: The Ultimate Guide

 

Source: Characteristics of X (Formerly Twitter) Community Notes Addressing COVID-19 Vaccine Misinformation JAMA Network Open DOI: 10.1001/jama.2024.480