The old adage that a lie can traverse the globe before the truth puts its shoes on has never been more relevant than it is today. Because of social media, lies can literally spread around the world in mere seconds. This has enormous implications for public health, as we have seen with the anti-vaccine movement.
According to a new report in the Wall Street Journal, Facebook plans to crack down on "false criticism of vaccines" and "posts promising miracle cures or flogging health services." While this is a great idea in theory if done properly, Facebook's track record of policing the content of its social media platform is poor.
Facebook's Dubious Track Record of Self-Policing
First, it's not clear how Facebook will determine which content is peddling pseudoscientific health claims. The WSJ article said that YouTube is using medical doctors to help identify such content. However, the article remained vague about Facebook's strategy, and a press release that the company issued didn't clarify matters.
It seems that Facebook is placing an emphasis on pages that sell products. That's a good start, but it's hardly sufficient. Some of the most influential people or news outlets on social media aren't selling products (other than website clicks for ad revenue). Memes spreading false information routinely go viral. Overall, the problem is so bad that a recent report suggested that a large majority of the health information shared on Facebook is wrong or misleading. (Worse, the "fake news" often comes from supposedly credible outlets.)
Second, Facebook has inappropriately censored content from its platform. For instance, it shut down a pro-biotechnology page called "We Love GMOs and Vaccines" for about a week because anti-science activists complained about it. They also blocked us from running an ad that criticized a shoddy health story -- the very thing that Facebook now says it is trying to accomplish. When we approached them about their decision, they twice provided bizarre, nonsensical reasons before finally reinstating the ad.
Finally, some health news which, at first glance, sounds a little wacky might actually be true. Zinc lozenges (but not vitamin C) may shorten the duration of colds. Acupuncture doesn't work, but the placebo effect does. Electromagnetic fields don't cause cancer, but strategically deployed electric fields actually may help fight it. Who at Facebook will determine which of these stories (or products) is allowed on their platform?
The answer most likely will involve some combination of algorithms and human intervention. Unfortunately, Facebook has already demonstrated a gross inability to block the spread of fake political news. How on Earth will it thwart phony health claims, which are much more difficult to analyze?
If they have not already done so, Facebook should follow YouTube's path and seek outside expertise. May we suggest the American Council on Science and Health?