After eight years and billions of dollars, tech company Meta is removing fact-checking services from it’s apps. When the feature was originally introduced, it was to combat the rise of misinformation on social media after the 2016 election. It became clear that social media’s public forum and rapid publication allowed for misinformation to spread more effectively than more traditional media.
The decision to end the fact-checking program came after concern about bias in fact checkers. Joel Kaplan, Meta’s chief global-affairs officer, said “Experts, like everyone else, have their own biases and perspectives. This showed up in the choices some made about what to fact-check and how.”
This reasoning likely stems from the outcry about fact-checking targeting right-wing misinformation more often than that of the left. Cornell University psychologist, Gordon Pennycook, worked on a study measuring “politically asymmetric sanctions:” the perception that certain political groups are the “victim” of more fact-checking and punishments for misinformation. Their study had similar results to Van Der Linden’s conclusions: there are inequalities in what ideologies spread misinformation, which leads to the consequences falling more heavily on certain political or social groups.
In the study’s conclusion, the authors suggest that transparency regarding fact-checking and the use of politically balanced fact-checkers would aid public understanding of why there’s a higher prevalence of right-wing posters being punished for spreading misinformation.
In the announcement marking the end of the fact-checking program, Zuckerberg suggested that Meta may be turning to a “community notes” program, in which posts and information will be peer-reviewed. Similar systems have been put in place on social media programs such as X, formerly Twitter.
According to a 2024 analysis done by the Center for Countering Digital Hate, or CCDH, the community notes program on X is too slow to act against misleading claims. According to their criteria, 74% of “accurate community notes” do not make it to all users reading the posts because of these delays.
Community notes programs can also play into polarizing media. Professional fact-checkers are required to uphold a dedication to truth without bias. According to Megan Duncan, a communications expert from Virginia Tech, “politically ambivalent audiences” don’t feel the need to weigh in on accuracy, and in turn, “the results of crowdsourced credibility labels are politically skewed.”
So where do we stand? Is the truth abandoning us, or are we abandoning the truth? From the perspectives of student media creators and journalists, we are saddened to see what we consider a step backwards from the prevalence of truth in the media we consume. Professional fact-checkers were a resource that allowed us to consume media with the belief that what we saw online is the truth.
Meta’s decision of inaction turns the responsibility of truth to educated people and the consumers of media. We must support those in our communities, especially those who may be more prone to believing misinformation. This may mean having difficult conversations with those we love and confronting a fear of confrontation.
Emotional responses are heightened when dealing with misinformation. We must think rationally and critically in online spaces. A “community notes” approach will only be functional if there is a community ready to put in the work that previously belonged to professionals. Those who have both knowledge and resources must work to debunk myths, or “prebunk” groups and share what misinformation they may be exposed to.
It is our duty to combat misinformation and save our communities when organizations such as Meta fail us. It will be our honor to look back on the society we have shaped.