Struggling for Truth in an Age of Misinformation
As a person who enjoys political debate, I often find myself engaged in lively discussions. Generally, even if we end up agreeing to disagree, I leave these conversations feeling like I’ve learned something. Indeed, in an era of increasing polarization, these kinds of healthy debates are crucial; they serve to crack open a window, to unsettle the stagnant air and let new ideas flow.
The problem I’ve found recently is that it feels like we’re no longer just debating interpretations or opinions, we’re fundamentally disagreeing on what is true. This erosion of a shared reality poses a significant challenge to meaningful dialogue, and Meta’s move to drop fact-checking filters feels like a dangerous step towards further uncertainty and division.
In Zuckerberg’s recent announcement, he revealed plans to replace third-party fact checkers with a user-driven system like X’s Community Notes. In this move, which will begin in the US, Meta has essentially shirked any responsibility for the misinformation posted on their social media platforms. Indeed, while this approach is supposedly driven by a vision of social media becoming the ‘town square’ of the internet, there are fundamental flaws in the system which inhibit effective self-regulation.
Firstly, the Community Notes system requires users to register as contributors and flag potentially misleading posts, but suggested comments are only made publicly available if they are approved by enough contributors with diverse viewpoints. We have seen the shortcomings of this approach play out already on X where an independent investigation revealed that only 20 of 283 posts containing false election claims received publicly visible Community Notes. While Zuckerberg suggests change is needed because fact-checkers are making ‘too many mistakes’, it seems likely this approach will make many more mistakes – missing misinformation.
The danger of misinformation can, in-part, be understood by applying a psychological lens; by considering the confirmation bias we all hold which makes us more likely to believe information that confirms what we already hold to be true. On social media, this psychological tendency sends us down rabbit holes, it polarizes us, because it is possible for everyone to find some ‘truth’ to latch on to. In this way, Naomi Klein describes the way social media exacerbates groupthink, acting as a ‘one way mirror’ that creates societal fracture where everyone feels like their reality is based on truth, while the ‘others’ are plagued by lies.
Additionally, the process of checking notes is slow. It can take an average of 11 hours for public comments to be added by which point the content may have been seen by millions of users. When we consider the brain’s anchoring bias – the tendency to rely heavily on the first information we see even when other information becomes available – we can appreciate the significant risk of this delay.
Further, the impact of misinformation is not abstract or conceptual. When doubts are cast on truth, the consequences can be fatal. We saw this most clearly during the COVID-19 crisis, which some have referred to as a ‘deadly infodemic’ as the spread of misinformation contributed to vaccine avoidance and mask refusal increasing morbidity rates. While the impact of false news in other areas may be harder to quantify, we see its insidious effects everywhere. The misinformation spread on X about the attack on a dance class in Southport sparked riots and fueled hate crimes across the UK. Further, in the US claims that the 2020 presidential election was stolen, sowed seeds of democratic doubt inciting an insurrection that saw violent mobs march on Capitol hill. Indeed, nothing is ‘just’ discourse, (mis)information has corrosive impacts that plays out in the ‘real world’ in more ways than there is here space to discuss.
Finally, while centering around the idea of truth, the political elements of this issue cannot be ignored.
Primarily, with this policy Zuckerberg claims to be combatting “politically biased censorship.” While the idea of free speech is not inherently politicized, there is a grain of truth in this idea of a partisan divide. Research analyzing 9,000 politically active twitter users suggests pro-Trump accounts were disproportionately suspended during the 2020 election. However, these accounts were also found to rely more heavily on low-quality news sources reporting misinformation. As such, this isn’t a free speech issue, because right-wing users aren’t being silenced for sharing truths, but rather for circulating falsehoods. The problem, therefore, lies not with ‘unfair’ fact checkers, but with the constant exposure to unreliable information.
Additionally, this move from Zuckerberg has – by no coincidence – seen him follow in the footsteps of the richest man in the world, who is particularly cozy with the incoming President Trump. Where Meta’s policy shift claims to reflect the ‘cultural tipping point’ indicated by the recent presidential election, it shows that the dial on political discourse has shifted to the right, and the profit of platforming such speech is too sweet for Meta to miss. Here, we see interest convergence theory in practice – while liberal values were in favour, restricting harmful speech was profitable, now, there’s money to be made from opening the floodgates to misinformation. Indeed, this shift to deregulate discourse, particularly on subjects of ‘immigration, gender identity and gender’ poses a significant threat to women, LGBTQI+ people and other minority groups.
Social media platforms undoubtedly have a difficult task in-hand, but Meta’s decision to dismantle third-party fact-checking represents a dangerous step in the wrong direction. In an era of misinformation, robust mechanisms to discern fact from fiction are more vital than ever. To resist the global spread of these policies, we need to put pressure on British policymakers, to strengthen online safety laws, regulate social media companies and call for prompts that remind readers to apply a critical eye. Indeed, we should take this moment as a reminder to support independent journalism and be vigilant about the information we consume. The quest for truth is not merely an academic exercise, what is at stake is nothing less than the cohesion of society and protection of justice.