Meta permits ads claiming 2020 election rigging
2 min readThe policy was quietly implemented in 2022 after the US midterm primary elections, as reported by the WSJ
Meta now permits political ads on Facebook and Instagram claiming the 2020 election was rigged. The policy change, reportedly introduced quietly in 2022 after the US midterm primary elections, was revealed by the Wall Street Journal, citing informed sources. In contrast to the previous policy that restricted Republican candidates from running ads challenging the legitimacy of the 2020 election, Meta will now allow advertisers to assert past elections were “rigged” or “stolen.” However, the platform still prohibits questioning the legitimacy of ongoing or future elections. With the 2024 presidential election approaching, various social media platforms are adjusting their policies in anticipation of heightened online messaging competition.
In August, X (previously Twitter) announced the reversal of its 2019 ban on political ads. In June, YouTube also changed its approach, deciding to no longer remove content falsely asserting fraud in the 2020 or previous US presidential elections. This shift aimed to preserve the capacity for “openly debating political ideas, even those that are controversial or based on disproven assumptions.”
Similarly, Meta considered free-speech concerns in its decision-making process. The Wall Street Journal reported that Nick Clegg, the President of Global Affairs, advocated against the company determining the legitimacy of elections.
According to The Wall Street Journal, Donald Trump ran a Facebook ad in August, taking advantage of the new rules. In the ad, he falsely claimed, “We won in 2016. We had a rigged election in 2020 but got more votes than any sitting president.”
The Tech Oversight Project expressed strong disapproval of the change in a statement: “We now understand that Mark Zuckerberg and Meta are willing to deceive Congress, jeopardize the American people, and continuously undermine the future of our democracy,” remarked Kyle Morse, deputy executive director. “This announcement provides a disturbing glimpse into what we might encounter in 2024.
Paired with recent Meta initiatives aimed at decreasing organically shared political content on Facebook, the visibility of campaign ads challenging elections may significantly increase in 2024.
Gina Pak, the CEO of Tech for Campaigns, a digital marketing political organization collaborating with Democrats, commented to the Journal, “Today you can create hundreds of pieces of content in the snap of a finger and you can flood the zone.”
In the past year, Meta has laid off approximately 21,000 employees, a substantial number of whom were involved in election policy.
Accusations were directed at Facebook for exerting a negative impact on the 2016 US presidential election, as it allegedly failed to address the proliferation of misinformation leading up to the vote, during which Trump triumphed over Hillary Clinton. False narratives, like articles falsely portraying Clinton as a murderer or asserting the Pope’s endorsement of Trump, disseminated on the platform as non-journalists, including a group of teenagers from Macedonia, created fictitious pro-Trump websites to generate advertising revenue when these stories gained traction.
Subsequently, Trump adopted the term “fake news” to disparage credible reporting of his own untruths.