Skip to Content

Facebook, Instagram will allow political ads that claim the 2020 election was stolen

By Clare Duffy and Donie O’Sullivan, CNN

New York (CNN) — Meta will allow political ads on its platforms to question the outcome of the 2020 US presidential election, part of a rollback in election-related content moderation among major social media platforms over the past year ahead of the 2024 US presidential contest.

The policy means that Meta, the parent company of Facebook and Instagram, will be able to directly profit from political ads that boost false claims about the legitimacy of the 2020 election. While the company will allow political advertisements to claim that past elections, including the 2020 presidential race, were rigged, it will prohibit those that “call into question the legitimacy of an upcoming or ongoing election.”

The change is part of a year-old policy update but has not been widely reported. The Wall Street Journal reported Meta’s ads policy change earlier Wednesday.

Meta says the policy allowing 2020 election denialism in political ads was part of an August 2022 announcement about its approach to last year’s midterm elections, when the company said it would prohibit ads targeting users in the United States, Brazil, Israel and Italy that discourage people from voting, call into question the legitimacy of an upcoming or ongoing election or prematurely claim an election victory. The same month, Meta told The Washington Post that it would not remove posts from political candidates or regular users that claim voter fraud or that the 2020 election was rigged.

Meta’s broader electoral misinformation policy continues to prohibit content that could interfere with people’s ability to participate in voting or the census, such as false claims about the timing of an election, according to the company.

President Joe Biden’s reelection campaign blasted Meta on Thursday, saying the company was “choosing to profit off of election denialism.”

“We wish we could say we were surprised Meta is choosing to profit off of election denialism, but it seems to be a feature of theirs, not a bug,” TJ Ducklo, communications advisor to the Biden campaign said in a statement provided to CNN Thursday.

“They amplified the lies behind the ‘stop the steal; movement. Now they’re coming for its cash. Joe Biden won the election in 2020 clearly, unequivocally, and fairly – no matter what Meta choose to promote,” Ducklo added.

“We wish we could say we were surprised Meta is choosing to profit off of election denialism, but it seems to be a feature of theirs, not a bug,” TJ Ducklo, a representative for the Biden campaign, told CNN in a statement about Meta’s ad policy. “They amplified the lies behind the ‘stop the steal’ movement. Now they’re coming for its cash. Joe Biden won the election in 2020 clearly, unequivocally, and fairly — no matter what Meta choose to promote.”

Meta did not immediately respond to a request for comment on the Biden campaign’s statement.

Pressure on tech companies to combat election misinformation ramped up following the January 6, 2021, attack on the US Capitol, which was fueled by baseless claims about 2020 election fraud.

The 2020 election was not rigged or stolen. Dozens of lawsuits attempting to challenge the 2020 presidential election results were dismissed at the state and federal levels in states across the country following a push to overturn the outcome that began in November 2020.

But more recently, social platforms have shifted in how they handle election advertisements and misinformation related to the 2020 election.

Meta, YouTube and X, the platform formerly known as Twitter, have all reinstated accounts belonging to former US President Donald Trump since last fall. Meta clarified following Trump’s reinstatement in January that it would not punish the former president for attacking the results of the 2020 election but said he would be prohibited from casting doubt on an upcoming election.

X also said earlier this year that it would once again allow political advertisements, lifting an earlier ban.

YouTube said in June that it would no longer remove content featuring false claims that the 2020 US presidential election was stolen, reversing a policy instituted more than two years ago. However, the company says it will still prohibit content that misleads users about how and when to vote, promotes false claims that could discourage voting or otherwise “encourages others to interfere with democratic processes.”

YouTube’s policy change allowing 2020 election denialism does not apply to its ad policies, YouTube spokesperson Michael Aciman confirmed Wednesday. YouTube’s ad policy continues to prohibit claims that are “demonstrably false and could significantly undermine participation or trust in an electoral or democratic process.”

Separately, Meta said earlier this month that it would require political advertisers around the world to disclose any use of artificial intelligence in their ads, starting next year, as part of a broader move to limit “deepfakes” and other digitally altered misleading content. It also said it would prohibit political advertisers from using the company’s new, artificial intelligence tools that help brands generate text, backgrounds and other marketing content.

–CNN’s Brian Fung contributed to this report.

The-CNN-Wire
™ & © 2023 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.

Article Topic Follows: CNN - Money

Jump to comments ↓

CNN Newsource

BE PART OF THE CONVERSATION

KION 46 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content