Facebook says it will ban fresh political ads in week before US elections to prevent disinformation
Washington: Facebook says it will ban political advertising the week before the US election, one of its most sweeping moves against disinformation yet as CEO Mark Zuckerberg warned of a “risk of civil unrest” after the vote.
Under the new measures, Facebook says it will prohibit politicians and campaigns from running new election ads in the week before the election. However, they can still run existing ads and change how they are targeted.
The social media giant also vowed to fact check any premature claims of victory, stating that if a candidate tries to declare himself the winner before final votes are tallied “we’ll add a label to their posts directing people to the official results.”
And it promised to “add an informational label” to any content seeking to delegitimise the results or claim that “lawful voting methods” will lead to fraud.
Posts with obvious misinformation on voting policies and the coronavirus pandemic will also be removed. Users can only forward articles to a maximum of five others on Messenger, Facebook’s messaging app. The company also will work with Reuters to provide official election results and make the information available both on its platform and with push notifications.
“I’m concerned about the challenges people could face when voting. I’m also worried that with our nation so divided and election results potentially taking days or even weeks to be finalised, there could be an increased risk of civil unrest across the country,” Zuckerberg said in a post.
Democrats have long warned that President Donald Trump and his supporters may try to sow chaos with false claims on November 3, when the vote will take place amid unprecedented health and economic crises, social unrest and protests for racial justice.
The US remains the epicentre of the world’s worst coronavirus outbreak, and voters are expected to shift to mail-in voting, with an estimated three-quarters of the population eligible to do so.
As a result officials are warning that the final tally may not be revealed until well after voting day—spurring fears that paranoia and rumour-mongering could hit an all-time high.
Trump—a prolific user of social media who is trailing Democratic challenger Joe Biden in the polls—has recently hurtled down a rabbit hole of conspiracy theories filled with claims that he is victim of a coup and/or plans to rig the polls.
Almost daily, he claims that increased mail-in voting is a gambit to “rig” the election against him, and he has refused to say whether he will accept the results.
He has also opposed more funding for the cash-strapped US Postal Service (USPS), acknowledging the money would be used to help process ballots.
And he has refused to condemn the presence of armed vigilantes in the streets during a wave of social justice protests across America this summer, spurring fears of unrest if there is not a clear result immediately after November 3.
Opponents say Trump’s increasingly extreme resistance to expanded mail-in voting—a method already used widely in the United States—is an attempt to suppress voter turnout, while setting up an excuse to challenge the result if he is defeated.
“This election is not going to be business as usual,” Zuckerberg, who has come under increasing pressure to do more to combat conspiracy theories at Facebook, said.
Facebook has long been criticized for not fact-checking political ads or limiting how they can be targeted at small groups of people.
After being caught off-guard by Russia’s efforts to interfere in the 2016 U.S. presidential election, Facebook, Google, Twitter and others companies put safeguards in place to prevent it from happening again. That includes taking down posts, groups and accounts that engage in “coordinated inauthentic behavior” and strengthening verification procedures for political ads. Last year, Twitter banned political ads altogether.
Zuckerberg said Facebook had removed more than 100 networks worldwide engaging in such interference over the last few years.
“Just this week, we took down a network of 13 accounts and two pages that were trying to mislead Americans and amplify division,” he said.
But experts and Facebook’s own employees say the measures are not enough to stop the spread of misinformation—including from politicians and in the form of edited videos.
Facebook had previously drawn criticism for its ads policy, which cited freedom of expression as the reason for letting politicians like Trump post false information about voting.