Facebook may ban all US political advertising after the polls close on Nov. 3

Facebook, which has been under fire for the misuse of its platform to promote politically motivated and fake news, said it would take more preventive measures to keep candidates from using it to manipulate the election's outcome and its aftermath; tensions grew amid Donald Trump's evasive comments of a peaceful transfer of power

By Mike Isaac
Published: Oct 8, 2020

Mark Zuckerberg, Facebook’s chief executive, testifies on Capitol Hill in Washington, April 10, 2018. In 2016, Zuckerberg had said it was a “pretty crazy idea” that the social network could have a serious role in altering the outcome of the election. Political ads will be banned indefinitely after polls close on Nov. 3, 2020, and the company plans new steps to limit misinformation about the results. Image: Gabriella Demczuk/The New York Times

SAN FRANCISCO — Over the past few weeks, Mark Zuckerberg, Facebook’s chief executive, and his lieutenants have watched the presidential race with an increasing sense of alarm.

Executives have held meetings to discuss President Donald Trump’s evasive comments about whether he would accept a peaceful transfer of power if he lost the election. They watched Trump tell the Proud Boys, a far-right group that has endorsed violence, to “stand back and stand by.” And they have had conversations with civil rights groups, who have privately told them that the company needs to do more because Election Day could erupt into chaos, Facebook employees said.

That has resulted in new actions. On Wednesday, Facebook said it would take more preventive measures to keep political candidates from using it to manipulate the election’s outcome and its aftermath. The company now plans to prohibit all political and issue-based advertising after the polls close on Nov. 3 for an undetermined length of time. And it said it would place notifications at the top of the News Feed notifying people that no winner had been decided until a victor was declared by news outlets.

“This is shaping up to be a very unique election,” Guy Rosen, vice president for integrity at Facebook, said in a call with reporters on Wednesday.

Facebook is doing more to safeguard its platform after introducing measures to reduce election misinformation and interference on its site just last month. At the time, Facebook said it planned to ban new political ads for a contained period — the week before Election Day — and would act swiftly against posts that tried to dissuade people from voting. Zuckerberg also said Facebook would not make any other changes until there was an official election result.

Read More

But the additional moves underscore the sense of emergency about the election, as the level of contentiousness has risen between Trump and his opponent, Joe Biden. On Tuesday, to help blunt further political turmoil, Facebook also said it would remove any group, page or Instagram account that openly identified with QAnon, the pro-Trump conspiracy movement.

For years, Facebook has been striving to avoid another 2016 election fiasco, when it was used by Russian operatives to spread disinformation and to destabilize the American electorate. Zuckerberg has since spent billions of dollars to hire new employees for the company’s “integrity” and security divisions, who identify and clamp down on interference. He has said the amount of money spent on securing Facebook exceeded its entire revenue of roughly $5.1 billion during its first year as a public company in 2012.

“We believe that we have done more than any other company over the past four years to help secure the integrity of elections,” Rosen said.

Yet how successful the efforts have been are questionable. The company continues to find and take down foreign interference campaigns, including three Russian disinformation networks as recently as two weeks ago.

Domestic misinformation has also mushroomed, as Facebook has said it will not police speech from politicians and other leading figures for truthfulness. Zuckerberg, who supports unfettered speech, has not wavered from that position as Trump has posted falsehoods and misleading comments on the site.

For next month’s election, Facebook has gamed out almost 80 scenarios — what technology and security workers call “red teaming” exercises — to figure out what could go wrong and to protect against the situations. It also updated its policies to outlaw certain types of statements and threats from elected officials, capped by last month’s sweeping set of changes.

But after weeks of Trump declining to say he would accept the election’s outcome, while also directing his supporters to “watch” the polls, Facebook decided to ramp up protective measures.

Asked why the company was acting now, Facebook executives said they were “continuing to evaluate and plan for different scenarios” with the election.

Representatives from the Trump and Biden campaigns did not immediately respond to requests for comment.

Vanita Gupta, president and chief executive of the Leadership Conference on Civil and Human Rights, said Facebook’s moves were “important steps” to “combat disinformation and the premature calling of election results before every vote is counted.”

The open-ended ban on political advertising is especially significant, after Facebook resisted calls to remove the ads for months. Last month, the company had said it only would stop accepting new political ads in the week before Election Day, so existing political ads would continue circulating. New political ads could have resumed running after Election Day.

But Facebook lags other social media companies in banning political ads. Jack Dorsey, Twitter’s chief executive, banned all political ads from the service a year ago because, he said, they could rapidly spread misinformation and had “significant ramifications that today’s democratic infrastructure may not be prepared to handle.” Last month, Google said it, too, would ban all political and issue ads after Election Day.

Zuckerberg has said that ads give less well-known politicians the ability to promote themselves, and that eliminating those ads could hurt their chances at broadening their support base online.

Facebook also said it would rely on a mix of news outlets, including Reuters and The Associated Press, to determine whether a candidate had secured the presidency. Until those news organizations called the race, Facebook said, it would place notifications in the News Feed to say no candidate had won. That buttresses what the company had said it would do last month, when it announced that it would attach labels to posts redirecting users to Reuters if Trump or his supporters falsely claimed an early victory.

To tamp down on potential intimidation at ballot boxes, Facebook also plans to remove posts that call for people to engage in poll watching “when those calls use militarized language or suggest that the goal is to intimidate, exert control, or display power over election officials or voters.”

The company said that it wouldn’t shy away from eliminating more posts as the election approaches. On Tuesday, it took down a post from Trump where he falsely claimed the flu was more deadly than the coronavirus.

“I want to underscore that we remove this content regardless of who posts it,” said Monica Bickert, head of global policy management at Facebook. “That includes the president.”

©2019 New York Times News Service

X