Facebook has announced significant changes to its advertising and misinformation policies, saying it will stop running political ads in the United States after polls close on 3 November for an undetermined period of time.
The changes, announced on Wednesday, come in an effort to “protect the integrity” of the upcoming election “by fighting foreign interference, misinformation and voter suppression”, the company said in a blogpost.
Facebook’s chief executive officer, Mark Zuckerberg, had previously defended the controversial decision not to factcheck political advertising on the platform, but in recent weeks Facebook has begun to remove political ads that feature dangerous and misleading claims.
In early September, the company pledged to stop running new political ads one week before 3 November, the day of the United States elections, to prevent last-minute misinformation. Now it will also disallow political advertising entirely following election day “to reduce opportunities for confusion or abuse”.
In other words, Facebook will not allow new advertisements starting one week before 3 November, and immediately after polls close it will stop running all political advertisements indefinitely. The company did not give a timeline for if or when political advertising would return.
The new policies mark important progress toward protecting elections, said Vanita Gupta, the president and chief executive officer of the Leadership Conference on Civil and Human Rights, a coalition of dozens of nonprofits and human rights groups advocating for democracy.
“We are seeing unprecedented attacks on legitimate, reliable and secure voting methods designed to delegitimize the election,” Gupta said. “These are important steps for Facebook to take to combat disinformation and the premature calling of election results before every vote is counted.”
Others said the change is too little, too late. Senator Elizabeth Warren called the changes “performative”. The internet freedom group Fight for the Future said in a tweet the change “isn’t going to fix the problem at all”. The group noted that Facebook’s recent decision to allow content from private groups to appear in newsfeeds will increase misinformation and negate any positive changes that come from an advertising ban.
“Facebook is banning political ads but at the same time they’re tweaking their algorithm to go into overdrive recruiting people into groups where they’ll be spoon-fed manipulation and misinformation,” Fight For the Future said.
Facebook is seeking to avoid another political disaster after it was found that Facebook was used by Russian operatives in 2016 to manipulate the United States elections.
Since then, Facebook has hired thousands of people working on safety and security surrounding elections and has worked on more than 200 elections around the globe, “learning from each” and making “substantial progress”, the company said.
Executives at Facebook, including Zuckerberg, reportedly became increasingly alarmed at language from Donald Trump suggesting the president would not participate in a peaceful transfer of power. Trump has also been accused of encouraging violence when he told white supremacists to “stand back and stand by” and encouraged supporters to “go to the polls” and “watch very carefully” at the first presidential debate.
The company also said it will be removing calls for people to engage in poll watching that use “militarized language” or suggest the goal is to intimidate voters or election officials.
Zuckerberg has previously expressed concern about challenges posed by the surge in mail-in ballots this year due to the pandemic.
“I’m also worried that with our nation so divided and election results potentially taking days or even weeks to be finalized, there could be an increased risk of civil unrest across the country,” he said.
Facebook said it would respond to candidates or parties making premature claims of victory, before races were called by major media outlets, by adding labels and notifications about the state of the race.