The world’s biggest social network is working out what steps to take should President Trump use its platform to dispute the vote.
By Mike Isaac and Sheera Frenkel
New York Times, August 21, 2020
SAN FRANCISCO — Facebook spent years preparing to ward off any tampering on its site ahead of November’s presidential election. Now the social network is getting ready in case President Trump interferes once the vote is over.
Employees at the Silicon Valley company are laying out contingency plans and walking through postelection scenarios that include attempts by Mr. Trump or his campaign to use the platform to delegitimize the results, people with knowledge of Facebook’s plans said.
Facebook is preparing steps to take should Mr. Trump wrongly claim on the site that he won another four-year term, said the people, who spoke on the condition of anonymity. Facebook is also working through how it might act if Mr. Trump tries to invalidate the results by declaring that the Postal Service lost mail-in ballots or that other groups meddled with the vote, the people said.
Mark Zuckerberg, Facebook’s chief executive, and some of his lieutenants have started holding daily meetings about minimizing how the platform can be used to dispute the election, the people said. They have discussed a “kill switch” to shut off political advertising after Election Day since the ads, which Facebook does not police for truthfulness, could be used to spread misinformation, the people said.
The preparations underscore how rising concerns over the integrity of the November election have reached social media companies, whose sites can be used to amplify lies, conspiracy theories and inflammatory messages. YouTube and Twitter have also discussed plans for action if the postelection period becomes complicated, according to disinformation and political researchers who have advised the firms.
The tech companies have spent the past few years working to avoid a repeat of the 2016 election, when Russian operatives used Facebook, Twitter and YouTube to inflame the American electorate with divisive messages. While the firms have since clamped down on foreign meddling, they are reckoning with a surge of domestic interference, such as from the right-wing conspiracy group QAnon and Mr. Trump himself.
In recent weeks, Mr. Trump, who uses social media as a megaphone, has sharpened his comments about the election. He has questioned the legitimacy of mail-in voting, suggested that people’s mail-in ballots would not be counted and avoided answering whether he would step down if he lost.
Alex Stamos, director of Stanford University’s Internet Observatory and a former Facebook executive, said Facebook, Twitter and YouTube faced a singular situation where they “have to potentially treat the president as a bad actor” who could undermine the democratic process.
“We don’t have experience with that in the United States,” Mr. Stamos added.
Facebook may be in an especially difficult position because Mr. Zuckerberg has said the social network stands for free speech. Unlike Twitter, which has flagged Mr. Trump’s tweets for being factually inaccurate and glorifying violence, Facebook has said that politicians’ posts are newsworthy and that the public has the right to see them. Taking any action on posts from Mr. Trump or his campaign after the vote could open Facebook up to accusations of censorship and anticonservative bias.
In an interview with The New York Times this month, Mr. Zuckerberg said of the election that people should be “ready for the fact that there’s a high likelihood that it takes days or weeks to count this — and there’s nothing wrong or illegitimate about that.”
A spokesman for Facebook declined to comment on its postelection strategy. “We continue to plan for a range of scenarios to make sure we are prepared for the upcoming election,” he said.
Judd Deere, a White House spokesman, said, “President Trump will continue to work to ensure the security and integrity of our elections.”
Google, which owns YouTube, confirmed that it was holding conversations on postelection strategy but declined to elaborate. Jessica Herrera-Flanigan, Twitter’s vice president of public policy, said the company was evolving its policies to “better identify, understand and mitigate threats to the public conversation, both before or after an election.”
Facebook had initially focused on the run-up to the election — the period when, in 2016, most of the Russian meddling took place on its site. The company mapped out almost 80 scenarios, many of which looked at what might go wrong on its platform before Americans voted, the people with knowledge of the discussions said.Sign up to receive an email when we publish a new story about the 2020 election.Sign Up
Facebook examined what it would do, for instance, if hackers backed by a nation-state leaked documents online, or if a nation-state unleashed a widespread disinformation campaign at the last minute to dissuade Americans from going to the polls, one employee said.
To bolster the effort, Facebook invited those in government, think tanks and academia to participate and conduct exercises around the hypothetical election situations.
An idea that came up during one exercise — that Facebook label posts from state media so users know they are reading government-sponsored content — was put into effect in June, said Graham Brookie, director of the Atlantic Council’s Digital Forensic Research Lab, who joined the session.
“We can see that their policy decisions are being affected by these exercises,” he said.
But Facebook was less decisive on other issues. If a post suggested that mail-in voting was broken, or encouraged people to send in multiple copies of their mail-in ballots, the company would not remove the messages if they were framed as a suggestion or a question, one person who advised the company said. Under Facebook’s rules, it takes down only voting-related posts that are statements with obviously false and misleading information.
In recent months, Facebook turned more to postelection planning. That shift accelerated this month when Mr. Trump said more on the issue, two Facebook employees said.
On Aug. 3, Mr. Trump questioned whether the Democratic primary in New York’s 12th Congressional District should be rerun because of long delays in counting mail-in ballots.
“Nobody knows what’s happening with the ballots and the lost ballots and the fraudulent ballots, I guess,” he said.
The next day, Mr. Trump broadened his attack, falsely stating that mail-in ballots lead to more voter fraud nationwide. “Mail ballots are very dangerous for this country because of cheaters,” he said. “They go collect them. They are fraudulent in many cases.”
Mr. Trump’s comments alarmed Facebook employees who work on protecting its site in the U.S. election. On the group’s internal chat channels, many wondered whether Mr. Trump would launch even more attacks against mail-in voting, one employee who saw the messages said. Some asked whether the president was violating Facebook’s rules against disenfranchising voters.
Those questions were ultimately sent to Mr. Zuckerberg, as well as top executives including Joel Kaplan, the global head of public policy, the employee said.
In a staff meeting later that week, Mr. Zuckerberg told employees that if political figures or commentators tried declaring victory in an election early, Facebook would consider adding a label to their posts explaining that the results were not final. Of Mr. Trump, Mr. Zuckerberg said the company was “in unprecedented territory with the president saying some of the things that he’s saying that I find quite troubling.” The meeting was reported earlier by BuzzFeed News.
Since then, executives have discussed the “kill switch” for political advertising, according to two employees, which would turn off political ads after Nov. 3 if the election’s outcome was not immediately clear or if Mr. Trump disputed the results.
The discussions remain fluid, and it is unclear if Facebook will follow through with the plan, three people close to the talks said.
In a call with reporters this month, Facebook executives said they had removed more than 110,000 pieces of content between March and July that violated the company’s election-related policies. They also said there was a lot about the election that they didn’t know.
“In this fast-changing environment, we are always sort of ‘red teaming’ and working with partners to understand what are the next risks?” said Guy Rosen, vice president of integrity at Facebook. “What are the different kinds of things that may go wrong?”