Google, Facebook, Twitter and other major tech companies met with U.S. government officials on Wednesday to discuss their plans to counter disinformation on social media in the run-up to the November election.
In a joint statement, the companies said that this was the latest in a series of meetings "on what we're seeing on our respective platforms and what we expect to see in the coming months." The statement said today's meetings focused on the upcoming political conventions and "scenario planning relating to election results."
Tech companies are under enormous pressure from government officials to prevent their platforms from being used by foreign actors and others to disrupt the 2020 election, as occurred in 2016.
In a statement earlier this month, the director of the National Counterintelligence and Security Center, William Evanina, said that Russia, China, Iran and other "foreign actors" are attempting to "sway U.S. voters' preferences and perspectives, shift U.S. policies, increase discord in the United States, and undermine the American people's confidence in our democratic process."
In an interview with NPR's Morning Edition, Facebook's head of cybersecurity policy, Nathaniel Gleicher, said the company is working harder than ever to combat such efforts, saying the goal is to make sure voters receive accurate information.
"I think I actually want to make the act of trying to tell a lie, or misleading people, more difficult," Gleicher said.
In March, Facebook announced that it removed 49 Facebook accounts, 69 pages and 85 Instagram accounts that were engaging in "foreign interference." More recently, the social network has removed about two dozen other accounts, some linked to Russia and Iran.
"We've shared information with independent researchers and then we've publicized context about it so that users can see what's happening on the platform," Gleicher said.
Facebook has a team of fact-checkers scouring for misleading or false content, the company says, but political advertising is not subject to scrutiny, according to Facebook's rules.
Gleicher defended this hands-off approach to political ads, which has allowed President Trump to spread falsehoods to millions of users about his presumptive Democratic opponent, Joe Biden.
"So we know right now in this election, there are massive debates about all of the ways that people vote and people engage. We want to make sure that people hear what elected officials are saying and what they think about voting," Gleicher told NPR.
"But frankly, I think that information is an important factor in how some people will choose to vote in the fall. And so we want to make sure that information is out there and people can see it, warts and all," he said.
Facebook has, however, on occasion overridden its policy. In June, for example, it removed Trump campaign advertisements containing an upside-down red triangle symbol, which had been used to identify political prisoners in German concentration camps during the Nazi regime.
Both Twitter and Facebook have also removed posts shared by Trump that contained false and misleading information related to the coronavirus pandemic.
Social media companies haven't always been successful at these efforts. An investigation by The Guardian found that groups promoting the conspiracy movement QAnon were rapidly growing worldwide on Facebook. The newspaper identified 170 QAnon groups, pages and accounts across Facebook and Instagram with more than 4.5 million aggregate followers.
Gleicher said that Facebook has been enforcing its guidelines against QAnon consistently. When asked about The Guardian's findings, he admitted that it has been challenging staying on top of conspiracy campaigns.
"This is a place where I think we have work to do," Gleicher said. "I think that the boundaries around what constitutes a QAnon site or not are pretty blurry and that becomes a challenge in all of us. It's why we're looking at this and we're exploring some additional steps that we can take."
Some have raised the possibility of Trump rejecting the results of the November election should he lose. In July, Trump declined in an interview on Fox News to say whether he would accept the outcome.
What would Facebook do if Trump falsely said on the platform that he was the winner of the presidential election?
Gleicher dodged the question, refusing to directly say whether Facebook would take action against such a post. Instead, he said that the final result is widely expected to not be settled on election night, and Facebook officials are gaming out how it might handle chaos that emerges during a prolonged vote count.
"We're not going to know the results immediately. There's going to be a period of uncertainty as counting is still happening. That's something that we've been particularly focused on," Gleicher said.
"In the wake of this period where votes are coming in and we don't know the results, we can expect candidates to be making claims about who's won," Gleicher said. "We could expect claims about whether the results were fair or not. These are things that we've seen before around elections but I think are going to be particularly critical this time."
"How do you accurately report on the claims that are being made," he continued, "but also provide the context to make sure that people understand and can weigh and judge these things?"
Gleicher also touted the company's goal of registering 4 million voters by posting "voting information centers" on Facebook and Instagram providing up-to-date information on how to register, obtain absentee or mail-in ballots and where to vote.
"The reason for this is voting is fundamentally about voice, and that's critical to our efforts and where we are as a company," he said. "It's the best way to hold our leaders accountable and to address important issues."
He added: "As someone who works on security, ensuring that voters have accurate information about an election is critical to protecting that election. Disinformation flourishes in uncertainty, and we've seen people take advantage of that uncertainty to drive influence operations and other types of deceptive campaigns."
Editor's note: Facebook is among NPR's sponsors.
STEVE INSKEEP, HOST:
Will voters who rely on Facebook be any better informed this fall than in 2016? We've been talking with a man who says they can be. Nathaniel Gleicher is Facebook's head of cybersecurity policy, and he has a goal.
NATHANIEL GLEICHER: I think I actually want to make the act of trying to tell a lie or misleading people more difficult.
INSKEEP: Facebook provides financial support to NPR. We cover it the same as any company, which includes criticism of Facebook. It faces demands for Congressional regulation. It's also under pressure for false statements and conspiracy theories on the platform. Russian disinformation and domestic deception spread on Facebook in the last election.
Nathaniel Gleicher spoke with us as the company announced a new effort. Users will now see voting information centers, which provide what the name says.
GLEICHER: The voting information centers are going to include information about how to register to vote, how to check your registration status, how to vote by mail, how you request an absentee or a mail-in ballot. There will be election alerts from state election authorities about changes to the voting process. And there'll be facts about voting to help prevent or address any confusion about the process itself.
INSKEEP: This all sounds really useful, but are you concerned at all that what you have here is a small boat of accurate information in an entire sea of disinformation?
GLEICHER: First, I think it's important to know that this is only one piece of our election protection strategy. We have core teams that are hunting for deception campaigns. When they find them, we announce them publicly and we remove them from the platform. We also have automated systems that help tackle fake accounts and other types of deceptive behaviors and partnerships with fact-checking organizations to ensure that there's accurate information and context on posts.
But a piece of the puzzle is giving people accurate information. So it will be one piece, but it will be prominently displayed, and we'll be layering it on top and into News Feed to make sure people can see it as they're engaging with the election.
INSKEEP: I want to grant (ph) what you've said. You have made announcements of pages you've taken down, groups that you've banned. And yet, ProPublica in the last few days has published an investigation having to do specifically with false claims about voting, about mail-in voting - 350,000 views for a video on Facebook saying that if you mail in your ballot, Barack Obama is going to burn it - obviously, a false claim. A false claim that you can only vote by mail in California - hundreds of thousands of people saw that false claim. Now, Facebook did delete them, ProPublica says, but only after ProPublica called them out. Why do you think that happened?
GLEICHER: So we work with third-party organizations to identify any type of voter suppression or voter interference. A claim that misrepresents how you vote, where you vote or when you vote violates our community standards, and we will take that down.
INSKEEP: But in these cases, the posts were up and were seen by hundreds of thousands of people and had to be reported in the media before you found out about them.
GLEICHER: I'd like to get to every single one of these posts as fast as possible. Large amounts of this we find proactively and remove ourselves. We also work with state elections officials because often, for this type of thing, they might see it first, particularly if it's localized. And so what I've found is that you don't want to rely on just one tool.
INSKEEP: You know, I suppose you had a kind of trial run of this recently when the president of the United States made a false claim that was carried on Facebook - a false claim about mail-in balloting leading to a corrupt election. Of course, millions of people have voted by mail. There's no evidence of that whatsoever. Facebook did add a label with more information about voting, which sounds approximately like what you're talking about, but did not call the claim false. How come?
GLEICHER: So we want to make sure that people hear what elected officials are saying and what they think about voting. Quite frankly, I think that information is an important factor in how some people will choose to vote in the fall. And so we want to make sure that information is out there and people can see it, warts and all.
INSKEEP: If the president just says something false and nowhere do you say it's false, where's the context for that?
GLEICHER: I completely agree. I think accurate context is very important. That's why having that link right there provides as much accurate context as we can provide so that people can see, here's what the experts are saying, here's how the process works. And they can weigh that against what elected officials are saying.
INSKEEP: I want to ask about a couple of political movements that are based on false conspiracy theories, one of them being QAnon, which, roughly speaking, is the idea of a deep state of child predators who are attacking the president - no evidence for that whatsoever. But there is a Republican QAnon believer who won a congressional primary in Georgia the other day, and it appears to be a movement that is growing a lot on Facebook. Why would that be?
GLEICHER: Enforcing against QAnon is something that we've been doing pretty consistently, actually. Just last week, we removed a large group with QAnon affiliations for violating our content policies. We've also removed networks of accounts for violating our policies against coordinated inauthentic behavior. We have teams that are assessing sort of our policies against QAnon as they currently exist and are exploring additional actions that we can take.
INSKEEP: And yet, The Guardian was able to find the other day 170 groups, pages and accounts on Facebook and Instagram with 4.5 million followers and even adds - this is a quote from The Guardian - "Facebook's recommendation algorithm has continued to promote QAnon groups to users, and some groups have experienced explosive growth." Do you deny that?
GLEICHER: This is exactly the type of adversarial behavior we'd expect to see. We consistently see actors on the platform as we put tools and controls in place looking for ways to get around them.
INSKEEP: I'm looking for a yes or no there. The question is, are your algorithms actually promoting QAnon sites in spite of your best efforts?
GLEICHER: I think I would want to talk about specific instances to weigh in on that. It really depends on the particular site that we're talking about and the post that they're making.
INSKEEP: So you're not sure if that's true or not, with 170 different groups cited by The Guardian.
GLEICHER: This is a place where I think we have work to do. I think that the boundaries around what constitutes a QAnon site or not are pretty blurry, and that becomes a challenge in all of this. It's why we're looking at this and we're exploring some additional steps that we can take.
INSKEEP: There's another conspiracy political movement, the boogaloo movement, which wants a new civil war. The Tech Transparency Project has been studying them, identified 110 new Facebook boogaloo groups created just since June 30, which is when you said you'd crack down. And it was a well-intentioned effort, I'm sure, to crack down, but 110 new groups since then. And the Project says that the boogaloo adherents just avoid using the word boogaloo, and so they seem to escape your search engines. Is it that easy?
GLEICHER: We don't rely on any one particular keyword for exactly the reason you said. We've seen these actors constantly change their terms. We knew and expected that sites like this would come back as we removed them. And what I can tell you is we've been taking aggressive action against not just sites discussing boogaloo or pages discussing boogaloo, but pages and groups where we see links between those discussions and intent or efforts to engage in physical activity or push further towards real-world harm. And what that does over time is it drives these actors to spend more and more of their effort evading the controls we're putting in place as opposed to coordinating for the ends that they're seeking.
INSKEEP: I'm thinking about the fact that a very large part of the 2020 election is going to play out on your platforms. That seems like a basic reality. And I'd like to know your level of confidence. Are you willing to say with some confidence that most voters who rely on Facebook will be mostly well-informed if they do so? Are you willing to say that?
GLEICHER: I think one of the things we've learned is that protecting an election is a whole-of-society challenge. And what I can tell you is that we have teams that have been focused on finding and exposing these types of campaigns for years. And I think that the tactics we saw in 2016 will be much less effective today.
INSKEEP: Do you think that most voters, if they rely on Facebook, will be mostly well-informed?
GLEICHER: I think people are going to read on our platform based on what they're looking for and the people that they're talking to. What I can tell you is every single person in the United States is going to have at the very top of their feed and on posts that they read about voting accurate information about how to vote, how the process works and what the experts in state government and elections officials are saying about how the processes function.
INSKEEP: I think I hear you saying that you believe everybody is going to have an option to find accurate information, but it's going to be up to them to find it.
GLEICHER: Well, we're going to put that accurate information in front of people as many ways as we can.
INSKEEP: Nathaniel Gleicher is Facebook's head of cybersecurity policy. Transcript provided by NPR, Copyright NPR.