Facebook’s Ex-Security Chief Sounds Off on 2018 Midterms
Former Facebook Chief Security Officer Alex Stamos has warned on the eve of the midterm elections that “very little has been done” to protect the United States during the two years since Russian agents used online misinformation to influence the 2016 presidential race.
“We’ve had two years since the main part of the Russian attack against the 2016 election, and very little has been done as a country, as a government, to protect ourselves,” Stamos told FRONTLINE. “We have signaled to the rest of the world that interfering in our elections is something that we won’t really punish or react to.”
Stamos was interviewed by FRONTLINE filmmaker James Jacoby for the upcoming documentary The Facebook Dilemma. Because of the timeliness of Stamos’ comments, FRONTLINE is publishing an excerpt of the interview in advance of the film’s broadcast.
Stamos said that technology companies, including Facebook, have altered their policies and dedicated more employees to the problem, while the U.S. government has created an FBI task force on foreign influence. But he said those measures were not enough to deter foreign adversaries from attempting to interfere with elections.
“You’re not talking about techniques that are incredibly difficult to recreate, so one of my big fears is that we’re going to see other U.S. adversaries — Iran, North Korea, China — jump into the information warfare space in 2018, and especially in 2020,” Stamos said.
Stamos stepped down as Facebook’s security chief in August. He was in charge of security on the platform during the 2016 elections and helped direct Facebook’s response to attempted use of misinformation on the platform to manipulate voters. He is now teaching and conducting research at Stanford University.
Stamos called for new standards that would enable the public to see all ads published by a campaign or political action committee in one place. He also said rules are needed that would prevent targeting political ads at the individual level, in essence creating personal political messaging crafted for a single voter.
Stamos cautioned that placing the entire burden of monitoring and controlling political speech onto technology companies risked creating even bigger problems than it would solve.
“These are very, very powerful corporations. They do not have any kind of traditional democratic accountability,” he said. “And while I personally know a lot of people making these decisions, if we set the norms that these companies need to decide who does and does not have a voice online, eventually that is going to go to a very dark place.”
Since the 2016 election, Facebook has set up new rules around online political advertising as well as an ad archive. The company has also built up the number of employees working on security from 10,000 to 20,000, said Facebook head of cybersecurity policy, Nathaniel Gleicher, also speaking in an interview with FRONTLINE.
“We have been able to move in the case of safety and security to being proactive, to getting ahead of threats, to taking down bad actors, to finding more bad content,” Facebook’s Vice President of Product Management Guy Rosen told FRONTLINE in an interview. “And this is a huge investment.”
Zuckerberg wrote in a post to Facebook last month that the company has launched systems to block fake accounts and taken down more than one billion of them in a period of six months.
In 2016, Facebook was ready for conventional cyber attacks and already had systems in place to detect and remove child pornography, nudity and terrorism-related material. But the site was caught unawares when foreign entities started using fake accounts to sow discord and disseminate misinformation, he wrote.
Now, the company is running drills on what to do if foreign-run pages driving political content suddenly surged – and establishing a “war room” at its Menlo Park headquarters.
“Today, Facebook is better prepared for these kinds of attacks,” Zuckerberg wrote.
Stamos said Facebook has moved to address threats, but that there is no guaranteed defense.
“I think Facebook has taken reasonable steps based upon what happened in 2016,” he said. “There’s two issues. One, we are always going to be vulnerable to some type of disinformation as long as we live in a free society…The other issue is, we’re not really sure what’s going to happen in 2018 to 2020.
“So while everybody’s been focused on the exact Russian activity in 2016, the goals of the Russians have changed, and I suspect the mixture of countries that are going to get involved has actually broadened,” he said. “If you’re talking about a bunch of different adversaries all with different goals, we might see very different techniques to manipulate the election.”