Alex Stamos spent three years as Facebook’s chief security officer. He oversaw a counterterrorism team that, he says, prevented multiple terrorist attacks without taking public credit.
He also says that this approach to communications may have added to the criticism of the company following the Russian interference scandal in 2016.
Let’s not create a communications moment where we’re making a controversy ourselves…
Alex Stamos: When I started, the number one content safety issue on Facebook was not disinformation. It was that ISIS, unlike some of their predecessors, was digitally native. That they had young millennials who often lived in western countries — people who we called “Jihobbyists” — who would spend all day creating and spreading content on social media to recruit people to come fight for them in Syria, or trying to celebrate attacks, or in some cases, threatening the lives of service members and their family members. And so that was kind of the big thing. And it struck me as we were dealing with this problem, how far over our skis we were. In that, we built a dedicated counterterrorism investigations team. We caught a number of terrorist attacks. There’s been several terrorist attacks that have been stopped where the FBI, or some other law enforcement agency, takes credit in a press release, but it was really Facebook. It was really our investigations team that found it and turned it over to law enforcement.
…Because to admit that bad things are happening was just so far out of how communications work at these companies. That was kind of shocking to me. And then that really hit the fan with the Russia stuff, obviously, because we knew about the GRU activity in 2016. We had turned it over to the government. That was kind of the standard thing. That’s how you handle it. And looking back, clearly if we had come out and said, “These are the kinds of things we’re seeing” in 2016, it would have been incredibly politically controversial, but it also would have massively inoculated the company against what ended up happening in 2017 and 2018. And so there was a lot of disagreement on that. The famous example is our team wrote a white paper about what we knew about GRU activity that we released in the spring of 2017. And there was a big back and forth on whether we would name Russia or not. And at the time, the policy team at Facebook was trying to live with the reality of a Trump presidency and did not want to be pulled into this. The term that’s bending about inside of Facebook a lot is, “let’s not break into jail.”
Laurie Segall: Explain that. What do you mean?
Alex Stamos: That like, let’s not create a communications moment where we’re making a controversy ourselves, right? Now our argument on the security team was, we’re already deep into this controversy…
There’s no way to avoid it, so we might as well just be honest. And so we ended up publishing what we knew, but like the word Russia was removed, which was really controversial. …The compromise was on a footnote in which we said that the Director of National Intelligence report was compatible with ours, which is us saying, “Yes, it’s Russia.” And then privately, when I briefed Congress on that report, I told them, “Yeah. It’s Russia. We know it’s Russia.” But those are the kinds of situations in which the world’s really changed. The expectation, at that time, was the US government is run by competent people who are good actors. If we give them this information, they will eventually release it in a way that’s appropriate. And in a situation where you no longer trust the administration, or the administration may be tied into some of this activity, it blows away all the ways of the companies. And so now they’re becoming much more independent.