Episode 13: Ex-Facebook Security Chief Speaks Out on Russia & Spies in Silicon Valley

First Contact with Laurie Segall is a production of Dot Dot Dot Media and iHeart Radio.

Alex Stamos: I expect that every major US tech company has at least several people that have been turned by at least China, maybe Russia, probably Israel, and a couple other US allies.

Laurie Segall: Did you ever uncover anybody that had been turned?

Alex Stamos: I’m not gonna comment on that. 

Laurie Segall: Can you comment a little bit?

Alex Stamos: What I’ll say is there’s, there’s a lot of weird things that happens at the companies and it’s very difficult to figure out why it did happen. 

Imagine being at the center of an event so big that it changed the way all of us think about social media and democracy. 

Alex Stamos was the chief security officer at Facebook for 3 years. I met him two months into the job. 

His job started with fighting ISIS. And ended with an exploration into Russia, disinformation, and society at large questioning the responsibility of technology in our democracy.

But before we get into that, I want you to get to know Alex for a second. He has spent his career dealing with the darkest the Internet has to offer. At the same time, he has been learning how to balance family, his career, and this larger sense of responsibility. Often times he’s felt tension between those characteristics that define him as a person. 

Can all three co-exist peacefully? What do we sacrifice to do the work we believe in? And is there a point in our lives where realize that that work isn’t actually possible in the role we’re in, even though optics may say otherwise? 

These are all questions Alex navigated at his job at Facebook. Now that his journey there is over, here are some of his takeaways from the other side.

I’m Laurie Segall and this is First Contact.

Laurie Segall: So Alex, do you remember our first contact?

Alex Stamos: I’ve been… ’cause I’ve been listening to your podcast so I’ve been trying to figure it out. So that I embarrassingly can’t mention. Was it at South by Southwest?

Laurie Segall: Nope.

Alex Stamos: No. Okay.

Laurie Segall: It was not. I’ll do, I’ll do you one better. It was in Las Vegas.

Alex Stamos: Oh okay.

Laurie Segall: It was right when you started at Facebook.

Alex Stamos: Mm-hmm (affirmative).

Laurie Segall: And it was when Facebook was, throwing parties for security-

Alex Stamos: Oh yes.

Laurie Segall: At Black Hat, which is this hacker conference, like a security conference in Las Vegas. So I remember it was like a really fancy party and I think there was like a pool and like lots of like-

Alex Stamos: Yes.

Laurie Segall: Nerdy people. And like someone ushered you over to me, uh, me the journalist ’cause I was with CNN at the time. And they were like, “You have to meet Alex Stamos. He’s the new head of security at Facebook.” And there were like little cabanas like with like people inside-

Alex Stamos: Yeah.

Laurie Segall: Having conversations. And I remember that was my first contact with you. And you were new on the job, right? It must have been like, what? 2015?

Alex Stamos: Yeah. So that would have been August 2015. I, I started in June of 2015 at Facebook. Uhm.

Laurie Segall: Yeah.

Alex Stamos: And that was the Facebook party, that, the last one we threw there ’cause actually I killed it after that one. 

Laurie Segall: I was about to say. Now Facebook was putting a lot of money into security-

Alex Stamos: Yeah.

Laurie Segall: And election integrity, but I don’t think it’s going to parties in Las Vegas for Black Hat.

Alex Stamos: No. I, it’s actually, it was a bit of a controversial call in the team, but after that party, I, I got rid of it, because it … There was just a lot of behavior. I mean the problem is, is if you hold a big party in Vegas, you incentivize people to act in a way that’s not very professional.

Laurie Segall: Right.

Alex Stamos: And there’s some things that happened there that I didn’t think was compatible with the, uh, kind of,  inclusive brand we were trying to build at Facebook security.

Laurie Segall: Right. And, and for folks who don’t know you, I mean you have a really strong reputation in the security community. And you also are known for speaking your mind. And, um, which I think probably makes it very interesting, when you kind of wade into big corporations. And you were also before at, you were at Facebook you were at Yahoo.

Alex Stamos: That’s right.

Laurie Segall: Um, where there was a whole big security breach and you have testified before. And so you were also at Facebook, and it was your team who discovered Russian influence. So you just kind of like, kind of step into it, right? Like you’re in these extraordinarily historical moments when it comes to security.

Alex Stamos: Yeah. A friend of mine called me the Forest Gump of InfoSec. That wherever I go there happens to be some kind of, uh, interesting uh geopolitical disaster unfolds.

Alex Stamos: I’m not sure it was totally a compliment by him, but it’s not completely inaccurate.

Laurie Segall: You must have a high tolerance for stressful situations.

Alex Stamos: Well, apparently not because I got out.

Laurie Segall: Right. That’s very fair. Um, and actually speaking of how you got out, that was August, 2015. Let’s fast forward to our, I would say, almost last contact which was, it was August, 2018.

Alex Stamos: Mm-hmm.

Laurie Segall: And it was, it was around your last day at Facebook.

Alex Stamos: Yeah.

Laurie Segall: And I remember I was doing a documentary on Facebook at 15 and, and you had made the decision as the Chief Security Officer, to leave the company. And, I just remember, how do I say this? Like we had known each other long enough that you kind of let us in to talk, to, we did some interviews around it. Um, and it was a really emotional moment. You had decided to leave the company, amidst everything going on. And my last contact with you really like on camera was that last day where you were like heading in for that last moment.

Alex Stamos: Yeah. It was emotional. I, you know, one of the interesting things about, Silicon Valley is people really do bring their whole selves to work, but then work also comes home with them and becomes their whole self. And, you know, leaving a place like Facebook is a little bit like you’re dying a bit, right? ‘Cause you, you end up, after a couple years there, that most of your social interactions are with your coworkers, that so much of your identity is tied up in the work you’re doing and so when you quit the job it’s a little bit like being excommunicated from the family. Not just that you’re gonna have to change things around. Now I think, being gone, I’ve, I’ve been very happy to be able to be in a situation where now I can work on some of these problems and step back from the myopic view that you get inside the company.

Laurie Segall: Sure.

Alex Stamos: And so I think now that I’ve had a little more than a year to think about it, that all encompassing feeling that you get from inside the companies is one of the reasons that they’ve failed to react properly in some of these cases, because you end up really living in a bubble. But yeah. It was a very emotional time. And it was also, you know, it’s a little scary to walk away from a, you know, a well compensated corporate job to go take a, an academic gig. Right? When you’re supporting a family. And so, you know, there’s, there’s a little bit of trepidation there, but it, it’s all worked out in the end.

Laurie Segall: Yeah. I remember like sitting on your back porch and being like, “This is a big pay cut you’re taking.” I was like god, Laurie, are you a jerk for like saying that on his last day? But I, I was like going through the numbers or whatever. I was like, I was like this is a very big pay cut.

Alex Stamos: Right.

Laurie Segall: Like as-

Alex Stamos: As you’re like looking up my mortgage in Zillow.

Laurie Segall: No. I didn’t look at your mortgage in Zillow. No, I, I didn’t do that. But it, you know, I think it says a lot about the conviction and like the idea that you weren’t leaving to go take another chief security officer job somewhere else or a security job somewhere else. like that’s a big deal. You know?

Alex Stamos: Yeah. I, you know, I think one of the reasons I was interested in leaving is that there’s some really interesting big problems that we’re dealing with right now about the relationship between tech and people’s lives. The relationship between tech companies and governments. Who’s in charge of online speech? How do we respect the rights of individuals while also protecting people from disinformation, bullying, and harassment? Like these are really humongous problems. And most of the problems are being solved inside the companies right now. And they’re being solved by thoughtful people, but people who are living within a bit of a bubble, and who are incentivized to think about these problems only in the context of their specific companies position. Right? Their position in the media, their regulatory risk, their, you know, what, what is good for their shareholders. And very, very few people have been able to leave that environment and then come outside and kind of talk about the base problems. And so there, I think, was a unique opportunity from a timing perspective to be able to, to do that. And, and I found that really attractive. You know, there’s, there’s just not a lot of ex-CSOs and certainly not people who have worked at a company that has control over the speech of two billion, two and a half billion people.

Laurie Segall: Yeah.

Alex Stamos: Very few of those kinds of people can step out into a position where they can kind of truthfully talk about what the issues involved are, which are numerous and, from my perspective, much more complicated than how people have generally been talking about it in the media. Which is, it was nice to be able to put myself in a position where I can kind of honestly criticize the company, but then also push back in situations where people are pretending that some of these problems are actually quite simple. Because dealing with them like they’re simple is going to end up in a situation where we, we’re, we’re going to create more problems than we fix.

Laurie Segall: And I want to get into a lot of that stuff.

Alex Stamos: Yeah.

Laurie Segall: You know, for folks who don’t know, can you just explain- it was your team that discovered Russian influence. And you were kind of on the front lines of what has now become, this fight against like democracy and social media and this larger, you know, more fundamental question about this role of tech in society. It was your team that really discovered it.

Alex Stamos: Right. And, and I can’t take a lot of credit here. I was very lucky to work with some really incredible people. We had a, a threat intelligence team whose entire job it was to study what governments were doing on Facebook’s platforms to possibly cause harm. One of the core issues going into 2016 is our focus was not on things like disinformation. Our focus was on traditional cyber warfare. And during the run up to the election, we had people who were dedicated to tracking activity by APT28 AKA Sofacy, AKA Fancy Bear. So this is the group of hackers that work inside of the GRU, the main intelligence director of the Kremlin. And the people who were tracking that group saw mysterious activity related to the US election, and we ended up turning that information over to the FBI. Now we know kind of now that, the FBI and the White House had kind of knowledge of a number of things going on. And there was some, some of that didn’t make it to the people who, who really needed to know about it, but, you know, that was kind of our part in the run up to the election. And then after the election I was part of a couple of rounds of investigation of what is the fake news crisis and then,  is there any evidence of Russian interference, especially within the advertising system on Facebook. And so I, I did get to participate in those and, and supervise the team, and spent most of 2017 kind of trying to answer those questions.

Laurie Segall: I mean and you take that home with you too, right? Like I mean-

Alex Stamos: Yeah.

Laurie Segall: And to a degree like, ’cause I, I was at your house as part of this. I mean you had like, in your office are like biographies of like the Russian Kre- I mean like a Putin biography. Didn’t you say you were like waking up in the middle of the night. I, I mean like you, you take that home. Like you take that kind of stuff home with you I can imagine.

Alex Stamos: Yeah. I, you know, one of the interesting things about that kind of job is you’re just dealing with the downsides of your products. Right? So Silicon Valley is kind of overall, the feeling inside these companies. The perfect distillation of that is Steve Jobs’ famous keynote where he introduced the iPhone. Right? Like that feeling of limitless possibility. Of technology is good, technology brings wonder into people’s lives. That is the feeling that pervades these companies. And then you’ve got me, and my team, and a couple of other teams at Facebook who just wallow in misery all day. All right. Like dealing with, not just disinformation, but people trying to attack the platform, people trying to send malware to each other, defraud each other, to sexually abuse children, to perform human trafficking, you know, terrorists who are trying to use the platform to either celebrate, or organize terrorist attacks. Like we would just spend all of our time on the most horrible parts of humanity as reflected on the products that our companies build. And so it, it makes you, kind of separates you a little bit from the rest of the company, right? Because the rest of the company is about, “We’re making the world open and connected. And we’re bringing them together.”

Laurie Segall: Right.

Alex Stamos: “And it is only a good thing that more people have used our product.” And, and, and then you’re the person that comes in as a Debbie Downer of like, “Well if another 100 million people use our product, this is how much child exploitation is gonna go up.” Right?

Laurie Segall: Yeah.

Alex Stamos: And everybody’s kind of like, “Oh my God, who invited Stamos to this meeting?” I get that a lot. Like I would walk into a room and people were like, you could see there places fall.

Laurie Segall: Right.

Alex Stamos: Of like, “Oh my God, this can’t be good.” Right?

Laurie Segall: It’s like, have you seen that SNL? Uh, there’s like this SNL Debbie Downer-

Alex Stamos: Yes.

Laurie Segall: Famous thing. It’s like you’re the, you’re the person, uh, only like Facebook’s 

Alex Stamos: Yes. Yes.

Laurie Segall: Not really Disneyland anymore.

Alex Stamos: Right. Yes. Feline AIDS is the number one killer-

Laurie Segall: Yeah.

Alex Stamos: Of household cats.

Laurie Segall: Yeah. (laughs)

Alex Stamos: Yeah. That’s exactly what it was like being me.

Laurie Segall: Yeah.

Alex Stamos: Um, and so this cloud follows you and it, it does follow you home, right? ‘Cause it’s, it’s just hard to … The, the thing that, I mean that really followed me home was the child stuff, right? So, you know, I had two areas of responsibility. One was the traditional information security, right? People trying to break into Facebook, steal data, steal money, that kind of stuff. But then we also had the safety responsibility and that included a dedicated Child Safety Investigations team, that some incredible, very dedicated, very skilled people that I was very lucky to work with, but working with them, and supporting them, and seeing their output really makes you kind of … It’s very hard to do that during the day and then to go home and hug your kids, right?

Laurie Segall: Yeah.

Alex Stamos: To see like the, the incredible levels of depths of depravity and horribleness that happen, um-

Laurie Segall: Yeah.

Alex Stamos: In the rest of the world and then go home and your kids are like, “Hey daddy. How’s it going?” Like it’s just, it’s very hard not to bring that with you. And so that’s the kind of thing that makes you wake up at 3:00 AM and check your phone. It took me about a year to detox, right?

Laurie Segall: Yeah.

Alex Stamos: It took me about a year before I could sleep through the night without worrying that if my phone vibrated, that somebody had died, or there was, you know, an attack that could be disastrous for, for our users.

Laurie Segall: Did your children ever, um … Did they ever sense anything? Did they ever say anything?

Alex Stamos: Um, yeah. I mean I think, I think they’ve sensed the difference since then. I think my son-

Laurie Segall: Mm-hmm.

Alex Stamos: Um, my, eldest who’s 12, said something along the lines of, “Dad, you’re a lot happier now.”

Laurie Segall: That’s good.

Alex Stamos: Uh, which I cried when he said that. Um, you know, it’s, it’s tough. It’s, it’s tough to deal with that kind of stuff, but it’s also important and I, I feel, you know, a little bit of a failure for not being able to, to do more of it. Right? My hope is I can still have impact, even though I’m not directly putting my hands on those problems now.

Laurie Segall: I remember the one thing you said to me, right before we like left on your last day. And I go back to that because I think it … I don’t think people get to see how human you guys all are.

Alex Stamos: Yeah.

Laurie Segall: Uh, in some of these, in some of these moments. People can yell about Facebook is doing this and that, but like when you’re at your home and you have your children playing with their legos or playing piano and like, and you’re about to walk away from a big job-

Alex Stamos: Yeah.

Laurie Segall: And for this thing that you believe in. And, you know, that, that, I, I can’t, I guess I kind of go back to that because I, I want people to understand what that looked like.

Alex Stamos: Yeah.

Laurie Segall: Uhm. You know, I think people don’t understand like, you know you’re small not scary dog. Which I actually thought it was very ironic that the chief security officer who’s like protecting like billions of people has a not very scary dog.

Alex Stamos: That we have a Golden Doodle.

Laurie Segall: Yeah.

Alex Stamos: Instead of a German Shepherd.

Laurie Segall: Yeah.

Alex Stamos: Yes.

Laurie Segall: I was like, I was like, “That’s, that’s your protection?”

Alex Stamos: There’s-

Laurie Segall: That was funny.

Alex Stamos: There’s some kind of metaphor if she sits there barking and then is completely ineffective in stopping things. I, I think one of my critics, uh, will point that out. Yeah.

Laurie Segall: Right. But I remember you saying something about how, um, when the lights went, went out once or something. Your kids thought it was a russia or something like that?

Alex Stamos: Yeah. No the, the, we just had a power outage in our neighborhood. And my, one of my kids asked, “Dad is this the Russians?” And I realized I was bringing perhaps a little too much. You know, ’cause-

Laurie Segall: Yeah.

Alex Stamos: You’re never really off, right? So if I, even if I got home at 7:00 PM, the odds of having two conference calls that night were pretty good. And so they would overhear a lot more than I, I knew. Especially that kind of intense summer of 2017 when we were investigating these issues and had yet to announce it. You know, and they were kind of on the knife’s edge of what’s our level of responsibility here. What do we tell the world? How do we tell the world? How are they going to react? It was a, it was a very tense summer and I think … And we effectively, we canceled all our summer vacations, and you know, I think the kids definitely noticed that one.

Laurie Segall: And I think you’re speaking about it broadly, but I also think, you know, it sounds like you had wanted to be a little bit more transparent about some of these things and I’m sure there was some internal conflict-

Alex Stamos: Right.

Laurie Segall: For you, as the person who likes to say things out loud and whose kind of like a, a little bit of a bull dog in this. Not being able to say fully what was happening, and this, and that. And, and I can imagine that summer itself was, tense.

Alex Stamos: Yeah. A lot of things have changed in the last couple of years-

Laurie Segall: Yeah.

Alex Stamos: About the expectations of what these companies should do and what they should say publicly. So, when I started the job in 2015, one of the things that struck me was the huge gap in how much proactive policing is done by tech giants and how much the public understands that that’s happening. And, you know, in 2015 the way that, that was really, becoming obvious was through our work fighting ISIS. Right?

Laurie Segall: Mm-hmm.

Alex Stamos: So like when I started kind of the number one content safety issue on Facebook. It was not disinformation. It was that ISIS, unlike some of their predecessors, was digitally native. That they had young millennials who often lived in western countries, people who we called Jihobbyists, who would spend all day creating and spreading content on social media to recruit people to come fight for them in Syria, or trying to celebrate attacks, or you know, in some cases, you know, threatening the lives of service members and their, their family members and such. And so that was kind of the big thing. And it struck me as we were dealing with this problem, how far over our ski’s we were. In that, you know, we built a dedicated counter terrorism investigations team. We caught a number of terrorist attacks. You know, there’s been several terrorist attacks that have been stopped where the FBI, or some other law enforcement agency, takes credit in a press release, but it was really Facebook. It was really our counterinv- our investigations team that found it and turned it over to the law enforcement. And the fact that we are kind of hiding all of that-

Laurie Segall: Yeah.

Alex Stamos: Because to admit that bad things are happening was just so far out of kind of how communications work at these companies. That was kind of shocking to me. And then that really hit the fan with the, the Russia stuff obviously because- You know, we knew about the GRU activity in 2016. We had turned it over to the government. That was kind of the standard thing. That’s how you handle it. And looking back, clearly if we had come out and said, “These are the kinds of things we’re seeing,” in 2016, it would have been incredibly, politically controversial, but it also would have massively inoculated the company against what, what ended up happening in 2017 and 2018. And so yes. There was a lot of disagreements on that. Kind of the, the famous example is our team wrote a white paper about what we knew about GRU activity that we released in the spring of 2017. And there was a big back and forth on whether we would name Russia or not. And at the time, the policy team at Facebook was trying to kind of live with the reality of a Trump presidency and did not want to be pulled into this. Like the, the term that’s bending about in- inside of Facebook a lot is, “let’s not break into jail.”

Laurie Segall: Explain that. What do you mean?

Alex Stamos: Right. That like, let’s not create a, a communications moment where we’re making a controversy ourselves. Right? Now our argument on the security team was we’re already deep into this controversy. Right?

Laurie Segall: Yeah.

Alex Stamos: Like there’s, there’s no way to avoid it, so we might as well just be honest. And so we ended up publishing what we knew, but like the word Russia was removed- Which was really controversial. And we ended up throwing like the compromise was on a footnote in which we said that the Director of National Intelligence report was compatible with ours, which is us saying, “Yes, it’s Russia.” And then privately, when I briefed Congress on that report, I told them, “Yeah. It’s Russia. We know it’s Russia.” But, you know, those are the kinds of situations in which, you know, the world’s really changed. Right, like the expectation at that time was there’s, you know, the US government is run by competent people who are good actors. If we give them this information, they will eventually release it in a way that’s appropriate. And in a situation where you no longer trust the administration, or the administration may be tied into some of this activity, it blows away all, kind of the ways of the companies. And so now they’re becoming much more independent-

Laurie Segall: Yeah.

Alex Stamos: Actors who are now independent actors on the geopolitical stage. They are pointing fingers. They are saying, “These countries are doing bad things.”

Laurie Segall: Yeah.

Alex Stamos: And perhaps changing history. And it’s kind of a crazy place to be, but that’s where we’re at now.

We’re going to take a quick break to hear from our sponsors, but when we come back, Alex explains why hes got a list of countries he won’t go to Also, If you like what you’re hearing, make sure you hit subscribe to First Contact in your podcast app so you don’t miss another episode.

Laurie Segall: So there’s life on the outside. And now you’re on the outside, and you kind of have this perspective, how are you feeling now?

Alex Stamos: I’m feeling good, but I, I also, What, what happens inside the companies, is you end up with this view that is completely reactive, that is all about the last emergency-

Laurie Segall: Yeah.

Alex Stamos: And you never have the ability to kind of take a step back and to think about how did we get here, right? And so, you know, and I, I am sure this is inside of all the big tech companies. Inside of Google. Inside of Twitter, is that you’re constantly dealing with this media cycle, and this regulatory cycle, and this emergency. And now that I’ve been out it’s become much clearer to me the kinds of long term issues that the companies have allowed to exist and that they haven’t tackled proactively. And, in both like their relationship with their users, but especially relationships with governments and it feels good to be able to kind of talk about that a little bit more.

Laurie Segall: Yeah. And so, tell me a little bit about what you’re doing here and what kind of things you’re teaching? 

Alex Stamos: Yeah.And so my work here I, I kind of break into three areas. I’ve got teaching, research, and then policy work. So on the teaching, our group is teaching two classes. One is a introduction to cyber security for not CS majors. I wanted to call it Hacking for Poets, but the registrar kicked that back. Uh, but it’s, it’s called the Hack Lab and it’s, offensive security techniques for lawyers, and MBAs, and people in the international policy masters here. And it’s to give people who are going to be in the business world, going to be in government, hands on experience hacking stuff. And the, the reason I’m doing that class, comes out of an experience I had. Uh, and I can’t not sound like a jerk telling this anecdote so I’m just gonna accept that. But I was in a meeting in the White House.

Laurie Segall: Mm-hmm.

Alex Stamos: Uh, you know, ’cause there’s something going on and I was summoned to the White House and some other folks during the Obama administration. And we’re having a discussion with people in the National Security Counsel and, all of the technologists had come from Silicon Valley and every single person on the other side of the table was a lawyer. And we were almost impossible for us to talk to one another. And so I, I kind of realized that, you know, there are lots of people who are now in a position of responsibility where they have to do this work where it is impractical for them ever have gotten real good hands on experience in cyber. So is there a way we can bridge that gap? And so that’s one of the classes I’m teaching. The other is called Trust in Safety Engineering. And so that’s a CS class for computer scientists. Effectively deal with the fact that Stanford keeps on graduating 22 year old, mostly guys, who go out and build products and don’t understand all the bad things that have happened in the past, right? They might think, “Oh yeah. I’m gonna build this mobile app and you’re gonna be able to anonymously send photos to an infinite number of women. What could possibly go wrong?” Right? And you’re like well, here’s the list of the 20 things that have gone wrong every time somebody’s-

Laurie Segall: Right.

Alex Stamos: Ever built an app that sends photos, right?

Laurie Segall: Right.

Alex Stamos: Um, and so that class, we cover lots of pretty heavy topics, right? So we talk about disinformation. We talk about hate speech, bullying and harassment. We talk about suicide. We have two whole lectures on child sexual abuse, on the different kinds of child sexual abuse. Not just the trading of child pornography, but sextortion, which is a whole class of issues.

Laurie Segall: Yeah.

Alex Stamos: We talk about terrorism and that terrorists use the internet. So it’s not an uplifting class.

Laurie Segall: But you basically talk about all the bad things that could happen when you build out technology?

Alex Stamos: Exactly. Yes.

Laurie Segall: This is like the class … Is it like a class of unintended consequences?

Alex Stamos: Exactly. Yeah. So that they will understand that they can not make the same mistakes my generation made. Right?

Laurie Segall: Yeah.

Alex Stamos: When they build these apps, that they understand the kinds of things that have happened in the past because also for all these issues, there are responses. You don’t just have to, rub gravel in your hair, and cry every night. Like there actually are things you can do around these issues, but if you’re not thinking about that proactively, what’ll end up happening is you won’t deal with them until after you ship it. Like we’re seeing this right now with TikTok, right? Which is like, you know, this new emergent social network. They are speed running every mistake Facebook made in its first 10 years, right? Like-

Laurie Segall: Yeah.

Alex Stamos: Every problem they’re dealing with is something that one of the bigger social networks have had to deal with, but because there’s no good repository of knowledge of all these issues, it’s effectively impossible for them to expect to preventively try to design their systems to, to stop it.

Laurie Segall: Hmm. Um, did I see that there’s a, a list of countries you’re banned from? Or was that a joke that you said?

Alex Stamos: Uh. It’s kind of a joke. I mean nobody’s keeping one list, but, I, uh, I-

Laurie Segall: Mm-hmm.

Alex Stamos: I do joke that, you know, you see those RVs that’ll have like fans of like the Nebraska Cornhuskers?

Laurie Segall: Mm-hmm.

Alex Stamos: They’ll have a map of every state they’ve seen the Cornhuskers play in? I have the opposite map of, of countries that I probably should not travel to just ’cause I’ve, I’ve either been involved through Yahoo or Facebook, in dealing with them or, or we’ve published something, you know, at Stanford.

Laurie Segall: Does that ever scare you?

Alex Stamos: Uh.

Laurie Segall: I’ve gotten like one call in my career from like, after going to too many like Black Hat and Devcon things from someone with like a modulated voice and I was scared for like weeks. I can’t even imagine. You must get all sorts of crazy stuff. Like have you ever worried about your safety?

Alex Stamos: Much less now. Yeah, when I was in the corporate jobs I got death threats and sometimes physical letters and stuff. So, you know, my mail was being opened, at the office and we had the physical security people, the executive protection people take a check out of our house and try to help scrub mentions of our address and where our kids go to school and stuff. So that’s definitely a problem. And, this hasn’t happened to me, but there’s now a big problem in Silicon Valley of swatting of executives.

Laurie Segall: Yeah.

Alex Stamos: So a couple of my friends have had the cops called, SWAT teams arrive at their house because, you know, somebody called with a, a fake ID and said that, “I’m so and so. I’ve, I’ve murdered my family.” Uh, and one right here in Palo Alto, which is unfortunate that Palo Alto PD actually did not handle it very well, which is an unfortunate thing in Silicon Valley. But that’s an issue that a lot of people have been facing. And now it seems that some of that’s being driven from some of the kind of white supremacist groups who are not happy with the companies cracking down are taking it out on individual executives. So-

Laurie Segall: It’s interesting because this has been going on for a while. Like, um, you know, there’s been a lot of anger over them cracking down on speech and all sorts of stuff, but, but-

Alex Stamos: Yeah.

Laurie Segall: This, the latest of Silicon Valley executives being … And for people who don’t know what swatting is, it’s just, it’s someone calling in a fake threat and then-

Alex Stamos: Yeah.

Laurie Segall: The police show up at your door thinking you’ve done something terrible, almost ready to shoot you essentially. Like and it’s a, it’s a psychological thing.

Alex Stamos: Yeah.

Laurie Segall: Right?

Alex Stamos: Right. It, it, they’re trying to say you’re not safe in your own home.

Laurie Segall: Yeah.

Alex Stamos: Um, and, you know there’s at least one example of somebody dying of a swatting, uh, that was related to a video game dispute, where the police ended up shooting an unarmed person. 

Laurie Segall: Yeah.

Alex Stamos: So, yes. it is, you know, that’s the kind of thing you’re dealing with if you’re at the companies right now.

Laurie Segall: Yeah. We had Adam Mosseri, the CEO of Instagram, on and it’s happened to him.

Alex Stamos: Yeah.

Laurie Segall: Um, and interesting to hear that it’s happening to other folks you know. It just-

Alex Stamos: Y- Yeah.

Laurie Segall: It seems like there’s a lot of anger, towards Silicon Valley right now and towards the executives here. And that is manifesting itself offline in some capacity.

Alex Stamos: Yeah. And, and my case like the worst threats I got were after a story about ISIS that listed three people, like it had quotes from three people. And it was me, and Mark, and Sheryl. And it’s like only one of those people does not have a full time security.

Laurie Segall: Yeah.

Alex Stamos: So, you know, so some stuff happened around that time, but it, it’s-

Laurie Segall: Like what kind of stuff?

Alex Stamos: Uh, I’d rather not go into it.

Laurie Segall: Okay.

Alex Stamos: Yeah.

Laurie Segall: Scary stuff, though?

Alex Stamos: Yeah. I mean messages and, and such, but nothing-

Laurie Segall: Yeah.

Alex Stamos: That we thought were, was, was, actually a high risk. I mean that, that’s something that ISIS kind of specialized in was trying to intimidate people remotely when there was actually very little risk of something happening, but if you, if you look at the tech companies, because you know there was also a, a mass shooting at YouTube that ended up could have been much worse.

Laurie Segall: Yeah.

Alex Stamos: And so since that incident the physical security at the tech companies has become pretty significant, all of them.

Laurie Segall: Well, it certainly seems like there’s this moment that we’re in, in the time I’ve covered technology, that like, the, it just almost feels like the, the tension is reaching like a peak and maybe it’s because of everything else happening, around you know, what’s happening in society and what’s happening, um, with the power of Facebook, and Twitter, and Google. And I, I, I increasingly see there’s like some tension between the media and tech. I mean it-

Alex Stamos: Yeah.

Laurie Segall: It really feels like there’s just this moment where there’s some real cracks. And it’s manifesting itself in a pretty ugly way.

Alex Stamos: Yeah. I think just like a lot of these other issues, this is a situation where divisions in society are being reflected online, right? That if you, if you end up in a really polarized world where people dehumanize their political opponents, the place where that is easiest and where you can get the most implication is gonna be online which then puts these platforms into place of being the refs of what is allowable speech. And for a while they were trying to be as hands off as possible, they were not seen as active players, but we’ve clearly crossed the rubicon. We’re both the media and, in the US, both political parties. But then political parties throughout the world now consider working the refs to be a critical part of their plan. And so you see, a lot of the criticism is legit and a lot of it is specifically about trying to change the behavior by these incredibly powerful platforms. Which is why you also see the platforms trying to find ways that they can get out of that, right?

Laurie Segall: Right.

Alex Stamos: So if you look at like Mark Zuckerberg’s big announcement of moving as much of Facebook to encrypted end-to-end communication as possible, that is a huge uplift for privacy. It is clearly a reaction to the privacy laws that have been passed. It is also a move that would take Facebook out of the job of moderating peoples speech in many cases.

Laurie Segall: Right.

Alex Stamos: Um, and so I think you’re going to continue to see companies look for ways that they can effectively lock themselves out of having to make these decisions.

Laurie Segall: Is there an unintended consequence there? Like I, understand that there’s this debate of we don’t want our, you know, tech companies … And I don’t really want Mark Zuckerberg in particular deciding what should stay and what should go, but-

Alex Stamos: Yeah.

Laurie Segall: I’m noticing this trend too that you’re talking about, which is that, you know, now they’re, as, as fast as they possibly can, going to say, “Oh. It’s this. We’re instilling this board to do this. And we’re doing this and this.”

Alex Stamos: Yeah.

Laurie Segall: But it’s still, the problem is on the platform. So like is there going to be an unintended consequence for this move of kind of saying like, “Oh. This isn’t on us. This is on X, Y, and Z.”

Alex Stamos: Yeah. Although, it’ll be many unintended consequences for sure. I, this is actually one of the research projects we’re running, in our group at Stanford is trying to understand what impact does moving all this to distributed or encrypted networks have on safety and are there things you can do to mitigate some of that risk? So, for example, you know, WhatsApp, is … The day that WhatsApp turned on end-to-end encryption, was a very proud day. Was probably the largest uplift in privacy in human history. Never before have so many people been given the ability to communicate with one another in a way without powerful corporations or governments seeing their communication. So a huge privacy win. That privacy also directly contributes to WhatsApp being used to spur ethnic violence in places like India or Sri Lanka. It also prevents WhatsApp from policing the sexual abuse of children. It stops WhatsApp from policing malware. And so there, there has been a real impact and I think this is one of the things I’m glad to be able to be on the outside. That now I can talk about this balance of equities, is that you can’t both ask these companies to make the platforms totally safe and then also ask them to respect everybody’s privacy. There’s a really hard trade off here. And, that is one of the problems with our current debate is that people kind of want it all. And when you try to build incentive structures for the companies that aren’t attached to reality, like what’s happening right now, then you end up with the companies looking for these radical solutions. So for Facebook it’s end to end encryption. For Twitter, it’s distributed systems. Right? Jack’s announced that Twitter might become some distributed thing, which is another way of saying, “We won’t be able to control what people tweet.”

Laurie Segall: Right. I mean it’s kind of a scary place to be when, um … You know, we had Yancey Strickler on who was the founder of Kickstarter, and he talked about this dark forest theory where, uh, a dark forest is where, all these animals are there, but just no one can see them, but they all don’t come out ’cause they’re so afraid.

Alex Stamos: Yeah.

Laurie Segall: And, and then the only room that’s left is for predators or something. And he talks about, you know, as social networks want to get more private and this and that, the town square’s going to get less and less crowded and it’s only, you’re only going to have these voices that are more extreme who remain. I thought that was an interesting metaphor. And then I also think some of the stuff you’re talking about with like … I, I don’t know where … It does seem like Silicon Valley in general is becoming more libertarian or some … Like this push right now feels more and more, um, I, I don’t know, reactionary. Like, and, it does seem like there’s been, there’s this inability to have some kind of debate, of where we fall, and how we fall, and because it doesn’t feel like any of the sides actually understand or speak to each other in a way that is coherent.

Alex Stamos: Right.

Laurie Segall: Understandably so because Silicon Valley hasn’t exactly done the best job of explaining themselves. You’ve been inside one of these tech companies. You’ve tried to explain yourself and you couldn’t. And the government hasn’t exactly done the best job of understanding technology as we all saw when Zuckerberg testified for the first time. So it certainly seems like a bit of a mess.

Alex Stamos: It does. And you’re right. The, companies have never really talked honestly about the trade-offs here. Right? Like we have promised people that you can have it all. That we’ll respect your privacy and we’ll stop all the bad guys. That, you can have this product for free. You don’t have to pay for it and, and therefore there’s no downside for that, which obviously then you have to do things like run ads and have ad platforms.

Laurie Segall: Yeah.

Alex Stamos: And so the fact that the companies don’t talk about the trade-offs is important because we’ve ended up in this situation where it is this incredibly important area of public policy discussion where the people who have the loudest voices don’t actually understand the equities involved. You know, people understand what the trade-offs are in tax policy, or, healthcare policy and the like, but they don’t understand the trade-offs in tech policy. And so as a result they end up asking for things that are completely contradictory. So like a great example of this is GDPR, the big European privacy law which both, tells the companies that they have responsibilities to protect user data, which is something I agree with. They also tell companies that user data belongs to the users and they have to let the users take the data with them anywhere, which is a thing that sounds good, but it turns out those two things are completely in conflict. Right? That you cannot, The, the current ability to take data out of a Google or Facebook ends up with 100 times more sensitive data than anything that was leaked out of the Cambridge Analytica scandal. And that is required by the European Union for the companies to operate there. And eventually somebody will figure out how to monetize that feature. And there will be a massive privacy scandal, but in this case the companies are gonna say, “Well, sorry. The Europeans made us do it.” And so like that kind of not understanding how the, the, how the rubber hits the road and the equities involved is started in Europe, but we’re starting to see it in the US as well.

Laurie Segall: What do you think about … I, I mean I know the debate is right now all about, how they’re treating political ads.

Alex Stamos: Yeah.

Laurie Segall: And misinformation. What is your take, having been on the inside?

Alex Stamos: Yeah. And so the things you’re trying to balance here is you’re, you’re trying to balance, not allowing the platform to be abused with how much power do you want these companies to have to control political speech. Right? So first off, a lot of the public debate about this is really twisted by people’s fascination with,  CDA230, the law that protects, internet platforms from intermediate liability for, for speech. CDA230 is effectively irrelevant in the political speech debate ’cause in the United States political speech is protected by the first amendment. Even incorrect political speech or, or outright lies. And so since there’s no outright underlying civil liability there, CDA230 is irrelevant. Um, so I just want to dismiss with all of that discussion. There’s been a couple of celebrities, who keep on talking out of turn about stuff they don’t really understand and it’s really mixing up the, the conversation. But on the political ads debate, I think from my perspective, the, the reasonable way forward is one to put limits into how much targeting you can do, about how finely grained you can cut up the electorate with your political ads. When you think about what are the things we want out of online political ads, is obviously we want them to be honest. We probably don’t want them to be super negative. We want them to be about the issues, not about personalities, and we want them to be as universal as possible, right? And this is one of the big differences between, uh, say a television ad, which you can show on one TV station, but you’re probably hitting 40, 50,000 people and an online ad where you can show it to 50 people is that with the online ad, because they’re cheap, because there’s things that you can generate automatically. And because of the targeting capability you end up with political actors creating thousands and thousands of ads that say different things to different groups of people. And I think that is a really dangerous direction for us to go is for the, political class to break up America into all these tiny micro segments and to be different people to all these different segments. And that means, you know, the candidates, but also the political action committees. And the dark money groups who, who can do this. And so I think to address that, the best thing would be a limit on a floor of how is the fewest number of people you can show an ad to at any time. And in a presidential campaign that might be like a congressional district. So that’s roughly 600,000 people. Or you might, you know, have some arbitrary number: 10, 15, 20,000. You know, I, there’s no good empirical evidence about what number it should be. I think we should just try something and then we can iterate, right? But like the important thing just to make it so that you’re not cutting people up into tiny … And this isn’t just a partisan issue. One of the things I really don’t like about the discussion is it’s all about Trump, but like we’ve got to be thinking about-

Laurie Segall: Yeah.

Alex Stamos: How do we want politicians to act for a long period of time in the post-Trump era. And even in this election it looks like Michael Bloomberg is building the most impressive online advertising capability in the history of mankind. Right? So, this isn’t just a partisan issue. It shouldn’t be seen as such. We should think about how do we want both parties, and eventually both candidates, to act? Um, and so that’s, I mean that’s, that’s one limit. And then on the, on the ads on the line, I don’t like the idea of Facebook or any other trillion dollar tech company deciding what is correct or incorrect political speech. I think a reasonable standard would be that if you’re making a claim about an opponent, that has to be based in fact. Right? But that we shouldn’t police claims about positions in themselves. So if Donald Trump says, “I’ll make Mexico pay for the wall.” Then that’s something that doesn’t get fact checked, but if he says, “My opponent is a criminal and is about to get arrested.” Then that’s something that campaign has to show some kind of factual basis for. Or the ad is banned. And I think that would encourage the campaigns to run ads that are about their own positions, that are about the topics in hand, instead of just calling names and pointing fingers. And it would also reduce some of this kind of risk, in the election, um, but I, I don’t think we should move to a world where things are totally fact checked.

We’re gonna take another quick break, but when we come back, we’ve talked about disinformation, but what about actual spies inside of tech companies? We ask Alex about spies inside of Facbeook. And if you have questions about the show, comments, really anything, you can text me on my new Community number: (917) 540-3410.

Laurie Segall: So a lot of the other tech companies, they have all taken a much lighter stance on political ads. Whether it’s banning them or doing, different types of things. I think the, the issue that I’m really interested in is like the micro targeting.

Alex Stamos: Yeah.

Laurie Segall: I just think like the micro targeting thing is really important to kind of dig into. And I think that was the one thing after Cambridge Analytica that I was just like who knows just how effective Cambridge Analytica actually was, but the larger issue was this idea of the ability to micro target people and manipulate people. This grand scale without the, kind of this outside world to be like, “No. That’s wrong.” Or, you know, and these, and, and these spaces seems really dangerous. And, and so what’s good when it comes to a political ad for, you know … What’s good when it comes to selling a product when you micro target for trying to sell someone deodorant or something.

Laurie Segall: Might not be good when you’re trying to micro target, and manipulate, and change someone’s mind when it comes to democracy, which is I think kind of this other whole debate that, that doesn’t seem to be getting as much attention, I think, when it comes to this.

Alex Stamos: Yeah. And the technique that everybody is thinking about, that we, we, we should be really worried about is this thing called custom audiences, right? So this is the ability that exists on, on many different online platforms to effectively upload a spreadsheet of these are the people I want to show this ad to, right? So let’s say you’re the Trump campaign and you, you have all of these lists. And you want to advertise to people who you think are pro-gun in Michigan and you’ve gotten information from the NRA. And so you upload the list of NRA members in Michigan and then Facebook matches that list to Facebook accounts and then only shows the ad to those Facebook accounts, right?

Laurie Segall: Right.

Alex Stamos: So that is the, uh, there’s equivalents on Google, and Twitter, and a bunch of other ad networks. That is the kind of functionality that I think we should have limits around. And to me, that is the real thing that needs to be attention to coming out of the Cambridge Analytica scandal. So to be completely honest, Cambridge Analytica as a scandal is massively overblown. The media loves to talk about it without really understanding what happened. Cambridge Analytica was a scam. Right? So this is a advertising company-

Laurie Segall: I’m the media and I feel like I understood what happened.

Alex Stamos: Okay.

Laurie Segall: Did I not just say that, that, you know, who knows actually like how, you know, how effective Cambridge Analytica was-

Alex Stamos: Yes.

Laurie Segall: But what they did, and what they were able to do, and kind of the loopholes they were able to take advantage of.

Alex Stamos: Right.

Laurie Segall: So anyway.

Alex Stamos: No. Okay. So-

Laurie Segall: Just as the media, I politely- And respectfully push back, Alex.

Alex Stamos: So present company excepted. Uh, so Cambridge Analytica did not get Trump elected. Cambridge Analytica did not make Brexit happen, right? They were a scam based upon this crazy idea about psychometric profiling. Yada yada. The only reason people care about Cambridge Analytica is they got a bunch of their data from stealing it from Facebook, right? There are a dozen or two dozen more Cambridge Analytica’s that have much more thoughtful models for how they do this. And they’re not dumb enough to either have a huge PR campaign. So like one of Cambridge Analytica’s basic problems is they went out and they made these You- YouTube videos-

Laurie Segall: Right.

Alex Stamos: With their like Bond villain accents. And they took credit for Trump and took credit for Brexit.

Laurie Segall: Yeah.

Alex Stamos: And so people believe their ads, but they’re just showmen. So these other companies are silent. They don’t even have websites. They don’t have domains. You can’t figure out who they are. And they don’t steal their data from Facebook, they legally buy it from Axiom, and Experian, and TransUnion and a dozen other data brokers. And so they’re building much better models than Cambridge Analytica did, but they’re doing the same kind of idea of, “We’re gonna target very small groups of people.” And so I think that is what we’ve got to focus on. Is we’ve got to focus on, of how do we limit the capability to do that, you know … If, if somebody wants to target, these are the 20 people that are gonna buy a Toyota Tundra in this county and they show them a, an ad for a Toyota dealership, I think you could argue that, that’s a reasonable thing. It’s just like direct mail, right?

Laurie Segall: Right.

Alex Stamos: Um, but, in the political aspect, we don’t care about different Toyota dealers showing different ads to different people, but we do care about one politician being a different person to thousands of different segments. And I think that is a real dangerous direction of kind of flooding the zone with hundreds of thousands of ads that are very hard for the media to check, for opponents to check, and the like.

Laurie Segall: Go with me. And so, and so why? I mean I think that’s an important point? So why? Why, wh- You know, they just want to appeal to different audiences, and show different sides of themselves. So I’m only gonna push you to, to, to help people understand.

Alex Stamos: Right.

Laurie Segall: So why is that so bad?

Alex Stamos: Right because so they’re not thinking about this is a position that’s important because of polling. What they’re doing is they’re having a computer auto generate ads. So you have a bunch of different messages, a bunch of different photos, a bunch of different quotes and you feed them into a computer and the computer generates thousands and thousands of potential ads. And then it runs each of the ads with $15, $10, $20 of budget. Right? and then it cross tests those ads against all these different segments. And then the computer sees, “Okay. Great this ad did well. This did with this.” And then it puts the rest of the money behind just showing those ads. So you’re effectively mass producing propaganda that is then tested, AB tested against individuals. And then money is only being put behind and no human’s part of it. Right? Like you can, you can’t say there’s any part of this that’s about real persuasion. It’s just about what mathematically did people click on. And I think that’s just a really dangerous direction for us to go.

Laurie Segall: What percentage of Facebook stance on political ads do you think is partisan, or is, is, is political? I mean like, what they’re doing-

Alex Stamos: Right.

Laurie Segall: You’ll hear Zuckerberg argue, “This is for free speech.” And, and-

Alex Stamos: Yeah.

Laurie Segall: And I’m a believer that Zuckerberg believes very strongly in free speech. That’s it not all about ad revenue, that he, he does, believe really in free speech, you know?

Alex Stamos: Yeah.

Laurie Segall: To whatever degree. But you know, do you believe that Facebook stance on political ads is part political?

Alex Stamos: So I don’t think it has anything to do with money. I think the percentage of revenue that is political ads is probably in the single digits and that is even an overestimate because every time you see an ad on Facebook or Google, it is because that ad won a very, very quick auction.

Laurie Segall: Right.

Alex Stamos: And so if political ads are taken out, that space will be filled with the, you know The deodorant ads, and the car ads, and the such.

Laurie Segall: So they’ve taken an extreme, a pretty extreme stance.

Alex Stamos: Right.

Laurie Segall: So is it because of free speech or is it because they’re in a very sensitive moment politically where you have a lot of politicians coming down on Facebook-

Alex Stamos: Right.

Laurie Segall: There’s a lot of sensitivity around this, word of the election, you know, w- what do you think it is?

Alex Stamos: So this is, I mean this is my read from the outside, I think it is political in that, you know, there is a long history of the democratic party being close to Silicon Valley. Right? The vast majority of executives at these companies are very socially progressive. They are naturally aligned with democrats. And during the Obama administration there was this movement of people between DC and the companies, and the like, and a kind of close relationship. And since 2017, the relationship has soured significantly.

Laurie Segall: Yep.

Alex Stamos: And now they end up in a situation where they’re seen as hated by both major political parties. The democrats hate the companies all possibly because they’re, they believe that they’re monopolous, that they’re too powerful, that they are helping Trump get elected. They say that the companies knew Russia was doing this. I’ve, I’ve heard kind of crazy, radical theories about, you know, effectively that like Russia called Mark Zuckerberg and said, “Run our ads.” Which is just ridiculous. I, but, you know, there’s all these theories blaming tech for bad things on the democratic side and then on the republican side, they hate tech companies for what they see as censorship of their views, and oppression of their views, and, and bias towards the democratic side. Facebook cannot be hated by both sides, right? And so I think they are making an explicit decision to, in a world where the democratic demand is, “You can’t exist because you’re too big and powerful and we’re gonna break you up,” and the republican demand is, “Carry our ads.” That’s what you’re gonna choose, right? And so I think there’s been kind of a cynical decision to lean much more into the republican side of the aisle because the republicans have been very good at working the refs, ’cause the other thing that’s kind of been under discussed is there’s been a concerted effort by senate republicans and by the Trump administration to do investigations, to bad mouth the companies. You know, the Trump administration has like five or six investigations open of just Facebook. They’ve got ones open of Google and Twitter. William Barr doesn’t really care about, you know, some of the issues he’s saying. This is a, this is the Trump administration throwing a brush back pitch of, “We control the Department of Justice. We control the FCC. We control the FTC. We have the ability to make your life hell.” And I think that has been effective. And again I don’t think that’s necessary because I don’t believe the political ad issues are actually as partisan as people make them out to be. Right? Because of the Bloomberg move, there’s a bunch of democratic donors that are putting a lot of money into digital ad technology and the like. And so I think actually 2020 will be a much closer fight than Trump versus Clinton was in 2016.

Laurie Segall: Hmm.

Alex Stamos: But that being said, that’s not the perception. The perception is, is any changes around political ads will only hurt Donald Trump, only help democrats, uh, and so because of that they are moving to try to neutralize the, the criticism from the left, or from the right. And the left has honestly walked into it ’cause the democratic party making noise about things like CDA230 and stuff, that’s getting now amplified by Josh Hawley, and Ted Cruz, and other republicans who … You know, this should have been a warning to Elizabeth Warren when Ted Cruz endorsed her tweets about Mark Zuckerberg. Like Ted Cruz is not doing that because he actually has the same position- On monopoly. He’s doing that because he’s looking for levers to try to keep the companies from enforcing their rules on speech that is seen as pro-Trump. And I think unfortunately there’s probably been a very cynical decision to, to lean into that.

Laurie Segall: Did you feel it was political at all when you were inside?

Alex Stamos: Well, certainly. I mean the, the concern of looking political was a huge deal, right? Like the company never wanted to make a decision that looked political. 

Laurie Segall: But did it act political.

Alex Stamos: What do you mean? Like and like, with the goal of-

Laurie Segall: But did it make decisions that were based on, political purposes, or not wanting to, to upset one or the other side.

Alex Stamos: Oh yeah. I mean I think decisions were definitely made to try to, to stay as neutral as possible. And so-

Laurie Segall: Yeah.

Alex Stamos: And that has become obvious to folks on the outside, which is why it’s now seen as an effective path to work the ref, right? Like, um, the best example of this was, was called the trending topics-

Laurie Segall: Mm-hmm.

Alex Stamos: Debate. Where there’s kind of a BS article about “editors inside of Facebook” suppressing conservative views. So these were a handful of people who worked on a product that was used by barely anybody. Like the little trending topic box it turns out had almost no engagement at all.

Laurie Segall: Mm-hmm.

Alex Stamos: So their ability to impact what people were looking at, at Facebook was actually tiny, but the right completely blew this up into a huge deal. And that was very smart because it ended up … in the run up to the election there were things the companies could have done around fake news, around account verification, around people’s amplification of stuff on advertising and stuff that I, I’m sure were not taken because they did not want to look like they were putting the thumb on the scale on behalf of Clinton.

Laurie Segall: Have you guys ever worried about like spies inside Facebook?

Alex Stamos: Yes, uh, absolutely. That was definitely a concern. It continues to be a major concern in Silicon Valley. 

Laurie Segall: Increasingly so, or just always?

Alex Stamos: I mean certainly it was my concern for my entire time and we did a lot of workaround kind of internal security, but I think it-

Laurie Segall: Well, well, why were you-

Alex Stamos: I think it’s increasing.

Laurie Segall: I mean what, what, what kind of stuck out to you? What-

Alex Stamos: Well, the, the truth is, is that on cyber issues, at least the tech companies are playing the same game as the major US adversaries. Right? So, a lot of things changed in Silicon Valley in 2009, which was when the People’s Liberation Army of the People’s Republic of China was caught breaking into Google and 35 other companies. So we, we actually just had the anniversary of all these things. The people who worked on this at Google have come up with, with cute little souvenirs. I still have to go pick mine up. And so I got to be involved in that as an incident responder at a number of companies. Since then, the companies invested a lot into repelling attackers from Russia, from China, from Iran and the like. And at least they’re in the same category, right?

Laurie Segall: Yeah.

Alex Stamos: So if you’re the Ministry of say Security of the Chinese, or you’re the GRU, or SVR, you are taking a risk trying to break into Facebook, or Google, or Microsoft, or Amazon, or a couple other companies at that top tier. You are taking a risk by trying to break into them with cyber. On human intelligence, we are children compared to these organizations. Right? We are children compared to the FSB and the SVR which are the descendants of the KGB, which are the descendants of czarist Russia’s intelligence services, right? Like countries have been planting spies or turning people using various forms of leverage for hundreds and hundreds of years. And tech companies are very open. They hire people of all kinds of backgrounds, with all kinds of citizenships. We don’t have security clearances. We don’t have-

Laurie Segall: Yeah.

Alex Stamos: We don’t put people through lie detector tests. We don’t have single scope background investigations or lifestyle polys. All the things that in the US government are used to prevent spies and still they have spies. So if the NSA can’t keep them out with all of that capability, with 18 year old marines with machines and polygraphs, then certainly the tech companies can’t. And the best example of this that’s come up happened in Twitter. Now it was fascinating. The country that got caught was the kingdom of Saudi Arabia, right?

Laurie Segall: Mm-hmm.

Alex Stamos: So the, the Saudi’s were paying off Twitter employees to spy on people, but I, I expect that every major, um, US tech company has at least several people that have been turned by at least China, maybe Russia, probably Israel, and a couple other US allies.

Laurie Segall: Did you ever uncover anybody that had been turned?

Alex Stamos: Um, I’m not gonna comment on that. It’s certainly an area in which we did a lot of work and investigation.

Laurie Segall: Can you comment a little bit?

Alex Stamos: What I’ll say is there’s, there’s a lot of weird things that happens inside of the companies and it’s very difficult for them to figure out why it did happen. If that makes sense.

Laurie Segall: So maybe there wasn’t like a solid yes, but there definitely wasn’t a solid no. So there was-

Alex Stamos: Yeah. That would be reasonable.

Laurie Segall: Suspicious activity that you uncovered in your time-

Alex Stamos: Yeah.

Laurie Segall: Uh, at Facebook.

Alex Stamos: Right. And so you, you end up … At the companies, you have to design your internal systems to detect that. You have to limit user data access. You have to classify the pieces of data that are really critical. So like find GPS location. Like exactly, peoples exact GPS location, their private messages. Those things are more important than IP addresses or- Public messages. And so you have to have systems to look for the abusive data internally.

Laurie Segall: Out of curiosity, what does suspicious activity of a potential spy look like?

Alex Stamos: Um, that’s a great question. So you, you, what you probably want to do is you probably want to have an understanding of who potential targets are. So, let’s say I was advising a new startup social media company.

Laurie Segall: Let’s say you were at Facebook.

Alex Stamos: Let’s not say I was at Facebook. Let’s say, let’s say, let’s say you’re, you’re, you’re starting up and, and you believe you’re gonna have a, a big, a company that’s gonna have a lot of interpersonal communication.

Laurie Segall: Mm-hmm.

Alex Stamos: Then I’d say one of things you’re gonna want to do is you’re gonna want to identify high risk targets and you’re gonna want to have special kind of monitoring of their accounts and data-

Laurie Segall: Mm-hmm.

Alex Stamos: Access. You’re gonna want to look at the patterns of data access. So if you’re a software engineer and you need to debug a problem, then you’re gonna have to look at data that’s tied to that exact problem. You’re not gonna end up looking at random accounts. And so you should be able to see that somebody opened up a bug. They were assigned this, this task. This task is tied to this one account. They access that account. That’s legit. If somebody just sits down and all of a sudden they look at five accounts in a row and they’re all not related to a task that they normally do, that’s the kind of thing that should set off an alert in an, a human investigation.

Laurie Segall: Suspicion. Uh, interesting.

Alex Stamos: Yeah.

Laurie Segall: Very interesting. I mean increasingly so. I can only imagine that’s, that’s got to be a pretty big problem. Or, uh, increasingly a, a problem and probably always a problem, but something that people have to pay attention to now.

Alex Stamos: Yeah. Absolutely. I mean… The big Silicon Valley companies are not the most powerful intelligence agencies in the world, but they’re in the top 20. Right? Like if the capability, if you’re a Google, or Facebook, or Microsoft, or Amazon. I think I would put those four at the top.

Laurie Segall: Right.

Alex Stamos: Those four companies can know what’s going on in the world, both in the cyber domain and in the physical world based upon people’s communication.

Laurie Segall: Yeah.

Alex Stamos: As well as lots and lots of state level actors. And so that means that they have to be incredibly careful of how they use that power. And they have to protect that power from other people who are looking for some level of asymmetry.

Laurie Segall: Hmm. What do you think is the biggest security weakness that everyday people still have in their tech lives?

Alex Stamos: Oh. By far, nothing is close to this. It’s the reuse of passwords. The fact that the vast majority of people in the world use the same password or a slightly modified password on every single website.

Laurie Segall: Right.

Alex Stamos: No- Nothing’s like, there’s nothing even close to how much- Privacy violations happens, how much damage to people happens because of that problem. And so, uh, for all these things I mean people are worried now of getting hacked by MBS and malware on their phone.

Laurie Segall: Uh huh.

Alex Stamos: And, you know, Chinese spies and stuff. Like the, the number one thing that any normal person can do is they can download a password manager.

Laurie Segall: Mm-hmm.

Alex Stamos: There’s three or four good ones. And they can go and generate random passwords for all of the different sites in their life. And so you have one good password to log into your password manager and everybody else has a random one. Or if you’re more old school, get a little black book and write all your passwords down in the book ’cause a hacker can’t, from 6,000 miles away, reach into your purse or your pocket and steal that book away, right? Which is what’s happening to people’s passwords.

Laurie Segall: It’s so funny that like out of everything happening, you’re like just don’t use the same password.

Alex Stamos: Yeah. No.

Laurie Segall: Yeah.

Alex Stamos: It’s, it’s by far the biggest. At Facebook we caught about 500,000 accounts every single day where a bad guy came in with the right username and password and we detected this is actually a bad guy, they’re not-

Laurie Segall: Wow.

Alex Stamos: The person who’s supposed to begin. Half a million a day. Like there’s, all these data breaches and stuff, there’s nothing close to the amount of data breach that happens every single day just to, due to stolen passwords being reused.

Laurie Segall: Yeah. Um, I want to end on something you said to me. I want to go back to that, that drive that we took on your last day.

Alex Stamos: Mm-hmm.

Laurie Segall: Um, to Facebook. I, I thought you said something really interesting because we were, we were driving to, to Facebook, as it was your last day and, and you said that … You said, “I think the company is lasting, but I think also that the one on one is littered with the old headquarters of companies that thought they’d be around forever. That is a constant reminder of Silicon Valley. The death and the rebirth of those products and these organizations. Nobody can get too comfortable and too arrogant about how long anybody can be around and how long you can be on top.” Do you think Facebook lasts?

Alex Stamos: I don’t know. I think the most accurate criticism of the big tech companies is that they’ve effectively locked out competition. It’s, I think the, the competition issue is big and it’s because they’ve been able to turn their access to huge amounts of cash, but then also an understanding of what people want to do online to either buy or copy their smaller competitors. And so like the, the vision of all of these dead companies, you know, that kind of famously … Part of Facebook’s campus used to be Sun’s campus and the back of the Facebook sign is the Sun Microsystems logo it has not been painted over as kind of the reminder of the turnover in Silicon Valley. I think that’s a good thing. I think it’s good for these companies to grow and die. I think that’s one of the reasons why Silicon Valley has been so competitive with the rest of the world is because we have this ecosystem. But it does start to feel like some of the big companies have been able to block out their smaller competitors through a number of means. And so I, I think that is an area that we have to work on.

Laurie Segall: Yeah. I go back to this idea of a little bit of hubris. And that filter bubble you spoke about at the beginning, at the beginning of this interview because it’d be too simple to say all these people are bad. It’d be too simple to say that this is, this place is just here to do evil because both of us have been in this a long time, um, because we cared about technology, we cared about people. Um, but there has been a very insular mentality and a very specific way that some of these bigger companies, um, I’ve covered Facebook very closely-

Alex Stamos: Yeah.

Laurie Segall: And, and they missed a lot. You know, there’s a lot of stuff they missed. And, and so I think it’s really interesting that you’re now like on the outside and, and that you thought that this is where you could make the most impact. I mean I actually think that’s kind of extraordinary sitting here and saying like you had this big role within, a, a company where you, could impact over two billion people, but you thought you could make a, a bigger impact on that outside. I mean that is pretty extraordinary.

Alex Stamos: Well, I, I think part of the, the root problems for the tech companies is there’s bubbles within bubbles, right? So each corporate campus is a bubble, but then the real decision makers at the top are living in an even smaller bubble, right? Where they are surrounded by folks who, have gone to the place they are because they’ve been successful in hitting certain metrics and those metrics are all about growth, and users enjoying the product, and advertising revenue, and all those kinds of things. And so almost all of the problems tech companies have faced over the last three, four years, almost all of them somebody inside the companies understood that was happening, it just didn’t make it to the right people. And so if, you know, if I had advice for my friends who are still there, it’s building a culture where people can be Cassandra’s and can say, “A storm is coming. This is a problem we have to deal with.” And then that’s taken seriously. And that’s bubbled up to the top. And that changes how you build products, and it changes, how you approach the world. That is the most important thing they can do if they want to survive. 

Silicon Valley is now a corporate machine – with billion dollar implications –  and an incredibly human impact. But there’s anger and frustration and people retreating farther into their own corners. 

You’ve heard our last two guests on this show frustrated they couldn’t speak more openly within their own companies. And there’s another layer — concern about saying anything controversial at a time when tech is in the national spotlight. 

I worry that people will go even farther into their corners, that Cassandras really can’t come forward. 

Now, of course, this isn’t just tech. Speaking hard truths to power, raising red flags — not a new concept… 

Okay, we’re gonna try something new. What do you think? Do you have something to say? What are the hard truths you’re struggling with? One of my favorite parts of the job is getting to hear from people. So text me. We just launched a community phone number for First Contact, and yes I mean this when I say it actually comes directly to my phone — so reach out to me. The phone number is 917-540-3410. That’s a US number. 

I’m Laurie Segall and this is First Contact 

For more about the guests you hear on First Contact sign up for our newsletter. Go to firstcontactpodcast.com to subscribe. Follow me I’m @lauriesegall on Twitter and Instagram and the show is @firstcontactpodcast. If you like the show, I want to hear from you leave us a review on the Apple podcast app or wherever you listen and don’t forget to subscribe so you don’t miss an episode.

First Contact is a production of Dot Dot Dot Media executive produced by Laurie Segall and Derek Dodge. This episode was produced and edited by Sabine Jansen and Jack Regan. Original theme music by Xander Singh. 

First Contact with Laurie Segall is a production of Dot Dot Dot Media and iHeart Radio.