First Contact Transcript

Episode 25: Free Speech – The Firestorm Tech Can No Longer Afford to Ignore

Read the transcript below, or listen to the full interview on the First Contact podcast.

First Contact with Laurie Segall is a production of Dot Dot Dot Media and iHeart Radio.

Barry Schnitt: What’s happening now is the culmination of things that have been happening for a long time. You know, for a really long time, I’ve been looking at the impact that Facebook has been having on the world, with some dismay. And actually, I’ve, I’ve, I’ve lost some sleep over, you know, laying in, in bed at night looking up at the, at the ceiling and, and kind of thinking like, “What have I done here?” 

We are living through some pretty surreal times… Most of the country is still shut down due to COVID-19. Over the last few days we’ve seen protests, and riots, across the country following the police killing of George Floyd, an unarmed black man. All while one of the biggest tech companies on the planet – Facebook – is dealing with its own turmoil. it’s not a coincidence…

It boils down to this: Many believe President Trump’s use of social networks makes situations like these worse. In this case, critics say some of his latest social media posts could incite  violence and spread misinformation.

Both Twitter and Facebook have taken different approaches – in some cases, Twitter has chosen to restrict or add warning labels to the president’s more inflammatory posts – Facebook CEO Mark Zuckerberg has chosen a hands off approach, he’s said it’s not the company’s place to restrict free speech and these posts don’t violate Facebook’s policies. 

But this comes down to a larger and incredibly important conversation about the limits of free speech and the role that tech companies play in civil discourse at a time when the stakes could not be higher.

Over the last few days, employees of the company have staged virtual walkouts. They’ve voiced their opposition to the policy – publicly. Some have even quit. Now, I’ve been covering Facebook for almost a decade. I’ve interviewed Mark Zuckerberg several times. This is a company that keeps things pretty close to the vest. Employees don’t often speak out this vocally. So, what we’re seeing here, to use an overused word lately, is unprecedented.

And yes, I know, we just wrapped up season one of the show, but this is simply too important to ignore. So, my guest today on this bonus episode of First Contact is Barry Schnitt. He was Facebook’s Director of Communications for four years, from 2008 to 2012, these were pivotal years for Facebook and he just published an article on Medium criticizing the company’s position. The company was facing very different challenges back when he worked there, but I think his perspective is interesting and important, especially as other people are voicing similar thoughts. 

I’m Laurie segall and this is First Contact. 

Laurie Segall: First of all, how are you doing?

Barry Schnitt: I’m well. How are you?

Laurie Segall: I’m good. You know, look, I’m as well as can be. We call this First Contact and we contact about my first contact with folks who we have on. Um, and I was trying to think because you … I, I’ve covered Facebook for a really long time and you worked at Facebook during four very, very important years. So, I don’t remember if we’ve been in contact, but I feel like we must have had a contact at some point.

Barry Schnitt: Yeah. I’m-, I have to believe that at some point I sent you a statement about some privacy controversy or something at … the very least, but, you know II’m sure you were dealing with so many people there and I was dealing …

Laurie Segall: Yeah.

Barry Schnitt: … with so many journalists. Uh, you know, um …

Laurie Segall: Right.

Barry Schnitt: But I know, I know your name and I, I know who you are, for sure.

Laurie Segall: Well, and, and so take me back to your role. I mean, you were at Facebook, um, for four very important years, right? Back, it was 2008 to 2012. Um …

Barry Schnitt: That’s right.

Laurie Segall: What, what was your role there?

Barry Schnitt: Yeah. So, uh, my boss at Google, Elliot Schrage, moved over to Facebook and, uh, he said, you know, “We’re doing some really interesting stuff here and you should come.” So I did. And my role was doing communications and, some public policy work around privacy, safety, security and content issues. And so what that ended up meaning was pretty much every controversy and crisis that Facebook dealt with during that time. I was a spokesperson for it and, uh, was working behind the scenes to try and figure out what our, not just our communications response for it, but, uh, also what our kind of substantive response was to it.

Laurie Segall: Yeah, it was almost like … Because I remember covering those days. Like Facebook was growing at lightning speed during that time, um, and there were so many things that were happening, with privacy, with the switch to mobile. Facebook was, you know …

Barry Schnitt: Right.

Laurie Segall: … going public launching a mobile app. There was just acquiring all these companies. Um, you were kind of on the front lines.

Barry Schnitt: Yeah, for some of that, for sure. Yeah.

Laurie Segall: Yeah.

Barry Schnitt: And it was a, it was a very exciting time, and lots was happening. And, uh, it’s also very stressful time, but, uh, you know, I’m, I’m proud of a lot of the work that we did then.

Laurie Segall: Yeah. And, and part of why I wanted to have you on today, you wrote this piece on medium. 

Barry Schnitt: Yeah.

Laurie Segall: You know, speaking, on Facebook’s policies on free speech, and what a fascinating moment to be having this conversation and what an important moment to have this conversation. Um, Uh, a lot of this is, is happening in the news now, but this … It almost feels like we’ve been coming to this moment for a very long time. You know, I think over the last couple weeks, it was a decision Zuckerberg made to not, you know, put a warning label on, uh, on a Trump post. Uh, Jack Dorsey from Twitter …

Barry Schnitt: Right.

Laurie Segall: … made the decision to put a warning label on it. And it’s really put into focus, I think, some of these issues that a lot of folks are pretty concerned about at the heart of social media. And what’s very unique for our listeners is like having covered this company for a really long time, people don’t really speak out. Uh, uh, it’s very rare to see, um, people collectively speak out and really say things and kind of organize and come together, employees at the company. It’s a very tight lipped company.

Barry Schnitt: Right.

Laurie Segall: Um, and we’re really beginning to see almost like a sea change of behind the scenes people beginning to talk about these things. So, maybe we can start with what, you know, what was the premise of what, of what you wrote and why did you decide to write it?

Barry Schnitt: Well, yeah, uh, it was actually … There’s a lot of soul searching for me, you know, and, uh …

Laurie Segall: Yeah.

Barry Schnitt: … and, and I think you, you described it pretty accurately in that what’s happening now is the culmination of things that have been happening for a long time. Um, and it’s, it’s, it’s not just about Trump, um, at least not for me. Um, but, you know, for a really long time, I’ve been looking at the impact that Facebook has been having on the world, um, with, with some dismay. And, and actually, I’ve, I’ve, I’ve lost some sleep over, you know, laying in, in bed at night looking up at the, at the ceiling and, and kind of thinking like, “What have I done here?” Because, you know, I, I joined Facebook with the idea of, you know, changing the world for better. And I think, you know, what you’re seeing in terms of employees and former employees speaking out is because they don’t just care about Facebook, but they care about the world and that’s why they worked at Facebook. And when they’re seeing the potential that Facebook is having to negatively impact the world, they, they want to do something about it, um, and they feel like Facebook can do something about it. And that, that was part of the premise of, of my writing is that, you know, Facebook in the time that I work there, and the time before and the time since has overcome, you know, tremendous challenges. Um, and every, every single time, they rise to it and overcome, you know, whether it’s facing Google, which was a behemoth, or MySpace which at the time was a behemoth. Or, you know, you mentioned the the change to mobile. You know, these are, these are things that they took from nothing and they made a tremendous success out of them. And I think they have the opportunity to, to do that here with, you know, actually being a force for, you know, information and for understanding and for truth. but they’re not doing that right now. Um, and so, my, my goal was actually to try to rally them to that end because I know that they have the ability to come up with some really innovative solutions that could have an impact on the world, and I think. Don’t necessarily need to restrict free speech. I think that’s a false choice. And those are the words I use in, in my writing and I, I, I believe that strongly.

Laurie Segall: Yeah, I mean, it- ..Can you give us your argument about … Because, you know, I know Zuckerberg, really, more so than any tech founder I know really digs his heels in when it comes to this argument on free speech, you know. When he says, um …when Trump was posting and saying. Um, it was a post that said something like when the looting happens, the shooting happens or something, and people were very concerned …

Barry Schnitt: Right.

Laurie Segall: … this was gonna promote violence. But Zuckerberg’s argument is he-, that Facebook will not be the arbiter of truth, um, and that it’s a slippery slope. This has always been the company’s argument, um, and it’s not changing. Now, what you say in, in this piece is that a lot of things have changed in the time that you guys drew up those community standards and the time that, this happened and that words have more meaning and are more powerful in a different way, because a lot of things have changed. Can you explain that to us?

Barry Schnitt: You know, in 2008, you know, there are a lot of discussions about how to handle speech on, on Facebook. And, and the, the main conclusion was, you know, Facebook is gonna have a hands-off approach to it. And I think that made sense in 2008, you know. One there were the professional arbiters of truth, and I believe the press are, are those and have been for centuries. You know, we’re much stronger and had much more distribution. I think Facebook was growing, but, but still relatively small. And that’s changed, you know. And, and Facebook was not a source for, that people look for news and information. You know, it’s a place where you look for photos from your friends or, you know … funny memes, etc. And all of that has changed, you know, dramatically in that, the press has … You know, newsrooms have been decimated. You know, the economics of news have changed dramatically. Uh, Facebook is a news and information source for literally billions of people. And I don’t think a decision made with the variables in 2008 still holds in 2020, especially if you look at the consequences, you know. Um, everything from Brexit to elections here in the US, to elections around the world, to health information during this pandemic are all being threatened by disinformation that is found and hosted by Facebook. And I just don’t see how you can look at the consequences of the decision you made more than a decade ago and see how dramatically bad they are and say, “Yeah, that was definitely … That’s still the right decision today.”

Laurie Segall: Right. I mean I mean, it’s a pretty powerful statement to be an employee, to have been an employee somewhere years ago, right? And to say that you’ve been losing sleep over decisions you made. I mean, that is a pretty powerful thing to say. If you go back like …

Barry Schnitt: Yeah.

Laurie Segall: What, what, what are you losing sleep over specifically?

Barry Schnitt: Well, I mean, I can, I can bring it to, you know, just like a few weeks ago. You know, if, if you saw this Plandemic video, which was a very slick, highly produced piece of gross misinformation about the current pandemic, and  it was liked two and a half million times on Facebook, which means it was seen by many more people. And so that’s literally millions of people who were misinformed about a current health crisis. Now, I, I believe that many of those people will make a decision based on what they saw on that video through Facebook that will be detrimental to the health of themselves or their family. You know, I don’t think it could get more serious than people will die because of something they saw on Facebook. And, uh, yeah, uh, you’re right, I lost sleep over that, um, and I continue to.

Laurie Segall: You know, you talk about, um, when we spoke before like a key moment. And you mentioned like intent bias and this piece, um, when you’re, when you’re talking through some of these issues. Can you explain that?

Barry Schnitt: Yeah. So, I mean, when I worked at Facebook and, and other tech companies, you know, you, you build these products and you, you think you know what they’re gonna do, but you never really know until they’re out in the world and people are using them. And they always use them in ways that you didn’t intend. And sometimes that’s good, and sometimes that’s bad. But, you know, your intention is good. You know, you had the intention to give them free speech, or allow them to share, or allow them to create, etc. and so your intent was good and then you look at the outcomes, and hopefully, there are some good outcomes and there are bad. But because of your intent, you focus more on the good outcomes than the bad. and I think that’s been happening at Facebook for years. And I think it’s happening today and that you see them write the stories of success, you know, whether it’s selling products, or the organizing of groups. and I provide examples. You know, they, they, they tout The Sisterhood of Truckers, which I think is amazing. But at the same time, you know, Mark looks at all of that and then he says, “I don’t see how Facebook could have impacted an election.” Like, he’s too smart to not see that those things are the same. You can’t have all of this good and organizing and people changing their minds in a good way and then not have the same thing happen in a bad way.

Laurie Segall: So then having worked closely with him and having worked closely at the, I would say, the company and being kind of the lead on communications, it could be crisis communications, communications. You say he’s too smart for it. Then what do you think it is?

Barry Schnitt: Well, I, you know, I, I haven’t talked to Mark in …

Laurie Segall: Yeah.

Barry Schnitt: … you know, nearly a decade. Aand, uh, so I can’t know what’s in his head. But, you know, the, the two things that my guess are, one, is the intent bias. I, I think he continues to look at all of the good that Facebook is doing in the world and I agree with him that there is a lot of good. And he says that it’s more than the bad. I would say that’s not good enough. And I would say the bad is continuing, is growing and that, that ratio was changing in a way that I think is bad for the world. And two, you know, when I worked at Facebook, there was something that Mark used to say a lot, and he would, and he had a chart and everything and he would, he would continually say, “We’re 1% done.” And, and that was in response to people being too conservative and people working to protect what we’d already achieved rather than working, taking risks to achieve the 99% that was left. And, and I feel like maybe for him and others at the company that, that, that ratio has changed and that maybe they’re feeling like there is more to protect than there is to achieve for Facebook.

Laurie Segall: Mm-hmm.

Barry Schnitt: And, and that Facebook maybe is too important in the world to risk. But I would argue that if, if we save Facebook but the world burns that we’ve made the wrong trade off.

Ok we’ve got to take a quick break to hear from our sponsors, more with my guest after the break.

Laurie Segall: It’s a nuanced argument too, right? Like it’d be, it’d be too easy to simplify this and say, “Well, you should take this down.” Right? Like you look at Jack Dorsey You know, he’s in the line of fire with, with the president right now. 

Barry Schnitt: Right.

Laurie Segall: And- and now having put um, some of these labels on Trump’s tweets. Uh, people are calling on Twitter to- to put labels on all these other world leaders tweets. You know and we’re wondering well why does this get a label and this doesn’t get a label? And from me having covered tech um, throughout the years it certainly seems like sometimes you get very confused about who’s making the decision, why the decision gets made. And those standards don’t seem to always apply in the same way and they change quite a bit. So with Facebook and- and the argument of it’s a slippery slope can you see it from the other side? Like how do yo weigh that argument right now in this current moment, that this could lead to um, you know censorship. It could lead to a lot of other unintended consequences for the platform.

Barry Schnitt: Yeah. Well the- the slippery slope it’s just a funny you know buzz word that- that I- you know people invoke in- in all kinds of situations. And I’ve heard it in meetings you know for decades. And I’ve come to realize that in a lot of ways it’s a cop out. In some ways it means yes we know the right thing to do in this situation. But we’re not gonna do it because we don’t know the right thing to do in some of these future situations.

Laurie Segall: Right.

Barry Schnitt: And if- if that’s the case um, you know that, again that feels like a cop out you know? I- I know that the decision that I’m gonna make here is wrong but that’s okay because I don’t know what to do about these other situations. And I make the argument that Facebook is too smart, has too many resources, you know too much innovation, too much technology to just use a slippery slope as a way of- of not doing the work to figure out, yes we know the right thing to do. And let’s work towards figuring out what the right thing to do on these other situations will be um, and make those decisions as we go. And it will be hard and there will be some inconsistencies and they will make mistakes. But I think all of that is less bad than the current situation. Which is rampant misinformation, rampant devisiveness, rampant incitement of violence. And I- and I think it’s worth the risk.

Laurie Segall: You know it’s interesting because when Zuckerberg testified in front of congress for the first time.

Barry Schnitt: Yeah.

Laurie Segall: I remember um, I was there. And day one, was kind of the Senators just asking like random questions about the internet.

Barry Schnitt: Right.

Laurie Segall: And it was kind of like uh, you know I remember everybody was kinda like okay the take away here is also that you know the government needs to educate themselves when it comes to technology. But day two, I remember thinking this Barry. I- I was like day two is really interesting. Because you had Zuckerberg in front of a lot of house members who were all asking him about taking down content. And who were all- this was you know what I think this was a couple of years ago now, right?

Barry Schnitt: Yeah.

Laurie Segall: But who were all talking about did they have a liberal bias? Um, and- and you know this is now we have another backdrop which is regulation um, in the power of these big tech companies and before all this- this whole pandemic happened we had the conversation about are these companies monopolies? Which you know I think that’s a little bit, we’ve- we’ve pushed it aside just a teeny bit, because there’s so many other huge things happening in the world. But these companies are under a lot of pressure right now. And I make no judgment either way. I think you know Zuckerberg uh, I think it’s- it’s you know having interviewed him many times and seeing some of this stuff I- I don’t get the- the sense that it’s all just political right? Or that he’s only just doing it all for the money. I actually think if you meet him it’s- it’s really different. Um, but do you think that part of this could be outside forces too? I mean and- and not just with this, but with a lot of the decisions that Facebook is- is making and a lot of the pressure that they’re under right now.

Barry Schnitt: Yeah, yeah. No, I- I 100% think there’s outside forces at play here. And um, that’s what I- that’s kinda what I was meaning to talk about with the company working more to protect than to- than to take risks to achieve. Is that I think those outside forces are real and scary, you know? I don’t think any company wants to draw the ire of the President. I mean I think he’s shown that he will use whatever government levers he has at his disposal to make things difficult. I think you know Jeff Bezos is an example. Um, you know the work that he’s doing against section 230 that you know, that has been described as the 26 words that created the internet. He- he basically wrote an executive order to try to rescind that you know? and that would be a dagger to the heart of all internet companies I believe. And so I don’t think it’s the right answer. But it’s an example of you’re not wanting to get on the bal- bad side of- of the President and this current administration and I think that is playing very much into some of the decisions that the company is making. And I don’t think they’re being honest with themselves or certainly with the public that that’s what’s at play.

Laurie Segall: You know I think it’s important to say like you, and a lot of, and we should mention that like a lot of other employees are speaking out. Some have resigned. Um, some are speaking out kind of behind the scenes. We just obtained a letter that folks wrote- a lot of early employees that wrote.

Barry Schnitt: Yeah.

Laurie Segall: Um, you know all these employees and I think this is a really important point, aren’t saying Facebook is terrible, Facebook is bad. I’ll read like a little. I thought this was so powerful. This is um, this is from the letter that a lot of early employees, and I’m assuming many who you worked with um, got together and they- they collectively wrote.

Barry Schnitt: For sure.

Laurie Segall: Like this was- this was a letter and- and correct me if I’m wrong this was a letter to Mark, right? That they wrote and it was published in the New York Times.

Barry Schnitt: Yeah.

Laurie Segall: It was kind of more of a petition for talking about the standards. But they wrote “as early employees on teams across the country we authored the original community standards, contributed code to products that gave voice to people and public figures and help to create a company culture around connection and freedom of expression. We grew up at Facebook but it is no longer ours.” You know I think that’s such a- I- I gotta say. And- and maybe this is me being a little inside baseball, as someone um, you know who’s uh, looked at this company since I would say 2009 or 2000, probably around 2010. But- but that’s you know seeing a lot of these names I- and I want our listeners to understand the names on- at the bottom of these, and even you right? You led coms, like no one at Facebook spoke like this, or spoke out like this for a very long time. So, I think it’s a really big deal that- that people are really beginning to question some of these decisions.

Barry Schnitt: I agree. Um, you know those are uh, I know most of those people. Um, they’re- they’re a lot of people smarter than me who- who signed that letter and uh, and they’re dear friends. And um, and in fact it just coincidentally I- I sent my you know draft post to one of them, and said hey I’d love your feedback. And they said well that’s a coincidence we happen to be working on something of our own. But yeah I think it’s collectively you- my experience that- of- of lying awake at night thinking you know what have I done is not unique. And I think you know seeing the impact that Facebook has had on the world and- and being proud of it for a very long time and then having that gradually become you know forms of- of shame and dismay is- is pretty powerful and seeing you know it’s- these- these are not you know people who are cranky critics you know? The kind of people who always uh, criticize Facebook about both doing too much and too little. Being too far to the right, and too far to the left. These are people who joined the company willingly and poured their hearts and souls into it for years. And are really shocked and dismayed about what it’s- how it’s- not of what it’s become. But the impact it’s having on the world. And- and I say I have a distinction there. Because I- I don’t think it’s become evil. I, you know I- I got that response from some people you know, Facebook is evil and I said well, you know that’s not a solution. You know if you pro- um, I’m in all ears you know? Let’s- let’s provide some action. But I- I do think Facebook is not understanding the negative impact that it’s having on the world. And I think it is not paying enough attention to potential solutions and certainly not putting enough effort towards developing those solutions. And- and I wish I knew the- the exact answer. But I don’t. But I do know that Facebook has the ability to come up with those answers. And I think that’s part of what’s so dismaying to- to people who used to work at the company, is that we know that Facebook can re- can rise to these challenges. And has you know limitless possibilities and- and they’re- for some reason it feels like they’re not trying to do that and we just don’t understand why.

Laurie Segall: What- what do you think a solution is? I know they’ve created projects for journalists. They have an outside, almost editorial board for content now. Um, so they have done quite a bit over the last years for some of the criticism they’ve- they’ve received on content decisions. With everything happening in journalism. I mean what do you think is the solution?

Barry Schnitt: Well, I’ll answer the opposite question first. I’ll tell you what I don’t think is the solution you know? Like I- I know Facebook somewhat. And I’ll- I’ll tell you one thing that I believe is that Facebook doesn’t outsource things that are really important to the company. Never has, never will. And everything that you mentioned is effectively an outsourcing. The journalism project, hey here’s some money, you guys go do some interesting work. Fact check, hey third parties here’s some money go do some interesting work. The board, hey third parties why don’t you make some decisions for us, because we can’t make them. And you know, the- the amount of money that they put towards these efforts sound big, but they are rounding errors in the- in the Facebook universe. And I think if they were serious about a solution to misinformation to  the incitement of violence and just to coming up with a new way to treat content, that they would do something internal. They would devote engineers to it. And- and that’s the number- you know having worked at Facebook. That’s the number one signal for- for whether Facebook thinks something is important. How many top engineers are working on it? And everything that you described, I would argue the answer, the number is roughly zero. And that’s- that’s probably a little hyperbole. Because there’s definitely engineers working on these related problems. But the things that they’re touting as potential solutions to this, are not actual solutions. They are- they are band-aids and you know as I write you know I think we’re actually hemorrhaging truth and civility on- on Facebook. And these are- these are a start but they’re just at the margins. And uh, and I think they need to devote significant resources. I- I propose you know kind of a symbol in my writing that they just suspend their stock buy back which they’ve committed another 14 billion to doing that. That’s the kind of resources that this is gonna take. And- and I don’t think they literally need to find 14 billion dollars, they have the money mostly what they need to find is the will. And I don’t know exactly what they need to do. But I know they need to commit to doing it. And- and that’s not even something they are willing to do thus far.

Laurie Segall: You know it’s interesting I remember interviewing Facebook’s former head of security um, who was there for context like during uh, it was their team that discovered Russian influence. He was there for the election interference, Alex Stamos.

Barry Schnitt: Right.

Laurie Segall: And something he said to me, this was for a documentary I did on Facebook. Something he said to me was you know for a very long time the growth team, they had more engineers at the growth team, and it was bigger, and the- the building was bigger than for the security team. So I think that’s an interesting point you make about about engineers. And- and- and to give a sense like these are human problems right?

Barry Schnitt: Yeah.

Laurie Segall: And so like and- and we’re looking at, and you’re talking about technology. But- but the real problem, this is not just Facebook, this is maybe beyond Facebook and for a lot of the bigger tech companies in the time I covered it and they went into these- grew into these huge companies is I don’t think they anticipated, when you talk about intent bias. I think there was an inability to look at the messy complicated human problems that would happen in some capacity.

Barry Schnitt: Right yeah. And- and you know the intent is important, because I think their intention is good. And their- and their hypothesis were not crazy right? I mean what Mark says is you know, we think people should decide right? But you know I think we can agree there are- there is truth and there are lies, you know? There is civil discourse and there’s inciting violence. And I think we would all agree that… Civil discourse, and there is inciting violence. And I think we would all agree that truth should get more distribution and attention than lies, and we should agree that civil discourse should get more attention than inciting of violence. And what Mark would say is, “Yes, and and people will figure that out, and they will decide for themselves.” But A, that’s not what’s happening, and B, some people aren’t equipped to discern and see, you know, there are very powerful forces that are deliberately trying to trick them into thinking that one is the other. And to, for Facebook to see all of that, and throw up its hands and say, “No, we’re just free speech, and we’ll let the people decide,” I think is, is wrong. And that’s just t-, what I’m trying to get them to realize.

Laurie Segall: Right. Looking at this letter that a lot of the early employees, and, you know like, some of the early employees, these are early architects of Facebook. But there are all sorts of people who signed this, um-

Barry Schnitt: Yeah.

Laurie Segall: Who  co-created this letter. And we’ll put it in our show notes, I would suggest people read it, read it, just because whether you agree with it or not, it’s really an interesting look at, I think, how people are viewing this moment in time. And, and the implications.

Barry Schnitt: I agree 100%.

Laurie Segall: And the implications. Um, you know, what some of these folks said, um, in the letter was, since Facebook’s inception, researchers have learned a lot more about group psychology and the dynamics of mass persuasion. You know, um, we understand the power words have to increase the likelihood of violence. You know, I, I remember being, um, like, at CNN. Um, I was outside where there, when there was the bomb that was pulled out of the, the building. This was, like, uh, I would say a year ago or something. And I remember someone had sent this, this bomb, and it ended up in the mail room. Thank God, uh, the incredible security at the time had found this. But you know, um, it had, it had stemmed from, I think, posts and tweets. And I remember thinking, like, “Oh my God. Like, this is actually happen.” You know, and it started there. And, and the threats had started there. And it, and you know, and then I was watching a bomb pulled out of our building where I had been for 10 years. And it was such a, maybe as someone who, uh, just has spent my adult career covering tech, it was just such a moment for me thinking, like, “Wow. The implications, like, can be very real life.” Um, so I thought that line in, in this piece was really interesting.

Barry Schnitt: Yeah. You know, the, it’s a good point, you know. This is not academic or theoretical. Um, you know, it’s happening every day. People are being radicalized, you know, based on what they’re seeing on Facebook. And by the way, it’s not just Facebook. I wrote about Facebook because I worked there. I kn- I know more about the company, you know. The same could be said for pretty much ev-, every technology company that hosts user generated content, you know. I think Twitter’s come up with unique solutions for, uh, Trump’s tweets. But there’s a lot of work that they need to do, too. Um, you know, around abuse and around misinformation, et cetera. You know, they, and so I just don’t know how to tell them what to do.

Laurie Segall: Right.

Barry Schnitt: Um, but yeah. This is, this is not happening in a far away place. This is not some, uh, you know, dystopian future that we can imagine, you know. People are walking into a pizza parlor with a, with an assault rifle, because they’re, they believe that it is a child sex ring. You know, with, with a presidential candidate. You know, the Pizzagate. You know, it’s, uh, it’s absurd, but it’s literally happening. People are planting bombs at CNN because they believe that, you know, based on what they’ve seen on Twitter and Facebook, that you know, you guys are the root of all evil. Um, and, uh, and that’s something that we really need to take more seriously.

Ok we’ve got to take a quick break to hear from our sponsors, more with my guest after the break.

Laurie Segall: I guess the, the question for me is, I mean, this is such a tight-lipped company, right? Um, you know, even covering Facebook, it’s a fascinating company. And, and it has so much impact. And it has completely transformed the world. But this is not a company where employees, like, freely generally tweet about how they feel. Or Facebook, I’m sorry, should say post on Facebook about how they feel. That is, you know, that’s not something that we’ve seen. We’re in the lens of a global pandemic. We have, protest and real, real anger. And you know, rightfully so, in this, in this country and around the world, given what’s happening with the racial divisions and, and racism. And, and I think, looking at the fact that people at Facebook, and, and hearing what I’m hearing from kind of former employees and employees about people really kind of, that turmoil behind the scenes, what do you think it is about now that’s, you know, causing people to maybe take the risk? To say something when maybe they wouldn’t have before?

Barry Schnitt: Well, I think it’s two things, right? One, that it’s, it’s not theoretical, you know, anymore. It’s not academic, you know. We are, we are seeing, it’s not isolated. You know, it’s not just, you know, one incident of, you know, a crazy person that you can dismiss. You’re seeing it, you know, simmering across the country and around the world. Just people being incited to violence and radicalized. And two, you know, you, you hit on every one of it. The stakes are so high. We, we are literally in the middle of a global pandemic, you know? Nearly half a million people have died. The smart people who know about viruses say many more people will die, likely in the fall. And in the meantime, Facebook is providing health, uh, misinformation to them. You know? It’s, I don’t, I, as I write, you know, the only way that the stakes could be higher is if we were on the brink of a world war. And I don’t think we are right now. But I don’t see the logical conclusion of this being a lasting peace, right? You know, it, it will, it will be, there will be some violent outcome of all of this if it is not, checked in some way. And in, in the meantime, in the short term, a lot of people will make health decisions that will be detrimental to their life, and uh, and Facebook will be complicit in it. And, and I think that’s, that’s just something that’s got to change.

Laurie Segall: You know, Facebook has always come under fire throughout, uh, and by the way, as someone who’s been on the other side of it as a journalist who’s asked very hard questions, interviewing Zuckerberg, right?

Barry Schnitt: Right.

Laurie Segall: In the midst of Cambridge Analytica. And during some of the harder moments in the company. The company has always, you know, the, I feel like they, they’ve played defense for a very long time. And so there is, I think, a certain mentality around that. And knowing that you’re going to get criticized, but you kind of, like, you just keep going if you have the mission. I think that’s in the DNA of Facebook, if I could kind of define it in any way.

Barry Schnitt: For sure.

Laurie Segall: Do you think that this time is any different? That there, that they’ll listen to some of the former employees, or, or some of the, you know, some of the, the, it, it, maybe because it’s more people behind the scenes? I know how much Zuckerberg does value the people he works with.

Barry Schnitt: The short answer is I don’t know. 

Laurie Segall: Yeah.

Barry Schnitt: I am, have, having said all the things that I say and believe about Facebook, um, I am, you know, almost a decade removed from the company.

Laurie Segall: Sure.

Barry Schnitt: I still have a lot of friends there. Um, but you know, I do think the opinion of the employees is, is very highly valued. And that is something that, in the past, has moved the company. There was a transcript that I read of an all company meeting earlier this week, and it, it seems like, you know, for the most part, the, at least the vocal people are very against the the current stands of the company. I am sure that is weighing on the, the, the leadership. Whether it makes a difference, I don’t know. Whether the external pressure will make a difference, I don’t know. The, the one thing that’s unique about the, the external pressure right now is that it is, it is so divided. In, in most cases, when I was at, at Facebook, there were cases where people were, you know, on both side of an issue, telling us w-we were wrong. But mostly, it was, uh, a united front telling Facebook it was wrong. You know, “You’re doing the wrong thing on privacy. You’re being too open. “Um, you’re, you know, you’re not, um, taking down this objectionable content.” Um, but this is a case where people on the right are saying, “You’re censoring too much and you’re taking too active a role in content,” and people on the left saying, “You’re taking an active role enough.” And when it’s divided like that, I don’t know how you make the calculation for you know, which is the path of least resistance.

Laurie Segall: Right.

Barry Schnitt: And I do think Facebook has made that challenge, I mean, made that calculation in the past. Um, and right now, because the forces of, um, you know, “leave the content alone” are in power, I, I worry that they will make the decision that that is the path of least resistance.

Laurie Segall: In this, this letter, too, that these employees sent, they said “Facebook isn’t neutral, and it never has been. Making the world more open and connected, strengthening communities, giving everyone a voice, these are not neutral ideas. Fact checking is not censorship, labeling a call to violence is not authoritarianism,” I mean, covering a lot of these companies, it was, for so long, um, “Hands off, we’re neutral, we’re not responsible, we’re not media companies.” There’s always been this tension for the last decade. And a lot of this, um, a lot of these companies, as you talk about, those 26 words that, that, uh, save the internet based off of two-, Section 230, right, which makes it so, these companies, to a degree, don’t have to be liable for certain content. But but it certainly seems like we’re seeing a shift, in that words have more meaning and have, consequences that are far reaching. And the stakes seem incredibly high, in that, I, you know, I, I think the debate is, is certainly open of, of, you know. And you look at, I, I keep looking at Jack Dorsey at what he’s kind of walked into as well. And now all the calls to do all these other things, and where are they going to draw the line? So it certainly, uh, it certainly feels complicated.

Barry Schnitt: Yeah. It is. And I, and I tried to address that, um, in that, you know, and let them know, let Facebook know, you know, I, I know this is not an easy problem. I know it’s going to be hard. But, but to try to give them the courage to do it, and do something about it. And, and I think the, the writing in the, in the letter from my former colleagues and my friends is, is brilliant. And I, I, it- Those insights are, are so spot on. And uh, and I hope, I hope Facebook listens.

Laurie Segall: I thought what you said about Facebook’s strengths are its weaknesses as well, which is, you know, there’s always been that story of technology, right? Which is, it can do such incredible things, and it can also do such terrible things, too. And, and we always just walk this fine line. And, and it can get incredibly murky, you know? And I think what we’ve seen over the last couple years are some of, and maybe even the last not even couple years, even before then, really, you know, these ethical issues that, that I, I think a lot of these people, are working through. And sometimes better than others.

Barry Schnitt: Yeah. I, I agree. And uh, you know, for, for Facebook, you know, the, their, their biggest strength is the, the connections that they’ve created, and, and fostered and facilitated between literally billions of people. And uh, there’s a lot of benefit to that. But you know, we’re also realizing that the, it creates some vulnerabilities. And there are evil forces in the world that are exploiting them. Um, and I just think there’s more that Facebook can and should do about it. And uh, and I hope they will.

Laurie Segall: Um, from like, uh, on a personal note, you’ve, you have been at Facebook, uh, you’ve been at Pinterest. You’ve been a part of these companies that, uh, that really have kind of, uh, shaped people and behaviors and whatnot. Um, what is, what is your takeaway on people?

Barry Schnitt: Wow. That’s a really broad question. What’s my takeaway on people? Uh, well, I, I-

Laurie Segall: Yeah. You’ve kind of been in the fray. You’ve been in it. You’ve watched, you’ve watched things buil-, you’ve built things, you’ve, you know, you were in the line of fire at Facebook. Pinterest is a more delicate company, I would say, knowing, knowing that company. It’s a more delicate culture, too. But, but what’s, you know, you, you’ve just had such extraordinary, I would say, experience kinda being, being in these places and being on the front lines.

Barry Schnitt: Well, yeah. I mean, I… What I’ve observed in those companies and what I’ve felt is that, you know, they, most of the time, the overwhelming amount of time, the people wanna do the right thing and they wanna change the world for better. Um, and that is, that is a, at least a large part of the, what’s motivating. And I, I believe it’s what, what’s motivating Facebook, I believe it’s what’s motivating Mark. Um, I, I believe unfortunately, in some cases, they are blind to the consequences of the decisions they’re making. Um, I believe they are not giving enough weight to the bad outcomes, you know, as I state. But, but I do believe these people are good and they’re trying to do good and they want the best for the world. And I think that’s why, you know, to your point, that they’re usually silent but they’re not being silent right now because they are seeing that that intention is not being realized. And in fact, quite the opposite. They, they may be actually damaging the world and, and they don’t want to and they wanna do something about it. And so my whole goal was, was to try to give them some ammunition and, and maybe put some form around their thoughts and ideas to, to move the discussion forward towards some action. Um, and I hope it gets there.

Laurie Segall: And you think action would be labeling more of this content.

Barry Schnitt: Well, I think it would be not saying we’re taking a hands-off approach to content. I think it would be taking some responsibility for the content. And, you know, again, there are lies in the world and there are truths and, you know, having a hand in making sure the truth is more seen than the lies I , I don’t think is a bad thing, and I don’t think it’s censorship. There is civil discourse and there’s inciting violence. I don’t think taking an active hand and saying, “This civil discourse, is of more value than this inciting violence, uh, is a bad thing.” And, that’s, that’s just a, a, a bridge that they haven’t been willing to cross and I’m, I’m urging them to cross it. And I, I don’t know whether that means one will be labeled. I don’t know whether that means the distribution will be throttled on one and, and surged on another. I don’t actually know the solution but I, they need- to cross that bridge first, uh, and, and commit to, to having those outcomes be an actual goal. You know, free speech is not an outcome. Free speech is, is a means to an end, but the end right now is, is damaging the world. So how do we get to an end that actually makes the world better? And that’s the thing that I’m hoping we’ll get to.

Laurie Segall: What do you say to the folks? You know, you, you know, um, any time some people, like, come and speak out, at Facebook or whatnot I would say… I saw an executive. I was like, I think it was Dan Rose or something, say, like, you know… Uh, what did he say? He said something like, you know, just, “Early people who have no connection,” or something like that. He was saying, um, you know, “Don’t even know the nuances or complexities of this argument. You haven’t been there in awhile. So to defend yourself, Barry, what do you say to the executives, uor the, the people who say, “Well, you haven’t been there in awhile. You don’t know. Why do you get to say something?”

Barry Schnitt: Well, so I know Dan Rose and I, I think he was having an emotional reaction to feeling that his life’s work is attacked and people he cares about and is loyal to are attacked. And I, I don’t think that tweet was actually reflective of him. I, I think he, he more meant, you know, kinda what you’re saying is that, you know, you guys are, have not been at the company for a long time. You don’t know the, the content of the discussions. They were, they have been deep and endless, um, and you should trust the people that work there. And to that I would say I do trust the people to work there, but I think you were not realizing the consequences of those decisions. I am sure they were thoughtful, I’m sure they were endless, I’m sure they were as deep as they could possibly be, but I believe they were wrong and I believe the evidence supports my position and not yours- and you have a responsibility to do something. And even more than that, I think you have the ability to do something and I think it could be a tremendous success and opportunity for the company, um, and for the, and for the world, more importantly, um, if they would seize it. Um, and I, I urge you to do so.

Laurie Segall: And, you know, and, and also when you talk about the decisions people at the company are making, I, I feel like I couldn’t end this interview without saying, you know, it, I think it was reported that there was one person of color, one woman of color in the room when the, one of the decisions was made on the Trump post. You know? So when we talk about… And this is a larger conversation. We talk about the decisions that these, that folks are making who are the people making the decisions and are they diverse and a diverse group of perspectives? I think that’s something we have to hold onto. ThatI, I think we’re seeing it come to a head now in this moment. You know, I mean, there’s just not enough different voices and perspectives. And this isn’t just Facebook. I mean, this is all of Silicon Valley has a massive problem.

Barry Schnitt: Yeah, I agree with that and I, and as you say it, I wish, I wish I had brought it up in, in my writing but, you know, and it’s, it’s diversity across every dimension, Race, religion, socio-economic status,education. You know, the, the people who work at Facebook and are making the decisions are highly educated, highly sophisticated. They’re not seeing this stuff in their feeds because they’re not posting it, and the people that they associate with, uh, know the difference. But lots of other people are seeing it in their feeds and they don’t know that it’s a lie. They don’t know that it’s inciting a violence in some cases, you know. And I think if more of the leadership team had the perspective of the people who were seeing all of the content on Facebook based on their race, their religion, their economic status, you know, where they come from, I think they would have a different opinion of the impact. ‘Cause they’re not seeing it themselves ’cause they’re not exposed to it.

Laurie Segall: Um, and then I guess last question; Have you had other folks from inside the company reach out to you after kinda speaking out and, and writing? Do other people share your feelings?

Barry Schnitt: Yeah. I mean, it was, uh… You know, the, the response has been fairly overwhelming. You know, I didn’t know if anyone would listen or care. Uh, it seems like people are and do which, which is, it’s just gratifying. And, and I would say the most common response that I got from current employees, uh, former employees, and actually even people who never worked at Facebook but feel a connection to it in, in some way be, probably because they love the product and have been on it for years is, some form of, “Thank you for writing this. You encapsulated some feelings and thoughts that I’ve had for awhile.” And, uh, and so I, I am, I am one person who, who, who, you know, had my own ideas and, but it, it seems like they are shared by a lot of people and, and I hope that gives them some weight to, to Facebook, and that they, and that they do something. ‘Cause I, I want them to. I know they can. I think they should. And, and if, and if they did, I think it could be, uh, tremendously valuable to the world.

Laurie Segall: And last question, I promise.

Barry Schnitt: Sure.

Laurie Segall: You, um, you know, you were leading the charge in, in comms during some of these very intense situations. Privacy was a huge one back when you were there.

Barry Schnitt: Right.

Laurie Segall: So you kinda went into the, the line of fire. Um, what advice would you give Mark right now?

Barry Schnitt: Oh, man. Um, what advice would I give Mark right now? I would… Well I, I tried to give some of it, you know, without naming him in, in my writing and, and I think it is to, pay attention to the outcomes, not just your intent. And to have courage against the, the critics who, who have power to, to limit and damage your business and your company. And to have faith in your ability to do something really remarkable to, for the world, in a new way and in a, in a way that you haven’t before. And not just enabling free expression but an outcome of actually informing people, and improving the, their knowledge of the world and their understanding of the world and enabling them to make the right decisions about it. Um, and, and I think that’s something that he has the ability to do and I, I urge him to do it.

First Contact is a production of Dot Dot Dot Media, executive produced by Laurie Segall and Derek Dodge. This episode was produced and edited by Sabine Jansen and Jack Regan. The original theme music is by Xander Singh.

First Contact with Laurie Segall is a production of Dot Dot Dot Media and iHeart Radio.