Yael Eisenstat spent six months as Facebook’s global head of elections integrity operations. But before that, she spent more than a decade in the CIA. She also served as a national security advisor to Vice President Biden.
During her time working for the government, she was frequently in hostile situations. Sometimes she found herself sipping tea across the table from suspected extremists. Her job was to assess the situation and report back. To gather intelligence. Here’s her assessment of Facebook after a brief time there.
Everything that they seemed to be doing was about pushing responsibility onto others.
Laurie Segall: Now I want you to give me your assessment of the situation, having been in [Facebook] for six months, of what’s going on there and what you think needs to happen. Where do we stand on national security?…What’s the assessment?
Yael Eisenstat: I think there are lots of people [at Facebook] who actually do, in their core, believe that what they are doing is the right thing, and want to do the right thing. So I want to start with that.
Laurie Segall: So you’ve started with the good news.
Yael Eisenstat: I start with the good news.
But that said, the biggest thing that I saw there, is that everything that they seemed to be doing was about pushing responsibility onto others. I even think the content moderation board that they’re setting up is a bit about pushing responsibility onto someone else, instead of proactively taking responsibility for the thing you built, and for how it’s being used. So there’s a lot of whack-a-mole approaches, right?…
…A lot of it, especially in the election side of it, is whack-a-mole, right? It is, “Okay, you told us this is bad. Maybe we need to go take this down. Oh, was this hate speech? Should we take it down? Should we not take it down?” None of those things are addressing the core underlying systemic issue of this platform, and that is: It is a platform that is using our human behavioral data to segment us more and more into these tribes of who we are. Putting us in these little different buckets in order to target us with ads, and in order to do that, you have to keep us engaged. And so you’re figuring out every possible way to not nudge us, to push us, to persuade us, to manipulate us, to keep our engagement there. And so all these whack-a-mole approaches, none of them are addressing the systemic issue, which is the business model.
Laurie Segall: Did you have those conversations when you were inside?
Yael Eisenstat: Never.
Laurie Segall: Did you never bring that up?
Yael Eisenstat: One time, and one time only, with someone very senior, did I. We were traveling and I said to the person “Okay, I get it. We’re doing all this whack-a-mole stuff right now and we’re trying to figure out how to deal with this election. But have you ever actually sat back and had the conversation about who does this company actually want to be? Who do we want to be in this space?” And this person admitted that, no.
So no, I don’t remember ever having those conversations. And I’m not saying these conversations don’t happen. They didn’t happen for me. But as long as your company continues to report out user engagement metrics so that Wall Street will continue to reward you, then you are not, in any way, actually listening to what so many experts on the outside are telling you is actually your core problem. And to me, that’s the biggest thing. Every conversation I was a part of was always sort of like, “What’s the quickest way that can scale?”…Well, you know what? When you’re talking about elections, and when you’re talking about election interference, and political manipulation, and all of that, it might not scale. Because the way a Russian wants to interfere in the elections in the U.S. is going to be a very different situation than the way somebody wants to inflame ethnic tensions in India. And the idea that I have to come up with a solution that scales globally is part of the problem.