First Contact Transcript

Episode 15: Inside the Minds of Trolls

First Contact with Laurie Segall is a production of Dot Dot Dot Media and iHeart Radio.

Camille Francois: I talk to trolls and hackers who, uh, made a living doing, uh, disinformation, and propaganda for hire. People who said, “Initially I joined because I wanted to do campaign messaging for, you know, for my candidate to win. And then Iwoke up and I was just sending rape threats to women journalists using fake accounts and wondered, you know, what happened? What am I doing there?”

Who are the people who spread online disinformation? The so-called trolls that you hear about in the news whose jobs are to distort facts and create chaos? 

I want to introduce you to someone who knows them well, Camille Francois. She’s a security researcher who spends a lot of her time in the darkest corners of the Internet.

Camille is the chief innovation officer at Graphika which is a social media analytics firm hired by major companies to identify and fight online disinformation. To give you some context, her team was a big part of helping us uncover the extent of Russian influence during the 2016 election. 

And she knows a lot about trolls. Unlike most of the people talking about them and the damage they’re causing – she actually spends a lot of her time talking directly to trolls, to understand the why, who they are and why they manipulate social media to distort the truth.. 

I’m excited for you to get to know her. She’s unsuspecting and quick to laugh. Which I actually think is pretty powerful. To give you a sense, she just kind of roamed right into our production studio- a New York City high rise with lots of security- No one stopped her. 

Camille Francois: Yeah, well no one asked me what I was doing there so …

So don’t underestimate her, she’s taking on one of the most extraordinary threats of our time. And it’s her humanity that gives her an edge. 

I had no idea when I sat across from her at a new york dinner party that I would leave asking myself a question that I want to pose during this episode: What if the key to fighting disinformation online and some of the most alarming cyber threats coming in the future, starts with empathy?

I’m Laurie Segall and this is First Contact.

Laurie Segall: So, welcome to First Contact. I’m super excited to have you here.

Camille Francois: I’m super excited to be here. I have to say, I love the name of the podcast. I’m a huge Star Trek fan, so First Contact speaks to me. 

Laurie Segall: Oh, I love it. I love that. And, you know, um. Well, first of all, I should do kind of a quick intro to you, which will probably play into, like, why you like Star Trek, right? And just, like, your background is really, really cool and interesting. Um, you’ve spent your whole career kind of at this intersection of cybersecurity, public policy and human rights. Um, you’ve worked for the French government, advised governments, um, on- on all sorts of policy issues. And then you were a principal researcher, I’ve seen, at Jigsaw, worked at Google many times. Jigsaw, this is this think-tank on technology incubator within Google. And now you’re the Chief Innovation Officer at Graphika, right? Is that- that’s-

Camille Francois: That’s right, yeah.

Laurie Segall: So, I mean, you’ve essentially spent your whole career digging into fascinating, interesting issues around cybersecurity and human rights.

Camille Francois: Yeah, I really lucked out. You know, when I went to school, I said, “I think, uh, this is what I want to do for a living. It’s like the intersection of human rights and tech and geopolitics.” And I think back then, people were laughing, saying, “That’s not a job.”

Laurie Segall: Right.

Camille Francois: Uh, I think I really lucked out being able to work on these issues for- for so long.

Laurie Segall: And so how we start with our First Contact, uh, is we talk about our first contact. Our first contact happened at a dinner party, right? Um, I guess we should say to our listeners, like, we were at a dinner party where no-one could talk about what they did, which is, I guess, this new concept of, like, we’re not defined by what we do. Although, what you do is just the coolest thing in the world and it’s so badass. Um, but we sat across from each other and everyone guessed, we were sitting around a table and everyone was, like, guessing what you did . Do you remember what people were saying?

Camille Francois: Yeah, it was really fun. So, uh, for the dinner, I had, uh, brought a- a bottle of natural wine, a biodynamic wine and that’s kind of the only thing that- that, you know, people knew about me. And so they were like, “Well, you … Obviously you work in the wine world”, and I think I get that because I’m French. And then some people were like, “No, she works in sustainability”, which, frankly, is kind of true given what I actually do, I work on the sustainability of our conversations online. I don’t think this is where they were going with it, but yeah, that’s- that’s what they said. I don’t know what- what you got, what did people say about you?

Laurie Segall: There was this weird guy next to me that kept saying everyone was a dancer. Do you remember that?

Camille Francois: Yeah, that’s right.

Laurie Segall: That was super weird. Uhm. 

Camille Francois: I thought you were a diplomat or something like that. Like, I had something super specific for you.

Laurie Segall: I will totally take that. And I’m forgetting what I said, but I think I- I said you, like, managed chaos or something. But-

Camille Francois: Also true. 

Laurie Segall: But, I mean, what people didn’t really realize is, like, sitting around this small dinner table was like, someone who is at the forefront of fighting disinformation and nation states and who works in the dark corners of the web and sees the craziest stuff ever. And I don’t know if this is, like, the right thing to say because, like, I hate when dudes do this, but, like, you do seem really unsuspecting, right?

Camille Francois: Yeah. 

Laurie Segall: Like, you know-

Camille Francois: It helps.

Laurie Segall: Like, we were just joking about how you kind of social engineered your way into this building, like, we didn’t come get you in time and, like, you kind of, like-

Camille Francois: Yeah, well no one asked me what I was doing there so … 

Laurie Segall: Right. You’re like this … You come in with this, like, cute shirt that says facts and, like, no-one questions you. You just have this, like, a … This ability to understand and I- I have … I’m gonna go somewhere with this. Um- You have this, uh, extraordinarily human quality about you and I think that’s what a lot of people picked up on at the dinner table. And I think that might be, if I could say anything, what is kind of your competitive edge when it comes to what you do for your day job?

Camille Francois: Yeah. You know, I think, um, I think people don’t realize that empathy today is a critical component of cybersecurity. A lot of people have thought in order to do cybersecurity right, you need to be really good at, like, managing tubes, breaking tubes, you know, managing networks. I think today, we realize that if you want to secure systems, secure conversations, secure the way we live online, you really have to have an- an empathic heart and to understand the types of threats and how they evolve.

Laurie Segall: They, like … So can we get really quick into … Because I wanna go back to your background, but just so people know, like, what is your day to day? Like I just envision you in an office, like, fighting troll farms online and making sure democracy isn’t ruined. But I don’t know what that actually looks like, so I-

Camille Francois: I know, it’s so hard to say. 

Laurie Segall: I know big tech companies call on you to help with these different campaigns and, um, you’ve had some pretty high profile reports that have helped us understand the extent of Russian influence, but, like, what does your day to day look like?

Camille Francois: Yeah. Well, you know, first, I don’t do this by myself. I’m very lucky that I get to manage a really fantastic team. Yeah, people coming from, like, different types of background. And so together we analyze online conversations and we look for markers that they are being manipulated. Now what’s really fun in 2020 is, you know, we tend to think when conversations are manipulated, it’s either trolls or bots or Russians. But today, there’s- there’s quite a big diversity of actors in ways in which people manipulate public conversations.

Laurie Segall: Hmm.

Camille Francois: And so we investigate, because we want to be able to have more people do that and to have a public that’s better informed and better equipped, to tackle these threats. We also build tools to make that easier, so we have an R and D lab, they just do scientific research on how can we better detect patterns of manipulation of online conversations?

Laurie Segall: So, look, you can essentially see it coming, understand if people are being manipulated, understand kind of who’s behind it and try to stop it. So, I’m gonna put it in- in this way of how I love to look at the hacking community-

Camille Francois: Yeah.

Laurie Segall: And security researchers. And I got really fascinated in the security community years ago, I went to Black Hat and DEF CON for the first time. And for our listeners, th- those are, like, hacker conferences in- in Las Vegas. So if you could just imagine a bunch of, like, security researchers, like, who are finding vulnerabilities online and, like, and weird Vegas hotel room … I mean, it’s just such, like, a- an interesting community. And I remember my first exposure to it, I was like, “Whoa, these people are like these modern day superheroes”, right? Like, “They have the ability to fight the bad guys online, and they have this- this skill, like, they can …” As we talk about social engineering and you being able to kind of walk in here and, and understand how computer systems work. Like, you could use this skill for bad or for good, right?

Camille Francois: Yeah.

Laurie Segall: And so I remember going to Black Hat and DEF CON for the first time and everyone had code names-

Camille Francois: Yeah.

Laurie Segall: Right? Do you … I mean, right? Like, is that a thing in the-

Laurie Segall: In your community, like-

Camille Francois: It’s funny that you say that because last year I was at Black Hat with Bruce Schneier and Eva Galperin and our panel was, “Please use your skills for good.”

Laurie Segall: I mean, like, back, like, six or seven, eight years ago, it was like, even when I started going to these hacker conferences, it was like if you found a vulnerability or if you started saying, “Hey, this isn’t good”, you could get into trouble for saying it.

Camille Francois: You still can. Yeah.

Laurie Segall: Right. And so … And there was something, um there was something very anti-establishment about it and being that person who was fighting the bad guys and who was calling attention to this and who was kind of saying this, I mean, do you have a codename?

Camille Francois: No, I have a … I mean, yes and no. It’s not the greatest code name. I- I go by Cam Tronics.

Laurie Segall: Okay. So, like, yes.

Camille Francois: It’s, like, pretty easy. Well, I mean, like, it’s-

Laurie Segall: Yeah.

Camille Francois: It’s not the most secret code name.

Laurie Segall: Yeah. That’s kind of cool though.

Camille Francois: Yeah.

Laurie Segall: What does it mean?

Camille Francois: Cam Tronics?

Laurie Segall: Yeah. Like, what’s behind … What was behind it?

Camille Francois: Um, you know, I can’t even remember. It’s been with me for a little while. It followed me around. Um, yeah.

Laurie Segall: That’s interesting. Um, right, so but, I- I guess what I’m getting at is there was always this spirit of the community, That was … Had a lot of conviction and wanted to do something and raise awareness. And so now we’re in the certain moment.

Camille Francois: Yeah.

Laurie Segall: And so, I guess, I- I’m curious about your background… Before we get into, like, all the stuff you’re doing to help, um, fight kind of this- this current moment in time, I’d be curious to know, just, like, how did you get into all this?

Camille Francois: I’m just … I’ll answer that in a second, but I’m really interested in what you’re saying about how this idea that initially was quite scary, right?

Laurie Segall: Mm-hmm.

Camille Francois: Let’s enable a bunch of kids, to poke at the systems from the outside and find the holes in it.

Laurie Segall: Yeah.

Camille Francois: Kind of became our best idea for how to do security in a complex world, right?

Laurie Segall: Right.

Camille Francois: And I think what’s really interesting is that model is going to apply to more and more questions. So I think a lot about, biases and algorithm, right?

Laurie Segall: Mm-hmm.

Camille Francois: So when machine learning makes decisions that are deeply racist or deeply sexist, right? And all those problems that we’re seeing emerge and that frankly, we don’t have a lot of solutions to- to tackle. Could the model of the bias bounty also be applied there, right?

Laurie Segall: Hmm.

Camille Francois: How many more learnings can we draw from the hacker mindset and from the security community, how we do security in an age where we have more complex problem with technologies, right?

Laurie Segall: Right.

Camille Francois: I think there’s a lot of innovation and promising areas to- to explore there.

Laurie Segall: That’s interesting. So tell me about how you got into this? Were you always kind of a free spirit?

Camille Francois: I mean, who answers no to that question?

Laurie Segall: Some people and maybe …

Camille Francois: I guess, um, in French, so-

Laurie Segall: Yeah.

Camille Francois: I know like … Um, uh, no, I- I- I- I don’t know how I really got into this is the real answer.

Laurie Segall: Mm-hmm.

Camille Francois: I think I was always quite, obsessed with technology. I grew up in France, and very quickly I realized that the type of questions that I wanted to ask, I needed to, you know, to go to the US to- to- to ask and to- to- to- to- to work on. Um-

Laurie Segall: But, like, take me back there.

Camille Francois: I wasn’t-

Laurie Segall: Like, were you- were you tinkering, were you, like-

Camille Francois: Yeah.

Laurie Segall: Playing on the internet? Were you in weird chat rooms? Like-

Camille Francois: Okay, I’ll tell you the truth.

Laurie Segall: Yeah. What- what- what does that mean?

Camille Francois: I was a very optimistic person and I still am, you know, like, my team, like, often says I’m- I’m the most optimistic person looking at, like, the darkest stuff.

Laurie Segall: Right.

Camille Francois: I was really excited by the promises of the internet. I really thought that the internet was going to bring democracy, was going to bring more diversity, was going to connect,, connect us to one another. I was- I was just really excited about the promise of the internet.

Laurie Segall: Why-

Camille Francois: Perhaps it’s a generational thing? 

Laurie Segall: Was it just how you were raised? Like, was it- was it just you love-

Camille Francois: No, I just-

Laurie Segall: The spirit of it?

Camille Francois: Yeah, I love the spirit of it. I loved it. I remember, you know, when my father got me my first computer, I just thought it was magic. And so I think the first, you know, projects that I worked on were just so extraordinarily optimistic. So, I remember with a close friend of mine, we had this project called Citizen Wifi and we were knocking on doors around in Paris and we were asking people, “Could you remove the password on your wifi so that everyone can connect to your wifi and we’ll give you a little sticker that says Citizen Wifi?” I mean, it’s not remove the password, it’s like create, you know, create a guest network.

Laurie Segall: Mm-hmm.

Camille Francois: “So that more people can access the internet and you’re a part of the guest wifi community.” Which, you know, again, it was very extraordinarily optimistic idea of just, like, let’s put more people online and things are gonna be great. Um-

Laurie Segall: Yeah.

Camille Francois: So yeah, I think I really come from a background that looks at the internet with very rosy, optimistic-

Laurie Segall: Hmm.

Camille Francois: Cheerful eyes.

Laurie Segall: I … Well, I als-

Camille Francois: Which I still do, way after all the crazy stuff I see, I still, you know-

Laurie Segall: Yeah.

Camille Francois: I still- I still have hopes for what, you know, digital technologies can bring to society.

Laurie Segall: Well, I also saw you wanted to be a space baker, which I don’t even-

Camille Francois: Yeah.

Laurie Segall: Know what that … So, what- what on earth does that mean? Because we’re going to get into Russian influence and- and fighting the bad guys, but I feel like we have to- we have to start at some point with space baker.

Camille Francois: You know, like, Neelix in Voyager or, you know, Quark on Deep Space Nine just- just have, like, a little bakery or bar and you’re on a space station and you make croissant, you know, it’s kind of nice. 

Laurie Segall: That’s-

Camille Francois: You meet people from everywhere. You’re like, “Hey, which quadrant are you from? Do you want a croissant today?” Like, just … Like, a nice little space baker.

Laurie Segall: Um, and by the way- No, no, no, it’s not that I’m, like, speechless, I’m just like, “Wow, what an interesting dream.” Like, I wanted to be a Broadway actress. Unfortunately that … The … You know, I didn’t really- Live into that version, but I wish I were cool enough that that was my dream to do that. I love that. Um, and So you ended up going and getting a degree in human rights and international security, uh-

Camille Francois: Yeah, that was- that was far from the space baking.

Laurie Segall: Yeah, like-

Camille Francois: The space baking route was not straight-forward.

Laurie Segall: Yeah, was there- was there a certain moment that you were … That- that that dream was crushed? Was there any- any indicator that that wasn’t gonna be it?

Camille Francois: No, I’m just really bad at cooking.

Laurie Segall: Right.

Camille Francois: That’s like the general-

Laurie Segall: Totally. I totally … Uh, you’re in safe … You had to save space for that.

Camille Francois: Yeah, I am just so bad at it.

Laurie Segall: Well, it’s good because the things you’re good at, we really need right now, which is, like, fighting really bad guys online. Um, so you ended up getting a degree in human rights and international security. And you ended up going to- to DARPA eventually. Can you tell us a little bit about your work there?

Camille Francois: Yeah. I mean, that was a long time ago. It was … It’s interesting, it was a project that I was working on at the end of my grad studies.

Laurie Segall: Mm-hmm.

Camille Francois: And it was just looking at, uh, privacy and security. So, I had, um, fairly conservative cybersecurity professor at Columbia-

Laurie Segall: Mm-hmm.

Camille Francois: Uh, who- who called me a- a hippie. And I told them, you know, I’m- I’m really not a-

Laurie Segall: Mm-hmm.

Camille Francois: Very radical hippie and what I’m telling you about digital rights, I really don’t think it’s very radical. I think it’s just really something that needs to be heard and discussed. And I was telling him, like, you know, I- I’m really concerned by the growing gap between national security discussions on one end and- and what I consider to be important digital rights and- and human rights security discussion on the other hand. And- and I think more should be done to bring digital rights, to bring human rights at- at the core of how we discuss cybersecurity. Uh-

Laurie Segall: What does that … What does that mean? Can you explain that a little bit?

Camille Francois: You know, privacy.

Laurie Segall: Yeah.

Camille Francois: Privacy is a good example, right? So you had a lot of conversations on what do we need to secure the internet from bad guys and none of these conversations … It’s- it was back- back then, right? I think we’re in a much better place now. But if you have these conversations consider that privacy was a very important element of that, right?

Laurie Segall: Mm.

Camille Francois: It was, I think, back at a time where most people considered that there was a tension between privacy and security. Honestly, I think we’ve done a long way since then. And today, I think there’s a recognition that privacy and security go hand-in-hand, right? If you have a system, you know, that leaks out private information on people, that also is a system that’s easier for hackers to exploit.

Laurie Segall: Hmm.

Camille Francois: And so I was, you know, working on these- these topics and my professor was like, “Okay, fine, well- well, you know … I know a project that could use a hippie”- ” and then well, you’ll just get to work on this project as a student.” So yeah, that was fun.

We’re gonna take a quick break to hear from our sponsors but when we come back: Camille talks about her work on the frontlines of terrorism online and how ISIS was actually really good at microtargeting. 

Also, If you like what you’re hearing, make sure you hit subscribe to First Contact in your podcast app so you don’t miss another episode.

Laurie Segall: And you also worked at, Google and specifically, Jigsaw, right?

Camille Francois: Yeah.

Laurie Segall: Which is for folks who don’t know, is kind of the … I- I think is really an interesting part of the company, which is where a lot of this technology, humanity, this, like, think-tank of sorts, where a lot of these hard problems, like AI and bias and, some of these efforts to counter extremism, a lot of the people who are working on these problems, are in that realm and you are kind of at the forefront of that. So what did you find there?

Camille Francois: Yeah, it was, um, it was really fun, uh, working on, on these issues. Um-

Laurie Segall: By the way, I love that you say it’s fun, like, you work on, like, counter-extremists and so, like, you’re, like, probably spending lots of time looking at ISIS recruiting videos.

Camille Francois: Yeah, but you work on those issues with people who really care about solving them-

Laurie Segall: Mm.

Camille Francois: And who are keenly aware of the different trade-offs and that, I think, is a very fortunate position. Working inside Google, on this issue has also meant, like, working on an organization who really wanted to get to the bottom of it and with colleagues at Jigsaw, but also, you know, frankly, all across Google and engineering and policy who wanted to make a dent in the problem, who were willing to experiment with creative and innovative ideas, but who also had a very clear picture of the trade-offs, of the constraints, of making sure that we don’t go, on the other side of the line, making sure that we protect freedom of expression as we think through these problems. So yeah, it’s- it’s a privileged position to work on these issues, for sure.

Laurie Segall: What were some of the, um … Even you worked on a research program to counter online propaganda, so, like, what … Like, can you tell me, like, what were the issues that you saw? Like, what did you kind of come up with?

Camille Francois: Yeah, so it’s a program that was called the Redirect Method.

Laurie Segall: Mm-hmm.

Camille Francois: Um, and I think what- what I was very interested in when we started, um, you know, thinking through this project is a lot of people think about, okay, terrorist propaganda, you have to just remove the content. That’s a great first step, right? Indeed, there’s content that’s harmful, sometimes illegal, it shouldn’t be online, and so you work to detect it and you work to take it down. But it doesn’t- you know, solve the problem that you still have issues with the questions, right? Like, there’s still users who are coming to your platforms and they’re saying, like, “Well, I’m here to- to consume you know, a piece of content that you’ve removed.” And this is where the Redirect Method operated, right? Like, when you have users who come and who are looking for content that is no longer there, do you still have an opportunity to reach out to them and to propose something else?

Laurie Segall: Mm.

Camille Francois: Now you don’t want to trick them into something else, right? So we really do want to redirect and propose, “Hey, here’s a playlist of alternative content that we think might be interesting.” And so you want to find, um, you want to find the most transparent way to do that and you want to find the most, sort of, clever, uh, way to- to- to do that. Again, avoiding all the- all the potential traps-

Laurie Segall: Right.

Camille Francois: That- that are- that are all around this- this question.

Laurie Segall: I mean, it was such … I remember covering it, and thinking, “This is such an interesting idea.” Like, I didn’t realize that you were behind it…

Camille Francois: It really was, you know, teamwork from a lot of researchers, who, as you said, like, we’re really close to the question, right? The idea was we had to sit with people and really understand when you end up in a rabbit hole of consuming terrorist propaganda, what led you there, right?

Laurie Segall: Well, by the way, I don’t think people understand. Maybe this is me having done a documentary on, like someone from ISIS who was- was killed in a drone strike. He was kind of a hacker type and, um, one of the things I … He was actually in charge of their social media. His name was Junaid Hussain, AKA Trick. Maybe you’ve heard of him.

Camille Francois: Yeah.

Laurie Segall: But, like, I remember, um, he was in charge of their propaganda stuff and I remember, like, thinking, like, “Oh my God, like, he’s making …” This is gonna sound really weird to say, we might have to cut it, like- He makes ISIS look very human. But he makes ISIS out to be, like, punk rock. Like, you know, these … Or like these rap videos and, like, these videos that are so compelling. Um, and so I’m sure, you know, and- and very human.

Camille Francois: Yeah.

Laurie Segall: So when vulnerable people are going to- to, you know, look at these videos, like … And- and this is where I think the human thing comes in-

Camille Francois: Yeah.

Laurie Segall: It’s like we all sit here and we think, like, “These people are crazy, they’re going to join ISIS”, or “These people are crazy, they’re Russian trolls”, or whatever it is. But there’s, like, a lot of humanity behind how people end up getting into- to these spaces, right? And-

Camille Francois: It’s so much more subtle-

Laurie Segall: Yeah.

Camille Francois: Then we make it out to be.

Laurie Segall: Yeah.

Camille Francois: You know? You know who was really, really good at micro-targeting though?

Laurie Segall: Who?

Camille Francois: ISIS.

Laurie Segall: In what sense?

Camille Francois: Well, what’s really interesting is we tend to think that terrorist propaganda is just one big thing where it’s like one bucket of very clear graphic imagery.

Laurie Segall: Hmm.

Camille Francois: But what we’re observing actually is tailored narrative targeted at very specific community, very specific people, right?

Laurie Segall: Right.

Camille Francois: Who was … You don’t- you don’t convince a young British girl the same way than you convince, uh, you know, uh, uh, uh, Chinese Muslims-

Laurie Segall: Yeah.

Camille Francois: …online. It’s just a much more nuanced and subtle and micro-targeted picture that- that we often imagine.

Laurie Segall: And so you left and joined Graphika and then your first assignment seemed like a pretty significant one, right? You were involved in, like, a super secret project for the US Senate Select Committee on Intelligence. Do you remember when Can you just, like, give us the … Like, don’t give us the company line. Like, tell me, like, they came to you, what did they say? Like, what- what was the- the mission?

Camille Francois: You know, at that stage, I had been working on Russian interference for, um, for a little while already.

Laurie Segall: Mm-hmm.

Camille Francois: And so I- I was- I was already pretty, you know, pretty- pretty obsessed with it. And I remember actually when my boss called me. So the CEO of Graphika called me and say, like, “Hey Cam, what if, like, we had all the data from, you know, everything that the- the Russians have been doing across platforms and- and we could really un- untangle and understand what’s going on? Like, wouldn’t that be great?” And it was kind of like, “What’s up with John?” 

Laurie Segall: Like, is he okay?

Camille Francois: And I was like, “Yeah, John, that would be great. Just give me the magic data box”-

Laurie Segall: Mm-hmm.

Camille Francois: “That’ll be just super great.” And he’s like, “Okay, well I think we’re gonna do that.” And I was like, “What?” 

Laurie Segall: Wow.

Camille Francois: And yeah, that was basically the assignment. Uh, the Senate Select Intelligence Committee really wanted to get to the bottom of what had happened. And I think we don’t often recognize how little we knew then and we still have gaps in our understanding of how really this campaign unfolded in 2016, but also before and after. And so it was extraordinarily exciting to be able to help the Senate, who had gathered all this data and really gave it to us with total free reign, right? They said, “Tell us what you see.” I think my first instinct was, you know, again, as I said, like, at this stage, I had all this already, sort of, in my heart and in my head, and so I was already looking for- for bits and pieces and- and it was IRA data. And so I remember the first thing we did is … I was like, “Oh, well, here are the things I expected to see in that data and that are not there.” And so it taught us very quickly that the IRA, the Internet Research Agency, the troll farm that’s based in St Petersburg was one part of the problem, but it was not the full campaign. And so I knew of other campaigns, later we realized they were the GRU campaigns, right, who were lacking the data set. And so it’s been like this for seven months before the report got public and then again, after and after and honestly, it’s, you know, it’s continuing to be an endeavor, uh, a puzzle of having to figure out what really happened. Why were the different entities involved in this campaign? How did the targeting take place? What is the exact relationship between the hacking and the trolling and the targeting? How did the platforms respond? And um, even more fun, how did the Russian trolls respond to the platforms responding?

Laurie Segall: Hmm.

Camille Francois: And so we had all of that in, sort of, millions and millions of data points.

Laurie Segall: So what does that mean, like, millions of millions of data points? Like, how are you … You do a whole analysis around it or, like-

Camille Francois: Yeah. So we-

Laurie Segall: How does that work?

Camille Francois: We wrote this, um, really long, you know, report and-

Laurie Segall: Mm-hmm.

Camille Francois: We tried to, uh, talk about the- the big trends and everything we observed and the role of the different platforms and how long this had been going on. I think, uh, the few trends that we really tried to highlight was, this was not just a campaign against the US, it was a campaign that had been waged against a Russian domestic population first, right? Against other populations in- in Eastern Europe and also a little bit in Canada and in Germany. And similarly, I think people were very focused on 2016 and we were able to demonstrate that it had been happening before, right? So, Project Lakhta, the big, uh, uh, US focus project of the IRA actually started in 2014. And in those- those two years before the election, there was a lot of, uh, fascinating detail of the, you know, the Russians really learning to, uh, to play the Americans, right? Like, what are the hot button issues?

Laurie Segall: Hmm.

Camille Francois: Like, what are the triggers? What can we try? And so we also looked at those two years of experimentation in which really they’re bizarre cases, right? There was that-

Laurie Segall: Like what?

Camille Francois: Oh, I think in 2015 around Thanksgiving, the Russians were trying to freak out everyone, telling people that the turkeys they would buy at Walmart will have salmonella.

Laurie Segall: What?

Camille Francois: Yeah, they did that. So there’s just, like, a bunch of weird, you know, bizarre-

Laurie Segall: What else? Let’s keep going. This is weird.

Camille Francois: They’re, like, food hoaxes that are fun. There’s, of course, um, a famous case called Columbian Chemicals-

Laurie Segall: Mm-hmm.

Camille Francois: That’s even before, that one is 2014. It targets a small community in Louisiana. This one’s interesting ’cause it involves SMS and so they’re telling people, releasing video, texting officials saying a chemical plant has exploded and they’re trying to create a panic. It works to some extent as in like, you know, it- it- it- it’s reported a bit and then the message circulates. But very quickly, the local authorities say, “Actually that- that’s a hoax, and it’s not true”, and they- they kind of move on, which to be honest, you know, I understand. I think in 2014 in Louisiana, if you were to have said, “It’s a hoax and we think it’s a Russian troll farm”, I think you would have sounded insane-

Laurie Segall: Right.

Camille Francois: To anyone around you. But, you know, but they did, like, things like this for- for at least two years before the election. And, of course, they continued targeting the- the American public after the election, right? So, 2017 is a really interesting year too because, people are talking about Russian trolls in 2017, right? It’s a new reality. And so the Russian trolls themselves are making jokes about it, right? So you have fake profiles that start making messages saying, “Oh, I’m reading all these stories about Russian trolls. That is ridiculous. Next time, I’ll be accused of being a Russian trolls. Ha-ha-ha”, right? 

Laurie Segall: So they kind of, like, adapted to the narrative-

Camille Francois: Absolutely.

Laurie Segall: Of the Russian troll?

Camille Francois: Absolutely. And then they all start adapting to platforms responding to this activity. I worked a lot on, part of that activity that targeted, black American activists in the US.

Laurie Segall: Mm-hmm.

Camille Francois: And part of this effort was to create fake activist organizations and to work without real activists on the ground to do events together and, and to really sort of, like, you know, immerse themselves in- in that community. And there was a specific group called Black, uh, Matters US. And when Facebook data mined that Black Matters US was a fake group and was a Russian entity, they removed it from the platform, but they didn’t coordinate with the rest of the industry.

Laurie Segall: Hmm.

Camille Francois: And so what really happened is, the group went to Twitter-

Laurie Segall: Oh.

Camille Francois: And started complaining about having being kicked out of Facebook, saying, “We’re really upset that Facebook supports white supremacist.” And then they started going on Google and they bought a lot of ads to redirect people to their new websites ’cause they had to direct all traffic away from Facebook because they had been kicked out. And so 2017 is this, like, really sort of, you know, surreal year for the Russian trolls where they playing cat and mouse with the industry who still doesn’t fully have, you know, their mechanisms well set-

Laurie Segall: Oh.

Camille Francois: And doesn’t really have their policies well set either, so it’s kind of a chaos and confusion for everyone. And then the Russian trolls start talking about Russian trolling, so it gets a bit meta.

Laurie Segall: Wow.

Camille Francois: And then, of course, in 2018, they’re the midterms. In 2019, they were also showing a different … It’s- it’s just, like, kind of a-

Laurie Segall: Yeah.

Camille Francois: I think what’s interesting, from my perspective, is people often think that the Russian campaign is one year and one thing. I’ve seen it evolve, over so many years and show so many different facets.

Laurie Segall: Yeah. Do you, when you, interact with these people, out of curiosity, like, do you just sit and watch them from afar, do you go … Do you have, like, an undercover- …name or something where you’re talking to Russian trolls as someone else? Like, what’s your deal?

Camille Francois: So I have, uh, talked to a few people who have worked in troll factories-

Laurie Segall: Mm-hmm.

Camille Francois: Russia and … Russia and others. It’s funny that you mentioned undercover, that’s not the type of work I do, but one of the reason we know so much about specifically the- the Russian Internet Research Agency, it’s because a young Russian journalist went undercover- And published everything she could find, uh, and she did that quite early.

Laurie Segall: Mm-hmm.

Camille Francois: I think what’s- what’s interesting … It’s an interesting reminder that- that honestly, the, the activist communities and the investigative journalists community knew about this and really went through great pains to document it before the rest of the world and Silicon Valley really cared about it.

Laurie Segall: You said, um, something that I thought was really interesting, you said this work is two parts technology and one part sociology.

Camille Francois: Yeah.

Laurie Segall: What did you mean by that?

Camille Francois: A lot of that is really about understanding socio-technical systems, right? So when you think about information operation, it’s not really like hacking, right? It’s- it’s not looking for a technical vulnerability, it’s looking for a social vulnerability, it’s looking for what’s going to play well into a society’s division. What’s going to fall in between two rules that a platform has and that the … That’s going to make them not catch me, right? A lot of this is really playing with social systems as much as it is playing with technical systems.

Laurie Segall: Speaking of the humanity of it, you talk about kind of bringing a hacker mindset to the data security problem. And, like, what I think is so interesting about you, is, I mean, you went and, like, talked to trolls, right? Like, we have this whole misconce … Who- who are these people who are doing this? Like, in America, we’re like, “Oh, the Russians are trying to mess with democracy.” And, um, you actually, uh, and maybe this is just me selfishly as, like, a journalist who, like, loves to talk to the other side and, like, loves to talk to the dark corners where people aren’t looking, uh, and hear the other side. You did that, right? Like… Take me into that. So you actually found people who were working,in Russian troll farms and- and talked to them?

Camille Francois: Not- not just Russians. I think I’ve, um, I was always very interested and, I think I’ve, you know, brought that mindset to my work. I was really interested in- in understanding more from the other’s perspective, right? So yeah, I talk to trolls and- and hackers who, uh, made a living doing, uh, disinformation, and propaganda for hire.

Laurie Segall: But, like, take me into the rabbit hole. How does one decide to do that? Like, are you sitting at your desk and you go to these … How do you even get in touch with these people? Like-

Camille Francois: And again, it’s so different.

Laurie Segall: Yeah.

Camille Francois: And- you know, as a journalist, you get that, right?

Laurie Segall: Totally.

Camille Francois: Every story’s really different.

Laurie Segall: Well, that’s why I know it’s really probably challenging. So you’ve got … This is why I’m kind of sitting here being like, “Props.” Like, how do you, like … Give- give me some specific examples. Like, who … Are there any people that really stick out to you that you spoke to that just surprised you from- from this community?

Camille Francois: I mean, they’re all fascinating stories. Uh, I have to say, I’ve heard so many different stories that I would be, um, I would- I would really struggle to- to paint it with one brush.

Laurie Segall: Mm-hmm.

Camille Francois: Um, things that come to mind is I’ve talked to a hacker, who did, uh, who did, uh, propaganda for hire all across Latin America. And that was way before, uh, people were worried about Russian troll farms and, you know, it was more, yeah, the entire disinformation for hire trolls and bots and fake profiles in Latin American politics. That was quite fascinating. I’ve talked to people who-

Laurie Segall: In what sense? Like, they do … Like, why’d they do it? Just for my … Just my-

Camille Francois: Helps them win an election, right? Like-

Laurie Segall: Right.

Camille Francois: It’s just like a-

Laurie Segall: Was it patriotism? Was it just money? Like-

Camille Francois: It’s-

Laurie Segall: Yeah.

Camille Francois: Like, why do people work on political campaigns?

Laurie Segall: Right.

Camille Francois: That was- that was his shtick.

Laurie Segall: Yeah.

Camille Francois: Like, you know, you’re assembling a political campaign-

Laurie Segall: Right.

Camille Francois: You’re getting a communications specialist, do you want this guy who can bring you a little army of fake profiles and bots and trolls?

Laurie Segall: Right.

Camille Francois: He kind of, like, made a niche for himself like that. It was pretty successful until it ended sort of badly ’cause he got caught and ended up in jail. That’s, like, a-

Laurie Segall: Oh.

Camille Francois: Yeah, that’s, like, one story. Um, it’s interesting ’cause the- the campaigning angle came a few times, right? So, talking to people who went into doing digital campaigning and really by- by patriotism to support their candidates, right? And slowly saw the campaign apparatus evolve into, like, a state propaganda machine after their candidate became in power. And so there are a few, you know, a few stories like this of people who said, “Initially I joined because I wanted to do campaign messaging for, you know, for my candidate to win. And then I woke up and- and I was just sending rape threats to women journalists using fake accounts and wondered, you know, what happened? What am I doing there?”

Laurie Segall: Oh. Someone said that to you?

Camille Francois: Yeah.

Laurie Segall: Wow. What did they say? They were just-

Camille Francois: Just that, you know? Like, that- that it, that it slipped, uh, that they went in for one thing and that with the success of the candidate and the evolution of- of the machinery, they ended up just really doing something else.

Laurie Segall: Where … Can you give any details about the candidate that this person … Like, it … Was this in a-

Camille Francois: That was a story that happened in India.

Laurie Segall: Wow.

Camille Francois: Um, but- but, again, like I- I’ve heard that a few times and- and I think that the story of- of doing something for political reasons that ends up sort of, like, putting you in the middle of the machinery that’s no longer what you had- had joined is- is one that-that’s, uh, more common than what we think. There’s also been other researchers who’ve done great ethnographic fieldwork, talking to trolls. Someone you know, specifically who comes to mind is  is a friend who wrote a report called The Architecture of Disinformation, that looks at what happens in the Philippines.

Laurie Segall: Mm.

Camille Francois: And it’s really fantastic report, and he’s talking to- to people who self-identify as doing this activity. They don’t say trolls, right, they don’t say, “I’m a troll”, but they say, “Yeah, I, you know, I make my living by having a lot of, uh, fake profiles and if you’re a candidate and you want to pay me for this activity, I will, uh, I will do that.”

Laurie Segall: Hmm.

Camille Francois: And, um, and I think in- in his work, what you see come through is, um, is a question on when did that become an illegitimate activity, right? Like there’s indeed a real business of people who do this for hire.

Laurie Segall: Right.

Camille Francois: And who suddenly are told, like, “You’re a troll and you’re gonna be deactivated.” And I think- I think part, you know, part of what you hear when you talk to people on the other side is, “Okay, wait a minute ’cause I’ve- I’ve been doing that for a little while and I thought it was okay.” 

Laurie Segall: Right.

Camille Francois: Yeah. “I was …”, you know. 

Laurie Segall: Um, did you ever, like, find yourself really liking these people-

Camille Francois: Yeah.

Laurie Segall: That you talk to?

Camille Francois: You know, you got- you got empathy for … And again, like, such different trajectories, right? You have empathy for someone who works for a candidate and suddenly says, like, “What am I doing here?”

Laurie Segall: Yeah.

Camille Francois: Yeah.

Laurie Segall: Did you ever learn about how they learned how to pose as American? Like, what’s-

Camille Francois: Yeah, that’s really fun. 

Laurie Segall: The secret sauce? Like, what is the secret sauce to posing as an American these days online?

Camille Francois: Yeah. 

Laurie Segall: I mean, I’m sure it’s changed over the last couple years… And it might not be rocket science, but, um-

Camille Francois: No, actually it’s fairly complicated. So we know a lot about how the IRA learned how to pose as an American, right?

Laurie Segall: Mm-hmm.

Camille Francois: And as I said, like, this is where the early days of the IRA are really fun because this is when they have to learn, right?

Laurie Segall: Mm-hmm.

Camille Francois: This is why they’re playing around with, like, “Oh, how much can we freak people out by, uh, talking salmonella in turkeys around Thanksgiving?”, right? Like, this is-

Laurie Segall: Right.

Camille Francois: Them trying to figure out, like, where America’s hot button is. We know they were watching House of Cards, which I still think is hilarious.

Laurie Segall: They were watching House of Cards?

Camille Francois: Yeah. 

Laurie Segall: Okay. How do you know that? They would- they would just tell you this?

Camille Francois: Uh, it’s in- it’s in, uh, it’s in a- a de facto testimony. Um but really, here, the legal indictment have a lot of- of sort of, like, crazy details on everything that the IRA did to-

Laurie Segall: Hmm.

Camille Francois: Sort of, like, learn to be American, right? So we know they took field trip, um, as part of how some of the employees ended up being indicted was they entered the country with, uh, tourist visas and I think-

Laurie Segall: Mm-hmm.

Camille Francois: A few years after, the government was like, “I don’t think you were here for tourism.”

Laurie Segall: It’s like a troll farm field trip-

Camille Francois: Yeah.

Laurie Segall: To America.

Camille Francois: Exactly. It’s a troll farm field trip.

Laurie Segall: I wonder what they did on their field trip.

Camille Francois: You know, you’ll observe people and you’ll understand how- how they, um-

Laurie Segall: Hmm.

Camille Francois: How they act, what they talk about. They were also looking at their social media metrics very closely, right? So whenever they were trying out a new post in a group, they would take notes on what’s performing, what’s not performing. They were talking about how to target specific groups and other groups. And, of course, I think the thing that we tend to forget is they were also targeting Americans, right? They were talking to Americans.

Laurie Segall: Mm-hmm.

Camille Francois: They were using their fake personas to have long dialogues with American activists on all sides on the spectrum, saying, “Hey, what do you think about? What does the community think about? How are we gonna do an event together?”

Laurie Segall: Yeah.

Camille Francois: So, honestly, they were doing serious research.

We’ve gotta take another quick break, but when we come back: It’s not just Russia using sketchy social media tactics. Could American political candidates be using fake accounts to win your vote?

And if you have questions about the show, comments, honestly anything, you can text me on my new Community number: (917) 540-3410.

Laurie Segall: Did you ever, uh, worry … Just because I know you’re kind of in these dark corners of, like, you know, dealing with troll farms and the GRU, like, I mean, also, like, real, like, well-funded governments who are trying to influence outcomes in some of these very dangerous ways. Did you ever worry, maybe this is an extreme question, but about your safety?

Camille Francois: Yeah, that comes to mind. Um, comes to mind, yeah, of course. I, you know, try to, uh, be as safe as- as I can. I also don’t worry about it too much because I also work with a lot of people who are … I mean, it’s- it’s not a race, but, you know, thinking about people who are at much greater personal risk, it also helps both prioritize and put some relativity on it.

Laurie Segall: What does that mean?

Camille Francois: Um… So, someone I’ve worked really closely with, uh, um, along the years on- on- on these questions is, for instance, the amazing, uh, Maria Ressa-

Laurie Segall: Mm-hmm.

Camille Francois: Who is the Executive Director of Rappler in the Philippines and who’s a fantastic journalist. Um, she’s been arrested so many times, she’s been targeted so harshly by her government that, you know, sure, sometimes I worry about my own safety, but I think, more often that not, I worry about that of my friends a bit more.

Laurie Segall: Right.

Camille Francois: Yeah.

Laurie Segall: So many of these disinformation campaigns, um, the idea’s also to silence people-

Camille Francois: Yeah.

Laurie Segall: And as women, I mean, I guess it’s … A lot of it is also silencing women and silencing female journalists and- and what … It’s- it’s … You know, not to go on-

Camille Francois: Silencing is definitely a key goal… I’m glad that you’re bringing this question because besides my own safety or that of my friends, I am really passionate about how do we build technology to protect users from very well-founded and well-resourced threats, right? And when you think about it, it’s very difficult problem, right? Like, how … What can you do when you know that a journalist is targeted by a nation state? There’s a- a little known feature, I mean outside of- of security circle that’s a feature that’s really near and dear to my heart. It’s called the state sponsored warning and I’ve been working a lot on this and thinking a lot about this. Sometimes when a platform knows that as a user, you’re being targeted, they would actually give you a little notification-

Laurie Segall: Hmm.

Camille Francois: That says, “Hey, we think you’re being targeted by a state actor. Why don’t you go and do these 10 things”, right? “Change your password, enable two-factor identification”, et cetera, et cetera. And, I think a lot about how- how much we should, you know, celebrate in the … These systems, they’re not much but they’re almost the only things that exist sometimes and how much we should invest in making sure they’re as strong and robust as possible.

Laurie Segall: Yeah. 

Camille Francois: Oh, and by the way, that isn’t ask, if you’ve ever received a state sponsored warning in your inbox and you have thoughts about- Like, your experience and wanna talk about it, shoot me an email, I love to hear those stories.

Laurie Segall: I mean, by the way, how … First of all, can we just take a step back, like, how scary would it be if you’re just, like, checking your email and you get, like, a state sponsored war … Like, warning?

Camille Francois: You know, it’s really fun because I’ve talked- To a lot of people over the years and-

Laurie Segall: You’re so strange. I mean- I think it’s great, like, I love that you talk about, like, doing … Taking on ISIS propaganda and you’re like, “This is fun”, and you talk about someone getting a state sponsored warning and you’re like, “This is so fun.” I’m like-

Camille Francois: No, you’re right-

Laurie Segall: This is horrifying.

Camille Francois: This is terrible. It is horrifying. Um, but- but, I- I …Fun is not the right word, but I’m saying it’s … What’s really interesting is that people have such different reactions to it, right?

Laurie Segall: But it’s important, right?

Camille Francois: Sure.

Laurie Segall: I would rather know. I would 100% wanna know.

Camille Francois: It’s extraordinarily important, which is why, like, I have met users who tell me, like, “Yeah, we receive those, but, you know, we think it’s a drill. We think that really platforms tell us that just to keep us on our tippy-toes”, and I’m like, “No, it’s not a drill”, right? If you receive that warning, please know this is not a drill, you really do have to think about your security, you really do have to enable two-factor identification, do those things, right? But you also have users, frankly, who, um, for them, it’s been so, um, so terrifying and often sometimes so tragic, that- that this becomes, a- a symbol for them, right? So, like, people have very different reactions to it and- and for some, it- it really is sort of the, you know, the- the beginning sign of a- of a journey that can be quite- quite horrifying and- and- and- and frightening and tragic, yeah.

Laurie Segall: As someone who’s spent a lot of time looking at influence in the 2016 election, who just spends so much time, like, what are we missing? I know …I watched you on C-SPAN. You know, and- and research for this, I- I took some time and watched,and watched you testify . And, you know, I just … I saw you talking about how the thing we’re missing is it’s not just the IRA, we’re talking about the GRU and, like, and how this is a very well-funded government backed campaign, so we’re not really talking about that, how also, we’re not even measuring, like, private messages of people being targeted. Like, you know … I- I just think there’s so much talk about one thing right now and my concern, as someone who covers this kind of stuff, is that we just don’t even look at other things, um, and we scream about the same things a lot, which is important, but we don’t look at other things. So, I thought those two things were really interesting, if you don’t mind getting into them a little bit. And then, like, what’s the other stuff that you think we should be talking about?

Camille Francois: Yeah. The first thing is, indeed, you know, we’ve talked so much about the IRA and that’s great. I mean, I say this as someone who’s very deep, deep down this rabbit hole.

Laurie Segall: Hmm.

Camille Francois: I would talk about the IRA day in and day out and for month and month without stopping. But it’s not the only actor in foreign influence. And a lot of people, when they say foreign influence, really, you know, their mental model is what the IRA did in 2016, which, again, doesn’t acknowledge that there were many other actors, Russian actors, right? So the GRU played an important part, there were other Russian entities who participate in foreign, uh, interference and information operations. But, of course, there is, uh, non, you know, other- other governments, right? So, the first campaign, by the Iranian regime targeting US audiences, I think starts in 2010, right?

Laurie Segall: Mm.

Camille Francois: Their first foreign interference on social media campaign. So, a lot of this was happening also before we kind of, like, woke up to it. So, there’s a lot more actors than just the IRA and frankly, than just more Russia, both on the foreign side and also as- as we talked about, right, on the domestic side, that’s one thing. And yes, a- as you said, I am worried that we still don’t have the full picture of how that specific Russian campaign worked and that there’s still a lot that’s missing from the record. In working with activists who have been targeted, we looked at the messages that they received and we never talk about those messages, right, when we think about the Russian interference. We kind of feel like yes, it’s a bunch of tweets, you had to be a little bit out of the loop to retweet a Russian troll. This is not what happened, right? Some people were targeted personally and worked with, uh, fake personas for month and month and-

Laurie Segall: Mm.

Camille Francois: Organizing events together and discussing political life and, I think we don’t talk about that nearly enough.

Laurie Segall: Hmm.

Camille Francois: I think we’re still lacking important evidence from the record. And, you know, I’ve worked with activists whose, uh, whose messages have also disappeared, right? They only have their side of the story.

Laurie Segall: Oh.

Camille Francois: So, trying to piece all of this together is still, I think, an important endeavor.

Laurie Segall: What do you think is the biggest threat going into 2020?

Camille Francois: Ourselves. 

Laurie Segall: What do you mean?

Camille Francois: You know, disinformation is really important. It is true that there’s foreign interference-

Laurie Segall: Yeah.

Camille Francois: But, um, it’s been very odd to see the pendulum swing so hard. In 2015, when I was saying I think there is such a thing as patriotic trolling, right? I think governments are actually doing these information operations on social media, I think there is such a thing as Russian trolling. It was kind of, like, “Yeah. Really?” And now every time there’s something-

Laurie Segall: Yeah.

Camille Francois: People see Russians under their beds everywhere, right? Like, everything is disinformation, everything is foreign interference, and I don’t think that’s helpful.

Laurie Segall: Hmm. And what about … I mean, on the home front, I think, like, you- you said something really interesting in one of the testimony about kind of this gray area of campaigns and you said, “I think because of our lack of serious dialogue and what we’re willing to accept on social media or not, we’re gonna find an increasing amount of gray area situations as we head into 2020.” Candidates, parties, PR firms, like, are we gonna see troll farms from actual candidates?

Camille Francois: Yeah.

Laurie Segall: Are we allowed? Like, what- what’s happening behind the scenes, like …

Camille Francois: Um, you know, it’s like two different problems, right? The first one is people don’t have a good grounding on what is normal campaigning. I’ll give you a specific example. In the midterms in 2018, there was a candidate who, uh, had his supporters install an app and the app would give you an off token access your account and then will help all the supporters sort of, like, tweet the same campaign message at the same time. But, you know, you would still have to install it on your phone and you would have to give the token to the app, right? When that happened, people completely lost it, being like, “Oh my God, look at these. This is the Russian trolls, they’re back. All these messages are doing the same thing”-

Laurie Segall: Yeah.

Camille Francois: “At the same time, they’re bots.” And it was really straight-forward to see that it was not bots and it was not Russians and it was just-

Laurie Segall: Right.

Camille Francois: People using a campaign app, right? Because we actually sort of lack, you know, serious grounding on what normal people do in- in course of a campaign, we’re prompt to overreacting. And so there’s a need for a debate on, like, what is okay for a campaign to do? Is it okay for people to download an app and sort of give their account to their candidate? Is it okay to use fake accounts? Is it okay to automate some of that activity? And on the other side, because candidates and campaigns don’t really talk about this, you do have a lot of terrible ideas that are floating around. I do see people who think it’s a great idea to have a little troll farm set up for 2020 with a lot of fake accounts that are just going to, you know, help amplify this or, uh, help to drown that out.

Laurie Segall: Like, do you think candidates now actually have, like … Some candidates could actually have troll farms on- on their own now? Knowing what we know, do you think that there could actually be troll farms here in the United States for candidates?

Camille Francois: Yes.

Laurie Segall: Any more details? 

Camille Francois: I am worried that this is not a discussion that we’re having with campaigns and parties and candidates. That being said, I think it’s slowly heading in the right direction. I was very encouraged to see Elizabeth Warren’s, uh, disinformation plan that does say to my supporters and to my campaign, these are the things we won’t do, right? Doesn’t get- doesn’t get deeply into the details, but I think we’re gonna need more of that. It’s going to be-

Laurie Segall: Did she say that they wouldn’t do that?

Camille Francois: So, we can pull up the details-

Laurie Segall: Yeah.

Camille Francois: Of the plan, but it has a section of it that-

Laurie Segall: Right.

Camille Francois: Addresses the type of behavior on social media that she discourages from her supporters. I don’t think that it specifically talks about the user fake accounts, which is interesting.

Laurie Segall: Mm.

Camille Francois: I think a lot of other concepts were kind of, like, misunderstood, right?

Laurie Segall: Yeah.

Camille Francois: So bots is a- is a traditionally misunderstood concept. That leads to more complex discussion that people don’t really wanna have, right? It’s like what is the role of automation?

Laurie Segall: Right.

Camille Francois: What part of your activity can you actually automate? What part of it is legitimate automation? What part of it is undesirable automation? Um, yeah.

Laurie Segall: Are you saying, and maybe you can or can’t get into details, … In the US, like, are you seeing candidates or people associated with candidates have bots or troll farms or that kind of stuff?

Camille Francois: So far, I don’t think that we’ve seen candidates and campaigns sort of, like-

Laurie Segall: Mm-hmm.

Camille Francois: Officially do that.

Laurie Segall: Right.

Camille Francois: What we have seen is is a lot of people thinking it’s a good idea to use fake profiles to do political mes- messaging, right?

Laurie Segall: Right.

Camille Francois: How much does that add the candidates direction of the campaigns, you know, sort of, like-

Laurie Segall: Right.

Camille Francois: I- I don’t- I- I don’t … I’m hoping that- that (laughs) we won’t see more of that. But again, I think a little bit more of a clear discussion on the rules of the road in this area would be helpful.

Laurie Segall: Um, I remember I interviewed Aza Raskin for this podcast and he said he thought in the future, a threat could be, um… This is my … Maybe very Black Mirror, so just go with me- And then pull me back. But saying that, you know, we … In the future, a threat could be, um, you know, a bad actor taking … Using AI, taking a combination of the faces of the five Facebook people you use the most or you- you talk to, you interact the mo- with the most and targeting you with a face that you automatically trust. Almost like you- you just kind of trust this face because it’s a face that you’re almost more used to, your brain just kind of registers it. Do you think we could see something like that?

Camille Francois: Yeah. I think the technology’s already on the table for that, which is interesting. Um, we recently did a report, we called it FFS, I think for Fake Faces Swarm was the official name.

Laurie Segall: Mm-hmm.

Camille Francois: And it was a really interesting report because it looked at a very large campaign of fake profiles that use generative adversarial networks, which, you know, basically like AI, to create fake faces from scratch. And so all these profiles had this fake faces that were generated by AI and we realized, “Wow, this, you know, this is something that we really honestly thought was a little bit further down in our future, that we’re just seeing there.” But on the other hand, the technology was there, it’s very easy to do, it’s available to anyone. It was honestly, on this one, I think it was harder to detect than to make. There’s still sort of telltale signs, right? So, something that was interesting is when you create, um … Or at least with that generation of generated adversarial networks, the symmetry of the faces-

Laurie Segall: Hmm.

Camille Francois: Was often, uh, wrong, right? So if you would have an earring on your left ear, the matching earring on the right would actually, like, not match at all, right? So if- if you had a face wearing glasses, the left branch would kind of be off if you compare it to the left branch … To the right one. So, like, there were- there were telltale signs like this, but still, I think we’re … With a lot of the AI technologies to generate these types of outputs still … It- it is still a case that it’s easier to generate them than to detect them.

Laurie Segall: People argue that privacy’s kind of a blurry concept, they say, “I have nothing to hide.” What do you say?

Camille Francois: Ah, there’s a- an entire book to be written about that. Um, yeah, that’s not the point of privacy.

Laurie Segall: What is the point?

Camille Francois: The point of privacy is the preservation of- of society and intellectual independence, right? You don’t have to have something to hide, you deserve your privacy. And, uh, no, it’s- it’s a fundamental value in democracy.

Laurie Segall: Hmm. kind of this next threat you talk about a little bit is not just deep fakes, right? Can you just take us to the- the idea, it’s read fakes? It’s-

Camille Francois: Yeah. And so, you know, people think about deep fakes a lot, right? So the ability for machine learning to generate a video from scratch of an event that never happened with a limited, uh, training data set. Um, I think that’s- that’s important and interesting. I also worry a lot about how that plays out in the text space, right? So there are, uh, a series of, uh, classifiers and GB2 is one of them and- and tools today who enable you to take a short training sample and to generate a lot of believable text uh based on that. And I worry a lot about (laughs) how that does to the disinformation ecosystem, right?

Laurie Segall: Hmm.

Camille Francois: ‘Cause when you spend a bit of time, uh, studying troll farms and disinformation operations, they often have to produce a large amount of engaging and believable text, right, to sort of, like, put out on- on a various set of properties or online accounts or domains and, you know, fake- fake profiles. And so I- I- I do worry a lot about that specific threat, which I, you know, jokingly call the- the read fakes.

Laurie Segall: Hmm.

Camille Francois: Um-

Laurie Segall: How would it play out? Like, how do you see it playing out?

Camille Francois: Well, if you ran, for instance, a disinformation ecosystem where you have 200 sides that you’re pretending have nothing to do with one another, it becomes much cheap- cheaper and easier for you to keep 200 sides sort of hydrated with fresh content.

Laurie Segall: Right.

Camille Francois: Um, I- I have a- a wonderful partner who, um, is a bit cheeky and he- he teaches kids, uh, to- to use sort of, like, deep fakes and- and read fakes and all of that. And I think that’s actually sort of a good response. I think people should- should play with those tools and sort of understand what they can do and what they cannot do and have sort of a lot more familiarity with these techniques so that-

Laurie Segall: Mm.

Camille Francois: They can more easily spot them.

Laurie Segall: Um, last question before you’re to go. Um, you said you were an optimistic, or at least in your Twitter bio, it says you’re an optimistic- Mind focused on dark patterns. Why, despite everything you’ve seen, are you so optimistic?

Camille Francois: ‘Cause people are great.

Laurie Segall: What makes you- I mean, I guess I don’t know if I have a follow-up to that. Um, why do you still think people are so great despite everything you’ve seen?

Camille Francois: Because I think that a lot of what needed to be uncovered was hard to uncover. I think people worked really hard to demonstrate that this phenomenon existed. I think people worked hard to say, “Look, there are such things as troll farm, this is how they work.” I think people worked hard to say, “Yes, you know, activists are targeted and this is what’s happening.” And I think despite the problems growing in complexity, um, and in size, there’s always been fantastic people chasing them and exposing them and, you know, coming up with creative solutions.

Laurie Segall: You work so closely with all the tech companies, so do you think they’re well-equipped to take on this next challenge?

Camille Francois: Much better off than a few years ago, for sure. We’ve come- we’ve come from far.

Laurie Segall: You’re still optimistic.

Camille Francois: I’m still optimistic. I mean, we are in a much better position sort of, like, the tech industry in general, than when we were a few years ago. Still not perfect, still a lot to do, but from, like, creating better rules, being better at implementing them and creating technology to be able to do detection faster. Um, but- but we’re in a much better position, yeah.

Laurie Segall: And maybe bring in more humans like you to the table. I would just say, like, adding in the- the people who actually have, uh, an understanding of humanity because I think the thing, maybe haven’t covered tech for a long time, like, I think the thing that seems to keep getting miss … Going missing the narrative is the human part. And maybe had we been paying attention a little bit more to the psychology of hacking and people and that kind of thing, we, you know … And there were maybe more people in these tech companies at a time, maybe that would’ve been something we could’ve caught a little bit earlier.

Camille Francois: More social scientists in tech, more diverse background in tech-

Laurie Segall: Yeah.

Camille Francois: You really can’t go wrong with that recipe, for sure.

So what do you think? We hit on empathy, trolls, the dark corners of the internet, and a little bit of optimism. I would love to hear from you. Are you liking the episodes? What do you want to hear more of?  I’m trying out this new community number: 917- 540- 3410. Text me. It goes directly to my phone. I promise I’m not just saying that. And here’s a personal request, If you like the show, I want to hear from you leave us a review on the Apple podcast app or wherever you listen and don’t forget to subscribe so you don’t miss an episode. Follow me I’m @lauriesegall on Twitter and Instagram and the show is @firstcontactpodcast on Instagram, on Twitter we’re @firstcontactpod.

First Contact is a production of Dot Dot Dot Media executive produced by Laurie Segall and Derek Dodge. This episode was produced and edited by Sabine Jansen and Jack Regan. Original theme music by Xander Singh. 

First Contact with Laurie Segall is a production of Dot Dot Dot Media and iHeart Radio.