Episode 28: Surveillance Tech & Biased AI: The ACLU Fights Back
Read the transcript below, or listen to the full interview on the First Contact podcast.
First Contact with Laurie Segall is a production of Dot Dot Dot Media and iHeart Radio.
Laurie Segall: There’s a great quote on the ACLU website, “The fact that technology now allows an individual to carry such information in his hand does not make the information any less worthy of the protection for which the founders fought.”
Susan Herman: Exactly, I like to talk about you know one of the whole points of the constitution adding the fourth amendment, which is the protection of privacy, is they wanted to protect what was in Benjamin Franklin’s desk. Nobody should know if he was writing some things they were anti government and we now have that on our cell phone, so of course, but that’s where I think that a lot of the protection of civil liberties is applying our fundamental principles in different circumstances.
We are in a moment of reckoning…
As we enter an age of ubiquitous surveillance, questionable data collection practices, even algorithms that discriminate, it’s minorities – especially Black and brown communities – that are disproportionately affected.
Over the last months, as the nation has grappled with a conversation around police brutality, we’ve seen predator drones used for aerial surveillance at protests… Facial recognition technology that wrongfully accused a black man of a crime he didn’t commit – and it wasn’t a coincidence – reports say the tech is a hundred times more likely to misidentify African American and Asian people… And as COVID-19 continues to spread, there are serious questions being raised about contact tracing apps, and how that data collected could be misused.
These issues raise ethical questions about technology and its impact on our civil liberties, equality, and the future of our country.
For Susan Herman it is an extraordinary time to be sitting in her seat as president of the ACLU.
Over the years, the American Civil Liberties Union has filed lawsuits fighting for free speech, reproductive rights, and privacy. But as technology continues to muddy the waters, the tradeoffs become more complicated… Where do we draw the line between security and privacy? And how do we prevent technological innovation from outpacing the law?
I’m Laurie Segall, and this is First Contact.
Laurie Segall: Susan, thank you for being virtually with me today.
Susan Herman: Well, thank you for inviting me, Laurie.
Laurie Segall: Yeah you know I always start out these interviews with our first contact I talk to guests about how we met and we don’t really have a first contact. We’ve never met in person, but we met on an email chain because we were going to do an interview together for something else and it fell through. So, I said, “You got to come on the podcast”, because you are just sitting in such an extraordinary seat at such an extraordinary moment in time. So that’s our first contact.
Susan Herman: Well, thanks. It just seems to me like our first contact was total serendipity.
Laurie Segall: Yeah, exactly. To get started, you’ve been the president of the ACLU since 2008. And I said this before, but you know what an extraordinary time to be sitting in your seat you know. How are you feeling?
Susan Herman: Oh my. It’s just sort of overwhelming. As president, I’m essentially chair of the board. So I’m not the one doing the day-to-day work, as all of the members of our staff are, but to be a member of the ACLU staff right now is just, it’s mind boggling because we had a lot of work that we were already doing before 2016, with all of the States making worse and worse laws about reproductive freedom and voting rights and immigrants rights and all sorts of other things, then came the election. And since then we have brought 173 legal actions against the Trump Administration for things like family separations and the travel ban and prohibiting trans in the military. Then in March, COVID hit. And at that point, you know since then, we’ve also brought over 100 lawsuits, including 100 lawsuits just about people who are incarcerated in jails and prisons in ICE detention and who are just in a hotspot. They have no control over whether they can social distance. And so we’ve been working very hard to get vulnerable people out you know of those terrible situations, out of basically death traps. Plus the COVID also led to a number of States opportunistically restricting things like freedom of abortion, declaring abortion to be a non-essential procedure, so people could just wait until the pandemic is over to get an abortion, right. And voting rights has also just been a really fraud area right now because all the restrictions on voting and the ways in which the vote was becoming distorted have just been magnified by all the difficulties of… There’s a lot to talk about there.
Laurie Segall: I was going to say, what I’m hearing from you is you’re sleeping really well at night. There’s no work to do.
Susan Herman: Yeah almost nothing to do. The staff, they’re just sitting around polishing their nails.
Laurie Segall: I mean, like take me to March. Coronavirus hits, you have been involved in some of these monumental cases that have just shaped society in our civil liberties. Coronavirus hits, and now you know we have a little bit, I don’t even think we have the luxury of perspective at this point, but we have a little bit more perspective. But like take me to March, in your role at this extraordinary moment. Like what was going through your head? What were you concerned about at the time?
Susan Herman: Well, you know one of the first concerns is just you have to close the office. So the first concern is how can people do all this? It increases the work and makes it more difficult to do the work. So we just had to really make sure that our technology was up to doing things. So one thing that the ACLU did was to buy new laptops for some staff people who are going to be working from home. You have to worry about how the technology is working, which has been a question for us every time there’s something really big hits. When the travel ban hit, there were so many people wanting to donate to the ACLU that our website crashed. Even things like that. That’s the number one of how do you handle this? We have been fortunate so far that the ACLU is so well managed and we had not spent every penny that all of our donors had given us up until that point. So we have not had to lay people off, which is very fortunate because as you were saying, there’s more than enough work to do. But that’s the first concern of just how do you keep the organization up to speed and ready to do you know what staff members now need to be doing an incredible amount more work, but for some of them it’s while they’re juggling a toddler and a dog.
Laurie Segall: Can you give me a quick run through of some of the cases that you’ve been involved in? Correct me if I’m wrong, you started out as an intern and really just worked your way up. I mean I can imagine you’ve been involved and I know you’ve been involved in some pretty extraordinary cases. To give listeners some context, can you explain some of the cases that kind of stick out to you?
Susan Herman: Well, I was an intern for the ACLU back in the 1970s around the time when I was in law school. And just to make sure that everybody understands, I don’t actually work at the ACLU. My day job is I’m a law professor. And I don’t generally work on the cases. What I’m generally doing is we run the organization. But I’ll tell you, I think it would be interesting to start with the first ACLU case that I actually did work on, which was while I was a law student. And this was a case, one of my connections with the ACLU originally was that one of my law professors in the first year was connected with the New York Civil Liberties Union. And he had some clients who came to him who were graduate students at Stony Brook on Long Island. They had just discovered they were not allowed to live together. They had rented a house together. There were six of them and they had just discovered they weren’t allowed to live together because there was an ordinance in their village, a village called Belle Terre, that prohibited more than two persons unrelated by blood, marriage or adoption from living together. So you know they were pretty shocked. It turned out that under the laws, it was at the time, by the time they were talking about this, they were liable for all sorts of criminal fines and punishment. It was really at a very heavy stuff. So, I started working on that case with my law professor. And um we went to a federal judge to ask for a temporary restraining order, which means to just, until we had litigated whether or not that was a constitutional thing to do, to tell people who they could and couldn’t live with, that the village should not be allowed to either kick them out of their house or to start locking them up because they owed too many fines for having been illegal residents. So, the judge ended up signing the order. He was signing the order about that. And then one of the ways in which – actually – the original way in which our clients have discovered that they were illegal residents was that they had applied for a residents only beach permit and they were told they couldn’t have one because they were illegal residents. So the judge who we had, the district judge, who was a very nice man, looked at the order we had written out and he said, “Well, it’s the summer. Don’t your clients want to go to the beach while the litigation is pending? Do you mind if I write that in, that they have to be allowed to park in the parking lot at the beach?” We said, “Sure. That’s very nice.” He wrote that in. Then as the junior member of the team, I was sent out to explain to our clients, to show them the order and explained to them what was going on. They gave me a tour of what the village looked like and the residents only beach, and the minute the wheels of their car hit the parking lot, this very large fierce looking man comes striding across and says, “What are you doing here? You’re not allowed to be in this parking lot.” And they all look at me and I’m thinking, “What am I? I’m 20 something. I’m not very tall. And what am I supposed to do about this large man who doesn’t want us in his parking lot?” And then I remembered that I had a federal court order right on my person. So, I kind of drew myself up and I showed him my federal court order. And I said, “Well, I’m with the New York Civil Liberties Union, kind of, and I have a federal court order saying that these people are allowed to be in this parking lot and go to the beach.” And he melted. That was, I think, one of the points at which I thought, “Whoa, this is really powerful stuff.”
Laurie Segall: You saw the power of what the law could do.
Susan Herman: That was my first big case. Yeah, exactly.
Laurie Segall: That’s great. And I saw I read that some of your earliest memories of speaking up to authority involved, I think a dispute over a book at your school library.
Susan Herman: Yeah, that’s right. Even before the Belle Terre case, my first civil liberties hero was my mother. So when I was in third grade, we were doing a school play about a story called Johnny Tremaine about a boy in the American Revolution. I thought the play was interesting. Plays don’t have that many words and we were told that this was based on the book. So I went to my school library, my public school library, and I asked to take out the book and the librarian said, “Oh, you can’t take out that book dear. That’s in the boys section.” I was surprised to find this out. I’d been reading books in the girls section, which were all collections of fairytales and biographies of president’s wives, but it had never occurred to me that I wasn’t allowed to take out a book from the boys section. So I went home and I told my mother about this just thinking that’s the way things are, and she just exploded. She called the library and the next day and say, “How dare you tell my daughter what she’s not allowed to read.” The librarian told me that from then on I could take out any book I wanted. Not long after that, they changed the policy for everyone. So you know, There was another example of how you can kind of speak up to authority when they kind of tell you who to be and prevent you from making your own choices.
Laurie Segall: Were you always like that?
Susan Herman: Well, that’s third grade.
Laurie Segall: So yes!
Susan Herman: I think for most of us, our values have formed when we’re pretty young. Seeing my mother do that, I’m sure would have had an impact on me.
Laurie Segall: That’s such a good story. I mean, did you always know you wanted to go into law?
Susan Herman: No, I actually really didn’t because having grown up as a woman during that era, my father was a lawyer and he always used to talk about the fact that law was really not a good profession for women. Why would you want to do that if you could be an English teacher and have the summer off and take care of your children. So, it took me a while. I graduated from college and then spent a few years doing other things and then decided to go to law school.
Laurie Segall: Well, I mean, it’s so interesting, and now kind of seeing where you’re at and seeing this moment. It does feel like a moment. And I was looking at something you said about this feels like a moment we can be optimistic because so many Americans are beginning to really understand the scope and the depth of structural racism. It certainly feels you know I’m based in New York City. You can just feel it right on the streets with the protests and you hear the sirens and the helicopters.
Susan Herman: Right.
Laurie Segall: You know, as we sit here and we hear your rich history and covering and caring about these issues, what is the challenge for you guys ahead?
Susan Herman: Well, you know the challenge on that particular subject is that this is work that we had already been doing. One of our top priorities for the past several years has been trying to break our addiction to mass incarceration, which as everybody is now really coming to terms with, has been really, it’s a system that that has disproportionately affected people on the basis of race and income and disability. A quarter of the people who are arrested are people who are mentally ill. Our feeling is that the system has been fundamentally broken and misguided for a long time. Part of what we’re trying to do with this moment is to capitalize on the fact that people want to look at what the police do. Now, we’re trying to encourage people to look beyond the police. It’s not just who are the police arresting and how are they treating the people they arrest. I think behind that is the question of what do we really want to treat as a crime? So, when you treat all sorts of very minor misconduct as a crime, you’re really setting up a situation where there are going to be more contacts and therefore potentially more arbitrary and discriminatory context. So, if you think about it, Eric Garner ended up dying because he was selling single cigarettes on which the tax had not been paid. George Floyd, the basis for that encounter was that they thought he might be passing a counterfeit $20 bill. So, I think that if you look at why are we criminalizing some of the things we criminalize, especially if you’re talking about people who are mentally ill and are having problems, do we really want the police to be the people who are the first responders to people who are having a mental health crisis? Or is there some more effective way to deal with that that would avoid putting those people into the criminal justice system, which isn’t really good for anyone, and to maybe recommit, reallocate some of the resources we’re using on arresting people and locking them up to actually dealing with the mental health crises. You have mental health treatment. So, instead of viewing everything as all dysfunction as a matter of policing, why don’t we spend more time re-investing and to try to prevent more dysfunction. It’s sort of like the old saying, if you’re a hammer everything looks like a nail. Well, not every problem in our society is a problem for the criminal justice system and an occasion to arrest people and lock them up. A lot of them really should be an occasion for thinking about public health treatments, or thinking about how we want to approach homelessness. A lot of much deeper thoughts about how you prevent dysfunction rather than answering everything with we’re going to send in the police.
Laurie Segall: It certainly seems also like this moment, even coming out of the pandemic, I can only imagine the mental health crisis is going to be even worse.
Susan Herman: Yeah, that could well be. I think the pandemic is also showing us … Somebody asked me the other day whether the protests over policing and police brutality are related to the pandemic. And I was in a webinar and one of the smart people in the room said, “Oh no, no, they’re two entirely different things.” But I said, “What do you mean? The same people who are being disproportionately affected by policing and police brutality are the people who are being disproportionately affected by COVID.” The statistics is that people of color are much more likely to die. And there are a lot of reasons for that, having to do with underlying health and having to do with the fact that minorities and people who are not affluent don’t get to work from home. They don’t get to work through the Zoom. They’re the people who are out there on the streets, being the first responders, being the people who are picking up the garbage, being the people who were stocking the supermarket shelves. I feel like the virus is really amplifying so many of the inequities we’ve had in our society. I think especially, I don’t know what it’s like for everyone else, but I live in Brooklyn and in New York City, it really felt like a lot of the people who were out on the street, they were out on the street because they were upset about George Floyd. But I think it was more that they recognized that George Floyd was the tip of the iceberg and that there were just a lot going on that they really just, you could not tolerate any longer.
More from Susan after the break. And make sure to subscribe to First Contact on Apple Podcasts or wherever you listen so you don’t miss an episode.
Laurie Segall: Putting on the tech hat, you know I think most people probably don’t think of tech when they think of the ACLU. But there’s quite a bit of litigation in regards to security and privacy issues around contact tracing, surveillance, algorithmic bias. Obviously the ACLU has a hand in checks and balances and a lot of the issues that are emerging from the pandemic. What are some of the tech developments that you guys are most concerned about?
Susan Herman: Well, since you were mentioning the COVID and the contact tracking and tracing, I’ll start with that. So the upshot is that we are neither for nor against contact tracing. If contact tracing is something that really will contribute to public health, our concern is not to say, “No, you can’t do it, or yes, go right ahead and do whatever you want.” What we’re concerned about is to minimize the damage to privacy, the damage to equity, again. There are a lot of concerns that we have. The other thing that we’re concerned about is discrimination again, because there are ways in which the technology could also increase preexisting social inequities. We think that people should not be coerced into participating in testing. We think it should be voluntary. And we also think that it should be non-punitive, because if you start having the criminal justice system enforcing whether or not people are willing to use their phone to take a test or whatever it is, you’re just creating more opportunities for police interactions that will at some point be arbitrary or discriminatory. So, we don’t want to see rules and regulations that would go to public health rules, even if they really are good public health rules. We don’t want to see those become occasions for filling up the jails with the people who aren’t complying, because we’ve already seen there were some statistics in New York that when you ask the police to start enforcing who’s wearing a mask and who’s not wearing a mask, that right away skews racially, racially disproportionate in terms of who they were questioning and who they weren’t questioning. So I think there’s just a lot of issues there, which are very much up your alley, because they’re very much ethical issues.
Laurie Segall: Yeah. One of the cases that I’m fascinated by, and you know I honestly felt like it was only a matter of time until we saw this headline, and then we saw the headline, you know a man was arrested after an algorithm wrongfully identified him.
Susan Herman: Oh, yeah, yeah.
Laurie Segall: I’ve been covering for so many years, AI is biased. AI is trained on data on you know online, which can be very racist, and I think for so many years we’ve been having this conversation, but the question of, okay, well, what happens when it gets into the hands of the police? What happens if it could go for policing? And so I think it’s such a fascinating case, and you guys, the ACLU filed an administrative complaint with Detroit’s police department over what you guys are calling the country’s first known wrongful arrest involving facial recognition technology. I mean, for context, a man was arrested because he was wrongfully identified by an algorithm. The police department thought he had robbed, I believe, like stolen watches, and he was arrested. I mean can you talk to me about the significance of this case? I can help put on my tech hat and scream, “You guys, this is a really big deal.”
Susan Herman: Yeah, it is a really big deal, and as you’re saying, Laurie, we’re aware of this problem for a long time and we’ve been complaining. So going back for a minute before getting to the case you’re talking about, your Robert Williams, the National Institute of Science and Technology says that African American and Asian people are up to 100 times as likely to be misidentified by facial recognition. So that’s the background problem. And so we knew that. We knew that before this case came up in Michigan. Um and it’s not the algorithm’s fault obviously. There’s something that’s being put into the algorithm that has a bias. And I think people tend to think that algorithms are so neutral and we can rely on algorithms. That’s what I was saying about the contact tracking and tracing, that you start relying on algorithms or apps that you think are neutral and you really have to be very wary of that. So um again, before getting to the Robert Williams case, an ACLU staffer at the ACLU of Northern California had the really interesting idea of trying out Amazon’s facial Rekognition program, Rekognition with a K, because yeah, they were just offering this to the police or whatever. This is great, it’ll help you identify and see if you have somebody who matches a mugshot. Well, what they tried to do, which I thought was very clever, was they tried to match mugshots against the members of Congress. They got the facial pictures of all the members of Congress. This was in July of 2018. And there were 28 members of Congress who were misidentified as matching the mugshots, 28.
Laurie Segall: Wow.
Susan Herman: There were 28 mistakes out of that. And not only that, but the false matches were disproportionately people of color. One of the people who was identified as matching a mugshot, and therefore probably this criminal, was civil rights legend John Lewis, you know, the guy who was beat up on the bridge in Selma to get us all voting rights. So we know that almost 40% of the false matches there were of people of color, even though people of color made up only 20% of the members of Congress. So in some ways, the Robert Williams case is completely predictable. We knew that we allowed for that to happen. It might have already happened elsewhere, but subterranean, in a way that we didn’t see the case. But what’s amazing about the Robert Williams case is that it happened right there, visible to everybody, where you can just see it. So what happened was that they told him that he was being arrested because they believed that the algorithm had said that he was a match for this mugshot. And they showed him the mugshot. He said to them, “Do you guys think all black people look alike? That looks nothing like me.” So, it was pretty clear, if you used your eyes and looked at the picture yourself, if you didn’t trust the algorithm and if you looked at the picture and this man’s face, they didn’t look alike. But nevertheless, he spent 30 hours in jail under some pretty miserable conditions, because the algorithm said it was a match. So, I think that’s really important. In some ways, the fact that you know a problem exists is not as inspiring to make people want to do something about it as when you see it. So that’s what happened with all the protests about George Floyd. People could watch that horrible video. They could see it. It was recorded on the video. And here we have an actual person, not just hypothetically as statistics are showing, but an actual person who did get arrested and did have a miserable time. He was arrested in front of his family, and it was really traumatizing. And based on, again, the officers involved were trusting the science more than they were trusting their own eyes. When anybody could have seen, he didn’t look like the picture.
Laurie Segall: Right. And he wrote an op-ed in the Washington Post and he asked the question, he said, “Why is law enforcement even allowed to use this technology when it obviously doesn’t work?” So I guess asking a legal scholar the question, and police departments all around the country are using different variations of facial recognition software, so what regulations should we see as we enter this era of algorithmic discrimination?
Susan Herman: Yeah, that’s a great question, and again, we have been urging long before Robert Williams turned up, we’ve been urging police departments not to rely on the facial recognition technology, that it was not reliable enough to hold people’s fates in the hands of… I guess algorithms don’t have hands. But for people’s fates to be dependent on this facial recognition technology which was being touted, and again, it’s great if a company is doing something to make money. But if wanting to make money is your only consideration, and if you’re not considering whether you are unleashing something that is really going to be disruptive of people’s lives unfairly, either because it’s just going to be wrong or because it’s going to be wrong in a racially skewed way, I think that’s just really a problem. So we’ve been urging police departments not to buy and use the technology, and I’m sure you know Amazon has withdrawn the facial recognition technology temporarily, and they’re not sure whether or not they’ll bring it back. The probability of a wrongful arrest is one thing, but when you draw the camera back and look at all the technology in the bigger picture, in addition to facial recognition, one thing that police departments have been doing was facial recognition in different law enforcement agencies is to try to see who attends a demonstration or see who’s in a crowd. So it ties not into is somebody likely to be wrongly arrested, like Robert Williams, because there was a false match, but it starts becoming big surveillance, too, that an agency has the cameras on and then they have the facial recognition and they’re purporting to identify all of the people in that crowd so that then they can track those people. They now know that you were at the George Floyd demonstration, and that person was in the anti-war demonstration. And at that point, the government starts having more and more information about all of us to the point where it feels like instead of we’re controlling the government, it’s like the government controls us. So I think the facial recognition is only one part of the whole tendency of technology to amplify government power to be watching what we do.
Laurie Segall: Yeah. It’s interesting to hear you say that, you know that type of technology is just a part of it, especially when it comes to this moment where people are out protesting police brutality, when people are out fighting for their civil liberties, you know there’s all sorts of technology that’s being built. There are cameras that are being built that can recognize people in real-time that police are wearing. There’s all sorts of technology. This is just the beginning of it. I know you’ve mentioned Amazon put a hold on their sales of Rekognition software, Microsoft has said it’s not going to sell face recognition software to police departments until there are federal regulations. I know IBM said that it was going to announce a ban on general purpose facial recognition. Is that enough? What is, I guess, you know what is the government’s role here? What do you think should happen, especially since this is just, as you say, one small part of a larger issue that we’re facing as a society?
Susan Herman: I think that’s right, and I think that there could be government regulations, but that’s not going to happen unless the public wants to urge their representatives to start controlling this. And what we’ve seen is that an enlightened public can make something happen even without regulation. It was like the public was becoming concerned, and that’s the reason why Amazon acted to withdraw this. They started being concerned that their customers were not going to be happy with them. I think at this point, that’s almost more effective than government regulation. And once you have that wake up call, then you can start having serious debates. I think those debates have to take place in many places. They should be taking place in legislatures, where people can talk about the tradeoff between privacy and mass surveillance and whatever the government is trying to accomplish. Why do they need this technology? Is it really worth it? Are there crimes that they wouldn’t be solving without it? And are they crimes that we’re concerned about solving, or do they fall into the category of is that something we don’t think should be a crime at all? People are generally unaware in terms of what the police do that only 4- 5% of all arrests involve crimes of violence. So when people think about we want to enable law enforcement to be catching criminals, or we’re concerned about divesting or defunding the police, because who’s going to protect us from physical harm, almost none of what the police and law enforcement do is about physical harm. It’s a tiny percentage. Everything else that they’re doing is about this whole array of all sorts of other things that we criminalize, and I think that in addition to having better conversations about is there a potential for some of these technologies that the government is using to create arbitrary or discriminatory enforcement, I think we need to dig deeper behind that question, in the same way that you need to dig deeper beyond the George Floyd murder and to ask is there’s something systemically wrong here. Do you need to rethink the whole question? So when people say, “Oh, but we need the facial recognition technology because it helps the police solve crimes,” well, okay, but what crimes? And what are the costs? So I think once people are educated enough and once they realize what the nature of the problem is, kind of what’s being unleashed, they can start really being ready to have that broader conversation. And I think it should take place in legislatures, but I think it also should take place and evidently is taking place in board rooms, at Amazon and Facebook and Google and Microsoft. They should be talking, and they do sometimes, if the people demand it.
Laurie Segall: Yeah.
Susan Herman: And it also has to take part just among people, among tech communities, and people just beginning to talk about what are our responsibilities here? Is it okay for us to create products to sell to make money if we know that there are dangers that the products are going to be misused or maybe aren’t reliable enough, or that they just feed into this enormous surveillance state? So let me compare this to an earlier moment after 9/11. We had a kind of a similar phenomenon, that in order to deal with catching terrorists, we changed a lot of laws that ended up really sacrificing a lot of privacy and allowing a lot more government surveillance. And for a number of years, that went unchallenged and people kept thinking, “Oh, well, if that’s what we need in order to be safe, we’re willing to give up a little privacy.” So first of all, I think people didn’t think about the fact that they weren’t giving up their own privacy, they were giving up somebody else’s. And second of all, people didn’t realize how extensive the surveillance really was until Edward Snowden. So then after Edward Snowden came along and people realized how the government was just scooping up tons of information about people and just keeping it in government databases and started realizing the horrifying potential of all that, what happened was that Congress made a couple of little changes to the law, but more important, Microsoft and Google and other places started to realize that their customers were concerned. And they started being a little less cooperative. At the beginning, right after 9/11, all of the telecoms, all of these companies were just saying to the government, “You want information? Here, take it all.” Verizon was like, “Sure, here are all the records of all our customers. Take it all. You’re keeping us safe.” And I think that to me, the most important thing is an informed public, that if people can examine for themselves whether they really think that we’re being kept safe by all of this and really examine both the costs and the benefits in an educated way, I think we get much better discussions and I think not only do you have the possibility of getting better legislation or regulation, you also have the possibility that private companies and the tech companies are not going to want to do it anymore because their customers don’t want them to.
Laurie Segall: Yeah. I mean it’s hard to have an informed public and to have these discussions, even in this current environment to some degree, people, I think, are struggling with the idea of truth. By the way, I remember the Snowden leaks. Like I remember being in the newsroom, covering technology, and thinking to myself, because I rode the tech bubble all the way up right, and thinking, this is an extraordinary moment, because we saw that we’ve been sharing all our data, but we saw for the first time that you know the government had a lot of access to things that we had no idea they had access to it. And I think it was a fundamental shift and the lens on tech companies changed at that moment. And tech companies’ behaviors changed quite a bit after that. I wonder this moment we’re sitting in, where we’re having these debates about surveillance, and privacy, and whatnot. These are sticky debates and they’re very politicized as we’re heading into an election, as we have misinformation spreading online, as a lot of people don’t know what to believe and what not to believe, as the media landscape has changed. It certainly seems like a harder environment to even, to even have some of these conversations.
Susan Herman: I think in some ways it’s harder, in some ways I think the other thing that is a catalyst for the discussions is realizing that there’s a dimension of race to all of this. I think in talking about artificial intelligence and facial recognition, not many people saw that as an issue of structural racism, that there’s something wrong with how we’re putting together the algorithms that it ends up that John Lewis is going to be misidentified as somebody who matches a mugshot and that Robert Williams is going to be arrested, so I think that the fact that we now know that that is an additional concern enables us to have richer conversations. We’re not only talking about, is there a trade off between security and privacy? Plus, I think the other thing that people are feeling much more open to is to have that deeper conversation about, what are our goals here? And if we’re enabling all this government surveillance in order to help the government to catch criminals, what do we mean by criminal? What crimes are they solving and how is this actually being used, in service of what? So I feel like in some ways, with the election coming up, I think that gives people more impetus to want to talk about these issues because elections, aren’t only about the president. They’re also about local prosecutors, and sheriffs, and the people who make the decisions about whether to buy surveillance equipment and what they’re going to do with their authority over the criminal justice system. So one thing the ACLU has been doing, in addition to everything else, is we’ve been very involved in elections of prosecutors, because that’s a place where people never used to pay attention to, who were these people running? Maybe they would vote for somebody without really knowing what they voted for. So what we’re urging, and I think this is very much what we’re about, about having an educated public, we’re urging people to go to elections, or to go to debates, to go to campaign events, to attend, I guess on Zoom these days, to attend campaign events and ask the candidates questions, what would be your policy about whether or not you’re going to accept military equipment from the federal government in your police department? Are you going to buy tanks? Are you going to buy these horrible weapons that are used? Is that something you would do? Are you going to buy facial recognition software? Is that how you would use your power if we elect you? Say to the prosecutors, would you support a reduction in cash bail and increased alternatives to incarceration? So that’s a place where, without waiting for the government to do something, we can, ourselves, affect what’s happening in our communities, by encouraging candidates to think about what positions they’re taking on these different issues and letting them know that they’re going to lose votes. The more they are educated, the more they can tell people that they’ll lose votes and to try to, this is something that’s worked in some places, to encourage candidates to take a better position. They might never had thought of that, but once they commit themselves, that’s going to be better. So there are all sorts of ways that we can affect things.
More from Susan after the break. And make sure you sign up for our newsletter at dotdotdotmedia.com/newsletter, we’ll be launching this summer.
Laurie Segall: Before I move on from specifically some of the tech issues, I have to bring up predator drones. The US customs and border protection flew a large predator drone over the Minneapolis protests. People were protesting police brutality and the killing of George Floyd and for many reasons, it almost felt symbolic. It was raising all these questions about aerial surveillance, about what data was being collected. Where was this going? What is your take on this?
Susan Herman: Well you know as you’re saying, Laurie, it really magnifies the opportunity to gather more information, because you don’t even have to have the helicopters or whatever, but that of course is a concern. How much information is the government gathering? What are they going to do with it? Who’s going to have access to it? Will it ever be deleted, or will it just stay there in the government databases forever? But I think the other thing that the predator drone brings to mind is a question that people were also asking, which is about the militarization of law enforcement. We have had for years in this country a Posse Comitatus Act, as it’s called, which says you don’t want the military doing everyday law enforcement, because that’s not our country. We don’t want the military to be, quote, “Dominating the streets.” And we don’t want the people who are out protesting to be considered the enemy of the United States. They’re people who are expressing their opinion. And so the whole idea of, It’s enough if the police have helicopters are flying overhead and trying to keep track of who’s in the crowd and what the crowd is doing, but once you start adding an element of the military helicopters, or the military drones, or things that feel like we are being treated as the enemy of the government, instead of the people who are the government, who are supposed to be controlling the government, I think it’s a very bad paradigm.
Laurie Segall: You think it’s a slippery slope?
Susan Herman: Well it’s a slippery slope, unless we stop the slipping. As we saw with Amazon and the facial recognition, if people say, “Wait a minute, I think we can make that stop.” but I think if people don’t pay attention, I think we have a very slippery slope. That’s why I’ve been saying about most of the issues we’ve talked about, you starting with the contact tracing and the surveillance and everything else, it seems to me that what’s really important is transparency. We should know what the government is doing and accountability. Back on the issue of contact tracing, one thing that the ACLU did together with the ACLU Massachusetts is we have filed a lawsuit, actually a records request demanding that the government, including the CDC, release information about the possible uses of all the location data that they would be collecting and connections with contact tracing, because if you don’t know what they’re doing, then you can’t have a discussion about what they should be doing. And one reason why I was bringing up all the post 9/11 changes of law is that I think that the whole idea that we can’t know what the government is doing, the government has to act in secret in order to keep us safe, or else the enemy will be able to know what they’re doing and work around it, but the government can know everything that we’re doing. I think that just has democracy backwards. We have to be able to know what’s happening inside the government and that applies to, why are they sending the predator drone? What are they going to do with the information? What does this mean? Are they going to do it again? And it also has to do with the contract tracking and tracing. Once they get that data, what happens to it? Are they going to erase it ever? Who do they share it with? What are they going to do with it? I feel those are really important issues in a democracy, that we just have the right to know what the government is doing, so that we can talk about it. I feel like to say, “Well, this is what the government is doing, and that’s really bad and that upsets me.” I think that kind of misses the point. If the government is doing something bad, then it is the duty of every American to find out what they’re doing and to push back. And so at the ACLU, we have a program that we call People Power. We first invented that and used it to explain to cities and localities all over the country about how they could fight back against draconian immigration rules by becoming quote, “Sanctuary cities.” what their rights actually were. We then used it for voting rights. We’re about to use it some more for voting rights, but what we really urge, and I hope that some of your listeners will go to the ACLU website and see about what People Power is doing in addition to what the ACLU is doing, because what is the ACLU doing? That’s all the staffers at home trying to work on their new laptops while they’re trying to keep their toddlers quiet, but People Power is about what every single person can, and I think should, be doing. If people really educate themselves and think about the ethical issues, the costs and benefits of all this technology, in addition to a lot of other things going on, I think we get a lot better results if people pay attention.
Laurie Segall: Yeah, I mean, it’s interesting to watch the ACLU take on issues like surveillance, facial recognition. I know the ACLU filed a lawsuit against Clearview AI, which was this very controversial company that was using biometric data. I think facial recognition technology helped them collect something like 3 billion face prints and they were giving access to private companies, wealthy individuals, federal state, and local law enforcement agencies. You know coming from the tech space, it certainly feels like sometimes these stories, you don’t know what these companies are doing until you start peeling back the layers and seeing, well, the data went to here and here and why did it go there and why wasn’t this disclosed? And and oftentimes it takes a watchdog to really understand where some of this can go wrong and how it’s being used in ways that can be dangerous in many ways.
Susan Herman: Yeah, I think that’s exactly right. That’s why I was saying before, that our concern, before everybody jumps on the bandwagon about let’s have more contact tracing and everybody, should you be doing all this information. I think we have to get a dog. You’re not going to have the watchdog tell you things, unless you build a watchdog into the system. And if everything is just, a company has invented this and is selling it to the police, or a company who has invented this and now we’re all going to buy it. If you just leave out any oversight, then you really have a tremendous potential problem.
Laurie Segall: Are there any other examples of tech that we’re not thinking about the unintended consequences for our rights or privacy yet?
Susan Herman: AI is really big altogether, as you say, across many different kinds of issues. I was just actually, this is not tangential to your question, but you were asking me before about cases that I had worked on. There was another case that I worked on that was about tech, where I wrote the ACLU’s brief in the supreme court. It was an amicus brief. It wasn’t about our client, but it was a case called Riley versus California. And what the police were saying there, most law enforcement places, the federal government, as well as the state of California, and many other jurisdictions, was that when you arrest somebody, the police get to do what is called a search incident to arrest. They get to see what you have in your pocket, makes some sense, right? If you have a gun in your pocket that’s a problem, or whatever.
Laurie Segall: Right.
Susan Herman: They get to do a search incident to arrest. The law had been that if they find something in your pocket that’s a container, they can search inside the container to see if there’s anything in it that could be harmful. And in fact, there was one situation where they opened up a cigarette package that somebody had, and they could find a razor blade, they could find a marijuana cigarette, whatever. So that was the law where the supreme court said, “Yes, you’re allowed to search people and search the containers that are on them.” what law enforcement said was, “Your cell phone is a container. When we arrest you, we can search your cell phone. It’s a container. We have the right to search incident to arrest.” And so we wrote a brief saying, “No, it’s not, it’s a container, but it’s a container that essentially is your home, it’s your library, it’s your desk.” So allowing the police to look in your cell phone, when they only had really very feeble and very unlikely scenarios, things that just wouldn’t happen too often for what the need was. Maybe you had some remote thing that would go off and would blow something up, oh, come on, but there were other ways to deal with a lot of that. And so, the supreme court actually agreed with that. They said, “Yeah, this is really is a technological way of finding out what’s in all your papers, and books, and records.” it used to be they were in your desk and now they’re in your cell phone. So that to me, it’s a whole thread of what we’ve been talking about, that the challenges to civil liberties are different and in some ways greater when the technology builds up.
Laurie Segall: Yeah, there’s a great quote on the ACLU website, “The fact that technology now allows an individual to carry such information in his hand does not make the information any less worthy of the protection for which the founders fought.” – The U.S. Supreme Court Justice, John Roberts.
Susan Herman: Exactly, I like to talk about one of the whole points of the constitution adding the fourth amendment, which is the protection of privacy, is they wanted to protect what was in Benjamin Franklin’s desk. Nobody should know if he was writing some things they were anti government and we now have that on our cell phone, so of course, but that’s where I think that a lot of the protection of civil liberties is applying our fundamental principles in different circumstances.
Laurie Segall: Taking a gigantic step back, what do you think is the biggest threat to civil liberties in the new world order?
Susan Herman: In the new world order? It’s hard to just select one, it’s like Sophie’s Choice, which is your favorite child? Right now, I think one of our very top priorities, mass incarceration is a big one, because so many people’s lives are just being totally disrupted, their families, and often the question really has to be for what? One thing that we’re hoping is that the work we’ve been doing around trying to get vulnerable people released from prison, so that they won’t get the virus and get seriously ill, possibly die, is we’re hoping that once jurisdictions see that they were able to release thousands of people from prisons and jails and it’s not going to cause a spike in the crime rate. It really is pretty safe thing to do. We’re hoping that that’s going to stick and that long run we’ll be able to rethink, well, did we really need to put all those people in prison and jail to start with? What are we doing with the criminal justice system? So that’s really big, but the other thing that I think is really big right now is voting rights. I have alluded to this at the beginning of our conversation, but the premise of democracy is that the people get to decide on who should be running the government and who should be making the policy, about all these things you’re talking about here. What are the regulations about technology? What are the regulations about your reproductive freedom? Everything else, LGBT rights. If the people’s vote is distorted, that’s a real problem, if people can’t vote. So we have litigation going on right now in, I think it’s 30 different states, trying to get people the opportunity to vote. One of the things that has happened, in addition to ways thatincumbents had been using to try to protect their own seats, is that the virus has really made it dangerous for people to vote in public places. We saw the election in Wisconsin where people were just lined up for tremendous distances, waiting for a really long time to vote, because Wisconsin would not allow them to submit absentee ballots. In fact, a study showed afterwards that at least 17 people got the virus from voting. Many, many polling places were closed, because first of all, the poll workers are generally elderly people and the poll workers were not able and willing to man the polling places. There are a number of states that don’t allow absentee ballots at all, unless you have a particular situation, like if you’re disabled in the states you’re saying, oh, well, the virus or getting ill that’s not a disability. Or before you get an absentee ballot and you have to have it notarized, you have to have witnesses. How is all this going to happen? So it’s very concerning that people are going to have to choose between their health and their right to vote. We don’t think that that should happen. That’s something that has to be attended to right now, because if states don’t come up with plans for trying to enable everyone who wants to vote to be able to vote, and for counting absentee ballots, and for administering this program, if you don’t come up right now with the plan and the resources, a lot of people are going to be left out and they’re going to find that either they can’t vote, because they’re afraid to go out to the polls or their vote is not going to be adequately counted. So I think that right now, making democracy work is really one of our top projects.
Laurie Segall: What is the solution to some of these problems? What are your tangible solutions?
Susan Herman: One tangible solution is that more states have to make absentee balloting available to people without. That more states have to make absentee balloting available to people without having all these conditions and obstacles. The other solution is that a lot of the… You were talking before about truth. A lot of the reason that’s given, the very thin veneer of justification that’s given for we don’t want absentee ballots or we need voter ID, people to carry a government approved voter ID, which means you have to go down to a governmental office live and get your voter ID and show it at the polls, the excuse for a lot of this is that there could be fraud. Well, studies have shown that there’s virtually no voter fraud. And just it’s a real unicorn. And again, I think if people understood that, that might sound good, but it’s not true. I think truth is another thing that we’re really fighting for these days. Can you listen to the evidence? Can you listen to the public health officials? Can you listen to what’s real?
Laurie Segall: I know for a fact that tech companies are very concerned about voter suppression and misinformation spreading online, this idea of countering truth around a lot of these very important initiatives, whether it’s absentee ballots, whether it’s showing up to the polls, all that kind of thing. You know I’d be curious to know your take. There’s a current battle happening right now. You have 750 advertisers boycotting, Facebook asking for better policing of hateful content. Are social media companies doing enough to police harmful content, especially as we head into an election where voter suppression and the spread of misinformation will most certainly be a tactic used to manipulate voters?
Susan Herman: Well, let me actually break your question down into two different parts because you were starting by saying that the concern about voter suppression. I think one thing that everybody should be doing is to increase awareness of what is a fair way to improve access to the ballot for everybody. And some of those things are tech solutions. We’ve had tech solutions for years that are available and not widely enough used, but how to enable differently abled people to vote. Can blind people vote? Do they have the technology? So there are a lot of places where we need the tech community and we need everybody to find out how you vote to find that voting could be made easier and to let people know what the rules for voting are where they live. So one thing the ACLU is doing is we have on our website know your rights, know what your voting regulations are. And that’s something that I think people really have to start thinking a lot about and to let all their communities, all their friends and family know about the importance of voting and what they have to do to vote and to urge them to just get out and vote in whatever form that’s going to take. So I think that’s really important. In terms of disinformation on social media, people talk about the First Amendment and whether there’s a First Amendment problem with Facebook telling you what you can’t do. Well, there isn’t because the First Amendment only applies to the government. So you don’t have a First Amendment right to say whatever you want on Facebook. However, I have to say that we don’t regard that issue as altogether a simplistic issue, that Facebook should be telling everybody what they can say, because even though the First Amendment does not apply to private companies, there’s still a tremendous value to free speech. And there are a number of examples which we’ve come up with about people who are, have speech suppressed for bad reasons. I’ll give you one example. There was a woman, an African-American, woman who posted something on Twitter, and she got all these horrible racist responses. And she posted a screenshot of the responses that she got to show people what she was up against. And Twitter took it down because it included racist words. That, okay, kind of misses the point. There was another ACLU lawyer wrote about a statue in Kansas that was a topless statue. It was a woman who was bare breasted. And so whatever the locality was in Kansas decided to take it down because they considered that to be important. So the ACLU lawyer who is challenging whether or not the – I think it was city – could take it down, posted a portrait, a picture of the statue. It wasn’t Twitter. It was, I think, Facebook, and that was taken down on the ground that it was obscene. So she couldn’t post the picture of what she wanted to do. So we think that social media control is really a two-edged sword. What I liked is at one point, Facebook had a protocol about what’s true and what isn’t true. And what they did was they gave you a flag. So if they were concerned that something that was said wasn’t true, they would have a neutral fact checker check it. And then if it didn’t turn up well, they would put a little flag over it and say, this has been questioned. And you can click on the flag and you could see why it was questioned, but they didn’t just take it down. So I agree that disinformation is a tremendous problem, but I think that the idea that the solution is ask the tech companies to decide what we should and shouldn’t see, yeah, I don’t think that’s so great either. And certainly they should not be doing it without transparency and accountability. If they’re going to be taking things down, they should tell us what their protocols are. And there should be more public discussion about where the balance is there.
Laurie Segall: Yeah. It certainly seems like the protocols change quite a bit, especially having covered tech for this many years. It certainly seems like Facebook changes it. Twitter changes it. And oftentimes, it depends on public pressure. I’m curious to see what happens with all these advertisers boycotting. I think personally I have a feeling it won’t impact the bottom line much, and they’ll go back to business as normal. But who knows? I do know that Zuckerberg cares deeply about his employees, but they’ve been kind of up against public scrutiny for a very long time. But it certainly is interesting, especially when the stakes get higher and disinformation can go further. Especially as we get closer to an election, it certainly feels like everyone feels more triggered around it.
Susan Herman: Yeah. Yeah. Well, one of the classic statements about the First Amendment is that in the marketplace of ideas, the best antidote to bad speech is more speech.
Laurie Segall: Right.
Susan Herman: So suppression, I think we always have to worry every time somebody is censoring and suppressing.
Laurie Segall: Yeah.
Susan Herman: Who are we giving that power to?
Laurie Segall: Nearing a close because I know we don’t have you for too much longer, I saw that you gave a talk, a Democrat and a Republican walk into a bar. And you were saying that it seems like these days, Democrats and Republicans can’t really agree on anything, but we all need to agree on fundamental American principles like due process, equality and freedom of conscience. So is that possible? Are you an optimist? Do you believe that in this current environment, is that possible?
Susan Herman: Well, I think that’s a great wrap up question, Laurie. So that speech I gave at the Central Arkansas Library. And my cheap point, as you’re saying, is I think that people have to be able to agree on neutral principles. The constitution was designed not to say what we’re going to do about everything. It was designed to have everybody have a fair opportunity to be part of the process of deciding what we’re going to do. So it sets up all these democratic structures where we get to vote for the people who are the policy makers, and we all get to decide. But the principles there, the underlying principle is that everybody should have a fair and the principles should be neutral. Everyone should get to vote. It’s not like if you’re a Democrat, your vote doesn’t count in this area, and if you’re Republican, your vote doesn’t count in that area because that’s not fair. And the basic ideas of the freedom of speech, freedom of religion, they’re all to be basic manifestations of the golden rule, that if I want the ability to just choose my own religion and decide what religion I’m going to practice, I have to respect your right to make a different choice and have your own religion because that’s the golden rule. If I want to say something that’s unpopular, I have to respect your right to say something that’s unpopular. And if I want to be treated fairly and not locked away for doing something minor and never given a fair trial, I have to respect your right to have the same thing happen to you. And to me, all those fundamental principles are things that we really all should agree on. I think people get into arguing and assuming that they can never agree on the principles because they’re differing on what they think the results should be. And I think to be part of the point of civil liberties is it’s all about process. It’s not about results. The ACLU is nonpartisan. We don’t try to get Republicans elected. We don’t try to get Democrats elected. We don’t favor or disfavor individual politicians or individual parties. But what we favor that there should be neutral principles that everybody can agree to to say, okay, here’s what’s fair. And the analogy I used in that talk at the Central Arkansas Library, it was one of the nights during the World Series but fortunately not a night where there was a game, so people were able to come. And I said, okay, and so what happens before a baseball game is that everybody has agreed on the underlying rules. And everyone agrees that your umpires, your referees in any sport should be neutral. And you don’t want somebody who is partisan. If they were favoring one team, you’d get rid of them. And all sports fans could agree to that. Maybe there would be a few who would be just so Machiavellian that they would rather have the biased umpire to always rule for their side. But I think sports fans can agree what you really want for a fair game is you want a fair game. You want everyone to agree on the principles beforehand. And I think that if we could sit down in small groups around the country and really talk about what the fundamental principles are, I am enough of a patriot to think we actually could agree about a lot. And let me give you an example of why I think there’s some basis for hope, maybe not optimism, but certainly hope. We were talking about voting rights. So one of the major problems is gerrymandering, the way when a party is in power they try to distort all the districts and they try to stack the deck so that their party will remain in power, or if the party in power in a particular state thinks it’s to their advantage to not have that many people vote, they try to make it harder to register to vote for new voters, et cetera. We have had the ACLU, and a number of other organizations working in coalition with us, have had a fair amount of success doing ballot initiatives, going to the people of the state in states like Michigan and Nevada and Missouri and Florida where we were part of getting the Amendment IV passed that gave the vote back to people who had been convicted of a felony at some point. And the people at the state, when you asked the people at the state, you can get a majority, sometimes the super majority, of people who say, no, we want the rules to be fair. Who doesn’t want the rules to be fair are legislators who are incumbents and who want to keep their seats even if it takes unfair procedures to do it. So that’s a real problem we have right now, that the incumbents, the people who are trying to maintain power and not allow any sort of regime change, are pulling all the levers. But what I think, I think the chief grounds for optimism is that when you go to the American people themselves and say, well, do you want a fair system, or do you want a system where you think your side is more likely to win? You talk to them about that, and I think that you’re going to get them to say they would really like to see a fair system. And that is the promise of America.
Laurie Segall: Last question. You have taught at Brooklyn Law School since 1980. What is the lesson your students will take from this moment in history?
Susan Herman: Whoa.
Laurie Segall: I know there are lots of lessons, but if you could extract it, what is the lesson your students will take from this moment in history?
Susan Herman: Well, in an individual setting, one thing I’m doing for the fall is I am preparing a course that I’m calling COVID-19 and the Constitution. So what we’re going to do in this seminar is we’re going to be looking at the way in which the constitution has been challenged and to see how well it holds up. What does the constitution have to say about whether you can quarantine people and whether you can allow people to be at a religious assembly but not go to a protest, and et cetera, et cetera. So I think there’s a lot of interesting things there which I think are very much this particular moment. But big picture, what I would like the students to take away, the constitutional law students, especially, is essentially what I just said to you, that the constitution is about process. It’s not about results. It’s not about you’re a Republican and you’re a Democrat and we have two different countries depending on what your party is. I think that we have one country and it’s all about a neutral process for very good reasons. And I would like people to think more about that. After my speech at the Central Arkansas Library, I had two examples of people who talked to me. One guy came up to me. He said, “I’m the Republican who walked into that bar.” And he said, “You’re making a lot of sense to me.” And then there was another guy who talked to me who was a Democrat. He said you know, “I never really thought about that, but maybe it’s not right if we’re only trying to win. I never thought about that that’s not what we do in sports.” And that’s what I’d like people to think about. Do you really want to do things that are only about how you think it’s going to come out and cheat and destroy the system and put a thumb on the scale and stack the deck in order to make things come out to what your preferred result is in the short run, or long term is that just a really bad idea because it’s just totally inconsistent? We’ve just come from 4th of July. It’s totally inconsistent with the premises on which we would like to believe our country was founded.
Laurie Segall: Does technology throw a wrench in the system? It does create lots of things you can’t control. And it-
Laurie Segall: But it always does. It’s always new environment. So different kind of example, we were talking about technology and surveillance where, of course, technology has enabled a whole lot of surveillance that we then have to deal with. But technology also enabled a whole lot of new marketplaces of ideas. So the ACLU did a lot of litigation a few decades ago on applying First Amendment principles to the internet. Could the government censor what was on the internet because a child might see it? And so every new generation of technology, there are new challenges about how you apply our principles like privacy and free speech, et cetera to the internet, but the principles remain the same.
I hope everyone is doing well in these strange and surreal times and adjusting to the new normal. Most important – I hope you’re staying healthy .. and somewhat sane. Follow along on our social media. I’m @LaurieSegall on Twitter and Instagram. And the show is @firstcontactpodcast on Instagram and on Twitter, we’re @firstcontactpod. And for even more from Dot Dot Dot, sign up for our newsletter at dotdotdotmedia.com/newsletter. And if you like what you heard, leave us a review on Apple podcasts or wherever you listen. We really appreciate it.
First Contact is a production of Dot Dot Dot Media, Executive Produced by Laurie Segall and Derek Dodge. This episode was produced and edited by Sabine Jansen and Jack Regan. The original theme music is by Xander Singh.
First Contact with Laurie Segall is a production of Dot Dot Dot Media and iHeart Radio.