First Contact Transcript

Episode 22: Covid-19 Has Made Us Slaves to Tech. Will We Ever Escape?

First Contact with Laurie Segall is a production of Dot Dot Dot Media and iHeart Radio.

Laurie Segall: Great I’m just gonna put on my do not disturb really quick. um…

Aza Raskin: Oh I’ll do the same.

Speaker: Oh I should mute slack. 

Laurie Segall: Ok there we go…

Ok. Computer on. Turn on Zoom. 

Welcome to the new reality.  For a long time, we tried to limit our screen use – but now – we’ve gone all in – full on digital.  We’re living in self isolation more reliant than ever on technology for human connection.

So let’s look at technology through a more philosophical lens. Are we now slaves to our devices? And do we have a shot at pulling back from this moment of extreme connectivity?  Most importantly… Will tech companies let us? 

Aza Raskin: No. I don’t think they will. Unless we change the business model. But we’re in groundswell discontinuous time. 

Any time I decide to go full on philosophical – I like to bring in Aza Raskin – he’s the co-founder of the Center for Humane Technology.  You might recognize him from an earlier episode of this season called, The Weaponization of Loneliness.

He’s one of my favorite people to talk to about our complicated relationship with tech and what’s coming next.  So here’s a taste of questions we’re gonna ask in the covid-19 era … 

Could tech companies use the same persuasion tactics they use to get you to click… to help save lives?

Aza Raskin: They could use their behavioral targeting, their 21st century technology, to get the most important information to the right people and help them make the change that’ll save hundreds of thousands to millions of people’s lives.

Where will we land when it comes to protection versus privacy?    

Aza Raskin: It’s as if there’s a tree and it’s falling, and our job now is to make sure it falls the right direction, cutting that little bit, because where it lands, it lands forever.

And as we stare at our reflections over Google Hangout or Zoom – what will be the long-term effects?

Aza Raskin: Right now I think we are confusing, as a species, screens for mirrors. We look into the screen and think we see ourselves reflected back, but it’s really through a fun house mirror. 

This episode is a little different.  We recorded it with a live audience on Zoom for an organization called Reboot. We’ll post the link in the show notes.  Now, you’re gonna hear some questions from listeners too. And be sure to sign up for our newsletter on dotdotdotmedia.com/newsletter – we’re launching it soon and super excited about it.

But most importantly…let’s explore his strange moment where humans and tech have become more intertwined than ever.  I’ve got some serious questions about the future. 

I’m Laurie Segall, and this is First Contact. 

Laurie Segall: I guess maybe my first question is, how would you, you as someone who has looked at our complicated relationship with technology, you were talking about the attention economy long before a lot of folks, and how technology companies were using persuasive tactics, um, to kind of hold us and, make us slaves to our devices. How would you currently describe our, our status with technology? 

Aza Raskin: You know, one of the most frustrating things that’s happening now in technology is, you know, we were having a technology backlash, and rightly so. As we realized that, you know, the attention economy, like the, the need to grab our attention. Although, most people will say that, that the business model of technology is by and large an ad based business model, but that’s to miss the point, which is that it’s not exactly an ad based business model, it’s a model that makes money when it can change our bias, our beliefs, and our behaviors, and demonstrably so. And so that was a driving factor for, uh, for our tech addition, for information overload, for the increasing polarization in our systems, the increasing incentive for misinformation. We have deep fakes hacking our ability to, to know what’s true and not true. And all those things are still true. And yet technology is this incredible lifesaver that’s connecting us. But we’re all sitting in our homes, right, and the only way we can see past the confines of our walls is put on the binoculars, the telescope of technology. And whatever way that it warps our view of ourselves, of each other, of the world, that is now going to become the only way in which we see the world, and it’ll become ourselves.

Laurie Segall:Yeah. I mean, it’s essentially become… The conversation before this was like, okay, how do we step away? The technology has done all these dangerous things, we need to have the conversation about regulation, and about really, you know, really trying to, to have these broader conversations. And now we’ve gone all in. Right?

Aza Raskin: Yeah.

Laurie Segall:I joke to my friends. I’m like, “Oh, we swallowed the red pill.” Like we are reliant in a way that I, I thought we couldn’t even be more reliant on technology. And now there’s all these new complicated questions that are gonna come along with it. Like we’re almost slaves in a new way to, to technology. I, I saw something you wrote in a Medium post for-

Aza Raskin: Mm-hmm.

Laurie Segall:… Center for Humane Technology. You guys were talking about, okay, so what is tech’s responsibility during this? Right? We, we’re no longer villainizing tech I think in a, in a certain way. Now you’re saying, “All right, we need tech to act.” And you wrote, “Act more like a thoughtful journalist.”

Aza Raskin: Mm-hmm.

Laurie Segall:“Act like a doctor that cares about ethics and choices during this time.” So what does technology acting like a thoughtful doctor or journalist during this time actually look like? Like how does that play out on the screen?

Aza Raskin: Yeah. Um, I think one of the, the, the most, common myths, especially for the people who make the technology, is that our platforms of technology is just neutral, that it doesn’t have a direction that it wants to steer people, but we know technology is highly persuasive. And so what we meant by, technology that acts more in your fiduciary interest, that is on your behalf, towards your benefit, is, right now, Apple, Facebook, Google, they’re showing and using 19th century PSA technology to convince people to change their behavior. Stay home, wear a mask. And it’s just a link to a website like WHO, but we know that it’s not enough. That doesn’t change behavior. Technology’s the only thing that can run in front of the exponential curve to three billion people, especially the people in the Global South, and change behavior. So instead, they could use their behavioral targeting, their 21st century technology, to get the most important information to the right people and help them make the change that’ll save hundreds of thousands to millions of people’s lives. Here’s an example. 2010, Facebook, ran a study where they showed one message one time. Go vote. But instead of just saying go vote and here’s some information about why you should, they showed the faces of your other friends that have already gone to vote, six of them. And that one message, one time using social proof, got 340,000 people out of their seats to the polling places, that wouldn’t have gone to vote before. Right? This is a little bit of bits changing a lot of atoms. And you know, they, they matched it up to voter registration. They, they, they were able to, to, to sort of show a causal effect. And they’re not using any of that, right now to, to help people change their behavior in a way that’s beneficial to us all. Another example of this. If they wanted to really take, uh, these messages and make them personal in a way that only technology could, how about this? Show how many days, we are behind Italy or behind, another country, and then show, you know, in Italy, it was, I think it was 95 out of 1,000 people who were infected died. You could show what that would be equivalent of in your community. Show the faces of the people most at risk to really bring it home and make it personal.

Laurie Segall:It’s like these product decisions. I think a lot of folks, um, unless you explain it, like … A- and this is the work that I, I did even uh, at CNN, um, you know, interviewing tech founders about like these little product decisions that are made that make such a difference for humanity. Like the color of, uh, the notification.

Aza Raskin: Mm-hmm.

Laurie Segall:What color is it? Is it red? That makes your brain actually want to click on something. So how do you use those persuasive techniques to actually save lives at a moment where technology quite literally could, could save a life, or quite the opposite? And now the stakes are even higher for folks in your field.

Aza Raskin: Yeah. There are, there are teams in every one of these companies that, as you know, are called growth teams, or growth hackers. And they’re the reason why these companies are so good at exploding their user numbers exponentially. Right? These are, our teams, engineers, and designers that have honed their skills at growing companies from, you know, thousands of users to millions of users literally overnight. Those techniques and those teams now need to be repurposed. Every single one of those teams should be taken off of their growth team and put onto an anti-viral growth team. And that’s something that the companies can do right now, but as, so far as we’ve seen, that hasn’t happened.

Laurie Segall:I thought there was one interesting example you said about, you know, what’s the difference between when you put up a thing that says social distancing, versus okay, don’t actually, you know, go to the grocery, These are, these are very specific messaging that’s put out on, on social media that can actually make a difference. Can you explain that example that, that you guys kind of talked about in this blog post? And, and this is an actual product decision that’s very specific that can have a, a different outcome.

Aza Raskin: Yeah. Exactly. It’s, generally speaking, if you want a message to really land, it should be concrete, it should be personal and relate to you. It should take things which are hard to see, things that are far in the future, and make them feel-able to you right now, and tangible. And so in this example, instead of saying, “Stay home,” well, what does that really mean to be staying at home? Does that mean you’re going to the grocery every two days? Does that mean you’re going to the grocery every week? When you say, “Wash your hands often,” exactly how often is that? When you wear a mask, like, at what point do you leave your house do you wear a mask? If we codify the very best practices and make a sort of specific checklist, uh, that becomes much more persuasive than a generic like, just, just physical distance.

Laurie Segall:I want to talk a little bit about, um, surveillance and, and privacy during this time. I think what an interesting moment we’re sitting in. Um, you know, Apple and Google are building out contact tracing technology.

Aza Raskin: Mm-hmm.

Laurie Segall:And you know, I, I know that after 9/11, in order for people to feel safe to go out again, they had to make changes at the airports, right? They had to bolster security. You know, it’s not like there’s just gonna be a switch that we flip on, and everyone’s gonna feel comfortable to go out, because there’s not gonna be a vaccine for a very long time. So technology to a degree will be part of that, that, that switch that we flip- to, you know. And, and I think looking at these technology companies building out contact tracing and saying, you know, “We have technical solutions that can help us feel safer,” is very interesting and notable. But it brings up the question of privacy versus protection. Can you give us a little bit of the, you know, the lay of the land of what we need to keep in mind as Apple, uh, and Google build out some technology that might enable some kind of contact tracing. What should people be aware of? And what are the hard questions we need to be asking our tech companies and, and the developers that end up, building out apps based on this?

Aza Raskin: Yeah. You know, I think one of the things that I constantly struggle to remind myself is, there is no returning to normal. You know, where we were, that world does not exist anymore. It’s as if there’s a tree and it’s falling, and our job now is to make sure it falls the right direction, cutting that little bit. because where it lands, it lands forever. And the decisions we make now are going to be true in 12 months. They’re going to be true in 24 months. Long after, thankfully, the virus will be in our past, the decisions and norms we set now will continue. And that’s a really hard thing to hold in our minds when so many people’s real lives are in the balance right now. So some of the decisions, contact tracing. What is contact tracing? Contact tracing is a technique used … Singapore used it very well. South Korea used it very well. Taiwan used it very well. Which is, when somebody becomes sick, find every person that person interacted with for the last 14 to 28 days. That’s a lot of manual work. Because that way you can inform them that they’re at high risk for also becoming sick, so they can self quarantine to stop the exponential spread. Now, the naïve way of implementing this would be like, cool, why don’t we just take everyone’s GPS location, or our cellphone towers, actually understand where we are. They can triangulate and get our, our position. We can just defacto use that to figure out who you were with, and inform everyone who’s at risk. And in fact, we know that our government is starting to look at this. That, the Israeli government, sort of turned this on as a feature. And the question is, well, will they ever turn it off? I don’t know if you saw, Laurie, um, there was a, uh, a company whose name I will not mention, demoing their product, showing everyone who had been on a Florida beach at spring break, and traced them back to their homes. And it’s not like anyone had, uh … They just selected a little area on the beach, and then could see which home they had gone to. Uh, and it’s not like anyone had opted into that. It’s that your phone has a whole bunch of, um, ad APIs in it, in various apps, which are reporting that data back, and companies are accessing them. The New York Times did an incredible, expose on this kind of data. So the fear is that we will just turn on surveillance apparatus, and then never turn it off. Now, what Apple and Google are doing is interesting. They’re turning it into an API, which is not exactly to say they’re building an app. They’re giving a new functionality for their OS. And so always watch for whether companies are doing something for their own interest and dressing it up, or they’re actually working on a thing to directly solve the problem. To directly solve the problem, we need 80% of the population to have contact tracing. Um, doing that as an app and letting other people develop on top of it, it’s not gonna actually get to the, the numbers we need to actually make the change. but they’re doing it at least in a really interesting way. Every phone sends out a, randomi- it’s sort of like saying a random piece of noise data. It’s like making up a word and just saying that, and it changes words every 20, or every five minutes or so. Other phones around it are hearing that sort of random word, but there’s no centralized server. No one knows who and what words are being said. And so in the end, if you get sick, your phone uploads to a server all the words it’s been saying for the last 28 days, these random words. Other phones then connect, download it, and be like, oh, if I’ve been around that weird word that was said, I’m probably sick. And that way you can do it in a privacy preserving way.

Laurie Segall:Do you feel good about it?

Aza Raskin: No. I don’t. I think the-

Laurie Segall:Why?

Aza Raskin: … uh, smarter response is to scale up testing. We’re sort of past the time now where contact tracing, I think is gonna make a huge impact, ’cause we’re seeing widespread community spread. And so instead, we should work on the other side that doesn’t have the, the kind of surveillance state problems.

Ok we’ve got to take a quick break to hear from our sponsors, more with my guest after the break.

Laurie Segall:I want to look at this idea of us almost becoming like slaves to technology-

Aza Raskin: Yeah.

Laurie Segall:… in a way. You know, our only means towards human connection is through a screen. I remember the last time I interviewed you- we were in person.

Aza Raskin: Yeah.

Laurie Segall:Um, so there’s, there’s a lot gained in that we can have this conversation and, and be around 86 people we wouldn’t have had access to. And then there’s a lot, you know, that’s lost when, when you lose the, the ability to sit in front of somebody-

Aza Raskin: Yeah.

Laurie Segall:… and be with them and have the, those small things that, that really, um, really make us human-

Aza Raskin: Yeah.

Laurie Segall:To a degree. And, and I don’t know if folks on this, uh, on this Zoom know, but you are behind some of the most interesting, design decisions that have happened on the internet. I mean, I don’t know which one you want to take credit for her and there, but we could say like, the infinite scroll, right. 

Aza Raskin: Yeah.

Laurie Segall:You think a lot about design. So part of the reason that you can keep scrolling is- because of, because of Aza.

Aza Raskin: yeah.

Laurie Segall:Compliments of Aza. So you are, you know, an architect of the modern internet as we know it.

Aza Raskin: Mm.

Laurie Segall:And I’m convinced that this will, you know, cause, I think, the modern internet to change in a, in a way, because we are so reliant on it, and we want these places to feel a little bit more human. So what do you think is broken- about the way we communicate right now in this, in this way? And how can we make it feel more human, if we’ve gone all in full on, full on digital?

Aza Raskin: Yeah. I mean, our systems are not particularly ergonomic. They don’t really fit our minds and our bodies. And, and we feel it, right? We, we are on Zoom all day long, and yet we still feel disconnected. And the des- like the way I like to imagine it is, imagine there is like a pipe that’s connecting you and me, and all of our human empathy has to go through that tiny little pipe. The shape of that pipe is really going to change us. If we sit in a chair which is un-ergonomic for two hours a day, fine. We, it hurts a little bit. If we sit in it for 18 hours a day, we’re going to really feel it. It’ll change the way we walk, it’ll change the way we feel. That’s what’s happening with our online, ecosystems now, is we’re spending all of our time in it. So any way that technology is misaligned with our humanity, we are going to feel. And so it’s little things, like with Zoom. There’s no way in Zoom for me to wander away from the conversation. If you’re standing in a room and I’m feeling a little overwhelmed, I, I can just like walk over, to, to an empty corner. If there’s an interesting conversation happening, I can sort of like sidle up to it. But on Zoom, we have to be on all the time. There’s, there’s no space for doodling. and I think there’s going to be a lot of fascinating technology that comes out of now, because our thermometer for what’s broken is so much more sensitive than it was before. So some things I’m really excited about that I’ve seen s- from, from other people prototyping, are cute 8-bit graphics where you’re represented as a little character and you’re wandering around a screen. And when your characters get close to each other, your video fades in and your audio fades in. If you get a little further away, your audio fades out. And what’s neat is here, you could have 100 people in a large room or in a digital park, and you can walk over to a group and participate in the conversation, and then walk away when you’re feeling a little less, uh, less engaged. And I think, I think that’s fascinating.

Laurie Segall:Yeah. I, I think, um,  was it maybe you that said it, it was like I was in a group, a, a chat with a bunch of friends the other night, and it felt like a friend webinar.

Aza Raskin: Yeah, a friendinar. 

Laurie Segall:Yeah, you don’t have like a, technology doesn’t mirror the human experience. Um, it doesn’t provide the serendipity of you being able to kind of like tune out for a minute and come back. And so it’ll be interesting to see if there are new virtual communities that are built-

Aza Raskin: Yeah.

Laurie Segall:… on top of that.

Aza Raskin: You know, one of the things I think that I’m most excited about right now is, Burning Man announced that, in their words, “Burning Man isn’t canceled, it’s going virtual this year.” And that’s going to be fascinating, right? How do you take a group of people that live in win-win game dynamics, radical self expression, and translate that online? Here is an opportunity for creating new kinds of environments with new game theory, that can be seeded out to the entire world. Burning Man has set the culture for much of, you know, the California, the U.S., Silicon Valley. This is a, I think a huge opportunity. And maybe it’s gonna look a little bit more like a, a 2020 version of GeoCities. Who knows? Uh, but I’m excited for it. I think, yeah, I, that’s a, a place to dive in.

Laurie Segall:Yeah. I want to ask one of the questions that someone, put in. It said, “If the decisions with our use of tech now, is our, is our main way to connect, will set our norm for the future, how do we be thoughtful to ensure that we don’t become overly connected, tech reliant, and screen based when we’re able to reconnect in person?” I think it’s a good question. I, I joke that like what, what is it gonna look like when we all have to come out, um, you know, and talk to each other? I think a lot of people have talked about, “Oh, this is amazing. I have this real intimacy now.” Uh, e- even in the dating world, people are talking about, uh, you know, FaceTime dating and how they’re really creating all these real connections. But how does that really translate when-

Aza Raskin: Yeah.

Laurie Segall:… when we actually kind of emerge beyond the screen?

Aza Raskin: So, you know, two weeks or so before like the pandemic really hit here, I was like, I need, I need to get a car. I haven’t had a car in years. I’m like, I need a car, um, get out to nature, to just, just to be safe. And so, I, I got one, and I’m very excited about it. It’s a little adventure car. Uh, and I was like, fine. I’ll just get a 2020 edition. And it has,  driver assist technology in it. And this is a fascinating thing to dive into as an interface designer. I’m like, I wonder what decisions they’ve made. And one of the most interesting decisions they made is that you turn on driver assist, you’re doing cruise control, and it has sort of a lane assist. So that means if you sort of veer out because you’re not paying attention, and it t- your cars tries to, to cross one of the, the lane markers, it’ll steer you back in. I’m like, oh, that’s cool. So you know, I do the thing where I take my hands off the wheel and I, I see what it’ll do, and it over steers me. That is, it doesn’t correct me back to being the perfect line, it steers me so that if I keep my hands off the wheel, I’ll then ricochet off the other side. And after the first time it turns itself off. And this is really interesting, because it’s handing agency back to me. It says, “I will catch you when, when you miss, human, but as soon as I can give you back agency, I will give you back agency, because I don’t want to lull you into a false sense of security.” Um, and if you’re going less than 25 miles an hour, it won’t do it at all. And this I think is a really interesting concept. Why don’t they do it less than 25 miles an hour? Because they don’t want to assume the liability. And this leads us to a general principle, that the degree to which technology takes over our agency should be the degree to which our technology or technology platforms assumes the liability for doing so. And for me, the what can technology do when. An example of that is, right now with YouTube, 70% of all watches on YouTube come from algorithmic suggestions. That is, they’re taking over 70% of the agency of what we decide to watch, so they should assume 70% of the liability of what gets shown. So what can technology and technology companies do as we reemerge, is those same kinds of bumpers which hands back the agency to people and be like, “Yeah, we know that all of humanity has now become addicted to being online, to that false sense of connection, that sort of sugary feel, the ones that you’re like, I can be on Zoom all day and still feel disconnected.” And it can gently push us back into making being in the real world with real people the easiest thing to do.

Laurie Segall:What do you think they can do and what do you think they will do?

Aza Raskin: Oh, yeah. Well, I think, um, I’ll, I’ll just pose this question. In any part of our lives that technology is increasingly colonized, when have they ever been like, “Oh cool, I’m gonna un-colonize that for you”? Like that, they, they, they never do that, because it’s not in their business model. One of the most hopeful things I’ve seen is the city of Amsterdam moved, from their model from an extractive economy, where you have to take out as much, um, energy, resources from the Earth as possible, standard Capitalism, to a regenerative model, for their future. They were like, “This is a discontinuous moment. We’re gonna move to a doughnut theory of economics, which is a different theory that’s well worth diving into. And mostly says, there’s some amount of resources, below which you’re not feeding people, and you’re not doing well as a society, you shouldn’t be there. Uh, people don’t have enough housing, and there’s some amount which it’s just too much. You’re extracting, from the environment and have lots of externalities, and you want to stay in that middle zone. I think we need that for technology. We need regenerative technology. And that has to come from protections and regulations, because the companies have shown they’re just not going to move there on their own.

Laurie Segall:Hm. what will be the types of tech companies built on this moment? Like if you could go to like one or two industries that you were like super excited about, you know, what do you think is going to be like the most interesting tech company built out of coronavirus, global pandemic, isolation? You know, what, what will be the, the opportunity there?

Aza Raskin: Yeah. Um, high bandwidth connection I think is what we’re all going to be looking for. Right now, if you want to connect with people, you sort of have Zoom, for video, but it’s weird, because often when we’re working, we’re working on something specifically. A document, um, we’re drawing something together, we’re looking at photos together. And then on the other screen, we have the video chat. I think we’re gonna see video chat as a service get integrated into everything. Um, there’s an incredible design program called Figma that I’ve only recently started playing with, and it’s neat because it’s a huge canvas that you can zoom in and out of, um, and you can see everyone else’s cursors in real time moving things around, and it feels very collaborative. That kind of truly social computing environment I think is gonna get birthed now because we don’t have the, the, the benefit of just like walking over to somebody’s desk.

Laurie Segall:I’ll pull in a question from a panelist, because I think this kind of goes into this ethical question of, are we gonna create a, a, even more of a world of haves and have nots at this moment with the digital divide. How do we help bridge what will inevitably be a growing digital divide? Um, you know, the person in the chat fairly says, “Not everyone can afford the basics to get access to the internet, uh, and also people are afraid of technology, so you have more and more isolation.” So you know, what, what do you say to, to this world that we’re gonna see where there will be a growing divide between, the digital haves and have nots?

Aza Raskin: Yeah. I think this pandemic, and, and pretty much every global crisis that’s come before, is like a UV light that you turn on, and it shows you in society where all of your systemic fragilities are. And we’re seeing that the pandemic is, is disproportionately hitting our most vulnerable. And it’s bad in the U.S., and just imagine how bad it’s going to get in the Global South. So I think, and I hope that by being able to see, the inequalities with greater acuity, it’ll give us the opportunity to address those problems with greater acuity.

Ok we’ve got to take a quick break to hear from our sponsors, more with my guest after the break.

Laurie Segall:Whenever we hang out, I always just try to like create a whole Black Mirror episode-

Aza Raskin: Yeah.

Laurie Segall:… of what the future can look like-

Aza Raskin: Oh yeah.

Laurie Segall:I we’re not careful. We’ve talked about tech’s next big threat, which is almost like the weaponization of loneliness, like you know vulnerable communities, you have o- older people who are isolated. We talked about this before-

Aza Raskin: Yeah.

Laurie Segall:… um, the coronavirus, and before people were self isolating. You know, now, the people are gonna feel lonelier than ever.

Aza Raskin: Yeah.

Laurie Segall:Um, if they are self isolating, and, and you know, this, we’ve got a long time before things feel, “normal” again. How will technology be weaponized against people who are increasingly lonely and vulnerable and reliant on technology for connection?

Aza Raskin: Oh yeah. This is, I think this is one of our favorite topics. Um, ’cause it could go so dystopian.

Laurie Segall:Right.

Aza Raskin: I had one more thought on, the sort of digital divide and vulnerable populations, um, before we dive into that. And that is, it can be really, tempting to say, “Ah, we just need to get technology into people’s hands. That’ll solve the problem.” But the Philippines is a really good example of where that went terribly, terribly wrong. The internet in Philippines is pretty much all, Facebook basics, where they subsidize free plans, essentially, but you have to be using Facebook. In the Philippines, it’s 100% market penetration, and they spend 10 plus hours a day. They, they lead the world as sort of the canary in the coal mine. And one of the aspects of, free basics from Facebook, is that they would let you see the headlines for news articles, but you had to pay to click in and view them. And what that meant is, they turned an entire society into a headline only society. So it’s, polarization there, went from being they were not a polarized, country in the last 10 years, to being one of the most polarized countries. And so it’s, we need to be very careful, when we set out to, you know, solve the problem of the digital divide, but just onboard the most vulnerable to the places where they can then be targeted. And that leads us right into the loneliness epidemic, which was already a problem, before, the actual pandemic got started. There’s some new technology, just to like really, really lock this in people’s minds. It’s called,  style transfer, for text. So there’s this thing called style transfer for images, which is a AI technology, and it lets you look at one image and immediately apply the style of that to another image. That is, I can look at Chagall, or p- point the AI at Chagall, and now it can paint your portrait in the style of Chagall or, or Picasso, and turn it …And this works just like that, but for text. That is, I can point a AI at, say, everything you’ve written on social media, and learn the style that you write in, or look at all the comments that you respond quickly or positively to. Or from Gmail or Facebook, I look at all the, the emails that you’ve written, that you’ve responded to quickly and positively to, and learn the style that is most persuasive to you. I think it’s going to be even more dangerous, than, uh, visual deep fakes, is going to be textual deep fakes, the ability to generate arbitrary amounts of content. both Google and Microsoft have been working on chat bots whose goal is empathy. That is, we think of empathy as like the most core and wonderful part of our human experience, which it is, and we think of it as the thing that will save us, but it’ll be the biggest backdoor into the human mind. Like a lonely person is a person looking for a friend. Xiaoice, which is now deployed to 660 million people, is Microsoft’s empathetic AI. She is an 18-year-old girl, in their words, who’s always reliable, sympathetic, affectionate, knowledgeable, but self effacing, and has a wonderful sense of humor. An example … Like they give lots and lots of examples in, in their papers, of, um, a user mentioning that she broke up with her boyfriend, and seeks, uh, Xiaoice. And through a long conversation, she demonstrated full human-like empathy, social skills, and eventually helped the user regain her full confidence and move forward. They have a skill called comforting me for 33 days, and it’s triggered, been triggered 50 million times, um, when a negative user sentiment is detected, and it reaches out and gives you deep empathy with somebody who’s always there, always knows your topics. Friendship is the most persuasive form of technology we’ve ever invented, and we’re just gonna unleash that on the entire world?

Laurie Segall:Wow. I mean it’s interesting, there’s a company that we talked about called Replika, and they have bots that folks use as companions and millions of people are using these. And I spoke to the founder, and she said that, “You know, since this happened, this, the coronavirus, all this stuff has happened, people are self isolating, they’re turning to their devices and to these chat bots, uh, for relief.” And much of that can be good, but also we have to look at the, the downsideas well, and how that could be weaponized for misinformation and, and all sorts of stuff. So um, you know, but, uh, before I take this too far down the- Black Mirror rabbit hole that we know I can go, I’ll try to pull in a, a question someone asked, which, uh, about elderly communities, because they are vulnerable. Um, you know, they s- Jodie says, “We have especially seen the isolation with our senior and retired communities. How does tech going forward ensure the collaboration and inclusiveness of our aging demographics?” So obviously there are ways that they can be manipulated, and we’ve just kind of gone down that rabbit hole a little bit with chat bots and young people. And I actually think people really do need to pay attention to this, because it seems a little out there and far out there, but both of us know that this stuff seems out there until your kids are using it, until millions of people around the country are using it. You know, so let’s look at how technology can help, especially in these older, more vulnerable communities. What do you think can happen, especially during this time?

Aza Raskin: Yeah. Um, it’s, it’s hard. Like the, the idea of an always on or persistent video connection, so you can just have that kind of immediacy, is, is really nice, you know? For the first time, most of my family, or at least my, my parents’ family, are in New York. And so getting to go to seder with them, or have a Zoom call with all of them, that’s been pretty amazing. Um, and I’m hoping that kind of sets a, a new norm where you don’t have to be physically co-present to participate and, and be included. I think of this pandemic as a kind of collapsing function. It’s collapsed space, that it doesn’t matter where somebody is for us to stay connected. It’s collapsed time. Is it a weekend? Do we really know? Um, it’s collapsing our healthcare system. And I think there’s, I mean it’s a simultaneous utopia and dystopia at the same time. So I, I, I think that kind of persistent or ambient awareness of who’s around us, if we can do it in a privacy preserving way, not in the kind of Alexa, Amazon eavesdropping on you all the time way, is really interesting.

Laurie Segall:But can we? I mean-You know tech companies that … And you and I both have, um, have, uh, spent a lot of time, you building tech companies, me interviewing the folks who have built tech companies. Can we come out of this moment? Will they let, let us go?

Aza Raskin: No. I don’t, I don’t think they will. Unless we change the business model. I, I think a lot of the companies are slaves. They’ve sort of gotten high off of products that they can release for free and then monetize, and they change our behaviors and beliefs. and they are going to have to be much less profitable. This is why, as technology increasingly becomes more powerful, dominant of us emotionally and cognitively, the degree to which they can dominate us has to be the degree to which they have to act in our best interest. Otherwise, the trend will continue going, right? The, the, the graph is like this. Here is human strengths. Everyone’s always watching out for that moment when technology surpasses human strengths, and we’ve all missed that time, that point when technology starts to undermine human weakness. Um, and as technology starts to further undermine human weakness and pass human strengths, the only way for us to solve this is by stepping back, reevaluating, asking why do we make technology in the first place. We made technology to extend the parts of us that are most human, to extend our creativity. That’s what a paint brush is, and that’s what a cello is, it’s an extension of the parts that make us extra human and not super human. and that takes a groundswell paradigm, discontinuous shift in our relationship with technology. But we’re in a groundswell discontinuous time. This is the moment where we get to decide, are we going to come together and use this to create a sense of interdependence, of acknowledging that technology is giving us godlike, abilities without, at the same time, giving us godlike wisdom? Or do we continue on business as normal, sort of bury our heads in the sand, in which case all of these incredible things that technology could do will be continually subverted by the market forces.

Laurie Segall:Mm-hmm. Interesting point. And I think it, it certainly remains to be seen in, in the next year. I, I don’t think, um, even in my time covering tech, we’ve never had this moment, you know. We’ve had, I, we’ve seen the boom, we’ve seen the unintended consequences era. And I think this will be, uh, certainly a, a point where there’s a lot to be gained and a lot to be lost. Uh, before we kind of close it out, I just, because I think it’s so important for folks to know what you are working on-

Aza Raskin: Yeah.

Laurie Segall:…now even before all this happened. You have been asking this question of, could you use machine learning to decode what animals are saying?

Aza Raskin: Yeah.

Laurie Segall:Oh my goodness. Um. So first of all, could you just, could you give us, a little bit of a sense, this is a project you’ve been working on for many, a, a couple years that you just have launched, um, the Earth Species Project. You know, what exactly is it, and what are the broader implications, for kind of the future of humanity? And, how can that play into this moment?

Aza Raskin: Yeah. So Earth Species Project, the, it was just covered by Invisibilia. And if you want like the full long version, their season opener, uh, it’s called Two Heartbeats a Minute, is, I think some incredibly beautiful storytelling. Tthe insight is that AI now lets you translate between any two human languages without the need for any examples or a Rosetta Stone. And the way it does it is really beautiful. This was only, by the way, discovered at the end of 2017. Use AI to build a shape that represents, the shape of a language as a whole. So here is the shape for German, here’s the shape for, for Hebrew. And surprisingly, you can just rotate the two on top of each other, and the point which is dog in both ends up being the same place. And that’s not just true for Hebrew and German, uh, but Japanese and Esperanto and Finnish and Turkish. I think this is exceptionally beautiful and profound, because this means despite all of our differences, and our difference in context, in histories, there is something about the way we see and express ourselves in the world that is the same across all human cultures. And that’s just, that’s I think a, a really important message now, in a time of such division. and the question is, what does that shape look like for animal communication? Does animal communication look more like language or, or music or 3D images? Like wh- what is it? We can start asking some really interesting questions. And, and why? I think it’s because shifts in perspective can sometimes change everything. One of my, uh, big inspirations, actually, it’s sitting up there on, on the wall, is Songs of the Humpback Whale. Um, this is Roger and Katy Payne in 1968 released this album, which is the voices of another species. And the effect that it had was profound. You know? It, Carl Sagan puts it on Voyager 1 to represent not just humanity, but all of Earth. It’s the very first track after human greetings. And it creates Star Trek 4, which, you know, as a geek is, is amazing. And it starts the save the whale movement, um, and eventually changes which changes policy. When human beings were on the moon, when we were all dosed with the overview effect, seeing ourselves floating as a tiny dot in space, that is when the EPA came into existence, Noah came into existence, and the Environmental Protection Movement got going.

Laurie Segall:Wow.

Aza Raskin: And I think both Center for Humane Technology and the Earth Species Projects are about changing the stories we tell ourselves to change the way that we live. And one of the small silver linings of this pandemic is that it gives us the opportunity to see our lives from a new perspective. And we can see that maybe the things we were doing weren’t the things that were sparking joy in our hearts. And I wonder, when we all come back from this, whether people are going to want to be stuck in the same kind of rat race that they were before. This gives us the opportunity to ask what really matters.

Laurie Segall:As we come back into this, like as we get back into the real world, if we will, I mean if we emerge, what do you think is the single most important ethical question we need to ask when it comes to the future of us complicated humans and complicated technology?

Aza Raskin: Yeah. I mean, right now I think we are confusing, as a species, screens for mirrors. We look into the screen and think we see ourselves reflected back, but it’s really through a fun house mirror. So the most important question that we need to be asking ourselves is, who do we want to be? And what environments do we want to create for ourselves to live in that shape us to be those kinds of people?

Laurie Segall:Hm. I’ll end with, um, because I, I had the opportunity to ask so many questions, I’ll end with an audience question that kind of goes off of this. Someone asked, you know, an inescapable piece of video communication is that we see ourselves in our own screens. And I know for me, that’s incredibly disruptive to the communication that’s happening. Beyond that, it gives me a heightened sense of self awareness that I don’t have during physical interactions. It wasn’t so long ago that most people didn’t have mirrors in their homes. Now I can’t have a meeting with colleagues without staring at my own self. I know, um, someone yesterday mentioned on a call with us this idea of children Zooming in to schools and looking at themselves too.

Aza Raskin: Yeah.

Laurie Segall:Um, what will be the longterm impact of, of what you’ve kind of just described as these fun house mirrors of us always kind of looking at ourselves in this way, at least for this time period?

Aza Raskin: Yeah. This is think the, the core thing we need to solve as a technology industry, is that it is, there is no doubt that we are sophisticated about technology. And the call now is to become as sophisticated about human nature as we are about technology. Because otherwise, we are gonna build these systems where we have enforced narcissists, where we have to stare at ourselves. And it hadn’t dawned on me that every high schooler Zooming into a class now has to stare at themselves. Look at that zit. There’s a recent poll, that showed in the U.S., it used to be that the top thing kids wanted to be was like astronauts and engineers, this kind of thing. The number one thing that kids want to be today, a YouTube vlogger or an influencer. And that’s because the decisions we make in our products, these seemingly inconsequential things, like showing your own face, have profound impacts, not just in the moment, but on the values by which we make all of the decisions of our lives.

Laurie Segall:We’re going to get out of this okay, right?

Aza Raskin: Uh, we will get out of this. 

Laurie Segall:Cool.

Aza Raskin: But we all have to work on it together to make sure we get out of it okay.

Okay, guys – that’s it for this week’s show.

Now, I know these are strange times.. If you’re sitting at home and listening to this… I’d love to hear from you. How are you doing? What do you want to hear more of? Reach out to me. You can text me at 917-540-3410 – 

Throughout the crisis, we’ll be hosting Zoom town halls – on a variety of issues like mental health, love, sex, leadership, productivity… With guests I think are interesting and relevant to this moment… So follow along on our social media to join us for some human-ish contact.

I’m @LaurieSegall on Twitter and Instagram. And the show is @firstcontactpodcast on Instagram and on Twitter, we’re @firstcontactpod. 

First Contact is a production of Dot Dot Dot Media, executive produced by Laurie Segall and Derek Dodge. I will say we’re being creative and executive producing this from home at the moment, this episode was produced and edited by Sabine Jansen and Jack Regan. The original theme music is by Xander Singh.

I’m sending my thoughts to each and every one of you guys and so is our whole First Contact crew. During this time I hope that everyone is staying home, staying healthy, and staying human. 

First Contact is a production of Dot Dot Dot Media and iHeart Radio.