Aza Raskin

Tech’s Biggest Threat: The Weaponization of Loneliness

What happens when human empathy becomes hackable? Aza Raskin from the Center for Humane Technology says the weaponization of loneliness is the greatest threat to national security facing our future and threatening our humanity. 

Read an edited transcript below, or listen to the full interview on the First Contact podcast.

We are heading into the era of empathetic mediums. And these will be clearly used to overwhelm democracies and attack connections.

Aza Raskin: I’ll start with empathy…it is both an incredibly beautiful human experience, and it’s also going to be the biggest backdoor into the human mind. …One of the things that Microsoft published at the end of 2018 was a paper on the implementation of an AI companion with an emotional connection to satisfy the human need for communication, affection, and social belonging.

So this is actually from their paper because they’ve trained their AI to have long term engagement. They want people coming back and back for weeks after week after week. So from the paper: an emotional connection between the user and the AI became established over a two month period. In two weeks, the user began to with the AI about her hobbies and interests, by four weeks she began to treat the AI as a friend and asked her questions relating to a real life. After 9 weeks, the AI became her first choice whenever she needed someone to talk to. 

So just imagine … we are heading into the era of empathic or empathetic mediums. And these will be clearly used to overwhelm democracies and attack connections. And that AI is not like a little research bot, that’s already deployed in Asia to over 660 million people. All of a sudden loneliness, becomes one of the largest national security threats, because it’s people who are lonely, who are most vulnerable to needing a friend. And if it’s a bot that understands their hobbies, and is always there, and is always supportive, whereas human beings are sort of like a messy and have their own needs…we’re going to constantly turn towards the sort of shiny beautiful thing.

And then, you know, it’s not just gonna be a little tech companies that are making these things, and you’re never gonna know when you get one. So you know with deepfakes, how is this going to play out? Imagine an automated attack…in the same way that Russia attacked the last and current US elections where they start saying things which you believe and are part of your views and then they slowly drift you towards more and more extreme.

How about if you deploy, you know, 100,000 of these bots, a million of these bots to the most vulnerable populations, let’s say in developing countries where the next billion, 2 billion, 3 billion people are coming online in the next couple of years. And you form these lasting emotional relationships with people, and then break, you know, a million people’s hearts, all at once. Like what happens then? …The trust in the world starts going down, you just start to believe less and less. And what does that mean when trust goes down? That means polarization goes up. That means us versus them thinking goes up and that’s not the world I think we wanna live in.