This AI Could Spot Depression Before Friends and Family… And Act On It

Often time our friends, family, and partners only see one side of us. They see what we want them to see. But what if an AI could see *all* of our communication… and get to know us better than any one person? Could it determine how we’re feeling? Even offer guidance when we need help…?

Es Lee is a data scientist and founder of a company called Mei. He hopes his technology will be widely used to monitor mental health. Right now, Mei analyzes text messages…

Read an edited transcript below, or listen to the full interview on the First Contact podcast.

It doesn’t matter if you have Mother Teresa in your phone. If she doesn’t reply to your text messages, she’s not gonna be able to help you.

Laurie Segall: If Mei is looking through a lot of these messages and seeing people at these very vulnerable moments… You must really see some people in pain, saying some pretty sad things about depression or suicide. What’s the responsibility of the AI?

Es Lee: Yeah. I mean, that’s a good question. I don’t think anybody really has a good answer for that… That was one of the first things that we turned to when we go, “Hey. If the AI is able to look at your conversations with everybody, your parents, your girlfriend, your best friend, it’s able to kinda understand things about you that really nobody has the perspective. Right?” You only get to see somebody in a certain dimension, whereas we, you know, we have a 360 view. And we kind of theorized that, hey, maybe we can actually, you know, figure out patterns. And so we started looking into conversations. And for the longest time, I was looking for another start-up in mental health or some data set where I… It’s kinda morbid to say, but, could I get the text messaging history of somebody who killed themselves? Because we have the analytics tools to kind of comb through the history. And what if we were able to use that to figure out patterns that preceded the actual act?

And so, I talked to many, many psychologists, start-up companies, and the answer was always “We don’t know,” or, “We don’t have the data, and even if we did have the data, privacy regulation is keeping us from doing anything.” And then it dawned on me, well, we have conversations of 150,000 people. What if we just search through these messages and try to look for the phrase “tried to kill myself”? We actually found about 10 occurrences of that where it was from the user and not a contact. So we had a lot more information on them.

In those 10, they actually included a date. So one of them was “Hey, you know, sorry I’ve been MIA. I’ve been going through a rough patch and I actually tried to kill myself two weekends ago. But all I managed to do was down a bottle of pills and send myself to the hospital.” So you know that person tried. Right? So with that data, we’re able to now look at the patterns. And some of the things that we found were actually really, really interesting.

Laurie Segall: …How do you feel when you’re looking at messages and someone is saying, “On this date, I tried to kill myself”? Like, I don’t know really what the question is other than… as a founder and with your responsibility and your technology, how does that make you feel?

Es Lee: …By the way, we don’t know the identity of the people ’cause we never ask them for their name. Right? I think it was an exercise in empathy that if everybody had the opportunity to go through, I would suggest it. Because going through the interactions of some of these people with everybody in their life, it felt like a soap opera… It was exhausting because imagine you know that this person was gonna go through hardship a month from now when you’re reading the conversations of them crying out for help to people. Right? It was kinda heartbreaking. And there were some conversations where this person came in and just started listening to them, checking in with them, and saying, “Hey, how are you doing?” I just remember mentally cheering at that point, like, “Thank God for you.”

…If we could use these algorithms to find that person that we could recognize cared about you the most but, you know, also was, depressive or could be empathetic to you… And it doesn’t matter if you have Mother Teresa in your phone. If she doesn’t reply to your text messages, she’s not gonna be able to help you. So we needed the algorithms to find somebody who tries a lot harder than you do to communicate than you do with them. And so the algorithm goes through all that and finds the person that might be right for you and suggests that you reach out to this person…

Laurie Segall: If Mei is looking through a lot of these messages and seeing people at these very vulnerable moments… You must really see some people in pain, saying some pretty sad things about depression or suicide. What’s the responsibility of the AI?

Es Lee: Yeah. I mean, that’s a good question. I don’t think anybody really has a good answer for that… That was one of the first things that we turned to when we go, “Hey. If the AI is able to look at your conversations with everybody, your parents, your girlfriend, your best friend, it’s able to kinda understand things about you that really nobody has the perspective. Right?” You only get to see somebody in a certain dimension, whereas we have a 360 view. And we kind of theorized that, hey, maybe we can actually figure out patterns. And so we started looking into conversations. And for the longest time, I was looking for another start-up in mental health or some data set where I… It’s kinda morbid to say, but, could I get the text messaging history of somebody who killed themselves? Because we have the analytics tools to kind of comb through the history. And what if we were able to use that to figure out patterns that preceded the actual act?

And so, I talked to many, many psychologists, start-up companies, and the answer was always “We don’t know,” or, “We don’t have the data, and even if we did have the data, privacy regulation is keeping us from doing anything.” And then it dawned on me, well, we have conversations of 150,000 people. What if we just search through these messages and try to look for the phrase “tried to kill myself”? We actually found about 10 occurrences of that where it was from the user and not a contact. So we had a lot more information on them.

In those 10, they actually included a date. So one of them was “Hey, you know, sorry I’ve been MIA. I’ve been going through a rough patch and I actually tried to kill myself two weekends ago. But all I managed to do was down a bottle of pills and send myself to the hospital.” So you know that person tried. Right? So with that data, we’re able to now look at the patterns. And some of the things that we found were actually really, really interesting.

Laurie Segall: …How do you feel when you’re looking at messages and someone is saying, “On this date, I tried to kil myself”? Like, I don’t know really what the question is other than… as a founder and with your responsibility and your technology, how does that make you feel?

Es Lee: …By the way, we don’t know the identity of the people ’cause we never ask them for their name. Right? I think it was an exercise in empathy that if everybody had the opportunity to go through, I would suggest it. Because going through the interactions of some of these people with everybody in their life, it felt like a soap opera… It was exhausting because imagine you know that this person was gonna go through hardship a month from now when you’re reading the conversations of them crying out for help to people. Right? It was kinda heartbreaking. And there were some conversations where this person came in and just started listening to them, checking in with them, and saying, “Hey, how are you doing?” I just remember mentally cheering at that point, like, “Thank God for you.”

…If we could use these algorithms to find that person that we could recognize cared about you the most but, you know, also was, depressive or could be empathetic to you… And it doesn’t matter if you have Mother Teresa in your phone. If she doesn’t reply to your text messages, she’s not gonna be able to help you. So we needed the algorithms to find somebody who tries a lot harder than you do to communicate than you do with them. And so the algorithm goes through all that and finds the person that might be right for you and suggests that you reach out to this person…

Laurie Segall: But doesn’t that feel like playing God, rolling the dice?

Es Lee: I think… It’s normal to see a person in need and say, “Hey, um, I think I could help that person.” I feel like it’s a responsibility, in a way, then. Right? Because I think in society when you know there are others, there’s a part of a responsibility that grows out of that for the other person to help you in need. Right? So I actually do think that it is the responsibility of the people who have access to something like this.