First Contact Transcript

Episode 11: Sam Altman: Growing Up Silicon Valley

First contact with Laurie Segall is a production of Dot Dot Dot Media and iHeart Radio. 

Sam Altman: One thing where I do think Silicon Valley gets a little bit dishonest is, saying like, “Okay. We’re going to build these incredibly powerful systems. They’re going to do all these wonderful things for us on this incredible exponential curve of technology. And then right when we want it to stop, it’s gonna stop.”

Tech years are like dog years – a lot happens in a short time.

I want to take you back ten years ago, this one is personal for me. 

One of the first people I met when I started covering technology around 2010  was Sam Altman. 

We were both in our early twenties.  It was before tech exploded and left us with some pretty complicated questions about the future.

Back then everyone was hustling.  We were coming out of the recession, the app store had launched, and a new creative class was emerging…  It wasn’t cool to go to work on Wall Street anymore. You could just have an idea and code it into the hands of millions…

I met Sam at a conference called South by Southwest, he had a startup at the time called Loopt. It was a location social networking app for your phone. That felt so long ago if you know his tratectory. 

Sam for me represents so much of what I find fascinating about tech – Success and failure. Big ideas. Blind ambition. 

Sam became a fixture in Silicon valley. He ran Y Combinator which is one of the most valuable incubators in Silicon Valley. His next act is called Open AI, it’s an initiative he started with Elon Musk.

Sam personifies what it was like to come of age in Silicon Valley.

His first company, Loopt, was ahead of its time. Technically, it failed – and sold in 2012. But Sam went on to become one of the most prominent voices in the Bay Area. It’s like his failure was a springboard for success, but he’s also someone driven by an inability to stay in the lines. He isn’t afraid to stand up and say things that might get him into trouble. He has a history of taking a stand under bright lights and a podium. You’ll hear that. 

That was always the spirit of Silicon Valley. But that spirit has been compromised. And there were some glaring omissions in the exponential curve that drove us to this moment in tech. 

I’m Laurie Segall and this is First Contact 

Laurie Segall: Okay.

Sam Altman: Okay.

Laurie Segall: Let’s start. So, the show is called First Contact. And what I love to do given my history of covering technology is, especially when I have people that I’ve known for many years is go back and talk about our first contact. Do you remember our first contact?

Sam Altman: South by Southwest in 2008, ’09? ’07?

Laurie Segall: I think it was, I want to say it must have been 2009 or 2010.

Sam Altman: Okay.

Laurie Segall: You, Sam Altman who are now kind of royalty of Silicon Valley, you know, President of Y Combinator and now, um, running OpenAI. You had founded, uh Loopt.

Sam Altman: Yeah.

Laurie Segall: And I remember, our first contact. I don’t even know if you realized this but I was just starting at CNN and I was just interested in technology. And I was a production assistant pretending to be a producer who had convinced someone to let me bring my best friend to South by Southwest with a camera and put-

Sam Altman: But didn’t you like buy your own plane ticket or something?

Laurie Segall: I, yeah and I like slept in like a bed with my friend. Like-

Sam Altman: Right.

Laurie Segall: … We literally, like a- and I w- I was hustling.

Sam Altman: I remember being very impressed by this.

Laurie Segall: I mean, I can’t believe I told you. Did I tell you that?

Sam Altman: You did.

Laurie Segall: I’m super upset by that. Um, yeah. But, but it was like this certain moment where I would have done anything-

Sam Altman: Yeah.

Laurie Segall: Fast forward, I became our senior technology correspondent for CNN. I was on camera for a decade but like, I wasn’t even on camera then. I would have done anything including pay my way to South by Southwest, and put people like you on camera because it’s such an extraordinary moment.

Sam Altman: There was this extraordinary moment, there was a legitimate new platform, that kicked off a ton of startups. At the same time, startups were easier to start by less experienced people than ever before. So you had like the intersection of the iPhone kicking off this new platform of mobile of which we still really haven’t had a new platform of that level of significance since.

And then a sort of technological and cultural moment where you had things like AWS where all of a sudden, people could easily start startups and a cultural moment where they wanted to-

Laurie Segall: Right.

Sam Altman: … You know, kind of coming out of the financial crisis. And it was this magic period, that I’m super thankful to have like witnessed and been in and been a part of. where it was sort of the world changed very fast. And it mostly got changed by people deciding they were going to start companies.

Laurie Segall: Yeah. And, and I think what, what’s really interesting to me is, you know, fast forward, I think it was maybe that same year, you were my first on camera interview. It was in Bryant Park and we were talking about Loopt and it was like, the iPad had come out.

Sam Altman: Yeah.

Laurie Segall: I mean, by the way, I feel like 100 years old even saying these words,um  it ages me. ‘cause I think you have to go back to the history in order to go to the future. And I think this is such like a crazy moment. And I remember being so nervous because it’s my first-

Sam Altman: 13 years ago, no iPhone.

Laurie Segall: … I feel like crazy. Um.

Sam Altman: It’s like in living, in easily in living memory, there were no smartphones that mattered.

Laurie Segall: Yeah. Ah-

Sam Altman: And now, it’s like very difficult to imagine life without that.

Laurie Segall: And that’s the power of technology, right?

Sam Altman: It’s a, but it’s an incredible, real time example, yes.

Laurie Segall: Yeah.  and, and so for you, it’s, it’s always very special. Just because you very much represented for me a first, first entrance into technology, a first entrance into being on camera, uh, which helped shaped my career. So not, not to be whatever, I-

Sam Altman: No. It’s really-

Laurie Segall: … You know.

Sam Altman: … Yeah. That’s really sweet.

Laurie Segall: Um, it was really interesting to, to me. And, and it’s been fascinating to watch you. Because now that was a decade ago.

Sam Altman: Yeah.

Laurie Segall: I would show you the video but like, I’m appalled by the fa- I just kept wearing this black blazer over and over and over again, which I’m just like appalled by. Um, so-

Sam Altman: That’s a good reason not to get at old videos.

Laurie Segall: Yeah, totally. And, you know, fast forward, you went on to, so Loopt was the-

Sam Altman: Yeah.

Laurie Segall: The first company funded by Y Combinator. You became the President of Y Combinator, which is like the most prominent incubator in Silicon Valley responsible for Reddit, Dropbox, Airbnb. I mean, the history kind of tells itself. But I remember this moment with you also became like this startup Yoda. Like-

Sam Altman: Y-

Laurie Segall: … You have this blog that like everyone like asks for your advice and you’re like became this massively influential startup person just like at the center of all of it. You’re, that everyone wanted you to fund their startups. Everyone wanted your advice. You know, that kind of happened in the time I knew you. And then I couldn’t really go to coffee shops with you anymore. You became like startup famous or something. (laughs)

Sam Altman: Uh, yeah. Extremely famous but in an extremely narrow world.

Laurie Segall: Right but super interesting. So that’s the, I guess-

Sam Altman: Well, startups really became this cultural moment and for a brief period of time and I think now, it’s shifted. It became the thing that the most ambitious, you know, people wanted to do. Whether or not they had to tell you technology was like, this is where it’s happening. This is how I can sort of most quickly have a big impact on the world. And Y Combinator was as you just said sort of at the center of that. And so I had that reflected, whatever.

Laurie Segall: I want to kind of go back to your experience in Y Combinator to kind of go and look towards the future. I mean, you became the President of Y Combinator. And you just like saw everything, right?

Sam Altman: Yeah.

Laurie Segall: Like, tell me, like, you write all of these posts about like the most successful people you’ve seen. Like, you, you are like the key of like, it’s like you do like pattern recognition. Like what, what would be like your biggest takeaway having watched success and failure and raw emotion of building?

Sam Altman: Um, well I think a lot has gone wrong, which everyone talks about but a lot has gone right as well. And that’s kind of, you’re not really supposed to like say that anymore. And one of the things that I think is really great about Silicon Valley is people, people don’t have to have a lot of credentials. They can have failed many times in many previous startups. And as long as they’re really right once, as long as they eventually do something where they build a product that people really love that somehow it’s useful to people in an important way.

That still drives the world forward. And people can still have sort of great careers that way. And I’ve never seen anywhere else where that happens. And so, looking for people who are determined and visionary and, bold, and willful, willful is like I think the most important word that I’ve learned to describe startup founders. If you have to just pick one.

Laurie Segall: Why?

Sam Altman: Because if you have a way to bend the world towards your will, most of the time, you will eventually be successful. You may get knocked down a lot of times. Like most people do. I certainly had one quite difficult and long failure of a startup. Um, but if you just keep going and if you just figure out how to make things happen and don’t, and don’t give up and have this sort of strong vision of what you want to do in the world and how you want to see the world be, you can eventually accomplish great things.

Laurie Segall: Where do you think that comes from for you? So I, I was looking back, and every person I’ve ever interviewed, it’s like, someone went through something. (laughs) Right?

Sam Altman: (laughs)

Laurie Segall: Anyone successful who I know has like gone through some kind of like, you didn’t fit when you were younger. What do you think it was for you-

Sam Altman: Yeah. I mean, the-

Laurie Segall: … To pinpoint it?

Sam Altman: … The, the question is always like, uh, are people like that? Is it genetic or is it from childhood trauma?

Laurie Segall: Mm-hmm (affirmative)

Sam Altman: Probably, the answer is both. Um, I had all things considered, A. Like, I can’t point to like, oh, like, “I was always doing this to like get my father’s approval or whatever.” Like, I, I had a wonderful childhood. Um, I, I don’t have like a single moment I can point to where like, “This traumatic thing happened to me. And like that’s why I work super hard.”

Laurie Segall: Right.

Sam Altman: Um, maybe it did and I just don’t see it.

Laurie Segall: But there’s something about you. Like, I was reading, I mean, I think it’s extraordinary, uh, that you, you grew up in Missouri.

Sam Altman: Yeah.

Laurie Segall: Right? I think it’s extraordinary. There’s, um, something I was reading that, in high school, uh, I think there’s something I go back to that I, I read, um, that you did that I think was very defining of who you are as kind of-

Sam Altman: Yeah.

Laurie Segall: … A person and what kind of divides the people who kinda stand by and someone who actually like says something in an extraordinary way. Can you tell us that story?

Sam Altman: So, there had been one sort of semi openly gay guy in my high school before me. Um, and he had, I think had an okay time but was happy to leave. We were in a relatively conservative by current standards but for St. Louis at that time, pretty open-minded school. And somehow, someone had invited like a mother of a gay guy to come give a talk for like some national coming out day, or something. And some students really objected to that. I think mostly on a religious basis but also just like, gay people are bad basis.

And so, kind of like made a lot of noise about how they weren’t going to go to assembly that day. And it was disgusting the school’s doing this and all of that. And I was at the time the only sort of like gay person I knew of at my high school, or totally openly. I think it then changed real quickly after that. And so, we had this thing that you could do where you could sort of anyone who wanted could give like a five-minute speech to the whole school any morning when we’re all together for assembly.

So, I went home and I wrote, the speech that I wanted to give about like, this is a different time. Sure, like you can like think gay people are like, bad news. But, uh, if we can’t talk about these issues without every time someone feels like they don’t like something, they’re going to like organize a protest and talk about how we have this like awful, unsafe, environment of decay or whatever. I mean, the, the language was just egregious. Um, that that was like not the kind of environment we should hold ourselves to.

And now, it’s like the kind of speech that anyone could sort of give at, and probably any high school anywhere in the US, even in pretty rough parts of it-

Laurie Segall: Mm-hmm (affirmative)

Sam Altman: … And would be okay. But at the time, it felt like a very scary thing to do. And I remember sort of like being up all night. And re- I think, I don’t think I immediately like fitfully dozed a couple of hours the night before. But, I was so nervous about it. I don’t really like get nervous for stuff and I was so nervous to do this. Because I was like, mostly out. Like most people knew about it but it was not the kind of school where you would really stand up and like talk about being gay and that was okay. And I almost didn’t go through with it.

I almost like changed the speech to like not put my own personal story in there.  I remember like very clearly sitting in the hallway outside of assembly with this pen, changing like the section where I told my own story back and forth like three times.

Laurie Segall: Wow.

Sam Altman: ‘cause I just couldn’t do it. And then I  don’t even know why but I finally was like, “You know what? Like, I’ll just tell everybody I’m gay. Whatever. Like, what’s going to happen to me?” And then like, it was sort of this moment where you kind of just go on auto pilot. But I remember, I was like 17. This is like half of my lifetime ago. So, uh, it’s amazing how clearly I still remember this. But I remember sort of like walking out to the podium and there’s sort of these bright lights. And thankfully, you can’t really see individual faces out there.

And I hadn’t really told the administrators what I was going to talk about. They probably had a guess. They were clearly nervous. I remember like the head of school was like, “So, you’re going to give a sound off today, huh?” And I was like, “Yeah, I am.” And he’s like-

Laurie Segall: Mm-hmm

Sam Altman: … He’s basically like, “Please don’t make a meltdown for me.” He was close, I was very close to him. And I gave this speech. And, uh, I don’t say this out of like self-deprecating anything but I’m not a good public speaker. I have a lot of other talents. That’s not, that’s not one of them. But I felt so passionate about this one that I, it went like much better than it would normally go if I stood up and, and gave a speech like surprised me how well it went. And then the audience are my classmates and people in the younger grades.

Sam Altman: Like, I’ve got a long standing ovation out of it and sort of all day at school that day, people telling me like how much it meant to them. And that they really thought I was right. I had some like younger students like, come up to me like in tears that had been like, there’s, and one had been like almost suicidal from the thing earlier about sort of this the protest. And then after that, I think a bunch of people came out and it was sort of a different environment. So it was this really great moment. I’m happy I did it but it was, uh, it was so terrifying to do at the time.

Laurie Segall: How did you feel?

Sam Altman: When it was done, I felt relief. Uh, I was like, whether that was a good or a bad thing, I was so nervous and now it’s over. And I did the thing that I think is right and now whatever happens happens.

Laurie Segall: You remember like the first lines of it at all?

Sam Altman: I don’t. I don’t. Um, I remember the, the last lines were just about like, either you have a tolerant to open community or you don’t and you don’t get to pick and choose-

Laurie Segall: Mm-hmm. Wow.

Sam Altman: … Which, which topics you’re tolerant on.

Laurie Segall: Mm-hmm (affirmative) Wow.

Sam Altman: Like again, I still feel like, you know what? Um, if people have like a problem with gay people, that’s fine. Like they can go somewhere else. Like the (laughs) this thing that Condoleezza Rice said that her mother told her once has really stuck with me, which is like, “If people don’t want to sit next to you because you’re black, that’s fine as long as they’re the ones that get up and move.” And so, you know, still, like if people have a problem, I’m sure people do, with me for being gay, like whatever.

Sam Altman: They can go do their thing. But, but you don’t get to create an intolerant community and I still feel really strongly about that.

Laurie Segall: Do you feel like here in Silicon Valley, it’s pretty tolerant?

Sam Altman: Yeah. I mean, I, I’m sure that these like subtle ways in which it’s still not. I, I’ve seen some of them. Um, but I don’t like, I don’t believe that you should live your life obsessing over all of the things that have gone wrong or sort of ways you’ve been slighted. I think you just sort of move forward in whatever way you can and um-

Laurie Segall: Mm-hmm

Sam Altman: … I think that’s just a better way to live. But yes, there are definitely ways in which it’s I think still not as good as, as we like. I mean, this like, this same impulse has gotten me in, uh, several internet wars where I say the things that I believe and you know, Twitter doesn’t like it.

Laurie Segall: Mm-hmm

Sam Altman: Um, but I do believe you have to at some point, either stand up for what you believe in and have the courage of your convictions. Or you just let the world get worse and that goes from matters of justice like, which is one that I thought about here to things like saying, um, I believe that AI is going to transform human society and you’re not supposed to say that out loud. And if you talk too much about the future, you’re kind of in the community of AI researchers viewed between a little bit of scans and like is actively quite evil.

And I may be wrong but what I actually believe is we’re going to create this technology that is more transformative than any technology humans have ever created and almost no one is paying attention or talking about it. And you can either do what most people do, which is say, “Okay. The societal norm is I’m supposed to just put my head down and not talk about it.” Or say, “I really believe, I may be wrong but I really genuinely believe that this is going to change the world in unrecognizable ways.”

It’s going to make what we looked, what we talked about happening earlier with the iPhone, look like a warmup. And we got to talk about that.

Laurie Segall: Yeah. And so, I mean, when we met was the iPhone had changed everything. So, now, you having spent, how many years did you spend at Y Combinator?

Sam Altman: Um, I ran it for like six but I was there helping out-

Laurie Segall: Yeah.

Sam Altman: … For basically since the beginning.

Laurie Segall: So you said even before when we came in here, you’ve seen something like 3500 startups?

Sam Altman: Something like that.

Laurie Segall: Right? Come through and completely transform societies. 

Laurie Segall: So you left Y Combinator and you went to start OpenAI. So can you explain a little bit about what that is?

Sam Altman: Well actually they, they overlapped. So while I was at YC, one of the areas that I was most interested in that I still am. It’s, it’s still the category of startups that I have the most passion for is how can we have more of these sort of moon shot startups that work on a difficult piece of technology that has massive societal implications if it works? Nuclear fusion, any of the other climate change efforts, artificial intelligence, synthetic biology, space colonization, these huge things. And we either helped fund or helped start a number of them.

While I was at YC, one of them was OpenAI. I wish I could tell the version of the story which is and OpenAI was always the one that I knew was going to work phenomenally well. Uh, that’s not true. It was, you know, like, many others that we tried failed. Some work and this one really worked. What is true is this was the one I was most passionate about. Um, when I was 18, I wrote down a list of the problems that I most wanted to work on. It was like a college assignment and the first one was build artificial intelligence.

Laurie Segall: Mm-hmm

Sam Altman: So, the passion had long been there but I had no idea it would work out as well as it has been since. And then as it kept going, it operated for a long time without any CEO. And as I realized, like this is going to have implications to our society and collective humanity that I still can’t fully grasp but are going to be unimaginably huge. That was what I wanted to spend my time working on and that still is. And, and over time, I sort of, fairly gradual transition where I did a little bit of that and then both kind of 50/50 and then sort of just moved over.

But the goal of OpenAI is to build general artificial intelligence. A computer that can think like a human in every way and use that for the maximal benefit of humanity.

Laurie Segall: it’s interesting. It just seems to me that you’re still to a degree, the guy in the gym with the bright lights on you just saying something. And putting yourself out there in some way hoping and, and not necessarily caring but hoping that people are going to just   understand or feel less alone or something. There’s like some person in you that’s still-

Sam Altman: (laughs)

Laurie Segall: … Out there, putting yourself out there in some capacity saying, “This is how it should be. Or this is, you know, this is who we are.” And to a degree.

Sam Altman: Yeah. And now the stakes feel really high.

Ok we’ve got to take a quick break to hear from our sponsors. More with my guest, after the break. 

Laurie Segall: So what do you think the next thing is that we need to be thinking about that we’re not? And, and I also want to get into this moment where, I think both of us feel this where when we started, there was this optimism.

Sam Altman: Optimism, yeah.

Laurie Segall: And everyone is like super excited about tech. You’re talking to the girl that paid her own way to South by Southwest and would have done anything to put these people on camera. And now, it’s like the pendulum has come all the way to the other side.

Sam Altman: Yeah. I mean, I know it’s true. Intellectually, I know it’s true because I remember it and I was there when everyone was optimistic about this. But it feels, it feels like it can’t really have been true relative to now when it’s like, everything is like, “Tech is awful.” And I do think the industry as a whole made some real mistakes of culmition, right? People did things that were actively that they kind of knew were bad.

Laurie Segall: Like what?

Sam Altman: Um, I would say the worst was services that were willing to optimize for engagement at the cost of huge side effects.

Laurie Segall: Like, companies like Facebook and the business model.

Sam Altman: Um, in my head, what I was thinking of is like Twitter-

Laurie Segall: Right.

Sam Altman: … Seems to feed on outrage.

Laurie Segall: Right.

Sam Altman: And you could probably do a lot of things to change Twitter that would, again, I think the people working in all of these companies are actually good people and you’re not, you’re not supposed to say it either. I fundamentally believe that and I assume these,  all of these services are going to change. I think we’re facing a moment where we just got through this explosion of growth and transformation of society. And we need new rules and and new antibodies and new norms and new regulation. And it will catch up and we’ll get there.

But you know, like I think along the way, people were definitely like, “Well, I can do this thing and it’ll make us grow. And if I don’t do it, we’ll lose to some competitor.” And that is a bug with capitalism that’s true. But I think we’re at this moment where the world is pretty great by a lot of metrics. Um, if you just look at the fall of extreme poverty or regular poverty around, globally in the last 30 years, um, we should all be celebrating. And if you look at what’s available there’s a lot that we should be really happy about.

And we’re not and I think technology has a lot of blame there. But it’s not just the actions of the companies. It’s that as we said, in the last 13 years, the world has fundamentally transformed and society hasn’t caught up. And this happens, when technological revolutions happen. I assume it would have happened if we had been around for the previous ones. This is just the one that we get to live through.

And as  hard as this has been and as, as big as the changes has been and as the weird, it, the behavior, is like one thing that astonishes me is watching people who I think of as truly progressive saying that Facebook or other private sector companies should decide what the rules of free speech are. It’s like, what happened just with the American spirit, the American values? I’m totally fine with there being an asterisk on free speech. Can’t yell fire in a crowded theater and social media’s a new kind of theater.

But I’d like the government to set those rules and the fact that we’re now in this world where there’s just calls for the companies do it themselves terrifies me. Anyway, as hard as all these issues are,  I think what we’ve just been through is a small warm up for what we’re now on the brink of. And we didn’t even get it right on the warm up. And-

Laurie Segall: What is that? That, that’s not comforting, right?

Sam Altman: N- no. Um, and, and some-

Laurie Segall: Especially for someone who has like a, an insane instinct as to companies (laughs) that are going to be correct. That warning doesn’t, doesn’t leave me with, you know-

Sam Altman: Um.

Laurie Segall: … The warm and fuzzy.

Sam Altman: Well, it shouldn’t! Again, I don’t think it does a service for someone like me to say, “Oh, like, you know, there’s no more technological change on the horizon.” Like, we had this big transition. That’s it. You know, I think kind of what happens now is there’s like one news story about someone editing the germline of babies in China and everyone really stresses out for two days and then they forget about it. Um, or you see an example of powerful AI technology being used for something. That’s sort of a example of what’s to come and then they forget about it.

And this is somehow or other, we are on the brink of being the first species ever to design our own descendants. Maybe we do it by, something like Elon Musk’s Neuralink. Maybe we do it by editing the genome with Crispr. Maybe we do it by creating artificial digital intelligence. But this is not, this is like not a small thing. Like most things that people really get stressed about are sort of in, you know, yesterday’s newspaper, uh, not in history books. And this is one that’s going to be in the history books.

Laurie Segall: But, and I, I agree with you that these are the things that are coming down the pipeline.

Sam Altman: Yeah.

Laurie Segall: This is the long term view. The thing I worry about, having covered this for the last decade, it’s like, so let’s go with Elon Musk for an example and you’re close with Elon. You know him. Um, when we’re thinking about Neuralink, right? Like, and you’re thinking about these things are coming down the pipeline, you’re going to have a chip implanted in your brain. It’s going to make us smarter. This is going to be amazing. Like, are we thinking already about the unintended consequences? Will your thoughts be hacked? Will we create a super human species?

Like, what are the human cost of the technology? Because for me having been a big cheerleader for you guys, for a long time, and, and really caring about this technology, the thing that always to me seems to, to get lost in the conversation of this is what’s coming next. And these are the conversations happening behind closed doors. I know you’re at all the-

Sam Altman: Yeah.

Laurie Segall: … Silicon Valley dinner tables that everyone wants to, a ticket to, right? I don’t think people are talking about that stuff as the technology is being built. Are they?

Sam Altman: I, I, no. I, I, I think this is a huge, I think this is a huge deal. Um, and I think like one thing that happened is the, there is relationship now between tech and the media, which I would describe as, uh-

Laurie Segall: Increasingly contentious?

Sam Altman: … Uh, I was going to just say like not that fun to be on the tech side of. And one way to respond to that is people just say, “Well, you know, I’m not going to keep talking to the media.” Um, and this means the conversation drops out of the sort of public view. you know, I think it’s personally the wrong approach. I, but I have sympathy for why people feel like “I can’t get fair treatment of complex issues”. It’s certainly been frustrating for me at OpenAI.

No matter how careful we try to be to not hype a result of ours or how we try to talk about something, um, when the story runs, it always runs with, with just killer robots.

Laurie Segall: (laughs)

Sam Altman: And that’s frustrating to be honest. And you know, and yet we keep doing it.

Laurie Segall: Speaking of the media, I read this like, nice paragraph that I was going to quote to you about how you and Elon met at, um, the Rosewood Hotel. Is that?

Sam Altman: Oh, kind of the first dinner of-

Laurie Segall: The first dinner-

Sam Altman: … OpenAI getting started?

Laurie Segall: … Of OpenAI.

Sam Altman: Yeah.

Laurie Segall: Like, set the scene. It’s like the Rosewood Hotel for folks who don’t know is like this very fancy hotel where it has nice views and plentiful drinks. Um, and people in high heels and lots of like VCs. (laughs) Um.

Sam Altman: We just picked it because I think Elon was staying there that night.

Laurie Segall: And it was like you guys talking about like Armageddon. Is, this is, this was the article.

Sam Altman: Yeah.

Laurie Segall: Or like Armageddon could happen when it comes to the future of artificial intelligence and like, we have to do something. So like, take us, I, because-

Sam Altman: Yeah.

Laurie Segall: Take us to the table with Elon in the Rosewood Hotel wherein we’re on Sand Hill Road.

Sam Altman: Yeah.

Laurie Segall: Um, what was that conversation?

Sam Altman: I mean, this is like one of many. Like, none of these, people, people love the narrative of like, there was this one conversation-

Laurie Segall: (laughs)

Sam Altman: … And then this company happened.

Laurie Segall: Okay. Well, that was in the public. So give us one that’s not out there.

Sam Altman: So, (laughs) um, I mean, I, I think the conversations that people, that sort of don’t, uh, they don’t have quite as good of a narrative but they’re the ones that matter are the 30 small conversations that happen in groups of two or three, sort of, you know, late at night. Where people are just like, of all the things we could do in the world, should we do this? And what does this even mean? Should it be a company like A, or like B? What is the research direction we would go after? What do we even think it’s going to take to build artificial intelligence successfully?

Like, what are our theories around that? And then, and then you at some point, um, say, “Okay. It is really important to do this. This is going to have a transformational effect on society.” And in fact, that was back in like, you know, started in 2016. Now it’s the beginning of 2020. Uh, even what’s capable four years later is, is, breaks my mental model of how fast progress can be. So we got that part right. But then the details of how we were going to make all this progress, we got largely wrong and so we’ve had to adopt and do new things along the way.

And the way that this really happens and, and the way that I think it matters is, some people meet each other or know each other and start talking in and say, “Should we do this or should we not?” And at some point, you make a decision and say, “We’re going to go ahead.” And then you sort of jump off the cliff and you try to build the airplane on the way down. And whether we’re successful or not,  remains to be seen. But the progress we’ve made has been faster than I thought it would be.

Laurie Segall: And what is the, the dream with OpenAI?

Sam Altman: Um, the medium term dream is that we figure out how to build intelligent systems. Systems that can learn, systems that can think and that can be useful to humans, and vastly increase everyone’s standard of living. And I think, I think it is astonishing that we now have computers that can learn. Like we don’t talk about this much because we’ve sort of all gotten used to it. But  the one thing that I think really makes humans special and really makes sort of life special is this ability to, to learn and to think.

And once you get one algorithm that can do that and that it can do more and more of that as you scale it up and make it better, then the world is, is really going to change a lot. And In the medium term, I think it can change a lot in all of these ways where we have computers that can think and really do things that we need them to do. One thing where I do think Silicon Valley gets a little bit dishonest is, saying like, “Okay. Uh, we’re going to build these incredibly powerful systems.

Sam Altman: They’re going to do all these wonderful things for us on this incredible exponential curve of technology. And then right when we want it to stop, it’s going to stop.”

Laurie Segall: Right.

Sam Altman: That seems unlikely. Not the way the world-

Laurie Segall: Right.

Sam Altman: … Usually goes. But you’re not supposed to talk about what happens when it keeps going.

Laurie Segall: Right. And-

Sam Altman: And I think we have a responsibility to talk about what happens when it keeps going. And I think we have some big societal decisions to make about what happens when it gets smarter than we are.

Laurie Segall: When do you think that’ll happen?

Sam Altman: In our lifetimes.

Laurie Segall: When, I, I mean, so what does the future look like? I mean, I know that you guys kind of, people were like, “Oh, they’re fear mongering when they talk about, uh, OpenAI and this and that.” And I thought it was interesting something you said. Um, you were speaking to, to someone about, if people had talked about Facebook like this, they would have been criticized. Like, you know, and, and, and brought up all this stuff like what if Facebook did all this like back in the day? And that’s kind of what you’re trying to do with artificial intelligence is like trying to like go and anticipate some of these things.

Sam Altman: Yeah.

Laurie Segall: What do you think worst case scenario looks like?

Sam Altman: Well, what I would say on the fear mongering point is I think you can never make, you can never make all the people happy all the time. And so, what you better do is just the thing that you think is morally right. And you might be wrong but you would make your best sort of intention. And I think that history, on this one, now I feel more confident history is going to be on our side, which is talking about, how to make sure we get to the good future. Not saying like the future is going to be awful with killer robots or whatever. We don’t actually say that.

Laurie Segall: (laughs)

Sam Altman: But saying that we have to do work to get to the good future.

Laurie Segall: Right.

Sam Altman: And that we have to think about this ahead of time. I think history is going to prove us right. And I think that people who are like, you know, it’s all like, it’s all always great. Don’t worry about it. Look the other way. Uh, I think that’s what kind of got some of the current social problem, platforms into trouble. And had they thought a little bit more ahead of time, we may have avoided some of these mistakes. Yeah. I mean, like long ter- lo- long, long, long term.

I think if we do a good job and if the world goes the way we want, you kind of have this transition from purely biological humans to some sort of hybrid, merged humans and artificial and, and digital intelligence. And not everyone’s going to choose to do that and I think there will be some sort of world for people who don’t want-

Laurie Segall: What does all that mean? You just described like, so what does that, so what does that look like?

Sam Altman: Uh, the opt out part or the rest?

Laurie Segall: All of it. Like, just explain it a little bit more.

Sam Altman: Um, I mean, here’s like one version of the world I could imagine, which is that, a lot of people, actually not a lot. Let’s say like a small percentage of people. Uh, a lot in absolute numbers but not by percentage, say, “You know, I wanna  I’m going to go all in on the future. I’m going to plug my brain in via Neuralink or whatever.” seems very scary to me honestly but I think a lot of people would-

Laurie Segall: Mm-hmm

Sam Altman: … Choose that and maybe I would at that point, too. And I’m going to merge with a copy of this AI and we’re going to, we like whatever this new thing is, this sort of, this hybrid, we’re going to go off exploring space. And sort of just be to a human, unimaginably smart and powerful and capable. In a way that a human today with all of the capabilities of an iPhone would seem like a magician to a human from just a few hundred years ago.

I think that exponential difference, we should expect to see even more powerfully on this exponential curve of technology.

Laurie Segall: Uh-huh.

Sam Altman: and I think what that means in terms of how, like power and capabilities are difficult for you and I to sit here and clearly imagine. But, it’s pretty unbounded. And then I think there will be some people who say like, “You know what? I am opting out of that whole thing. I want to live out my life as a, you know, regular human.” And there will be some way to do that where you live in some, maybe the whole world is just like an AI free zone, I don’t know. And AI goes off and takes the rest of the universe.

Laurie Segall: How about, um, I’ve been particularly interested. Well, this line is going to sound weird when I say it. I’ve been particularly interested in death. How do you think death is going to change and this idea of, you know, I’ve, I’ve done a lot on like bots through, I mean, this is very basic the like bots where we create a digital-

Sam Altman: Yeah.

Laurie Segall: … Version of ourselves. But I know that it’s been an obsession here in Silicon Valley. You know, the idea of replacing your blood with the blood of young people. That’s one thing but, but even beyond that. Like, what do you think death means in like 20 or 30 years?

Sam Altman: I think it’s a really interesting question. I- i- if you had a perfect copy of your brain, like if you got the Neuralink implant and downloaded every thought process, every memory, every emotion, say if you could like make a perfect copy of Laurie in a computer that was going to live-

Laurie Segall: Can I make some tweaks, Sam?

Sam Altman: Let’s say you can make some tweaks.

Laurie Segall: Okay. (laughs)

Sam Altman: Um, so there’s now this like tweaked version.

Laurie Segall: Mm-hmm

Sam Altman: And you know that you, your body is going to die but that copy of Laurie, which has all your memories, all your thoughts, it acts exactly like you because it is the surrogate of, in the extreme molecule by molecule copy simulated in software. you count that as you living forever? Do you care?

Laurie Segall: I mean, I do think that’s me living on in some way. I mean, you know, I think, it’s so weird. Not, not to bring like the human stuff, my mom was sick recently and I, and I was thinking like, um, that, I don’t know. There’s something so visceral about it. Maybe having like the connection with technology of like, looking at our text messages, looking at everything, like, we have so much life data, too that we’re leaving out here.

Sam Altman: Yeah.

Laurie Segall: Right? Like, I don’t know.

Sam Altman: I, I think have a different opinion here than most and I’m sorry I’m going to go full Silicon Valley tech bro for a minute.

Laurie Segall: Okay.

Sam Altman: But-

Laurie Segall: Just hit met with it.

Sam Altman: The mo- (laughs)

Laurie Segall: I’m, I’m prepped. Hit me with it.

Sam Altman: Um, one of the most valuable perspective shifts that has come out of what has now been sort of a long term and pretty intensive meditation practice for me, that’s the, that’s the cringe. I’m sorry.

Laurie Segall: Everyone, okay. Do it-

Sam Altman: Um-

Laurie Segall: … Hold on, wait. Are our listeners still with us? Okay. Go ahead. (laughing) I’m just kidding. Go ahead. (laughs)

Sam Altman: … Um, ha- has really been this, uh, arrival towards this certainty that, that I don’t, I don’t feel a separate self anymore. Um, I, I sort of like, I view me as this system that takes input, runs it around in my brain and produces output. But when I really deeply look at that, I cannot find an egoic me anywhere in there. It’s like, you know, I’m part of my environment. There’s this model of the world in my head. Photons come in, action comes out. but, but there is no, there is no me.

There’s like this body, this mind, these thoughts but there’s no sort of like separate me outside of reality at the controls. And, if there is no you, there’s no other and then kind of, all of this sort of, all of the versions in the different philosophical traditions of the world of non-dualism end up being true in this weirdly basic way. Um, and if you, if that’s your operating model of how you think about the world, which is there’s a body. There’s a mind but there’s no entity at the controls of that.

There is no self,  then I think you think differently about what it means to die or not. Or like, if the thing in the computer is you or not.

Laurie Segall: Right.

Sam Altman: And so, all of these I think deep and old philosophical questions are newly relevant in a world where we actually can connect our brains to computers.

Laurie Segall: How would you feel about it?

Sam Altman: Um, it’d be something but like, as I said, if you don’t feel like there’s a you in the first place, it’s like, “Okay. There’s like another copy of this agent that-

Laurie Segall: Oh.

Sam Altman: … Sees, that reacts.”

Laurie Segall: Mm-hmm

Sam Altman: That takes input from the world and reacts.

Okay we’ve got to take a quick break to hear from our sponsors. More with my guest, after the break.

Laurie Segall: I ask every founder that I interview this question just because I think it’s an important question. What do you think is the single most important ethical question we need to ask ourselves when it comes to the future of like us tech in humans?

Sam Altman: Well, I think it’s this question about in a world where we’re going to have computers that can think like humans, what is the society we want to design?

Laurie Segall: Right.

Sam Altman: Um, I think there’s a lot of short term important questions.

Laurie Segall: Mm-hmm (affirmative)

Sam Altman: That’s empathy, one we talked about I think is a big one. I think the absolute catastrophic failure of San Francisco to empathize with people who need help for example is a, is a big one. I’m I’m ashamed in the deepest sense of the word about how San Francisco has dealt with homelessness, mental health and drug crises.

But I think longer term, measured on the sort of like geologic time scales, this question of what do we want the role of humans to be in the world and how do we make sure the world is good for humans in the broadest sense, uh, I think that is the biggest ethical question of our lifetime.

Laurie Segall: What do you think, uh-

Sam Altman: Bigger than even the inequality questions and everything else that feels huge today.

Laurie Segall: Wow, I mean, what do you think having spent this last decade and what do you think the next, as Silicon Valley, like, what, what happens next?  Let’s look at the current mess. We talked about the town square being-

Sam Altman: Yeah.

Laurie Segall: … Overrun. We talk about, you know, the, the days of technology being loved and all the great press around founders. Uh, you know, all of this has changed. We have people truly questioning Facebook, truly questioning Twitter. Truly questioning the business model of Silicon Valley. Where do you think we land?

Sam Altman: Um, on the whole, this industry cares more about doing the right thing for the world and making things better than say like, the finance industry.

But I think it is also true that there’s plenty of people here who are between somewhat and entirely motivated by making a bunch of money and don’t think about the consequences of doing so.

Laurie Segall: I want to go back a little bit to this idea that you are kind of like this startup whisperer. I think that a lot of the things you help people with are, are not just startup things that are just like universal human things. Um, and reading through your blog, you can just like write out things. You, you were saying like successful founders have almost too much self-belief. And you were kind of-

Sam Altman: Almost, the almost word there is really important.

Laurie Segall: … Right. Successful founders have-

Sam Altman: Right.

Laurie Segall: … Almost-

Sam Altman: Right.

Laurie Segall: … Too much self-belief. and I think you gave the example of Elon Musk showing you around SpaceX and being like, “I’m going to-

Sam Altman: Yeah.

Laurie Segall: … Or something like that.

Sam Altman: That was a long time ago and I remember it like step by step. It was such a visceral example of someone who has almost too much self-belief.

Laurie Segall: Explain that to me.

Sam Altman: But almost, almost. Um, it’s very hard to do a startup. It’s hard in a way that it’s difficult to explain to someone who hasn’t done it. In fact, I’d say the thing that is almost universal from talking to founders who have started companies that have been successful is, this has taken over my life all of my life forces have gone into this to a degree that I had no framework for. You hear that again and again and again. And there are so many times where it’s tempting to give up.

Where you just feel like there’s nothing left to go wrong but if one more thing does, I’m just going to like collapse on the floor and that’s it. I can’t do this any longer. And you have so many people telling you that you’re going to fail. In addition to so much direct evidence that you are in fact failing. That the, that the self-belief that it takes to get through that and say, “Against all these odds, against all this evidence that this isn’t working, against all these smart people telling me that it’s not working, I am going to keep going.”

That takes an unusual kind of person. In fact, the personality traits that make one good at that are not good probably in sort of other careers and maybe not even in the rest of someone’s life. But there is something about it where if you’re really trying to do something new in the world, you’ve got to be able to keep going in the face of incredible doubt.

Laurie Segall: And you give the example. I mean, g- give us an specific. So one was Elon Musk showing you around SpaceX. Right?

Sam Altman: Yeah. Uhm. So, I forget what year this was. Uh, but let’s say it’s like around 2012. So this, well before they’ve been a successful, as they’ve been now. And it was like, you know, a Friday afternoon in Hawthorne, California. It’s just the end of a, um, a long work week for me and I’m sure a very, very long one for him. And we were meeting, I didn’t remember what we talked about but we met in like sort of a little conference room for a little while. And then he was like, “Do you want a tour of the factory?”

Sam Altman: And I was like, “Uh, sure.” And I assume we had like some tour guide. And then he spent like three hours himself like showing me around. I would have thought he was busy but he did.

Laurie Segall: (laughs)

Sam Altman: And, you know, there’s these like little like vignettes, stick out in memory where it’s sort of funny like whenever we’d walk some a- when he would walk up some air with like me in tow, the people they were just like scatter.

Laurie Segall: (laughs)

Sam Altman: And any detailed question about like, “Oh, what does like this piece of a turbo pump on the engine do?” He would have like a, a 30-minute answer for. And so it was impressive like the level of, of just technical detail of how the whole thing fit together. And it was also sort of impressive to hear him talk about why he viewed it as such a moral imperative to get humans off of earth-

Laurie Segall: Mm-hmm

Sam Altman: … Living in other places and just like very clear how genuine his motivation on making humans multi-planetary relatively soon is critical if we want humanity to be robust and survive. But yeah. The thing that stuck out sort of thinking about it much later was not the technical depth, not the intensity of how much the mission mattered to him. But the certainty that he could do it.

Laurie Segall: Mm-hmm (affirmative)

Sam Altman: And, when you talk about sort of like, well, this part seems really hard. And man, like, you know, establishing life on Mars. Like, think about all the prep work you have to do to make it human habitable and get like just take all the stuff that we have on earth that we’ve built up with billions of people over thousands of years to make human society function. And just like getting all that machinery, figuring out new governments, new countries.

Sam Altman: Everything you have to do to establish society on Mars, almost everyone would just say like, “That’s too much work. That’s not actually possible.”

Laurie Segall: Yeah.

Sam Altman: And he was like, “We have to do it and so we’re going to no matter what it takes. We’ll figure it out.”

Laurie Segall: Mm-hmm

Sam Altman: And that spirit of like we’ll figure it out. I will make it happen. that’s just extremely powerful thing.

Laurie Segall: Do people ever come to you since you’re kind of the startup Yoda and ask you about like, I’m assuming because if you, to be a good founder, you do have to be really have a lot of self-belief. You have to be obsessed with something.

Sam Altman: Yeah.

Laurie Segall: You have to live and breathe something. You have to be a little insane. Those are at least the, the traits of people I’ve interviewed that are successful.

Sam Altman: Yeah.

Laurie Segall: Um, you need to be really resilient. I can imagine that’s really difficult for personal life. I can imagine that it’s difficult for mental health.

Sam Altman: Totally.

Laurie Segall: And and so, with those highs, and with that intensity must come, this whole other aspect of what it is to be that kind of human to be able to be capable of this. And, and I think that, that there is a cost, in some capacity. So I wonder if you, if you, if you hear that and, and what advice you give.

Sam Altman: Absolutely. So I, I should mention that I don’t advise startup founders much anymore just because-

Laurie Segall: Mm-hmm

Sam Altman: … Running a company is not a part-time-

Laurie Segall: Yeah. Right.

Sam Altman: … Job that like really keeps you-

Laurie Segall: Right.

Sam Altman: … Exceptionally busy. And so, I don’t do this as much but when I was doing it, I would say, half my conversations with founders, even if they weren’t explicitly asking about this sort of stuff, which they rarely do, that was the real question. Like I, I learned over time that when a founder comes asking for sort of vague, non-specific help, what it’s really about is, I’m having a hard time with myself. Or I’m, you know, in, in one of these, these many moments of despair that come to all founders.

Or not uncommonly, my company is going fine while my personal life is collapsing around me. what should I do? And I do think that one thing that’s gotten a lot better is founders take mental health much more seriously. Founders prioritize getting sleep but on the other hand, like, it’s easy to get sleep on things aren’t stressed. And then when you’re just like, facing incredible stress and anxiety at work. It’s like hard for anyone to sleep. 

Or you have founders say like, “My company is going great. And, but I just realized that I have been working 80 or 90 hours a week for the last seven years. And I’ve neglected all of my friends and they don’t call me anymore. And what do I do about that.” And I think one of the things that people don’t like to talk about in Silicon Valley and probably in other industries, too is that to really succeed at sort of the highest levels, uh, that requires a prioritization decision that means you sacrifice a lot of other things.

And I sort of think that most people can have, if you’re sort of like in a privileged enough place to be able to do this, most people can have whatever one thing they most want. You can prioritize work and work 80 hours a week and have a finally successful career. You can prioritize your friends and family and have a sort of super rich social life. Or many other things as well.

It’s very hard to have it all and I think it’s a disservice when people pretend that you can and people pretend that you can run a very large and complex,  organization and that there’s no personal life trade offs that come with that.

Laurie Segall: What’s the thing that you want most?

Sam Altman: Um, at this point, the two things that I really care about are working as hard as I can to make this transition as we move to this technologically enhanced species go well. And then spending time with the people that I really love, which honestly, that’s become a smaller and smaller list. I think like many people, my 20s were about, success and #CrushingIt and having like tons of friends and going to parties. And now, there’s like a small, narrow kind of work that is really important to me.

And a small set of people that are really important to me. And other than those two things, I don’t have a ton of room in my life for other stuff. And I’ve gotten okay at just saying no.

Laurie Segall: It’s a hard thing though. I mean, I guess you’ve had to get really good at that. Probably because of the influx of people who want things from you. But I, I think that’s a, that’s much harder.

Sam Altman: It’s, well, I don’t know if it’s hard for most people. It’s terribly hard for me and I was horrible at it.

Laurie Segall: Right.

Sam Altman: And then I realized that like, I am letting my life get filled up by what other people want me to do.

Laurie Segall: Right.

Sam Altman: Or these things that I’ve been working on. You know, like I could have kept being a startup investor forever and it’s like very addictive because it’s not very hardworking, make a ton of money and there’s a lot of glory.

Laurie Segall: Right.

Sam Altman: And that was, if I think I could have easily gotten pulled on to that path forever, or I could have gotten pulled onto a path of just saying like, yeah. It’s like really fun to like have a bunch of relatively shallow relationships with tons of interesting people. And it’s easy to get pulled into that. But, uh, I do feel like it’s been this effort for me to prioritize. To really think about w- how I want to spend my time and, and really try to prioritize that.

And, and the hard part is as I said earlier, you can have like, maybe one or two things that you really want if you’re willing to let all the other things go-

Laurie Segall: Um-

Sam Altman: … And letting other things go has been hard.

Laurie Segall: I remember, um, when the, there was a New Yorker profile that came out on you.

Sam Altman: Ugh.

Laurie Segall: (laughs) And you still, I mean, that, that reaction, right? I reread it, uh-

Sam Altman: Oh, no.

Laurie Segall: … Before I interviewed you. I know. There’s a lot. I mean, we could, there’s a lot we could take into the bunker. You have like a bunker. There’s so many things we could, we could get into. Like, um, I, I had a section in my notes called it’s the end of the world as we know it.

Sam Altman: In the spirit of the shortness of life-

Laurie Segall: (laughs)

Sam Altman: … Is there anything else you’d like to talk about?

Laurie Segall: No, no, no. we don’t have to talk about it fully. What I wanted to talk about it was, I heard you say something about that profile. I thought that that was interesting, which was, you know, you reread it and it was okay. But it just like, you didn’t feel like it captured you.

Sam Altman: Yeah.

Laurie Segall: and so, now that we’re sitting here and you are sitting with someone who’s known you for 10 years-

Sam Altman: Yeah.

Laurie Segall: how would you describe yourself?

Sam Altman: Um …

Laurie Segall: What didn’t it capture? I mean-

Sam Altman: What didn’t it capture?

Laurie Segall: Yeah. Like what, what-

Sam Altman: So I read it twice.

Laurie Segall: Uh-huh.

Sam Altman: Once right when it came out and it was like, quite annoying and then I read it once later and I thought it was sort of more fair but, but still did, didn’t feel like looking in a mirror. Um.

Laurie Segall: When you look in the mirror, what do you see, Sam?

Sam Altman: Um, something more human than what emerged from the scaffolding of that piece. I like the guy that wrote it. and I think if he, you know, didn’t have the space constraints, he probably would have put more in there that I think more fully captured the picture. And it’s, it’s always sort of, you want to write in any characterization than anything, you want to highlight the things that sort of most differentiates someone from like the average human.

Sam Altman: And so in that sense, you want to like plan out the things about me that are sort of most different from the composite of average person.

Laurie Segall: I, you know, did learn that you have a bu- or like a bunker-

Sam Altman: Yeah, sure.

Laurie Segall: … Or something. (laughs)

Sam Altman: So there’s like all these weird things you can point out about me.

Laurie Segall: (laughs)

Sam Altman: but I don’t think like those are the ones that define me.

Laurie Segall: So what are the ones that define you?

Sam Altman: Um, I mean, they’re the boring things, right? Like they’re the same thing, like, the things that I think make most people who they are are, the people that they love how they spend their time, what they’re like to interact with. Like, how you treat a friend in a crisis, how you treat a family member in a crisis. how you sort of like bring joy and happiness to the people around you. what you believe in, what you work on, and how you, how you sort of like, how you spend your time. How you feel, and like most of that for me is just like anybody else.

Laurie Segall: Right.

Sam Altman: And then you can like point to these weird edge cases.

Laurie Segall: Looking back like what do you think is your most memorable,  investment? I mean, because I think you’ve had such interesting specific experiences. I don’t know like if folks listening who aren’t as inside baseball understand that you’ve invested in probably many of the, for like, uh, maybe are the reason they’re using some of the products they’re using on their phone. Right? It’s like, but what would you say is, um, one of the most memorable investments you made?

Sam Altman: I think I could point to like all of these specific cases where I invested in this company and it went really well or sort of more painfully, I had the opportunity to invest in this company and passed and it went on to do really well. This is going to sound like a dodge than an answer. But for me, the most memorable thing is how well the strategy of investing in startups in the earliest stage overall has worked. It still amazes me that I or anyone else who does this reasonably well can sit down with two or three founders and an idea.

And say in aggregate, you know, you get it wrong in many specific cases. But in aggregate if you do it enough, say with enough accuracy that these are special founders. They’re going to be successful, and decide that after a 10-minute or a 60-minute meeting and be right. Where on aggregate. You can sort of crush almost every other investment class of opportunities. Now this may not be as true now as it used to be. In fact, I suspect it’s not because valuations have gone  up so much.

Laurie Segall: Mm-hmm

Sam Altman: But the thing that amazes me that is most memorable to me is not any specific investment but it’s that the entire strategy works.

Laurie Segall: But when you say you can sit with someone and like, five, 10 minutes or something and I think you said five at another thing but you’re not like allowed to do it in five-

Sam Altman: Right.

Laurie Segall: … Because it’s like not socially acceptable.

Sam Altman: Right.

Laurie Segall: Um.

Sam Altman: Sometimes, you do change your mind in the second five-

Laurie Segall: Right.

Sam Altman: But not very often.

Laurie Segall: but like, what would it be? Like what would you put, like, what, what would it be that you’d be look- like, what is it?

Sam Altman: Um, evidence of willfulness. You know, in, in whatever circumstances in life a person has been faced, have they outperformed what should have been possible at every stage? Have people found a, a way relative to where they are? To bend the world to their will? Vision and courage, raw intelligence, the ability to be an effective evangelist. One of the biggest jobs of being a company founder is to look at, well, it, it’s to continually convince many different kinds of people that you meet that they should help you.

So you have to be able to convince people to come join your company, for them to keep working there while they get incredible opportunities. People to invest in your company, customers to buy your product while it’s still in development. Many other things like that and so, one of the most important skills of a founder is to be really convincing that this startup is good. The best way to do that, of course is to really believe deep in your heart that it’s good. And that comes through fast or not.

Laurie Segall: Mm-hmm. All of this kind of like goes back to the location days, right?

Sam Altman: Yeah.

Laurie Segall: And I interviewed all of you guys back then. What do you think is the biggest difference between the Sam that I interviewed in 2010 and the Sam that I’m sitting in front of in 2020?

Sam Altman: Um, it’s funny. I, I feel like very much the same person but playing a very different game. Um, like, you know, I was at that time running a sort of forgettable startup that I really cared about. Um, but didn’t turn out to matter that much. To now then having this intermediate period where I sort of ran this thing that I think had a huge impact on the world in a lot of different ways.

Y Combinator that just touched, it was, is and was certainly even more maybe, this kind of voluntary flag bearer of the startup movement at the time when the startup movement was not what it was now. But still kind of the insurgency. And that was a formative experience and a lot of fun. Uh, to now working on this thing that I, I, again, I may be wrong but I believe will be the most important work I ever do. There’s that quote that I love that your 20s are always an apprenticeship for the work you do in your 30s. But you never know what that’s going to be.

Sam Altman: And so I felt like in my 20s, I was learning and practicing, for the work I’m doing now. And I hope I can do a really good job. Because now I think it actually matters. So it’s nice to have a practice one.

Laurie Segall: A good warmup, right?

Sam Altman: Yeah.

Laurie Segall: Well and it, and you can relate that to the tech stuff that you were talking about like, this whole last decade was a warmup to, to these real-

Sam Altman: Totally.

Laurie Segall: … Massive issues that we’re now facing, um, that are fundamental and human and impacted us at, such a human level.

Sam Altman: Yeah. I, I feel very grateful to have gotten to watch that warmup. I wish it had gone differently but just watching what happens when people are not as conscientious as they should be upfront, not because they’re bad people but because the incentive system is what it is. And also, because technology can get so powerful so fast that if you don’t make yourself stop and think about what’s going to happen, not next year but in 10 years, as this compounds. It’s just really easy to do the wrong thing.

Laurie Segall: And I, I just want to push you on this because like you sit at a really important table. Like, do you think that the people in your community are going to “do the right thing?”

Sam Altman: I think they, they would to the degree they’re able to and they know what to do. But I think if it is a few hundred people at one company, plus a few hundred thousand in a broader community, that are deciding what the future of the world is going to look like, that is not okay. They could do as well as they can at predicting what the rest of the world wants and would think. But the only way to do this and have it be just and good is to include a very broad representation of the world in making decisions about how we want to coexist with this technology.

What human rights look like, what the role of humanity looks like, what the new socioeconomic contract, with your government looks like. These are, these are questions that everyone deserves to lay it on and that we should make a collective decision and not sort of the will of the people who write the software.

If the last decade was tech’s warm up, and we didn’t really get it right, What does the main event look like? Will Silicon Valley do better?

Is the reflection there, the incentive structure? Honestly, I spend a lot of time there and I’m not sure.

But I think Sam is right – the stakes now are even higher. 

I’ve watched a lot of founders like Sam  grow up in Silicon Valley. 

I think the next phase requires more people at the podium, under bright lights, standing for something and opening up the door to people and new voices who don’t think in terms of code but who specialize in humanity.

I’m Laurie Segall and this is First Contact. 

For more about the guests you hear on First Contact sign up for our newsletter. Go to firstcontactpodcast.com to subscribe. Follow me I’m @lauriesegall on Twitter and Instagram and the show is @firstcontactpodcast. If you like the show, I want to hear from you leave us a review on the Apple podcast app or wherever you listen and don’t forget to subscribe so you don’t miss an episode.

First Contact is a production of Dot Dot Dot Media executive produced by Laurie Segall and Derek Dodge. Original theme music by Zander Singh. Visit us at firstcontactpodcast.com. 

First Contact with Laurie Segall Segall is a production of Dot Dot Dot Media and iHeart Radio.