All Episodes

February 3, 2020 60 mins

According to investor and entrepreneur Sam Altman, the last decade of tech was the warm up... and Silicon Valley didn’t exactly get it right. The main event is even more pressing. Now the stakes are higher.


First Contact’s Laurie Segall met Sam ten years ago. Back then, he had a startup called Loopt. It was a location-based social networking app for your phone. Since then, Sam became a fixture in the tech world. Loopt didn’t take off, but he went on to run Y Combinator — one of the most valuable incubators in Silicon Valley. And his next act is OpenAI — an initiative he started with Elon Musk.


Sam is someone who’s driven by an inability to stay in the lines. He isn’t afraid to stand up and say things that might get him into trouble and has a history of taking a stand under bright lights and a podium.


In this episode of First Contact, Sam opens up about what is was like to come out as gay in a St. Louis high school in the early 2000s, the possibility of human/AI hybrids, and why the next ten years in tech will be more disruptive than the last.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
First Contact with Lori Siegel is a production of Dot
Dot Dot Media and I Heart Radio. One thing where
I do think Silicon Valley gets a little bit dishonest
is saying like, Okay, we're going to build these incredibly
powerful systems. They're gonna do all these wonderful things for
us on this incredible, exponential curvive technology, and then right

(00:22):
when we wanted to stop, it's gonna stop. Tech years
are like dog years. A lot happens in a short time.
I want to take you back ten years ago. This
one is personal for me. One of the first people

(00:44):
I met when I started covering technology around was Sam
all Met. We were both in our early twenties. It
was before tech exploded and left us with some pretty
complicated questions about the future. Back then, everyone was hustling.
We've coming out of the recession, The app store had launched,
and a new creative class was emerging. It wasn't cool

(01:06):
to go work on Wall Street anymore. You could just
have an idea and coded into the hands of millions.
I met Sam at a conference called south By Southwest.
He had a start up at the time called Looped.
It was a location social networking app for your phone
that felt so long ago. If you know his trajectory, Sam,
for me represents so much of what I find fascinating

(01:28):
about tech success and failure, big ideas, blind ambition. Sam
became a fixture in the Bay Area. He ran y Combinator,
which is one of the most valuable incubators in Silicon Valley.
His next act is called open Ai. It's an initiative
he started with Elon Musk. Sam personifies what it was

(01:50):
like to come of age in Silicon Valley. His first company, Looped,
was ahead of its time technically. It failed and sold
in but Sam went on to become one of the
most prominent voices in the Bay Area. It's like his
failure was a springboard for success. But he's also someone
driven by an inability to stay in the lines. He

(02:12):
isn't afraid to stand up and say things that might
get him into trouble. He has a history of taking
a stand under bright lights and a podium. You'll hear
that that was always the spirit of Silicon Valley, but
that spirit has been compromised and there were some glaring
omissions in the exponential curve that drove us to this
moment in tech I'm Laurie Siegel, and this is First Contact. Okay, Okay,

(02:41):
let's start. So the show is called First Contact. And
what I love to do, given my history of covering technology, is,
especially when I have people that I've known for many years,
is go back and talk about our first contact. Do
you remember our first contact? South By Southwest and two
thousand eight nine? I want to say it must have

(03:03):
been two thousand nine or two. You Sam Altman, who
are now kind of royalty of Silicon Valley, you know,
president of y Combinator and now running open Ai. You
had founded a loop and I remember our first contact.
I don't even know if you realize this, but I
was just starting at CNN and I was just interested

(03:26):
in technology, and I was a production assistant pretending to
be a producer who would convince someone to let me
bring my best friend to south By Southwest with a
camera and put But then you like buy your own
plane ticket or something. Yeah, and I like slept in
a bed with my friend, like we literally like and
I was hustling. I remember being very impressed by this.

(03:47):
I mean, I can't believe I told you, did I
tell you that? Super upset by that. Yeah, but but
it was like this certain moment where I would have
done anything fast forward. I became our senior technology correspondent
for CNN. I was on camera for decade, but like
I wasn't even on camera then I would have done anything,
including paying my way to south By Southwest and put
people like you on camera because of such an extraordinary

(04:08):
There was this extraordinary moment. There was a legitimate new
platform that kicked off a ton of startups. At the
same time, startups were easier to start by less experienced
people than ever before. So you had like the intersection
of the iPhone kicking off this new platform of mobile,
of which we still really haven't had a new platform
of that level of significant since, and then a sort

(04:32):
of technological and cultural moment where you have things like
aws where all of a sudden people could easily start startups,
and a cultural moment where they wanted to, you know,
kind of coming out of the financial crisis. And it
was this magic period that I'm super thankful to have
like witnessed and been in and been a part of,

(04:52):
where it was sort of the world changed very fast,
and it mostly got changed by people deciding they were
gonna start companies. Yeah, And I think what was really
interesting to me is, you know, fast forward. I think
it was maybe that same year you were my first
on camera interview. It was in Bryant Park and we
were talking about looped and it was like the iPad

(05:13):
had come out. I mean, by the way, I feel
like a hundred years old. Even saying these words um
at ages me because I think you have to go
back to the history in order to go to the future.
And I think this is such like a crazy moment.
And I remember being so nervous because like crazy. Um,
it's like in living in easily, in living memory, there
were no smartphones that mattered. Yeah, and now it's like

(05:35):
very difficult to imagine life without that. And that's the
power of technology, right, it's but it's an incredible real
time example. Yes, yeah, And and so for you, it's
it's always very special just because you very much represented
for me a first first entrance into technology, a first
entrance into being on camera, um, which helps shape my career.
So not not to be whatever, you know, it's really yeah,

(05:56):
that's really sweet. Um, it was really interesting to me,
and it's been fascinating to watch you because now that
was a decade ago. I would show you the video,
but like, I'm appalled by the I just kept wearing
this black blazer over and over and over again, which
I'm just like appalled by. Um, that's a good reason
not to get out all the videos. Yeah, totally, and
fast forward you went on so Looped was the first

(06:18):
company funded by y Combinator. You became the president of
y Combinator, which is like the most prominent incubator in
Silicon Valley, responsible for Reddit, Dropbox, Airbnb. I mean the
history kind of tells itself, but I remember this moment
with you. You you also became like this startup Yoda,
Like you have this blog that like everyone like asked
for your advice, and you're like became this massively influential

(06:40):
startup person just like at the center of all that,
You're everyone wanted you to find their startups. Everyone wanted
your advice. You know. That kind of happened in the
time I knew you, and then I couldn't really go
to coffee shops with you anymore. You became like startup
famous or something. Yeah, extremely famous, but in an extremely
narrow world, right, but super interesting. So that's the thing.
Startups really became this cultural moment for a brief period

(07:03):
of time, and I think now it shifted. It became
the thing that the most ambitious young people wanted to do,
whether or not the hit study technology. It was like,
this is where it's happening. This is how I can
sort of most quickly have a big impact on the world.
And why Combinator was, as you said, sort of at
the center of that. And so I had that reflected
whatever I want to kind of go back to your
experience in y Combinator to kind of go and look

(07:25):
towards the future. I mean, you became the president of
why Combinator, and you just like saw everything right, Like
tell me, like you write all of these posts about
like the most successful people you've seen, Like you you're
like the key of that. It's like you do like
pattern recognition, Like what what would be like your biggest
takeaway having watched success and failure and raw emotion of building. Um, well,

(07:49):
I think a lot has gone wrong, which everyone talks about,
but a lot has gone right as well, And that's
kind of you're not even really supposed to like say
that anymore. And one of the things that I think
is really rate about Silicon Valley is people. People don't
have to have a lot of credentials that can have
failed many times at many previous startups, and as long
as they're really right once, as long as they eventually

(08:12):
do something where they build a product that people really
love that somehow is useful to people in an important
way that still drives the world forward, and and people
can still have sort of great careers that way, and
I've never seen anywhere else where that happens. And so
looking for people who are determined and visionary and bold

(08:32):
and willful. Willful is like I think the most important
word that I've learned to describe startup founders, if you
have to just pick one, because if you have a
way to bend the world towards your will, most of
the time, you will eventually be successful. You may get
knocked down a lot of times, like most people do.
I certainly had one quite difficult and long failure of

(08:52):
a startup um. But if you just keep going and
if you just figure out how to make things happen
and don't and don't give up, and have this sort
of strong vision of what you want to do in
the world and how you want to see the world
be you can eventually accomplish great things. Where do you
think that comes from? For you? So? I was looking
back and every person I've ever interviewed, it's like someone

(09:15):
went through something. Right, anyone successful who I know has
like gone through some kind of like you didn't fit
when you were younger. What do you think it was?
The question is always like are people like that? Is
it genetic or is it from childhood trauma? Probably the
answer is both. Um, I had all things considered a
like I can't point to like, oh, like I was

(09:37):
always doing this to get my father's approval or whatever,
like I had a wonderful childhood. Um, I don't have
like a single moment I can point to where like
this traumatic thing happened to me, like that's why I
work super hard. Um, maybe it did. And I just
don't about you like I was reading. I mean, I
think it's extraordinary that you you grew up in Missouri, right,

(10:00):
I think it's extraordinary. There was something I was reading
that in high school. I think there's something I go
back to that I read. Um that you did that.
I think it was very defining of who you are
as kind of a person and what kind of divides
the people who kind of stand by and someone who
actually says something in an extraordinary way. Can you tell
us that story? So there had been one sort of

(10:23):
semi openly gay guy in my high school before me um,
and he had I think I had an okay time,
but I was happy to leave. We were in a
relatively conservative by current standards, but for St. Louis at
the time pretty open minded school. And somehow someone had

(10:45):
invited like a mother of a gay guy to come
give a talk for like some national coming out there
or something, and some students really objected to that, I think,
mostly on religious basis, but also just like gay people
are bad basis, and so kind of like made a

(11:06):
lot of noise about how they weren't going to go
to assembly that day, and I was discussing the school
is doing this and all of that, and I was
at the time the only sort of like gay person
I knew of at my high school or totally openly.
I think it then changed real quickly after that. And
so we had this thing that you could do where
you could sort of anyone who wanted could give like

(11:28):
a five minute speech to the whole school any morning
when we were all together for assembly. So I went
home and I wrote the speech that I wanted to
give about like this is a different time. Sure, like
you can like think gay people are like bad news,
but if we can't talk about these issues without every
time someone feels like they don't like something, they're going

(11:50):
to like organize a protests and talk about how we
have this like awful unsafe environment of to care whatever.
I mean, the language was just egregious. Um that that
was like not the kind of environment we should hold
ourselves too. And now it's like the kind of speech
that anyone could sort of give it and probably any
high school anywhere in the US, even in pretty rough

(12:11):
parts of it, and would be okay. But at the
time it felt like a very scary thing to do.
And I remember sort of like being up all night
and really I think I don't think. I mean yeah,
like fitfully dozed a couple of hours the night before.
But I was so nervous about it. I don't really
get nervous for stuff, and I was so nervous to
do this because I was like mostly out, like most

(12:32):
people knew about it, but it was not the kind
of school where you would really stand up and like
talk about being gay, and that was okay, and I
almost didn't go through with it. I almost like changed
the speech to like not put my own personal story
in there. I remember, like very clearly sitting in the
hallway outside of assembly with this pen, changing like the
section where I told my own story back and forth

(12:53):
like three times because I just couldn't do it. And
then I don't even know why, but I finally was like,
you know it, like I'll just tell everybody I'm gay whatever,
like what's going to happen to me? And then like
it was sort of this moment where you kind of
just go on autopilot. But I remember I was like seventeen,
This is like half my lifetime ago, So it's amazing
how clearly I still remember this. But I remember sort

(13:17):
of like walking out to the podium and there's sort
of these bright lights and thankfully you can't really see
individual faces out there, and I I hadn't really told
the administrators what I was going to talk about. They
probably had a guess. They were clearly nervous. I remember
like the head of school was like, so you're gonna
give a sound off today, huh? And I was like, yeah,
I am, and he's like he's basically like, please don't

(13:40):
make a meltdown for me. It was close. I was
very close to him, and I gave the speech. And
I don't say this out of like self deprecating anything,
but I'm not a good public speaker. I have a
lot of other talents. That's not that's not one of them.
But I felt so passionate about this one that I
it went like much better than it would normally go
if I it up and gave a speech, like surprised

(14:02):
me how well I went. And then the audience are
my classmates and people in the younger grades, like I
got a long stand innovation out of it, and sort
of all day at school that day people telling me
like how much it meant to them and that they
really thought I was right. I had some like younger
students like call to me, like in tears that had

(14:22):
been like just one have been like almost suicidal from
the thing earlier about sort of this the protest, and
then after that, I think a bunch of people came
out and it was sort of a different environment. So
it was this really great moment. I'm happy I did it,
but it was ah it was so terrifying to do
at the time. How did you feel when it was done?

(14:43):
I felt relief. I was like whether that was a
good or a bad thing. I was so nervous. And
now it's over and I did the thing that I
think is right, and now whatever happens happens. Do you
remember like the first lines of it at all? I don't.
I don't, um. I remember that the last lines were
just about like either you have a tolerant, open community

(15:04):
or you don't, and you don't get to pick and choose, hm,
which which topics you're tolerant? Mhm wow. Like again, I
still feel like, you know what, um, if people have
like a problem with gay people, that's fine, Like they
can go somewhere else like that this thing that Kondo
Lesa Rice said that her mother told her once, it's

(15:25):
really stuck with me, which is like if people don't
want to sit next to you because you're black, that's
fine as long as they're the ones that get up
and move. And so you know, still like if people
have a problem I'm sure people do with me for
being gay, like whatever, they can go do their thing,
But you don't get to create an intolerant community, and

(15:45):
I still feel really strongly about that. Do you feel
like adherent Silicon Valley it's pretty tolerant. Yeah, I mean,
I'm sure there are these like subtle ways in which
it's still not I've seen some of them. Um, but
I don't like, I don't believe that you should live
your life obsessing over all of the things that have
gone wrong or sort of ways you've been slighted. I

(16:07):
think you just sort of move forward in whatever way
you can, and um, I think that's just a better
way to live. But yes, there are definitely ways in
which it's I think still not as good as as
we like. Yeah, I mean this like the same impulse
has gotten me in uh several Internet wars where I

(16:27):
say the thing that I believe and you know, Twitter
doesn't like it. Um, but I do believe you have
to at some point either stand up for what you
believe in and have the courage of your convictions or
you just let the world get worse. And that goes
from matters of justice, like which is one that I
thought about here, two things like saying, um, I believe

(16:52):
that AI is going to transform human society and you're
not supposed to say that out loud, and if you
talk too much about the future, you're kind of in
the community of AI researchers viewed between a little bit
askance and like, is actively quite evil. And I may
be wrong, but what I actually believe is we're going
to create this technology that is more transformative than any

(17:12):
technology humans have ever created, and almost no one is
paying attention or talking about it. And you can either
do what most people do, which is say, okay, the
societal norm as I'm supposed to just put my head
down and not talk about it, or say, I really believe.
I may be wrong, but I really genuinely believe that
this is going to change the world in unrecognizable ways.

(17:32):
It's going to make what we looked we talked about
happening earlier with the iPhone look like a warm up
and we got to talk about that. Yeah. So, I
mean when we met, it was the iPhone had changed everything.
So now you having spent how many years did you
spend at y Combinator. Um, I ran it for like six,
but I was there helping out basically since the So
you said, even before when we came in here, you've

(17:53):
seen something like, right, come through and completely transform society.
So you left why Combinator and you went to start
open AI. So can you explain a little bit about
what that is? Well, actually they overlapped. So while I
was at y See, one of the areas that I
was most interested and that I still am it's it's

(18:14):
still the category of startups that I have the most
passionate for is how can we have more of these
sort of moonshot startups that work on a difficult piece
of technology that has massive societal implications if it works,
nuclear fusion, any of the other climate change efforts, artificial intelligence,
synthetic biology, space colonization, these huge things, and we either

(18:39):
helped fund or helped start a number of them while
I was at y See. One of them was Open Eye.
I wish I could tell the version of the story,
which is and open Eye was always the one that
I knew was gonna work phenomenally. Well that's not true.
It was you know, like many others that we tried failed,
some work, this one really worked. What is true is

(19:00):
this was the one I was most passionate about. When
I was eighteen, I wrote on a list of the
problems that I most wanted to work on. It was
like a college assignment, and the first one was built
artificial intelligence. So the passion had long been there, but
I had no idea it would work out as well
as it has been since. And then as it kept going,
it operated for a long time without any CEO, and

(19:22):
as I realized, like, this is going to have implications
to our society and collective humanity that I still can't
fully grasp. It are going to be unimaginably huge. That
was what I wanted to spend my time working on,
and that still is and and over time I sort
of fairly gradual transition where I did a little bit
of that and then both kind of and then sort
of just moved over. But the goal of open aiye

(19:46):
is to build general artificial intelligence, a computer that can
think like a human in every way, and use that
for the maximal benefit of humanity. It's interesting. It just
seems to me that you're still, to a degree, the
guy in the gym with the bright lights on, you
just saying something and putting yourself out there in some way,

(20:07):
hoping and not necessarily caring, but hoping that people are
going to just understand or feel as loan or something.
There's like some person in you that's still out there
putting yourself out there in some capacity saying this is
how it should be here, this is you know, this
is who we are. And to a degree, yeah, and
now the stakes feel really high. Okay, we've got to

(20:28):
take a quick break to hear from our sponsors more
with my guest after the break. So what do you
think the next thing is that we need to be
thinking about that we're not. And I also want to

(20:49):
get into this moment where I think both of us
feel this where when we started there was this optimism.
Everyone is like super excited about tech. You're talking to
the girl that paid her own way us out by
some west and would have done anything to put these
people on camera. And now it's like the pendulum has
come all the way to the other side. Yeah, I mean,
I know it's true. Intellectually. I know it's true because

(21:10):
I remember it and I was there when everyone was
optimistic about this, But it feels it feels like it
can't really have been true relative to now, when it's
like everything is like tech is awful. And I do
think the industry as a whole made some real mistakes
of commission, people did things that were actively that they
kind of knew were bad. Um. I would say the

(21:33):
worst was services that were willing to optimize for engagement
at the cost of huge side effects, like companies like
Facebook and the business model, and um, in my head,
what I was thinking of is like Twitter seems to
feed on out ridge, and you could probably do a
lot of things to change Twitter that would Again, I
think the people working at all of these companies are

(21:54):
actually good people, and you're not really supposed to say
either fundamentally believe that, and I assume these all of
these services are going to change. I think we're facing
a moment where we just got through this explosion of
growth and transformation of society, and we need new rules
and new antibodies and new norms and new regulation and

(22:14):
it will catch up and we'll get there. But you know,
like I think along the way, people were definitely like, well,
I can do this thing and it'll make us grow,
and if I don't do it, we'll lose to some competitor.
And that is a bug with capitalism, that's true. But
I think we're at this moment where the world is
pretty great. Buy a lot of metrics Um, if you

(22:35):
just look at the fall of extreme poverty or regular
poverty around globally in the last thirty years, UM, we
should all be celebrating. And if you look at what's available,
there's a lot that we should be really happy about
and we're not. And I think technology has a lot
of blame there. But it's not just the actions of

(22:56):
the companies. It's that, as we said, in the last
thirteen years, the world is funda mainly transformed in society
hasn't caught up. And as happens when technological revolutions happened,
I assume it would have happened if we had been
around for the previous ones. This is just the one
that we get to live through. And as hard as
this has been and as as big of a change
this has been, and as the weird the behavior is like.

(23:19):
One thing that astonishes me is watching people who I
think of as truly progressive saying that Facebook or other
private sector companies should decide what the rules of free
speech are. It's like what happened to sort of the
American spirit, in the American values. I'm totally fine with
there being an asterix on free speech. Can't you acquire
in a crowd of theater, and social media is a

(23:39):
new kind of theater. But I'd like the government to
set those rules. And the fact that we're now in
this world where there's calls for the company to that
themselves terrifies me anyway. As hard as all these issues are,
I think what we've just been through is a small
warm up for what we're now on the brink of.
And we didn't even get it right on the warm up.
What is that that that's not comforting? No? Um and

(24:02):
and so especially for someone who has like an insane
instinct as two companies that are going to be correct
that warning doesn't that doesn't leave me with you, Well
it shouldn't. Again, I don't think it does a service
for someone like me to say, oh, like, you know,
there's no more technological change on the horizon, like we
have this big transition, that's it, you know. I think
kind of what happens now is there's like one new

(24:24):
story about someone editing the germ line of babies in China,
and everyone really stresses out for two days and then
they forget about it. Um Or you see an example
of powerful AI technology being used for something that's sort
of a example of what's to come, and then they
forget about it. And this is somehow or other, we

(24:46):
are on the brink of being the first species ever
to design our own descendants. Maybe we do it by
something like Elon Musk's neuralink, Maybe we do it by
editing the genome with crisper. Maybe we do it by
creating artificial digital intelligence. But this is not This is
like not a small thing. This is not like most
things that people really get stressed about are sort of

(25:09):
in you know, yesterday's newspaper, not in history books, and
this is one. It's going to be in the history books.
But and I agree with you with that, these are
the things that are coming down the pipeline. This is
the long term view, the thing I worry about, having
covered this for the last decade, it's like, so let's
go with Elon Musk for an example. And you're close
with Ellen, you know him. Um, when we're thinking about neuralink,

(25:30):
right like, and you're thinking about these things are coming
down the pipeline, You're going to have a chip and
point in your brain that's gonna make a smarter This
is gonna be amazing, Like, are we thinking already about
the unintended consequences. Will your thoughts be hacked? Will we
create a superhuman species? Like? What are the human cost
of the technology? Because for me, having been a big
cheerleader for you guys for a long time and really

(25:53):
caring about this technology, the thing that always to me
seems to get lost in the conversation of this is
what's coming now next? And these are the conversations happening
behind closed doors. I know you're at all the Silicon
Valley dinner tables that everyone wants to a ticket to, right,
I don't think people are talking about that stuff as
the technology is being built. I know. I I think
this is a huge I think this is a huge deal. Um.

(26:16):
And I think like one thing that's happened is the
there is this relationship now between tech and the media,
which I would describe as h increasingly contentious. I was
going to just say, like, not that fun to be
on the tech side of. And one way to respond
to that is people just say, well, you know, I'm
not going to keep talking to the media. Um. And

(26:37):
this means the conversation drops out of the sort of
public view. You know. I think it's personally the wrong approach,
But I have sympathy for why people feel like I
can't get fair treatment of complex issues. It's certainly been
frustrating for me at Open Eye. No matter how careful
we try to be to not hype a result of

(26:58):
ours or how we try to talk about something um
when the story runs, always runs with midges, killer robots,
And that's frustrating to be honest, and you know, and
yet we keep doing it. Speaking of the media, I
read this like nice paragraph that I was going to
quote to you about how you and Elon met at UM.
The Rosewood Hotel. Is that oh kind of the first
dinner of open dinner of Open AI, like set the scene.

(27:20):
It's like the Rosewood Hotel for folks who don't know,
is like this very fancy hotel where it has nice
views and plentiful drinks um and people in high hills
and lots of like vcs UM. We just picked it
because I think you almost staying there that night, and
it was like you guys talking about like armageddon is
this is this was the article, Like armageddon could happen

(27:40):
when it comes to the future of artificial intelligence, and
like we have to do something so like take us
to the table with Ellen in the Rosewood Hotel. We're
on Sandhill Road. I mean, this is like one of many,
Like none of these people people love the narrative that
like there was this one conversation and then this company. Okay,
well that was in the public, so give us one.

(28:01):
It's not out there. Um. I mean, I I think
the conversations that people that sort of don't uh, they
don't have quite as good of a narrative, but they're
the ones that matter. Are the thirty small conversations that
happen in groups of two or three sort of you know,
late at night, where people are just like, of all

(28:23):
the things we could do in the world, should we
do this? And what does this even mean? Should it
be a company like a or like be what is
the research direction we would go after? What do we
even think it's going to take to build artificial intelligence successfully?
Like what are our theories around that? And then and
then you at some point, um say, okay, it is
really important to do this. This is going to have

(28:44):
a transformational effect on society. And in fact, that was
back and like you know, we started it's beginning. Even
what's capable four years later? Is it breaks my mental
model of how fast progress can be. So we got
that part right, but then the details of how we
were going to make this progress we got largely wrong.
So we've had to adapt and do new things along

(29:04):
the way. And the way that this really happens, in
the way that I think it matters, is some people
meet each other or know each other and start talking
and say should we do this or should we not?
And at some point you make a decision and say
we're going to go ahead, and then you sort of
jump off the cliff and you try to build the
airplane on the way down. And whether we're successful or

(29:25):
not remains to be seen. But the progress we've made
has been faster than I thought I would be. And
what is the dream with open ai um? The medium
term dream is that we figure out how to build
intelligent systems, systems that can learn, systems that can think,
and that can be useful to humans and vastly increase

(29:49):
everyone's standard of living. And I think I think it
is astonishing that we now have computers that can learn.
Like we don't talk about this much because we've sort
of all gotten use to it. But the one thing
that I think really makes humans special. It really makes
sort of life special. Is this ability to learn and

(30:10):
to think. And once you get one algorithm that can
do that, and that it can do more and more
of that, is you scale it up and make it better,
then the world is really gonna change a lot. And
in a medium term, I think it can change a
lot in all of these ways where we have computers
that can think and really do things that we need

(30:31):
them to do. One thing where I do think Silicon
Valley gets a little bit dishonest is saying like, Okay,
we're going to build these incredibly powerful systems. They're gonna
do all these wonderful things for us on this incredible
exponential curve of technology, and then right when we wanted
to stop, it's gonna stop. That seems unlikely, not the

(30:51):
way the world usually goes. But you're not supposed to
talk about what happens when it keeps going. And I
think we have a responsibility talk about what happens when
it keeps going. And I think we have some big
societal decisions to make about what happens when it gets
smarter than we are. When do you think that will
happen in our lifetimes? When I mean, so what does
the future look like. I mean, I know that you

(31:12):
guys kind of people were like, oh, they're fearmongering when
they talk about open eye and this and that. And
I thought it was interesting something you said, Um, you
were speaking to someone about if people had talked about
Facebook like this, they would have been criticized, like you know,
and and and brought up all this stuff like what
if Facebook did all this like back in the day.
And that's kind of what you're trying to do with
artificial intelligence of like trying to like go and anticipate

(31:34):
some of these things. What do you think worst case
scenario looks like. Well, what I would say on the
fear mongering point is, I think you can never make
you can never make all the people happy all the time,
and so what you better do is just the thing
that you think is morally right, and you might be wrong,
but you make your best sort of intention. And I
think that history on this one now, I feel more

(31:55):
confident in history is going to be on our side,
which is talking about how to make sure we get
to the good future, not saying like the future is
going to be awful killer robots whatever. We don't actually
say that, but saying that we have to do work
to get to the good future, and then we have
to think about this ahead of time. I think history
is going to prove us right. And I think the
people who are like, you know, it's all always great,

(32:15):
don't worry about it, look the other way. I think
that's what kind of got some of the current social
platforms into trouble. And had they thought a little bit
more ahead of time, we may have avoided some of
these mistakes. Yeah, I mean like long term, long, long, long,
long term. I think if we do a good job
and if the world goes the way we want, you
kind of have this transition from purely biological humans to

(32:38):
some sort of hybrid merged humans and artificial and digital
intelligence and not everyone's going to choose to do that,
and I think there will be some sort of world
for people who don't want What does all that mean?
You just described? Like, so what is that? So? What
does that look like? The opt out part of the rest? Like,

(32:58):
just explain it a little. I mean, here's like one
version of the world I couldn't imagine, which is that
a lot of people actually not a lot, let's say
like a small percentage of people, a lot in absolute numbers,
but not by percentage. Say you know, I wanna I'm
gonna go all in on the future. I'm gonna plug
my brain in via neuralink or whatever. It seems very
scary to me, honestly, but I think a lot of

(33:18):
people would choose that, and maybe I would at that
point too. And I'm gonna merge with a copy of
this AI and we're gonna we like whatever this new
thing is, this sort of hybrid going to go off
exploring space and sort of just be to a human
unimaginably smart and powerful and capable in a way that

(33:39):
a human today with all of the capabilities of an
iPhone would seem like a magician to a human from
just a few hundred years ago. I think that exponential
difference we should expect to see even more powerfully on
this exponential curve of technology. And I think what that
means in terms of how like power and capabilities are

(34:04):
difficult for you and I to sit here and clearly imagine,
but it's pretty unbounded. And then I think there will
be some people who say, like, you know what, I'm
opting out of that whole thing. I want to live
out my life as you know, regular human and there
will be some way to do that where you live
in some maybe the whole world is just like an
a I free zone. I don't know, and AI goes
off and takes the rest of the universe. How about

(34:26):
I've been particularly interested This line is going to sound
weird when I say it, I've been particularly interested in death.
How do you think death is going to change? And
this idea of you know, I've I've done a lot
on like bots to I mean this is very basic,
but like bots where we recreate version of ourselves. But
I know that it's been an obsession here in Silicon Valley.
You know, the idea of replacing your blood with the

(34:47):
blood of young people. That's one thing. But but even
beyond that, like what do you think death means? And
like really interesting question if you had a perfect copy
of your brain, Like if if you got the neurallink
implant and downloaded every thought process, every memory, every emotion.
Say if you could like make a perfect copy of

(35:07):
LORI in a computer that was going to live forever
in the computer, and let's say you can make some tweaks. Um,
so there's not this like tweaked version, and you know
that you your body is going to die. But that
copy of LORI, which has all your memories, all your thoughts,
it acts exactly like you because it is this sort
of in the extreme molecule by molecule copy simulated in software.

(35:28):
Do you count that as you're living forever? Do you care?
I mean, I do think that's me living on in
some way. I mean, you know, I think it's so
weird not not do bring the human stuff. Like my
mom was sick recently and I was thinking, like, but
I don't know, there's something so visceral about it, maybe
having the connection with technology of like looking at our
text messages, looking at everything, like we have so much
life data too that we're leaving out there right Like,

(35:52):
I don't know. I think I have a different opinion
here than most. And I'm sorry, I'm going to go
full Silicon Valley tech pro for a minute, but moment
me with it. I'm prepped, hit me with it. One
of the most valuable perspective shifts that has come out
of what has not been sort of a long term
and pretty intensive meditation practice for me. That's the that's
the cringe. Okay, do our listeners still with us, Okay,

(36:15):
go ahead, I'm just kidding. Has really been this h
arrival towards this certainty that I don't I don't feel
a separate self anymore. Um. I I sort of like
I view me as the system that takes input, runs
it around in my brain and produces output. But when

(36:35):
I really deeply look at that, I cannot find an
egoic me anywhere in there. It's like, you know, I'm
part of my environment. There's this model of the world
in my head. Photons come in, action comes out, but
there is no there's no me. There's like this body,
this mind, these thoughts, but there's no sort of like
separate me outside of reality at the controls. And if

(36:58):
there is no you, there's no other. And then kind
of all of this sort of all of the versions
in the different philosophical traditions of the world of nondualism
end up being true in this weirdly basic way. Um.
And if you if that's your operating model of how
you think about the world, which is there's a body,
there's a mind, but there's no entity at the controls

(37:20):
of that, there is no self, then I think you
think differently about what it means to die or not,
or like if the thing in the computer is you
or not? And so all of these I think deep
and old philosophical questions are newly relevant in a world
where we actually can connect our brains to computers. How

(37:41):
would you feel about it? Um, it'd be something, But
like as I said, if you don't feel like there's
a you in the first place, it's like, Okay, there's
like another copy of this agent that sees, that reacts,
that takes input from the world and reacts. Okay, we've
got to take a quick break to hear from our

(38:02):
sponsors more with my guest after the break. I ask

(38:24):
every founder that I interview this question, just because I
think it's an important question. What do you think is
the single most important ethical question we need to ask
ourselves when it comes to the future of like us
tech in humans. Well, I think it's this question about
in a world where we're going to have computers that
can think like humans, what is the society we want
to design. I think there's a lot of short term

(38:45):
important questions. Is empathy, when we talked about, I think
is a big one. I think the absolute catastrophic failure
of San Francisco to empathize with people who need help,
for example, it is a big one. I'm I'm ashamed
in the deepest sense of the word about how San
Francisco has dealt with homelessness, mental health, and drug crises.

(39:06):
But I think longer term, measured on the sort of
like geologic time skills, this question of what do we
want the role of humans to be in the world
and how do we make sure the world is good
for humans? And the broadest sense, I think that is
the biggest ethical question of our lifetime. What do you think,

(39:29):
bigger than even the inequality questions and everything else that
feels huge today? I mean, what do you think having
spent this last decade, I mean, what do you think
the next as Silicon Valley? Like, what what happens next?
Let's look at the current mess. We talk about the
town square being overrun. We talk about, you know, the
days of technology being loved and all the great press
around the founders. You know, all of this has changed.

(39:52):
We have people truly questioning Facebook, truly questioning Twitter, truly
questioning the business model of Silicon Valley. Where do you
think we land um on the whole this industry cares
more about doing the right thing for the world and
making things better than say, like the finance industry. But

(40:14):
I think it is also true that there's plenty of
people here who are between somewhat and entirely motivated by
making a bunch of money and don't think about the
consequences of doing so. I'm want to go back a
little bit too, this idea that you are kind of
like this startup whisper. I think that a lot of
the things you help people with are not just start
up things, that are just like universal human things. And

(40:35):
reading through your blog, you can just like write out
things you were saying, like successful founders have almost too
much self belief, and you're kind of almost the almost
word there is really important. Successful founders have almost too
much self belief. And I think you gave the example
of Elon Musk showing you around SpaceX and being like,
I'm going to something like that. That was a long
time ago, and I remember it like step by step.

(40:58):
It was such a visceral example of someone who has
almost too much self belief but almost almost Um, it's
very hard to do a startup. It's hard in a
way that is difficult to explain to someone who hasn't
done it. In fact, I'd say the thing that is
almost universal from talking to founders who have started companies
that have been successful, is this has taken over my life.

(41:22):
All of my life force has gone into this to
a degree that I had no framework for You. Hear
that again and again and again. And there are so
many times where it's tempting to give up, where you
just feel like there's nothing left to go wrong. But
if one more thing does, I'm just gonna like collapse
on the floor and that's it. I can't do this
any longer. And you have so many people telling you

(41:47):
that you're going to fail, in addition to so much
direct evidence that you are in fact failing, that the
self belief that it takes to get through that and say,
against all these odds, against all this evidence that this
isn't working, against all these smart people telling me that
it's not working, I am going to keep going. That
takes an unusual kind of person In fact, the personality

(42:10):
traits that make one good at that are not good
probably in sort of other careers, and maybe not even
in the rest of someone's life. But there is something
about it where if you're really trying to do something
new in the world, you've got to be able to
keep going in the face of incredible doubt. And you

(42:31):
give the example, I mean, give us some specific So
one was Elon Musk showing you around SpaceX. Right. Um,
so I forget what year this was, uh, but let's
say it's like around two, so it's well before they've
been as successful as they've been now. And it was
like a Friday afternoon in Hawthorn, California, but at the

(42:54):
end of a long work week for me, I'm sure
a very very long one for him. And we were
meeting and I didn't remember what to talk about. But
we met in like sort of a little conference room
for a little while and then he was like, do
you want a tour of the factory? And I was like, uh, sure,
And I assume he had like some tour guide. And

(43:14):
then he spent like three hours himself like showing me around.
I would have thought he was busy, but he did.
And you know they're these like little like vignettes stick
out his memory or sort of funny like whenever he
would walk up some air with sort of like me
and to the people they were just like scatter, and
any detailed question about like oh, what is like this
piece of the turbo pump on the engine do he
would have like a a thirty minute answer for. And

(43:38):
so it was impressive, like the level of just technical
detail of how the whole thing fit together. And it
was also sort of impressive to hear him talk about
why he viewed it as such a moral imperative to
get humans off of Earth living in other places, And
it was like very clear how genuine his motivation on

(43:58):
making humans multiplanetarry relatively soon is critical if we want
humanity to be robust and survive. But yeah, the thing
that stuck out sort of thinking about it much later
was not the technical depth, not the intensity of how
much the mission mattered to him, but the certainty that

(44:20):
he could do it. And you know, and when you
talk about sort of like, well this part seems really
hard and man, like, you know, establishing life on Mars,
like think about all the proper work you have to
do to make it human habitable and get like, just
take all the stuff that we have on Earth that
we've built up with billions of people over thousands of

(44:41):
years to make human society function and just like getting
all that machinery, figuring out new governments, new countries, everything
you have to do to establish society on Mars. Almost
everyone would just say like that's too much work, that's
not actually possible. And he was like, we have to
do it, and so we're going to no matter what takes,
We'll figure it out. And that spirit of like we'll

(45:03):
figure it out, I will make it happen. That's this
extremely powerful thing to people ever come to you, since
you're kind of the startup yodah and ask you about
like I'm assuming, because if you to be a good founder,
you do have to be really I have a lot
of self belief. You have to be obsessed with something.
You have to live and breathe something. You have to
be a little insane. Those are at least the traits
of people I've interviewed that are successful. Um may to

(45:25):
be really resilient. I can imagine that's really difficult for
personal life. I can imagine that's difficult for mental health.
And and so with those highs and with that intensity
must come this whole other aspect of what it is
to be that kind of human, to be able to
be capable of this and I think that that there
is a cost in some capacity. So I wonder if

(45:46):
you if you if you hear that, and what advice
you give. Absolutely, so I should mention that I don't
advise startup founders much anymore, just because running a company
is not a part time job really keeps you right
exceptionally busy, and so I don't do this as much.
But when I was doing it, I would say half

(46:08):
my conversations with founders, even if they weren't explicitly asking
about this sort of stuff, which they rarely do, that
was the real question. Like I learned over time that
when a founder comes asking for sort of vague, non
specific help, what it's really about is I'm having a
hard time with myself or I'm, you know, in one

(46:29):
of these these many moments of despair that come to
all founders or not uncommonly, my company is going fine,
while my personal life is collapsing around me? What should
I do? And I do think that one thing that's
gotten a lot better is founders take mental health much

(46:49):
more seriously. Founders prioritize getting sleep, but on the other hand,
like it's easy to get sleep when things aren't stressed,
And then when you're just like facing incredible will stress
and anxiety at work, it's like hard for anyone to sleep,
or you have founders say like, my company is going great,
but I just realized that I have been working eighty

(47:11):
or ninety hours a week for the last seven years,
and I've neglected all of my friends and they don't
call me anymore. And what do I do about that?
And I think one of the things that people don't
like to talk about in Silicon Valley and probably in
other industries too, is that to really succeed at sort
of the highest levels, that requires a prioritization decision that

(47:34):
means you sacrifice a lot of other things. And I
sort of think that most people can have if you're
sort of like in a privilege enough place to be
able to do this, most people can have whatever one
thing they most want. You can prioritize work and work
eighty hours a week and have a family, successful career.
You can prioritize your friends and family and have a
sort of super rich social life or many other things

(47:55):
as well. It's very hard to have it all, and
I think it's a disservice when people pretend that you can.
When people pretend that you can run a very large
and complex organization and that there's no personal life trade
offs to come with that. What's the thing that you
want most? Um at this point, The two things that
I really care about our working as hard as I

(48:17):
can to make this transition as we move to this
technologically enhanced species go well, and then spending time with
the people that I really love, which, honestly, that's become
a smaller and smaller list. I think, like many people,
my twenties were about success and hashtag crushing it and

(48:39):
having like tons of friends and going to parties. And
now there's like a small, narrow kind of work that
is really important to me and a small set of
people that are really important to me. And other than
those two things, I don't have a ton of room
in my life for other stuff. And I've gotten okay
at just saying no. It's hard thing, though. I mean,

(49:00):
I guess you've had to get really good at that,
probably because of the influx of people who want things
from you. But I think that's a that's much harder.
It's well, I don't know if it's hard for most people,
it is terribly hard for me. And I was horrible
at it. And then I realized that, like I am
letting my life get filled up by what other people
want me to do or these things that I've been
working on, you know, like I could have kept being

(49:20):
a start of investor forever, and it's like very addictive
because it's not very hard work. You make a ton
of money and there's a lot of glory. And that
was if I think I could have easily gotten pulled
onto that path forever, or I could have gotten pulled
onto a path of just saying like, yeah, it's like
really fun to like have a bunch of relatively shallow
relationships with tons of interesting people, and it's easy to

(49:40):
get pulled into that. But I do feel like it's
been this effort for me to prioritize, to really think
about how I want to spend my time and really
try to prioritize that. And the hard part is that
I said earlier, you can have like maybe one or
two things that you really want if you're willing to
let all the other things go. I'm there. Things go
always been hard. I remember when there was a New

(50:03):
Yorker profile that came out on you still, I mean,
at that reaction right, I reread it before I interviewed you.
I know, there's a lot. I mean, we could there's
a lot we could dig into. The bunker you have
like a bunker. There's so many things we could we
could get into. Like, Um, I had a section of
my notes called it's the end of the world as
we know it, in the spirit of the shortness of life.
There anything else you'd like to talk about, We don't

(50:25):
have to talk about it. Fully. What I wanted to
talk about it was was with this idea that I
heard you say something about that profile that I thought
was interesting, which was, you know, you reread it and
it was okay, but it's just like you didn't feel
like it captured you. Yeah, And so now that we're
sitting here and you are sitting with someone who's known
you for ten years, how would you describe yourself? Um,

(50:49):
what didn't it capture? I mean, yeah, like what I
read it twice once right when it came out, and
it was like quiet, annoying. And then I read it
once later and I thought it was sort of more fair,
but but still it didn't feel like looking in a mirror. Um,
when you look in the mirror, what do you see soon. Um,
something more human than what emerged from the scaffolding of

(51:14):
that piece. I like the guy that wrote it, and
I think if he didn't have the space constraints, he
probably would have put more in there that I think
more fully captured the picture. And it's it's always sort
of you want to write in any caricaturization of anything,
you want to highlight the things that sort of most
differentiate someone from like the average human. And so in

(51:35):
that sense, you want to point out the things about
me they're sort of most different from the composite average person.
I mean, I did learn that you have things. Sure,
so there's like all these weird things you can point
out about me, But I don't think like those are
the ones that define me. So what are the ones
that define you? Um? I mean they're the boring things, right,
Like they're the same thing. Like the things that I

(51:55):
think make most people who they are are the people
that they love, how they spend their time, what they're
like to interact with, Like how you treat a friend
in a crisis, how you treat a family member in
a crisis, how you sort of like bring joy and
happiness to the people around you. What you believe in
what you work on, you how you sort of like

(52:17):
how you spend your time, how you feel, and like
most of that for me is just like anybody else.
And then you can like point to these weird edge
cases looking back, like what do you think is your
most memorable investment? I mean, because I think you've had
such interesting specific experiences. I don't know, like if folks
listening who aren't as inside baseball understand that you've invested
and probably many of the like maybe are the reason

(52:40):
they're using some of the products they're using on their phone. Right,
It's like, but what would you say, is um one
of the most memorable investments you made? I think I
could point to like all of these specific cases where
I invested in this company and I went really well,
or sort of more painfully, I had the opportunity to
invest in this company and pass and went on to
do really well. This is going to sound like a

(53:02):
dodge of an answer, but for me, the most memorable
thing is how well the strategy of investing in startups
at an early stage overall has worked. Still amazes me
that I or anyone else who does this reasonably well
can sit down with two or three founders and an
idea and say inaggregate, you know, you get it wrong

(53:27):
in many specific cases, but in aggregate, if you do
it enough, say with enough accuracy that these are special founders,
they're going to be successful, and decide that after a
ten minute or a sixty minute meeting and be right
where on aggregate you can sort of crush almost every
other investment class of opportunities. Now this may not be
as true now as it used to be. In fact,

(53:47):
I suspect it's not, because valuations have columned so much.
But the thing that amazes me that is most memorable
to me is not any specific investment, but it's the
entire strategy works. But you say you can sit with
someone in like five ten minutes or something, and I
think you said five at another thing. But you're not
like allowed to do it in for you because it's
like not socially acceptable. Sometimes you do change your mind

(54:10):
in the second five, but not very often, Okay, But like,
what would it be like? What would you put like?
What what would it be that you'd be looking at? Like?
What is it um evidence of willfulness? You know, in
whatever circumstances in life, a person has been faced. Have
they outperformed what should have been possible at every stage?
Have people found a way relative to where they are

(54:33):
two bend the world to their will, vision and courage,
raw intelligence, the ability to be an effective evangelist. One
of the biggest jobs of being a company founder is
to look at well, is to continually convince many different
kinds of people that you meet that they should help you.

(54:55):
So you have to be able to convince people to
come join your company, for them to keep working there
while the get incredible opportunities, people to investine your company, customers,
to bring you a product while it's still in development,
many other things like that, And so one of the
most important skills of a founder is to be really
convincing that this startup is good. The best way to

(55:17):
do that, of course, is to really believe deep in
your heart that it's good and that comes through fast
or not. All of this kind of like goes back
to the location days, right, and I interviewed all of
you guys back then. What do you think is the
biggest difference between the SAM that I interviewed in and
the SAM that I'm sitting in front of in um.

(55:42):
It's funny I I feel like very much the same
person but playing a very different game. UM. Like you know,
I was at that time running a sort of forgettable
startup that I really cared about, um but didn't turn
out to matter that much. Two now then having this

(56:07):
intermediate period where I sort of ran this thing that
I think had a huge impact on the world in
a lot of different ways. Why Combinator that just touched
it was is and was certainly even more maybe this
kind of voluntary flag bearer of the start up movement
at the time when the start up movement was not
what it was now but still kind of the insurgency.

(56:30):
And that was a formative experience and a lot of fun.
Ah to now working on this thing that I again
I may be wrong, but I believe will be the
most important work I ever do. There's that quote that
I love that your twenties are always an apprenticeship for
the work you do in your thirties, but you never
know what that's going to be. And so I felt

(56:51):
like in my twenties I was learning and practicing for
the work I'm doing now, and I hope I can
do a really good job because now I think it
actually matters. So it was nice to have a practice one.
It's good warm up, right, Yeah. Well, and if you
relate that to the tech stuff that you were talking about,
like this whole last decade was a warm up to
these feel massive issues that we're now facing um that

(57:15):
are fundamental and human and impact us at such a
human level. Yeah, I feel very grateful to have gotten
to watch that warm up. I wish it had gotten differently,
but just watching what happens when people are not as
conscientious as they should be upfront, not because they're bad people,

(57:37):
but because the incentive system is what it is, and
also because technology can get so powerful so fast that
if you don't make yourself stop and think about what's
going to happen, not next year, but in ten years,
as the compounds, it's just really easy to do the
wrong thing. And I just want to push you on
this because, like you sit at a really important table, like,

(57:59):
do you think that the people in your community are
going to quote do the right thing? I think they
would to the degree they're able to and they know
what to do. But I think if it is a
few hundred people at one company plus a few hundred
thousand in a broader community, that are deciding what the
future of the world is going to look like. That

(58:20):
is not okay. They could do as well as they
can at predicting what the rest of the world, Wanson
would think. But the only way to do this and
have it be just and good is to include a
very broad representation of the world in making these decisions
about how we want to coexist with this technology. What

(58:42):
human rights look like, what the role of humanity looks like,
what the new socio economic contract with your government looks like.
These are questions that everyone deserves to weigh in on,
and that we should make a collective decision on, not
sort of the will of the people who read the software.

(59:03):
If the last decade was text warm up and we
didn't really get it right, what does the main event
look like? Will Silicon Valley do better? Is the reflection
there the incentive structure. Honestly, I spend a lot of
time out there, and I'm not sure, but I think
Sam is right. The stakes now are even higher. I've

(59:24):
watched a lot of founders like Sam grow up in
Silicon Valley. I think the next phase requires more people
at the podium under bright lights, standing for something and
opening up the doors to people and new voices who
don't think in terms of code, but who specialize in humanity.
I'm Laurie Siegel and this is First Contact. For more

(59:47):
about the guests you here on First Contact sign up
for our newsletter. Go to First Contact podcast dot com
to subscribe. Follow me. I'm at Laurie Siegel on Twitter
and Instagram and the show is at First Contact Podcast.
If you like show, I want to hear from you,
leave us a review on the Apple podcast app or
wherever you listen, and don't forget to subscribe so you
don't miss an episode. First Contact is a production of

(01:00:09):
dot dot dot Media. Executive produced by Lorie Siegel and
Derek Dodge. Original theme music by Xander Sang. Visit us
at First Contact podcast dot com. First Contact with Lori
Siegel is a production of Dot dot dot Media and
I Heart Radio
Advertise With Us

Popular Podcasts

1. The Podium

1. The Podium

The Podium: An NBC Olympic and Paralympic podcast. Join us for insider coverage during the intense competition at the 2024 Paris Olympic and Paralympic Games. In the run-up to the Opening Ceremony, we’ll bring you deep into the stories and events that have you know and those you'll be hard-pressed to forget.

2. In The Village

2. In The Village

In The Village will take you into the most exclusive areas of the 2024 Paris Olympic Games to explore the daily life of athletes, complete with all the funny, mundane and unexpected things you learn off the field of play. Join Elizabeth Beisel as she sits down with Olympians each day in Paris.

3. iHeartOlympics: The Latest

3. iHeartOlympics: The Latest

Listen to the latest news from the 2024 Olympics.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.