All Episodes

June 8, 2023 33 mins

Just as Facebook was on the verge of becoming Meta Platforms, Inc. in late 2021, a scathing series of articles was published by the Wall Street Journal. The reporting was based on internal documents that detailed the ways Facebook’s platforms “are riddled with flaws that cause harm, often in ways that only the company fully understands.” The source for these internal documents — some tens of thousands of pages — became known as The Facebook Whistleblower.  The name behind these revelations is ex-Facebook product manager Frances Haugen. 

On this episode, Haugen reveals why she came forward, what she hopes to accomplish with her new book, The Power of One, and what she sees as the perils — and promise — of an ever-changing technology landscape that requires transparency to keep itself honest.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Hi everyone, I'm Katie Kurk, and this is next question.
In September of twenty twenty one, The Wall Street Journal
began to roll out a series of eleven stories that
would have a major impact on the way people think
about technology companies. The series was called The Facebook Files,
and the source for the series maybe one of the
biggest journalistic treasure troves of the century, was a former

(00:27):
Facebook product manager named Francis Hogan. Francis had only been
at Facebook for a couple of years, but during that
time she became increasingly alarmed by a disturbing pattern.

Speaker 2 (00:39):
The only data that we get out of these companies
is how many users do they have? How much time
do you spend, how the ads do you book on?
What's that revenue? You don't get the societal costs that
come as a consequence.

Speaker 1 (00:51):
It seemed that Facebook was prioritizing their own profits over
public safety and putting people's lives at risk. So she
blew the whistle. She made tens of thousands of pages
of internal documents available to The Wall Street Journal. And
what happened next testifying before the US Congress, the UK

(01:11):
and EU parliaments, and filing a complaint with the sec
exposed the depths that the tech company would go to
to mislead the public and grow its bottom line. Now,
Francis has written a book. It's called The Power of One.
How I found the strength to tell the truth and
why I blew the whistle on Facebook, And I'm very

(01:33):
glad that it's brought her to next question. First of all, Francis,
it's great to finally meet you. I am a huge
admirer of yours, so.

Speaker 2 (01:42):
Excited to be here. Thank you so much for inviting me.

Speaker 1 (01:45):
Of course, it's been quite a ride for you since
you became the Facebook whistleblower in October of twenty twenty one,
now a year and a half or so later. How
are you feeling? Are you happy that you came forward?
Do you have any regrets? Would you do it all
over again?

Speaker 2 (02:04):
You know, when I originally came forward, I had very
very basic goals, like I wanted to not have to
carry a secret that I thought had the potential to
really impact the lives of others. I came forward because
I was concerned about how Facebook was operating in African
countries in Southeast Asia, and I then and still genuinely
believe that if we continue to operate the way we do,

(02:26):
there are millions of lives on the line from things
like ethnic violence. But the world has changed a lot
since I came out. I was really shocked last week
when the Surgeon General issued his advisory on social media
and mental health for kids. I've been amazed at how
just knowing that these companies knew these harms were real
across a wide variety of harms has really galvanized the

(02:49):
activist community. It's caused legislative conversations around the world that
were pretty stalled for a long time. And so if
I could do it again, I would totally do it again.
I've been incredibly fortunate and how smooth it's gone, and
it's exceeded my wildest expectations.

Speaker 1 (03:05):
You've written a book about your experiences. Why did you
want to take pen to paper, fingers to laptop and
share your story with the world.

Speaker 2 (03:16):
One of the things that kind of baffles tech journalists
when I talk to them is because tech journalists live
in a little bit of an echo chamber. You know,
like our classic criticism of tech is tech lives in
an echo chamber. Tech journalists also live in an echo chamber.
A little bit when I say to them, you know,
when I take flights. You know, I'm a friendly person.
I'm one of those annoying people that talks to their

(03:37):
seed mates.

Speaker 1 (03:38):
I did that too.

Speaker 2 (03:39):
Yeah, and it's amazing. At least half the people I
sit next to you have never heard of the Facebook whistleblower.
And so it's one of these things where culture change,
and that's the thing that we really need. We need
to reset our relationship with these companies. It takes a
long time, and this book, I'm hoping helps a much

(03:59):
much larger set of people, much more diverse set of
people get a seat at the table, but kind of
laying out, like what are the conversations we need to
be having, Like what are the choices we get to
make in the next few years, because we are in
a moment of inflection and we need to have as
many informed people at the table as possible.

Speaker 1 (04:17):
You know, you talk about the Vic Murphy, the Surgeon
General issuing a warning about the dangers of social media
for kids. Other people becoming more aware, and I'm just
curious if you feel this way. But for me, I'm
kind of like, what took so long? This doesn't take
a brain surgeon, a rocket scientist, or a tech expert

(04:39):
to know that people were becoming have become incredibly addicted
to social media. Tristan Harris was sounding the alarm after
he left Google. All kinds of experts were saying, this
is dangerous. Why did it take so long for this
to really become headline news.

Speaker 2 (05:00):
One of the things I talk about in my book is,
you know, what's the difference between say, the car industry
and like the automotive industry and social media when it
comes to our ability to hold it accountable or or
our ability to understand it. Back in nineteen sixty five,
it's going to sound shocking. There were no seatbelts in cars,
no airbags. I remember that, Yeah, Like I listened to

(05:23):
like my parents still stories about the kids all like
jumbling all over each other in the back of the
station wagon, and it's like, really, wow, a different world.
We now put eight year olds in car seats, right.
But the world changed very suddenly when a guy named
Ralph Nader came out with a book called Unsafe at
Any Speed. And what really changed was that people didn't
realize that there was the ability to live in a

(05:46):
different world. You know, that our fatality rate today is
way less per mild driven for cars because of a
long series of actions. But the thing that people need
to understand is when when Ralph Nator published that book.
You know, there were one hundred thousand automotive engineers in
the world when I came forward. I think they were
on in order, probably three hundred or four hundred people

(06:07):
in the world who really understood how systems like facebooks work.
And of those people, you know, we are educated in
such narrow ways, I think a lot of those people
didn't understand the larger societal consequences of those choices and decisions.
And so Ralph Nader could have a chorus of automotive
engineers all say this is happening. When it comes to

(06:30):
social media, each of us sees a different world. You know,
for many, many, many people who would be the ones
asking those questions. When they open social media, they see
their friends and family who are likely relatively similar to themselves.
You know, the idea that Facebook could be radically different,
radically more dangerous in a place like an African country

(06:51):
or in Southeast Asia, it sounds foreign to us. Were
like social media is about looking at pictures of cats,
and so I think that's a big part of it,
Like we need to be able to have the right
to study social media, We need to have the right
to be able to get independent data off these systems,
because then we can have definitive conversations.

Speaker 1 (07:08):
Is that because we're not having a universal experience. It's
a highly deeply personalized experience for everyone. So it's not
as if we're all driving cars. We're all on different vehicles,
if you will. So it's not unifying people to realize
that they have to demand change.

Speaker 2 (07:31):
It's really important for people to understand just how different
those worlds are in terms of transparency. And that's part
of why I wrote the book. You know, I've I've
got to live a lot of that arc of how
we write software or like what does it mean to
have experiences online? And I wanted to walk people through,
you know, this is what changed from step to step
to step so that more people have that context.

Speaker 1 (07:52):
I'm just curious, Francis, I know you weren't super psyched
to go to Facebook. When did you realize we're not
in Kansas anymore? Something is awry.

Speaker 2 (08:03):
It was interesting, like when they reached out to me,
I was like the only thing I work on is misinformation.
You know. It's interesting. I got there and I think
one of the first moments where I was like, wow,
this is this is chaotic is to the role I
had is something called a product manager. So product managers
are responsible for helping articulate what is the problem we're

(08:24):
trying to solve, how might we solve that, and then
once we come to consensus on a solution, what's this
series of engineering tasks that will allow us to execute
that solution. I had a role of being a product manager.
And Facebook understood that they were a different enough company
that they had seen that if people came in from

(08:44):
the outside, they didn't succeed at a very high rate,
like there was a lot of churn, and so they
established a boot camp for two full weeks to just
give kind of a basic level of like heire's how
Facebook works. And my manager pulled me out of it
with a like three days. He was like, we're you know,
things aren't fired too much, Like we have to come
up with a plan for the next six months, even

(09:06):
though you know nothing about the problem or what's going on,
Like we need you an articulative plan. Now. That was
kind of my first warning that I was like, oh, wow,
like the house is on fire. The house is on fire,
and people are running around, even having the self awareness
to be like, oh, we know that if people don't
get at least a certain amount of bootstrapping. Facebook is
very hard to figure out, even internally, how it works.

(09:28):
And it was interesting. I showed up for that first meeting,
you know, the one that my managers like urgently pushing
me to prepare for, and we spent twenty minutes basically
discussing should I have a job? So imagine you show up,
You've just gotten hired for this thing, and all the
leadership you're saying is like, why why do you have
a team? Like why does this team exist? Like think
about that for a moment. You know, because activists have

(09:51):
told you, because the un told you, Hey, and me
and Mark, your negligence around misinformation killed twenty four thousand people.
You know of a problem. And yet I could sit
in a room full of the leadership of safety having
them be like shouldn't this team exist? Right? You can
imagine that first six months was a little stressful.

Speaker 1 (10:10):
What was Facebook doing in the lead up to the events?
On January sixth, we'll talk about that right after this,
we're back with Francis Hogan, Facebook whistleblower and author of
the Power of Want. Francis, did Facebook really care about

(10:33):
misinformation or did the company just feel like dealing with
it was an exercise and futility.

Speaker 2 (10:39):
I think part of the problem was that Facebook had
taken the most obvious path to deal with misinformation, which was,
let's hire experts, let's hire journalists to help us assess
what's true and false. But that kind of approach, you know,
fixing safety after the fact, like, oh, we've already hyper
amplified you know, extreme content. Now we're going to pluck
out the dangerous parts. Those strategies don't scale. You know,

(11:03):
Facebook has three billion users. There were maybe a few
thousand fact checks being done a month, maybe a few thousand.
You can see why you actually need to take a
different kind of approach, which is coming in and saying,
why are the algorithms rewarding extreme content? How do we
change our operations to deal with that fact?

Speaker 1 (11:23):
As they say, lies make it around the world before
the truth has the chance to tie its shoes, And
so what was the alternative? What would that approach it?
Ben I mean, what is the answer.

Speaker 2 (11:35):
Let's take, for example, something as simple as should you
have to click on a link before you reshare it,
so you hit the nail on the head. One of
the problems with third party fact checking is journalism takes time.
At Facebook, on average, it was like two or three
days for someone to write a fact check, and they
put a huge amount of effort into trying to build
prediction systems to guess which are the pieces of content

(11:58):
that might go viral, because we have to give the
journalists a head start. Alternatives are things like if you
require people to click on a link before they reshare it,
that reduces misinformation by like ten or fifteen percent, just
because people have to pause and think for a moment.
You know, there's one or two billion people on Facebook
who live in places where Facebook is the internet. You know,

(12:20):
they might have become literate to use Facebook, and as
a result those places. In some countries, thirty five percent
of everything you see on your news feed is a
reshare and so Facebook wasn't willing to take the hit
of you know, zero point one point two percent less
profit of reducing the amount of content that was moving
through the ecosystem. As a whole. So those are kinds

(12:42):
of product design ways of dealing with misinformation.

Speaker 1 (12:46):
Our third party fact checker still the primary way Facebook
is ostensibly trying to combat miss and disinformation.

Speaker 2 (12:54):
You know, I don't know like because we have no transparency.
You know, right now, we don't have any trend parency
into how these book operates. We know that Mark fired
lots and lots of safety people during his year of efficiency,
so it's possible that things have changed, but given his
recent behavior, it's unlikely things have materially changed.

Speaker 1 (13:13):
Let's talk about sort of the public facing explanation from Facebook.
I interviewed Cheryl Samberg in twenty nineteen when you were
still working at Facebook. I asked her if she felt
like Facebook was doing enough to invest in its security.
Let's take a listen and let's hear your reaction to

(13:34):
what Cheryl told me.

Speaker 3 (13:35):
We've put tremendous engineering resources, and we're doing things like
red teams, asking what do we think the bad guys
would do and how would we do it? So we're
never going to fully be ahead of everything. But if
you look at if you want to understand what companies
care about, you look at where they invest their resources
and if you look back three to five years and
you look at today, we've totally changed when we invest

(13:56):
our resources. And my job has changed too. If I
look at up at Facebook eleven and a half years,
for the first eight or so, I spent most of
my time growing the company and sometime protecting the community.
We always did some protection, but now that's definitely flipped.
My job is a majority building the systems that protect
and minority grow. And so we're definitely changing as a company.

(14:19):
We're in a different place across the board on all
of these things.

Speaker 1 (14:22):
Do you think you're changing enough fast enough?

Speaker 3 (14:25):
I hope, so we're trying.

Speaker 1 (14:27):
What's your reaction to that, Francis, It's.

Speaker 2 (14:29):
So interesting when we listen to media, you know, a
film clip, an audio clip from the past, you know,
you can often hear the emotional echoes of that moment.
And I think back in twenty nineteen, she's quite earnest,
like the only part of Facebook that was growing was
the civic Integrity team. I think twenty nineteen was the
year that the UN report on Menmark came out and

(14:50):
you know, firmly placed blame on Facebook. They were still
living right in the immediate echoes of Cambridge Analytico for context,
right about when when you interviewed her, that would have
been when the FTC find Facebook five billion dollars because
of privacy violations from Cambridge Alica. But over the course
of the next two years or even the next year,
I think Facebook began to realize that having a big

(15:13):
safety team, having people with PhDs asking questions was putting
Facebook in a quite awkward position because the more people
dug in, the more people having the ability to ask questions,
they found things, and they found things that were quite disturbing.
I think they did a pretty good job in the
run up to the twenty twenty election, but as soon

(15:36):
as the twenty twenty election passed, they fired that team,
the Civic Integrity team.

Speaker 1 (15:41):
One month after the election, yeah, one month, and of
course five weeks later was when the Trump supporters, many
of whom organized on Facebook, stormed the US capital.

Speaker 2 (15:54):
And I think part of what happened was their papers
in the Facebook files. They say like, we saw all
this building up. But I think because now no single
person was responsible, no one felt like they had the
authority to go in there and intervene.

Speaker 1 (16:10):
Where do you draw the line? And nobody has a
crystal ball to say these people are going to do that.

Speaker 2 (16:16):
Facebook talks about the difference between movements and we are
to knows adversarial movements. So an adversarial movement knows that
they're violating Facebook's policies and actively does countermeasures to try
to get around Facebook. That's one we differentiate between, like,
does this movement think they're doing something wrong? Right?

Speaker 1 (16:37):
Right?

Speaker 2 (16:37):
And you saw that extensively with Stop the Seal.

Speaker 1 (16:40):
If the Facebook Civic Integrity Team had not been disbanded
less than one month after the election, what would they
have seen and what would they have done with the
activity they witnessed going on on the platform.

Speaker 2 (16:54):
In the round of the twenty twenty election, there were
a number of things in place where Facebook said, hey,
we know we have vulnerability in our system. For example,
live videos. This is where you know, I can film
something on my phone and Facebook will put a little
announcement at the top of people's feeds. Facebook knew that
live video was a particularly big vulnerability for the company
because video is harder to monitor than audio or text,

(17:19):
for sure. So you either can deal with that after
the fact, or you can say, hey, what's leading to
that video? Going viral, and in the case of live video,
Facebook said, hey, you know, every piece of content on
Facebook earns a score based on how relevant it is
to uktie or to your listener. You know, is it
similar to other things that they've seen before. Does this

(17:40):
person generally produce content that people would like to engage with?
You know, there's a bunch of factors. You earn a score,
and that gives you a priority in the news feed.
When it came to live video, they would give a boost.
They'd say that score, We're going to multiply it by
eight hundred and fifty times to make sure that it
will show up at the top of your feed. They said, hey,
we know this is dangerous. We're going to only boost

(18:02):
it sixty five times. In the runchy election, it's a
little tiny detail, but when they stormed the capitol, the
rioters actively used live video to coordinate, and so there's
these little things where they could have had the safety
measures on that were on election day, but because no
one felt they had the authority to say we're in
a situation, no one turned those on until the day

(18:24):
after they stormed the capital. These are little, tiny details,
but you have to remember When people interviewed the rioters
after January six, they said it seemed real. It seemed real.
It seemed like everyone was saying, like, we're about to
experience a coup, like we need to like go and
save our democracy. These little product tweaks would have changed

(18:46):
the information environment that those people experienced, and who knows
what would have happened with January six.

Speaker 1 (18:51):
So why didn't they do it? Because the team had
been dissolved.

Speaker 2 (18:55):
There was no longer a person in the company who
wore the hat of saying, let's make sure we're a
positive force in society, right, there was diffuse responsibility for
Little Tiny Slippers. And I think after they dissolved the team,
which was on December second or December third, I don't
think there was anyone who felt like they had the
authority to say, hey, some people are going to have

(19:15):
to work over the holidays, right, Like this is a
big enough deal that someone's going to have to do something.
And I think that's why Facebook was asleep at the wheel.

Speaker 1 (19:23):
When we come back, Francis and I talk about improvements
in social media that could have a positive impact on
the team Mental health crisis. We're back with Francis, Howgan
do you think Mark Zuckerberg just cares about profit over everything?

(19:46):
And is there something about the broader culture of Facebook
that makes this almost a Sisifian task to try to
control or at least even monit or remove really dangerous content.

Speaker 2 (20:04):
So I'm glad that you bring up Mark, so just
for people understand how different the leadership of Facebook is
versus other companies. Mark Zuckerberg holds about fifty five fifty
six percent of the voting shares that control Facebook, so
that means he's the chairman of the board, He's the CEO.
If he wants to invest tens of billions of dollars

(20:25):
in the metaverse, no one can stop him because he
is the only voice that matters. I do think responsibility
goes to the top, right, Like part of the challenge
here is you have a man who has been CEO
since he was nineteen years old. Facebook is intimately tied
to his identity, and it's very hard for people to
admit that their life's work might be hurting other people.

(20:46):
And so unfortunately, there is an internal culture to the
company where the people who surround Mark know that being
too critical isn't going to get you very far, I said,
I think part of why Cheryl left from when I
came out, she left maybe six once after the Facebook
files happened. I think as Cheryl was a voice that
was trying to push for responsibility, and there wasn't really

(21:08):
an appetite in Tronto the company to do that.

Speaker 1 (21:10):
To this point, in twenty nineteen, I spoke with her
about whether Facebook's business model ultimately rendered implementing security measures
bad for business. Let's hear what she said.

Speaker 3 (21:22):
So on this, I'm really pretty proud of our track record.
If you look a number of years ago and you
listen to our earnings calls, So earnings calls are exactly
what people are worried about. They're directed at investors. It's
our quarterly report. If you actually watch us in earning calls,
we are spending as much time talking about the measures
we take on safety and security as we are about
our business growth.

Speaker 2 (21:43):
Easily.

Speaker 3 (21:44):
We actually said many quarters ago, this is so important
to us that we are going to make massive investments
and change the profitability of our company by making real
resource investments. And we have to the tune of billions
and billions of dollars, and we will keep doing it.
We've taken action after act after action that is better
for protecting them community than it is for our growth,

(22:05):
and we're going to continue to do that. Mark has
said it over and over again. I have said it
over and over again.

Speaker 1 (22:10):
Do you believe that, Francis.

Speaker 4 (22:11):
Oh, Kittie, I'm so glad you played that clip for me,
because I am totally going to go get the transcripts
now of the investor calls just to see how things
have changed, right, because I think back in twenty nineteen
they were trying like they got burned by Cambridge Analytica.
They lost a huge amount of goodwill and regulators from users.
I don't think that sentiment she expressed is still true.

(22:34):
One of the things that Elon Musk showed was that
you could fire all your safety teams and no one bad.
It an not right because we don't have any stats.
I want to be super honest with people. Mark Zuckerberg
has fired a huge number of safety people in the
last six months, and the market has rewarded him. You know,
their stock price is going up because Facebook looks more profitable.
But he also fired their AI safety team, and then

(22:58):
they open sourced their large language model when people talk
about existential risks from AI. Allowing for mass proliferation of
these technologies doesn't allow us to do thoughtful, slow, intentional development,
And so I don't think what she's saying is true anymore.
We're living in a very different world.

Speaker 1 (23:17):
In fairness to Cheryl, do you think it was true
at the time.

Speaker 2 (23:20):
I think in twenty nineteen, they were trying hard. If
Facebook had continued in the vein they were working in
twenty nineteen, I probably would have never been a whistleblower.
You know. I probably would have been like many people
who came before me, who've kept their head down and
kept trying, kept trying to make it safer, and eventually
burned out because the only part of Facebook that was
growing was the safety teams in twenty nineteen. By twenty twenty,

(23:43):
they had given up on that. You know, they'd said,
we're not getting acknowledged for the effort we're putting in,
and these teams are just liabilities.

Speaker 1 (23:50):
Let me ask you just what can be done. We've
heard about kids and mental health. We've heard about misinformation
and the election. We've heard about so many things that
are causing harms to society because social media platforms like Facebook.

(24:12):
Section two thirty prevents or protects these social media platforms
from liability for the content they may carry. The Supreme
Court just made a ruling on that, and I guess
now it's up to Congress. But in the best of
all possible worlds, what would you like, Francis to be

(24:33):
done to rain in the social media companies if you
had to wave a magic wand so.

Speaker 2 (24:39):
I think it's important for people to understand kind of
what's the tool chest that's available to us. I think
the way forward is more something like what Europe did.
So Europe came in and said, hey, you need to
be honest with us about the risks, the harms you
know about. You need to publicly tell us how you're
going to reduce those risks, and you need to get

(25:00):
us enough data that we can see if you're making
progress on those things. Because for context, I think the
fundamental problem is our relationship with these companies is spewed.

Speaker 1 (25:09):
And Congress doesn't seem to really understand the rudimentaries of
the technology that powers Facebook to actually want to do
something about it.

Speaker 2 (25:22):
I think the thing that's going to push Congress over
the line is actually the growing crisis around teenage mental health. Historically,
just for people's contexts, over the last sixty years, we've
had only a handful of Surgeon General advisories. It's things
like seat belts save lives, smoking causes cancer, breastfeeding helps

(25:42):
infants health, things that we take for granted today. But
before those advisories happened, there was ambiguity, there was controversy. Historically,
after a Surgeon General advisory is issued, usually within two
to three years, some sort of legislative action takes place.
I think it'll be really interesting to see how things
play out over the next year or two, at least

(26:03):
in the context of kids.

Speaker 1 (26:05):
And what can be done about that. Tell me how
to reverse or stop the negative impact that social media
and things like Instagram are having on young people.

Speaker 2 (26:20):
So you mentioned earlier that you know the business model
is working counter to our own well being or safety.
Let's take a look at sleep deprivation and kits. So
one of the things called up by the surge in
general was that thirty percent thirty percent of teenagers say
they use social media till midnight or later most weekdays.

(26:40):
That's crazy when we look at risk factors for things
like multiple kinds of mental illness. That's not just depressure
and anxiety. It's also things like bipolar. When we look
at risk factors for accidental death, both automotive and just
general accidents. When we look at risk factors for substance use,
uppers post they're tired downers because they're depressed. All of
those things link back to sleep deprivation. We've known for

(27:05):
twenty years that we can influence whether or not people
use products. Imagine if for two hours before eleven, Instagram
got a little bit slower and a little bit slower,
and a little bit slower, it was like it was
like you're pushing the post a little harder. Maybe there
was a lag on TikTok between videos. Who knows. We've
known for twenty years that if you make an app

(27:26):
a little bit slower, people use the less. Imagine as
you approached your bedtime, you just got tired and went
to bed. That feature is live on Instagram today. That's
a meaningful thing that would help kids go to bed.

Speaker 1 (27:38):
How Mud if parents come in and take their kids' phones.

Speaker 2 (27:41):
We should definitely do that, right, we ignore the fact
that these technologies are extremely powerful and addictive, and they
operate the level of independence that no other consumer product
does today.

Speaker 1 (27:52):
In closing, Francis, I feel like I have to ask
you about AI, which is the new boogeyman of technology,
And rightfully so, it was pretty chilling when these AI
leaders said that artificial intelligence poses a threat as big
as pandemics and nuclear war, and it's sort of like,

(28:17):
holy shit. And yet you wonder, since the government has
been so impotent when it comes to figuring out how
to regulate social media, what they're going to do about
this looming threat.

Speaker 2 (28:34):
So I think it's always important to remember that these
are percentage risks, right, So this is you know, they
say there's a one percent and two percent risk, which
is terrifying, right, you know, one or two percent risks
of extension. We should take those seriously. But I think
one of the things that people also need to be
honest about is we kind of let the cat out
of the bag, right. I think things like Fortune five

(28:56):
hundred companies should get together and say, hey, we will
only buy generative AI products that meet this bar of safety.
There's a code of practice, the code of conduct, where
we're like, we're not going to let our economic might
fuel development of AI unless you do it in an intentional, thoughtful,
responsible way. I think that's totally a thing that should
happen one hundred percent. I think Sam Altman's talks about

(29:18):
having licenses of saying hey, right now, there is a
market disincentive to be safe, you know, move fast and
break things to quote more exeper work. The fact that
Facebook fired AI safety team, no one's punishing them for them.
But when people talk about existential risk, to not have
that existential risk, we have to say, no one in

(29:39):
the world, that includes governments and militaries, get to have
AIS more powerful than a certain level. You know, how
do we have a just more stable world? Because if
we are just escalating, the path of escalation will lead
to will lead to all those existential risks.

Speaker 1 (29:56):
Is there anything you're excited about when it comes to AI, Francis,
so we don't have have to end on a terrifying note.

Speaker 2 (30:02):
Yeah, we need to talk about short term and long term.
The short term on generative AI, I think is transformative.
Right now around the world, there are literally billions of
people who don't have doctors. We're going to live in
a world in the next ten years where every child
in the world has a pediatrician, it just might be
a robot pediatrician. We're going to live in a world

(30:23):
in the next ten years. We're going to live in
a world where every child in the world is going
to have the highest quality reading instruction that has ever
existed for humanity. You know, an endlessly patient tutor that
will sit there and over and over again as long
as that kid keeps working, will help them learn to read.
That's going to be transformative. There are high probability short
term rewards that I think are almost certainly going to happen.

(30:46):
It is going to transform the world. The thing I
try to caution people on is those existential risks are
very low probability, and they're much longer off, and so
it is more important for us to try to build
a just world where the motivations incentives for doing those
existential risks are as low as possible. So one of
the things that I am always trying to remind people

(31:09):
is we have invented new communication technologies before. When we
invented the printing press, suddenly a bunch of people learned
to read, and people start publishing pamphlets on things like
how do you know if your neighbor's a witch? What
should you do about that? And chaos ensued. We had
wars that killed huge numbers of people when we invented

(31:29):
the cheap printing press. We had wars about misinformation things
like you know, yellow journalism. But we learned and we responded.
We developed journalistic ethics, We founded journalism schools to teach
those things, journalistic trade associations to help people self regulate.
We passed laws on media concentration to make sure that

(31:49):
you know, you got to hear from different voices. We
learned about how it lived in our media environment or
information environment. It feels overwhelming right now because because we're
the ones who are responsible for figuring out where we
go from here, you know, it's about how are we
going to learn, How are we going to respond, how
are we going to act? And part of why I
have faith that we're going to figure this out is

(32:11):
is while it may seem impossible right now, every single
time before when we've made a new media technology, we've
learned and we've responded. So I will keep on pushing
and I just have a longer time horizon, I think
than many other people did.

Speaker 1 (32:26):
From your lips to God's ears. Francis Hagen, thank you
so much for talking with me. Your new book is
called The Power of One. How I found the Strength
to Tell the Truth and Why I Blew the Whistle
on Facebook. Thank you so much. Thanks for listening everyone.
If you have a question for me, or want to
share your thoughts about how you navigate this crazy world

(32:48):
reach out. You can leave a short message at six
oh nine five one two five to five oh five,
or you can send me a DM on Instagram. I
would love to hear from you next time. This Question
is a production of iHeartMedia and Katie Couric Media. The
executive producers are Me, Katie Kuric, and Courtney Ltz. Our

(33:08):
supervising producer is Marcy Thompson. Our producers are Adrianna Fazzio
and Catherine Law. Our audio engineer is Matt Russell, who
also composed our theme music. For more information about today's episode,
or to sign up for my newsletter wake Up Call,
go to the description in the podcast app or visit
us at Katiecuric dot com. You can also find me

(33:30):
on Instagram and all my social media channels. For more
podcasts from iHeartRadio, visit the iHeartRadio app. Apple podcasts, or
wherever you listen to your favorite shows.
Advertise With Us

Host

Katie Couric

Katie Couric

Popular Podcasts

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.