Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Kiota. I'm Chelsea Daniels and This is the Front Page,
a daily podcast presented by The New Zealand Herald. Meta,
the parent company of Facebook and Instagram, has implemented some
changes across its organization since Donald Trump's election win. The
(00:26):
social media giant is set to remove independent fact checkers
from its service, replacing them with community driven notes, similar
to what x implemented after its rebrand from Twitter. The
company has also ended various diversity, equity and inclusion measures,
while chief executive Mark Zuckerberg has called for more masculine
(00:48):
energy in the corporate world. All this came before he
was seated in the front row at Trump's inauguration alongside
other tech bosses, raising questions about how tied up these
global companies are becoming with the current US administration. Francis
Hougan is a former Facebook employee turned whistleblower over the
(01:10):
company's actions. She joins us today on the Front Page
to discuss the changes in the take world. Francis, could
you start by giving us a quick rundown of what
you did, it matter and what prompted you to go
public with those thousands of internal documents.
Speaker 2 (01:29):
Certainly so when I joined Facebook. Facebook knew it had
a problem, so it had invested a great deal of
money in something called third party factchecking. That's the program
that they recently ended. But they knew that that program
only covered a tiny, tiny sliver of the people in
the world. Overwhelmingly the money was invested in the United States.
Beyond that, it was a little bit in Australia, a
little bit in Europe, a little bit scattered in a
(01:51):
few other places. But Facebook is the Internet for most
of the world. You know, for many, many, many people.
They live in countries where, you know, seventy percent, eighty percent,
ninety percent of the content available on the Internet in
their language will only be on Facebook. And so they said, hey,
we need to have a way to deal with misinformation
when we can't have factcheckers. And so that was the
(02:12):
responsibility of my team. So the Civic Misinformation Team and
I came forward in twenty twenty one. Because in the
process of doing that job for two years, different variants
of it worked on different parts ended up within threat
intelligence within Facebook. I realized that Facebook was serially lying.
They had a habit of lying about what their company
was doing, what the effects of their action or inaction
(02:35):
was and they actively misled the pubook about what was possible.
And so I came forward because I wanted people to
have the information they needed to protect themselves. And in
the process I became known as the Facebook whistleblower, even
though there are very very many whistleblowers that come out
of Facebook, because I'm not the only employee that knows
their problems the public needs to act on.
Speaker 1 (02:57):
Now, I've had in recent ways about metas plans to
remove independent fact check is and let use this fact
check content. Does this feel like a backward step to you?
Speaker 2 (03:08):
You know, it's one of these actions that on the
surface feels really dramatic. But one of the key things
that I detailed when iking forward was was how superficial
the third party fact checking program was. You know, every
month they fact checked maybe hundreds and maybe less than
like five hundred. We're talking three hundred two hundred pieces
of content globally, when Facebook was producing hundreds of millions,
(03:30):
if not billions, the pieces of content globally. So in
many ways, we're ending an era of you know, lipstick
on the pig, if you will. When it comes to
seeing like third party fact checking is our first line
of defense. But I think the larger issue is around
how a lot of this has been framed. Right, The
idea that you know, Facebook is quote like fighting for
freedom and fighting for freedom of speech, and yet we
(03:53):
see that they're quite willing to take down content from
issues that maybe be politically expedient to them. So it'll
be interesting to see how all this evolves.
Speaker 3 (04:05):
After Trump first got elected in twenty sixteen, the legacy
media wrote NonStop about how misinformation was a threat to democracy.
We tried, in good faith to address those concerns without
becoming the arbiters of truth. But the fact checkers have
just been too politically biased and have destroyed more trust
than they've created, especially in the US. So over the
(04:26):
next couple of months, we're going to phase in a
more comprehensive community notes system.
Speaker 1 (04:36):
I know some of the revelations from your Facebook files
centered on Meta's failure to fact check stolen election claims
prior to the January sixth insurrection. Right, does it surprise
you that the company is now I guess, perceivably handing
the power back to its consumers. It doesn't feel like
they've actually learned any kind of lessons, does it.
Speaker 2 (04:54):
Well, I would say the larger issue in my disclosures was,
for example, prior to genuine so before the twenty twenty election,
they actually took a lot of actions. You know, they
had one hundred plus what they know is break the
glass measures, which were little tweaks to how the company operated.
You know, for example, if you're in a group that
(05:15):
has a very small number of people who are driving
all the invites for the group, that's kind of suspicious.
You know, when you have point three percent of people
who send out invites for stop the Steal, inviting a
third of all the members, you know, that should be
something that you would maybe take pause on or maybe
operate that group a little bit differently because you know
that someone's trying to manipulate your platform. I was trying
(05:36):
to hack your vulnerabilities. Facebook knew prior to the twenty
twenty election they had a bunch of problems, and they
flipped a bunch of little switches to make their systems
work in a way that they knew was safer. But
prior to January sixth, they turned off. They left all
those switches in the dangerous position. I think one of
the key kind of differences of what's changed in the
(05:57):
last few years is this question of do companies have
a responsibility to understand their vulnerabilities and do they have
responsibility to act to address those vulnerabilities if those vulnerabilities
have the ability to say, change the course of an election.
And back in twenty twenty, Facebook very clearly thought they
(06:18):
had a responsibility to do that because they had a
three hundred person team staffed to look for things like
voter disenfranchisement, to look for foreign operations trying to amplify
misleading information. But by the twenty twenty four election, they
had dissolved that team. And now they're taking even a
step further and coming out and slandering their own efforts
(06:40):
to go and try to make the platform safer by
labeling with words like censorship, even though many many of
those interventions had nothing to do with content. So it's
interesting to see how far Mark has come and how
his perceptions around what his responsibilities are have shifted in
a remarkably small amount of time.
Speaker 1 (06:58):
What do you think it's because perhaps they more at
home now. I suppose at Trump's inauguration last week we
saw metas Mark Zuckerberg X's elon Musk, Amazon's Jeff Bezos,
and Google's the d Pachi in the front row ahead
of Trump's cabinet. Why are these tech CEOs so keen to,
I guess kiss the ring cozy app to the new administration.
Speaker 2 (07:20):
So we're in a really interesting moment where when we
look across the world, countries are starting to say that
they want a different balance of power with Silicon Valley.
And I've heard very openly on certain podcasts where people
who are really deeply connected into Silicon Valley say things
like part of why people shifted, why these taxios shifted
(07:43):
to support Trump, was that as those countries, as the
European Union passed the DSA, as Australia started sending boundaries
on how young should children be when they to use
social media, they expected the United States government to step
up and defend their economics interests. And the Biden administration,
I think, in many ways took what is probably a
(08:04):
more appropriate position, which is to say every country is different,
trace should have autonomy to say how they want to
govern forces that shape their own countries. But if you're
a tech ceo, your job is to optimize for your company.
And so Trump's inauguration is a second chance for a
lot of these companies, you know, the AI regulations that
you're a past. The only way Silicon Valley won't have
(08:27):
to figure out how to adapt to those things like
they did to GDPR the privacy law is by having
people like Trump's step in and say we're just not
going to do that. Look at what happened with the
TikTok Investment Bill, also sometimes to know as the TikTok ban.
He said, I'm very popular in TikTok. I don't want
it to go away. You know, there's these interesting moments where,
(08:48):
you know, Silicon Valley sees we have someone who is lawless.
You know, he doesn't care that the Supreme Court has
ruled that Congress had the right to do this, that
this is a legal action. He believes he's about the law.
So if they can show Trump that they are useful
to him, they are going to get a second chance
at all these mechanisms of oversight. And I think that's
(09:10):
the thing that they're fighting for.
Speaker 3 (09:16):
And as of today, TikTok is back. And I said,
we need to save.
Speaker 4 (09:24):
TikTok because we're talking about a tremendous who in this
audience goes with TikTok Benny very popular.
Speaker 1 (09:35):
And frankly, we have no choice. We have to save
it a lot of jobs.
Speaker 2 (09:40):
We don't want to.
Speaker 1 (09:41):
Give our business to China. We don't want to give
our business to other people. You mentioned the TikTok ben
and it's quite incredible to see once it came back online,
there was a message shown to all users that thanked
the Trump administration. That doesn't seem like something that we've
(10:03):
seen before.
Speaker 2 (10:05):
No, well, they know flattery works. You know, this is
what foreign governments realized in the first Trump administration. But
the number one thing that you can do to get
your way is just make him feel like a really
big man. And so we have the situation now where
tech CEOs are coming in and saying, as long as
we can stay on Trump's good side. And remember, in
the case of Mark, he really needs Trump. For context,
(10:28):
when we look at what's coming down the pipeline, just
this year, we're going to start seeing the cases that
are going to make up the forty four US state
lawsuit against Trump against Meta herting kids. You know, this
is often compared to like the tobacco lawsuit. We're going
to see school districts suing to say, hey, you're costing
us tons of money by forcing us to have to
(10:50):
pay for tutors, to pay for more security guards, to
pay for therapists because kids are shown up to school suicidal.
We need those costs rehooped. There's a lot of these
major actions that are coming down the pipeline. More countries
are probably going to follow Australia's lead and putting in
age restrictions. Mark's only chance to subvert the democratic processes
that are happening here is to get Trump to intervene
(11:12):
in one of these hail Mary's. And so I think
it makes a ton of sense strategically for these companies
to be doing everything they can to kiss the ring.
That's the thing that's going to protect them from this
wave of oversight that democracies around the world have been
pushing forward.
Speaker 1 (11:39):
Zuckerberg recently suggested on the Joe Rogan podcast that corporate
culture needs more masculine energy, describing it as culturally muted,
and that having a culture that celebrates the aggression a
bit more has its own merits. Now having worked in
that Silicon Valley environment, to those ring true to you.
Speaker 2 (12:01):
You know, it's fascinating. So Google spent a huge amount
of money and a huge amount of effort, bringing in
data scientists, bringing sociologists, doing huge amounts of work to
figure out what makes for a highly performing team in
Silicon Valley. And the thing that they found was most
important was that people felt psychologically safe. So when you
sit there and you say we need more masculine energy, like,
(12:21):
what does that mean? Does it mean violent? Like you know,
active aggression, doesn't mean more yelling? Like, what's the behavior
you want to see more of in the office? You know,
I think if I were sitting at Google, you know
the thing they actively did after they did that research
was go and give their managers more EQ training because
you know, if you want to have a highly performed
(12:43):
team where people take risks, people have to feel psychologically safe.
So it feels just like marketing messages. It doesn't feel
like you know, a serious person actively looking at the
research that's been done on what makes teams.
Speaker 1 (12:56):
Achieve, Yeah and not. The key thing we've learned in
recent ways is about the end of DEI initiatives that
the company and Meta, including removing tampons from men's toilets
that were there for trends or non binary employees. What
point are they trying to get across by doing this?
Speaker 2 (13:12):
It's purely performative, right Like what harm was done by
having tampons in those laboratories? Right? Like? Who has been
damaged by this? I think one of the interesting trends
that we are going to start to see happen over
the next few years is AI encoding radically changes the
constraints the companies were built on. So let's be super
(13:36):
honest here. Part of why Silicon Valley companies started investing
more in these diversity initiatives is they just needed more employees.
You know, It's like the US military. The US military
cannot function today without reinstating the draft, without having women's soldiers.
It's like, you can, you can drive the limit a way,
you can drive the racial minorities away, but then we're
(13:57):
gonna have to bring back the draft. In the case
of Silicon value, you can't force people to work at
a tech company. But at the same time, with the
advent of AI, you don't need as many employees. Like
I think there will be companies that emerge that make
very different choices esthetically, And I'm saying esthetically because that's
what banning tampons or having tampons is in this case. Right,
(14:17):
you're saying, hey, we want to be a culture. The
way we envision happiness is being inclusive in the following
ways or or or not being inclusive. It's going to
be possible for some companies to actually reach reasonable scale
by just having fewer employees. And so I think one
of the things that Facebook is probably looking at is,
you know, they purge to huge number employees in twenty
(14:38):
twenty two. They've done it again over the last few years.
It seems like every single year they say their goal
is to decrease headcount. And they also are one of
the top companies at investing in large language models, and
so it you know, maybe they've made a decision where
they're like, we don't have to anymore be a space
that welcomes a diverse set of people because we can
(14:59):
get away with it with So. I don't know this
will become a larger trend. I don't know if it's
going to be. Like I mentioned before, people are going
to go by the research and say, hey, if we
want people to take risks, if we want them to
act effectually at work, we're going to create spaces where
people feel safe. But who knows, It'll be an interesting
few years to see how it all plays out.
Speaker 4 (15:23):
Within days, Musk has not only moved to reshape the product,
but the company's culture itself.
Speaker 3 (15:29):
What Twitter prides itself on is a real human centric,
people first mentality.
Speaker 4 (15:35):
That's what Twitter Canada's then boss told us in twenty twenty.
Now Musk appears to be gutting the social media giant's
content moderation team, and research is already pointing to a
spike in racist slurs on the platform.
Speaker 1 (15:54):
You described Meta back in twenty twenty one as putting
profit over safety. I think that's the case now with
the changes we're seeing coming through and where do you
see it going from he so, I think.
Speaker 2 (16:06):
I think one of the major trends that Elon Musk
released on social media as a whole is when he
reduced the size of the company by over fifty percent,
he fired many of them, many of them left on
their own accord. When he reduced the size of the
company so dramatically, he ended up showing Silicon Valley that
you could cut safety and there would be no consequences.
(16:27):
And when we look at what Mark said in twenty
twenty two when he cut huge spots of the safety teams.
He said, Elon showed you could quote rip the band
aid off, and we're going to do it too. And
so I think there's a real feeling that there's no
one forcing them to do even slightly the right thing. Right.
If they can evade the DSA, it doesn't matter how
(16:50):
little they invest in safety outside the United States. If
they can get on Trump's good side, they can evade
the consequences of the lawsuits that are stalking them. And
so I think they think that they can act any
way they want to. And part of that is there
are people on the Internet who have demonized any form
of action. They've lumked everything into one pot, even though
(17:11):
people like me have been out here since twenty twenty
one saying, hey, Facebook knows, there's lots and lots of
ways to do safety that aren't content moderation. And those
interventions like designing for safety, making sure the products operate
holistically in ways that are safe, work much more effectively
than choosing which ideas are good or bad. But those
messages really get lost in a world where you have
(17:32):
so many loud voices on the Internet saying that the
word safety is synonymous with censorship, and Mark got up publicly,
you know, not more than a few weeks ago, and
said over and over again in his announcement the word censorship,
further equiting in people's mind that safety and censorship are
so intimately intertwined.
Speaker 1 (17:52):
Finally, I know you've spoken about some of Australia's measures
against the likes of Facebook, but if you could speak
to New Zealand lawmakers about the dangers of social media
and algorithms, what would you tell them?
Speaker 2 (18:04):
The most important thing for lawmakers to understand is that
we are facing wildly asymmetrical forces here, that open societies
are intrinsically vulnerable to be manipulated online. You know, we
value people's opinions. If you can find ways to cheaply
and at scale influence the opinions of people, you can
(18:24):
change the course of a democracy. And what we've seen
time and time again now either through internal forces, because
let's be honest, you know we've seen in places like
Brazil or in Argentina there are people who are very
effective at spreading their messages on social media can go
and upend a political establishment, and so demanding transparency at
(18:46):
a minimum for things that impact national security. That's things
like foreign interference and manipulation of our social media platforms.
Ensuring that at a minimum we get transparency on national
security issues is not a nice to have for operating
a democracy in the twenty first century. It's a must
have if you want to continue to get to be
(19:07):
a place where we can have public deliberations, where we
can altogether work defind what is the best path forward,
because otherwise countries they want to make us week want
to see us divided, will divide us, and we'll lose
that magic that makes our democracies so special.
Speaker 1 (19:25):
Thanks for joining us, francis my pleasure. That's it for
this episode of The Front Page. You can read more
about today's stories and extensive news coverage at enzidherld dot
co dot MZ. The Front Page is produced by Ethan
Sills and Richard Martin, who is also our sound engineer.
(19:47):
I'm Chelsea Daniels. Subscribe to The Front Page on iHeartRadio
or wherever you get your podcasts, and tune in tomorrow
for another look behind the headlines.