All Episodes

November 19, 2021 81 mins

Robert is joined again by Jamie Loftus to continue to discuss the Facebook Papers.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
Let's do it. Start the podcast podcast. All right, well
have the let's have that be what starts the podcast.
What we just said, let's start the podcast podcast. Let's
start the podcast. Well, I'm Robert Evans. Yep, I'm so
feeling I'm in. I never introduced myself. I'm Jamie. Who

(00:23):
are you, Jamie Loft? Anything more we need to say?
Are we done with the episode? Anderson's here? No, I
think that. Yeah, yeah, Anderson is here. Uh sure, Well,
you know what's happening in the world. Facebook is happening
to the world, and it's not great. Jamie. It's not great.
Sophie not not a fan of the Facebook. We left

(00:45):
off having gone through some of the Facebook papers, particularly
employees attacking their bosses after jan six, when it became
clear that the company they were working for was completely
morally indefensible. Attack they are. Yeah, they already knew. I
wouldn't call it attacking either. I would call it they
were pretty I mean, there's a guy like the quote there,

(01:09):
there was the guy who's like history won't judge as kindly. Uh,
the guy, and we didn't ban Trump into thus in fifteen.
That's what caused the Capital riot. I mean facts or facts?
Is that really attacking if you're just like, well, I
think that, Yeah, I think stating facts can be an attack. Well, okay,
put it on a T shirt. I mean for people

(01:31):
like this, you know, Um yeah, I think I think
stating facts can be an attack. And we ended Part
one by sharing some of the blistering criticisms of Facebook
employees against you know, management in the service itself. So
as we start part two, it's only proper that we
cover how Facebook responded to all of this internal criticism.
As I stated last episode, Facebook is in the midst

(01:51):
of a year's long drought of capable engineers and other
technical employees. They are having a lot of trouble hiring
all of the people that they need for all of
the things they're trying to do. Um. So, one of
the things is for a lot of these employees, when
they say things that are deeply critical, they can't just
dismiss the concerns of their employees outright because act like

(02:12):
if they were to do that, these people would get
angry and they need them, right. Facebook's not in the
strongest position when it comes to the people who are
good engineers. They have to walk a little bit of
a tight rope. However, if they were to actually do
anything about the actual meat of the concerns, it would
reduce profitability and in some cases destroy Facebook as it
currently exists. So they're not going to do anything, um,

(02:32):
which is meant that they've had to get kind of
creative with how they respond. So Mark and his fellow
bosses pivoted and argued that the damning critique like, yeah,
old Zucky z uck. So when when this all comes
out and people are like, boy, it sure seems like
all of your employees know that they're working for the
fucking death star. Um, Zuckerberg and his like mouthpieces made
a statement that like, all of these damning critiques from

(02:54):
people inside the company were actually evidence of the very
open culture inside Facebook, which encouraged workers to share their
opinions with management. Um. That's exactly what a company spokesperson
told The Atlantic when they asked about comments like history
will not judge us kindly. The fact that they're saying
will be damned by historians means that we really have
a healthy office culture. Um. Hashtag dust star, proud, dust Star,

(03:16):
proud everybody. Yeah. Yeah, it's like the fact that we
removed a stigma of working for the devil, right, I mean,
come on, the devil I would be proud to work
for because he's done some cool stuff, Like have you
ever been to Vegas? Nice town? Oh, I've been to Vegas.
I've seen I saw that. I saw the Backstreet Boys
in Vegas right before two of them were revealed to

(03:37):
be in Q and On. So really caught the end
of that, Locomo, Wow, I did not realize that a
sizeable percentage of the Backstreet Boys had gotten into Q
and On. That makes total sense. The Backstretoys, They're from Florida.
They're ultimately five men from Florida. So what can you do?
As the author of that article, The Atlantic article noted,
this stance allows Facebook to claim transparency while ignoring the

(04:00):
substance of the complaints and the implication of the complaints
that many of Facebook's employees believe their company operates without
a moral compass. All over America, people used Facebook to
organize convoys to d C and to fill the buses
they rented for their trips, and this was indeed done
in groups like the Lebanon main truth Seekers, where Kyle
fitz Simmons posted the following quote, this election was stolen,

(04:20):
and we're being slow walked towards Chinese ownership by an
establishment that is treasonous and all too willing to gaslight
the public into believing the theft was somehow the will
of the people. Would there be an interest locally in
organizing a caravan to Washington, d C. For the electoral
College vote count on January six? Um? Yeah, and Kyle
recently played guilt not guilty to eight federal charges, including

(04:40):
assault on a police officer. Mark Zuckerberg would argue that
like Facebook didn't play a significant role in organizing January six,
and couldn't have played a significant role in radicalizing this
guy and many other people. But the reality is that
for the people, like the people who managed part of
what led Kyle fitz Simmons to go assault people on
Nuary six, UM was the fact that he had been

(05:02):
radicalized by a social network that for years made the
conscious choice to amplify angry content and encourage anger because
it kept people on the site more right, Like all
of the anger that boiled up at January six that
came from a number of places, but one of those
places was social media. Because social media profited and specifically,
Facebook knowingly profited from making people angry. That was the business,

(05:25):
and of course it blew up in the real world.
I have a question, um, just out of your own
experience and observation, which is, how do you, um like,
if you're doing a side by side case study of
how Facebook responded to events like this versus how like
YouTube slash Google responded to radicalization? Are there like significant differences?

(05:47):
Did is there any did anyone do better or difference?
Twitter has done better than probably most of them YouTube.
I mean, and again I'm not saying this the Twitter
has done well, um or that YouTube has done well.
They've both done, particularly with coronavirus disinformation, a bit better
than Facebook. Um, and they were they were better in

(06:08):
general on not really YouTube as much, but like Twitter
was definitely has taken has been the most responsible of
the social networks around this stuff. Um. It did seem
like for a while there the various networks were kind
of like duking it out to see who could do
the absolute worst and damage the lives and it's it
seems like Facebook one. Yes, that I would say Facebook,

(06:30):
but and again Twitter chose to do a lot of
the same talxic things Facebook did, so did YouTube, and
they did it all for profit. A number of the
things we've criticized Facebook for, you can critique YouTube and Twitter. Four.
I would argue Twitter certainly has done more and more
effectively than Facebook, not enough that they're not being irresponsible,

(06:51):
because I would argue that Twitter has actually been extremely
irresponsible and knowingly so. Um. But I think Facebook, in
my analysis, Facebook has been the worst. Although I'm not
as I haven't gotten studied as much about like TikTok yet,
so we'll see. But but my analysis, you've got to
get on TikTok, pivot out of podcasting and into TikTok dances. Yeah,

(07:12):
I mean, I'm it's it's not the dances that concerned
me on TikTok. It's the minute long conspiracy theory videos
that have convinced a number of people that the Kardashians
are Armenian witches and had something to do with the
collapse of the astro worlds or the deaths in the
astro working. My my concern there is the dances that
go over those conspiracy videos and really marry the worst

(07:35):
of both worlds. Because I have seen dancing on there.
I have seen conspiracy videos that involved dancing incredible and
skincare routine. Have you ever seen a conspiracy video where
someone's also doing their skincare routine? Because that is a
thriving SOMETI I'm sure that's yeah. So well, I was like,

(07:58):
that is just a thing that on many platforms. Change
media companies are willfully bad at stopping radicalization because making
people angry and frightened is good for all of their
bottom lines, so they all knowingly participate in this. I
think Facebook has been the least responsible about it, but

(08:21):
that doesn't that shouldn't be taken as praise of anybody,
like saying Twitter. Saying Twitter has done the best is
saying like, well, we were all drunk driving, but John
could actually walk most of a straight line before vomiting,
so he was the least irresponsible of us who drunk
drove that night. Just to put it in terms that
I understand, I was it sounds like Twitter is the

(08:43):
Backstreet boy. That's like, look, I don't believe in Q
and on, but I see their points. That's that's kind
of um the vibe I'm getting fair enough. Um So,
when deciding what which posts should show up more often
in the feeds of other users. Facebook's algorithm ways a
number of actors. The end goal is always the same,
to get the most people to spend the most time

(09:04):
interacting with the site. For years, this was done by
calculating the different reactions of post god and weighing it
based on what responses people had to it. Again, for years,
the reaction that carried the most weight was anger. The
little snarling smiley face icon you could click under a
post it was at one point being waited five times
more than just like a like really like the again

(09:24):
when I'm when I'm saying this was all intentional. They
were like people who respond angrily to post that keeps
them on the site more. That's they spend the most
time engaging with things that make them angry. So, when
it comes to determining what by which methody like how
we choose to have the algorithm present people with posts,
the post that are making people angriest is the post

(09:44):
our algorithm will send to the most people. That's a
conscious choice. That's a conscious Yeah, it's so funny, how
I mean not funny, it's tragic and upsetting, But just
how specific the Facebook audience is that it's like you
would have to be the kind of person who would
be like, I'd better react angry as specific as possible.
In my feedback to this post, which is farm bill

(10:04):
moms and yeah it's boomers, it's boomers, and yeah, um
and yeah, they just kind of knowingly set off a
bomb and a lot of people's fucking brains. Um, they're
addicted to telling on themselves for Yeah, why why Facebook
has something called the Integrity Department and these are the
people with the unenviable task of trying to fight misinformation

(10:25):
and radicalization on the platform. They noted in July that
is so embarrassing. Yeah, just going on the first day
be like, I work for the Facebook Integrity Department, Like, yeah,
good fucking luck. Yeah, I work for thee in your life.
My job is to go door to door and apologize
to people after we bomb them. We have gift baskets

(10:46):
for the survivors, you know, Like it's that that's the gig. Really. Um, yeah,
I send edible arrangements to people who have been droned
striight like, oh Jesus awful. There's was one of my
favorite follows on twitters, Brook Minkowski Um, who used to
work for Facebook and was like one of the people
early on who was trying to warn them about disinformation

(11:06):
and radicalization on the platform years ago and left because
like it was clear they didn't actually give a shit.
UM And and a lot of the Integrity department people
are actually like really good people who are a little
bit optimistic and kind of young and coming and like, Okay,
I'll make it's my job to make this huge and
important thing a lot safer. Um. And these people get

(11:28):
chewed up and spit out very very quickly. Um. And
the members of the integrity team, Um, we're kind of
analyzing the impact of weighing angry content so much. And
some of them noted in July the extra weight given
to the anger reaction was a huge problem. They recommended
the company stop weighing at extra in order to stop
the spread of harmful content. Their own tests showed that

(11:50):
dialing the weight of anger back to zero so it
was no more influential than any other reaction, would stop
rege inducing content from being shared and spread nearly as widely.
This led to a five percent reduction and hate speech, misinformation, bullying,
and posts with violent threats. And when you consider how
many billions of Facebook posts there are that's a lot
less nasty shit, um, some of which is going to

(12:11):
translate into real world violence. And again this was kind
of a limited study, so who knows how it would
have actually affected things in the long run. So Facebook
made this well less money question mark. Yeah, this actually
was kind of a win for them. Um. Facebook did
make this change. They pushed it out in September. UM,
and the employees responsible deserve real credit. Again, there's people
within Um, there's people within Facebook who did things that

(12:34):
really actually we're good Like changing this was probably like
made the world a bit healthier. That said, the fact
that it had been weighted this way for years, you
don't undo that just by dialing it back now. For
one thing, anger has become such an aspect of the
culture of Facebook that even without weighing the anger emoji,

(12:55):
most of the content that goes viral is still stuff
that makes pisces people off, because that's just become what
Facebook is, because that's what they selected for for years.
Like it Also, like, who knows, like if they've done
this years ago, if they never waited anger more, it
might be a very different platform with like a very
different impact on the brains of for example, our aunts
and uncles. Um I think that that's really interesting too,

(13:16):
because that timeline lines up pretty quickly with or pretty
exactly with where it feels like a lot of younger
people were leaving that platform and the platforms became associated
with older people. Because I feel like I don't think
I was using Facebook consistently after twenties seventeen. I want
to say it was maybe my last Facebook year. Yeah

(13:37):
I stopped. I mean I I stopped visiting it super
regularly a while back. Um yeah, maybe around So in
April of Facebook employees came up with another recommendation. This
one wouldn't be as successful as as changing you know,
the reaction of the algorithm to the angry reaction spurred
by the lockdown on the sudden surge of q and

(13:58):
on Boogaloo and he lockdown groups urging real world violence.
It was suggested by internal employees that the news feed
algorithm de prioritize the posting of content based on the
behavior of people's Facebook friends. So the basic idea is this,
um a new what Facebook was doing was you would
if normally like the way you'd think it would work, right,

(14:19):
is that like your friends post something and you see
that in your news feed, right, like the posts of
the people that you've chosen to follow and say are
your friends? Right? That's how you you would you would
want it to work. Is that what worked? At one
point they made a change a few years back where
they started sending you things not because someone you followed
had said something, but because they'd like to think um,
or they'd commented like not so even commented just like

(14:41):
liked a thing, Like if they'd reacted to a thing
you would get you would get that sent to your
news feed UM. And members of the Integrity Team start
to recognize, like, this has some problems um in it
from one thing, it results in a lot of people
getting exposed to dangerous bullshit UM. So they uh, they
start looking into like the impact of this and how

(15:04):
how just sharing the kind of things your friends are
reacting to influences what you see and what that does
to you on Facebook. The Integrity Team experimented without changing
this might work, and their early experiments found that fixing
this would reduce the spread of violence, insight and content UM.
For one thing, what they found is that like normally
if you hadn't seen someone like a post about something

(15:27):
that was maybe like violent or aggressive, your conspiratory, like
a flat earth post, your appost urging the execution of
an elected leader. If you hadn't seen anyone that you
knew react to that post, even if you saw it,
you wouldn't comment on it or share it. But they
found that, like if you just saw that a friend
had liked it, you were more likely to share it,
which increases exponentially the spread of this kind of violent content.

(15:47):
And it's it's this idea, like the whole people weren't
stopped being afraid to be racist at a certain points
as much as they had been earlier, and it led
to this surge in real world violence. It is kind
of the same thing people felt by seeing their friends
react to this, they felt permission to react to it too.
In a way, maybe they would have liked, well, I
don't want to like, maybe I'm interested in flat earthship,
but I'm just going to ignore this because like I

(16:08):
don't want to seem like a kok that is so
fucking upsetting and and fascinating in the way that it
affects your mind. Is is Yeah, there there was a
time where you would if you were you know, racist, misogynist, homophobic,
whatever you were, but you just didn't talk about it.
But then all of a sudden, there's this confirmation that like, hey,

(16:29):
this person you know and see all the time feels
the same fucking way you do. So why be quiet
about it. Let's discuss. Like it's just that's so dark.
It's really dark. And so the the the integrity team
sees this and they're like, we should change this. Um,
we should only show We shouldn't be showing people just
like the reactions their friends have had to content because
it seems to be bad for everybody. Um. And they

(16:52):
do find in some of their you know, because when
the experiment, they're like that, we'll take this country or
this city and we'll we'll roll this change out in
this limited geograph location to like try and see how
it might affect its scale. And they do this and
they see that like, oh, changing this significantly reduces the
spread of specifically violence and citing contents. So they're like, hey,
we should roll this out service wide. Zuckerberg himself steps in.

(17:14):
According to Francis Hoggan, the whistleblower and quote rejected this
intervention that could have reduced the risk of violence in
the election from The Atlantic Quote, an internal message characterizing
Zuckerberg's reasoning says he wanted to avoid new features that
would get in the way of meaningful social interactions. But
according to Facebook's definition, its employees say engagement is considered

(17:35):
meaningful even if it entails bullying, hate speech, and re
shares of harmful content. The episode, like Facebook's response to
the incitement that proliferated between the election in January six,
reveals a fundamental problem with the platform. Facebook's mega scale
allows the company to influence the speech and thought patterns
of billions of people. What the world is seeing now
through the window provided by reams of internal documents is
that Facebook catalogs and studies the harm it inflicts on people,

(17:58):
and then it keeps harming anyway. See that that is
so that's always um so interesting to hear. And by interesting,
I mean, you know, psychologically harmful, but uh, because it's like, yes,
that is a fundamental flaw of the platform, but that's
also very entrenched into like what the DNA of the
platform always was, which was based on harshly judging other people.

(18:23):
Like that's why Mark Zuckerberg created Facebook, was to harshly
judge women in his community. So it's like, I I
know that it is you know, on a bajillion scale
at this point, but I'm always kind of stunned at
how people are like, oh, it's so weird that this
went you know, the way that it did. It's like, well,

(18:45):
to an extent um, it was always like that, and
maybe it was like cosplaying as not being like that,
and for certain people there were eras in Facebook where
your user experience wouldn't be like that. But it you know,
this goes back almost twenty is at this point of
this being in the DNA of this um this ship show. Yeah,

(19:06):
and it's it's really bleak. Um. It's just really bleak.
And it also goes to show like the It was
one of the things Zuckerberg will say repeatedly when he
talks about when he when he does admit like, yes,
there are problems and there have been like negatives associated
with the site, and we're we're aware of that. They're humbling,
but like, you know, you also have to include all
the good that we're doing, all of the meaning and

(19:27):
the way he always phrases this is like all of
the meaningful social interactions that wouldn't have happened otherwise. And
then you realize when every time he says that meaningful
social interactions that have taken place on Facebook, when he
says he's including as these internal blocuments, he includes bullying
and people like making death threats and like talking about
their desire to murder people like, that's a meaningful interaction.
People getting angry and trying to inside violence together is

(19:49):
a meaningful social interaction, which I guess yet I mean
not meaningless that it that has meaning. Plan meetings were
meaningful social interactions, you know, Um, you gotta give the
KKK that. Uh, the Nuremberg rally was meaning interacting. The
last meaningful interaction I had on Twitter led to like

(20:11):
a rebound. I was dating coming to my grandma's funeral,
blackout drunk, so you know, just man, it's been too
long since I've shown up at a funeral just too
drunk to stand. It was still a one of my
favorite mag with my family to this day. They're like,

(20:31):
who is this guy? And I'm like, I don't really know.
He's drunk. He came on the megabus. He did the
megabus getting drunk from a camel back on the megabus. Yeah,
that would be when I used to do a lot
of bus trips, like when I was traveling and stuff.
That would be one of the tactics as you feel

(20:52):
like a thermist or a camel back with like cranberry juice,
six liquor and and just oh, I mean get that.
It's I'm not above getting funked up on a mega
you know, on your way to my grandma's funeral. That was.
That was a move. Me and my friends got like

(21:12):
wasted in San Francisco one day, just like going shopping
in broad daylight with a camel back, where we would
we would get a bottle of orange flavored Trader Joe's
Patron tequila, and we would get a half dozen lime
popsicles and you just throw the popsicles in with the
patron and the camel back, and throughout the day it
melts and you just have constant cold margarita. It's actually
fucking amazing that fucking we were ito. Yeah, I recommend

(21:39):
it heavily. You will get trashed and people don't notice
dude walking around with a camel back in San Francisco.
Nobody gives a ship. Oh my god, you're basically camouflaged.
Yeah there, you know who else is camouflaged, The products
and services that support this podcast, camouflaged to be more
likable to you by being wrapped in a package of

(22:00):
of the three of us. That's how ads work. I
thought you were saying that you were taking ads from
the U. S. Army recruitment center. Again, I mean, it's
entirely possible. Um but but but at the moment, we're
just camouflaging. I don't know whoever, whoever comes on next,
whoever comes on next, you'll feel more positively about because
of our presence here. That's how ads work. It's good stuff.

(22:29):
Oh we're back. My goodness, What a good time we're
all having today. How are you doing? You make it okay?
You made it sounds sarcastic. I'm having I am having
a good time. I'm glad. I'm happy that you're having
a good time. That's that's my only goal for for
this show and for you. That's a good time. See
now you're doubling down on it, and I'm getting insecure
doubling down. And I'm also talking more and more like

(22:50):
an NPR talking head as I as I get quieter
by the bed. Now I'm going to start having a
panic attack. I've never heard you talk. I know. This
is how I talk to my cats when I may
grey at them there. Honestly, I feel like we do
have that dynamic. I feel like I'm a cat that
you get angry at something. Yeah, because you you jump
on my desk and knock over my zvia. It's infurious attention,

(23:12):
I know. But I've got to work to keep you
in expensive cat food. I only feed my cats the
nice wet food. I would rather have your attention than
really nice food. That's not what my cats say. Um. So,
there's just a shipload to say about how Facebook negatively
impacts the increasingly violent political discourse in the United States

(23:32):
and how they helped make January six happen. But I
think the way I'd like to illustrate the harm of
Facebook next is a bit less political. Um it also
occurs in a different Facebook product. I'm talking about Facebook,
the company generally want her to Facebook. But now we're
going to talk about Instagram. In Part one, I mentioned
that young people felt that removing likes from Instagram temporarily
corresponded with a decrease in social anxiety. The impact of Instagram,

(23:55):
specifically on the mental health of kids and teens can
be incredibly significant. One of the other Facebook internal studies
that was released as part of the Facebook papers was
conducted by researchers on Instagram UH. The study, which again
almost certainly would never have seen the light of day
if a whistleblower hadn't released, it found that thirty two
percent of teen girls reported Instagram made their made them
feel worse about their body. Twenty two million teenagers in

(24:18):
the United States log onto Instagram on like a daily basis.
So that's millions of teen girls feeling worse about their
body because of Instagram. I've never been less surprised. Well,
good news, it gets worse like no fucking kidding. These
researchers released their findings internally on in March of noting

(24:40):
that comparisons on Instagram can change how young women view
and describe themselves. Again not surprising. So company researchers have
been investigating the way that Instagram works though for for
quite a while years um about three years that they've
been doing this seriously, and their previous findings all back
up the same central issues. Photos sharing in particular is
harmful the teen girls nineteen report concluded we make body

(25:03):
image issues worse for one in three teen girls. Its
findings included this damning line, teens blame Instagram for increases
in anxiety and depression. This reaction was unprompted and consistent
across all groups. So like they almost always mentioned that
this app specifically makes them feel worse about their body,
and we don't have to prompt them at all, Like
this just comes up when they talk about Instagram. I mean,

(25:25):
that's that, truly, It's so Sophie. I don't know how
you feel. I mean, I truly think that because I've
been on Instagram since what like show earlier. I think
I liked that on earlier. It was around when we
were in high school. I truly think my my life
and my relationship to my body would be very different

(25:46):
had not been on that app for the better part
of a decade. Yeah, I mean, I mean, especially when
they introduced filters. Yeah, we're about to talk about So
here's here's the kicker, And by kicker, I mean the
bleakest part. In teens who reported suicidal thoughts, percent of
teens in the UK and six percent of the teams
in the United States claim their desire to kill themselves

(26:09):
started on Instagram. That's fucking disgusting and terror. That's pretty bleak.
More than I just like I wish I were more surprise. Yeah,
but it's good to have this data. Um. The data
shows that more than forty percent of Instagram so more
than of Instagram users are less than twenty two years old. UM,
which means you've got twenty two million teens logging onto

(26:31):
the service in the US every day. Six percent of
those people becoming suicidal as the result of Instagram is
one point three two million children who once started wanting
to kill themselves while using Instagram. Hey, everybody, Robert Evans here,
and I actually screwed up the math that I just cited,
which is often the case when I do math. So
anytime I do math of my own in an episode,
you're right to question me. UM. I was calculating six

(26:54):
percent of twenty two million basically, Um. But as the
auddy noted, it's six percent of kids who are suicidal
say that their suicidal feelings started on Instagram, So I
wanted to recalculate that, um about seventy to seventy six
kind of depending on the source. Percent of American teens
use Instagram. UM there are about forty two million teenagers

(27:16):
in the United States um SO I calculated from that,
and about eighteen percent of about nineteen percent of high
school students of of like teenagers UM seriously considered attempting suicide. So,
if we're just counting serious attempts at or people who
seriously considered attempting suicide, that's UM five millions, seven thousand,

(27:40):
six hundred teens who seriously considered suicide six percent of those.
If six percent of those kids had their suicidal feelings
start on Instagram, that's three hundred and forty four thousand,
seven hundred and thirty six children in the United States
who suicidal feelings started on Instagram. UM and I've furthermore
found that about nine percent of kids who seriously attempt

(28:00):
suicide or seriously consider suicide attempted. So of that three
hundred and uh thousand, seven thirty six American kid teens
whose suicidal feelings started on Instagram, about thirty one thousand,
uh and twenty six kids attempt suicide. Um SO, about

(28:21):
thirty one kids in the United States on an annual
basis attempts suicide because of suicidal feelings that started on Instagram.
So that is the more accurate look at the data.
And I apologize as always for the error. But what's
interesting is that these studies do document like Facebook has
is as physically harmful at scale, is like a wide

(28:45):
variety of narcotics, like most narcotics probably are less harmful
at scale physically than Instagram. Um, I think weeds certainly
is um so my god, if if every teenager was
smoking weed instead of doom scrolling Instagram, the world would
just be so funny. If they were, they were chained
smoking cigars instead of being on Instagram. It's so weird

(29:09):
because I think about, like how I don't know whatever.
Like I'm in my late twenties, so I feel like
I have like a little bit of memory of like
what life was like before you were constantly being encouraged
to compare yourself to every single person you've ever met
in your life, regardless of whether you know who they are,

(29:31):
how they are whatever. Um, and I just call me nostalgic,
but uh, I liked how I felt better. Yeah, Like
it's so absurd how much I know about people I
don't give a shit about, and how bad it makes
me feel to know about the curated lives of people
that I don't give a shit about, and how I

(29:51):
let that actively affect my daily life. And it's just yeah,
it's just fucking miserable. It is. It's horrible. It's horrible.
Um that said, I like Florida on the application, so
you know, it's sure. Now Here's why despite the documented
harm that Instagram does, nothing's ever going to change. Um.
As I stated, million US teens used to Graham daily,

(30:13):
only five million log on to Facebook. So Instagram is
almost five times is popular among teenagers as Facebook, where
kids are leaving in droves. So Facebook Mark Zuckerberg's invention
is now definitively just the terrain of the olds. And
Facebook knows that kids are never going to come back,
because that's not how how it being a kid works

(30:33):
like that, You don't get them back. They're going to
continue to do new ship. Eventually they'll leave Instagram for
something else. You know, That's just the way it fucking goes.
Unless unless the thirty year nostalgic cycle is like Facebook
is actually back now, it's actually and I don't think
it gave anybody a good experience enough to have that
it's not the fucking teenage mutant ninja turtle. Yeah. No,

(30:53):
one's getting dopamine hits. That's a good Yeah, it's like
flame and hot cheet does. Nobody's thinking fondly back to
scrolling Facebook when they were seven. They're thinking back to
I don't know, SpongeBob square Pants. Um oh, but at
the moment, Instagram is very popular with teens, and Facebook
knows that if they're going to continue to grow and
maintain their cultural dominance, they have to keep bringing in

(31:15):
the teens. They have to keep Instagram as profitable and
as as addictive as it currently is. Um. And that's
why they bought Instagram in the first place. They only
paid like a billion dollars for it. It It was an
incredible investment um. And they spend in it in a day. Yeah,
that's that's cheap as hell for something as influential and
huge as Instagram. Yeah. I wonder do you know what

(31:38):
it's worth now? I would guess significantly more than a
billion dollars um, But I don't entirely know a lot
of value. But Facebook's like a trillion dollar company now. Yeah,
they're very sucks and it's well, but face but that
includes Instagram, you know. Okay, so yeah, and among you know,
teens are one of the most valuable demographics to have
for advertisers, and Instagram is where quarantine. The number it's

(32:02):
estimated value is two billion. Yeah, that's a good investmentsment,
good investment if money you gotta yeah. Um So. The
fact that so much is at stake with Instagram, the
fact that it's such a central part of company having
any kind of future, is part of why Mark and
company have been so compelled to lie about it. None
of this stuff that we've been talking about was released

(32:24):
when Facebook researchers got it, of course not. They wouldn't
want anyone to know this ship. In March, Mark took
to Congress, where he was criticized for his plans to
create a new Instagram service for children under thirteen. He
was asked if he'd studied how Instagram infects affects children,
and he said, I believe the answer is yes. So
not yes, I think we've studied that. He told them. Then,

(32:47):
the research we've seen is that using social apps to
connect with other people can have positive mental health benefits.
And I'm sure there's something that he's gotten paid researchers
to come up with that he can make that case
off of. Um, I'm sure in certain situations it may
even be true. There are ways you can use social
media that are good. Dear, myth I've legitimately smiled or
had my heart warmed by things that happened on social media.

(33:09):
It doesn't not happen. Um, And I do think that
there is a case for like I mean, and it's
you can't credit Mark Zuckerberg with it, but just I
mean going back to fucking like live journal days of
just like friendships that have deepened as a result of
social media. That's definitely a thing. But the costs that
way the benefits there by quite a bit. Yeah, Um,

(33:32):
it's it's great. So um. So Mark goes on to say,
you know, I I think we've got research that shows
that can have positive mental health effects. You know, I
think we've studied whether or not how it affects children. Um.
But he doesn't talk about He leaves out all the
stuff that all the statistics, like about all the kids
whose suicidal ideation starts on Instagram, they had that data

(33:52):
when he went before Congress. He just didn't mention it.
They hadn't told anyone that ship, Like, he didn't say
a goddamn word about it. Yeah, he was like, yeah,
I think we've looked into it, and you know, there's
some ways in which it can be healthy not And
also one point three million American kids became suicidal because
of our app Like he did not throw that anything,
Like did he throw that? I mean truly, I'm like

(34:13):
up in the air of like did he not say
that because he didn't want people to know? Or did
he just say that because he heard it and he
didn't care and he forgot, Like you just don't know
what that guy that is so fucking evil. Woah, it's
pretty great. Um, and we'll talk more about that later.
In May, Instagram boss Adam Mussari told reporters that he
thought any impact on teen well being by Instagram was

(34:36):
likely quote quite small, based on the internal research he'd seen. Again,
I haven't released this research. He's saying, Oh, we have research,
and it says that any kind of impact on well
being is pretty small. And again, the actual research by
this point show of kids in the UK and six
percent of kids in the United States were moved to
thoughts of suicide by Instagram, which I would not call small.
I would not call I wouldn't I wouldn't necessarily say

(34:57):
it's huge. But that is not a small impact. Um, No,
that is like thousands and thousands, and yeah, that's significant.
The Wall Street Journal caught up with Massari after the
Facebook papers leaked, so they were able to like drill
him on this a bit, and he said a bit more. Quote,
in no way do I mean to diminish these issues.

(35:17):
Some of the issues mentioned in this story aren't necessarily widespread,
but their impact on people may be huge, which is like,
again a perfect nonstatement. That's right. They're like, but what
about the thing we couldn't possibly gauge at all versus
the thing that we did and we're actively distancing ourselves from.
I mean those statistics that's like at least one kid
in every classroom, Like that is gigantic. And when you

(35:40):
read the responses of guys like Massari and compare them
the responsive guy like people like Mark Zuckerberg and official
corporate spokes people, it's it's very clear that they're working
from the same playbook, that they're very disciplined in their responses.
Because Massari does try to tell the journal um that
he he thinks Facebook was late to realizing there were
drawbacks and connecting people in such large numbers. But then
he says, I've been pushing very hard for us to

(36:01):
embrace our responsibilities more broadly, which again says nothing. He
then pivots from that to stating that he's actually really
proud of the research they've done in the mental health
effects on teams, which again they didn't share with anybody,
and I would argue light about by omission in front
of Congress. Um, He's proud of this because he says
that shows Facebook employees are asking tough questions about the platform.

(36:22):
Quote for me, this isn't dirty laundry. I'm actually very
proud of this research, which is the same thing Zuckerberg
said about his own employees damning the service after six Right,
I'm gonna say that's the same exact thing as the
as the as the like actually bad work, you know,
talking about how that working for the Death Star is bad,
um is like evidence of Oh, the Death Stars actually

(36:42):
has a really open work culture. Like No, I don't know,
I feel like there are a few. There are not
many CEOs that are good at flipping a narrative. But
Mark Zuckerberg is particularly bad at it. Yeah, and it's
I mean, part of why they can be bad at
it is it doesn't really matter um, or at least

(37:03):
it hasn't fucking so far. Um. But the patterns I
mean not enough to get a better figurehead like, Yeah,
the patterns pretty clear here. Um. When a scandal comes out,
deny it until the information that can't be denied leaks out,
and then claim that whatever is happening at the site,
whatever like our information you had about how harmful it is,

(37:23):
is a positive because it means that you were trying
to do stuff about it, even if you actually rejected
taking action based on the data you had and refused
to share it with anybody else. Massari and Zuckerberg were
also careful to reiterate that any harms from Instagram had
to be weighed against its benefits, which I haven't found
a ton of documentation on. In fact, as the Wall
Street Journal rights. In five presentations over eighteen months to

(37:44):
this spring, the researchers Facebook researchers conducted what they called
a teen mental health deep dive and follow up studies.
They came to the conclusion that some of the problems
were specific to Instagram and not social media. More broadly,
This is especially true concerning so called social comparison, which
is when people assess their own value you in relation
to the attractiveness, wealth, and success of others. Social comparison
is worse on Instagram, states Facebook's deep dive into Team

(38:07):
Girl body image issues in noting that TikTok, a short
video app, is grounded in performance, while users on Snapchat,
a rifle photo and video sharing app, are sheltered by
jokey features that keep the focus on the face. In contrast,
Instagram more focuses more heavily on the body and lifestyle.
March Internal Research states it warns that the Explore page,

(38:27):
which serves users photos and videos curated by an algorithm,
can send users deep into content that can be harmful.
Aspects of Instagram exacerbate each other to create a perfect storm.
The research states, Yeah, I mean, again, not a shocking
revelation over here there. It is. I mean, I and
I do think that that let's TikTok and Snapchat get

(38:50):
off easy. They're like they're certain there is absolutely uh
toxic body image culture on there, And I feel like
finspo will thrive on any platform it fucking gloms itself onto.
But Instagram is particularly bad because it's like where so
many lifestyle people have launched and and there's so many

(39:11):
headless women on Instagram. It's it is shocking. There's so
many like not like um, not like you macheted my
head off, but like you're not encouraged to show your
head by the algorithm, which sounds weird, but it is true.
The less like it is just very focused on how
you physically look. And then there's also this tendency to

(39:33):
like tear people apart if they have edited their body
to look a certain way, when it's like, well that
the algorithm rewards editing your body to look a certain
way and to do all this, and it's you do
bring up a good point where it's like going, it's
frustrating that it's important to critique Facebook in relation to
its competitors like TikTok and snapchat. Um. That can lead

(39:57):
to the uncomfortable situation of like seeming to praise them
when they haven't done a good job. They just haven't
been as irresponsible. It's kind of like attacking like Chevron.
If you look at all of the overall harms, including
like their impact and like covering up climate change. Maybe
the worst of the big oil and gas companies. I
don't know, it's debatable, but it's like if you're if
you're criticizing Chevron specifically, it doesn't you're not saying that

(40:18):
BP is great. You're just being like, well, these are
the guys specifically that did this bad thing, and they
were the leaders in this specific terrible thing. Other bad
things are going on, but we can't, like, the episode
can't be about how bad everyone. We're talking about Facebook
right now, we have these documents from inside Facebook. I'm
sure versions of this are happening everywhere else. Listeners, in
your everyday life, just don't use Facebook as a yardstick

(40:41):
for morality, you know, because they just end up letting
a lot of people off for a lot of funked
up stuff. I would say in your regular life, don't
use Facebook. Is all the sentence we need it there. Um. Wow,
So you asked me you were talking earlier about like
because Mark went up and of Congress and was like, yeah,
I think we've got research on this, and I've definitely

(41:03):
seen research that says it's good for kids. We know everything.
I just stated that quote. I just read everything like
that's that's in those internal studies. Um, we know that
Mark saw this. We know that it was viewed by
top Facebook leaders because it was mentioned in a presentation
that was given to Mark Zuckerberg himself. We know that
when in August, Senators Richard Blumenthal and Marshall Blackburn sent

(41:25):
a letter to Mark Zuckerberg asking him to release his
internal research on how platforms impact child mental health, we
know that he sent back a six page letter that
included none of the studies we've just mentioned. Instead, the
study said that it was hard to conduct research on
Instagram and that there was no consensus about how much
screen time is too much. Meanwhile, their own data show
that Instagram users who reported feeling unattractive said that the

(41:46):
feeling began while they were on Instagram. Facebook's own internal
reports showed that their users reported wanting to spend less
time on Instagram but couldn't make themselves. And here's a
quote that makes it sound like Heroin teens told us
they don't like the amount of time they spend on
the app, but feel like they have to be present.
They often feel addicted and know that what they're seeing
is bad for their mental health but feel unable to
stop themselves. That's Facebook writing about Instagram like that's that's

(42:11):
their own people saying this like this is not some
activists getting in here, you know. That's so I mean,
it's I guess, good on them, regardless of the level
of self awareness going on there. Um, that's I mean.
And what I was thinking about earlier when it comes
to any time Zuckerberg is in front of Congress or
in front of political officials, I feel like for a

(42:32):
lot of people, the takeaway and the thing that gets
trending is how little political officials and members of Congress
understand about how the Internet works. And that's like the
funny story is like, oh, Mark Zuckerberg talked about an algorithm,
and they and you know, like this is this comes
up all the time, It comes up on v became

(42:54):
up on succession of just like how not internet literate
the majority of people who just i'd how the Internet
works are And it's like it almost becomes like a
he he ha ha, old guy doesn't know how algorithm works.
But it's like, well, the consequence of that is that
it ends up making Mark Zuckerberg look way cooler than
he is and it also doesn't address the problem at

(43:14):
all of like, no, Mark Zuckerberg is omitting something gigantic here,
and the majority of our you know, lawmakers in Congress
don't have the fucking, you know, cultural vocabulary to even
understand that. And that is like and and I guess
it's like it makes for a couple of memes, but
it's just like, no, this is bad. Can you commit

(43:38):
to cancel Finsta? Do you remember that who that was? Sad? Right?
Cancel fits? I mean that I think that's the most
recent one where it's like, okay, yeah, that is you know,
objectively funny. But but like the consequence of that is
I mean, that's ultimately a win for Instagram. And that's
a win for Facebook because it makes them look like

(44:00):
they're operating on a level of the fucking government doesn't understand.
And meanwhile, you know, one kid in every classroom is
suicidal as a result of the inability of law of
like law making officials to understand the effect that this has.
It's just it makes me real mad, Robert. And one
of the nune things about this is that while these

(44:21):
lawmakers don't understand and sound like idiots talking to Mark
Zuckerberg his own employees. These researchers who are part of
the Integrity team, these researchers studying the impact of Instagram
on teens know exactly how harmful it is, and they
are grappling in real time with like the damage their
product is doing two children. Members of these teams reported
frustration at the fact that their colleagues often refused to

(44:44):
take their findings seriously. One former researcher told The Wall
Street Journal that we're standing directly between people and their
bonuses when they try to reduce the harmful aspects of Instagram,
because like anything that reduces the harm is going to
reduce its popularity. It's going to cut down on time site,
it's going to cut down on users, and so everyone
else at Facebook. But the Integrity team gets paid by

(45:05):
how much they increase engagement on the site. That's where
the bonuses come from. So if you're trying to make
Facebook less dangerous, a huge number of the most influential
employees have a financially invested financial interest in you failing.
So that's yeah, And what that boils down to is
like there is a financial incentive for you convincing a
thirteen year old girl that she should get fucking fillers

(45:28):
in her, Like, what the fuck? It's just now all
I'm going to try to sell the teens because it's
ad break time, Jamie, and I would like to Okay, okay,
if you're if you're a teenager, you don't need Instagram.
That ship is bad for you. You certainly don't need Facebook.
That's where the olds are. But you know what's come
back around to being cool and what I think your

(45:50):
generation needs to get on board with smoking huge comical cigars.
I'm talking the ones that are like nine inches long
and like three inches white. Massive teens listening right now,
go out to your local cigar shop. Do not dribe
a sad ador. The worst looking you find outside the
cigar shall pay him for huge, cheap cigars and new

(46:13):
Instagram smoking horrible cigars. This is this is well, this
is what's going to bring buying teenagers hook zoomers, zoomers.
The air is not getting any cleaner. Right, You're all
gonna we're all gonna choke to death on wildfires. You
might as well burn down a big fat Macanudoh, Robert,
can I tell you I bought a teenager a white

(46:33):
claw the other. Good for you, Thank you. I felt good.
I felt like I did a public service. Yeah, teens,
go buy those big, fat, ridiculous lunatic cigars. Bribe, bribe
for it. You're just preparing yourself for climate change. All right,
here's the other white claw teens. Okay, I mean white
Cloud goes great with a huge shitty cigar, Jamie, No,

(46:55):
it did. Absolutely smoking is bad for you, Andy, as
a white smoking a cigar, you puff it, so it's
healthy you. Alright, here's some ads. Alright, we're back. We

(47:15):
are we all just we all just enjoyed a couple
of really comically large cigars. Um, we did not have
those ridiculous long asylum cigars. It was great. Why why
are you fixated on this what is happening? Because I
I find that sketch from I think you should leave
while the little girls are talking about smoking five maccanodos
to unwind at the end of the day. Actually, I

(47:38):
mean yeah, but like I love when you're reveal yourself
to be a basic bitch. I am a basic bittix
A right, That's why I'm thinking about cigar. I love that.
I love that we're in the middle of a podcast,
and uh, you can't get off that well. I also
think making children do things that's bad for them as funny,

(47:58):
but not this way, not the face flashes send dan flashes.
I mean they've also I think the teens are rejecting
in f t s pretty widely, Jamie. So when Facebook
does try to make the case that their products are benign,
they like to bring up studies from the Oxford Internet Institute,
which is a project of Oxford University, which show minimal

(48:20):
or no correlation between social media use and depression. Uh.
The Wall Street Journal actually reached out to the Oxford
researcher responsible for some of these studies, who right away
was like wasn't like, oh, yes, they're right, everything's fine. Um,
he was like, actually, Facebook needs to be much more
open with the research that they're doing because they have
better data than than we can get, than researchers can get,

(48:41):
and so our actual information that they're citing is hampered
by the fact that they're not sharing what they're finding.
And who knows how things can change and our conclusions
could change if we had access to all of that data. Um.
He even told the Wall spre Street Journal. People talk
about Instagram like it's a drug, but we can't study
the active ingredient, which you'll notice is not him saying
it's fine, it's him being like, yeah, I really wish

(49:03):
we could actually study this better. Um, it's difficult, right.
And also he's referring to it like drugs, which is
the comparable scale of how it manifesting. Okay, Yeah, he's
certainly not being like everything's fine. Um, I think that's clear.
He's truly like constantly, Mr Policeman. I gave you all
the closing the situation and just no one gives a ship.

(49:25):
It is very funny and like that that movie, And
that's what I was trying to say, that it's hilarious. Ye.
So we focused a lot on these episodes about how
Facebook has harmed people and institutions in the United States,
but as we've covered in past episodes, the social network
has been responsible for helping to incite ethnic cleansings and
mass racial violence in places like Myanmar and India. Mob

(49:46):
violence against Muslims in India and cited by viral Facebook misinformation,
led one researcher in February of two thousand nineteen to
create yet another fake account to try and experience social
media as a person in Kerala, India, might um from
New York Times quote. For the next three weeks, the
account operated by a simple rule follow all the recommendations
generated by Facebook's algorithm to join groups, watch videos, and

(50:08):
explore new pages on the site. The result was an
inundation of hate speech, misinformation, and celebrations of violence, which
were documented in an internal Facebook report published later that month.
And this is from the Facebook researcher following this test
users news feed. I've seen more images of dead people
in the past three weeks than I've seen in my
entire life total. What a great site Mark built. Facebook's

(50:31):
new tagline the place for corpses. Oh yeah, my goodness.
I mean and so I know that we we have
discussed Facebook's role in in uh super charging ethnic cleansings,
but that is just that is so. Yeah, it's not great, Jamie.
Someone wrote that down, Robert and one wrote that down

(50:54):
and hit published it's not greater or because India is
Facebook's biggest customer, forty million Indians use one or more
Facebook products. Um, that's that's a ship of people. Yeah
three ll um. That is something that I think is
important to remember and something that I lose sight of

(51:14):
sometimes is like Facebook is not a super popular platform
for people of all ages in North America. But that's
not the case. Yeah, and it is just it is
the Internet for a lot of these people. Um, Like
that is the way that that is the whole of
how they consume the Internet in a lot of cases.
I mean maybe with like YouTube or something mixed in,

(51:36):
but they're probably getting a lot of their YouTube links
from their Facebook feed. Um. Now, the fact the fact
that India is the number one customer in terms of
like number of people for Facebook, I'm sure the United
States is still more profitable just because of like differences
in income and whatnot. Um, but this is a huge
part of their business. But despite that fact, they have
failed to invest very much in terms of meaningful resources

(51:57):
into having employees who speak the language, whereas is more
the problem the languages of India. See India super mixed
country right in terms of different like ethnic groups and
religious groups. They have twenty two officially recognized languages in
the country, and there's way more languages than that in
India that significant numbers of people speak. There's twenty two
officially recognized languages. Anyone who can travel there, and I've

(52:19):
spent a lot of time in India can tell you
that being able to effectively say hello and ask basic
questions of people can require a lot of research if
you're traveling a decent amount. But Facebook aren't twenty something
tourists on the problem for good Tan Dory and bond Lassies.
They have effectively taken control of the primary method of
communication and information distribution for hundreds of millions of people,
and they fed failed to hire folks who might know

(52:40):
if some of those people are deliberately inciting genocide against
other people in the country. Eight seven to Facebook's global
budget for identifying misinformation is spent on the United States.
The rest of the planet shares of their misinformation budget.
You want to guess what percentage of Facebook users? North
Americans make up? Ten percent of their budget goes on

(53:03):
ten percent of their users. Of like dealing with disinformation
something else, of dealing with disinformation specifically now, When this
leaked out, Facebook's response was that the information site it
was incomplete and did not include third party fact checkers.
They're like, well, this doesn't include all of the people
the third party companies we hire, except for the data
they show suggests that the majority of the effort and

(53:25):
money spent on third party fact checkers is for fact
checking stuff in the United States. Um. And of course
they did not elaborate on how including this information might
have changed the overall numbers, So my guess is not
by much of at all. Internal documents do show that
Facebook attempted to create changes to their platform to stop
the spread of the disinformation during the November election. In
myan mar those changes which also halted the spread of

(53:46):
disinformation put out by the military, which was a big
like the it was the military inciting ethnic cleansings and
like and trying to incite violence in order to like
lockdown political power ahead of this election. Um, so they
cut this significantly prior to the election. They see it
as a problem. The institute changes similar to the changes
they talked about putting up in the U. S. If
things went badly with the election and these worked, it

(54:07):
dropped dramatically, Yeah and again and it's it's that is
that is good. I'm glad that was done. But they
only respond to give me a second, exclusively, give me
a second, Jamie, because prior to the election, they institute
these changes which are significant. It reduces the number of
inflammatory posts by twenty five point one percent and reduces
the spread of photo posts containing disinformation by forty eight

(54:29):
point five pc. This is huge. That's that's that's really significant. UM.
As soon as the election was done, Facebook reversed those changes,
presumably because they were bad for money. Three months after
the election, the Myanmar military launched a vicious coup. Violence
there continues to this moment. In response, Facebook created a
special policy to stop people from praising violence in the country,

(54:49):
one which presumably reduces the spread of content by freedom
fighters resisting the military as much as it reduces content
spread by the military. It's obviously too much to say
that Facebook call is to coup and Myanmar ship's been.
I mean, there's a lot going on there. I'm not
gonna I'm not pretending that this is like it's just Facebook,
but a major contributing factor. It was an insignificant UM

(55:10):
and the fact that they knew how much their policies
were helping and reverse them after the election, reversing this
effect and leading to an increase in inflammatory content because
it profited them more is damning, right, That's the thing
that's damning. UM Around the world. Facebook's contribution to violence
maybe greatest in places where the company has huge reach

(55:30):
but pays little attention. In Sri Lanka, people were able
to automatically add hundreds of thousands of users to Facebook
groups that spread violent content. In Ethiopia, a nationalist militia
coordinated call sur violence openly on the app. The company
claims that it has reduced the amount of hate speech
people see globally by half this year. But even if
that is true, how much hate was spread during the
years where they ignored the rest of the world. How

(55:52):
many killings, how many militant groups seeded with new recruits,
how many pieces of extermination? Is propaganda spread while Facebook
just wasn't paying attention? The actual answer is likely incalculable.
But here's The New York Times again reporting on that
test account in Kerala, India, perfect turn of friends. Yeah yeah.
Ten days after the researcher opened the fake account to
study misinformation, a suicide bombing and the disputed border region

(56:15):
of cashmir set off a round of violence and a
spike and accusations misinformation and conspiracies between Indian and Pakistani nationals.
After the attack, anti Pakistan content began to circulate in
the Facebook recommendation groups that the researcher had joined. Many
of the groups she noted had tens of thousands of followers.
A different report by Facebook published in December two thousand nineteen,
found Indian Facebook users tended to join large groups, with

(56:36):
the company's median group size at a hundred and forty
thousand members. In a separate report produced after the elections,
Facebook found that over fort of top views or impressions
in the Indian state of West Bengal were fake or
inauthentic when one in authentic account had amassed more than
thirty million impressions. A report in March showed that many
of the problems cited during the two thousand nineteen elections persisted.

(56:58):
In the internal document called Adverse Serial Harmful Networks India
Case Study, a Facebook researcher wrote that there were groups
and pages replete with inflammatory and misleading anti Muslim content
on Facebook. The report said that there were a number
of dehumanizing posts comparing Muslims to pigs and dogs, and
misinformation claimed that the Koran, the Holy Book of Islam,
calls from men to rape their female family members, so

(57:20):
that's significant. Like the scale at which the ship spreads
is huge, and and I I mean, I don't even
I mean I feel like I know the answer if
if the hate is existing on that scale unmitigated. But
who is working to? Like how many people does Facebook

(57:42):
have working on? Is there is there an integrity team
for this regent? Like technically yes, the question is how
many of them and how many of the languages there
are represented by the team. And it's not many exactly,
Like it's not many. Can't have a global company and
not have global representation or ship like this is going

(58:03):
to happen, Like it's just it's actually, you know what.
It kind of reminds me of Jamie. I was looking
at this, and I was thinking about the East India
Trading Company UM. When the East India Company took over
large chunks of India UM. They took it over from
a regime. The government, the monarchical government that had been
in charge in that area prior, was not a good government,

(58:24):
right because they number one, they lost that war. But
like they weren't a very good government, they were a government,
so they did do things like provide aid and famines
and disasters and have people whose job it was to
like handle stuff like that and like handle like make
sure that like place the stuff was getting where it
needed to go during like calamities and whatnot, and doing
things specifically that helped people. But we're not we're not

(58:44):
profitable because a big chunk of what a government does
isn't directly profitable. It's just helping to like keep people
alive and keep the roads open and whatnot, right sustain humanity.
When the East India Company took over, they were governing
and in control of this region and this has actually
been all I think is their first place. But they
don't have any responsibility. They don't have teams who are
dedicated to making sure people aren't starving. They don't have

(59:06):
people who are dedicated to actually keeping the roads open
in any way that isn't necessary for directly the trade
that profits them. They don't do those things because they're
not They're governing effectively, but they're not a government. And
there's been a lot of talk about how Facebook is
a is effectively like a nation, a digital nation of
like three billion people, and Mark Zuckerberg has the power
of a dictator. And one of the problems with that

(59:28):
is that for all of their faults. Governments have a
responsibility to do things for people that are like necessary
to stop them, like to deal with like calamities and whatnot.
Facebook has no such responsibility, and so when people were
not paying attention to Sri Lanka to West Bengal to
um Um to Myanmar, they didn't do anything. Um And

(59:48):
as we know, like forty in the in a in
a region where there are millions and millions of people,
forty percent of the views were fake and authentic content,
you know, like because don't give a ship what's spreading
because they don't have to, because they don't have to
deal with the consequences unless it pisses people off, as
opposed to a government where it's like, well, yeah, we

(01:00:09):
are made up of the people who live here and
if things go badly enough, it can't not affect us.
I'm not trying to be again, not like with TikTok,
I'm not trying to praise the concept of governance. But
it is better than what Facebook's doing right right, It's
it's I think that that is like a very I'd
never considered looking at it that way, but but viewing

(01:00:29):
it as this kind of digital dictatorship that a colonial dictatorship.
It's colonized people's information. Um, really like information streams, it's
colonized the way people communicate, but it has no responsibility
to them if they aren't white and wealthy. Well yeah,
and and yeah, and marginalize people in the same ways
that actual dictatorships do in terms of how much attention

(01:00:52):
is being given, how are are people being hired to
support and represent this area? And of course the answer
is no. Course the result of that is extreme human
consequence and and harm. And it's so and it's like,
it's just so striking to me that it still feels

(01:01:12):
like in terms of the laws that exist that control.
I mean that that even attempt to address the amount
of influence and control that a gigantic digital network like
like Facebook has. Um, you know that Facebook. I mean
unless people are yelling at them, and unless unless their

(01:01:34):
bottom line is threatened, they're never going to respond to
stuff like this. Like it. That's that's been made clear
for decades at this point. It's great. I love it
so well. I'm all worked out. Yeah. A great seal
of the disinformation that goes throughout India on Facebook comes
from the RSS, which is an Indian fascist organization closely

(01:01:56):
tied to the b j P, which is the current
ruling right wing party. And when I say fascist, I
mean like some of the founders of the r S
s we're actual like friends with Nazis and they were
heavily influenced by that ship in like the thirties. Both
organizations are profoundly anti Muslim and the r S s
IS propaganda has been tied to numerous acts of violence.

(01:02:16):
Facebook refuses to designate them a dangerous organization because of
quote political sensitivities that might harm their ability to make
money in India. Facebook is the best friend many far
right and fascist political parties have ever had. Take the
Polish Confederation Party. There your standard right wing extremists, anti immigrant,
anti lockdown, anti vaccine, anti LGBT. The head of their
social media team, Thomas garbage Check uh sorry Toms, told

(01:02:41):
The Washington Post that Facebook's hate algorithm, in his words,
had been a huge boon to their digital efforts. Like
he calls it a hate Alglerman says this is great
for us um expanding it. Like I think we're good
with emotional messages and thus there ship spreads well on Facebook.
Quote from The Washington post. In one April two nineteen
document detailing a research trip to the European Union, a

(01:03:02):
Facebook team reported feedback from European politicians that an algorithm
changed the previous year built by Facebook at chief executive
Mark Zuckerberg's a effort to foster more meaningful interactions on
the platform had changed politics for the worst. This change,
Mark claim, was meant to make interactions more meaningful, but
it was really just a tweak to the algorithm that
made comments that provoked anger and argument even more viral.

(01:03:23):
Um and I'm gonna quote from the post again here.
In two thousand and eighteen, Facebook made a big change
to that formula to provote meaningful social interactions. These changes
were built as a design to make the news feed
more focused on posts from family and friends and less
from brands, businesses, and the media. The process weighted the
profit probability that a post would produce an interaction such
as a like, emoji or comment, more heavily than other factors,

(01:03:44):
but that appeared to backfire. Hogan, who this week took
her campaign against her former employer to Europe, voiced a
concern that Facebook's algorithm amplifies the extreme anger and hate
is the easiest way to grow on Facebook, she told
British lawmakers, many of whom have their jobs because of
how easy it is to make people ship go by
roll when it comes. I mean that that that's not

(01:04:06):
true for yes, Um, what's again? We're we're focusing on
Facebook here in part because I do think it's more
severe in a lot of ways there, but also just
because like they're the ones who had a big leak,
and so we have this data. So we're not just saying, yeah,
look at Facebook obviously hate spreading. They were saying no, no no,
we have numbers. We have their numbers about how fucking
bad the problem is. I guess that that is the difference. Yeah, yeah,

(01:04:29):
and and yeah, we have evidence that the system is
well aware of the Yeah. I would love to be
talking about Twitter too. It's just maybe Twitter just never
bothered to get those kind of numbers. Who knows. Um.
This caused what experts describe as a social civil war
in Poland. Like this change, one internal report concluded, we
can choose to be idle and keep feeding users fast food,

(01:04:49):
but that only works for so long. And you've already
caught onto the fact that fast food is linked to
obesity and therefore and it's short term value is not
worth the long term cost. So he's being like, we're
poisoning people and it's addictive, like you know McDonald's, but
like people are going to give it up in the
same way that McDonald started to suffer a couple of

(01:05:09):
years back, because like, they don't they don't like the
way this makes them feel. Actually, it's fun for a moment.
We just got to get a Morgan Spurlock for Facebook. Baby,
we just got to get where's the supersize me for faceboo?
Entire society is the Morgan Spurlock for Facebook? January sixties more. Yeah,
I was gonna say it was like I feel like

(01:05:31):
it's I mean, whatever, not to say that McDonald's isn't
a hell of a drug, but like this is not
the same. I mean, I it's stronger because it's your
fucking brain and self image and the view of yourself.
And I feel like that is the most strong manipulation

(01:05:54):
that any given system, person, whatever can have on you
as controlling the way that you see yourself. It's not
the same in terms of like involuntary baseness. I feel
like it's something that you very much participate in. Yeah,
um it's bad. Yeah, um it's good. I think it's good.
That's what I think, Jamie. Facebook has been aggressive. You've

(01:06:17):
called me today to see good, actually to read all
this and then say so that's fine, let's never talk
of it again. Um. Anyway, Facebook has been aggressive at
rebutting the allegations that their product leads to polarization. Their
spokeswoman brought up a study which she said shows that
academic research doesn't support the idea that Facebook or social
media more generally is the primary cause of polarization. To

(01:06:39):
ignore for the moment that not the primary cause doesn't
mean isn't a significant cause. And let's look at this study.
The spokeswoman was referencing cross country trends and effective Polarization
in August study from researchers at Stanford and Brown University.
This study opens by noting it includes data for only
twelve countries and that all but Britain and Germany exhibited

(01:07:00):
a positive trend trend towards more polarization. So right off
the bat, there's some things to question about this study. Um,
which is number one, they're saying that, like, oh, Britain
hasn't gotten more polarized, which is like have you have
you been there? Have you talked to but I yeah,
not the not the don't live there, but not what
I've been hearing. And here's the thing, when you look

(01:07:21):
at Facebook is basically using this citing this is like
evidence that like, look, we're fine, social media is not.
The study from this very credible study says that we're
not the cause of polarization, so everything's good. The study
doesn't quite back them up on this um right off
the bat. One of the authors provides it like notes this,
and this is from a right up by one of

(01:07:42):
the authors on the study on a website called tech
policy dot com where he's talking about the study and
what it says. A flat or de or declining trend
over the forty years of our sample does not rule
out the possibility that countries have seen rising polarization in
the most recent years. Britain, for example, shows a slight
overall decline, but a clear increasing trend post two thousand
in Brexit. So he's saying that, like, we don't have

(01:08:02):
as much data from like more recent polarization, and that
may be a reason why this study is less accurate,
why some of our our statements are not do not
conform with like what people have observed. He goes on
to note the data do not provide much support for
the hypothesis that digital technology is the central driver of
effective polarization. The Internet has diffused widely in all the
countries we looked at, and under simple stories where this

(01:08:24):
is the key driver, we would have expected polarization to
have risen everywhere as well. In our data, neither diffusion
of internet nor penetration of digital news are significantly correlated
with increasing polarization. Similarly, we found little association with changes
in inequality or trade. One explanatory factor that looks more
promising is increasing racial diversity. The non white share of
the population has increased faster in the US than in

(01:08:45):
almost any other country, and our sample in other countries
like New Zealand and Canada, where it has risen sharply
have seen that rising polarization as well. So I have
some significant arguments with him here, including the fact that,
as he notes here, his study only looks at Western nations.
With the exception of Japan, all of the nations in
the study are European or the United States. In Canada,

(01:09:06):
um and so they have all have had prior to
two thousand, higher penetrations of the Internet and non Internet
mass media. Like outside of this, if you're trying to
determine the impact of social media elements of what social
media has done, we're in present in places like Fox
News in the United States years before Facebook ever existed. Um,
And that was not the case in places like me
and mar in India, which are not a part of

(01:09:26):
this study. So right off the bat, it's problematic to
try and study the impact of social media on polarization
only in countries that already had robust mass media before
social media came into the effect. Which is not to
say that I agree with their conclusion, because I think
there's other flaws with this study, but it one of
the flaws is just that like hundreds of millions of
their users exist in countries where they did not The

(01:09:47):
study was not done, um, where they were not looking
at these places, which is a flaw which is just
like dependent on yeah, and that's dependent on most readers,
just conflating, you know, in North America and Europe with
the center of the fucking and and again I have
issues about like, Okay, well you're saying that racial diversity
is more of a thing. But like, where is the
where is the propaganda, Where is the hate speech about

(01:10:09):
racial diversity spreading? Is it spreading on social media? Like, yes,
it is. I can say that as an expert. Um.
It's also just like again, not that this study is
even bad or not useful. It is one study, um.
And again we have internal Facebook studies that make claims
that I would say throw some of this into question.
But again, this is just how a corporation is going

(01:10:30):
to react. They're going to find a study that they
can simplify in such a way that they claim that
that they can claim there's not a problem because nobody,
none of the people who they're going to be arguing
with on Capitol Hill and precious to you, the journalists
are going to actually drill into this and then talk
to other experts. You're gonna reach out to members of
that study and be like, how fair is this phrasing?
How does that how does it deal with this information

(01:10:52):
and this information As we saw earlier with the last study,
when we people reached when the Wall Street Journal, to
their credit, reached out to that scientist, he was like, well,
actually they have better data than me. And I'd love
to see it because maybe that'll change our conclusions. Um anyway,
yeah uh. Mark Zuckerberg has been consistent in his argument
that deliberately pushing divisive and violent content would be bad
for Facebook. Quote. We make money from ads, and advertisers

(01:11:14):
consistently tell us they don't want their ads next to
harmful or angry content. While I was writing this article,
I browsed over to one of my test Facebook accounts.
The third ad on my feed was for a device
to illegally turn a glock handgun into a fully automatic weapon. Um,
just just his heads up. Yeah, one of my I
have a couple of test feeds and it was like, hey,
this button will turn your glock automatic, which is so

(01:11:37):
many felonies. Jamie. If you even have that thing in
a glock in your home, the FBI can put you
away forever. I have to laugh. I have to laugh
because that is really really scary. But yeah, it is
like Mark being like, look, oh, no advertiser wants this
to be a violent place by a machine gun on Facebook,
you know, next to ads that are like t shirts
about telling liberals and stuff, Like a machine gun ad

(01:12:00):
advertiser maybe would be one that wouldn't take issue with that.
Holy fucking hang the media shirts advertised to me on
Facebook like my god go to go like fuck you
Mark Um, I used to the last Well, when I
quit Facebook a couple of years ago, I was I
was getting Normi advertisements. I was getting good for you,

(01:12:22):
those really scary ones that says, like those custom T
shirts that say it's a Jamie Loftus thing, you wouldn't understand,
and you wouldn't I would not know. And you know,
the only time Facebook I can think of recently actually
anticipated something I wanted is they keep showing me on
all of the accounts that I've I've used videos of

(01:12:43):
hydraulic presses crushing things, and I do love those videos.
Those those are those are pretty pretty fun. And that's
the meaningful social interactions that Mr Mark Zuckerberg was talking about,
was the hydraulic press videos, and those are very comforted.
On the good old Internet, which also wasn't all that great,
but on the old Internet, which was it was a

(01:13:04):
lot more fun. Though there would have just been a
whole website that was just like, here's all the videos
of hydraulic presses crushing things. Come watch this ship. There
wouldn't have been any algorithm necessary. You could just scroll
through videos. There's no friend function, it's just hydraulic press ship. Yeah,
that's all I need, baby, I need. So back to

(01:13:24):
the point, it is undeniable that any service on the
scale of Facebook, again like three billion users, it's going
to face some tough choices when it comes to the
problem of regulating the speech of political movements and thinkers.
As one employee wrote in an internal message, I am
not comfortable making judgments about some parties being less good
for society and less worthy of distribution based on where
they fall in the ideological spectrum. That's true. Um, This

(01:13:47):
is again part of the problem of not regulating them
like a media company, like a newspaper or something. Um,
Because by not making any choices, they're making an editorial choice,
which is to allow this stuff to spread. Presumably. Actually,
if you were actually being held to some kind of
legal standard that again most of our media isn't anymore,
you would at least have to be like, well, let's
evaluate the truthfulness of some of these basic statements before

(01:14:10):
president and and I would say that's where the judgment
should come in on. But that's expensive. If Facebook is saying,
we won't judge based on politics, but we will will
We will judge based on whether or not something is
counter factual that I think is morally defensible. But that's
expensive as shit and they're never going to do that.
Look spent more. Moral decisions are famously not cheap, and

(01:14:30):
that is a lot of the reason why people do
not do them. Yeah, um, it is true. That is
not a profitable venture. Yeah, it's no, of course not um.
And the other thing that's true is that Facebook already
makes a lot of decisions about which politicians and parties
are worthy of speech, and they make that decision based
mostly on whether or not said public figures get a

(01:14:51):
lot of engagement. Midway through last year, they deleted and
like all of the different anarchist media groups that had
in a lot of anti fascist group that had accounts
on Facebook. Just across the board, they deleted Let crime
Think and uh um they kicked off. It's going down
like a rapper, I know, soul like yeah they I mean,
I mean, it's nobody ever complains when best ship happens

(01:15:13):
to anarchists except for anarchists. But yeah, they knuked a
bunch of anarchist content. Um just kind of blankets saying
it was dangerous. And I think it was because they
just knuked the Proud Boys and they had to be
shown to be fair. Um. But it has now come
out that they have a whole program called x check
or cross check, which is where they decide which political
figures get to spread violent and false content without getting banned. Um,

(01:15:37):
they had a yeah, based on engagement. They've claimed for
years that everybody's accountable to site rules. But again the
Facebook papers has revealed that, like, that's explicitly a lie
and it's a life Facebook has told other people at
high levels of Facebook. And I'm gonna quote from the
Wall Street Journal here. The program, known as cross Checker
x check, was initially intended as a quality control measure
for actions taken against high profile accounts, including celebrities, politicians,

(01:16:00):
and journalists. Today, at shields millions of v I P
users from the company's normal enforcement process. The documents show
some users are white listed, rendered immune from enforcement actions,
while others are allowed to post rule violating material pending
Facebook employee reviews that often never come. At times, the
documents show ex check has protected public figures whose posts
contain harassment or incitement to violence, violations that would typically

(01:16:22):
lead to sanctions for regular users. In two thousand nineteen,
it allowed international soccer star in neymar to show nude
photos of a woman who had accused him of rape
to tens of millions of his fans before the content
was removed by Facebook. White Listed accounts shared inflammatory claims
that Facebook's fact checkers dimmed deemed false, including that vaccines
are deadly, that Hillary Clinton had covered up pedophile wings,
and that then president Donald Trump had called all refugees

(01:16:45):
seeking asylum animals. According to the documents, A two thousand
nineteen review of Facebook's white listing procedures marked Attorney Client
Privileged found favoritism to those users to be both widespread
and not publicly defensible. We are not actually doing what
we say we do publicly, said the confidential review. It
called the company's actions a breach of trust, and added,

(01:17:05):
unlike the rest of our community, these people violate our
standards without any consequence, and they like to like their
board members about whether or not this. They didn't lie
about whether or not it was a thing. They said
it was very small and just really I think the
initial claim was, like we have to have something like
this in place where people like President Trump. But it's
a tiny number of people, and it's because they occupy
some political position where we can't just as easily, you know,

(01:17:26):
delete their account because it creates other problems because they're
not as fringe as they need to be for this
conduct to be. That was their justification, one sect, Jamie.
That was their justification on the level of you can
be unethical and still be illegal. And well, here's the
thing they told their board. They only did this for

(01:17:46):
a small number of users. You want to guess how
small what that small number was. Oh, I love when
Facebook say there's a small number. What is this? What
is this? Eight million? That's so many? Yeah, oh dear,
it's very funny. It's very funny. It's all good. I

(01:18:07):
that is so. I mean, yeah, that they're just they're
just can I say something controversial? Please? I don't like
this company one bit. You don't, well, I feel like
that's going a bit uh, that's going a bit far. Sorry,
And I'm famously you know, I don't like making harsh
judgments on others, but I'm starting to think that they

(01:18:29):
might be doing some bad stuff over there. M hmmm, yeah,
I would. You know, I don't like these people. I
don't like these people at all. You know what, I
do like Jamie ending podcast episodes. M m oh, I
actually do like that. Yeah, that's the thing I'm best at.
Do you want to plug your plug doubles passion? Um? Yeah,

(01:18:50):
sure you can find. I'm gonna just open by plugging
my Instagram account, a famously healthy platform that I'm addicted to,
and I don't really have any concerns about it. I
don't really think it's affecting my mental health at all. Um.
So I'm over there and that's Jamie christ Superstar. I'm
also on Twitter, which Robert can't stop saying is the

(01:19:12):
healthiest of the platforms. Of all of the people who
are drunk driving through intersections filled with children, Twitter has
the least amount of human blood and gore underneath the
grill of the car, like the Robert saying for all
you Backstreet Boys heads, he's saying that Twitter is the
Kevin Richardson of social media. I'm there as well. I'm

(01:19:35):
saying the the drunk driving Twitter car made it a
full fifteen feet further than the Facebook car before the
sheer amount of blood being churned up into the engine,
flooded the engine air intakes. But at the end of
the day, we're all fucked. I'm on Twitter as well,
at Jamie. Listen to my podcast, Yeah, you know. You

(01:19:57):
can listen to my podcast, the Beachdel Cast. You listened
to act Cast, that's about the Kathy Comics. You can
listen to My year in Mensa. You can listen to
the lead of podcast. You can listen to nothing you
know what never led to a genocide in any country
as far as I'm aware of, Jamie. Uh, the Kathy Comics.
We'll see then you haven't listened to the whole series, really,

(01:20:17):
is it? Oh? You know what? You know? Yeah, that's
that's why the last the last episode is your Your
Life Report from Sarah Jevo in episode eleven. Yeah, Irving
really his his politics were not good. Yeah he was.
He was. He was like weirdly into the Serbian nationalism. Um.
Irving is like for the Kathy Comics, he's like, Okay,

(01:20:39):
I'm about to make a wild parallel. But Irving is
like the Barefoot Contessa's husband and that he looks so innocent,
but then when you google him, you're like, wait, a second.
This man is running on dark money. This guy is
like was on Wall Street in the eighties. This is
a bad man. That's he's basically like Jeffrey, the Barefoot

(01:21:00):
Contessa's husband. The Barefoot Contessa is run on dark money.
I know people don't like to hear it. They love her,
but it's just true. It's objectively true, and that's what
I would like to say at the end of the episode.
I've never heard of the Barefoot Contessa and I don't
know what you're talking about. I have not even one
percent surprise, but that's okay. But you know what I

(01:21:20):
do know about what I know about podcasts, and this
one is done great ending

Behind the Bastards News

Advertise With Us

Follow Us On

Host

Robert Evans

Robert Evans

Show Links

StoreAboutRSS

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.