All Episodes

February 22, 2021 52 mins

Bridget Todd joins the show to talk about the role social networking platforms play with radicalization and extremism. From a hands-off approach to algorithms that exacerbate a dangerous problem, we look at what we do know and establish what we don't.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Welcome to Tech Stuff, a production from my Heart Radio.
Hey there, and welcome to tech Stuff. I'm your host,
Jonathan Strickland. I'm an executive producer with our Heart Radio
and I love all things tech and I had to
rush through that intro. Ladies, gentlemen, everybody. I have to
talk to my good friend Bridget, host of There Are

(00:26):
No Girls on the Internet, who has joined the show
today to talk about a very serious topic. But before
we get into that, Bridget, it is so nice to
have you on the show. It's so nice to be here.
I'm so excited. You know, we connect every week just
via phone, and it's nice to be connecting in people's
ear buzz in podcast land. Yeah. So, Bridget has been

(00:50):
hosting an incredible show, There Are No Girls on the Internet.
If you haven't heard it already, you absolutely need to
seek it out. And specifically, I wanted to bring her
on the show to talk about a season two theme
that you've really done a lot of hard work on, Disinformed,

(01:10):
because we're going to be talking about misinformation, disinformation and
the promotion of that through algorithms. But Bridge on one,
in your own words, why don't you talk a bit
about there are no girls on the Internet in general
and disinformed particular. Yeah, so There Are No Girls on
the Internet was really born from this idea that we

(01:30):
know that marginalized people, underrepresented people, women, communities of color,
other folks from underrepresented backgrounds have had such a huge
impact on technology and honestly, just what makes the Internet
a fun, great place? Like how many times is there
hilarious hashtag or a hilarious vine or a hilarious video
on TikTok that goes viral that you're seeing everywhere and

(01:52):
the person behind it is a woman or a person
of color. And so I really wanted to carve out
a space where we could really highlight and sell brain
and amplify all the amazing impacts that people from underrepresented
backgrounds have had on the Internet. And so that's why
I started There Are No Girls on the Internet. It's
been a really fun journey to really talk about some
of the ways that underrepresenting folks either show up or

(02:14):
don't show up online. And from making that show, one
of the things that came up for me again and
again was the role that disinformation and distorted media stories,
fake news that you know, um things like online harassment radicalization.
How these same underrepresenting people who are showing up, making up,
making great content online, making the Internet such a fun place,

(02:38):
those same people are really at the forefront of a
lot of this stuff, right, You know, when it comes
to disinformation, communities of color are the most impacted, women
are the most impacted. But we are the same ones
who are doing a lot of the research and a
lot of the work to fight back. And I was
really fascinated by that. I was fascinated by the human
center of of disinformation. I think with a lot of tech,

(03:00):
it can be hard to remember that it's really about people.
You know, that people looked up at the other end
of screens or the other end of phones, and disinformation
was a situation where it really was important to meet
a center those people and to make sure that their
stories were at the forefront. You bring up a point
that I think is good for us to lay out
right at the very beginning, because while we're going to

(03:22):
be talking about social media platforms, specifically platforms like Facebook, YouTube, Twitter,
those sort of things, and to another extent, some of
the more uh extreme social networks that are out there
that are heavily promoting things like radicalization within the confines

(03:43):
of those own communities. While we're going to talk about that,
we have to remember that's one piece of a very big,
very complicated puzzle. UH. The the pathway to extremism or
radicalization is not a one lane highway where everybody goes
the exact same way. There's usually a lot of different

(04:05):
factors at play, and if we're being completely honest, we
still don't really have a full understanding of what weighted
UH factors are the most important, right like obviously being
in person with groups that are UH proselytizing extremist views

(04:29):
and are reinforcing when new people come into the community
that those extremist views are legitimate and and reinforcing. Whenever
someone expresses an extremist view, that's clearly very important, and
when it happens in a real space meet space, UH,
it can have a huge impact. But we also know
that online interactions also can play a big part, especially

(04:54):
as we have embraced them more wholeheartedly in the last decade,
and clearly in a situation like we are in now
where all of a lot of us are at home
and the only interactions were mostly having with other people
are online we're seeing that online radicalization is in fact

(05:15):
a factor as well, but to what extent it is
as effective or ineffective as others. That's something that's under
academic debate. So we're gonna have this discussion, but we
do want to make it clear that there are no
absolutes that we know of yet. These are all things
that are under study and consideration. It's more that we're

(05:38):
trying to kind of take stock of what's going on
and what elements could be playing a part in the
in the process of someone encountering extremist views and the
possibility of getting wrapped up in that um. So with
that in mind, let's talk about social media. One thing

(05:59):
I noticed I decided to do a quick search bridget
before we jumped on this call, and I found a
Pew Research survey from a couple of years ago that
was looking at, uh, what percentage of people were getting
some or most of their news through social networking sites,

(06:20):
primarily Facebook, And at this point it's around adults get
either some of their news through social media or practically
all of their news through social media. So obviously a
social platform plays a big part in the way we
access information that we then tried to incorporate into our lives. Yeah,

(06:43):
I think so. I know that study. I'm glad that
you grounded it in our conversation in that study. You know,
as someone who tracks and and talks and thinks a
lot about the way that platforms operate in the role
they have in our society. For a long time, platforms
like Facebook would say that they were just neutral plat forms,
you know, they did not need to take any kind
of editorial strategy because they're just neutral platforms. And I

(07:06):
think that Pew study that adults get some some aspect
of their news from from these platforms really shows how
that's not always the case. You know, you have to
I think that tech leaders should be thinking about their
platform policies based on the role they actually play in
their users lives. And if most users are saying, hey,
I use your platforms to get my news, then maybe

(07:29):
you do need to have a little bit of editorial strategy, right,
Maybe you do need to have policies in place to
make sure that it's that it's being used in safe
and responsible ways. I'm glad that we have moved away
from that thinking that's like, oh, it is not my
fire on a platform. It is not my responsibility to
have any kind of policing or policies around what is
on that platform, which I think was the conversation for

(07:49):
so long. Yeah, I think the interesting thing to me
is that, um, you will hear a lot of debate
in in the in America about Section to thirty, which
is a specific piece of legislation that provides online platforms
some legal immunity from the sort of stuff that their
users might post to those platforms, saying that they are

(08:13):
not responsible for the things that their users share. And
the hope of the people who actually drafted that legislation
was that it was going to be a two pronged thing.
It would allow platforms to establish themselves, because otherwise they
would constantly be under legal threat and nothing would ever
gain any traction, because this legislation was proposed in the

(08:35):
nineties when the Worldwide Web was very young, And the
second part was that they were really hoping it was
going to lead to uh to moderation practices on these platforms,
that the platforms would moderate the material that was being
posted to them without fear of legal action against them
for doing so, that they would be given that freedom.

(08:58):
And the problem or one of the problems is that
a lot of these platforms embraced the first part where
they're not held responsible for the stuff that people post
to them, but they didn't go so hard on the
second part, where they actually moderate the material. And we've
seen that time and again because we've seen it on Facebook,

(09:20):
we've seen it a lot on YouTube. I mean, the
the whole uh controversy a couple of years ago about
the the various bizarre and sometimes very disturbing videos that
were showing up on YouTube's children oriented service shows that, Uh,
they weren't really looking at it from that perspective for
multiple reasons. And we'll get into some of those a

(09:43):
little bit later, because it does tie into the algorithm
side too. Uh. The other part Pew study that I
saw that was interesting and disheartening is a separate study
found that people who gained most of their news from
social media of sites were in general less engaged and

(10:03):
less knowledgeable about the subject matter than those who were
consuming it from multiple sources. So if you were someone who, yeah,
you've got some of your news from social platforms, but
you also got some of your news from journals or
newspapers or television, or radio or whatever. You generally had
a better understanding of subject matter. And this to me

(10:26):
almost sounds like the problem of people reading a headline,
never clicking through to read and digest the content, and
basing their entire response on the headline and maybe comments
that are placed underneath an article that's been posted to Facebook,
let's say, which is really upsetting as someone who has

(10:48):
an English background, I mean especially and for us, like
we've worked in media, we know there's a special skill
in creating headlines that doesn't necessarily relate back to the
content of the piece that it's attached to, or it's
it's some minor element that's elevated in order to gain attention.

(11:10):
So this, to me is another issue. And this is
before we even get into purposeful misinformation. This can just
be a misunderstanding because you didn't take the time to
read the full material. Oh absolutely, I can. I'm I
can say so much about this. Um So, I got
my One of the places that I've worked in my

(11:31):
life was MSNBC. I was on their digital team and
so our job was essentially framing stories on social media
in ways that we're going to get them clicks, and
you know, attention online, and so we also had access
to metrics, so we could actually see how people were
engaging with this content. And the amount of times that
people share stories on Facebook that we can see on

(11:54):
the back end that they have not actually clicked in
to read would frighten you. But it was our job
to kind of explore that, and so we learned that,
you know, oh, people like to share articles where it
seems like they're trying to indicate something about themselves, like
this article makes them look thoughtful or um interesting or
unique or well read but really informed, um, and so

(12:16):
they want to project that to the best of their followers,
so they share it. But those same stories are the
ones that people aren't clicking in to read. And so
it really is not surprising to me that people who
get their media and news from social media primarily are
less informed and less engaged, because I think these media
ecosystems really prioritize that. I think that they built a

(12:38):
media landscape that prioritizes sharing without reading thoughtfully right, like
everything happened so quickly on the Internet, I feel like
that is what's really the priority. And I also think
just our our media landscape has really changed I grew
up in a household where we got the newspaper. We've
got multiple newspapers in my home and we have to,
you know, note the ways that are Our journalism industry

(13:00):
has really been you know, um decimated. And so it's
not surprising to me that as small local papers that
are putting out local, relevant, timely information right on your doorstep,
as those begin to shut their doors, more and more
people are turning to social media, which is in turn
and making them less informed. That this is not a
surprise to me. Right, I have a great example to

(13:22):
give you a bridget It's a very recent one and
I think it's one you'll appreciate, which is that, uh,
as I record this, I think we're like a day
out from when a trailer for a new Mortal Kombat
movie came out and over on Jezebel, a writer wrote
an article with the title, why isn't chun Lee in

(13:44):
the Mortal Kombat movie? And for those who aren't schooled
in Mortal Kombat, chun Ley is not a character from
Mortal Kombat. She's a character from Street Fighter, different different franchise, Right,
But that was the headline. If you at the article,
there were hints in the article that this was a joke,
that there's a point where she they say, if in

(14:06):
the ultimate street Fight, chun Lee needs to be there,
indicating yes, the author knows that chun Lee doesn't live
in Mortal Kombat world, that chun Ley lives in street
Fighter world. But it was almost like a like a
social experiment, and of course Twitter went nuts and everybody
started slamming this article, saying, look, how stupid this person is.

(14:29):
They don't even know that chun Ley is not in
Mortal Kombat. Which made me say, all this is telling
me is which of you guys failed to read the article,
Because if you had read the article, you would have
seen it was a joke, you would have recognized it
as satire, and you would know that that this was
a wind up the whole time. You're really just falling
into the trap that the writer set at this point.

(14:52):
And it's just the perfect microcosm of this this tendency.
And I think that po listing links to two news
articles has almost become a shortcut for a comment. It's
almost like a comment or a gift or an emoji. Right,
It's a way for someone to say, here is my thought,

(15:13):
on that, uh in headline form, even though you may
not have clicked through to read the actual article underneath,
or you're just you're looking for something that supports whatever
position you're taking on any particular topic, whether it's something
lighthearted in pop culture or it's something really serious and
it has to do with like politics or health. And

(15:35):
I see that a lot too, where people are clearly
grabbing articles. They're just doing the down in dirty Google
search to find something that looks like it supports their
position and uses that in place. So again, this is
all before you even get too purposeful misinformation or disinformation.
It may be that the sources that you pull from

(15:56):
our misinformation, but it may be that they're totally yet
you just don't know the context for it. So you're
just you're just throwing things at the wall to see
what sticks. And uh So we're doing a lot of
the work ourselves is part of it, Like we have
created a a kind of culture that supports the sharing

(16:16):
of misinformation. But the flip side of that coin is
these platforms are optimized for the sharing of misinformation and
the elevation of misinformation. And that's really the part that
we can look at more critically and say in what
ways is this happening? And uh and and how culpable

(16:40):
are these platforms for that process? Uh? Keeping in mind
that in most cases, I think we would argue that
ultimately there are other people not connected to the platforms
who are generating the content that's getting elevated there, but
the platforms are benefiting from that. Um. So let's talk

(17:00):
a little bit about that and and what is going
on with these platforms and why we want to even
talk about algorithms. Uh So, just so you guys out there, no,
and algorithm is essentially just a list of instructions. It's
a you can think of it like a program. Doesn't
have to be a program, but it's essentially a list
of instructions that guides some sort of process. And in

(17:21):
the case of social networks, essentially what we talk about
when we say algorithm is it's the bit of that
platform that decides what content you see in what order
you see it. So for YouTube, it could be the
recommendation engine where you're watching one video and then you've
got a whole bunch of other videos recommended along the side.

(17:42):
So for me, it's all, uh, cute animal videos. I
wish I could lie and say it wasn't. I'm a
punk rock kind of guy. But it's all kitten and
puppy videos sometimes a sloth um and Facebook. It's it's
which posts you see in which order. You know, everyone
gets frustrated that they're not seeing it in a reverse

(18:02):
chronological order. Well, that's by design, and if as users
we think of it as it's the algorithm that shows
us what is potentially next on our docket. For the platforms,
algorithms exist solely because the platform wants to keep people
on it for as long as possible, because that's how

(18:25):
the platforms make money. It's through advertising, Uh, it's through
the the commoditization of people's personal information that they generate
while they're on that platform. And the longer you're on it,
the more the platform makes So the algorithms are are
I would argue a moral they don't really care what

(18:47):
is they're serving up as long as it keeps you there,
and that I think is the core of the problem.
Absolutely a thousand percent correct. I think that tech platforms
need to be held accountable for the ways that their
algorithms have really just take an advantage of a lot

(19:08):
of human nature is not so great tendencies, right humans. Listen,
we all know humans are terrible, right. We are lazy.
We make rash, snap decisions, myself very much included. We
will smash that share button, smash that like button, smash
that comment button on an article that frankly, we have
not read, and we're just putting our uninformed opinion out there.

(19:29):
I am guilty of all of this. Right, Platforms Essentially,
algorithms have rewarded that kind of behavior, that kind of
quick behavior, snap decision making, and created this this media
landscape where the citizenry is less informed, less engaged, and
more prime for bad actors to disinform and misinform them.

(19:51):
And I think the way that you put it it
is completely correct. And I think you know I I
say this all the time on my own podcast. I
want to one of the reasons why I'm very interested
in disinformation is because I think it's a good example
of the way that algorithmic thinking has really failed us.
You know, I want to build an internet that prioritizes
and privileges things like thoughtfulness or you know, discourse, or

(20:14):
really engaging with opinions that are different from your own.
And I feel like algorithms and platforms have given us
the opposite. It's a media landscape that rewards quick decision
making and being less engaged and less thoughtful and engaging
less and thus leaving people really able to be exploited
by people who do want to push misinformation and disinformation.

(20:36):
And I think when platforms like Facebook get rich off
of it, we really have to ask some questions. You know,
Facebook's own internal report says that six or four percent
at a time, when somebody joins an extremist Facebook group,
it's because Facebook itself recommended it, right, They should not
then be getting more money based on something that they
admit has such a corrosive impact on our society. Hey, guys,

(20:58):
Jonathan from the future here is coming in to say,
we're going to take a quick break with our conversation
with bridget Todd of there are no girls on the Internet,
and we will be back right after this break. You
can't play both sides of the issue saying, uh, we

(21:22):
don't have any responsibility, We're we're neutral, We're just a
platform that people post to, and then also reaping the
billions of dollars generated through the process of radicalization. If
you are directly profiting from that process. You are at
least in some way culpable for it. You can't like

(21:46):
the idea of being able to wash your hands and
walk away is uh alien to me. I can't imagine
being able to shed that kind of responsibility. And you know,
we talk about Facebook revenue in the billions of dollars
every single quarter. It's not not billions of dollars per year.
Every quarter of that company is making billions and a

(22:09):
lot of this is because of these algorithms that you know,
Facebook's their value proposition is that they can put very
specific groups of people in front of very specific content.
And it's it's because the the the Facebook engine learns

(22:29):
so much about us and what how we interact that
it can start predicting the things that we're going to
react to next and thus serve up that stuff in
front of us, and sure enough, we're bound to act
on it. And in if this were a world where
radicalization wasn't a thing, that would mostly mean we would

(22:51):
all be buying more crap and that would be the
worst of it. But we're seeing in the same thing
is true of YouTube, right that YouTube Google when you
log into YouTube, not if you're just browsing it, like,
without being logged in. But if you're logged into YouTube,
it is constantly tweaking your profile of the sort of

(23:12):
things that you interact with, both on and off the site,
and building on that and determining what you want to
see next. In fact, there were there was a study
self published, not even pure reviewed. I hesitate to even
bring it up, but there was one report that was
published a couple of years ago where a pair of

(23:33):
researchers and I used the term lightly, argued that YouTube
wasn't contributing to radicalization. And they had this whole thing
where they talked about watching different styles of videos and
seeing what got recommended next. Except for one tiny little thing.
They didn't log into YouTube. They were they were watching
it without logging in, And it's the log in process

(23:55):
where Google starts to build that profile, figuring out, oh, well,
they're inter acting with a lot of content that's in
say the video game space. That's another one that I
watch a lot of. So now I get puppies and
kittens and video games, not all in the same video
there separate, but I get a good mix. Um. But
if I were if I were to start watching videos

(24:18):
that were let's say a little to the far right
or the far left of the spectrum. Uh, then the
those algorithms start working and start determining what other sort
of videos might I respond well to, and based upon
my activities on the site, it tweaks those waitings, right,
It says, it's like machine learning. It says, all right, well,

(24:41):
we saw that they that this person spent an hour
when we shared this kind of video, so let's push
even harder on that and see if we can get
that engagement up even more. And from the platform side,
they're just thinking, how can we maximize this person's time
on our platform and make the most money. But it's
it's the same general approach as trying to radicalize someone,

(25:06):
where you're trying to continually serve them up information from
a very specific ideology and to reinforce that over and over.
It just so happens that these two things are in
aligned with one another. And so from that algorithmic standpoint,
we see the process of radicalization somewhat automated, and that

(25:29):
is where the real concerns are, and we know that
this is still a thing. There was a report just
earlier this month about how researchers were still finding that
extremist videos would get pop up in the recommendation uh
sidebar if you were to to watch them, whereas YouTube
has been working pretty hard to to to call that,

(25:52):
but it's still happening. So this is I can't see
how if you run so shual media platform that has
an algorithm to determine what order you see stuff in,
how you can step back and say you're not responsible
for at least playing a role in leading people toward extremism.

(26:17):
I mean, I wish I had that ability to wash
my hands of responsibility of things that I don't want
you be responsible for. But it's difficult to see how people,
how tech leaders cannot see their culpability in this. Just
as you described, it's it's very difficult to make the
argument that they don't have a responsibility when these kinds
of things, by their own metrics, are happening on their platform.

(26:40):
It's very clear study after study after study, particularly on YouTube,
which frankly, you know, in all of the conversations we've
been happening about having about you know, platform accountability, I've
been surprised the way that YouTube has really been able
to kind of skirt a lot of that heat. Like
we talk a lot about Facebook, we talk about Twitter,
but YouTube. I don't know how they've done it, but
they've been a this sort of sidestep that conversation, which

(27:02):
I think is really not good considering the role they
do play in radicalization. You know, I think that we
really I'm very curious how they've been able to avoid
that kind of responsibility for so long. Your point about platforms,
I think it is such a good one, and that
if they're if if we didn't, if you lived in
a wrote that did not have radical views or extremist views,
even then I would I would want to ask questions

(27:25):
about whether or not these platforms are actually doing good
or doing harm. I remember this time. This is kind
of a weird story, but I was going through a
breakup and I was still following my ex on Facebook.
And it was one of those breakups where I was like,
you know, like it was really getting me down, and
I realized Facebook must have sensed that my relationship with

(27:45):
this other account was something was going on there. So
every little update that I got about my ex, Facebook
was like shoving it in my face. And I realized
a platform can make you feel bad about yourself. A
platform can make you if it has the power to
shape you to extremist ideality, to your extremist thinking or
political content. It has that power to really shape how
we feel in our day to day. I wish it

(28:07):
wasn't true, but it is. And so I think even
from that perspective, it is completely fair to step back
and ask, well, how are these platforms contributing to harm?
Whether it's extremist content or just how someone feels on
their day to day, like if they feel like they're
able to step away from platforms, if platforms have prioritized
amount of time on screen or amount of time on

(28:28):
a page, or what have you. Is that kind of
thinking doing harm in society? I think it's completely fair
to ask these questions. And I think, you know, I
would like platforms. I would like tech leaders and the
people that you know use these these technologies to have
that conversation. But I feel like getting to a point
where we're all on the same page it has been
so tough. Yeah, And I would say that the the

(28:50):
efforts we have seen kind of illustrate the point you're making,
Bridget in that we see these tech companies occasionally spawned
when things come to a point where there's no other option,
they have to respond right there, called before Congress. Perhaps
that's happened multiple times now, or they're under pressure because

(29:13):
of advertisers that don't want to be associated with things
that are spiraling out of control. Uh, we see that
those cases when I have things get to the extreme,
But often the responses are very uh, surface level. So
things like we now have a part of our app

(29:35):
where it will alert you if you've spent X amount
of time on the app, So it's a little screen
alert time. And you're thinking, well, I see how you're
trying to address the issue kind of, but you're not
getting at the underlying problem. You're just looking at a symptom.
It's like treating someone who is very, very sick, but

(29:58):
all you can do is alleviate the terms but not
cure the sickness. The same sort of issue. And uh,
when we're talking about platforms that have the capacity to
make massive changes in people's behavior over time, that's not
really good enough, right, That's it. We are We are

(30:22):
having countless people go down dark pathways where it's very
hard to turn back, and there's they're going into communities
where there's all this reinforcement that's again supplemented by the
wave platforms work in the first place. Now, one of
the other things I wanted to mention is that and

(30:43):
I we talked about at the very beginning about how
we don't know really the full scope of the effect
of this. We know what's happening, we know that it's
a problem. The interesting thing is seeing some disagreements among
researchers who have really looked into this as to the extent.
So one of the articles I referenced with you, Bridget

(31:06):
was one that an opinion piece and wired by Emma Bryant.
She was writing about the Oxford Institute. They had released
a study that was showing an increase in UH companies
participating in influenced. Influenced campaigns is the way they word it.
I hate these I hate these words that we use

(31:29):
where we take the staying out of what's happening. It's
an influence campaing. Although it does also bring give you
a new appreciation of what it means to be an influencer.
It makes it much more sinister, realistic, but sinister. Yeah,
campaign sounds like what a Instagram influencer does to get
you to buy coffee or something. Yeah. And now, granted

(31:54):
you could argue that the same same ideas that are
used by influencers to try and get their followers to
go and buy whatever brand is supporting them are it's
somewhat aligned with some of these other ideas. But I
think we could agree that there's a spectrum of harm
here from from oh man, this drink that I bought

(32:14):
is real nasty. I wish I hadn't bought it too.
Oh man, I found myself on the steps of the
Capitol on January six that boy, well, am I sorry
that I've done that? Um, that's a that's a spectrum
right there, That's what we call that. But what am
I Brant was writing about was that so Oxford Institut
who comes out and says that that there's been an
increase year over year of this form of of misinformation

(32:39):
campaigns and the role of the Internet, and uh, Bryant's
point was that we don't even understand the full scope
of this, and it's more like we're paying more attention,
so we're seeing more of this, but it doesn't tell
us that there's actually been an increase year over year.
What it tells us is that we're finally paying attention

(33:00):
to a very real problem that's been around for a while.
If I can make an analogy, it's kind of like
when people started to see, uh, what appeared to be
autism rates rise, but in reality it looked like it
was more that the definition and the way of diagnosing
autism had expanded to a point where we were just

(33:20):
realizing the number of cases that actually exist, as opposed
to it's increasing. It was more that, oh, no, it
was here, we just didn't recognize a lot of this
as autism. It was sort of the same argument that
Emma Brian is making in that if you if you
come back and you say, well, I looked at the ocean,

(33:41):
and we took a bucket out and we filled it
up five times, so we know there's at least five
buckets of water in the ocean. And you say, well, yes,
but there's a lot else that's there that we don't
know about yet, and we we haven't measured and we
haven't quantified. And to me, that is it's interesting. It
shows an opportunity for actual, real research and analysis and

(34:04):
study into what's actually there as opposed to what's being reported,
and it's frightening. It's so scary because we don't even
like like she says, you know, they looked at a
list of a couple of dozen different um UH platforms
that were actively pushing out misinformation back in twenty nineteen,

(34:24):
I think it was, And she said, listen in Cambridge,
Analytica was active in six different countries by itself, that
was just one and says I've I've just started and
I've made a list of more than six hundred companies
that are actively push pushing out influencer operations. So that's

(34:45):
why I wanted to set that ground early on. Is
it's not to say that what we're talking about UH
isn't relevant, but rather we can't speak in any any
definitive scope because we just don't know what that scope is.
And to that means that we've gotta be on the lookout. Yeah,
and and I'm sorry to say this, but that Oxford

(35:07):
study that that she references in that piece is incredibly influential, right,
and so it is it's upsetting to see, I mean,
the name like Oxford, you expect a level of rigor,
you know, in in the kind of research they're putting
out right, I'm not a researcher, so I I can't
really say. But the name Oxford, You're like, Okay, that's

(35:27):
going to be a very reputable you know, it's a
reputable place. But it's got some it's got some research
street cred. Yeah. Absolutely. Um. And it's upsetting that that
study was so influential in the disinformation space when a
lot of the points that it makes are really flawed.
And she also, I have to say, she points out

(35:47):
that there the study itself had a lot of typos
and like spelling errors, which I would be so embarrassed
if I put out an influential study and someone was like,
actually it was really sloppy, I would be like mortified.
Her point is that the work is largely without meaning
because it's creating a sense that we have metrics for
something where uh, you know, really it's it's it's such

(36:12):
a small subsection of the overall problem that her point
is that it doesn't really tell us anything, right, It
just it tells us of the reported incidents, But that,
unfortunately is not really meaningful if you're looking at a
holistic approach to misinformation, and in fact, you know, you know,

(36:33):
and in fact, she was also pointing out that the
study looked at countries where the reporting mechanism might not
be as vigorous as in others, like countries that don't
have the same sort of perspective on things like the
freedom of the press, maybe state controlled press. You have
places like China and Russia where the state has a

(36:55):
large amount of influence on the media that goes out
in those countries. Uh, It's it does show that that
it's dangerous to take any stance where you're making absolute
claims because we just don't have the study to really
do that justifiably. We we don't have the evidence to

(37:16):
back that up. You're exactly right, We don't know the
real scope of the problem and the sort of lay
of the land. I also think something that she points
out that's really important to keep in mind is that
a lot of folks from a media perspective just weren't
really talking about disinformation in a serious way prior to
like right, and so I think that an issue that

(37:37):
we do have is people kind of getting up to
speed with how we think about this issue in a
holistic way, because I think, you know, after and after
you know, the insurrection, I feel almost overnight, this was
an issue that was getting more buzz and more pressed
and people were talking about it more. But with that
really does come and need to take a take a

(37:58):
beat and would have analyzed how we've got to the
space and and what, you know, try to put some
more research around, you know, understanding what is actually going on.
And I agree that we haven't really gotten there yet. Um,
and I am happy that there are more people who
are interested in this issue, but I don't want it
to become an issue where people are claiming to have

(38:20):
more knowledge than they do, or people are claiming that
we know more than we do. Because this this is
very much still developing problem that we're still sort of
trying to get a hold on. I would say, yeah,
we don't want there to be misinformation about how much
misinformation there is. Yeah, hey, guys, Jonathan from the future. Again,

(38:41):
we're going to take a quick break, but we'll be
right back with more about social media platforms, algorithms and radicalization.
So this issue is getting to a point where the

(39:03):
concern is great, and yet you also look at the
tactics that are being used. You realize that the tactics
that are used are very old. I mean, you know,
the the approaches to extremism, to propaganda, to misinformation, that stuff.
We we've got a handle on that. I mean, you know,
you can see numerous documentaries from everything from people who

(39:25):
were experts in creating the propaganda during World War Two,
both on both sides of the conflict, up to you know,
the ad executives. I mean, that's what Madman was all about,
was the idea of framing. You know, how do you
frame information in a way to get people to do
what you want them to do. Um. You know, there's
the whole discussion about the advertising about cigarettes, was all

(39:49):
about that. So, like this is all old stuff. What's
new is this this method of compartmentalizing communities and reinforcing
the delivery system of that material. So it's the identifying
of a potential candidate, the introduction of material to that

(40:13):
candidate that will set them potentially down this pathway, and
then the methods of reinforcing that and indoctrinating that person
into more radical views. Interestingly, probably not surprisingly, some of
the studies I was looking at suggested that this is
most effective for people who kind of have the lone

(40:34):
wolf kind of approach to radicalism, that in cases where
you're looking at groups of extremists, typically the meat space
is where that kind of radicalization still happens primarily. So,
but I would also argue that the insurrection on January six,
that that was in large part a lot of different

(40:56):
individuals who all just sort of kind of converged on
the same point. That it wasn't as much let's all
join these online groups and then we start to plan
from there. It was that it was a bunch of
individuals who started slowly gravitating toward one another through methods

(41:18):
like this. And of course there are other communities I
mentioned before, communities like parlor or parley if you prefer, uh, which, hey,
they're back, that's great, um that where you know, there's
not so much the algorithm there. It's it's that it's
a community that is actively reinforcing beliefs. So in that case,

(41:41):
it's it's almost more like the traditional method of radicalization
in the sense that you have this this self um
selected community that is following this process. So it we
have we have the whole spectrum here. We still have
the meat space stuff. We have the online communities that

(42:03):
are specifically geared uh if not specifically geared overtly at
least effectively, they're geared towards radicalization. And then you've got
the stuff that everybody's using that can lead you to
that pathway. So, um, how you feeling, bridget I mean,

(42:24):
as you were describing the the issue, I almost felt
this this paying in the pit of my stomach because
we're up like, we are up against so much. There
are so many you know, we have we have the
in real life meat space organizing radicalizing people. We have
these platforms that are engineered to radicalized folks. That scope

(42:44):
of the problem is quite large. And I often wonder like,
and I'm actually will be curious to know your thoughts.
Do you think will ever tackle this? I do you
think that we will ever get to a place where
it is not just the norm for folks to be
having these kinds of experiences being rad clies online. I
think without without forcing the platforms through legislation, regulations, whatever

(43:13):
it may be, to take a truly active role and
to and also to be incredibly transparent with how their
algorithms work, we won't get there. And companies are going
to be extremely resistant to that, obviously, because the algorithm
is that's the secret sauce for making the money, so

(43:35):
you don't. The companies are very resistant to make that
a transparent process because, for one thing, it could give
a competitor the opportunity to beat them at their own game,
make a better algorithm that does essentially the same thing
but in a slightly different way, and then they're no
longer king of the hill. Uh. I think it's gonna
be really tricky. I think the moves we're seeing in

(43:58):
the US government where there's the potential of breaking up
some of these companies, I don't think that that's necessarily
going to solve this problem. There's going to be there.
There will need to be additional measures put in place
for that too to be effective. Otherwise, all you're doing
is taking one big piece and you're making a bunch

(44:19):
of smaller pieces. But if they're all working the same
way as the big piece was, we haven't really solved
any issues. Um. I think everyone recognizes the amount of
power these companies have there that's undeniable. The question is
how do we deal with that. My answer from my
my from a personal standpoint is not satisfying, because I
just disengaged. I quit Facebook, But that that's one person,

(44:45):
and I would never tell anyone else like, you've got
to quit Facebook. I might believe it really hard, but
I can't tell them that because that's the way a
lot of people stay in touch with their friends and family.
It's a way a lot of people rely on Facebook
for their own businesses. I am in a luxurious position
where I can disengage and uh, I got a dog.

(45:06):
I'm okay. I might not talked about friends anymore, but
I got a dog. Um, and I still got all
those dog videos on YouTube too, So really I'm living
it up. But but yeah, I mean, like that's the
that's the thing is that this is a huge problem,
and like a lot of huge problems, there may not
be a simple solution. There may not be one that

(45:27):
is uh completely satisfying, and it may be really messy
to implement solutions that themselves could have unintended consequences that
will have to deal with later. Um. The important thing
is really acknowledging that problem, putting more effort into understanding
the scope and impact of that problem, and making sure

(45:50):
that our our energy for solutions is directed in the
right place. Because without really understanding the scope and the
nature of the issue, the best we can do is
try random solutions and hope they work. But with a
deeper understanding, you can craft a pathway that is at
least has a better chance of making a positive impact.

(46:12):
That's kind of it's the lame way of saying it,
But the more I looked into this, the more I thought,
we just don't have a deep enough understanding, and it's
largely because we didn't take it seriously, like you were saying,
bridget I mean, before Brexit, people were aware that there
are lies posted on the internet because people lie. You know,

(46:34):
wherever people are, we're gonna find falsehoods. But when Brexit happened,
and after all the fallout about accusations that the support
for Brexit was largely based off of unsupportable claims, that's
where it kind of started, the snowball effect of Wow,

(46:54):
we've really let this get to a place where we
don't have a handle on it um and we honestly
don't even know how bad it is, and we still
don't five years later. So yeah, I mean I feel
very similarly, but I am hopeful that we are finally
taking it seriously. I wish we had gotten here three
years ago, five years ago, ten years ago about understanding

(47:16):
the impact that this has. But I'm glad that we're
here now, and I think what we need to do
now is have these honest conversations, put money behind the
research to actually understand what's going on, because I think
we're tackling it late. You know, I wish we had
gotten there earlier. But I'm glad we're here now. And
you know, you talked about how you deleted Facebook. I
still have Facebook. I need it for work. Um, But

(47:38):
I always say, you know, a lot of these issues
are big systemic. You know, we're talking about your Mark Zuckerberg,
so we're talking about policy level things that it's very
difficult to feel as an individual like you have any
impact over that. But and that's true, but I think
we can all take small steps in our individual lives
to assess and be critical of the role that platforms

(47:59):
play in our own to get diets and how we
use those platforms. And so you know, I would also
never tell anybody to delete Facebook, especially in a pandemic
where you might not have the ability to see your
friends every day but you want to feel connected. But
maybe you can delete Facebook from your phone so you're
not carrying around a physical reminder of its presence on
your person at all times. Right, maybe you only go

(48:19):
on the browser, and you and and on your laptop
or whatever, and you're deciding when you're gonna engage with Facebook. Right,
maybe you're like, I have a new rule in my house,
no phones and no phones in the bedroom. Right. You
can we can all find ways of having an impact
on the way that Facebook plays out in our personal
media diets and how we use it in our personal lives.

(48:40):
And so while we may not be able to have
a huge impact on the grip that platforms have on
our democracy and our discourse, those are small things that
we can all do. You know, if you're gonna share
something on Facebook, read it first. You know, before you
comment on something, make sure that you read it. You know,
just little things, little steps. Agreed, Agreed. I think also
we should point out the fact that you and I

(49:02):
also host shows where we get the opportunity to reach
out to our listeners and to suggest to them that
they incorporate qualities like critical thinking. I always partner critical
thinking with compassion. I think if you have one without
the other, things don't go so well. But when you
got both you don't go too wrong. You can't you
can't go wrong. Yeah, you really just got to employ

(49:25):
both of those things. And I am so thankful that
you joined me for this discussion, and I hope that
our listeners really get an understanding. Like, the reason we
talk about this is because the stakes are very high,
and clearly, if you recognize there are elements of friends

(49:45):
or family of yours who appear to be maybe going
down a path, you should seek out resources to help
you and perhaps help them. It's a very delicate thing.
I certainly am not an expert to all you how
to handle a situation in every situation is different, but
that I'm seeing this real impact on people. I know.

(50:08):
There was a person that I worked with years ago
in a theatrical production, and several years ago I saw
him making some statements that kind of had him going
down the men's rights pathway, and uh it, that was
a big warning sign at the beginning, but it has
since gotten worse, and I just wish that I had

(50:29):
recognized things earlier and perhaps been a a positive influence
on his life too, and not have him go down
that pathway quite so wholeheartedly. Um, people are their own people.
They'll make their own decisions, but we can always be
supportive and helpful in certain situations. And one way you

(50:50):
can be supportive and helpful for yourself and for others
is to subscribe to There Are No Girls on the Internet.
Because it's an incredible show. I don't just say that
because I am the executive producer of that show. I
say it because it's I say it because if I
had no involvement whatsoever, it's still an amazing show. In fact,
I should I should point out I have a very

(51:13):
light touch on that show. That show is amazing because
of Bridget, because of Tari, and because of the incredible
amount of work you you put into it. And uh,
definitely go and check out Bridget show. There Are No
Girls on the Internet. Um, see what all the all
the all the accolades are about. Because this is a
show that's received quite a few of them. Oh, I mean,

(51:34):
I couldn't. I couldn't do it without you, John at
that Entari. This means so much coming from you, you know,
the tech podcast Guru. I got a poster of myself
in the background. It's that's how important I am. I
don't know if you notice that Bridget but that's that's
me back there. That's if you ever wondered who's the
kind of guy who hangs a poster of himself up,

(51:55):
well it's it's me and apparently Donald Trump. I guess
that's I'm not in good I'm not in good company,
but I can't deny it. It's on the door. Uh,
thank you guys for listening. Make sure you again subscribe
to There Are No Girls on the Internet. Go check
that out. Look at the list of shows, because there
have been some really incredible episodes. You've had some amazing

(52:17):
guests on that show, and uh, it just makes me
want to become a better interviewer. So I thank you
for that too, because that's been very inspirational to me.
And if you guys have any suggestions for future topics
of tech stuff, you can reach out to me on Twitter.
Yes I'm I'm still active there and the handle to
use this text stuff h s W and I'll talk

(52:38):
to you again really soon. Text Stuff is an I
Heart Radio production. For more podcasts from My Heart Radio,
visit the I Heart Radio app, Apple Podcasts, or wherever
you listen to your favorite shows.

TechStuff News

Advertise With Us

Follow Us On

Hosts And Creators

Oz Woloshyn

Oz Woloshyn

Karah Preiss

Karah Preiss

Show Links

AboutStoreRSS

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.