All Episodes

March 12, 2024 56 mins

This week Biden said he would sign legislation into law banning TikTok if congress passes it. TikTok is one step closer to being banned in the United States after the legislation unanimously passed a big vote last week. 

Popular TikToker and misinformation researcher Abbie Richards and I discussed what a ban would mean for marginalized communities. 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
If you are against misinformation and hate speech, you know
you got to be against it on all platforms and
not just on TikTok.

Speaker 2 (00:15):
There are No Girls on the Internet.

Speaker 3 (00:16):
As a production of iHeartRadio and Unbossed Creative, I'm Bridget
Todd and this is there are No Girls on the Internet.
This week, President Biden said that he would sign legislation
into law banning TikTok if Congress passes it. This comes
after legislation that would force TikTok, which is owned by

(00:38):
the Chinese company byt Dance Right Now to sell to
an American owner, pasted a big vote last week. The
House Energy and Commerce Committee voted unanimously to pass the bill,
meaning TikTok is one step closer to being essentially banned
in the United States. Last year, after TikTok CEO SHOWSI
Cheu went before a Congressional hearing to defend the platform.

(00:58):
Popular TikToker and disinformation researcher Abby Richards and I discussed
what a band would mean for marginalized communities. With the
platform being one step closer to being banned in the US,
it is really important to understand what this means for
our media landscape and for marginalized people. Now, y'all know
I love TikTok, but I also know that, like any

(01:18):
other social media app, it is not without its issues.
TikTok disinformation researcher and friend of the show, Abby Richards,
was one of the first people calling out TikTok for
the way that myth and disinformation and hate speech spreads
on the app. But for all of her criticisms of TikTok,
Abby still says that banning the platform is not the
right move. In a recent up edd for Newsweek that

(01:40):
you can read in the show Notes, Abby explained why
she thinks banning TikTok will ultimately hurt marginalized communities. I
caught up with Abby right after we both watched the hearing.
Abby Richards, friend of the show, How are you?

Speaker 1 (01:55):
I am angry, Ridget. I am an infuriated girl right.

Speaker 3 (02:02):
Now, as someone who has really built a lot of
your career not just on TikTok, but like calling out
TikTok for the ways that it can spread misinformation and
hate speech. What has it been like to watch elected
officials talk about banning it.

Speaker 1 (02:17):
It's been infuriating because they're gonna get all wrong, Like
I have literally built my career criticizing TikTok. I will
be the first to tell you about like the copious
amounts of misinformation and hate speech, extremism, harassment that happened
on that app, but to then try and ban it
without any evidence that it is a national security threat whatsoever,

(02:43):
and just using like fear mongering and pointing to other
pieces of misinformation that I have tried to debunk about
TikTok like that. And this is like the types of
misinformation that emerges off the platform. So if you were
watching the hearing yesterday, you would have seen a lot
of our congress men and women pointing to TikTok challenges

(03:06):
that are causing children tons of harm. And I have
spent so many, like countless hours debunking these exact like
moral panics around these challenges, where essentially adults try and
blame the stupid things that kids do on the newest technology.
We see it over and over and over again. But
the fact that you know that's being used to justify

(03:29):
the banning of this app too is infuriating. And I'm
deeply concerned about what the consequences will be of banning
an app that one hundred and fifty million Americans use.

Speaker 2 (03:42):
So you brought up something really interesting.

Speaker 3 (03:44):
I know that these like so called like deadly challenges
was a big part of the hearings and something that
I mean, I almost hate to ask this question, but
I feel like it's an important context of the conversation.

Speaker 2 (03:56):
That is like great news for Facebook.

Speaker 3 (03:58):
They're like, oh, our strategy is winning, Like our strategy
to turn the public and of course elected officials against
TikTok and make them think that like it's this like
place where all these deadly challenges emerge, even though some
at least some of those challenges actually emerged on Facebook,
if you really want to get into the nitty gritty,
So did you see in the hearings the ways that

(04:19):
like fear mongering just sort of writ large about technology
was taking center stage in these hearings.

Speaker 1 (04:26):
It's almost like they're placing this fear that they have
of new and emerging technologies and how young people use
them and just placing it on one singular app. So
rather than actually dealing with the consequences of this like
rapidly changing society because of the expansion of technology, like
on a monthly basis, we just want to like pin

(04:49):
all of our anxiety on one app and cross our
fingers that like, if we just ban this one app,
it'll all go away. And so I think we see
that a lot with the challenges and like the the
steam that they pick up among people who are not
on the platform. In particular, we know that it was
the Washington Post that reported that there was a consulting

(05:15):
It was the Washington Post that reported that there was
a marketing group that was working with Meta to place
abeds about, you know, fear mongering about TikTok. So that's
something that we have seen happened before, and it certainly
seems like this is a big win for Meta. If
you are on TikTok at all, you are probably seeing

(05:37):
this widespread popularity of almost conspiratorial language about where people
are blaming the ban on Meta and saying that like
Meta is lobbying and Google too to some extent blaming it,
but yeah, blaming it on Meta and Google for lobbying

(05:57):
Congress people and encourage them to ban their biggest competition.
I don't know how much truth there is to that,
because I think that there is a lot of geopolitics
there too, and it seems like the US government is
you know, very interested in like splitting the Internet as
much as possible right now. So it's hard to say
if it's like one cause versus another, but this does

(06:20):
definitely look like a big win for Meta, and they
used rhetoric that Meta has also pushed before.

Speaker 3 (06:28):
Yeah, so in terms of a TikTok ban, So in
your piece you write about how, you know, obviously you
are the first one to admit that TikTok has its
downfalls and the democratization of like media has its downfalls,
but you also talk about the ways that this could
actually harm marginalized communities. How do you see that working?

Speaker 1 (06:47):
So let's okay, well we'll start here, and you consider
a like institutional media legacy media, so New York Times,
Washington Post, Wall Street Journal, CNN, and they are top
down information systems, which means that these newsrooms essentially look
at all of the news and then decide what is

(07:09):
newsworthy and then disseminate it to their audience. TikTok is
much more of a bottom up information system, which means
that anyone can post basically anything that they want, and
then the algorithm is working with the users to determine
what people are watching, and in that way, the users
decide what is of importance to them. So we have

(07:33):
two different information systems, and this top down one tends
to privilege white, richer, more privileged people. It tends to
be composed of people who come from more privileged backgrounds.
It speaks about their problems and things that they are

(07:53):
interested in thinking about. They in particular, like they have
to sell subscriptions or sell ads, so they have to
run stories that they know that the wealthy audience that
they pander to will be interested in the first place,
because it's really not that profitable to tell stories of

(08:15):
marginalized people who can't afford to subscribe to their newspaper.
So TikTok being this much more bottom up infrastructure, allowed
for two major like kind of newish forms of massive
dissemination of news and information. The first would be seeing
information from first hand accounts. So that's like when you

(08:39):
know this the started the invasion of Ukraine, where we
saw all of these videos of people fleeing the country
or being in bomb shelters. So you're seeing people who
are actually experiencing the events posting online and that garners
attention without editors needing to push it out. And then
the second is through trusted messengers. And these are people
that can speak to two specific communities and they have

(09:01):
had they've had a lot of success on TikTok. They
have built up these communities where they are trusted sources
of information and sometimes opinion. People go to them to
try and feel like they can understand certain problems. And
you know, this could range from a community of like
ten thousand people two communities of millions of people. But

(09:23):
this is particularly advantageous to marginalized groups who probably can't
afford a subscription to the New York Times, or alternatively
they don't want one because they don't trust the New
York Times. They've never seen themselves represented in the New
York Times. Like, there are so many reasons why they
might not turn to legacy media, but they will turn
to like a trust messenger who looks like them, who

(09:44):
speaks their language, who they feel understands them in their communities.
And so if we dismantle that infrastructure like suddenly, like
flipping off a light switch, you'd be leaving all of
these communities without their information infrastructure, leaving them in the dark.
It could be really, really damaging for them. And they're

(10:05):
very resilient to all of these marginalized communities, whether it's
communities of color, queer communities, disabled communities, they're very resilient.
They will rebuild. But keeping them in a state of
like constantly rebuilding the information infrastructure stops them from being
able to do things like organized and mobilize and gain
power because they're constantly rebuilding their information infrastructure. So yeah,

(10:28):
I'm mad.

Speaker 3 (10:30):
I mean, that's such a good point, Like, I where
do I even start. I think that one of the
reasons why we're even having this conversation is like a
fundamental misunderstanding of the ways that platforms like TikTok have
worked and what they have like been used to, like
what movements they've been used to build. Right, Like I
still see people being like, oh, it's a kid's dancing app,

(10:52):
which like part of my soul dies every time I
hear that, you know, looking at around, like what was
happening to women in Iran? I have to say, I
don't think I would have known what was happening. I
don't think I would have known what was going on
if not for TikTok, if not for Iranian women saying, hey,
this is what's going on, here's how you can help,
here's what we need. I don't think I would have
been able to be plugged into what was happening if

(11:12):
not for TikTok. And so I think that it's it's
like a it's like a the people who are legislating
this don't necessarily have that visibility into the ways that
it is functioned for communities who are marginalized. And so
I I if we're gonna have any conversation about just
an outright ban, they're not even including that in the conversation.

(11:33):
And I feel like it's kind of like what you said, like,
if we're going to do something about how these apps
and platforms function in our lives and in our like
media ecosystem, let's not have it be done hastily and
badly right where people are, where people are gonna be
have the rug pulled out from under them, and they're
their very real issues are not even being discussed or
like you know, brought to the table.

Speaker 1 (11:54):
Oh yeah, there's no nuance in this conversation whatsoever. And
you would think that if we're going to have a
conversation about pulling the plug on one of the biggest
pieces of communications infrastructure in the world. That is a
conversation that we should be very careful about. Dismantling infrastructure

(12:18):
that large is not a decision that should be taken lightly.
And yet it seems like our representatives like truly have
bought into this idea that TikTok is somehow simultaneously just
a kid's dancing app and also spyware. Like it's really
confusing how it can be both, and like saying like, oh, well,

(12:39):
you could push specific you know, you could push specific
misinformation campaigns and try and sway the American populace, but
then also simultaneously ignoring the fact that like it is
a messaging center for a huge portion of the American populous.

(13:00):
Just remove it, there will be consequences. And then when
I bring this up, I see some people just like, oh, well,
you know, at the like you know, national security is
more important, or it's like everyone will just go to
a new app, and like, yes, probably they will, but
that doesn't mean that it won't cause a lot of

(13:22):
harm in the process. Doesn't mean that people will you know,
there's still gonna be lots of creators who lose their jobs.
There's going to people people who have built these large
audiences who now have nothing, and a lot of people
whose voices were empowered and suddenly were coming into a

(13:46):
position of discursive power who will now not have access
to that. And even if like yes, technically, if you
look at like a very big picture level, people will
migrate to new apps, sure, but there's still gonna be
a lot of consequences and we can't really ignore that.

Speaker 3 (14:06):
Yeah, I mean, even looking at what's going on with Twitter.
I there was a time where I was like, Oh,
everyone's gonna be using masterdon or this or that.

Speaker 2 (14:15):
It hasn't.

Speaker 3 (14:16):
I know that our communities, particularly marginalized communities, as you said,
are really resilient and we always find a way to
make We always make a way out of no way, right,
Like that is what we do. But it's watching it
watching Twitter become a place where people spend less and
less of their time. There hasn't been like one place
where everybody is like setting up shop, and so it's

(14:38):
takes longer, it's messier, it's it's more energy for these communities,
many of whom, if you look at the trans community
and the queer community are up against quite a bit
right now, and so maybe having to pick up and
rebuild a new communications infrastructure elsewhere is not what they
need right now. Is maybe it is maybe not gonna
be like great for those communities.

Speaker 1 (14:58):
It'll be horrible for them because now, in addition to
legislation that is threatening their right to exist, they also
don't have their communities that they can turn to for advice,
for support, for information, and they don't have access to
their broader communications network to mobilize right and plan what

(15:23):
they will do in response or try and have a
place they can turn to to raise funds if they
need surgery, or if they need you know, legal fees,
or if they need to get out of whichever state
they're in. If you remove the communications infrastructure, and we're
talking about a marginalized community that is undergoing a lot

(15:47):
of legislative challenges right now that's putting them in a
position that just increases their marginalization.

Speaker 4 (16:00):
Let's take a quick break at our back.

Speaker 3 (16:14):
So let me zoom out a little bit and just
make sure that I have this right. Lawmakers are saying
that they want to ban TikTok because it's a national
security risk. This is a little bit above my pay grade.
I am no digital national security expert, and I want
to make sure that my understanding of this aligns with
your understanding. So it is true that TikTok is a
Chinese owned company, and it is true that potentially that

(16:37):
could open the door for all kinds of national security risks.
But what we don't have right now is a smoking
gun that this is actually a problem. We're very much
speaking in hypotheticals of what could happen or what potentially
might happen. So lawmakers right now are trying to legislate
TikTok because of this hypothetical threat that might be out there,

(16:57):
but it's not yet revealed itself.

Speaker 2 (16:59):
Do I have that right?

Speaker 1 (17:01):
Yeah, I would say you nailed it. They haven't provided
any evidence of instances where China use TikTok in any
sort of antagonistic way towards the US. It's only in
the hypothetical of like they own it and they maybe could,

(17:23):
and even if they could, it's kind of unclear how
they would go about doing that. What we do know
and like, what feels like it's missing from this conversation
so much is that even if we were to ban TikTok,
it would not stop the Chinese government or any government
or any bad actor from just buying your data elsewhere.

(17:46):
TikTok is roughly collecting about the same the same amount
of data as any of the other major social media platforms,
and then they profit by selling off selling it. Well
within the hearing that he denied that they sell they
sold data, so maybe they're not even selling the data.
But social media platforms in general collect your data and

(18:09):
sell it, and it is not difficult for anyone with
any sort of intention to just buy your data so
that they can target you with specific advertisements and specific messages.
So if we were actually concerned about the Chinese government
or any other government manipulating Americans, you would be implementing

(18:32):
like a comprehensive data privacy legislation rather than going after
one singular app. Because even if you were to band TikTok,
and even if TikTok were using it's data in that way,
which we still have no evidence that it does, they
would still just be able to purchase your data anywhere else.
So this is like applying a used, gross infected band

(18:55):
aid to a wound that will really not fix wound
at all, but will almost certainly add additional infections.

Speaker 3 (19:06):
Oh my god, I was just about to read your
quote in the newsweek piece back to you. But yeah,
I mean it's this is this is what like really
gets me about this conversation is that, like I guess,
we have no meaningful data privacy legislation in this country, right,
and so like that, the fact that we're even having
this conversation indicates to me a complete failure of leadership,

(19:29):
governing and legislating, right, And so I feel like banning
TikTok is this big flashy gesture of like.

Speaker 2 (19:37):
See we did something.

Speaker 3 (19:39):
See we got it together and governed and and made
some kind of a gesture toward you know, data privacy,
but it actually does nothing. As you said, like there's
nothing keeping these companies or really anybody from buying our data,
Like there are so many other the fact that our
data is so misused and like that is the baseline.

Speaker 2 (20:00):
Shouldn't we be talking about that?

Speaker 3 (20:01):
So even if we were to ban TikTok, it would
be totally possible for the Chinese government to access Americans
data and all other kinds of ways they maybe they have,
And so It's like, what does it really do if
the real problem is our lack of data privacy infrastructure
in this country.

Speaker 2 (20:16):
Other than being a big, flashy gesture, what does banning
TikTok actually achieve?

Speaker 1 (20:23):
I mean to answer your question. Nothing. On a political level,
I think that it is a distraction that allows for
the American public to feel like something was done to
make them feel safer, while simultaneously not passing the legislation
that would actually protect the American public. And like, we

(20:48):
are a superpower when we demand that all of the
major social media platforms be stemming from us. How dare
the Chinese also have one?

Speaker 2 (20:58):
Yeah?

Speaker 3 (20:58):
I saw this great USA Today headline that was like,
how like how dare China try to get my data
from TikTok? Don't they know that that's for Google and Facebook?
But I think I get like like we have just
like lost the plot of doing anything to meaningfully address this.

Speaker 2 (21:17):
And yeah, I.

Speaker 3 (21:19):
Think like during the hearing, some of the things that
came up around like content moderation and stuff, those are
important conversations to have, but having them having TikTok be
the face of you know, a lack of content moderation
or hate speech and harmful content proliferating on a platform.

Speaker 2 (21:37):
I ask someone who has worked.

Speaker 3 (21:39):
In this space for a while, I'm deeply uncomfortable with
that because I think it it creates a scapegoat that
lets these other i would say sometimes bad actors off
the hook, right, Like we're we're not having the conversation
in a meaningful way to actually get results. We're just
making an example out of this one big bad boogeyman,
which is TikTok.

Speaker 1 (21:58):
Yeah, definitely. And I mean the examples that they were
pointing to aside from these like moral panic challenges, the
examples of violent content on the app were really wild
that they were pointing to it in the first place.
Because as far as removing you know, really violent content,

(22:20):
TikTok is better than most of the other platforms. I mean,
I professionally criticized them. It's one of my favorite things
to do is bullying TikTok. But they will generally be
pretty good about removing anything that I flag towards them.
And they have a lot of like horrible things on
their apps, like absolutely, but all social media platforms do.

(22:42):
And so what you really want to be looking at, too,
is just how much reach are those things getting. So
one of the videos that they pulled up during the
hearing was an imagery of a gun, and I think
it included one of the committee members' names, and it
was and they were railing on TikTok because the video
had been up for I believe forty one days, and

(23:04):
what they were failing to account for is that the
video had like no likes. It was not a video
that had been engaged with whatsoever. So arguably it's actually TikTok.
It's evidence that TikTok was doing a decent job with
its moderation, but that it had like siphoned this off
and contained it more than anything else. Like, yes, it

(23:25):
shouldn't be on the platform at all, but if it's
not getting any reach, really, then TikTok's moderation to an
extent is working. That's the whole point. It is, like,
you want these things contained, you want them, you want
them quarantined so that they don't spread. So being able
to point towards that was like, it was such a

(23:45):
strange choice. And then it was additionally strange because we
love guns in this country but apparently have problems with
images of them.

Speaker 3 (23:55):
If you watch the hearing, there were actually some pretty
good questions that came up. Representative Eve Clark asked whether
or not TikTok suppresses marginalized creators, and Representative Lisa Blunt
Rochester asked about whether accurate information about abortion is being
suppressed on TikTok. But what was much more abundant, sadly,
were questions that weren't just silly, but also underscored how

(24:16):
little some elected officials seem to know about the technology
that they are responsible for legislating.

Speaker 2 (24:23):
Yeah, I mean that example really.

Speaker 3 (24:25):
Brings me to something that, like it pains me, but
I can't not talk about it with you. How some
of the questions and examples that these elected officials were raising,
we're just embarrassing, Like, you know, Representative Richard Hudson from
North Carolina, Like, I'm sure you know where I'm going
what I'm getting at the like exchange about whether or

(24:47):
not TikTok accesses users home Wi Fi? Like do you
ever so like, do you ever feel like the people
that we have elected to represent us and to advocate
for us and our interests up against big tech just
are making it clear that they're talking about legislating technology
that they seem to not really understand or get.

Speaker 1 (25:09):
No, it was quite clear that no one on that
committee had ever opened TikTok, like they had never used
it before. They're just making wild statements and accusations and
guesses about how it works. And if you go on
TikTok now in response to that committee, I mean, they're
just being ridiculed because the users can tell that these

(25:33):
congressmen and women have never really actually used the app
or know what they're talking about in any way, shape
or form. And it is mind bogglingly infuriating to witness
people who don't even understand how Wi Fi works, or
like couldn't tell you the difference between a modem and

(25:53):
a rotor trying to legislate major social media policies, like
in major data privacy ubolicies, Like they don't understand any
of it, So how are we expecting them to even
know where to begin when it comes to those bigger
legislative problems.

Speaker 3 (26:12):
It really goes back to what you said in the
beginning of our conversation of like, if we're gonna do it,
let's not do it badly. Let's not do it in
a way that like we are talking about a pretty
major communications platform. If we're gonna do it, let's not
have the people in charge make it clear time and
time again that they have no idea what they're talking about.

Speaker 1 (26:31):
Maybe, just maybe, just maybe that could maybe be a
good place to start. Is like knowing that you need
to use Wi Fi to access TikTok. Like, Hi, oh
my god. I really liked the exchange about whether or
not TikTok tracked how much your pupils dilated in response

(26:55):
to certain videos. And it's like, no, they look at
watch time. Ah, They're not like reading your pupils to
see how big they get in response to each video.
I don't know what Black Mirror episode you think we
are in. We're in a different one. We're still in
one different just.

Speaker 2 (27:13):
A different episode. Yeah, I don't think. I don't think
that TikTok.

Speaker 3 (27:17):
Is like like logging your biometrics, dude.

Speaker 2 (27:21):
Like it just it just was.

Speaker 3 (27:23):
Some of those questions were concerning, And I also think
they just like reveal how we just need younger people.
I think, like like younger folks both in the both
in elected office and also being listened to.

Speaker 2 (27:39):
I think is really really important.

Speaker 3 (27:41):
People who actually have real world experience with the technology
that they're talking about, because it was, as you said,
it was just so clear that many of the folks
in Congress had just had no idea what they were
talking about.

Speaker 2 (27:52):
Yeah.

Speaker 1 (27:52):
No, we would absolutely need to have younger people or
just more you know, digitally literate people writing these laws,
because if you truly don't understand that, like the app
needs to track where your eyes are so that they
can put sunglass filters on you like that, that is
just something that a younger person would need to explain

(28:15):
and need to help them understand. And if if all
of our legislators are you know, I don't want to
say dinosaurs, but like they're acting like dinosaurs right now,
and I feel like it's it's called for, then we
can't expect solid digital legislation from them. Like they don't
they don't know what they're doing, They don't understand anything

(28:37):
beyond email at this point.

Speaker 2 (28:39):
Yeah, that is sad.

Speaker 1 (28:42):
I mean, it's really sad. It's really scary because if
you want to talk about like the American people being vulnerable,
our leaders not even understanding like the you know, bare
minimum that they need to understand about how the internet
works like that leaves us quite vulnerable.

Speaker 2 (28:59):
That's such a good point.

Speaker 1 (29:04):
That seems like a much bigger threat. Our leaders failing
to understand anything about technology feels like a much bigger
threat than this TikTok boogeyman scare.

Speaker 3 (29:17):
So what do you say to someone listening who is, like,
I am concerned about the national security threat that TikTok poses. Still,
what would you say to that person?

Speaker 1 (29:27):
I would say that at this point in time, we
have seen more evidence of Facebook engaging in schemes to
manipulate the American public than we have TikTok. I mean,
we have evidence of Cambridge Analytica, we know that Meta
was involved with this, we have no evidence of TikTok
doing it. So at the moment, all of our fears

(29:49):
are based around a hypothetical and that they are based
around this like boogeyman of China, and that that might
not be the best place to create legislation from, Like
that mentality might not lend itself to the creation of
like really strong digital privacy legislation. And that I mean, also,

(30:18):
if you are personally concerned about TikTok, like, there's no
reason why you have to have it on your phones
that's a decision that you can totally make if you
if you have national secrets on your phones, like maybe
don't have TikTok on it. But then also like maybe
don't use any other social media platforms because they are
all gathering your data.

Speaker 3 (30:42):
More after a quick break, let's get right back into it.
Other platforms are also gathering your data and behaving badly
and acting in ways that calls all kinds of harm.

(31:02):
But this legislation is very much black and white, and
it seems very much based in scare mongering about TikTok
as this potential boogeyman. And I just don't think that
fear mongering is a place that will elicit good, meaningful,
substantive legislation. You know, black and white thinking on a
nuanced issue is bad enough, but black and white legislating,

(31:23):
I don't know.

Speaker 2 (31:23):
I don't like it.

Speaker 1 (31:25):
I don't like it either. I don't trust it. It's
not coming from a smart place. It's coming from a
fearful place. And that's not where we get good legislation.
And if you want to have a much bigger conversation
about social media and the future of social media and
whether it's bad for us, whether it's good for us,
what harms it does, but also you know what net

(31:45):
positives it brings to the world. I am here for
that conversation. It'll take us a few years to get
to the bottom of it. It is not like a
quick and easy conversation to have, but I do think
that that's one we should be having, and I've and
trying to push for it. It's like, Okay, well, there
are all these issues with like misinformation and extremism on

(32:05):
social media, and we know that these platforms benefit from
the posting and sharing of content that is that appeals
to our emotions that is often false or is you know,
oversimplifying really complex issues, or it's just like extremists and
hateful in nature. We know that they benefit from that
because that's the sort of content that keeps people on

(32:28):
their platforms, consuming content and therefore consuming ads. And I
would like us to be working towards the creation of
like digital communities that are better for our emotional and
social well beings. I could absolutely see that occurring in

(32:50):
the future. Is like creating these digital public spaces where
people can go share information, but that aren't driven exclusively
by profit and are built with the goal of creating
a more well informed and empathetic public. But that doesn't

(33:12):
happen just because you ban one app you don't like. Like,
that is deep, deep infrastructural change that we need to
make and I think is the only way really forward
unless we want to drive ourselves insane. But that's that's
not a light and easy task.

Speaker 3 (33:29):
It's certainly not going to be achieved by banning one app.

Speaker 1 (33:33):
Yeah, if you are against misinformation and hate speech, you
know you got to be against it on all platforms,
and not just on TikTok, because Facebook, Instagram, YouTube, Twitter,
they all have it. I mean, Twitter right now is
also such a mess, just as a like, it is
wild to watch all of this, this giant attack on TikTok. Meanwhile,

(33:59):
Twitter is just falling apart, crumbling and has disintegrated as
like this this reasonable public space and the fact that
you know, we would be losing TikTok and essentially be
losing Twitter, that would really leave Meta with a lot

(34:23):
of power. But I also think it's interesting that like
one rich guy can just buy and ruin Twitter, but
like God forbid and pass some Chinese ownership.

Speaker 3 (34:38):
You if you had to say, do you think that
TikTok is going to be banned in the United States.

Speaker 1 (34:43):
I'm pretty fifty to fifty right now. I don't like it.
If you had asked me before, I'd be like, probably not,
because I thought that the Dems were smarter than that.

Speaker 2 (34:57):
I don't know about that.

Speaker 1 (34:58):
I know, I know why. In hindsight, what was I thinking?
They love losing. But I'm just I was baffled that
the Biden administration pushed for the sale versus ban because
I thought that they knew that if they lost TikTok
they would not be able to mobilize voters in twenty

(35:20):
twenty four. But apparently they don't know that and they
just are down for political suicide. I really it baffles me.
So I continue to be wrong and surprised by what
they will do in the name of like this weird
singular bipartisanship over xenophobia. It's the one thing that they

(35:45):
can relate to the Republicans on right now.

Speaker 3 (35:49):
Also, you know, you made the point about mobilizing younger voters.
I said this on a panel and it got like
a laugh, and I was like, oh, I was actually serious,
I think, and this might sound it's gonna sound how
it sounds, but I truly believe this. I think if
TikTok is banned in the United States, I see that

(36:12):
going very poor for Biden and the Democrats. Not to
make this like a horse face issue, but I think
that some of the most interesting, meaningful organizing of young
folks politically is happening on TikTok. If you look at
the conversations that younger folks are leading on when it
comes to things like climate justice, those conversations are happening
on TikTok, and they are mobilization conversations. So I don't

(36:35):
I'm not saying that I think that young people will
be like, oh, we love TikTok and we're not going
to vote for Joe Biden just because he banned it
and we love it.

Speaker 2 (36:43):
I think it's much deeper than that.

Speaker 3 (36:44):
I think that gen Z uses it as a platform
to do political and social mobilization and organizing.

Speaker 2 (36:51):
If you show that you don't.

Speaker 3 (36:53):
Not just don't understand that, don't see it, don't value it,
ignore it, that is going to be a problem. And
so it's not an issue of it like, oh, these
whiny brats just want their dance videos no, no, no, no.

Speaker 2 (37:03):
I think it's much deeper than that.

Speaker 3 (37:04):
And if you if you show that you aren't even
able to see that, it is going to be a problem.

Speaker 1 (37:10):
One hundred percent. And people keep playing this off as like, oh,
you're going to poos off gen Z and it's like,
that's that's less what I'm worried about. I think a
lot of gen Z would still you know, the ones
who are going to vote anyways, will still vote for
the lesser of two evils, Like that's going to continue.
But what will you will But what you will lose

(37:31):
is all of this mobilization infrastructure where people are getting
their news, where people are having like lots of political discourse.
What is helping young people to understand the world is
the discourse on TikTok. Is the trusted messengers that they
turn to on TikTok. And if you take that away,

(37:52):
like they're just going to not be as politically engaged.
But because TikTok made it really easy for them to
get politically engaged and get politically informed and deep dive
into this discourse. Like you know, the people keep saying,
TikTok is this dancing app and it's so so wrong
because like, fundamentally, it's a discourse app. It's where people
are having conversations about the world. That's why like stitching

(38:15):
was such a groundbreaking feature, because you could take one
video and respond to it and you get this like
built and constructed, constant discourse about political and social issues.
If you remove that, you know, they might still be
left leaning and like some of them might vote, but
they won't be as informed on certain issues. They won't

(38:38):
necessarily know where to or like where to be putting
their their energy for certain political issues. Like they would
just be losing so much of their infrastructure and that
will have severe costs, particularly for the Dems. And it's
really mind blowing to me that they don't recognize that. Yeah,

(38:58):
I'm just I'm shocked. I shouldn't be. I shouldn't be.
I'm years into this, I shouldn't be.

Speaker 2 (39:05):
Well, here's I mean big question.

Speaker 3 (39:08):
Where will Abby Richards go if there's no TikTok? Like
I think of you as the person who is like
made a platform, like one of the first people who
really made a platform from calling out and criticizing TikTok
and advocating for it to be better. Where do you
personally see yourself showing up if it's banned.

Speaker 1 (39:26):
Here's the thing is, Like, on one level, I think
I experienced some panic where it's like, if I lose
my platform on TikTok, do I lose my power? Like
you know, will I still have work to do while
people still see me as valuable. And then after you know,
a little bit of dwelling in that panic, I got
over it because it's like, no, I'm privileged. I have

(39:46):
a lot of like institutional power at this point, like
I can pivot to other platforms if I want. And fundamentally,
I think that the deeper we go into a tech dystopia,
the more job security I actually have, which is dark.

Speaker 2 (40:01):
But that was some dark hobby. That was some dark shit.

Speaker 3 (40:05):
But I mean you're right, right, like like you call
out TikTok, but really you're calling out like our media
ecosystem the way that it impacts our democracy, Like it's
so much more than one platform.

Speaker 2 (40:18):
Your voice is about so much more than one platform.

Speaker 3 (40:20):
I guess I'll put it that way, But I agree
with you that, Like you know, when I started this
tech podcast, I never knew the Like I remember thinking like, oh,
maybe I'll what if I run out of content, what
if I run out of like things to talk about?

Speaker 2 (40:34):
Nope, I sure never will, never ever will.

Speaker 1 (40:37):
I mean my goal is to eventually put myself out
of a job. I would really like if I could
work hard enough and eradicate all of the misinformation I
hate speech online, and then one day I get to
retire and I don't need to do this job anymore.
I don't see that happening. So I think, you know,
if TikTok is banned, I will go to other platforms.
I will build out my presence on other platforms too.

(40:59):
I'm far more concerned about, like the people who don't
have the institutional backing and who don't have power outside
of their TikTok presence, Like they will have their lives
much more uprooted than I think I will.

Speaker 2 (41:17):
We are in such uncharted waters.

Speaker 1 (41:20):
Yeah, no, truly what happens to the creators like because
Congress I saw I was one one congressman like went
on Politico with like, No, they'll be able to switch
to a different platform, as if they haven't spent last
like two three four years building a platform, so now
they can do brand deals or now they can make

(41:41):
money from the Creator Fund and now, oh no, they
can just start over, it'll be fine.

Speaker 3 (41:46):
Who cares like it's it's do you know how long
that takes? Like, yeah, it's but again, it's it's showing
that the people who are meant to be legislating this
just don't have a real realistic understanding of what of
like what they're actually talking about. And when they make
suggestions like that, it's like, have you taken a step
back to ask what that actually looks like in practice?

Speaker 1 (42:08):
Yet I don't think that they have.

Speaker 3 (42:10):
I will say, where there are marginalized creators on a
platform facing some kind of something that they don't like,
there's always really good pushback, and so scrolling the platform
after those hearings got to give it up to the
creators who are using their strengths creating good content to

(42:32):
you know, educate people on those hearings and sort of
what they thought about them. I've seen such hilarious tiktoks
coming out of that. I saw one creator who was
pretending to be a lawmaker in Congress who was like,
I want to know and my wife wants to know why.
Every time I open TikTok, all I see is these
big breasted women that's making fun of, Like how silly

(42:53):
some of their questions were. So that's something I'll always
have faith in is marginalized creators being able to use
social media platforms to poke fun at legislators and to
spotlight and educate folks on something that they do not like.

Speaker 1 (43:07):
Oh, one hundred percent, the tiktoks coming out of this
have been phenomenal. I was like getting a lot of
comfort last night and just seeing the conversations that were
being had. Even did you say, see any of the
chew edits the edits of the CEO people were making
like spicy edits of him, Stay TikTok Uh. Those were great.

(43:30):
There are a lot of I want to see if
I given told what I know. I just screenshotted one
this morning that was giving me life. Oh it was
the cab cut meme of Pedro Pascal eating a cracker sandwich.
And then it says me watching the CEO of TikTok

(43:53):
defend the American people and our rights from the literal
US government all while they mispronounced his name, have no
knowledge about how Wi fi works, and are saying they
are doing it for the children when there was yet
another school shooting that I didn't see on the news,
but I did see reported on TikTok.

Speaker 2 (44:10):
Doesn't that really say at all?

Speaker 1 (44:11):
That does that sums it up? Doesn't It? So like
TikTok is reading this stuff for shit. But like the
rest of the internet, I don't know so much. Because
there was also that study that came out or I
don't know if it was a study, it was a report. Well,
I want to say, no, I don't even remember which
which day's organization ran it. That was the biggest determining

(44:32):
factor whether or not you support the TikTok band is
whether or not you use TikTok.

Speaker 2 (44:36):
Yeah, so there you go.

Speaker 3 (44:37):
It's like people who don't who like don't have direct
familiarity with the issue, are the ones who are in
favor of an outright band.

Speaker 1 (44:47):
Yeah, they're much It's much more. It is much easier
to convince people who are not even using the app
that the app is dangerous because they are not exposed
to it. They don't understand how it actually works. It's
much easier to fear monger about something that you're unfamiliar with.
It's much easier to hijack people's feelings of fear and distrust.
If they don't even know the other group that they're

(45:10):
dealing with, If if they're completely unfamiliar with the topic,
then you can just push like fear mongering on them.
But the people on TikTok who have their own experience
was with it, you know, can can read that as bullshit. Yeah,
totally in favor of some comprehensive sweeping data privacy legislation

(45:30):
that would absolutely affect TikTok. Let's do that from a
place of reason and you know, national security of the
American people, Like, if we actually cared about that, let's
go write that legislation.

Speaker 3 (45:44):
In the end, TikTok and it's part in our larger
digital media ecosystem is huge. So a potential band would
have major consequences, not just for the millions of people
who use it, but for the way that information and
discourse is spread.

Speaker 1 (45:58):
Oh, this was cut of my Newsweek article, but it
was good. It's so like, I'm a member of this
coalition of TikTok members. I'm a member of this coalition
of TikTokers, TikTok influencers with the coalition has over five

(46:19):
hundred members and collectively, they have over five hundred million followers,
and to put that in perspective, I believe that that
is more than the collective monthly viewership of like Fox
ynn AN MSNBC, which is about five million, So not
just more, it's like one hundred times more than their

(46:40):
monthly viewership, and that's just like their following count. But
I thought that that was, you know, really, I thought
that that was a good way to highlight just how
vast this infrastructure is, and that we're not talking about
cutting off like a couple million people or like, you know,
handful of people. We're talking about dismantling in a communications

(47:04):
infrastructure that is so vast that it is like literally
hundreds of times the reach of the biggest news networks
that we can think of. That is not a decision
that we should be taking lightly.

Speaker 2 (47:17):
I'm so glad you've added that.

Speaker 3 (47:19):
I do think that for people who don't use TikTok,
they probably it's probably easy to think of it as
like it's like that arrested development line, like it's a
it's a TikTok Michael, how many users could there be?

Speaker 2 (47:30):
A hundred? Like they might not necessarily know the gravity
of the the way that it.

Speaker 3 (47:38):
Functions as a communications platform and its role in our
larger ecosystem of digital communications, which is vast.

Speaker 1 (47:45):
It's huge. It's huge, and people don't understand that it's
not a dancing app and that it is in fact
an information app that is a discourse app, and that
there is like so much existing infrastructure which has been
built over the course of the last few years to
mobilize people, particularly for progressive causes, and if you take
that away, we're going to have to rebuild it, and

(48:06):
rebuilding takes time. Rebuilding is a way of keeping people
down right, Like, the reason why legacy news institutions are
so powerful is that they don't have to rebuild every
four to five years. It's like you build power by,
you know, having stability in your institution. And if you
take all of these young people who have finally found

(48:27):
a bit of power, and particularly marginalized young people who
have finally found a bit of power, and then you
force them to rebuild again, you're removing all of their
access to power. And like, sure they'll get it back
in a few years, but a lot can happen in
a few years.

Speaker 3 (48:43):
Yeah, And wouldn't it be and I guess it would
be to have to force these folks to have to
rebuild for something that didn't even really like achieve much right,
for something that like.

Speaker 2 (48:56):
Didn't even really.

Speaker 3 (48:58):
Achieve much meaning full action in terms of protecting digital
national security.

Speaker 2 (49:03):
That is like, to me is like the real cruelty
of it.

Speaker 1 (49:06):
I guess, yeah. I mean for it to be done
for such a pointless cause for which there is no
evidence of harm in the first place, is uh deeply
unsettling to be like, you know, really challenging our First

(49:27):
Amendment rights in the way that they are threatening to
with zero evidence. Again, we have more evidence against Facebook
than we do against TikTok, and to ban TikTok, I mean,
it's such a concerning choice for the First Amendment. And

(49:49):
like usually I'm on the misinfo side of like having
to be like, well, you know, free speech is not
free reach, but like right now it's like, yes, no,
free speech does matter, Like this is a First Amendment issue.

Speaker 2 (49:59):
Yeah, this is the government regulating a plaque.

Speaker 3 (50:01):
How many times have you been in the comment section
being like, well, this isn't the government regulating something, it's
a private platform regulating something. So it's a little bit different.
Maybe don't evoke the First Amendment. This is actually the.

Speaker 2 (50:11):
Government regulating something.

Speaker 3 (50:14):
So like all of those free speech people like like mobilize,
like let's hear from you.

Speaker 1 (50:19):
I know, it's like so quiet from the free speech
absolutionist right now. Like they cared a lot when TikTok
was censoring medical misinformation that TikTok had every right to
be to be removing. But if the government is actually
coming for a platform where a one hundred and fifty

(50:40):
million people practice the utilization of their voices, that seems
like a free speech issue to me, that seems like
a challenge on our First Amendment rights. Really scary one too.
I was like, yeah, watching the hearing was quite scary

(51:00):
to me.

Speaker 2 (51:02):
What scared you about it?

Speaker 1 (51:04):
I think that there was this strange element of like, okay,
I when I learned about the red skin and history class,
I didn't anticipate living through another one that was like
weird to watch in real time. I was like, oh,
I thought that was like a historical event. I thought
we didn't do that again. So it had that element

(51:28):
to it where they were like asking the CEO of TikTok,
who I don't know his personal finances, but if he's
not a billionaire, he is a multi millionaire, just asking
him if he's a communist. It had a wild tone
to it, but on a much deeper level, the links

(51:57):
that they will go to to take away our free
speech in the name of just xenophobia, that was really
scary to me. I don't think that that symbolized like
massive steps towards a more progressive world. That felt like
taking steps backwards, towards, you know, stricter borders and more

(52:21):
nationalism and stricter laws, and just creating this divide between
the West and the East, and just trying to reimpose
this iron curtain of sorts, this digital iron curtain. Like
that felt really dark to me.

Speaker 3 (52:42):
I mean, when you put it that way, yeah, it's
I hadn't picked up on that vibe, but now that
you say it, it is scary, And I don't think
it's I don't think I don't think it's discourse that
leads us to a more just, progressive world.

Speaker 2 (53:04):
It seems like it leads us someplace else.

Speaker 1 (53:07):
Yeah, it seems like it shuts us down and puts
us deeper into like isolationist thinking. And that's not the
type of thinking that I want to see our worlds practicing.
I would like to see, you know, the Internet was
always promised as this thing that would like bring global
citizens together. I don't believe that we are in any

(53:31):
sort of like technological utopia or that the Internet is perfect,
but I do see the internet's capacity for connection and
connection with people that you would otherwise like never have
spoken to or heard their stories. So earlier, you were
talking about like seeing women in Iran protesting and how

(53:53):
you feel like you wouldn't have been exposed to that
without TikTok. And I worry that if we continue down
this path of implementing these strict bands and like insisting
that we control all of the social media platforms that like,
it'll continue to cut us off. And that's not what

(54:15):
I want to see.

Speaker 3 (54:18):
Here, fucking here, Abby, thank you so much for being here.
This was informative. I'm really glad. I everyone should read
your newsweek piece. We're gonna put it in the show notes.
But yeah, this was great.

Speaker 2 (54:28):
I feel like I learned a lot.

Speaker 1 (54:29):
I'm glad that you let me come. Rand I wrote
the piece and I was like, I know who I
need to send this.

Speaker 2 (54:34):
Oh you have an open invite.

Speaker 3 (54:36):
Anytime you're like, I'm annoyed about this, You've an open invite.

Speaker 1 (54:41):
Well, you know, I love coming on your show. It's
always fun. You were like the one of the first
ever podcasts that I did, and you were so nice
to me.

Speaker 2 (54:46):
Oh my god, you were I remember it very clearly.
You were such a pro.

Speaker 1 (54:50):
I had no idea what I was doing. It was
all I was faking it.

Speaker 2 (54:53):
Now you've got the mic, you've got the pop filter.
Look at you.

Speaker 1 (54:58):
Now I know what we're doing. I know that I
can Like when I first started, I didn't know that
I could like restart a sentence when I like the
first time I ever did podcast, so the pressure was
way higher. But now I know I can read.

Speaker 2 (55:11):
Oh, I love a restart.

Speaker 3 (55:13):
I restart like I catch myself doing it in casual
conversation where I'm like, Nope, no one's gonna come in
and clean this up.

Speaker 1 (55:19):
But you know, I feel like there's no editor right now. Still,
sometimes it's good to just restart because you're like, you
know what, that sentence had a false it was a
faulty start, And it doesn't even matter if I'm being recorded,
it'll make more sense. If you just let me scratch
it and start over again.

Speaker 3 (55:41):
If you're looking for ways to support the show, check
out our March store at tegoti dot com Slash Store.
Got a story about an interesting thing in tech, or
just want to say hi? You can reachus said Hello
at tegodi dot com. You can also find transcripts for
today's episode at tengody dot com. There Are No Girls
on the Internet was created by me, which It's hid.
It's a production iHeartRadio, an unbossed creative edited by Joey

(56:02):
pat Jonathan Strickland is our executive producer. Terry Harrison as
our producer and sound engineer. Michael Almada is our contributing producer.
I'm your host, bridget Todd. If you want to help
us grow, rate and review us on Apple Podcasts. For
more podcasts from iHeartRadio, check out the iHeartRadio app, Apple Podcasts,
or wherever you get your podcasts.
Advertise With Us

Popular Podcasts

Dateline NBC
The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.