Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
If you are against misinformation and hate speech, you know
you gotta be against it on all platforms, and not
just on TikTok. There are No Girls on the Internet.
As a production of iHeartRadio and Unbossed Creative, I'm Bridget
(00:23):
Todd and this is there are No Girls on the Internet.
Let's talk about TikTok, the super popular social media app
owned by byte Dance, a Chinese based company. Now a
growing collection of elected officials want TikTok banned in the
United States because they say it represents a national security risk.
(00:45):
The Biden administration demanded that TikTok be sold or it
could face a ban in the US. Congress has also
rolled out a bipartisan bill allowing a nationwide TikTok band,
called the Restrict Act, which would allow the Secretary of
Commerce to ban apps that pose a risk to US
national security. Last week, members of Congress held a hearing
where they grilled TikTok CEO show Ze two. Now, y'all know,
(01:09):
I love TikTok, but I also know that, like any
other social media app, it is not without its issues.
TikTok disinformation researcher and friend of the show, Abby Richards
was one of the first people calling out TikTok for
the way that myth and disinformation and hate speech spreads
on the app. But for all of her criticisms of TikTok,
Abby still says that banning the platform is not the
(01:30):
right move. In a recent upend for Newsweek that you
can read in the show notes, Abby explained why she
thinks banning TikTok will ultimately hurt marginalized communities. I caught
up with Abby right after we both watch the hearing
Abby Richards, friend of the show, how are you? I
am angry, bridget I am an infuriated girl right now,
(01:56):
as someone who has really built a lot of your
career not on TikTok, but like calling out TikTok for
the ways that it can spread misinformation and hate speech?
What is the big light to watch elected officials talk
about banning it. It's been infuriating because they're going to
get all wrong. It's like I have literally built my
(02:16):
career criticizing TikTok. I will be the first to tell
you about like the copious amounts of misinformation and hate speech, extremism,
harassment that happened on that app. But to then try
and ban it without any evidence that it is a
national security threat whatsoever, and just using like fearmongering and
(02:39):
pointing to other pieces of misinformation that I have tried
to debunk about TikTok like that. And this is like
the types of misinformation that emerges off the platform. So
if you were watching the hearing yesterday, you would have
seen a lot of our congressmen and women pointing to
TikTok challenges that are causing children tons of harm. And
(03:02):
I have spent so many, like countless hours debunking these
exact like moral panics around these challenges, where essentially adults
try and blame the stupid things that kids do on
the newest technology. We see it over and over and
over again. But the fact that you know that's being
used to justify the banning of this app too is infuriating,
(03:28):
and I'm deeply concerned about the what the consequences will
be of banning an app that one hundred and fifty
million Americans use. So you brought up something really interesting.
I know that these so called like deadly challenges was
a big part of the hearings, and something that I mean,
I almost hate to ask this question, but I feel
like it's an important context of the conversation that is
(03:49):
like great news for Facebook. They're like, oh, our strategy
is winning, Like our strategy to turn the public and
of course elected officials against TikTok and make them think
that like it's this like place where all these deadly
challenges emerge, even though some at least some of those
challenges actually emerged on Facebook. If you really want to
get into the nitty gritty, did you so did you
(04:10):
see in the hearings the ways that like fear mongering
just sort of large about technology was taking center stage
in these hearings. It's almost like they're placing this fear
that they have of new and emerging technologies and how
young people use them and just placing it on one
singular app. So rather than actually dealing with the consequences
(04:32):
of this like rapidly changing society because of the expansion
of technology, like on a monthly basis, we just want
to like pin all of our anxiety on one app
and cross our fingers that like if we just ban
this one app, it'll all go away. And so I
think we see that a lot with the challenges and
like though the steam that they pick up among people
(04:56):
who are not on the platform. In particular, we know
that it was the Washington Post that reported that there
was like a consulting group. There was the Washington Post
that reported that there was a marketing group that was
working with Meta to place op eds about, you know,
(05:16):
fearmongering about TikTok. So that's something that we have seen
happened before, and it certainly seems like this is a
big win for Meta. If you are on TikTok at all,
you are probably seeing this widespread popularity of almost conspiratorial
language about where people are blaming the ban on Meta
(05:43):
and saying that like Meta is lobbying and Google too
to some extent blaming it, but yeah, blaming it on Meta,
on Google for lobbying Congress people and encouraging them to
ban their biggest competition. I don't know how much truth
there is to that, because I think that there is
a lot of geopolitics there too, and it seems like
the US government is you know, very interested in like
(06:04):
splitting the Internet as much as possible right now. So
it's it's hard to say if it's like one cause
versus another, but this does definitely look like a big
win for Meta, and they used rhetoric that Meta has
also pushed before. Yeah, so in terms of a TikTok band,
(06:25):
So in your piece you write about how you know,
obviously you are the first one one to admit that
TikTok has its downfalls and the democratization of like media
has its downfalls, but you also talk about the ways
that this could actually harm marginalized communities. How do you
see that working? So let's okay, well we'll start you
and you consider a like institutional media, legacy media, So
(06:49):
New York Times, Washington Post, Wall Street Journal, CNN. They
are top down information systems, which means that these newsrooms
essentially look at all of the news and then decide
what is newsworthy and then disseminate it to their audience.
TikTok is much more of a bottom up information system,
(07:09):
which means that anyone can post basically anything that they want,
and then the algorithm is working with the users to
determine what people are watching, and in that way, the
users decide what is of importance to them. So we
have two different information systems, and this top down one
tends to privilege white richer more privileged people. It tends
(07:37):
to be composed of people who come from more privileged backgrounds.
It speaks about their problems and things that they are
interested in thinking about. They in particular like they have
to sell subscriptions or sell ads, so they have to
(07:57):
run stories that they know that the wealthy, the audience
that they pant or two will be interested in the
first place, because it's really not that profitable to tell
stories of marginalized people who can't afford to subscribe to
their newspaper. So TikTok being this much more bottom up
infrastructure allowed for two major like kind of newish forms
(08:22):
of massive dissemination of news and information. The first would
be seeing information from firsthand accounts, So that's like when
you know this, these started the invasion of Ukraine where
we saw all of these videos of people leaving the
country or being a bomb shelter. So you're seeing people
who are actually experiencing the events posting online and that
(08:43):
garner's attention without editors needing to push it out. And
then the second is through trusted messengers, and these are
people that can speak to specific communities and they have
had they've had a lot of success on TikTok. They
have built up these communities where they are trusted sources
(09:04):
of information and sometimes opinion. People go to them to
try and feel like they can understand certain problems. And
you know, this could range from a community of like
ten thousand people to communities of millions of people. But
this is particularly advantageous to marginalized groups who probably can't
afford a subscription to The New York Times, or alternatively,
(09:24):
they don't want one because they don't trust the New
York Times. They've never seen themselves represented in the New
York Times. Like, there are so many reasons why they
might not turn to legacy media, but they will turn
to like a trust messenger who looks like them, who
speaks their language, who they feel understands them in their communities.
And so if we dismantle that infrastructure like suddenly, like
(09:46):
flipping off a light switch, you'd be leaving all of
these communities without their information infrastructure, and that could be
leaving them in the dark. It could be really really
damaging for them. And they're very resilient. All of these
marginalized comunities, whether it's communities of color, queer communities, disabled communities,
they're they're very resilient, they will rebuild, but keeping them
(10:07):
in a state of like constantly rebuilding the information infrastructure
stops them from being able to do things like organized, immobilize,
and gain power because they're constantly rebuilding their their information infrastructure.
So yeah, I'm mad. I mean that's such a good point, Like,
I oh, where do I even start. I think that
(10:30):
one of the reasons why we're even having this conversation
is like a fundamental misunderstanding of the ways that platforms
like TikTok have worked and what they have been used to,
like what movements they've been used to build, right, Like
I still see people being like, oh, like it's a
kid's dancing app, which like part of my soul dies
every time I hear that, you know, looking around, Like
what was happening to women in Iran? I have to say,
(10:53):
I don't think I would have known what was happening.
I don't think I would have known what was going
on if not for TikTok, if not for Iranian women
saying hey, this is what's going on, here's how you
can help, here's what we need, I don't think I
would have been able to be plugged into what was
happening if not for TikTok, And so I think that
it's it's like a it's like a the people who
are legislating this don't necessarily have that visibility into the
(11:15):
ways that it is functioned for communities who are marginalized.
And so I if we're going to have any conversation
about just an outright ban, they're not even including that
in the conversation. And I feel like it kind of
like what you said, like, if we're going to do
something about how these apps and platforms function in our
lives and in our like media ecosystem, let's not have
(11:36):
it be done hastily and badly right where people where
people are going to be have the rug pulled out
from under them and they're they're very real issues are
not even being discussed or like you know, brought to
the table. Oh yeah, there's no nuance in this conversation whatsoever.
And you would think that if we're going to have
a conversation about pulling the plug on one of the
(11:57):
biggest pieces of communications infrastructure in the world, that is
a conversation that we should be very careful about. Dismantling
infrastructure that large is not a decision that should be
taken lightly. And yet it seems like our representatives like
(12:19):
truly have bought into this idea that TikTok is somehow
simultaneously just a kid's dancing app and also spywear. Like
it's really confusing how it can be both, and like
saying like, oh, well, you could push specific you know,
you could push specific misinformation campaigns and try and sway
(12:40):
the American populace, but then also simultaneously ignoring the fact
that like it is a messaging center for a huge
portion of the American populace. If you just remove it,
there will be consequences. And then when I bring this up,
I see some people like, oh, well, you know, at
(13:01):
the like you know, national security is more important, or
it's like everyone will just go to a new app,
and like, yes, probably they will, but that doesn't mean
that it won't cause a lot of harm in the process.
It doesn't mean that people will you know, there's still
going to be lots of creators who lose their jobs.
(13:22):
There's going to people people who have built these large
audiences who now have nothing, and a lot of people
whose voices were empowered and suddenly we're coming into a
position of discursive power. Who will now not have access
(13:43):
to that? And even if like yes, technically, if you
look at like a very big picture level, people will
migrate to new apps, sure, but there's still going to
be a lot of consequences and we can't really ignore that. Yeah,
I mean, even looking at what's going on with Twitter.
(14:03):
I there was a time where I was like, Oh,
everyone's going to be using Mastodon or this or that.
It hasn't. And I know that our communities, particularly marginalized communities,
as you said, are really resilient, and we always find
a way to make We always make a way out
of no way, right, Like that is what we do.
But it's watching it watching Twitter become a place where
(14:25):
people spend less and less of their time. There hasn't
been like one place where everybody is like setting up shop,
and so it's takes longer, it's messier, it's it's more
energy for these communities, many of whom I look at,
the trans community and the queer community are up against
quite a bit right now. And so maybe having to
pick up and rebuild a new communications infrastructure elsewhere it's
(14:47):
not what they need right now, is maybe it is
maybe not going to be like great for those communities.
It will be horrible for them because now, in addition
to legislation that is threatening their right to exist, they
also don't have their communities that they can turn to
for advice, for support, for information, and they don't have
(15:09):
access to their broader communications network to mobilize right and
and plan what they will do in response or try
and like have a place they can turn to to
raise funds if they need surgery, or if they need
legal fees, or if they need to get out of
whichever state they're in. If you remove the communications infrastructure,
(15:34):
and we're talking about a marginalized community that is undergoing
a lot of legislative challenges right now, that's putting them
in a position that just increases their marginalization. Let's take
(15:54):
a quick break enter back. So let me zoom out
a little bit and just make sure that I have
this right. Lawmakers are saying that they want to ban
TikTok because it's a national security risk. This is a
(16:16):
little bit above my pay grade. I am no digital
national security expert, and I want to make sure that
my understanding of this aligns with your understanding. So it
is true that TikTok is a Chinese owned company, and
it is true that potentially that could open the door
for all kinds of national security risks. But what we
don't have right now is a smoking gun that this
(16:37):
is actually a problem. We're very much speaking in hypotheticals
of what could happen or what potentially might happen. So
lawmakers right now are trying to legislate TikTok because of
this hypothetical threat that might be out there, but it's
not yet revealed itself. Do I have that right? Yeah,
I would say you nailed it. Okay. They haven't provided
(16:59):
any evidence of instances where China use TikTok in any
sort of antagonistic way towards the US. It's only in
the hypothetical of like they own it and they maybe could,
and even if they could, it's kind of unclear how
they would go about doing that. What we do know
(17:24):
and like, what feels like it's it's missing from this
conversation so much, is that even if we were to
band TikTok, it would not stop the Chinese government or
any government or any bad actor from just buying your
data elsewhere. TikTok is roughly collecting about the same same
amount of data as any of the other major social
(17:45):
media platforms, and then they profit by selling off selling it. Well,
it wasn't the hearing that he denied that they sell
They sold data, So maybe they're not even selling the data.
But social media platforms in general collect your data and
sell it, and it is not difficult for anyone with
(18:06):
any sort of intention to just buy your data so
that they can target you with specific advertisements and specific messages.
So if we were actually concerned about the Chinese government
or any other government manipulating Americans, you would be implementing
like a comprehensive data privacy legislation rather than going after
(18:30):
one singular app. Because even if you were to band TikTok,
and even if TikTok were using its data in that way,
which we still have no evidence that it does, they
would still just be able to purchase your data anywhere else.
So this is like applying a used, gross infected band
aid to a wound that will really not fix the
(18:53):
wound at all, but will almost certainly add additional infections.
Oh my god, I was just about to read your
quote in the newsweek piece. Back to you, but yeah,
I mean this is this is what really gets me
about this conversation is that, like, I guess, we have
no meaningful data privacy legislation in this country, right, and
(19:15):
so like that, the fact that we're even having this
conversation indicates to me a complete failure of leadership, governing
and legislating, right, And so I feel like banning TikTok
is this big flashy gesture of like, see we did something,
See we got it together and governed and made some
(19:36):
kind of a gesture toward you know, data privacy, but
it actually does nothing. As you said, like there's nothing
keeping these companies or really anybody from buying our data,
Like there are so many other the fact that our
data is so misused, and like that is the baseline.
Shouldn't we be talking about that? So even if we
were to ban TikTok, it would be totally possible for
(19:59):
the Chinese govern meant to access Americans data and all
other kinds of ways they maybe may have. And so
it's like, what does it really do if the real
problem is our lack of data privacy infrastructure in this country,
other than being a big flashy gesture, what does banning
TikTok actually achieve. I mean, to answer your question nothing.
On a political level, I think that it is a
(20:25):
distraction that allows for the American public to feel like
something was done to make them feel safer, while simultaneously
not passing the legislation that would actually protect the American public,
And like, we are a superpower and we demand that
all of the major social media platforms be stemming from us.
(20:48):
How dare the Chinese also have one? Yeah? I saw
this great USA Today headline that was like, how like
how dare China trying to get my data from TikTok?
Don't they know that? That? For Google and Facebook? Like,
and I think I get like like we have just
like lost the plot of doing anything to meaningfully address this.
(21:10):
And yeah, I think like during the hearing, some of
the things that came up around like content moderation and stuff,
those are important conversations to have, but having them having
TikTok be the face of, you know, a lack of
content moderation or hate speech and harmful content proliferating on
a platform. I'm asked, someone who has worked in this
(21:32):
space for a while, I'm deeply uncomfortable with that because
I think it creates a scapegoat that lets these other
I would say, sometimes bad actors off the hook, right, Like,
we're not We're not having the conversation in a meaningful
way to actually get results. We're just making an example
out of this one big bad boogeyman, which is TikTok. Yeah, definitely,
And I mean the examples that they were pointing to
(21:54):
aside from these like moral panic challenges, the examples of
violent content on the app were really wild that they
were pointing to it in the first place. Because as
far as removing you know, really violent content, TikTok is
better than most of the other platforms. I mean, I
(22:17):
professionally criticized them. It's one of my favorite things to
do is bullying TikTok. But they will generally be pretty
good about removing anything that I flagged towards them. And
they have a lot of like horrible things on their apps,
like absolutely, but all social media platforms do. And so
what you really want to be looking at, too, is
(22:38):
just how much reached are those things getting. So one
of the videos that they pulled up during the hearing
was an imagery of a gun, and I think it
included one of the committee members names, and it was
and they were they were railing on TikTok because the
video had been up for I believe forty one days,
and what they were failing to account for that the
(23:00):
video had like no likes, It was not a video
that had been engaged with whatsoever. So arguably it's actually TikTok.
It's evidence that TikTok was doing a decent job with
its moderation, but that it had like siphoned this off
and contained it more than anything else. Like, yes, it
shouldn't be on the platform at all, but if it's
(23:21):
not getting any reach, really, then TikTok's moderation to an
extent is working. That's the whole point. It's like, you
want these things contained, you want them, you want them
quarantined so that they don't spread. So being able to
point towards that was like, it was such a strange choice.
And then it was additionally strange because we love Dunce
(23:43):
in this country but apparently have problems with images of them.
If you watch the hearing, there were actually some pretty
good questions that came up. Representative Yvette D. Clark asked
whether or not TikTok suppresses marginalized creators and Representative Lisa
Blunt Rochester asked about whether accurate information about abortion is
being suppressed on TikTok, but what was much more abundant. Sadly,
(24:06):
we're questions that weren't just silly, but also underscored how
little some elected officials seem to know about the technology
that they are responsible for legislating. Yeah, I mean that
example really brings me to something that like it pains me,
but I can't not talk about it with you. How
some of the questions and examples that these elected officials
(24:29):
were raising, we're just embarrassing, Like, you know, Representative Richard
Hudson from North Carolina like, I'm sure you know what
I'm going. I'm what I'm getting at the like exchange
about whether or not TikTok accesses users home WiFi, Like
do you ever so like, do you ever feel like
the people that we have elected to represent us and
(24:49):
to advocate for us and our interests up against a
big tech just are making it clear that they are
talking about legislating technology that they seem to not really
understand or get. No, it was quite clear that no
one on that committee had ever opened TikTok like they
had never used it before. They're just making wild statements
(25:12):
and accusations and guesses about how it works. And if
you go on TikTok now in response to that committee,
I mean, they're just being ridiculed because the users can
tell that these congressmen and women have never really actually
used the app or know what they're talking about in
any way, shape or form. And it is mind bogglingly
(25:37):
infuriating to witness people who don't even understand how Wi
Fi works, or like couldn't tell you the difference between
a modem and a rotor trying to legislate major major
social media policies, like in major data privacy elelicies, like
they don't understand any of it, So how are we
(25:58):
expecting them to even nowhere to begin when it comes
to those bigger legislative problems. It really goes back to
what you said in the beginning of our conversation of like,
if we're gonna do it, let's not do it badly.
Let's not do it in a way that like we
are talking about a pretty major communications platform. If we're
(26:19):
going to do it, let's not have the people in
charge make it clear time and time again that they
have no idea what they're talking about. Maybe just maybe,
just maybe that could maybe be a good place to start.
Is like knowing that you need to use Wi Fi
to access TikTok. Like, oh my god, I really liked
the exchange about whether or not TikTok tracked how much
(26:45):
your pupils dilated in response to certain videos. And it's like, no,
they look at watch time. They're not like reading your
pupils to see how big they get in response to
each video. Don't know what Black Mirror episode do you
think we are in. We're in a different one. We're
still in one, but it's different, just a different episode. Yeah,
(27:08):
I don't think. I don't think that TikTok is like
like logging your biometrics, dude. Like it just it just
was some of those questions were concerning. And I also
think they just like reveal how we just need younger people.
I think, like, yeah, like younger folks both in the
(27:28):
both in elected office and also being listened to. I
think is really really important people who actually have real
world experience with the technology that they're talking about. Because
it was, as you said, it was just so clear
that many of the folks in Congress, I just had
no idea what they were talking about. Yeah. No, we
would absolutely need to have younger people or just more
(27:50):
you know, digitally literate people writing these laws, because if
you truly don't understand that, like the app needs to
track where your eyes are so that they can put
sunglass filters on you like that, that is just something
that a younger peace person would need to explain and
need to help them understand. And if all of our
(28:13):
legislators are you know, I don't want to say dinosaurs,
but like they're acting like dinosaurs right now, and I
feel like it's it's called for, then we can't expect
solid digital legislation from them. Like they don't they don't
know what they're doing. They don't understand anything beyond email
at this point. Yeah, that is sad. I mean it's
(28:36):
really sad. It's really scary because if you want to
talk about like the American people being vulnerable, our leaders
not even understanding like the you know, bare minimum that
they need to understand about how the internet works like
that leaves us quite vulnerable. That's such a good point.
(28:57):
That seems like a much bigger threat our leaders failing
to understand anything about technology feels like a much bigger
threat than this TikTok boogeyman scare. So what do you
say to someone listening who was, like, I am concerned
about the national security threat that TikTok poses. Still, what
(29:18):
would you say to that person? I would say that
at this point in time, we have seen more evidence
of Facebook engaging in schemes to manipulate the American public
than we have TikTok. I mean, we have evidence of
Cambridge Analytica, we know that Meta was involved with this,
we have no evidence of TikTok doing it. So at
(29:39):
the moment, all of our fears are based around a hypothetical,
and that they are based around this like boogeyman of China,
and that that might not be the best place to
create legislation from, Like that mentality might not lend itself
(30:01):
to the creation of like really strong digital privacy legislation,
And that I mean also, if you are personally concerned
about TikTok, I like, there's no reason why you have
to have it on your phones. That's a decision that
you can totally make if you have if you have
national secrets on your phones, like maybe don't have TikTok
(30:24):
on it, But then also like maybe don't use any
other social media platforms because they are all gathering your data.
More after a quick break, let's get right back into it.
(30:49):
Other platforms are also gathering your data and behaving badly
and acting in ways that calls all kinds of harm.
But this legislation is very much black and white, and
it seems very much it's based in scaremongering about TikTok
as this potential boogeyman. And I just don't think that
fear mongering is a place that will elicit good, meaningful,
subsidutive legislation. You know, black and white thinking on a
(31:13):
nuanced issue is bad enough, but black and white legislating,
I don't know. I don't like it. I don't like
it either. I don't trust it's it's not coming from
a smart place. It's coming from a fearful place. And
that's not where we get good legislation. And if you
want to have a much bigger conversation about social media
and the future of social media and whether it's bad
(31:33):
for us, whether it's good for us, what harms it does,
But also, you know what net positives it brings to
the world. I am here for that conversation. It'll take
us a few years to get to the bottom of it.
It is not like a quick and easy conversation to have.
But I do think that that's one we should be having,
and I've been trying to push for it. It's like, Okay, well,
(31:55):
there are all these issues with like misinformation and extremism
on social media, and we know that these platforms benefit
from the posting and sharing of content that is that
appeals to our emotions, that is often false, or is
oversimplifying really complex issues, or it's just like extremist and
(32:15):
hateful in nature. We know that they benefit from that
because that's the sort of content that keeps people on
their platforms, consuming content and therefore consuming ads. And I
would like us to be working towards the creation of
like digital communities that are better for our emotional and
(32:38):
social well beings. I could absolutely see that occurring in
the future. Is like creating these digital public spaces where
people can go share information, but that aren't driven exclusively
by profit and are built with the goal of creating
a more well informed and empathetic public. But that doesn't
(33:05):
happen just because you ban one app you don't like.
Like that is deep, deep infrastructural change that we need
to make and I think is the only way really
forward unless we want to drive ourselves insane. But that's
not a light and easy task. It's certainly not going
to be be achieved by banning one app. Yeah, if
(33:27):
you are against misinformation and hate speech, you know you
gotta be against it on all platforms, and not just
on TikTok, because Facebook, Instagram, YouTube, Twitter, they all have it.
I mean, Twitter right now is also such a mess
just as a like, it is wild to watch all
(33:48):
of this, this the giant attack on TikTok. Meanwhile, Twitter
is just falling apart, crumbling and has disintegrated as like
this this reasonable public space and the fact that you know,
we thought we would be losing TikTok and essentially be
(34:11):
losing Twitter. That would really leave Meta with a lot
of power. But I also think it's interesting that like
one rich guy can just buy and ruin Twitter, but
like god forbid, an app has some Chinese ownership. If you,
if you had to fay, do you think that TikTok
(34:34):
is going to be banned? In the United States. I'm
pretty fifty fifty right now. I don't like it. If
you would ask me before, I'd be like, probably not
because I thought that the Dems were smarter than that.
I don't, I know, I know what I in hindsight,
(34:54):
what was I thinking? They love losing um. But I
just I was baffled that the Biden administration pushed for
the sale versus ban because I thought that they knew
that if they lost TikTok, they would not be able
to mobilize voters in twenty twenty four. But apparently they
(35:15):
don't know that and they just are down for political suicide.
I really it baffles me. So I continue to be
wrong and surprised by what they will do in the
name of like this weird singular bipartisanship over xenophobia. It's
(35:37):
the one thing that they can relate to the Republicans
on right now. Also, you know, you made the point
about mobilizing younger voters. I said this on a panel
and it got like a laugh and I was like, oh,
that was actually serious, I think, And it is might
sound it's gonna sound how it sounds, but I truly
(35:58):
believe this. I being if TikTok is banned in the
United States, I see that going very poor for Biden
and the Democrats. Not to make this like a horse
race issue, but I think that some of the most interesting,
meaningful organizing of young folks politically is happening on TikTok.
(36:18):
If you look at the conversations that younger folks are
leading on when it comes to things like climate justice,
those conversations are happening on TikTok, and they are mobilization conversations.
So I don't I'm not saying that I think that
young people will be like, oh, we love TikTok and
we're not going to vote for Joe Biden just because
he banned it and we love it. I think it's
much deeper than that. I think that gen z uses
(36:40):
it as a platform to do political and social mobilization
and organizing. If you show that you don't not just
don't understand that, don't see it, don't value it, ignore it,
that is going to be a problem. And so it's
not an issue of like, oh, these whiney brats just
want their dance videos. No, no, no no, no. I think
it's much deeper than that. And if you show that
you aren't even able to see that it is going
(37:02):
to be a problem one hundred percent. And people keep
playing this off as like, oh, you're going to piss
off gen Z and it's like that's that's less what
I'm worried about. I think a lot of gen Z
would still you know, the ones who are going to
vote anyways, will still vote for the lesser of two evils,
Like that's going to continue. But what will you will?
(37:22):
But what you will lose is all of this mobilization
infrastructure where people are getting their news, where people are
having like lots of political discourse. What is helping young
people to understand the world is the discourse on TikTok
is the trusted messengers that they turn to on TikTok.
(37:42):
And if you take that away, like they're just going
to not be as politically engaged. But because TikTok made
it really easy for them to get politically engaged and
get politically informed and deep dive into this discourse. Like
you know, the people keep saying TikTok is this dancing
out and it's so so wrong because like, fundamentally, it's
a discourse app It's it's where people are having conversations
(38:06):
about the world. That's why like stitching was such a
groundbreaking feature, because you could take one video and respond
to it and you get this like built and constructed,
constant discourse about political and social issues. If you remove that,
you know, they might still be left leaning and like
(38:27):
some of them might vote, but they won't be as
informed on certain issues. They won't necessarily know where to
or like where to be putting their their energy for
certain political issues. Like they would just be losing so
much of their infrastructure and that will have severe costs,
particularly for the Dems. And it's it's really mind blowing
to me that they don't recognize that. Yeah, I'm just
(38:52):
I'm completely shocked. I shouldn't be. I shouldn't be. I'm
years into this, I shouldn't be. Well, here's I mean
big question. Where will Abby Richards go if there's no TikTok?
Like I think of you as the person who is
like made a platform, like one of the first people
who really made a platform from calling out and criticizing
(39:13):
TikTok and advocating for it to be better. Where do
you personally see yourself showing up if it's banned here's
the thing is, Like, on one level, I think I
experienced some panic where it's like, if I lose my
platform on TikTok, do I lose my power? Like, you know,
will I still have work to do? Will people still
see me as valuable? And then I, after you know,
(39:34):
a little bit of dwelling in that panic, I got
over it because it's like, no, I'm privileged. I have
a lot of like institutional power at this point, like
I can pivot to other platforms if I want. And fundamentally,
I think that the deeper we go into a tech dystopia,
the more job security I actually have, which is dark.
But I think that was some dark abby. That was
(39:57):
some dark shit. But I mean you're right right, like
like you call out TikTok, but really you're calling out
like our media ecosystem the way that it impacts our democracy,
Like it's so much more than one platform. Your voice
is about so much more than one platform. I guess
I'll put it that way. But that but I agree
with you that, Like you know, when I started this
(40:18):
tech podcast, I never knew the Like I remember thinking like,
oh maybe I'll what if I run out of content?
What if I run out of like things to talk about. No,
I sure never will, never ever will. I mean my
goal is to eventually put myself out of a job.
I would really like if I could work hard enough
and eradicate all of the misinformation I hate speech online
(40:39):
and then one day I get to retire and I
don't need to do this job anymore. I don't see
that happening. So I think, you know, if TikTok is banned,
I will go to other platforms. I will build out
my presence on other platforms too. I'm far more concerned about,
like the people who don't have the institutional backing and
(41:00):
who don't have power outside of their TikTok presence, Like
they will have their lives much more uprooted than I
think I will. We are in such uncharted waters. Yeah, no,
truly what happens to the creators like because Congress I
saw I was one one congressman, like went on Politico
(41:21):
with like, No, they'll be able to switch to a
different platform, as if they haven't spent last like two
three four years building a platform, so now they can
do brand deals or now they can make money from
the creator fundum and now oh no, they can just
start over, It'll be fine. Who cares like it's it's
(41:42):
do you know how long that takes? Like, yeah, it's
but again, it's it's showing that the people who are
meant to be legislating this is don't have a real
realistic understanding of what of like what they're actually talking about,
and when they make suggestions like that, it's like, have
you have you taken a step back to ask what
that actually looks like in practice? Yeah? I don't think
that they have. I will say, where there are marginalized
(42:06):
creators on a platform facing some kind of something that
they don't like, there's always really good pushback, and so
scrolling the platform after those hearings got to give it
up to the creators who are using their strengths creating
good content to educate people on those hearings and sort
(42:27):
of what they thought about them. I've seen such hilarious
TikTok's coming out of that. I saw one creator who
was pretending to be a lawmaker in Congress. It was like,
I want to know and my wife wants to know why.
Every time I opened TikTok, all I see is these
big breasted women that's making fun of like how silly
some of their questions were. So that's something I'll always
(42:49):
have faith in is marginalized creators being able to use
social media platforms to poke fun at legislators and to
spotlight and educate folks on something that they do not like. Oh,
one hundred percent, the TikTok's coming out of this have
been phenomenal. I was like getting a lot of comfort
last night and just seeing the conversations that were being had.
(43:11):
Even did you say, see any of the true edits
the edits of the CEO people were making like spicy
edits of him, save TikTok. Uh, those were great. There
are a lot of um I want to see if
I can pull. I know. I just screenshotted one this
morning that was giving me life. Oh it was the
(43:36):
cap cut meme of Pedro Pascal eating a cracker sandwich
and then it says, knee watching the CEO of TikTok
defend the American people and our rights from the literal
US government, all while they mispronounced his name, have no
knowledge about how Wi fi works, and are saying they
are doing it for the children. When there was yet
(43:57):
another school shooting that I didn't see on the news,
but I and see reported on TikTok. Doesn't that really
say at all? That does that sums it up? Doesn't It?
So like TikTok is reading this stuff for shit. But
like the rest of the internet, I don't know so much.
Because there was also that study that came out or
I don't know it was a study, it was a report.
(44:18):
Well I want to say, no, I don't even remember
which which day's organization ran it. That was the biggest
determining factor whether or not you support the TikTok band
is whether or not you use TikTok. Yeah, so there
you go. It's like people who don't who like don't
have direct familiarity with the issue, are the ones who
are in favor of an outright band. Yeah, they're much
(44:41):
It's much more. It is much easier to convince people
who are not even using the app that the app
is dangerous because there they are not exposed to it.
They don't understand how it actually works. It's much easier
to fearmonger about something that you're unfamiliar with. It's much
easier to hijack people's feelings of fear distrust if they
don't even know the other group that they're dealing with.
(45:03):
If if they're completely unfamiliar with the topic, then you
can just push like fearmongering on them. But the people
on TikTok who have their own experience was with it,
you know, can read that as bullshit. Yeah, totally in
favor of some comprehensive sweeping data privacy legislation that would
absolutely affect TikTok. Let's do that from a place of
(45:27):
reason and you know, national security of the American people, Like,
if we actually cared about that, let's go write that legislation.
In the end, TikTok and it's part in our larger
digital media ecosystem is huge. So a potential band would
have major consequences, not just for the millions of people
who use it, but for the way that information and
(45:49):
discourse is spread. Oh, this was cut from my Newsweek article,
but it was good. It's so like, I'm a member
of this coalition of TikTok members, or I'm a member
of this coalition of Tiktoker's TikTok influencers with the coalition
(46:11):
has over five hundred members and collectively they have over
five hundred million followers, and to put that in perspective,
I believe that that is more than the collective monthly
viewership of like Foxy and n AN MSNBC, which is
about five million, So not just more, it's like one
hundred times more than their monthly viewership, and that's just
(46:34):
like their following count. But I thought that that was
you know, really, I thought that that was a good
way to highlight just how vast this infrastructure it is,
and that we're not talking about cutting off like a
couple million people or like, you know, a handful of people.
Like we're talking about dismantling in a communications infrastructure that
(46:58):
is so vast that it is like literally hundreds of
times the reach of the biggest news networks that we
can think of. That is not a decision that we
should be taking lately. I'm so glad you've added that.
I do think that for people who don't use TikTok,
they probably it's probably easy to think of it as
like it's like that arrested development line, like it's a
(47:20):
it's a TikTok Michael, how many users? Could there be?
A hundred? Like they might not necessarily know the gravity
of the of the way that it functions as a
communications platform and it's role in our larger ecosystem of
digital communications, which is vast. It's huge, it's huge, and
people don't understand that it's it's not a dancing app,
(47:42):
and that it is in fact an information app that
is a discourse app, and that there is like so
much existing infrastructure which has been built over the course
of the last few years to mobilize people, particularly for
progressive causes, and if you take that away, we're going
to have to rebuild it. Rebuilding takes time. Rebuilding is
(48:02):
a way of keeping people down right, Like, the reason
why legacy news institutions are so powerful is that they
don't have to rebuild every four to five years. It's
like you build power by you know, having stability in
your institution. And if you take all of these young
people who have finally found a bit of power, and
particularly marginalized young people who have finally found a bit
(48:24):
of power, and then you force them to rebuild again,
you're removing all of their access to power. And like,
sure they'll get it back in a few years, but
a lot can happen in a few years. Yeah, And
wouldn't it be And I guess it would be to
have to force these folks to have to rebuild for
(48:45):
something that didn't even really achieve much right, for something
that like didn't even really achieve much meaningful action in
terms of protecting digital national security. That is like to me,
it's like the real cruelty of it. I guess, yeah.
I mean for it to be done for such a
(49:06):
pointless cause, for which there is no evidence of harm
in the first place, is deeply unsettling to be like,
you know, really challenging our First Amendment rights in the
way that they are threatening to with zero evidence. Again,
(49:28):
we have more evidence against Facebook than we do against TikTok,
and to ban TikTok. I mean, it's it's such a
concerning choice for the First Amendment. And like usually I'm
on the missinfoside of like having to be like, well,
you know, free speech is not free reach, but like
(49:48):
right now it's like, yes, no, free speech does matter,
and like this is a First Amendment issue. Yeah, this
is the government regulating of like how many times have
you been in the comments actual being like, well, this
isn't the government regulating something. It's a private platform lating something.
So it's a little bit different. Maybe don't evoke the
First Amendment. This is actually the government regulating something. So
like all those free speech people like like mobilized, like
(50:11):
let's hear from you. I know, it's like so quiet
from the free speech ab solutionists right now. Like they
cared a lot when TikTok was censoring medical misinformation that
TikTok had every right to be to be removing. But
if the government is actually coming for a platform where
(50:32):
one hundred million, one hundred and fifty million people practice
the utilization of their voices, that seems like a free
speech issue to me, that seems like a challenge on
our First Amendment rights. Really scary one too. I was like, yeah,
watching the hearing was quite scary to me. What scared
(50:56):
you about it? I think that there was this strange
gentlement of like, okay, I did I when I learned
about the red Skier in history class, I didn't anticipate
living through another one that was like weird to watch
in real time. I was like, oh, I thought that
was like a historical event. I thought we didn't do
(51:18):
that again. So it had that element to it where
they were like asking the CEO of TikTok, who I
don't know his personal finances, but if he's not a billionaire,
he is a multimillionaire. Just asking him if he's a
communist had a wild tone to it, but on a
(51:43):
much deeper level, the lengths that they will go to
to take away our free speech in the name of
just xenophobia, that was really scary to me. I don't
think that that symbolized like massive steps towards a more
(52:07):
progressive worlds. That felt like taking steps backwards towards, you know,
stricter borders and more nationalism and stricter laws, and just
creating this divide between the West and the East, and
just trying to reimpose this iron curtain of sorts, this
(52:30):
digital iron curtain, Like that felt really dark to me.
I mean, when you put it that way, yeah, it's
I hadn't picked up on that vibe, but now that
you say it, it is scary, and I don't think
it's I don't think I don't think it's discourse that
(52:53):
leads us to a more just, progressive world. It seems
like it leads us someplace else. Yeah, it seems like
it shuts us down and puts us deeper into like
isolationist thinking, and that's not the type of thinking that
I want to see our worlds practicing. I would like
(53:14):
to see. You know, the Internet was always promised as
this thing that would like bring global citizens together. I
don't believe that we are in any sort of like
technological utopia or that the Internet is perfect, but I
do see the internet's capacity for connection and connection with
people that you would otherwise never have spoken to or
(53:38):
heard their stories. So earlier, you were talking about like
seeing women in Haran protesting and how you feel like
you wouldn't have been exposed to that without TikTok. And
I worry that if we continue down this path of
implementing these strict bands and like insisting that we control
(54:01):
all of the social media platforms that like, it'll continue
to cut us off. And that's not what I want
to see here fucking here, Abby, Thank you so much
for being here. This was informative. I'm really glad everyone
should read your news week piece. We're gonna put it
in the show notes. But yeah, this was great. I
feel like I learned a lot. I'm glad that you
(54:23):
let me come. Rand I wrote the piece and I
was like, I know who I need to send this. Oh,
you have an open invite anytime you're like, I'm annoyed
about this, you've an open invite. Well, you know, I
love coming on your show. It's always fun. You're like
the one of the first ever podcasts that I did,
and you were so nice to me. Oh my god,
you I remember it very clearly. You were such a pro.
I had no idea what I was doing. It was
(54:44):
all I was faking it. Now you've got the mic,
you've got the pop filter. Look at you. Now I
know what we're doing. I know that I can. Like
when I first started, I didn't know that I could
like restart a sentence when I like the first time
I ever did podcast, so the pressure was way higher.
But now I know I can read. I love a restart.
(55:06):
I restart like I catch myself doing it in casual
conversation where I'm like, Nope, no one's gonna come in
and clean this up. But you know, I feel like
there's no editor right now. Still, sometimes it's good to
just restart because you're like, you know what, that sentence
had a false it was a faulty start, and it
doesn't even matter if from being recorded, it will make
(55:27):
more sense if you just let me scratch it and
start over again. If you're looking for ways to support
the show, check out our merchstore at tangodi dot com
slash Store. Got a story about an interesting thing in tech,
or just want to say hi, You can reach us
at Hello at tangodi dot com. You can also find
transcripts for today's episode at tangodi dot com. There Are
(55:49):
No Girls on the Internet was created by me Bridgetad.
It's a production of iHeartRadio and Unboss Creative, edited by
Joey pat Jonathan Strickland as our executive producer. Tarry Harrison
is our producer and sound engineer. Michael Amato is our
contributing producer. I'm your host, Bridget Todd. If you want
to help us grow, rate and review us on Apple Podcasts.
For more podcasts from iHeartRadio, check out the iHeartRadio app,
(56:11):
Apple Podcasts, or wherever you get your podcasts.