Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Hey, this is Annie and Samantha. I don't other Stephane
never told you appection, but I heart radio and today
we are thrilled once again to be joined by the delightful,
the winsome, the dessert expert. Maybe we were talking off
(00:28):
Mike about our favorite and least favorite desserts. Yeah, very
strongly about merangue. Oh my goodness. Some of us feel
very very very strongly about their dislike merangue. Not pointing
to myself or anything. I love it. I love these
strong favorite opinions. I still maintain we should have like
(00:49):
a mini podcast where we just talk about desserts. I figured,
three of us can we bring desserts and then like
somehow try to share each other's favorite desserts with each other.
That would be the way to go. Somehow we like
post it something I don't know. Okay, I wouldn't I
make that happen. I think we can make that. The
funniest thing is I'm not a huge dessert person, so
I'd be the one true co host a food podcast.
(01:13):
How are you add into desserts? Either are fine? I'm
just like not the thing that I go for a lot, Okay,
savory thing, right, Also, you'll you'll try almost anything even
if you're allergic to it. So therefore, I think she
maybe like the expert because she's willing to do all
the things just to try, and she will never let
anything go to waste. So I feel like she's the better. Yeah,
(01:36):
even with mint, I don't like. Do you not like mint?
I hate? I don't like mint in desserts. I'm learning
a lot about your taste. No, I love it. I'm
just allergic to it. I'm intolerant to it. Yeah, what
do you do for toothpaste? I suffer. It's miserable. There
(02:00):
are tooth I could solve this problem. This is an
ongoing issue with me. I could solve this, and I
continue to not do it because she refused to buy
something new. I'm going to send you some cinnamon flavor toothpaste.
That's what I I don't like, miss, I love cinnamon. See,
this is gonna be easy, easy fix. Such an easy fix.
It so a solvable problem. Well you are back in Washington,
(02:25):
d C. Bridget and happy belated birthday, my birthday, pie day,
which I feel like as a geeky nerdy person. Of
course it is three fourteen. Did you have any pie.
Did you do anything fun? I didn't really do anything fun.
We I had just got back from my trip to
Mexico City, and so I was in that weird time
(02:47):
where you're like, I've just been out of a country
for a long time. Is this like, this is not
my beautiful house, this is not my beautiful life. Like
I was like really taking having a weird re entry
back to my regular life. So my birthday was spent
just like on the couch watching television, which actually was good.
Was fine, yeah, yeah, lower key. Yes. Well, Samantha and
(03:13):
I are very very grateful you're here to talk about
this today because we were talking off Mike before you
came on. We don't really know what's going on here,
and h it's confusing, but it's in the news, it's everywhere.
Everyone's talking about it. It's true. So I'll do my best.
So I should say it right off front. There are
parts of this conversation that I'm not the expert on,
(03:34):
but I will do my best to break it down.
And that is the congressional hearings and conversations around banning TikTok,
which I'm sure folks have been hearing about, even if
you're not a TikTok user. You've probably cursory seen like
Congress people asking goofy questions to TikTok ceo. So yeah,
I wanted to break it down. This is the best
of my ability, Thank goodness, yes, because it is a lot.
(03:58):
It is a lot. It is confusing, and I was
trying to think about this. It feels to me very new,
and that it's not that we haven't had conversations like
this before, but to have a serious, like congressional thing
about like let's just get rid of this thing right
in this country feels unique to me. I don't know
if that's entirely true, but that's how it feels to me. Yeah,
(04:20):
I'm struggling to think of another social media app that
we have had conversations that rise to like the president,
you know, in Congress about just an outright ban. I
don't know that I'm really strong if there is some thing,
So if someone is like driving right now and they're
like screaming an example the top of their lungs, please
(04:42):
let me know. But I can't think of a time
where that has happened. And I feel like this is
different because it's happened so quickly. I feel like TikTok
really rose in popularity in the last few years, and
I think going from it not really being a thing,
to it being a thing that people were like, oh,
it's just an app for kids, blah blah bla, to
it being ubiquitous, to now the conversation being about it
(05:04):
being banned, all happening within the span of a few years.
That feels different to me as well, right, I guess
I have a lot of questions just because of the
So we know that Zuckerberg had to go and testify
in front of the courts in Congress as well because
of disinformation and misinformation of being allowed to be posted
onto Facebook, but there was no real conversation about it
(05:25):
going away or him doing anything wrong other than do
you take responsibility? How are you going to change to this,
which ended up being nothing. They're like, yeah, his fine,
it's not his first amendment whatever. And then having when
Trump was president, him coming in and threatening and banning TikTok,
saying that TikTok was a spy for China, which was
all about the xenophobia which happens with COVID and everything
(05:48):
else has happened in the last five seven years, so like,
this has been a bigger job, and I guess which
was really interesting. And what I don't quite understand, I
don't understand at all is who's in the right and
who's in the wrong. That's a great question. That is
not a question that I can answer definitively. I'll try
(06:08):
to break it down and then I'll give some of
my like lingering thoughts towards the end. But yeah, I
should say that this is a conversation that many people
that I trust and respect have different takes on. I
did an interview on my podcast there are no girls
on the Internet with this TikTok disinformation researcher Abby Richards,
(06:29):
who I'll refer to later in the episode today, but
she is a fervent TikTok user. Her academic and professional
work is Happens on TikTok, where she was one of
the first people to call TikTok out for the way
that it can spread hate speech, conspiracy theories, miss and disinformation.
But she believes that a TikTok band would harm marginalize
(06:51):
people the most. And so there are people like that
who were like, absolutely no, do not ban it. And
then there are other people who I also respect and
trust in the space where like, oh, well, banning TikTok
might actually be a good thing, because you know, TikTok
is the most relevant social media platform out there right now.
Twitter is really in this like weird, nebulous space. If
(07:13):
Twitter is in this weird space and TikTok goes down,
we might actually have a chance to sort of restructure
our social media ecosystem without the current giant that is TikTok.
I don't know if I agree with that, but I
guess what I'm saying all of this is to say
that I don't have the definitive answer. But I first
(07:34):
started researching this topic, I was like, I don't really
know where I stand. Having done that research, I think
I am coming down in favor of the idea that
banning TikTok is not the best solution. But again, that's
I am. That is just mine. That's just one person's take.
I've seen many other interesting takes that I'm like, oh,
(07:55):
that makes sense to me. I respect that, Does that
make sense? I feel like I'm rambling. No, no, no,
I mean I feel like dust. The whole thing of
TikTok is kind of going trying to reason out when
it is the most plausible decision or what's the most
responsible because there is no good or bad in this there.
It's all kind of like, oh, this could be bad,
this also could be bad, but also this could be good.
(08:15):
So it seems like that's that kind of conversation in itself. Again,
I'm a TikTok fan that I do love what I
get to see, but again, yeah, how does it even start? Yeah, well,
so let's let's get into it. So last week, folks
might have seen that there were congressional hearings where members
of Congress grilled TikTok ceo show z Chu. So the
(08:35):
big thing to know here is that TikTok's parent company
is called byte Dance, and they are a Chinese company,
and the people who want it banned are essentially worried
that it's meteoric rise represents a national security threat because
it is owned by their parent company, Byte Dance is
based in China, a country with whom the United States
has had like a chilly diplomatic relationship, And so this
(08:56):
comes at a time when more and more state governments
and institution are cracking down on TikTok on state issue devices,
like if you there are a lot there's like a
growing number of universities and state agencies that are if
if they give you a device, or if you have
a state run device, you're not allowed to have TikTok
on that device, and lawmakers are proposing bands of TikTok.
Two weeks ago, the Biden administration demanded that TikTok be
(09:19):
sold or that it would face a ban in the
United States. And this is not just like hot air
or posturing. Congress has also rolled out a bipartisan bill
allowing a nationwide TikTok ban, called the Restrict Act, which
would allow the Secretary of Commerce to ban apps that
pose a risk to the US's national security. A sale
of TikTok would require the Chinese government to go along
(09:40):
with it and agree, and perhaps unsurprisingly, they're saying like, no,
we will not authorize or approve a sale of TikTok.
In response, TikTok has committed to spend one point five
billion dollars on a plan that they're calling Project Texas,
which would essentially enact a stronger firewall between TikTok and
employees of its Chinese based parent company. Byte dance. It
(10:02):
would all be set up through a US tech firm
called Oracle as kind of dislike watchdog organization that's meant
to scrutinize TikTok's source code and act as kind of
a third party, unbiased United States based monitor to monitor
for like potential security risks. And so that's kind of
the zoomed out conversation about what exactly is going on
(10:25):
and the contact behind how we got to these hearings. Yes,
and you know, it is like I've used the word
confusing a lot. It is confusing. But it's been one
of those things where we've heard a lot of lawmakers
ask these questions that make clear they don't really understand
(10:49):
any of this. The congressman be like, what's the internet?
So if I'm all Wi fi and people making a meme,
there's been so many means, all my questions of the
congress people asking the most ridiculous questions with chosen phase
going one. Yes, I mean it reminds me of when
I go visit my parents, and you know how when
(11:11):
you visit I mean, if your parents are something like
my parents, you have a nice meal whatever whatever. And
then the tech support part of the evening starts where
it's like what, like can you do this? Can you
delete that? Like what's our Wi Fi password? You know,
like two to remote controls? Like so many Oh my god,
(11:32):
here's a pessay for anyone listening. Do your parents have
favor change their Wi Fi password to something like easier
for them to remember, so it's not just a random
collection of numbers and letters that they have to read
out to you while they're there. You know, that's exactly
to me. The few bits of the clubs that I
(11:53):
saw was literally feeling like watching my parents asking what
is this tickety talk? Totally you're so right. So there
were some good questions that came up, like a said
one lawmaker asked about whether or not marginalized creators are
suppressed on TikTok by the algorithm. Another one asked if
the Apple was suppressing accurate content about abortion. So there
(12:14):
were some like good questions where I'm like, oh, that's
a good question, But there were so many more lawmakers
asking questions that reveal that they have no idea what
they're talking about. Probably the one that got the most
play was Rep. Richard Hudson of North Carolina asking about
like whether or not TikTok I think I think he
was trying. I think the substance of the question was
(12:35):
if you're using Wi Fi on your phone and are
using TikTok, can TikTok access other devices that are on
that same network via your Wi Fi connection. I think
that was a spirit of the question, But the way
that he was asking it was like, does TikTok access
the home Wi Fi of a user? Made it made
it seem as though he was not clear like the
(12:57):
relationship between Wi Fi and TikTok, you know what I mean,
Like it wasn't a question that voted a lot of confidence,
is what I'm saying. And you know, it's It's one
of those things that you really get a sense of
the fact that so many elected officials are meant to
be legislating technology that they just perhaps don't even really understand.
And I think we're seeing that with so many different
(13:19):
types of tech like conversations around AI. I think is
another one where it's like stuff is moving quickly and rapidly,
and we really need elected officials and people with power
and institutions to be advocating for the best interests of
the public. If you aren't able to do that. It's
actually like that is actually a pretty big national security
(13:40):
risk in my book, right right, Yeah, And I get
that we have this conversation about national security threats and
(14:00):
invasion with privacy as we have constantly having to update
our privacy notices from everything electronic whatever it may be.
My phone just recently did it our audacity which we
use to record on the like terms and conditions are changing,
and we have already had this conversation about the fact
that our phones are listening to us because you can
(14:22):
say a store and the next thing you know, it
pops up as an ad for you. And we know
that in China at this point, I believe Facebook has
been banned and they use their own technology because kind
of on that same line of what's happening with a
concert of TikTok. But I have to ask, how is
this so different from any other apps that we would
(14:43):
use for social media. That's a great question, and I
would have to say it's not really that different other
than the fact that TikTok's parent company, byte Dance, is
a Chinese company. Right, There's not many differences, Like most
if not all, of the harms that TikTok is responsible
for that came up in that hearing. Aren't true about
(15:03):
every other major social media platform as well, and so
that was something that I didn't love about. What came
up in that hearing is that, like, if we're going
to be making TikTok the poster child for harms that
all social media platforms are responsible for, we should have
a clear reason why we're doing that. There should be
some kind of smoking gun, some kind of evidence, some
(15:24):
kind of something that's like, well, here's why. And so
the big question of the hearings is whether or not
TikTok is actually a national security risk. This is a
little bit above my pay grade. I am not a
digital national security expert, so you know, just take that
for what it's worth. So during the hearing, TikTok CEO
was grilled about his relationship with the Chinese Communist Party
(15:44):
and whether Project Texas was going to be enough of
a solution. I am concerned that what you're proposing with
Project Texas just doesn't have the technical capacity of providing
us the assurance that we need. This is from California
Republican Jay Obernulty, a congressman and software engineer, I should
also say that like TikTok does not have the like
(16:05):
squeakiest cleanest record when it comes to privacy and how
they handle your data, of course, like most social media
platforms do not. And this definitely came up in the hearing.
Neil Done, a Republican from Florida, asked pretty bluntly whether
or not byte Dance has quote spied on American citizens,
And we actually know that the answer to this is
probably yes. There are reports last December that TikTok access
(16:26):
journalist information and an attempt to identify which employees had
been leaking information to those journalists, and TikTok actually admitted
to this, according to an internal email. When asked about
this directly, TikTok CEO responded that spying is not the
right way to describe it, which I don't know. I mean,
it does kind of sound like spying to me, Like
if there's some sort of nuance in like, you know,
(16:47):
maybe maybe from his perspective, if the thing that you're
trying to sniff out is which of your employees is
leaking information to a journalist, maybe that's the distinction that
he's making. But you know, it doesn't sound great. And
so when I say that, like, I don't want to
make it seem like I am suggesting that TikTok is
a perfect platform where things like this never happened, because
(17:09):
that's not the case at all. That we know that,
Like this is an example of you know, perhaps TikTok
doing some things that work great, but this is, I
hate to say it pretty in step with how social
media platforms behave at large. Like I wish that wasn't
the case, but that is the case. I mean, it's
kind of we've seen a live play with Twitter as
it's breaking down of which they are going after people
(17:32):
or people that they don't like or disagree with and
not necessarily docting them, but definitely cutting them out and
making sure that they know they being the people who
ever did this, are that they are being watched quote
or any of those things are being seen by Twitter themselves,
the company. So it's not like this is anything new.
Once again, it's very concerning, but that's kind of that
(17:54):
acknowledgments us having a phone that is connected to WiFi,
that is connected to any type of internet or any
of the data they're getting our information, that that's I
would assume that we all understood this totally. So that
is really my biggest point that I always come back
to in this conversation. We don't have any kind of
(18:16):
meaningful data privacy legislation in this country whatsoever. All of
our information is for sale by whoever wants it, and
I mean that literally. I've done a whole episode about
doxing and how people do get doxed on their no
girls on the internet, and essentially, if you've ever done
anything like voted, or turned on the electricity in your apartment,
(18:40):
or paid a parking ticket, your information is for sale
on the internet. Oftentimes it is put there by our
state agencies. That information is just available widely for whoever
wants to spend a small amount of money to buy it.
I wish that wasn't the case, but that is the reality.
And so the fact that we are talking about banning
TikTok when if the whole national security threat is China
(19:05):
having access to American data, all of that data is
for sale. So like if you banned TikTok, that would
still be the case. China would still have access to
American data. It wouldn't be from the TikTok app, but
they there's it's just a widely available and so I
feel like banning TikTok is this flashy scapegoat of like,
see we did something when in reality you've done nothing.
(19:29):
The analogy I use it's like putting bars on your
window when you don't have a front door. Right, It's like,
we desperately, desperately do need meaningful legislation that protects user
data and user privacy. But banning TikTok will not get
us there. We will still all of our information will
still be widely available to whoever wants it. That's the problem, right.
(19:49):
I mean, let's be honest too. I think Choe was
not wrong when he said that the hearing felt like
it was a xenophobic attack, Oh my god, whole group
of people rather than just a social media platform. Absolutely
so um. There is a great piece on this by
CNN's Brian Fung, who I actually know. IRL. Shout out
to Brian Fung, where you talked about how some of
(20:12):
the some of the rhetoric coming out of those hearings
just felt very xenophobic, and so TikTok Ceo Chow he
is Singaporean, right, and so accusing him of working with
the Chinese government and trying to associate him with the
Chinese Communist Party is like just doesn't feel it doesn't
feel like helpful rhetoric. And to me, it did feel
(20:33):
rooted in xenophobe. It felt rooted in this idea of
making this connection to China feel other and thus like
nefarious or suspect or something. And so if there was
some kind of like smoking gun right that the Chinese
government is using TikTok to spy on Americans in mass
(20:56):
and there was some sort of evidence or a smoking
gun to illustrate that, I would have that we would
have that conversation. But right now it is just so
based in like hypotheticals like well they could you know,
that is a potential risk, And it just feels like
adding this assumption of nefarious behavior simply because we're talking
(21:17):
about a Chinese company, does that make sense? Yeah? No,
Like this is kind of the big question in this
rise of age and hate, in this level of discrimination
when it comes to anything that's coming out of China.
It feels like they're playing into while we're already kind
of scared and we have a base of people who
are going to blame the Chinese for everything and anything.
So let's go ahead and start this when in actuality,
(21:39):
this type of privacy in data collecting has been happening.
I'm trying to think back, didn't the Congress actually bypass
a privacy data law in order to get more money
out of like companies a while ago? We are based
straight up don't care like that's the thing. It's like,
this is just me kind of like tinfoil hatting a
little bit, So take that for what it's worth. But like,
(22:03):
I think that when there is a foreign boogeyman, you
see elected officials acting differently, but when given the chance
to legislate and govern and act on these very same issues,
when there isn't some foreign boogeyman to blame, they do nothing.
They've had the opportunity to act and they have done nothing.
(22:24):
So for me, it's like, is this really about security
and harm and risk? Because you have done jack up
until now. So if it truly is about protecting the
public from harm, you need to then explain to me
why at every opportunity that you've had to act, you've
done nothing. Like that needs to be explained to me
because I don't get it or actively voted against it,
(22:45):
which we saw again like I remember this coming about
and people talking about we need more laws, and so
lawmakers did and then it just went nowhere. And you're like, wait,
so do you actually care about I privacy? Are not?
I mean, if you were to again, so much of
this sounds like I'm, I'm it sounds like a conspiracy theory.
(23:06):
But I know that part of this conversation is the
immense money and energy that Facebook has put into lobbying
elected officials into seeing TikTok as harmful to take the
heat off of them, and in some cases, you know,
I would wonder if elected officials have some sort of
financial connection to Meta right, Like during that hearing, I
(23:30):
heard specific talking points that I know came from Facebook's
massive lobbying company called Targeted Victory, where they assign blame
for you know, these challenges that they say start on
TikTok that were where kids end up doing them and
then and then dying or getting hard harming themselves. But
those some of those challenges actually originated on Facebook, and
(23:51):
so Facebook actually had a huge pr campaign to make
the public and elected officials associate TikTok with harmful challenges.
You know, misinformation, all kinds of bad stuff that they
themselves are are also pushing, right, And I want to
come back to that point in a second, but to
(24:12):
go back to sort of the challenge, the challenge aspects
of social media and young people, I think one of
the other big pieces of this conversation that also relates
to those sort of funny and distressing soundbites, is that
older people don't use TikTok, whereas they might use Facebook,
or they might use something else, and therefore a very
(24:35):
dismissive of it, as like, oh, this is young folk
hooligans using this one totally. So I still hear people
talk about TikTok as a kid's dancing app, and I
always a bristle at that because that is just not true,
Like TikTok is a discourse app, like it is not
just for kids dancing. But it is absolutely true that
it has a huge young fan base. According to The Guardian,
(24:58):
a majority of teams in the US say that they
use TikTok, with sixty seven percent of people aged thirteen
to seventeen saying that they use the app almost constantly
according to Pew, and so it definitely is an app
where young people are congregating. But that doesn't mean that
it's not an app where serious discourse is taking place,
because it absolutely is. And so I think that you're
(25:19):
exactly right that that is something that takes center stage
in conversations around things like content moderation is the fact
that it does have a very young user base. And
so during the hearings, we saw lawmakers pointing out that like,
children can have access to content around guns, Like there
was an image of a gun that lawmakers pointed to
on TikTok that they found very distressing, which is almost
(25:41):
humorous to me that it's like, oh, this image of
a gun on social media on TikTok. Bad guns in
real life, like in schools, hurting actual kids. Nah, Like
all good. So that was kind of interesting. And so
something that Chew said was that harmful content making its
way to miners on social medi the apps is an
industry wide challenge, which is true, right, So I think
(26:04):
every major social media platform is struggling to make sure
that young users are not having access to content that's
going to be harmful for them. But the thing that
I am like really passionate about is how apps like
TikTok promote things like disordered eating or self harm content
and medical misinformation. So right now TikTok is facing like
lawsuits over young people who have gotten hurt or died
(26:28):
because of content that they came into contact with allegedly
on TikTok. And so you know, I would say this
like this is part like this. I do this for
a living. It is like I meet with social media
platforms and the leadership at these platforms to advocate for them,
you know, making platforms safer. And I will say that
like carmful, downright dangerous content working its way into platforms
(26:53):
is a problem for all social media platforms. I wouldn't
necessarily say that TikTok is performing worse than any other
social media platform out there. It might be even performing
like above average when you compare it to things like
Twitter or Facebook. But this is a very real issue
and all social media platforms across the board need to
(27:13):
be doing better. And so the fact that that came
up in the hearing, I was like, oh, well, that
is actually something that we need to be talking about,
But we don't necessarily need to be talking about it
in a way that just poses that TikTok is the
only bad actor in the space, or is the only
platform in the space where this harm is happening, because
the reality is it's happening on Facebook, it's happening on Twitter,
(27:33):
it's happening on Reddit, it's happening across social media. Yeah,
it's one of those things that feel so strange that
we've just accepted it. But it's difficult to find it,
I guess, but you kind of know, You're like, well,
if I'm using my phone, then they're going to know
that I looked up this thing or this thing or
this nick And I also read that if you don't
(27:55):
even have whatever app, it can still somehow get access
to your data. So just as kind of helpless, I guess, like, yeah,
well I get that, and I feel the same way,
and I think it makes me sad that, just as
you said, Annie, we've like just accepted that this is
(28:15):
how it is right that, of course social media platforms
are going to be making all my data available, whether
or not I even use them or have them on
my phone. Of course they are going to be listening
to my conversations and you know, serving me up ads
based on that. Of course, if I am interested in
obtaining an abortion, they will share that information with law enforcement.
(28:37):
Like I think that we should be able to expect better.
I think that we deserve better. We deserve to have
better digital tools and digital platforms. We deserve to have
our online experiences not just be marketplaces for harm and
exploitations and scams. We deserve for them to be places
(28:57):
where you can have meaningful discourse, get accurate information in
a way that is safe and private. And the fact
that we're just like, oh well, nope, that the baseline
experience that we can expect is quite literally the opposite.
I don't accept that. That's unacceptable to me. Oh my god,
you just completely put into my mind. I forgot about already.
I can't believe there's so many bad things in the world,
(29:19):
but I forgot about the case in which Facebook allow
for information to be gathered on a young woman and
her mom for getting access to abortion and actually being
obtained by law enforcement to go after these people. It's like,
if the if you want to talk about us, to me,
what a security threat that would be it? And that
(29:39):
was from Facebook not too long ago exactly. So that's
such a good point that, like, there are specific and
current harms that social other social media platforms like Facebook
are responsible for today. Right, We know about that, We
know about things like Cambridge Analytica, Right, Like, these are things.
These are no one of things where it's like, yeah,
(30:01):
we know that Facebook has admitted responsibility for literal genocide.
These are things that we know. They're not hypotheticals, they're
not potential risks or potential harms. They're things that are
that have happened. And so for me, it's a little
bit rich for me to be having this like breathless
conversation about the potential harms maybe down the line that
TikTok could be responsible for potentially, and then not having
(30:25):
the conversation about the laundry lists of documented admitted harms
that Facebook has been responsible for already in reality, Like,
I don't understand how they got to this point where
the hypothetical risk takes precedent over the actual documented and
oftentimes admitted harm of platforms like Facebook. Right. And you
(30:46):
were saying earlier because it sounded like a whole movie.
It feels like a spy movie, as you were saying,
because you were talking about how Facebook has actually been
kind of behind the scenes in doing a whole smare
campaign on TikTok. Can you talk about the Yeah, So
this whole conversation, the whole hearing feels like a real
win for Facebook. We already know that Facebook has paid
(31:06):
lots of money to try to make people and more importantly,
lawmakers dislike TikTok, and we definitely saw the impacts of
this on display at the hearings. Right. So, those deadly
challenges that I was talking about, several of those challenges
actually originated on Facebook, not TikTok. And the reason that
lawmakers might be associating those with TikTok is because of
(31:27):
this coordinated smear campaign orchestrated by Facebook. The New York
Times found that Facebook paid Targeted Victory, which is one
of the biggest Republican consulting firms in the country, to
orchestrate a nationwide campaign seeking to turn the public against TikTok.
This campaign included placing op eds and letters to the
editor in major regional news outlets, promoting dubious stories about
(31:47):
alleged TikTok trends that actually originated on Facebook, and pushing
to draw political reporters and local politicians into helping take
down its biggest competitor. This is an email that The
New York Times found Targeted Victory needs to quote get
the message out that while Meta is the current punching bag,
TikTok is the real threat, especially as a foreign own app,
that it's number one in sharing data that young teens
(32:09):
are using. A director for Targeted Victory wrote in a
February email, So I really don't like that Facebook is
at least pulling some strings behind the scenes of this
conversation about TikTok to kind of take the heat off
of the massive wrongdoing and public harms that they have
been responsible for, like the whole hearing. I'm sure that
(32:29):
Mark Zuckerberg is like, oh, this is taking the heat
off of me. I can keep stay over here doing
like evil while TikTok it's all the heat. Yeah, that
was my Mark Zuckerberg impression. It reminds me of when
I was in college. I had an internship in China
(32:50):
and I was working for this big company that will
name nameless. But they made me get a different laptop.
But they made me get a different phone because they
were like, once you go through in China, they just
take all data. And I remember thinking like I'm an intern,
You're not sending me any useful information or emails? Like sure,
I'll do it, but like, I don't know what you
(33:11):
think they're gonna get from me. And I love how
they kind of point out like they're spuning on the US,
not that the US aren't doing anything amazing, but it's
sort of like I think it's all about money at
least a part of it is just like we want
Meta to make money because it's in the US, sell
the data all the time to them. But TikTok no,
(33:31):
Oh my god. I saw this you USA Today headline
that was like, TikTok wants my data? Don't they know
that's preserved for Google and Meta? Like very much like
we want the West to be where social media platforms
and data are our marketplace, Like like I have to
wonder if how much of the conversation is exactly that Annie, Yeah,
(34:05):
and so I guess this brings us to another big question,
what would happen if TikTok was banned? Like if we
play out this scenario, what would happen? So great question.
I had an interview with TikTok disinformation researcher Abby Richards
this Week, who wrote a great piece for Newsweek which
folks should read. In it, she writes, I understand the
(34:25):
privacy concerned stemming from reporting that TikTok has been weaponized
by the Chinese Communist Party together data from Americans, But
banning TikTok is like applying a dirty, used band aid
to the gaping wound that is our broken digital privacy
status quo. It would do little to protect the data
of Americans, but it would cause a whole host of
new problems. To address this problem at its core, we
must regulate the use of data. Why should Google, Meta
(34:47):
and Twitter get a free pass because they are not
Chinese owned. If we banned TikTok, the channels of communication
that have been steadily established over the last half of
this decade will cease to exist, leaving some of the
most marginalized in our country suddenly in the dark. The
US is at a crossroads. We could dismantle a massive
piece of communications infrastructure used by young people, LGBTQ plus
people and people of color, exacerbating existing inequalities in information access.
(35:11):
Or alternatively, Congress could implement legislation that serves to protect
the digital privacy and safety of all Americans on all platforms.
And so the thing that Abby is really getting at
here is that that I hadn't really thought about until
reading her piece and talking to her, is that we're
at a time where marginalized folks like trans communities, career communities,
communities of color, women are facing a lot of attacks.
(35:34):
And so if you dismantle a platform where a lot
of these communities have built up a voice and built
up a platform for themselves, you would be setting those
communities in and their you know, their work for equality
and justice back quite a bit. And I hadn't really
thought about that because I don't think I had fully
fought through how big some of these spaces on TikTok
(35:57):
are in terms of creating discourse. Like I'll be straight
up with you, when all that stuff was happening in Iran,
I don't think I would have known what was going
on with women and girls in Iran if not for TikTok.
That was the first place that I saw it. That
was where I saw like conversations about how folks in
the United States could help amplify That was how I
got connected to have guests from Iran on my podcast,
(36:19):
Like if not for TikTok, I would not have been
able to do any of that. And so, you know,
the ways in which marginalized communities have been able to
use this platform to create discourse and power for themselves
really is pretty vast. And so if we ban it
outright that, according to Abbey, that could have really drastic
consequences for marginalized communities online. Oh yeah, you say that.
(36:41):
We literally just had a guest who was Iranian Elka,
who has a big following on TikTok. But yeah, on
the same way, I would have not known about any
of the things going on, even knowing more updated information,
because I definitely don't see it on the news. It's
very rare that I get to see personal takes on
how it's affecting individuals and families that are imprisoned. So
(37:01):
but yeah, it is all because of TikTok, and I
would have no clue except because of that. And honestly,
it's helped me connect even deeper into my Korean heritage,
which I feel so lost about, like I can't. I'm
not big on TikTok. I don't post things on TikTok.
I follow a lot of different people, but some of
the connections that I made without them knowing my paras.
Social connections really have brought me to a deeper understanding
(37:25):
or trying to understand myself or like myself a little
better in my ethnicity. And I know that's the whole
other conversation, but it truly has made me feel a
little more connected to a community I felt so ostracized
by through TikTok, and I cannot imagine. I'm sure there'll
be other platforms. I'm not gonna sit here and saying
this is going to be an end all if it
(37:46):
goes away, but it does seem very targeted for something
that I think has done a lot of good, I
know for me and for a lot of people. Whether
it is sending out my information to everybody in China,
I assure you that you can have it. I'm sad
and lonely in itself, but like it really does. There's
this level of me being a marginalized person and being
more marginalized in a community that's already marginalized, being adopted
(38:09):
and being very isolated to that felt so much more
connected through creators on TikTok that are willing to share
their life so that I feel more a part of
that community. That makes me sad. Yeah, And I mean,
I think I think that you've really said it. I
think that all social media platforms have their ups and downs,
(38:30):
but we can't discount those the experiences like the one
that you just shared about the way that this particular
app has enabled folks to really build community, explore their
identities and who they are. And if we just blanket
get rid of it, where will that conversation happen. I
don't see it happening on Twitter. I don't necessarily see
(38:50):
it happening on Facebook, right, I'm with you. I think
that marginalized people always are able to make a way
out of no way. I'm like, build power and communities
and feel seen online even in places that are hostile
to us. But it takes time. It doesn't happen overnight.
And I think we're in a weird social media landscape
(39:11):
where I wonder, like where what those kinds of conversations
that you just describe take place? If we didn't have TikTok,
you know, it would certainly take a long time to
rebuild them at a time, and a lot of these
communities don't really have a ton of time because they're
being attacked. Yeah, and I think those are those are
great points to make because I know we talked about
this in the most recent episode on What's going On
(39:33):
with Twitter. It's a lot of people will tell me, well,
I don't use it, think well, good for you. But
it's like very very meaningful for a lot of people.
And there is especially if you're in any way marginalized
or in a small town for instance, and you don't
have a lot of people you can talk to, you
go to these social media platforms and you find that
(39:55):
and there's something very powerful about that. And these movements
have happened on their information has happened on there that
has shifted how people think. That has shifted movements. So
it's like I just encourage people always like, don't just
say oh, I don't use it, so it doesn't matter.
It matters to sell people. It matters to a lot
of people, and it is powerful. That's such a good point.
(40:16):
I've only been on TikTok for a couple like two years.
In those two years, I have learned more useful information
that I didn't know than my entire time on Facebook
and Twitter. Absolutely, hands down, I'm like, did you know
what you're supposed to be boiling your wooden spoons, like
that's the way to clean them? Who do I think
(40:37):
those are who knew this? Like so, I don't want
to sound super biased, but I do love TikTok, and
I do I have personally seen the ways that it
can be really a really important place for discourse and
information and communication and community. And I think that given
all of that, having people who don't necessarily understand it,
(40:58):
who've never necessarily use used it, making a blanket ban
of it coming from a place of fear mongering, and
you know, in some cases xenophobia is not the move.
I am all for regulating social media. I am all for,
you know, a meaningful, comprehensive legislation that protects of the
(41:18):
American public's privacy online. Yes, give it to me, we'll
take it. Let's have that conversation. I don't see this
as a conversation that will bear that fruit that I
know that we need. I see this as posturing, scare mongering,
fear mongering, making a boogeyman of one platform, when what
we need is something much more meaningful and much more comprehensive. Right,
(41:42):
I completely agree, there's so much confusion because you do
partially agree that things need to change on TikTok, but
that partial agreement is an arcing to all the social media.
Like you said that, unless you say that you're doing
this for all of social media and inner and all
of that and our phone, then we're not going to
(42:03):
believe that you're doing this for the well being of
a nation. You're doing this for your profit exactly. Abby
had a point, she said, data privacy, misinformation, hate speech.
You got to care about it on all platforms, not
just TikTok, not just the one that you can make
a foreign biogee man out of. Right, And I would
love to keep like pitching these ideas that will probably
(42:23):
never do, but I would love to come back to
talk about like Also, it's interesting to me that they're
talking about this when we've had so many high profile
instances lately of like our whole flight system falling apart
here in the US because we haven't updated our technology
and years. Genuinely scares me. Right, there's so much infrastructure
(42:50):
but also tech and digital infrastructure, but we've just accepted
it is like yeah it's jankie. Yeah, well basically can't
use it. I don't know it. I'll just stuck in
the airport for a couple of days. Good. Look, yeah,
we just we deserve better. We deserve better. We do
we do? Oh well, thank you so much as always,
(43:13):
Bridget for coming on helping us to understand this whole thing. Oh,
the pleasure is all mine. Thanks for helping me. I
feel like I was very ranty, but thank you for
helping me understand it better too. I always feel like
I get clarity from connecting with these issues with you all.
That's what I feel like that too. I feel like
we have good conversations. Yes, it's good in these episodes. Yes, Well,
(43:39):
where can the good listeners find you? Bridget? You can
find me on my podcast there are no girls on
the internet on iHeartRadio. You can find me on Twitter
at Bridget Marie. And you can find me on Instagram
at bridget Marie DC. And you can find me on
TikTok at Bridget makes podcasts easy enough. Like I said before,
on these it's so strange. I mean, it's important to
(44:02):
be critical of the things that you use, right, but
it's always so strange at the end to be like
and you could find me on exactly, Yes, well, thanks
again as always, Bridgett can't wait for next time. In
the meantime, listeners, If you would like to contact as
you can our emails steffiea mom Stuff at iHeartMedia dot com.
You can find us on Twitter at mom Stuff Podcasts,
(44:22):
or on Instagram and TikTok at stuff I Never Told You.
Thanks as always too, our super producer Christina, our executive
producer Maya, and our contributor Joey Yes. Thank y'all, and
thanks to you for listening Stuff I Never Told You.
Distraction by Heart Radio. For more podcast from my Heart Radio,
you can check out the iHeartRadio ap Apple podcast wherever
you listen to your favorite shows