All Episodes

August 1, 2025 70 mins

This week Bridget recaps the tech stories you might have missed with longtime friend of the show indispensable Internet advocate Abbie Richards.

She is the Steve Martin of TANGOTI. If you're not following Abbie on TikTok you're missing out! www.tiktok.com/@tofology

US labor hero and friend of the show Chris Smalls, co-founder of the Amazon Labor Union, was beaten and choked by IDF soldiers while trying to deliver aid to Gaza. https://www.theguardian.com/us-news/2025/jul/29/chris-smalls-amazon-labor-union-gaza

Substack sent a push notification for an openly nazi blog, continuing their streak of promoting nazis. https://www.usermag.co/p/substack-sent-a-push-alert-promoting-nazi-white-supremacist-blog

The Discourse Is Broken- How did a jeans commercial with Sydney Sweeney come to this? https://www.theatlantic.com/technology/archive/2025/07/sydney-sweeney-american-eagle-ads/683704/?utm_source=reddit&utm_medium=social&utm_campaign=the-atlantic&utm_content=edit-promo

First YouTube, and now LinkedIn, have changed their policies to allow discrimination against trans people: https://www.advocate.com/business/youtube-scraps-gender-identity-protection and https://www.advocate.com/news/linkedin-transgender-deadnaming-misgendering-policy

TikTok adds footnotes in an attempt to add context to misinformation and fake content. We hope it helps! https://newsroom.tiktok.com/en-us/rolling-out-tiktok-footnotes-in-the-us

Community Notes and its Narrow Understanding of Disinformation: https://www.techpolicy.press/community-notes-and-its-narrow-understanding-of-disinformation/

Trump Admin uses songs without permission to create cruelty porn. These people have no shame. https://newrepublic.com/post/198600/white-house-jet2-holiday-meme-deportation

There's Already a Class Action Lawsuit Against the Viral 'Tea' App: https://lifehacker.com/tech/tea-app-class-action-lawsuit

Spotify threatens to delete accounts that fail age-verification: https://www.telegraph.co.uk/business/2025/07/30/spotify-threatens-to-delete-accounts-unless-users-prove-the/

Not just YouTube: Google is using AI to guess your age based on your activity - everywhere: https://www.zdnet.com/article/not-just-youtube-google-is-using-ai-to-guess-your-age-based-on-your-activity-everywhere/

If you’re listening on Spotify, you can leave a comment there or email us at hello@tangoti.com!

Follow Bridget and TANGOTI on social media! Many vids each week.

instagram.com/bridgetmarieindc/

tiktok.com/@bridgetmarieindc

youtube.com/@ThereAreNoGirlsOnTheInternet 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet, as a production
of iHeartRadio and Unbossed Creative. I'm Bridget Todd and this
is There Are No Girls on the Internet. Welcome to
another episode of There No Girls on the Internet, where
weeks before the intersection of social media, technology and identity.
This is another iteration of our weekly news roundup where

(00:26):
we talked through the stories that you might have seen
on the Internet so you don't have to. I am
so thrilled, so welcome back to the show. Friend of
the show, I think maybe the are our longest running guests.
Like you know how Steve Martin was the most he
hosted SNL the most. I think you're the Steve Martin
of There No Girls on the Internet. Oh my god,
what a claim to fame. It's an honor and a privilege.

(00:48):
We are so thrilled to be joined by Abby Richards,
prolific TikTok user, misinformation expert, media matters analyst. How do
you describe yourself these days? You're a woman who were
many hats.

Speaker 2 (01:00):
Yeah, I think I described myself as a woman who
wears many hats. I'm still going with like misinformation researcher
and content creator as like blanket, you know, just covers
all the bases.

Speaker 1 (01:14):
Yeah, there's a lot more to Abby. She has multitudes. Abby.
The last time that we saw each other, I think
was in Ireland a question mark. Yeah, it was in Dublin. Dublin,
that's right. We were there with Mozilla. And one of
the folks that you and I were lucky enough to
spend some time with and really get to know when
we were in Ireland was Chris Smalls. We've talked about

(01:35):
Chris Small's on the show before. He is one of
my personal heroes. You might know him best as one
of the main leaders in the fight to unionize Amazon workers.
He was the co founder and the former president of
the Amazon Labor Union, which was the very first ever
unionized Amazon worker coalition recognized by the National Labor Relations Board.
When we first got on, we were talking about the

(01:55):
fact that Chris was part of this Freedom Wotilla coalition,
which is a great last roots international collective that has
really been trying to get aid to folks in Gaza.
They were intercepted by the IDF, they boarded their their
vessel and you know, I'm sure that nobody was having
a great time once that happened, But all the reporting
I saw was that Chris was being uniquely targeted, and

(02:17):
I have to imagine that has something to do with
him being I think the only black man as part
of that coalition. Yeah, he was the only black activist
on board.

Speaker 2 (02:26):
And he was reportedly like kicked in the head and
like had like they were like beating at his like
knees and legs. It was like seven uniformed, big people
in uniforms that were assaulting him. And reportedly then when
his lawyers went to go like meet with him, they

(02:47):
had like six like special police guards on him, and
then his conditions like when he was being held were terrible,
I think as without their activists. But it did seem
like he was specific targeted and it was really really
concerning for us who are like his friends and want
to make sure.

Speaker 1 (03:06):
That he's safe. Yeah, and it sounds like he might
have actually been released quite recently, Do I have that right?

Speaker 2 (03:11):
Yeah, So he's in as far as I'm aware at
the time of recording, I think he's like in Jordan,
like boarding a plane or maybe even like on the
plane right now and is expected to land tomorrow morning,
which like thank god, I'm really glad that he's he's
made it out. Sounds like conditions were terrible. They were
saying that they were like that infested with bed bugs
and overheated, really unclean, overpacked, and also that the activists

(03:37):
like we're on like day five of their like a
hunger strike. So it just the conditions were terrible and
it sounded absolutely miserable. But he posted a video i think,
at an airport like ready to leave, and seemed in
good spirits, which is nice.

Speaker 1 (03:52):
And it's a good reminder of something that we get
into sometimes, which is that these issues, at least from
my perspective, are very much all connected. But back that
Chris Mall's was the you know, co founder of the
first Amazon labor union, I don't think is unrelated to
what's happening with Palestine. Like I think these are all
tech issues, they're all justice issues, they're all race issues.

(04:12):
The fact that a black activist would be uniquely targeted
by IDF when coming to provide aid and coming to
advocate for the cause, I think all of these issues
are connected in ways that sometimes can be difficult to see.
But I think that what happened with Chris really does
demonstrate that, and so I think a lot of folks
might think, like, oh, well, what does this issue have

(04:33):
to do with technology? Why would you be talking about
this in a tech podcast. I see it as very
as very linked the fact that someone would see what's
happening with Palestine and have the orientation of, like, well,
the same kind of forces of injustice that are at
play with big tech companies like Amazon are certainly at
play here. Yeah.

Speaker 2 (04:53):
I mean, I think that is connected on so many
different angles. There's like the you know, technological colonialism of
it all, but there's also like the social media activism
of it of like how he even got there was
from a place of like knowing how to utilize social
media to get eyes on Gaza. Because it's not like

(05:13):
the Freedom flotilla was bringing like tremendous amounts of aid
in I think like they're they were very aware as
with the first round of it too, that like they
would likely get boarded and taken into custody and that
the aid wasn't going to get through. But it's like
a it's activism in that performance of like letting themselves

(05:35):
get arrested, right, A lot of activism revolves around that
and using social media to get that many eyes on it, right,
the types of activists that were on that boat.

Speaker 1 (05:45):
That is such a technological issue.

Speaker 2 (05:48):
It of course it is, And those things are so
connected and intersecting, and I think that they're really interesting
that we have to like account for that in you know,
twenty first century resistance efforts.

Speaker 1 (05:59):
Absolutely well, Chris, if you are listening, our hearts are
with you, We're thinking about you. We love you. Yeah,
good vibe, so proud. Okay. So I want to talk
about something that I know you probably have lots of
thoughts about, and that is TikTok adding footnotes. Did you
see this this new kind of like program they're rolling
out that'll like let a select group of TikTokers add

(06:21):
context and background information to some videos on the app.

Speaker 2 (06:25):
I saw, yeah, I saw. I saw the article like
saying that they were planning to unroll it. I would
like to be in that first group, please you?

Speaker 1 (06:32):
Yeah, you would be let me, Mike you. I can't
think of somebody who uses the platform in a better
way to call out some of the misinformation that is there,
Like I there's nobody really doing it the way that
you're doing it. And what I love about your work
is how you're able to, like you use the platform.
You're a prolific user of the platform, but you also
critique the way that misinformation is able to spread on

(06:54):
that platform. And so it's not like you're someone who
is only raw rah. This platform that I've built up
can do no wrong. You just come at it from
such a such a place of transparency going on. Thank you.
I mean, like I am, I am a TikTok and enjoyer.

Speaker 2 (07:08):
There are things I really enjoy about that aut but
also there are things that like infuriate me, and I
think that we can hold space for you know, two
things to exist at the same time, where like I
have found great, great joy there and also I have
been like lied to prolifically. So yeah, I think that
the community notes function that they're going to try, and

(07:29):
add I'm actually pretty in favor of it, as long
as it's not also replacing other fact checking, which it
sounds like it isn't.

Speaker 1 (07:37):
So it's similar to the way that community notes functions
on X. But notably TikTok has says they are not
going to abandon in house back checking X and META.
When they rolled out community notes. That was all they
were going to be doing, and it really really did
seem like a way to all fload that burden from
paid staff at these companies and just put the burden

(07:59):
on you users, right, And so I think that you're
exactly right. The fact that TikTok is saying, hey, this
is not going to be replacing our in house backcheckers.
We're going to continue to have that too, I think
is really a key difference. And why to me, I'm
not like opposed to this. Yeah, no, I'm not opposed.

Speaker 2 (08:14):
And I also think that it's kind of community oriented,
which I like. I like that there can be like
discussions around the accuracy of something like I'm very interested
in that. Again, it doesn't replace like journalists and fact checkers,
so as long as like it's just like an added
thing of like I think this is ai. I can

(08:35):
recognize what this bunny disappeared on a trampoline.

Speaker 1 (08:39):
But bunny. So the bunny it got me. I sent
that to somebody, and then I had to send a
follow up when I saw the piece and four for
media and well that got a lot of people, That
got millions of people. So the piece, so the piece
and four for media breaking down this viral TikTok that
purported to be backyard ring cam security footage on a

(09:01):
bunch of bunnies adorably jumping out a trampoline. They were
so cute. It was so cute. And the reason why
it got me was because ring camera footage is always blurry,
so you're not I think that you're not trained to
be thinking is this real or is this not? You're
just expecting to see some low res video, and I
think that's easier to like slip AI into that. Yeah,

(09:22):
I think the blurry footage also with there's been a
few of those videos of like animals jumping on a
trampoline at night, and it's always at night because if
it were during the day, I think it would be
a lot more obvious.

Speaker 2 (09:35):
But that like nighttime ring ring footage, it's just easier
for it to look a little shitty.

Speaker 1 (09:42):
Yes, And when it comes to missing disinformation in the
way that it spreads in video form, it usually adheres
to some sort of worldview that you hold, And I
have the worldview that when night falls, animals are doing
all kinds of cute things and wouldn't it be great
if we could see them? And so I was like, Oh,
I I want to believe. I want to believe in
a world that bunnies are jumping on a trampoline at

(10:04):
night in a coordinated way when we're all asleep.

Speaker 2 (10:07):
That's the ideology that I want to have. My core
belief is that when I go to sleep, bunnies get together.

Speaker 1 (10:17):
It just jump fundrey of belief. Bunny nighttime mischief.

Speaker 3 (10:20):
Yeah.

Speaker 1 (10:20):
Yeah, So this is something interesting about community notes. So
I like the fact that TikTok is not replacing their
in house fact checking with community notes. But when platforms
like x are using community community notes only as fact checking,
the studies on whether or not that and that alone
is an effective way of handling incorrect information is a

(10:42):
little bit mixed. There's a really good analysis in Tech
Policy Press written by not A Jude, a PhD. Researcher
within the Digital Media Research Center at the Queensland University
of Technology, and doctor Adriana Matrimos Fernandez, Associate professor at
the University College a Dublin. They found that studies on
the effective of community notes are mixed. For example, there

(11:02):
is a debate about whether or not the tool reduces
or heightens engagement with misleading posts. Notes are also slow,
and the public sees less than twelve point five percent
of all submitted notes. Partisanship often motivates volunteers to participate,
and the system struggles to fact check divisive issues from
influential counts and hard to verify claims that include, for example, sarcasm.

(11:24):
These challenges call into question whether or not prioritizing consensus
and moderation systems designed to address miss and disinformation is
a desirable or worthy aim. And I found that to
be so interesting because you know, on a platform like X,
where that really is the only thing that anybody is
doing to prevent the spread of false information, you really
see how it might not be something that's genuinely affected

(11:48):
at keeping incress information off the platform on TikTok. I
like the idea of it being something where it's where
context is added or you know, background information or sort
of like let me put this in context for you,
as opposed to being like yes it is true or
no it is not true. I almost wonder if that
false binary of true versus not true. But it's like

(12:08):
it's just not working on platforms like X when that's
the only thing they have.

Speaker 2 (12:12):
I think there's so many other context on TikTok where
context would be so helpful. I mean the amount of
times that like I'm fully lost trying to find the
original video that someone is like mocking and has like
done a play on, or like I'm completely lost and
don't have context on like what this like discourse I've
stumbled into is, or or what account posted about it first,

(12:36):
and like what their user name is? Like it it's
so confusing, and like just given the nature of TikTok,
you get dropped into like these conversations on your for
you page and you're like, I don't know what's happening.
So I think, like, yeah, focusing on context and adding
that is I think that that can actually be like
a beautiful thing for community. I'm interested to see how

(12:58):
it plays out. I certainly don't think it can replace
fact checking at all ever, And I don't think it
can replace true like content moderation. And I certainly worry
about companies who just like don't want to deal with
content moderation so they just like kind of give that
burden to users and they're like, you figure out what's

(13:19):
real exactly.

Speaker 1 (13:21):
Yeah, none of this would be a replacement for expert
factcheckers who know what they're doing, who are trained to
address things, and particularly like culturally informed content moderators, because
so much of the content moderation like it's just if
people aren't culturally informed, it can be very difficult to
know what people are talking about. I mean, you're you're

(13:42):
so right about TikTok. It's a platform that it's really
difficult to follow drama on TikTok because god, you're like,
I'm out of the loop, Like, but who said what?
What started it? I can never keep up?

Speaker 2 (13:54):
And then like somebody will just post a user name
that's like scrambled, and then you get like to like
that person's old videos and you're going through and it's
just it's a lot of it's a lot of investigative
work to try and understand drama you've gotten like dropped
into the middle of due to your algorithm. And there's
like a lot of the times where I look at
something and I like just have to make a decision

(14:16):
of just like I have to let this go.

Speaker 1 (14:18):
I don't like I can't invest any more of my
time to try to figure out who started this random dispute. Yeah,
I understand that the wordal community is having discourse and
that they're fighting, but like, I need to not put
any energy into understanding why. Ooh, I actually would be
interested to know what the wordal community is up in

(14:40):
arms about. They were that was a while ago, but
like there was some beef in the word community. I
honestly felt like it was kind of manufactured for attention.

Speaker 3 (14:47):
But that was just me. Let's take a quick break
at our back.

Speaker 1 (15:08):
Okay, Well, speaking of TikTok I have, this is like
not something I mean. I'll just say like, I try
very hard to not focus on what the Trump administration
is doing unless I absolutely have to, and so I
blocked all of their social accounts because they're all abhorrent.
But when I peeked back in, I was surprised to

(15:28):
see that, you know, the TikTok viral baby hold my
Hand Nelson Beatza Debt to Holiday, So everybody was on
that meme. They were using that song from the Jet
to Holiday commercial to illustrate, you know, things vacation's gone
awry or things bad happening while they're on trips. While

(15:48):
the White House got in on that, they posted on
X videos of handcuffed allegedly undocumented people being escorted by
blurred out ICE enforcement of fit onto a Global X flight,
which is like an airline provider abused by ICE with
that viral TikTok sound, and the post is captioned when

(16:09):
X books you will one way jet to holiday to deportation.
Nothing beats it. Yeah, I just think we I know
there's a lot of other horrible stuff in the world,
but I do think that we should be talking about
how a borring the official comms channels out of this
administration have gotten. I remember earlier they were doing the

(16:31):
studio Ghiblie like AI generated cartoons. They made one of
a woman being deported, like a cartoon version of her crying,
and I just feel like they are putting out stuff
even for their base. You have to be a depraved
sicko to want to see this kind of thing. You

(16:51):
have to be like a depraved person to get enjoyment
out of this. And it really reminds me of like
our country's history with things like lynchings, right, like big
public displays, big spectacles being made out of the suffering
of others. And I just think like in the scheme
of things that might not be that big of a deal.
But the way that the administration uses their official comms

(17:15):
channels to signal to the most depraved members of their
base just really I mean it cannot be overstated.

Speaker 2 (17:22):
Yeah, No, that meme was like truly sick kidding, Like
I felt that in my stomach.

Speaker 1 (17:27):
I mean same with like the they have like the deportation.

Speaker 2 (17:29):
Asmr oh too, And it very much is an instance
of like the cruelty is the point, right, Like they're
trying to be as cruel as possible because they know
that it creates media attention.

Speaker 1 (17:43):
They know it works people up.

Speaker 2 (17:45):
So it's also just about I think triggering the left
to some extent, doing things that they know will make
the left upset. And I think it's also partially about desensitization.
So like the more you're like eye sockets are assaulted
with abhorrent content like that, you just get used to it,

(18:05):
You get desensitized to it, and I think it encourages
people to like emotionally shut down. It's absolutely bonkers propaganda,
but it does seem to be working for them and
getting the attention that they crave.

Speaker 1 (18:20):
Yeah, I think you're so right about the purpose of
this kind of content, because I do think it's about
you could. I mean, this is why I have these
channels all blocked, because you can only see this stuff
so much. You can only engage with this so much
until you're either it just either you're you become desensitized
to it, as you said, or you simply can't go

(18:44):
about your day. I mean, Elaine Walterroth had this great
TikTok where she was like, the experience of scrolling social
media right now is horrible, tragedy, horrible, despicable comms coming
out of an official White House channel and then somebody's
makeup tutorial. And I don't irmly believe that this is
not an environment that we were as humans meant to

(19:04):
be in. This is very It's like, oh, no, we
should not have this is just should not be normal.

Speaker 2 (19:08):
No, our brains were not meant to be assaulted with
this much like horrendous content and then simultaneously sold as
many things to consume as we possibly can. And that's
what going on social media like really feels like right
now is just like.

Speaker 1 (19:25):
Horrible thing, horrible thing, by this horrible thing, horrible thing,
purchase this and like it's really bleak. And I think
that there are people who like see that and recognize it,
like that's the emotional experience that like a lot of
us are having and are happy to add on to
it and try to like emotionally wear us out and

(19:47):
wear us down. Yeah, and I think the fact that
they're picking these things that are symbols of joy might
might be too strong. But the the Studio Ghiblie cartoons,
like those cartoons are all about celebrating nature and love
and innocence, Like they're very kind of pure and you

(20:09):
know they're they're they're for me. They are a symbol
of joy. So then seeing that perverted and seeing that
taken to illustrate cruelty, I think is as a as
a definite like intentional choice.

Speaker 2 (20:22):
Can we talk about how they did the Jet Too
Holiday meme wrong? Though, Like that bothers me because the
whole point of the Jets to Holidays that it's supposed
to be something going terribly wrong.

Speaker 1 (20:33):
And yeah, and they are like, oh, we love this,
so look at us deporting so well, and it's almost
like they almost.

Speaker 2 (20:40):
Kind of got themselves with the fact that they don't
know how the meme works and they just used a
trending audio. But like the whole point of that meme
is that like fun upbeat audio with like something going
just like demonstrably like wrong.

Speaker 1 (20:55):
Like, yes, they they've kind of misunderstood them, like what
makes it a meme? What make it funny? And in
a roundabout way, they're kind of saying these deportations are
something that have gone wrong. But I know that's not
what they want to say. No, that's not what they want.

Speaker 2 (21:11):
What they want is just like cruelty and supremacy and
to like trigger I think they want to like trigger
the left and then on the right. They want to
trigger feelings in superiority and like give people that kind
of like emotional hit of other people's cruelty.

Speaker 1 (21:28):
Yeah. The person who sings the song is Jess glim.
The song is called hold My Hand, and she actually
posted a statement on Instagram basically saying that she was
horrified to see the White House using her song in
this way, even though she was happy to see it
take off on TikTok. Initially, she wrote this post honestly
makes me sick. My music is about love, unity and

(21:49):
spreading positivity, never about division or hate. And then the
voiceover artist, the one that's saying nothing beats a jet
Blue Holiday said what can be done about the White
House using jet to Sound and my voice over to
promote their nasty agenda? And I do feel like it.
I mean it sucks, Like I can't imagine what it

(22:10):
must feel like to have the White House take your
work and your words and use it to promote something
in this way, especially after kind of celebrating that that
this jet to Holiday ad that you made people did
find joy and sort of recontextualizing it, Like I don't know,
I just it's like the White House like we can
we cannot have any small moment of joy or or

(22:31):
wonder without them being like, how can we pervert this
but make it awful? It's so annoying, like let us
have fun, let us like enjoy things, but no, like
let's like they just come in here and completely ruin
the vibes and.

Speaker 2 (22:48):
It's not like this is like the first artist that
this has happened to. It was like, wasn't a Olivia
Rodrigo had a song?

Speaker 1 (22:55):
Oh my god? Which song was it?

Speaker 3 (22:56):
Was it?

Speaker 1 (22:56):
Deja vus?

Speaker 2 (22:58):
That yeah, the Trump administration used as an audio and
she was just like take this down or like never
use my audio again, Like don't use my sound and
they deleted her comments, so then she just like removed
the sound off of TikTok.

Speaker 1 (23:12):
I think yeah, And I mean it also says a
lot that they have to steal from artists and use
their voices and their work without permission up because who
would want to be associated with this? Like they they're
not able to like it's not like, I mean, part
of me wonders like why not just work with the
artists who voluntarily would would sign on for this kind

(23:33):
of thing. It's telling that it's like, you don't want
it to be kid rock doing the soundtrack, somebody who
probably would jump at that chance. No, it has to
be somebody who doesn't want.

Speaker 2 (23:40):
To be associated with you, Bridget, Bridget, you know, they
don't have good artists on their side. You know the
answer to this. They don't have good artists. They have
to steal ours.

Speaker 1 (23:50):
They have to. They're not making like what the Carrie
Underwood's going to produce the really top audio right now,
Like that's what they're working with. Oh my god, this
is such a throwback that Carrie Underwood thing. I found
that to be that was like something out of out
of Veep, where Carrie Underwood. For folks who don't know,
this is a little bit of my personal Roman Empire

(24:11):
signed on to perform with the inauguration. Really shouldn't have
when she signed up. When she went to perform her
there was a problem with her stage and her performance
ended up being really yanky, and she was panned, and
it just was one of those things where it's like, dang,
don't you wish you would never got mixed up with
these people? What a bad choice.

Speaker 2 (24:29):
It's oh my god. We always forget how incompetent they
are too. They're incoonfident and they don't have any good
artists on their side, and that sucks, and I do
feel bad for them.

Speaker 1 (24:39):
I do absolutely.

Speaker 3 (24:44):
Let's take a quick break at her back.

Speaker 1 (25:02):
Okay, so we have to talk about what is going
on at Substack. I don't have a Substack. I for
the longest time was like, oh, I should get a substack.
I should get a Substack, and then I'm kind of gone,
I don't have one. We talked about this before. Back
in twenty twenty four, Substack agreed to remove Nazi content
after hundreds of writers, including Casey Newton, who runs this

(25:22):
very prominent tech publication, platform or on substack. They all
signed an open letter threatened to quit the platform if
the platform did not remove Nazi content. The Atlantic reported
that prominent white supremacists, people like Richard Spencer were using
the platform to earn money. Spencer was likely at least
making nine thousand dollars a year and potentially more than
that from his substack, and because substack got a cut,

(25:45):
they got ten percent. Essentially, substack is profiting from white
supremacists who are on their platform. So Substack back in
twenty twenty four was like, Okay, we'll do something, and
I guess the problem has not been solved because Taylor
the Rens reported that substack sent a pushed notification promoting
a Nazi publication on the platform to an unidentified number

(26:07):
of their users. I don't use the term Nazi publication lightly.
The blog literally included a swastika icon, So that's like
pretty cut and dry. I feel pretty confident saying this
is a Nazi public They were definitely.

Speaker 2 (26:22):
I think they self identified as a National Socialist public exactly.

Speaker 1 (26:26):
This is not something where I'm just throwing the word
around like they would probably agree that that's how they
would They would self identify Yeah, and so substack apologized.
They said, we discovered an error that caused some people
to receive pushed notifications they never should have received. In
some cases, these notifications were extremely offensive and disturbing. Y'll say,
this was a serious error, and we apologize for the

(26:48):
distressed cause. We have taken the relevant system offline, diagnosed
the issue, and are making changes to ensure it does
not happen again. So I feel like, for as bad
as it is to send a push notification that includes
a swastika telling people to check out this nazi blog,
that that statement, to me is like throw that in
a garbage It doesn't mean anything. Ours Technica spoke to

(27:10):
Joshua fish Or Birch, a terrorism analyst at a nonprofit
and geo the Counter Extremism Project who's been monitoring substack
and their role in helping far right movements spread propaganda online.
He makes a very good point that what should be
happening now is more transparency and then outlining exactly what
happened and exactly what steps substack is taking to prevent

(27:33):
it from happening again. Just saying oopsie, we sent you
a swastika is really not good enough. When you consider
how big of a of a fuck up that is.

Speaker 2 (27:43):
Yeah, No, that's like a really big fuck up and
being like, oh, like that system has been diagnosed, what
do you mean? Yes, what do you mean? Can you
tell me more about what that system was?

Speaker 1 (27:55):
Like? How did that even get scooped up there?

Speaker 2 (27:57):
Like I go, when we say transparency, we don't mean
like a two sentenced statement from your company. We say like,
we mean like like true accountability, like what rent wrong here?
And like how can you explain it to us so
that we understand and can have faith that like you're
taking actions and like steps to make sure it doesn't
go wrong again? But like that's I don't feel like

(28:18):
we're getting real transparency here.

Speaker 1 (28:20):
No, absolutely not. And Fish or Birch he made a
really good point that all of these like far right
nazi extremist types have gotten the impression that their content
is far less likely to be removed from substack, and
they're not wrong. And importantly, they see substack as quote
a legitimizing tool for sharing content, specifically because the substack brand,

(28:43):
which is widely used by independent journalists and top influencers
and sherish content creators, can help them convey the image
of a thought leader. And so I thought that was
such an important point. That it's not just that substack
is hosting this kind of content, which that is a problem,
it's also that the people who make this kind of
content rightly understand that me being on substack next to

(29:06):
all of these important influencers and content creators legitimizes whatever
it is I have to say. So it's not just
that it's being taken down. It's that I'm hobnobbing with
all these other important voices. So it seems like my
voice is important too.

Speaker 2 (29:19):
Yeah, it like just it lends them legitimacy exactly. It's yeah,
And hasn't substack like talked about how they don't want
to take that content down because they don't want to
like overly sense like sensor that content and they believe
that like it'll just.

Speaker 1 (29:34):
Go into the shadows, Like, am I right? That's like
also that is part of their Like that's just also
we I would so much rather that content be in
the shadows than on substack next to you know, legitimate journalists,
like actual high quality substacks. Uh, Like it's okay for
bad things to be in the shadows where we can't

(29:54):
see them. Actually that's where bad things like should be
and should stay. I'm okay with that, and I think
that's that's that's such a good point. And you know,
it's one of those things where I do think that,
like so many prominent journalists who are doing their own
thing and like have their own you know, media outlets
are on substack. And it goes back to that old

(30:16):
adage about a Nazi bar, right, Like, if you're hanging
out a naziabar, what point are people gonna start thinking
that you're the Nazi. I think that if substack doesn't
do something, I think people who are who are prominent
on the platform who don't want to be associated with
this kind of thing and don't like don't want this
thing to be out of the shadows they want. They
don't want it to be like next to their piece
or something, I think they might be having to make

(30:39):
some changes. Angry Black Lady Amani Gandhy, who is a
journalist and a lawyer who I follow online. She wrote
on Blue Sky that basically it sounds like substack is
unsustainable and that people either have to move now or
move in shame later. Those are really the only two
options at this point, And I thought that was such
a good point of like, at a certain point people

(30:59):
might have to decide, I don't want to be on
a platform like this, and won't it be embarrassing to
have to do that in shame after you've been like, Oh,
they're gonna fix the problem.

Speaker 2 (31:11):
But also at the same time, it puts like creators
of that content in such a shitty position where like
they need that like hosting service and their audience uses it,
and it's not like the audience is that good at
following people off platform onto different platforms, and so like
if they're financially dependent on substack and they've gone independent

(31:34):
and they like rely on it, then like it's a
big ask to be like, can you not use this
platform that you rely on and that you can't really
transfer your audience to anywhere else because you don't like
other content that's being hosted here. Like it's it's really hard.
It puts them in a really yeah, between a rock

(31:54):
and a hard place. That's a really good point.

Speaker 1 (31:56):
They're really doing a disservice to the people that use
their but they have to be thinking about this at all.
And I get a lot of people who are on subseck.
Like the state of journalism and media, everyone I know
has either been laid off or wakes up being worried
about being laid off. And the people who are laid
off often use platforms like subsec to do their own
thing while they figure out what's next. But I think

(32:18):
they I think SUBSEC is putting a lot of people
who are already oftentimes operating from a place of precarity,
not oft not always, but often, I think that they
are putting this on them in a way that is
I don't think is acceptable.

Speaker 2 (32:32):
No, I mean, that's what happens on every platform where
like they refuse to engage in and content moderation because
like arguably I do the same thing on TikTok where
like I'm forced to be on TikTok at like to
to reach my audience who I can't transfer over anywhere else.
And it's also on a platform that is like you know,
consumed with like misinformation and like a lot of you know,

(32:54):
like I just published about all like the racist AI
generated content and I'm forced to like live there, and
it's it's not fair to me. I don't think it's fair. Uh,
I mean, I think that the platform in general would
be a lot better if it focused instead on like
creators who made like original, high quality content, but that's
not where they want to invest all of their operations.

Speaker 1 (33:17):
You know, the last time that you and I spoke
on the podcast, we were talking about a potential TikTok
ban in the United States. If TikTok was banned, what,
like what would that look like for your voice and
your platform.

Speaker 2 (33:31):
I mean, on one hand, I would lose like, you know,
half a million followers, which is is like brutal. But
on the other hand, my TikTok reach right now is terrible,
absolutely horrendous.

Speaker 1 (33:45):
Uh, Like I barely can reach like a tenth of
those followers, like I just have. I never have guaranteed.

Speaker 2 (33:54):
Access to the people who did choose to opt into
my content and like routinely them in the comments being
like I haven't seen you on my fore page in
years or like whatever. And I feel like on TikTok,
especially right now, my content is just being flooded out
with ads instead. I mean, the place is just becoming
like a glorified Amazon. I got eleven sponsored posts in

(34:19):
a row like last month between like TikTok shop ads
and like actual ads.

Speaker 1 (34:25):
I got eleven in a row.

Speaker 2 (34:27):
I screen recorded at eleven in a row. It was
crazy and like it's barely usable. I mean, I loved
TikTok for a time.

Speaker 1 (34:36):
I stopped using it precisely because of the just the
deluge of ads. And I don't mind ads that much
in my content. However, there was there is something particularly
the user generated TikTok shop ads alongside the regular ad

(35:00):
There's something about that that just feels dystopian. You know,
you've probably seen this sting on there where it will
be someone who says they're going through a horrible trauma
and it's like, can you just let this video play
for seven seconds or so that I can get get
on the button so that I can pay my bills
or whatever. Yeah, I'm very sympathetic to people who need

(35:25):
to use platforms like TikTok to raise money to get
themselves out of tough situations. But it just it just
makes me sad. It just doesn't feel good to show
up on a platform where this is feeling more and
more like the norm and less and less like a
like an unusual experience, you know what I'm saying.

Speaker 2 (35:46):
Yeah, It's almost like we now are not just consuming
the ads, but we also produce ads for ourselves to
consume alongside other content of people just being like I
need money so badly and you have to sit there
and be like, my attention is currency. I have to
pay you in attention so that you can pay rent.

(36:07):
And it's just it's so dystopian and it feels like
it never ends, and it's it's brutal because I remember
a time where TikTok was fun and not flooded with dads,
and now I go on there and it just feels
like I'm reminded of what.

Speaker 1 (36:23):
Like a capitalist dystopia we live in. Oh my god.
I was seeing this song on TikTok everywhere and I
was like, Oh, it's weird that this song I've never
heard of by an artist I have never heard of
and don't know at all, is in all these tiktoks
at the same time. And it turned out that you
probably know this. I learned this for the first time,
that TikTok had a program where you could get paid

(36:45):
to use artists' songs because they're trying to promote the
songs with like, oh, if you get x views on
this song, you might get a payout, and I realized
is nothing organic anymore? No, people like like, if I
like a song, I'll use that song on social media.
But the idea that someone would be being paid to

(37:05):
amplify a song just so that it becomes an earworm, yeah,
I guess it's just what you're saying. It's it's just
so clearly an attention economy where they're just trying to
get whatever value they can of me, my eyeball's paying
attention to something for a couple of seconds, and it's
like like they're not asking did anybody actually like the song?
That anybody would anybody have actually organically used this song

(37:28):
and their content. No, they were just trying to get
a couple a couple of cents, Okay.

Speaker 2 (37:32):
Yeah, now it really the moment that broke me was
one time I had an influencer tell me that they
were paid by jo Josiah's team to talk about her negatively.

Speaker 1 (37:43):
I absolutely believe this because something's going on with Jojo
Siwa and I never it's always there's only something going
on with her, and I absolutely believe I believe her.
I believe that influenceder No.

Speaker 2 (37:52):
I mean, the influencer showed me the video like I
saw it and like told me about like the money
they took, uh, and that like their team was very
much just like say whatever you've got to say, like anything,
anything you want to say. And yeah, I really do
believe that like very little is organic at these points,
and just so much of it is just like discourse

(38:14):
all the way down and at the end of the day,
it's just like meaningless distraction.

Speaker 1 (38:19):
Oh my gosh, I mean I might even cut this,
But when my producer was putting together the different stories
that we might talk about, one of the potential things
was the Sydney Sweeney controversy around the American and Eagle
Jeans ad. That's like, oh, I have great genes. And
the only thing I have to say about that is
that it just demonstrated to me how annoying discourse is

(38:41):
on the Internet, because it's just it's just a very
It just became clear how easily our attention is captured
and how easily we're all sort of played in a
kind of way that you know, I think that company
was like, oh, this is going to be a controversial
ad that he gets people talking. I think that one

(39:01):
of the executives from American Eagle posted on LinkedIn basically
saying that, oh, Sidney Sweeney really wanted to, you know,
make this something everybody was talking about. And I just
just it just I'm just so sick of feeling like
we are all so reactive and that our attention is
just so easily gamified. And they're not wrong, right, like,

(39:23):
like that's not incorrect, it's very effective.

Speaker 2 (39:25):
It's so effective, and it like distracts us from like
actual real problems, with like surface level representations of those problems.
Like did you read Charlie Rosel's piece in I think
it was The Atlantic?

Speaker 1 (39:38):
I sure did.

Speaker 2 (39:39):
Sidney Speed I literally copied and pasted his last because
I was like, if we talk about this, I need
to read to you this last paragraph because like it
had some place like stuff on discourse that can.

Speaker 1 (39:49):
I just read it? Oh my god, hit us with it. Okay.

Speaker 2 (39:51):
Discourse suggests a process that feels productive, maybe even democratic,
but there's nothing productive about the end result of our
information environment. What we're consuming isn't discourse. It's algorithmic grist
for the mills that power the platforms. We've uploaded our
conversations onto the Grist is made of all our very
real political and cultural anxieties ground down until they start

(40:12):
to feel meaningless. The only thing that matters is that
the machine keeps running, the wheel keeps turning, leaving everybody
feeling like they've won and lost at the same time.

Speaker 1 (40:22):
That is exactly how it feels. That is that literally
that like sums it up exactly.

Speaker 2 (40:28):
Literally, Like that's exactly it, and like it's just so
mindless and it's about these real things, like they are
these very real political and cultural anxieties, and yet it's
just a hamster.

Speaker 1 (40:42):
Wheel where it takes us nowhere at all. Yeah, And
I mean I checked back in on the discourse around
the ad and it was like liberals turn on Sydney
Sweeney because they hate hot women, and it's like, who
hates hot people? No, Like, I just really love you
hot women. Yeah, I like talking about.

Speaker 2 (41:04):
I will speak for the left and be like I
love hot women. And they're like they hate they hate
her boobs, and I'm like, no, I love her boobs.

Speaker 1 (41:10):
And also like, what's I don't think we need to
act like good looking women with big boobs are in
a pressed class, you know what I mean?

Speaker 3 (41:18):
They don't.

Speaker 1 (41:18):
They don't understand the plight the flight of a hot
blonde with big tits Fridgid, We don't know the struggle, yes,
but yeah, it's just just something about that particular story
I found very exhausting, and I think that piece really
sums it up that it it feels like a hamster

(41:40):
wheel where we kind of get near, like you're right
that there are real anxieties, real political and social anxieties
that they kind of represent, but then we're not actually
having a substantive conversation. It just feels like when you
eat a bunch of junk food, a bunch of McDonald's,
and then you feel good for a minute, but then
you're hungry again later because you didn't get any nourish.
But I don't. I feel like the discourse is not

(42:02):
nourishing me. It's only it's like making me feel empty.

Speaker 2 (42:06):
Yeah, it's yeah, it reminds me of the discourse around
Sabrina Carpenter's album. It was exact same thing of just
like this is so meaningless and it truly doesn't matter,
and it's not really like representative of all of feminism.

Speaker 1 (42:22):
Or like all of all women, and like we.

Speaker 2 (42:25):
Project so much meaning onto things, and I think it
represents like our feelings of like lack of control, and
it just, oh my god, that discourse drove me absolutely insane.
They were like, she's setting the feminist movement back, or like,
you know, she's like a feminist icon, and it's just like,
I think she's just horny.

Speaker 1 (42:44):
Yeah, I think it's so random. Have you ever seen
the movie This is Vital Tap? No, So it's it's
such a classic. I'm like outing myself with a big
nerd here. But it's a satirical movie about a rock band.
It's like a mockumentary. And they put out an album
cover where it's a woman sniffing a glove.

Speaker 4 (43:04):
She's put a greased, naked woman on all fours with
a dog collar around her neck and a leash and
a man's arm extended out up to here, holding on
to the leash and pushing a black glove in her
face to sniff it. You don't find that offensive, you
don't find that sexy.

Speaker 3 (43:25):
That's nineteen eighty two. Get out of the sixties.

Speaker 1 (43:28):
We don't have this mentality anymore.

Speaker 4 (43:30):
They wanted to do. I don't care what they want
to believe.

Speaker 1 (43:33):
When I first saw that to Breena Carpenter album, I
was like, Oh, is this a no mushmild a really
niche of mad a niche. This is a joke for
just me specifically. That's so funny.

Speaker 2 (43:45):
I also think this is just like this is like
new research that just came out from media matters, but
is like interesting with the Sydney Sweeney stuff they just
published that Like since Monday, Fox News spent eighty five
minutes talking about Sydney's and three minutes talking about Epstein.

Speaker 1 (44:03):
Certainly Sidney Sweeney's new American eagle gene that is more
important has more like bigger impacts for the country.

Speaker 2 (44:11):
No, And it's like it just feels so representative of
like how our own like discourse cycle that like we
feel like is somehow scratching an itch of our like
political issues actually just becomes a tool to distract us
from like real political power and how it's manipulated. Like

(44:33):
it just it feels so representative of that phenomenon.

Speaker 1 (44:37):
Definitely.

Speaker 3 (44:38):
Oh more, after a.

Speaker 1 (44:42):
Quick break, let's get right back into it. LinkedIn is
viiantly stripping their explicit protections for trans and non white

(45:04):
users from its English language hate speech rules. The changes,
which were flagged by the nonprofit Open Terms Archive and
then independently confirmed by the advocate involved, edits to LinkedIn's
professional community policies, specifically the hateful and Derogatory content and
harassment and Abusive Content sections in both references to protections
for trans people and people of color were either weakened

(45:26):
or removed entirely. This is something that is I have
a b in my bonnet about it because it was
work that I worked on when I was at Ultra Violet,
like getting platforms like I will say TikTok was one
of the first platforms to be like, oh, of course
we will add dead naming and misgendering to our hateful
conduct policies.

Speaker 3 (45:42):
Yeah.

Speaker 1 (45:42):
Great. But when Elon Musk bought Twitter, one of his
first acts was removing twitters at dead naming and misgendering
policy from their hate speech policy. And I remember thinking, oh, well,
I think that platforms are going to be watching to
see if he is able to do this without much
fanfare and without much pushback. People did push back against it.

(46:04):
It's not like nobody said anything, but I think the
fact that he was able to just like unilaterally make
that choice, and you know, people kept using the platform.
At that point, I think signaled to a lot of
other platforms that it was okay to roll back these
kinds of policies. So when Trump got an office in January,
like Meta did the same thing, YouTube did the same thing.
And I do think you know, platforms, they see how

(46:28):
other CEOs behave, they see what other things that other
platforms can get away with, and I think it informs
what they think they can get away with. So I
do really think that in part we have Elon Musk
to sort of blame for LinkedIn of all places feeling
as if it's fine for them to stop or to
roll back these these policies to add protections for trans

(46:48):
folks and people of color.

Speaker 2 (46:50):
I think in general, like there's group think when it
comes to stuff like you know, content moderation policies on platforms,
and they are kind of all looking at each other
because like, I don't want to moderate. It's it's like
time intensive, it's money intensive, it requires the hire people

(47:10):
to do that work. Like the less moderation that they
have to do, like, the more cost effective it is
for them.

Speaker 1 (47:16):
So I think that they're always going to look for
where they can trim that and just like not.

Speaker 2 (47:20):
Have to worry about it. It's it's I don't understand it.
I don't understand why you would just like not care
if your platform becomes like a toxic wasteland. I really
can't wrap my head around it, especially a platform like LinkedIn.

Speaker 1 (47:37):
Yeah, so it's for it's really meant to be for professionals,
job seekers, that kind of thing. I think, not giving
trans folks and folks of color an equal playing field
in a digital arena that it's all about job seeking,
it's just not. It's it's it feels especially cruel, especially
as more and more people are getting laid off that

(47:57):
are that nobody it seems like nobody's hiring right now.
It just feels like an extra added burden that fist
people and not non black people are not asked to
deal with. If you if you can't show up to
a platform that it's all about talking about professional accomplishments
and professional thought leader and find thought leadership and like

(48:17):
finding a job, if you can't show up there without
being dead named and misgendered, I just think it really
it's it's a it's a clear way where these platforms
are not equitable and people are not having equitable experiences
on them.

Speaker 2 (48:30):
Yeah, and like we've known that, and we've known that
they also aren't very good at enforcing those policies. But
like watching them get taken away and like take away,
like even like the pretense that they were going to
try is just like really sad, and I think speaks
to like the current political moment and the apathy that
these companies feel towards like a need to serve historically

(48:55):
like marginalized groups, Like they don't like they like that
was very much I think a performance for when it
was popular, and now that they feel like it's not popular,
they're not.

Speaker 1 (49:04):
Going to do that performance. Like it was never coming
from a place of actual, like well meaning desire to help.
Oh I know it wasn't because they had to be
cajoled into it. So this can't be real if if
you it only happens because you were cajoled.

Speaker 2 (49:19):
I just imagine being like I'm gonna build a h
platform that's for professionals to discuss their professional work and
I don't care like if people dead name each other,
Like I hope there's more dead naming on this platform. Yeah, why,
Like it's it's a professional platform. It makes no sense.

Speaker 1 (49:38):
No, I'm with you, and even the way that they
dealt with it is really wild. So the Advocate reported
on this and they got a statement from LinkedIn where
a spokesperson initially defended the platform stance against identity base abuse,
asserting quote, we regularly update our policies. Personal tax intimidation
or hate speech towards anyone based on their identity, including misgendering,
violates our harassment policy. See it is not a lot

(50:00):
on our platform. Then less than an hour later, the
company wrote back to the Advocate and asked to revise
that statement, removing the phrase hate speech and instead mirroring
the new policy language saying personal texts or intimidation towards
anyone based on their identity, including miss gendering, violates our
harassment policy and it's not allowed on their platform. So
it's like they can't even keep their own whack policy

(50:23):
straight internally.

Speaker 2 (50:25):
Well, also, at a certain point, like these words become
meaningless because like how is that different than hate speech?
And then also you think about like the fact that,
like you know, dozens of people were involved in that
discussion and there's a whole email thread about like is
hate speech allowed but like discrimination isn't, and.

Speaker 1 (50:42):
Like like what are we doing? Like this?

Speaker 2 (50:46):
It's it's kind of mind boggling and like hard to process,
but like there are so many people involved, and like
having those conversations of like are we is hate speech
not a word that we're not a phrase that we're
going to be using in our policy anymore?

Speaker 1 (51:01):
And it's just like what is this? What are we doing?
Why can we have like a group meeting and maybe
like revise our goals as a species, because like what
are we? Truly? What are we doing? I think this
all the time. My last thing I want to talk
about is an update on a story that we did
earlier this week, and that is about the tea app.

(51:23):
Certainly you have heard about this, right, oh, I'm I
have heard a lot, Yes, I'm very It's very interesting.
So for folks that don't know, the tea app was
designed for women to spill tea on men. The creator
of this app because it was designed to help women
feel more safe while dating men, so that they could
share Women could share their experiences with specific men. I

(51:44):
guess the idea being that if a man was abusive
or you know, they had a bad experience with a
man they could they could safely and anonymously share that
what we know it was not anonymous. For a time,
that app required women to show there to give copies
of their driver's licenses to confirm they were women. The
app eventually scrapped that requirement, but did require women to
send selfies. Last week, one three hundred driver's license images

(52:09):
of those women, along with seventy two thousand sealthies of women,
were all found to be essentially publicly accessible thanks to
the app's atrocious security practices. So the app said, oh,
as soon as you submit your license or your selfie,
we deleted as soon as you're confirmed. Obviously that wasn't
the case because images of that stuff was all over
four chan. Additionally, private dms were also left accessible. So

(52:31):
folks want the deep dive into what happened there, check
out the episode that we put out about it this week.
But we have a slight update ooh okay, and that
is the app, probably unsurprisingly, has been hit with two lawsuits.
Two lawsuits have been filed in the Northern District of
California alleging negligence, breach of implied contract, and other claims.

(52:52):
So one was filed on behalf of Griselda Rays, who
said that she submitted a photo while signing up for
the app that was included in the breach. She seeks
it in junk, requiring Tea to encrypt all data and
purge private information as well as monetary damages as determined
by the court. I actually think she has that case.
Like I the app made it clear as day and

(53:12):
said we delete your images, and they did it like that.
They dislied.

Speaker 2 (53:16):
That's wild. It's it's absolutely wild. How negligent that is.

Speaker 1 (53:20):
Yeah, I mean, it almost wasn't fair to call it
a hack or or like even breach feels a little
bit strong because these things were essentially publicly accessible, so
you can't like to hack information, to hack something, it
has to be behind some sort of security protocol. This
was not, so it's like you can't even a hack.
It was the level to which I mean the analogy

(53:42):
that that I spoke to a cybersecurity professional about it.
The analogy that they gave me was imagine if your
doctor told you that your private health information that they
collected was private, but actually they stored it in an
open crate in the alley behind the clinic. That is
the level of except that this information was Oh.

Speaker 2 (54:02):
My god, wait, isn't that so the tea app it's
all like for women. It's for women to discuss like
men and ranging on, you know, their interactions with men.

Speaker 1 (54:13):
But wasn't it like found and created by a man? Yeah?
The creator, businessman and tech capitalist Sean Cook, formerly of Shutterfly,
said that he created this app because he like in
their marketing. Honestly, it does just sound like marketing to me.
In the marketing materials for the app, he said, Oh,
I watched my mother date and get catfished and like
get mixed up with men who had criminal records. So

(54:33):
I wanted an app that would keep women safe and
that women could like have safer dating experiences. That may
very well be true, And I'm not deny that there
are women who maybe use this app to vet a
man or to find out information about somebody who genuinely
was abusive. I'm not saying that that did not happen
on this app. I am saying that when you actually

(54:54):
look at the kind of stuff that was posted on
the app, it was also women being like, you know this,
this guy has a bad vibe or like like saying
things that were not necessarily rooted in trying to keep
women safe from abuse. And so I think it's interesting
that this man who created this app for women got
to enjoy talking about how they were all about women's

(55:14):
safety while doing something that put those same women so
clearly at risk.

Speaker 2 (55:20):
Yeah, no, it's it feels kind of poetic. It's also
kind of what I've started to expect from men, especially
when they like perform feminism, Like I have pretty like
low expectations from them when they like make their whole
persona about like protecting women, Like I just think that

(55:45):
I need to see them go above and beyond and like, actually,
you know, walk that walk.

Speaker 1 (55:52):
But yeah, the fact that it's like it was that
careless and that that easily not even hacked, I guess
stumbled upon und crazy and the women, some of the
women they so that not only were these women's drivers'
licenses which have their addresses on them, posted online, also

(56:14):
some of these dms that were that became accessible were
very serious in nature. The second lawsuit, which was brought
on behalf of an anonymous Jane Doe, says that she
joined the tea app because she wanted to anonymously warn
other women in her northern California community about a man
who sexually assaulted at least two other women. The app
promised her that anonymity, and promised her safety, and promised

(56:34):
to delete her verifrication data. Te broke every one of
those promises. And so that law Shue really demonstrates, like
fact that genuinely could put her at an unsafe situation
if she is if she is reporting this person attacked me,
harmed me. Having her her conversations about that be accessible
to anybody who wants to see it is putting her

(56:55):
deeply at risk. And so, yeah, this app, like I
had a there were going to be lawsuits. I'm actually
surprised it was this quick because this just happened last Yeah,
but you know it really it really I think these
women have a right to some justice here.

Speaker 2 (57:11):
Yeah, no, one hundred percent. I kind of am interested.
I'm very interested to see how this unfolds. I think
the whole conversation around it is fascinating. How did they
make money?

Speaker 1 (57:21):
Oh so the app? You got up to five free
searches on the app, So how it works. Let's say
that I met Joe Blow on the street, and I
want to see if anybody's talking about him. I could
search him on the app to see, you know, if
he appears in any te posts. After those five posts,
you then have to buy a paid subscription, which is
fifteen dollars a month, or you could share. You could

(57:43):
have like five girlfriends sign up and keep using it
for free. Okay, so it was gonna be paid, so
they weren't doing giving it away for free. And then
like selling data. I I mean that I am I'm
not willing to say, I'm not able to say that.
Would it surprised me? I don't know for sure, Okay,
that makes sense.

Speaker 2 (58:01):
I just think everything about that app is completely fascinating
to me, and like the fact that so many women
flocked to it.

Speaker 1 (58:10):
As a place of safety and then more portrayed is interesting.

Speaker 2 (58:12):
But also like the discourse that was had there, and
like what it means for privacy and safety today is interesting.
And then the fact that like you know, all of
a sudden, thousands and thousands of women are docked is
like the fact that we live in that context too,
where so suddenly you become essentially like a public figure.

Speaker 1 (58:32):
Who is getting harassed.

Speaker 2 (58:33):
Is like I mean, it's it's such a like a
rich text for twenty twenty five Society and Technology.

Speaker 1 (58:41):
I talked about this in the episode, but genuinely, when
I first heard about this, I thought, this has to
be some kind of a setup. There's no million this
I mean, but that was my initial gut feeling was
it was such a kind of morality play in a
kind of way where it just seemed so on the
nose that I thought, this has to be this has

(59:02):
there has to be more to this story. I'm I
don't know that to be true, but that was my
initial just it just felt too just too much of
an on the nose gender war story to have to
have actually happened the way that it seemed to have happened.

Speaker 2 (59:16):
Also, there's something to be said to culturally about the
way like women engage in sleuthing as a means of
like self defense and not almost it almost felt like
that on steroids of just like let spring sleuthing into
like let's like give it a whole structure, let's give
it a system when it's already something that like so

(59:39):
many women do in the name of, like I think,
really an anxiety and trying to protect themselves.

Speaker 1 (59:45):
Oh yeah, I think that sleuthing and gossip and whisper networks,
I don't want to stigmatize those things, I think. I
think those things exist for a reason and they have
forever because it's about those are how women keep our
self safe. I don't think that there's anything wrong with
women sharing information with each other. I think the problem
is when this app promises to systematize that in a

(01:00:08):
way that is safe and anonymous, and so cruelly betrays
the women who flocked to it thinking that it would
be safe and anonymous.

Speaker 2 (01:00:18):
Yeah, it's like if a whisper network has that much
official structure, Like, at what point is it no longer
a whisper network exactly exactly.

Speaker 1 (01:00:30):
Like I'm not sure.

Speaker 2 (01:00:31):
I am pro whisper network, and I am a huge
proponent of gossiping. I think that it is like an
evolutionary tactic. I think gossiping is like one of the
best things we've ever developed.

Speaker 1 (01:00:42):
I love it. Uh.

Speaker 2 (01:00:43):
But also if you're gossiping, like with you know, one
hundred thousand people, Like is that is that still gossiping?
Like is that still protection? Or like I don't know,
I I I find it really interesting.

Speaker 1 (01:00:59):
Same and you know, one of the reasons why I
found the t app breach so concerning was that so
much of the Internet is being restricted by age and
like age verification sometimes requires a copy of your government ID,
and so we already know that platforms that do this
have been compromised before. So the version, so a version

(01:01:20):
of what happened to the women with the t APP
could happen conceivably when all of us are submitting our
IDs to verify that we're old enough to access certain
pockets of the Internet. And it's already happening in some
places like the UK. So on Wednesday, Spotify said that
they were going to be rolling out the use of
yod which is a smartphone app that uses face scanning
to estimate a person's age. If you appeared to be

(01:01:41):
under age, then you need to submit your government ID
in order to not be age restricted on Spotify and
also YouTube announced this week that it was rolling out
technology that was going to use AI to quote interpret
a variety of signals based on the kinds of videos
that so one sorches for and watches and the longevity

(01:02:02):
of their account to determine if they are under eighteen,
So if their AI senses that you're because of your
viewing habits that you might be under eighteen. You will
then have to submit your government ID to bypass automatic
age restrictions. And so the fact that this is becoming
more commonplace really concerns me. Yeah, that's gonna be a
problem for me.

Speaker 2 (01:02:20):
On YouTube, I've been watching so many clips from zombies
for They're gonna be like, why is this child also
watching so many the ussays?

Speaker 3 (01:02:32):
I think.

Speaker 1 (01:02:32):
I think about my my sister in law, who when
I look on her YouTube it's one hundred percent blue,
it's one hundred percent because they've got toddlers. Yeah, or
you know, she's got great taste.

Speaker 3 (01:02:42):
Ye.

Speaker 1 (01:02:44):
Blue is a fantastic show and I am going to
stand ten toes down on Blue. Yeah we're not no,
no shame for liking Blue here. But all this age
verification stuff, it's often janky and doesn't work, and there's
been lots of studies about like being kind of hit
or miss. And I think as we sort of move
to a place where more and where the Internet is

(01:03:06):
age restricted and we have we might have to show
our government IDs to get it. You know. I think
the t APP breach really shows how problematic and concerning
that can be because we really don't know that much
about these platforms, and like any company can say, oh week,
we're going to delete your driver's license picture as soon
as you submit it, they don't have to necessarily do that.

Speaker 2 (01:03:25):
It's also interesting that, like, truly, the thing that drives
all of this is porn. Like it's really interesting to
me because it's like, you know, both of us have
been deep in this world of like content moderation and
people who make anonymous accounts and can get away with
posting the most horrendous things and there's no verification, Like

(01:03:45):
you know, part of the reason why you're able to
go online and create a hate account and just troll
people and make everybody's lives worse is because like you're
making that count completely unconnected to a government id right
like they like there's that's part of how the Internet
has always worked and part of what's made our lives

(01:04:06):
in particular very difficult.

Speaker 1 (01:04:09):
But the thing that is.

Speaker 2 (01:04:11):
Driving this, it's always porn. It's always porn in sex
and like a panic around it rather than like actual
like well being conversations. I think it's really interesting.

Speaker 1 (01:04:23):
And let's be real, if we actually wanted to keep
young people safe and keep them away from things that
are harmful. There are a million things we could be
doing other than restricting the internet so heavily in the
name of keeping children safer. Like, it's interesting to me
when we do and when we don't appear to care
about the well being of children and keeping them safe.

(01:04:45):
I argue that we are a deeply anti child society,
that we don't care about children, We don't care about
their safety at all. But then, so what's interesting when
we're being told we all have to start showing our
IDs to listen to music on Spotify the children, it
just immediate red flag, red flag, red flag, red yeah.

Speaker 2 (01:05:06):
Of like yeah, or like you know, the sex toy
like legislation in Texas of like trying to make sure
that you have to if you want to buy a
sex toy online, you need to show like a government
ID to prove your age. And it's just like, first
of all, which children are buying sex toys online? Find
me one.

Speaker 1 (01:05:28):
Second?

Speaker 3 (01:05:28):
Of all?

Speaker 2 (01:05:29):
I just I mean, maybe this is a hot I
just I don't really care about a kid having a
sex toy. If a kid wants to have a sex toy,
and do whatever, like they're they're gonna use something to
masturbate their children.

Speaker 1 (01:05:41):
Well, yeah, I mean that's some So in Australia, they
had done this whole study on what on like her
respecting the Internet and harm against kids, and one of
the things they found was that when you try to
keep like sexual content and sexual things away from kids,
a lot of times those kids are actually using it
for content that they're trying to learn about their bodies

(01:06:03):
and their sexuality and all of that. And so by
trying to legislate that kind of stuff away from them,
you might actually be harming them because you're creating an
environment where it's harder for them to get that education
and they will seek it elsewhere. Yes, which I thought
was such an interesting perspective and sort of how the
way that we've restrict things sometimes actually end up introducing

(01:06:25):
new harms.

Speaker 2 (01:06:26):
Yeah, because well, I mean we're doing this and simultaneously
not providing sex ed right. Like part of the reason
why I think young people might flock to porn is
because they're not having those conversations like at home or
in school. They don't understand like how sex works, and
like they can't make sense of like the feelings that

(01:06:46):
they're experiencing in their body, and they don't have access
to that information, so of course they're going to like
gravitate towards porn, and then porn is going to like
pull you deeper into some sort of Internet rabbit hole
because it's like an algorithmically kind of like send you somewhere,
but it seems like the real Like if we cared
about kids and taking care of them and respected them

(01:07:07):
as like coming into their own sexuality before the age
of eighteen, which they are, like they're going through puberty,
Like we would make sure to have like resources for
them and like teach them about the things that they're experiencing.
But instead we're so much more interested like we did
just makes us uncomfortable, so we just shut it down.
And she'd be like no, we no, like no sex
for them, nothing sexual.

Speaker 1 (01:07:30):
Yes, nothing sexual for them. And also show your ID
when you want to watch something other than bluey I'm huge.

Speaker 2 (01:07:36):
But also the second you turn eighteen, you can be
in that porn like yes of the day, yeah, the day.

Speaker 1 (01:07:43):
You're eighteen, go wild, Abby. It's always such a pleasure
talking to you. You bring such a light to these
sometimes tough topics. Where can people what are you up to?
Where can people keep up with what you're up to?
Tell us all the things? What am I too?

Speaker 2 (01:08:01):
I'm just back from vacation and like reassessing some kind
of bigger projects that I want to do some deep
dives in. So we're gonna see where those take me.
What I've been up to for the last few months
is a big like fascist propaganda series, so doing a
lot of breakdowns on different types of fascist propaganda and
like how they work and the themes that they typically

(01:08:23):
cover and like use to manipulate people into buying into.

Speaker 1 (01:08:28):
The fascist cause. And you can find that work on
my TikTok at topology or Instagram or Abby s R.
I'm also on YouTube, which I think it's just Abby
Richard's uh and you can read my written stuff at
medium Matters. Abby is one of my favorite follows on TikTok.
It will it Will It Will It will turn your

(01:08:49):
TikTok experience up a notch to follow Abby. Thank you,
that's so kind. I love your TikTok videos. I like
to see you. I'm I'm from me. I've talked about this.
It's like it's like in bare. I am an audio
person because I like the idea of being a voice
in somebody's ears. And then when you get on video
you're like, oh, that's how I look.

Speaker 2 (01:09:06):
He's like, I'm trying, Well, you're a delightful voice in
my ears, and you look great on screen.

Speaker 1 (01:09:13):
I'll take it. I'm trying to get like you. Oh
I want to be like you. I want to be
a voice in people's ears. I want to stay a
little podcast sounds fun. We got a freaky Friday where
I do the short form video content and you do
the long form podcast. That sounds so fun. Sign me up. Yeah.
Thank you all for listening and hanging out and unpacking

(01:09:33):
these stories. If you want to follow me, you can
follow me on Instagram at Bridget Brie and DC, on
TikTok at Bridget Brie and DC, or on YouTube. There
are no girls on the internet. Leave us a comment.
If you're listening on Spotify, I hope you're over eighteen
and I will talk to you soon. Got a story

(01:09:53):
about an interesting thing in tech, or just want to
say hi? You can be just said Hello at tengodi
dot com. You can also find transcripts with today's episode
at Tengoiti dot com. There Are No Girls on the
Internet was created by me bridget Toad. It's a production
of iHeartRadio and Unbossed Creative. Jonathan Strickland is our executive producer.
Tari Harrison is our producer and sound engineer. Michael Almado
is our contributing producer. I'm your host, bridget Tood. If

(01:10:16):
you want to help us grow, rate and review us
on Apple Podcasts. For more podcasts from iHeartRadio, check out
the iHeartRadio app, Apple podcast or wherever you get your podcasts.
Advertise With Us

Popular Podcasts

24/7 News: The Latest
Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

The Clay Travis and Buck Sexton Show

The Clay Travis and Buck Sexton Show

The Clay Travis and Buck Sexton Show. Clay Travis and Buck Sexton tackle the biggest stories in news, politics and current events with intelligence and humor. From the border crisis, to the madness of cancel culture and far-left missteps, Clay and Buck guide listeners through the latest headlines and hot topics with fun and entertaining conversations and opinions.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.