All Episodes

August 8, 2025 71 mins

This week, Bridget is joined by Producer Mike to break down the tech stories you might have missed.

Lynx coach Cheryl Reeve had some eloquent, powerful things to say about the sexist crypto bros who threw dildos at WNBA players as a publicity stunt for their new meme coin. It's less than a minute and worth a listen:  https://www.reddit.com/r/Fauxmoi/comments/1mkdusa/lynx_coach_cheryl_reeve_livid_over_sex_toy/

Disgraced journalist Chris Cuomo fell for an obvious AOC deepfake video about Sydney Sweeney, then demanded she answer for it even after he knew it was fake: https://www.independent.co.uk/news/world/americas/us-politics/chris-cuomo-aoc-sydney-sweeney-jeans-b2803523.html

A jury has ruled that Meta illegally collected Flo users’ menstrual data: https://www.theverge.com/news/753469/meta-flo-period-tracker-lawsuit-verdict

TeaOnHer, a rival Tea app for men, sprang up overnight and is already leaking users’ personal data and driver’s licenses: https://techcrunch.com/2025/08/06/a-rival-tea-app-for-men-is-leaking-its-users-personal-data-and-drivers-licenses/

Grok generates fake Taylor Swift nudes without being asked. https://arstechnica.com/tech-policy/2025/08/grok-generates-fake-taylor-swift-nudes-without-being-asked/

ICYMI: Taylor Swift Twitter deep fakes are everyone’s problem: https://podcasts.apple.com/us/podcast/taylor-swift-twitter-deep-fakes-are-everyones-problem/id1520715907?i=1000643579343

Grok Imagine's 'Spicy' mode lacks basic guardrails for sexual deepfakes: https://mashable.com/article/xai-grok-imagine-sexual-deepfakes?test_uuid=003aGE6xTMbhuvdzpnH5X4Q&test_variant=b

Meta illegally collected data from Flo period and pregnancy app, jury finds. https://arstechnica.com/tech-policy/2025/08/jury-finds-meta-broke-wiretap-law-by-collecting-data-from-period-tracker-app/

Hackers Clown Trump Education Secretary With ‘Curb Your Enthusiasm’ Music and ‘Corrupt Billionaire’ Heckles. ttps://www.thedailybeast.com/hackers-clown-trump-education-secretary-with-circus-music-and-corrupt-billionaire-heckles/

If you’re listening on Spotify, you can leave a comment there or email us at hello@tangoti.com!

Follow Bridget and TANGOTI on social media! Many vids each week.

instagram.com/bridgetmarieindc/  

tiktok.com/@bridgetmarieindc  

youtube.com/@ThereAreNoGirlsOnTheInternet

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet, as a production
of iHeartRadio and Unbossed Creative. I'm Bridget Todd and this
is There Are No Girls on the Internet. Welcome to
another episode of There Are No Girls on the Internet,
where weeks for the intersection of identity, technology and social media.

(00:24):
And this is another iteration of our tangode weekly news
Roundup where we summarize and break down the news online
that you might have missed so you don't have to.
I am joined by my producer Mike. Mike, thank you
for being here. Welcome back to the show.

Speaker 2 (00:38):
Bridget, Thanks for having me back. Always such a pleasure
to be here with you and the Tangoty News Roundup.

Speaker 1 (00:43):
Okay, so listeners settle a debt for me or way
in on a conversation, So I was telling Mike off
Mike how much I wanted to watch this new movie
War of the Worlds with ice Cube. And the reason
I wanted to watch this movie is because it has
achieved something truly historic, which is a less than five

(01:05):
percent fresh rating on Rotten Tomatoes. And I honestly think
there's maybe there's like two movies in the whole world
that have ever had such.

Speaker 3 (01:14):
A low rating on Rotten Tomatoes.

Speaker 1 (01:17):
To me, when the rating is very high or very low,
they kind of achieved the same thing, and that I'm
just so curious what's going on. I was like, Mike,
we should watch this and talk about on the podcast.
You were like, absolutely not. I don't want to know
what's going on with a movie that is this badly reviewed.

Speaker 2 (01:33):
So last time I checked, it's rating was not just
less than five percent, it was zero percent. It's had
a zero percent rating of Rotten Tomatoes, which I don't know.
Ordinarily I would be with you, like, let's watch this
train wreck. I'm there for it. I love bad movies.
I love Mystery Sides Theater three thousand. Bad movies are great.

(01:57):
But we watched the trailer. You got me interested to
watch the trailer.

Speaker 3 (02:03):
As it didn't do it.

Speaker 2 (02:04):
It was like, like, the whole thing is shot on
Microsoft teams.

Speaker 1 (02:09):
Yes, spoiler alert, the entire movie takes place on Microsoft teams,
which I feel I guess the kind of idea that
somebody in a pitch meeting said and they were like, oh,
that sounds good, and then when you actually see it,
you're like, oh, no, this is not good.

Speaker 2 (02:22):
It's like a sick joke, Like nobody wants to do
anything on Microsoft Teams, let alone like spend ninety minutes
recreationally watching a movie. It was just like jerky and
bouncing around. It couldn't make out what was going on.
I didn't want to make out what was going on
that the dialogue was stilted and bad. Ice Cube looked stressed.

Speaker 1 (02:46):
He even ice Cube is like, how did I get
roped up into this?

Speaker 4 (02:49):
Like?

Speaker 1 (02:49):
Who?

Speaker 3 (02:50):
Like where?

Speaker 1 (02:51):
What ball was dropped on my team? That I am
here on teams in this remake of War of the World.
One of the reviews I read was in the mini
in the many remakes of this movie War of the World,
I remember the Tom Cruise remake.

Speaker 3 (03:05):
I think in like the early adds, this is the
worst one.

Speaker 1 (03:07):
And again that that peaked my interest. So I guess
what I'm hearing is you are not interested in watching
this and recap make it for the podcast if I
if I want to do that, I have to I
have to.

Speaker 3 (03:19):
Get somebody else to be on the other end of
the microphone. That's what I'm hearing.

Speaker 2 (03:23):
Yeah, I mean, I you know, I I love your
ambition that you're just gonna go for it to bring
this hard hitting news to the listeners. But I just, uh,
that trailer, I don't know it. It did not look good,
Like what did? Did any part of it leave you
wanting more?

Speaker 1 (03:41):
Only reading the reviews because the reviews are so bad
that if you're if the reviewers were trying to make
me not curious to see this movie, they did the opposite,
because I was like, well, dang, this sounds so bad.

Speaker 2 (03:55):
I have to see what what's going on with this?
You know what it kind of reminds me of is Megalopolis?
You remember that movie? Did you ever end up seeing that?
I didn't, but I wanted to and I still want to. Oh,
I have not made it a priority.

Speaker 1 (04:06):
It is absolutely one of my favorite things in life
is to read the reviews of a movie that has
just been panned. I've not seen it either, but the
reviews of that movie are also similarly, they're just so
bad that I wasn't really that keen to see it.
The reviews made me keen to see it because they're like,
I almost want to let me see if I can

(04:27):
pull some up.

Speaker 2 (04:27):
Yeah, you look for some reviews, because so that's a
movie that the reviews at least. The early reviews were
universally terrible, like this is the worst movie ever made?

Speaker 3 (04:38):
What was Francis Bord Coppola smoking like?

Speaker 2 (04:42):
And that made me want to watch it, like I
still want to watch it because it sounds interesting, like
there was there was ambition there. The synopsis that I
read sounded insane and like like disconnected plot points that
some movies that I want to watch watching ice Cube

(05:02):
like look around from one quarter of his screen to another. Well,
glitchy news reports come in about Aliens.

Speaker 3 (05:12):
I guess uh that no, just no.

Speaker 1 (05:16):
I will say that the reviews of Megalopolis are now
I would call them mixed. They were not as like
universally bad as when I first checked in. My favorite though,
is from Johnny Olenski from The.

Speaker 3 (05:27):
New York Post.

Speaker 1 (05:28):
A zero star wacko disaster.

Speaker 2 (05:33):
Right, a wacko disaster. I'm here for a wacko disaster.
I do not want to see a ninety minute Microsoft
team's disaster.

Speaker 1 (05:43):
Okay, Well, if you watch War of the Worlds with
me to recap it for the podcast, I will I
will watch Megalopolis with you.

Speaker 3 (05:51):
We'll do a like.

Speaker 2 (05:52):
We'll do a little agreement, okay, and then we'll like
force our listeners to listen to recaps of well these
two terrible movies.

Speaker 1 (05:59):
We will lose every listener that we have. Okay, wait, well,
speaking of wacko disasters, let's talk about Elon Musk.

Speaker 3 (06:07):
Yeah, the most wacko, the most disaster. Yeah.

Speaker 1 (06:10):
Do you remember when Elon Musk had these big grand
plans to make updates to his chatbot Groc, and then
like a day later, Groc started praising the Nazis and
saying that my name is no longer Groc, my name
is Mecha Hitler.

Speaker 3 (06:26):
Oh yeah, I remember.

Speaker 2 (06:27):
I mean he made those updates, he trained it on
four Chan and x and probably some other cesspools of hate,
and unleashed it on the world.

Speaker 3 (06:37):
He achieved his goal.

Speaker 1 (06:38):
Okay, well don't worry, because now it's also creating deep
fake images of Taylor Swift completely unprompted.

Speaker 3 (06:46):
Oh good, Oh good.

Speaker 1 (06:47):
So we're right, We're right on schedule because Groc Imagined,
which is xai's new generative AI tool, created explicit deep
fakes of Taylor Swift without even being specifically prompted or
asked to do so. This is according to new reporting
from The Verge. Now folks will remember this is not
the first time that X has been used in kind
of a similar way. Back in January twenty twenty four,

(07:10):
AI generated Taylor Swift deep fakes went viral on X
will drop the episode that we did from twenty twenty
four about that in the show notes just in case
you missed it. So if you're curious what happened this time, well,
Jess Weatherbed of The Verge discovered that grock Imagine spit
out uncensored, topless videos of Taylor Swift the very first

(07:30):
time she used this tool.

Speaker 3 (07:32):
She did not direct or ask the bot.

Speaker 1 (07:35):
To depict these images, but once she turned on grock
Imagine's Spicy mode, which is the setting that Elon Musk
promoted in the days right after its launch, it turned
out a video in which Taylor Swift tore off her
clothing and began dancing around in a thong. And again,
this was not something that she asked for. It's like, oh,
let me see what this AI tool is doing. Oh,

(07:57):
it's spitting out videos of Taylor Swift undressed. Now this
is not terribly surprising. There was a really good report
and Mashable that pointed out all the ways that groc
Imagine lacks even the most basic guardrails around sexual deep fakes.
Right now, the xai Acceptable Use Policy prohibits users from
depicting the likenesses of persons in pornographic manners. Unfortunately, there

(08:22):
is some distance between sexual and pornographic, and groc Imagine
seems to be carefully calibrated to take advantage of that
specific gray area. Groc Imagine will readily create sexually suggestive
images and videos, but it does stop short of depicting
actual nudity or sex acts.

Speaker 2 (08:41):
So it's like the whole thing about the definition of
pornography being I'll know what when I see it. We've
now left that to Grock and Elon Musk to define.

Speaker 1 (08:52):
Pretty much pretty much, you know, and most mainstream AI
companies will usually have a rule that is spelled out
explicitly prohibiting folks from creating harmful content, which typically calls
out by name sexual material or celebrity deep stakes, right
you know. Rival AI generators like Google vo three or
Sora from Open Ai usually have these built in protections

(09:14):
that are an attempt to stop users from creating this
kind of content, So you could not just type in
for instance, tailor Swift Deep stakes and actually generate them.
When those Tailor Swift Deep stakes went viral on x
the last time around, users had these workarounds to create it.
So obviously explicitly prohibiting users from making this kind of

(09:34):
sexually charged imagery doesn't stop the problem, but it arguably
does create a bit of a barrier.

Speaker 2 (09:41):
Yeah, and it just kind of, I don't know, like
appeals to common decency that you shouldn't be promoting non
consensual nude imagery of people, even if they are celebrities. Like,
it's just unsavory, and it seems like polite society shouldn't

(10:05):
be pushing those tools onto everyone else.

Speaker 1 (10:09):
Well, not at x because, unlike its rivals, XAI does
not shy away from not safe for work content in groc,
and folks might remember that it recently introduced Annie, which
is kind of a thirty sexpot anime avatar that will
engage in not safe for work chats. Mashable reports that
Groc's image generation tool does let users create images of

(10:32):
celebrities and politicians. So it's telling to me that the
competitors of Groc and Imagine spell out like, hey, don't
use our platform in this way, not at X, they
have no such qualms.

Speaker 2 (10:45):
It's like tempting to go down the free speech rabbit
hole and be like, well, you know, the rules about
free speech and First Amendment rates are different for celebrities politicians,
like you don't want to limit speech about them.

Speaker 3 (11:02):
But this just sounds like really gross.

Speaker 2 (11:06):
Sending aside the fact that that Verge reporter didn't even
ask for, you know, nude images of Taylor Swift.

Speaker 3 (11:13):
It just it feels super gross.

Speaker 2 (11:16):
You know, we know that Elon Musk has written a
bunch of like highly specific rules into Grok. He's had
or had his engineering teams do it because he I
don't think can write code at all, but he's written
a bunch of very specific stuff into Grock where before
Grok answers a question, it will search Elon Musk's tweets

(11:39):
to see how he what he thinks about an issue
before providing a response. And so I was curious whether
something like that might exist to prevent images of Elon Musk.
So I don't pay for Grock Premium or whatever it
is because I'm not a Nazi, so I couldn't test out

(12:00):
groc imagine. I think that's only available to those folks Nazis,
Nazis and X users almost event diagram that is almost
a perfect circle at this point. But so I just asked,
like I asked it to draw me an image of
Elon Musk in a bikini, and it did, and it

(12:20):
was terrible. And so I do have to give him
that that he didn't add special rules to exempt himself
from being depicted by Grock, at least in Grock three
point zero, which I guess is a couple models below
this new imagined one. But it was still terrible, and
I didn't feel any better after having done it. But

(12:43):
I was curious and that curiosity has been satisfied, and.

Speaker 3 (12:48):
It was terrible.

Speaker 1 (12:49):
Yeah, I mean, you can really see what they mean
when tech leaders tell us that this kind of technology,
this is going to be the linchpin of our entire economy.
Communities can't have clean water or clean air, but we
can be served up AI videos of Taylor Swift undressing
without even asking for it or going looking for it.

Speaker 3 (13:10):
That is the future. People for freedom, for freedom.

Speaker 4 (13:17):
Let's take a quick break at our back.

Speaker 1 (13:31):
Speaking of freedom, did you know that we recently might
have solved gender equality? Can I tell you about it.

Speaker 2 (13:40):
Oh, thank goodness because I heard that. Actually, it's been
a problem lately.

Speaker 3 (13:44):
Don't worry.

Speaker 1 (13:45):
We have solved it because we have another update about
the continuing fallout of the Tea app That's right, the
app designed for women to share information about men that
they might be dating. We did a whole episode about
this where we talked about how men were saying we
should have our own t app where men can talk
about the women, notwithstanding the fact that do you think

(14:08):
that men need a dedicated app to do this? Men
have been like sharing like pictures of women and talking
about women on group chats since forever. They don't need it.
You don't need a dedicated platform to do it, but whatever.
So the Tea on Her app was meant to be
exactly that, and TechCrunch reports that the Tea on Her app,

(14:29):
the app for men to Talk about women, has also
exposed users personal information, including government IDs and selfies. A
step in the right direction when it comes to gender
equality and progress.

Speaker 2 (14:42):
No just classic race to the bottom stuff like no
one can have anything nice. If everyone has nothing, then
then we're equal.

Speaker 3 (14:52):
You know.

Speaker 1 (14:53):
I envision a world where all genders can have our
data privacy compromised by sketchy apps in the name of
gender equality. The ta on Her app, which launched on
the Apple App Store earlier this week and shopped to
number two in the Lifestyle category on the App Store.
It borrowed language from the original t app for women
in its description. Here's what tech crunch found. TechCrunch found

(15:16):
at least one security flaw that allows anyone to access
data belonging to t on Her app users, including their
usernames and associated email addresses, as well as driver's licenses
and selfies that users uploaded to the app. Images of
these driver's licenses are publicly accessible web addresses, allowing anybody
with the links to access them using their web browser.

(15:37):
In one case, tech Crunch saw a list of posts
shared on the Tea on Her app appended with each
user's email address, display name, and self reported location. It
gets worse because TechCrunch also identified a potential second security
issue in which an email address and plaintext password belonging
to the apps creator was left exposed on the server.

(16:00):
The credentials appeared to grant access to the app's admin panel.
Tech Crunch did not use the credentials because doing so
would be illegal. But it does highlight the risks of
inadvertently leaving admin credentials exposed to the web. What are
we doing?

Speaker 3 (16:17):
Oh? Really, that's risky. It's risky.

Speaker 2 (16:19):
You shouldn't do it to expose your admin credentials to
the entire internet.

Speaker 1 (16:25):
I mean I wouldn't.

Speaker 3 (16:27):
No, I generally keep my ADMIN credentials to myself.

Speaker 1 (16:31):
What's interesting to me is that the conversation when the
t app breach first happened, where all these women who
had uploaded their driver's licenses and their selfies to this
app had their personal information leaked online. A lot of
people were saying, well, the women wanted to gossip about men,
and you know, turn about is fair play that their
stuff would be leaked online. I feel like I have

(16:54):
one heard a lot less about the tea on her
app potentially being susceptible to these kind of And two
I don't feel like there's the same kind of like
moral hand ringing about well do these men like, is
this turnabout is fair play for these men who wanted
tea on women? Now they're having their information exposed. It's
very interesting to be how different the conversations were around

(17:18):
these two different apps in their user bases.

Speaker 2 (17:21):
Yeah, did you ever notice how like their stuff is shit,
but like your shit is stuff?

Speaker 1 (17:30):
I mean you know that that that I mean rip
George Carltt. I feel like so much of how I
understand the world I feel like I have gotten from him,
but specifically that one bit of how well when I
do it, it's justified. When I do it, It's like
it's like I don't deserve anything bad happening to me
when I do it, because I'm me, you see.

Speaker 2 (17:51):
Yeah, right, And like not surprising that there aren't hordes
of women taking to the Internet to be like I'm
glad these men had their personal information exposed, Like how
dare they try to run a whisper network and like
talk to each other. That's just not something that women
would say. That's not something that like, well, people of

(18:14):
any gender would say.

Speaker 1 (18:15):
And I read this really interesting piece in Encyclopedia Britannica
by Brick bisk Off called the Manosphere is the motherboard
the t app hack media landscape, And one of the
points that I found very interesting and kind of clarifying
in that piece was how easily the initial t app
breach was hijacked and how it instantly became this flashpoint

(18:39):
of manosphere talking points, And I do think the t
app was this was a situation that was ripe for
a discussion about you know, gender and dating and sexuality
and privacy and how all of these things intersect with
this one story, but we didn't really have that conversation
because it was so easily weaponized by man sphere communities

(19:01):
talking about how women who were interested in sort of
getting information on men deserve what they get and how
you know, it's it's women who should really be being
gossiped about because of you know, like what however you
want to say it. But how how easily that conversation,
which was genuinely fertile soil for like genuinely interesting conversations

(19:25):
about how we interact with each other, how quickly it
just became about gender wars nonsense.

Speaker 2 (19:32):
It feels like there's really like two groups that are
keeping this thing going. There's like shitheads that are intentionally
trying to like weaponize gender wars as a way to
either farm engagement or just actively like suppress women. They're

(19:53):
like largely the minority.

Speaker 3 (19:54):
But then it also.

Speaker 2 (19:55):
Seems like there's this huge like not huge, but like
it seems like, there's also this group of men who
are like caught up in the nonsense of that first group,
and uh and are just like afraid of women and
like the only thing they know is gender wars, and

(20:16):
so like every single event that is in the news
or is being talked about online gets like filtered and
transformed to reinforce like a gender war framework.

Speaker 1 (20:30):
Absolutely, and I don't remember if it was you and
me that we're talking about this, But as much as
I love the internet and I am of the internet
and was raised on the internet and am SuperM online,
I do think that gender is one of those issues
where the worst people, the most extreme people, the people
with an axe to grind, the people who are grifting
and trying to make money off of the discontent of others,

(20:52):
They have really been able to take an outsized footprint
in the conversation about gender dynamics in my experience when
I'm like out in the world. As hard as it
is being a woman in the world, a black woman especially,
I genuinely think that the in real life way that
we interact with each other, which is not perfect by

(21:14):
any means, I don't think it is, it is so
deeply colored the way that online conversations would have you believe.
Sometimes I genuinely believe that we have let a small
sub section of very loud, very vocal extremists who are
in the minority dictate what the conversation is and like
really paint an impression that perhaps it's not always the

(21:37):
impression that is happening. And then when you have big
flash points like the t APP, that faction explodes, right,
and then they get to be like we told you so,
they get to really so easily dominate the conversation. And
I feel very bad for people who are just looking
for information about others, right, Like you're just interested in
a woman's perspective, or you're just interested, like if you're

(21:59):
a guy in your like, oh, I want to know
how to approach women or talk to women. The Internet
is the worst place to go for good faith information
in that vein, because it is the worst people who
are owning the conversation and owning the landscape. It is
a real problem.

Speaker 4 (22:15):
You know.

Speaker 2 (22:16):
When I was a graduate student at the University of Wisconsin,
I had the great fortune to study with this amazing professor,
doctor Janet Hyde, brilliant, brilliant professor, scientist, woman, and she
a big part of her career was advocating for what

(22:36):
she called the gender similarities hypothesis, which is both like
so common sense but also like somewhat radical, like the
idea that the men and women have so much more
in common than they have than are different between them,
and yet for various reasons, partly because of that cadre

(22:57):
of extremists you mentioned, but also other reasons, starts to say,
we just love to focus on the differences, and and
so we do. We talk about the differences, and we
focus on the differences and reinforce the differences to the
exclusion of the fact that like actually, come on, like
it's it's not really that different, like men and women
for the most part, Like we just want to watch

(23:18):
a comedy and like eat a burger or like a
salad if you're a vegetarian, like whatever, Like we're just
and yet there's this big investment in like focusing on
the differences that gives them this outsize weight.

Speaker 3 (23:31):
Yeah, I agree.

Speaker 1 (23:32):
I mean who among us doesn't just want to put
on War of the World and see what happens.

Speaker 3 (23:38):
Okay, well I don't want to watch that.

Speaker 1 (23:40):
We're so different, Mike, I guess.

Speaker 3 (23:44):
This is just a woman thing. You know how women are, Oh,
watch war the world.

Speaker 1 (23:48):
Women be, women be nicid, women be shopping, women be
watching War of the World. Wait, so you you actually
did a little bit of research into the Tea on
her app, right, more than I did.

Speaker 3 (24:01):
I did just a little bit.

Speaker 2 (24:02):
I'm just like curious about it because something about it
felt strange to me.

Speaker 1 (24:06):
That's when I was reporting on the tea app. The
thing that I said was that it felt hinky and
that it just didn't something didn't sound right. And this
it sounds like you found the tea on her app
similarly hinky.

Speaker 2 (24:19):
It was just like the story arc was too quick
that it went from like not existing to being released
to suffering the same types of security breaches as the
tee app within like like a week. Yeah, I think, uh,
and like specifically, uh, just putting sensitive information in public

(24:42):
buckets with public or else that were not secure, which
is insane, Like that's that's just completely nuts, Like I
can't It's it's hard to imagine that somebody would do
that in the first place, and it's even harder to
imagine that some nobody would commit that same error of

(25:05):
an app that they had replicated that had suffered that
exact breach like a week before.

Speaker 3 (25:09):
But you might think that, like.

Speaker 2 (25:11):
If you were creating a copycat app that was adapted
for men instead of women, maybe a little light bulb
would go off and be like, hey, what if we
make these private URLs? And also like requiring the users
to upload their driver's licenses. That's another big red flag
for me, because you know, as you talked about I
think it was last week on the news roundup, uh

(25:33):
age verification is actually a dangerous morass. And it sounds
like the t app realized this like two years ago
and stopped requiring users to do that. That's right, yeah,
because they you know, I guess I don't know what
the reasons were, but I have to assume that part
of their rationale is that this is a dangerous thing

(25:53):
to do, to just be even like handling that level
of sensitive information. And yet the Tea on her app
did it, and like I have to ask, I have
to wonder why, and part of me, you know, like
you know, I've got my tinfoil hat on pretty pretty firmly,
but like perhaps this was all just a fishing expedition

(26:16):
to get a bunch of sensitive information about a bunch
of men, you know, seemed to be created by this guy,
Xavier Lampkin from the Newville Media Corporation. Never heard of
any of these companies. There's a billion app developing companies
that I've never heard of, but like I couldn't find
a whole lot about either.

Speaker 3 (26:33):
Of them online.

Speaker 2 (26:34):
Uh, but it just it just seemed really I guess
hanky is a good word.

Speaker 1 (26:39):
I mean, that was exactly my experience when I was
first being researched on the tee app of I almost
I mean, I am not saying this is what was
going on. And I looked into the founder and I
was like, Okay, he has a paper trail, like a history,
an employment history that checks out. But it seemed so
absurd that I was like, is this some sort of

(26:59):
a seme?

Speaker 2 (27:00):
And I remember we talked about that, and I actually
talked to you down from that. I was like, no, no,
I don't think it's a scheme. Like you're really underestimating
the uh incompetence and laziness of humans who work in
like software development and data where you're just trying to
do things really fast, and like I was like, YadA, YadA.

Speaker 3 (27:21):
I could see how somebody would, you.

Speaker 2 (27:23):
Know, back up that database to a public bucket and
plan to go back and lock it down later. They
never did, or like, so I talked you down. I
was like, no, it's not I think then competence can
fully explain what happened with the tea app. But then
for somebody else to replicate it a week later and

(27:45):
the same errors, I don't know.

Speaker 1 (27:48):
And after there was so much reporting about the driver's
license and selfy aspect of it that oh, these women
when this app was breached, their driver's license is they're
but they're addresses on them. We're floating around the internet.
I don't know how you could have been even nominally
paying attention to that story. And then a handful of

(28:09):
days later be like, new app wants my driver's license
that has tea in the name of the app. I'm
gonna do it. I'm with you, So I as you know,
I went on an evolution of thinking this is some
sort of a scheme to this is just incompetence, and
I do suspect like if I don't know, but this
is my opinion, my take. I think that these apps

(28:31):
are just being quickly pushed out to capitalize on exactly
the kind of gender wars stuff we were just talking about.
And so I think when you are rushing, when you
want to be part of a current conversation, you want
to get there quickly. It doesn't really surprise me then
that maybe things like, oh, I don't know, security for

(28:52):
your user base would take a back seat when the
only thing that you're worried about is capitalizing on this
big engagement splashing moment about gender wars. Do you know
what I'm saying? I do.

Speaker 3 (29:02):
I totally do.

Speaker 2 (29:03):
I think I completely agree, And I also think that
there's a little bit of I don't know if irony
is the word here, and poetic justice makes it sound
more satisfying than it is. But like that same asymmetrical
criticism that you described of how like when the tea
app came out, so many men were writing about how

(29:24):
it was like messed up and like how dare these
women like a smirch men and it's unfair? Uh, and
that there was you know, with this new tea on
her app, there's just like none of that. It's just
a vacuum of conversation. And I think that that lack
of scrutiny, and that lack of conversation about it, I

(29:44):
think perhaps is giving companies like the Newville Media Corporation
a pass to just exploit and steal data from their
male users without commentary. Like I couldn't find anything about
it when I was searching earlier today.

Speaker 1 (30:02):
Neither could I, and I guess that's my thing. If
you are. I don't want to use the word grifter,
because I don't think this person is grifting in that sense,
but if you were interested in like the men who
sign up for this app are the product, right, Like
I saw through this as marketing bullshit, But at least
for the tea app for women, the creator, who was

(30:23):
a man, was like, oh, well, my mother dated, and
I wanted to make an app where women wouldn't have
to face dangerous experiences when they dated. YadA, YadA, YadA.
Whether you believe that or not, I happen to not
really believe that. I think it's just marketing. At least
there was some nod to the fact that this is
an app created for women to keep women safe. Sure, fine, whatever.

(30:43):
I think that For the t TA on her app
for men to talk about women, I think it is
like this is someone who was interested in capitalizing on
the current conversation around gender wars, and the men who
sign up for the app, even if they're a promised
like oh you can find out all the tea on
these women, Da da da da, they are the product
who is being exploited, as evidence by the fact that

(31:05):
they didn't even do the bare minimum to keep their
data safe. So yeah, it's just the worst people dominating
the conversation, and people who of all genders, who might
actually be trying to use these apps to keep themselves safe,
are being exploited by people who do not actually care
about that that are selling them a FOSSi bill of sale.
I guess that's my ultimate point.

Speaker 2 (31:27):
Yeah, absolutely, everybody deserves better. Women deserve better, men deserve better.

Speaker 3 (31:31):
We all deserve better. Exactly more.

Speaker 1 (31:36):
After a quick break, let's get right back into it,
speaking of wits, speaking of deserving better and justice. I

(31:57):
guess one of the stories I've been keeping a very
close eye on in the wake of the Supreme Court
ruling gutting Row is these period tracking apps. When Row
first fell, probably the number one question that listeners and
people in my own community would ask me was, oh,
bridget should I be using period tracking apps? And I
always say I mean, I err on the side of caution.

(32:18):
We had a privacy expert on the show who said, honestly,
for that kind of thing, good old fashioned pen and
paper is going to be your best bet. But there
was one specific, very popular app called Flow bet and
bet in court because of a lawsuit filed over the
apps practice of sharing people's sensitive information with third parties,
including Meta, without permission. So this being a period tracking app,

(32:41):
when I say sensitive information, I'm talking about your menstrual cycles,
like the most intimate stuff happening in your body. That
is the information they are sharing with third parties, including Meta.
I don't really want to talk to Mark Zuckerberg and
Adamoseri from Instagram about my cycles. Actually, So, this lawsuit

(33:01):
was filed in the wake of a twenty nineteen bombshell
Wall Street Journal story reporting that despite promises of confidentiality,
because remember these companies can and do, just say whatever
and then they'll do a completely different thing, Flow shared
users period data with Meta, Google and other third parties,
who then used it for targeted advertising. So the other companies,

(33:23):
Google and some of the other third parties they just
settled in a suit, but not Flow and not Meta,
so went to court. So no one disputes that the
data was shared it happened. The thing that was really
in dispute was whether or not menstrual data counts as
health data, which is a special protected class of data
in the US, and some states, most notably California, where

(33:44):
this lawsuit took place, have even stronger state specific protections
for health data specifically, so there was a jury trial
about two weeks ago to determine whether or not Flow
and Meta used this class as personal health data for
advertising and other purposes. Carol cee Vealgas, the attorney for
the plaintiffs, asked the jury and her opening statement to
decide how seriously big tech takes women's privacy. She said,

(34:08):
this wasn't an accident, this wasn't a mistake. This is
how Meta makes money. This is their business, and honestly,
knowing what I know about Metta, she is not wrong.
An you know who else agrees with me the courts,
because this week a California jury found that Meta did,
in fact a legally collect user health data from the
Flow period tracking app, violating California's wiretap law. In a

(34:32):
statement about the verdict, the attorneys said, this verdict sends
a clear message about the protection of digital health data
and the responsibility of big tech companies like Meta that
covertly profit from users. Most intimate information must be held accountable,
and I could not agree more.

Speaker 2 (34:49):
I thought this was a particularly interesting case because the
way that apps like to get around the legality of
sharing information like this with third parties is that they'll
say we aren't help the apps.

Speaker 3 (35:01):
You know.

Speaker 2 (35:01):
We've seen that in the realm of therapy. We've seen
it in terms of period.

Speaker 3 (35:06):
Tracking, you know.

Speaker 2 (35:09):
And it's it is kind of like a gray area
in some regards what is and is not health information,
because a lot of stuff is like very personal but
not necessarily health information, like the food you eat. It's
related to your health, but we probably wouldn't call that
health information. The World Health Organization, I think has a

(35:31):
really good definition of health. They define it as quote
a state of complete physical, mental, and social well being
and not merely the absence of disease or infirmity. And I,
you know, as a public health preventive medicine guy, I
really resonate with that. I think it's really important to
take a holistic approach to what is health. But that

(35:54):
very holistic approach does run into some problems when it's
up against the law where the law covers specifically health data. Right,
So we have HIPPA at the federal level, which protects
health information, and then we have states like California that
also have health information laws. And so in this case,

(36:17):
my understanding is that Meta was suggesting that like period
tracking information was not health data, so there's no reason
that it should be held to this higher standard of privacy.
And they were just really trying to exploit that gray
area in the way that we see them try to
exploit every opportunity they can to harvest as much data

(36:42):
from their users as possible and then use it to
sell ads or whatever they can think of to make
money without any concern for the health and well being
of their actual users exactly.

Speaker 1 (36:55):
And we see this time and time again with other
kind of apps that I guess you do think of
them as health apps, but they get to say like, oh,
we're not a health app, Like I came up in
our conversation that we had a while ago about better Help,
where they say, oh, well, we aren't a mental health
app or we're not. We don't have health information, we
shouldn't be held to that scrutiny. We're just connecting people

(37:17):
with people who are mental health professionals.

Speaker 2 (37:20):
And you know, I used to work for a company
where we built apps that operated in a HIPPA environment,
where all of the data that we were collecting from users,
and you know, we were processing it and using it
to try to help people, but all of it was
governed by HIPPA. So I do have some I'm not

(37:43):
a lawyer, but I do have some experience leading the
data team that had to operate within HIPPA rules. And
I'll share that in my personal experience just talking with
people at like parties or wherever.

Speaker 3 (37:56):
People really misunderstand HIPPA.

Speaker 2 (37:59):
And I think often the average person has this idea that, uh,
it just like protects all health data and anything that
might that might be related to your health in a
common sense way is protected by HIPPA, And that's just
not the case.

Speaker 3 (38:15):
It is a very.

Speaker 2 (38:18):
Narrow set of data and information about your body and
your health that is protected by HIPPA. And so if
you're using an app, unless you sign off on like
what's called a HIPPA authorization right when you're like onboarding
onto the app, when you're setting up your account, they'll
have you read like the terms of service and like
you'll sign off on the terms of service, but there

(38:40):
will also be a separate thing that you have to
sign off on that is the Hippa authorization that essentially,
you know, says that you give up some of your
Hippa writes so that they can process the data. Unless
unless that's part of what you sign off on during
the onboarding service, that's not being true as health data

(39:01):
by that app, and so they're just considering it any
other data like whether you know what size shirt you wear,
or whether you prefer blue to green, or some sort
of data like that.

Speaker 1 (39:14):
And it really is troubling when we're talking about something
that is so intimate about your body. It's like some
of the most intimate information about us. And I will say,
the damages have yet to be set in this case,
and I suspect that Meta will appeal, but basically the
flow app Google settled pretty quickly.

Speaker 3 (39:36):
Meta lost.

Speaker 1 (39:37):
They will appeal, but we'll see everybody involved in this
basically has now been found to.

Speaker 3 (39:43):
Have been doing something they shouldn't have been doing with
our data.

Speaker 1 (39:46):
And I do think that that is still a win
for privacy, Like, I think it sends a message that
you can't just there are limits to how much you
can exploit people, like not everything about us is for
sale or simply hours for the taking, and that there
will be consequences. Again, Meta will appeal. Even if they
they lose that appeal, what they will end up paying
on I'm sure will be nothing. And I also think

(40:08):
the fact that flow Meta, you know, we're all found
to be doing something they shouldn't be doing. When the
conversation comes up of like, well can I trust this
period tracking app? I even if the damages aren't much,
that being part of the public record, that know, you
cannot trust this this specific app like nope, Mick, I
think that that will be that will go a long

(40:30):
way in helping people understand the dynamic that we actually
live in. That like, these apps will say anything, right,
the tea app said they were deleting selfies where they
know these you really can't, like how much do you
trust Mark Zuckerberg with this kind of information about yourself?
I don't trust them at all? Right, And so even
if it's not going to be something that actually financially

(40:51):
hurts any of these companies that have been, you know,
shown to be doing the wrong thing. I think it
helps us see, hey, maybe I really can't just take
them at their words.

Speaker 2 (41:00):
Absolutely, you can't trust them, and uh and we also
have to stand up for privacy.

Speaker 3 (41:05):
It's nice to see people doing that and winning.

Speaker 1 (41:08):
Speaking of Meta, I did want to give a quick
psa that Meta just rolled out this new Instagram map
feature which is sort of like Sign my Friends or
the snap map, which will allow friends to monitor each
other's real time location. So essentially, this new feature on
Instagram allows users to opt in to sharing their location

(41:29):
with people with a bevy of options including friends, followers,
you follow back, close friends, selected friends, or no one.
So obviously whether you want to use this feature is
up to you. I just do not trust Meta. And
the thing with Meta specifically is that they are known
to have this like just a dizzying amount of third

(41:54):
parties and contractors that they share information with. So when
you share information on Meta, a lot of companies are
like this, but like Meta is like like this at
a different on a different level. You just have no
idea who all, like what landscape that is being shared with,
which is why I'm like very particular about how my
information is shared via Meta.

Speaker 2 (42:14):
Yeah, I think that's like such a good way to
say it, and I think that applies to so many situations.
It's like a lot of companies are like this, but
Meta is like at another level.

Speaker 1 (42:25):
And again, I mean, as we just said, Meta has
like I don't trust any of these companies. I'm not
saying that any one is like doing greater or whatever whatever. However,
Facebook and Meta are so bad and so in a
landscape where it's full of people, full of companies and
people that run them that are doing the wrong thing,
Meta is doing the wrong thing double right. Luckily, this

(42:47):
is very easy to disable. Even though Meta says that
it is an opt in system, I would still recommend
checking your settings to see if you are currently sharing
your location, So please go and check your settings. However,
even if Instagram does not know your location, people who
tech their posts with location can still be seen if
they opt in. We'll throw those instructions into the show notes.

(43:08):
But again, I just think people should be aware that
this is a new feature, spend some time thinking about
whether or not you want to share this with Instagram
and Meta and also in general. I mean, like, who
really was clamoring for this? But it just really makes me think,
I don't know if the geniuses over there at Meta
and the geniuses like Adam MOSERI who you know, is

(43:31):
the tech guy I probably hate the most out of
all the tech guys we talk about on this podcast.
For some reason, I don't know that they really know
what people want. If you've ever talked to a woman,
I bet that you might know that opting people in
to sharing their location when they post on social media
is not something that I think most women would think

(43:52):
was super cool. But you know, on at a company
run by Mark Zuckerberg, who initially started his tech Empire
to the looks of women, guess I can't say I
expect much more.

Speaker 2 (44:03):
Yeah, it has nothing to do with what people want
and everything to do with them getting more information about
you so they can sell more stuff, some of it
to you, some of it to other people like you.
Doesn't even matter. They don't even have to have a
specific reason why they want this information. They just want
as much information about you as they can have, and
they're not real juicy with who they share that with.

Speaker 1 (44:25):
The conversation around this location sharing change on Instagram has
been really interesting. It's mostly been happening on threads where
people are essentially complaining about this change, in some cases
maybe sort of fear mongering about it a little bit.
They're kind of leaving out the fact that according to Meta,

(44:45):
this change is opt in, so it's not automatically sharing
your location. However, as the story that we just talked
about indicates, you really can't trust what Meta says, so
they are a company that, per a court, will say one.

Speaker 3 (44:58):
Thing and do another.

Speaker 1 (45:00):
And Adamosi himself is in people's posts on thread saying no,
that's not true. We're not automatically sharing anybody's location. It's
totally opt in. And I think what they are missing
is that this is their fault. This is the level
to which consumers simply do not trust Meta at Instagram,
and clearly those consumers are not wrong for, you know,

(45:24):
not giving Facebook or Meta or Instagram the benefit of
the doubt. I've really been kind of surprised seeing people say, wow,
average consumers are misunderstanding this change and mid fear mongering
and misreporting it. I don't think it's any of that.
I think that Facebook really has not seen the level

(45:45):
to which people simply do not trust this company, and
they're not wrong for doing.

Speaker 3 (45:49):
So, completely agree.

Speaker 2 (45:51):
This really exposes how little trust people have for Meta,
And like you just said, it's not that people are misunderstanding.
People understand perfectly well that Meta over and over again
has demonstrated that they can't be trusted, that they will lie,
that they will try to sneak your data however they can.

(46:15):
And also, I think the claim of it's totally opt
in it miss is an important piece That a lot
of people probably allowed Meta to access their locations when
they first downloaded these apps, you know, some cases years
ago before this feature existed, and if that's the case,

(46:37):
I'm pretty sure it would still be on.

Speaker 3 (46:40):
So it's not surprising to.

Speaker 2 (46:43):
Me that we've seen some people be surprised that their
location sharing is turned on. You know, maybe they technically
did opt into it at some point in the past
before this feature existed, but that doesn't change the fact
that they now feel surprised that their location is being

(47:04):
shared in this way exactly.

Speaker 1 (47:07):
And it's really been interesting to see folks like Adamosi
clearly be in damage control mode, and it just shows me,
they don't understand the role that they play in the
media diet of their own consumers. Diego Jimenez, who is
a product designer at Instagram, posted on threads Misinformation Aside

(47:28):
the reactions to the new ig friends map are pretty funny.
Younger generations get it and love it. They already use
social maps. Boomers don't understand it and freak out they
want photos back. The real takeaway is, no matter how
clearly a new feature is explained, people won't read the
explanation or give it a try before rushing to alarm
the world about it. Shruggy emoji and that that coming

(47:51):
from somebody who is an internal product designer at Instagram
tells me so much. These people simply do not get it.
They do not get the way that people are so
distrustful of them, their company, their platform, their products, these changes.
The fact that there's been so much loud and vocal
backlash that they basically have to resort to you're just

(48:12):
an uncool boom or if you don't like this, really
tells me that they have no idea how the public
actually perceives them, and you know, until they're ready to
hear that and like deeply in a meaningful way, make
amends for the rightful reasons why people are not willing
to trust them and are not willing to give them

(48:33):
the benefit of the doubt and are rushing to sound
the alarm they're not gonna learn. Like it just is
is sad to see people essentially say, no, it's the
users who are wrong.

Speaker 3 (48:44):
That quote you just read.

Speaker 2 (48:45):
I thought that was some sort of like pick journalists
or podcast or something. It's nuts that that's an actual,
like internal product person at Meta talking about their own
users in that way.

Speaker 3 (48:59):
It sounds ainful.

Speaker 1 (49:00):
And if you had, if you worked on a team
that rolled something out that had this level of loud backlash, again,
even if some of it is perhaps not entirely accurate,
in that Meta says, oh, this is all opt in.
We're not automatically sharing anybody's location with anybody, But they
are a company, per the courts that you can that
you cannot trust if you've like if the only way

(49:24):
that you can respond to that is essentially telling the
people who are speaking out about how they're responding to
this rollout by saying you're uncool, you don't understand tech,
and you don't get it. That really says a lot.
I mean, is it any wonder why people do not
trust this company more?

Speaker 4 (49:42):
After a quick break, let's.

Speaker 1 (49:53):
Get right back into it, all right, So we got
to talk about Chris Cuomo, the brother of local sex
pest Andrew Cuomo. Chris Cuomo was fired from CNN a
few years ago after it came to light the extent
to which he was helping his brother Andrew Cuomo in

(50:14):
his defense against sexual harassment allegations that led to Andrew
Cuomo resigning as governor of New York.

Speaker 3 (50:21):
So real, just.

Speaker 1 (50:23):
A list family over here.

Speaker 2 (50:25):
Yeah, And just to put a point on it, the
reason he was fired was because the network that employed him, CNN,
which like you know, questionable, but we can move on
from that. But like his journalistic employer felt that he
was not upholding journalistic standards, right, Like that's why he
was fired. Correct, Yes, this will come become important again later.

Speaker 1 (50:52):
Well. Chris Cuomo, phenomenal journalist that he is currently now
at News Nation, shared this video of AOC talking about
the Sydney Sweeney American Eagle jeans ad on X.

Speaker 5 (51:06):
Sydney Sweeney looks like an Aryan goddess and the American
Eagle Jens campaign is blatant Nazi propaganda. I mean, watching
that sultry little temptress squeeze into a Canadian tuxedo three
sizes too small, with her bouncy little fun bags on
the screen, staring at you, piercing through the core of

(51:27):
your soul, with those ocean blue eyes that could resurrect
the Furor from his grave in Argentina is something that
should alarm every American citizen, because in America, beauty is
not defined by whiteness. Oh no, it is defined by
the number of victim groups of which you are a member. Skinny, attractive,

(51:48):
blonde haired, blue eyed cisgender women descend from the slave
daddy oppressors of this nation. And any man who cranks
one out while thinking about a woman like this probably
hates black people, probably hates gay people, and they certainly
hate the diversity of our great nation. So I say,
instead of simping for the Sydneys, we should be celebrating

(52:10):
the Shaniquas.

Speaker 1 (52:12):
Instead of worshiping the hot.

Speaker 5 (52:13):
Straight blonde what about the obese alphabet people with blue hair?
They need love too. And to all the haters who
say companies that go woke, go broke. I'd rather be
poor than a Nazi.

Speaker 1 (52:26):
Only one problem. It's a deep bake, and honestly, I
think a pretty obvious deep fake at that.

Speaker 3 (52:34):
It is so clear that he thought this.

Speaker 1 (52:35):
Video was real, even though the clip has a watermark
on it that says parody one hundred percent made from AI.

Speaker 2 (52:43):
I guess he didn't see it. It's got a watermark
right on it, Like, come on, man.

Speaker 1 (52:48):
So he shared this on s with the message nothing
about hamas or people burning Jews in cars, but Sweeney
jeans ad deserved time on floor of Congress. What happened
to this party? Fight for small business, not small culture wars?
Never want to back down. AOC called him out and said,
this is a deep thake, dude, Please use your critical

(53:09):
thinking skills. At this point, you're disreposting Facebook memes and
calling it journalism, which that's absolutely what he's doing. It's
like if my uncle was a journalist and was like
reporting on Facebook's AI slop as if it was real.

Speaker 3 (53:23):
It's also such a double standard.

Speaker 2 (53:25):
It's like, what about like hamas or people burning other
people in cars has to do with fighting for small business.
It's like you're fighting the culture war in this tweet
calling to not fight culture wars like have some self awareness.

Speaker 1 (53:43):
And Chris Cuomo has a platform on News Nation. If
he wants to highlight any of those issues, he absolutely
has a platform to do it. He doesn't have to
scream at AOC for not doing it, even though it's
completely made up, like he's just like making up stuff
to be mad at. So he took the video down
and said, you are correct. That was a deep fake,

(54:06):
but it really does sound like you. Thank you for correcting.
But now to the central claim. Show me you calling
on Hamas to surrender or addressing the bombing of a
car in Saint Louis belonging to the IDF American soldier dude.
And then he doubled down on this on his show.
Here's what he said. She was right, they got ai.

(54:28):
It was really good and it did seem like something
she would say. So I thanked AOC for correct but I.

Speaker 3 (54:33):
Didn't reminded her.

Speaker 1 (54:34):
She ignored the part of.

Speaker 6 (54:36):
The tweet that met Okay not Sweeney, which really should
never been a fake.

Speaker 4 (54:41):
Let's be honest.

Speaker 6 (54:42):
Why did AOC, the most popular Democrat in the country,
powerful reportedly ignore what I asked about calling on Hamas
to surrender to end the more they saw her, she
has never said that that I could find think about that.

Speaker 1 (55:00):
So it is so clear to me that when it
comes to AI, one of the things that makes it
tricky is that, especially when it adheres to a worldview
that we already hold, it can be difficult to see like, oh,
this is not real. I am getting taken by a
fake video. So even though in my opinion, this was
very obviously AI down through the watermark spelling out that

(55:23):
it is AI and not real, the fact that it
aligns with Cuomo's worldview, an attitude and an opinion that
he's already hold, and it's like very ready to uphold
and very ready to believe and share any sliver of
information that it hears and upholds that attitude. I think
that is why we see Quomo now basically saying, well,

(55:43):
it doesn't matter or not if she actually said this,
It doesn't matter or not if this was actually fake,
because I think she would say this, and so AOC,
you have to defend yourself against this allegation that I
basically just made up that was buttressed by something that,
like I understand now was fake because it is a
worldview that I hold. It's basically just using AI to

(56:04):
hold people accountable for a worldview or opinion that was
bullshit that you just made up, that he has made
something up to be mad at.

Speaker 2 (56:12):
I think this is the darkest thing we've talked about
in this entire episode, because it just really illustrates exactly
the way that AI is like destroying society, destroying democracy,
that like it's it's not real, it's fake. Like this

(56:34):
is the dangerous thing about AI is it is blurring
the line between what's real and what's not. And that's scary.
But what's even worse is that someone who purports to
be a journalist would just act like that distinction doesn't
even matter. Like even after he's called out on the

(56:54):
fact that this is a fake video, he just doubles
down on it and like you said, like demands that
she respond to this fake thing that she didn't say.
It's it's dark and frightening, and that's like he is
just working so hard to bolster the authoritarian takeover of

(57:22):
our entire information ecosystem with this kind of behavior.

Speaker 1 (57:27):
And I just I mean not to make it about me,
but I simply cannot imagine being called out for getting
something so for being a journalist and getting something so wrong,
and then doubling down on it and being like, but
I'm still owed. Some answers like where is the shame?

(57:47):
Bring back shame? And not for nothing. Congress is not
even in session right now, you moron, You like, like,
when do you think this happened? I just honestly, these
are meant to be our journalists, and I listen. I
told a story on the podcast last week. I got
taken by an AI video some bunnies jumping on a trampoline.
Although I when I told you about that, you were like,

(58:08):
I thought it was AI, But I didn't want to
burst your bubble.

Speaker 2 (58:13):
I mean I said that after the fact, so you
thought it was real. No, But like, those buddies did
not jump, Bridget.

Speaker 1 (58:21):
But it's one thing. I mean, it all comes from
the same place. I don't want to say that me
getting taken is the same thing as Cuomo getting taken,
but I think it really shows exactly what you were saying,
that how.

Speaker 3 (58:36):
Bad and eroded.

Speaker 1 (58:37):
Our information and digital media ecosystem has gotten that he
could be get something so wrong.

Speaker 3 (58:43):
As a journalist.

Speaker 1 (58:45):
Basic stuff still sort of be like, well it sounds
like you, but something that even if you didn't say
as something that you would say.

Speaker 2 (58:54):
Like what like, yeah, it is similar because you were
like those bunnies would jump on a trampoline, Like, how
do you know they wouldn't get up there they could jump.

Speaker 1 (59:01):
I still stand by that. I still stand by that
when the when night falls, bunnies are doing all kinds
of things we don't know about. I stand by that.

Speaker 2 (59:09):
It was a fake. It was a fake. They didn't
jump on that trampoline. They're just down there eating grass, dandelions,
big broad leaf vegetation.

Speaker 3 (59:19):
That's what they like.

Speaker 1 (59:19):
You know a lot about bunny diet.

Speaker 2 (59:23):
Yeah, that's how I knew it was a fake. It
wasn't a real bunny thing.

Speaker 1 (59:27):
They don't jump on You don't think they would jump
on trampolines.

Speaker 3 (59:29):
No, they would hate that. First of all, how are
they gonna get up there? Why would they get up there?

Speaker 1 (59:34):
Wileye haven't you ever watched a cartoon?

Speaker 2 (59:37):
They're afraid, they're skiddish. They may as well be hold
up a sign while they're jumping. It's like, hey, come
and eat me, foxes whatever.

Speaker 1 (59:48):
Every now and then on the show, I will talk
about a story that sounds like it was created in
a lab to infuriate me specifically, and this is one
of those stories. Because over the past two weeks, they're
six known incidents of green sex toys being thrown onto
the court during WNBA games. The latest happened this week

(01:00:09):
during a game between the Indiana Fever and the La
Sparks at Crypto dot Com Arena in LA. Sex toys
were also thrown in the stands at the New York
Liberty and Phoenix Mercury Games this week, and another two
were thrown at the Atlanta Dream Games last week. In
a Chicago Sky game on Friday, it has gotten pretty serious.
Two different people were arrested on multiple charges for allegedly

(01:00:29):
throwing these sex toys. We were going to play audio
of her speaking, but it is so drowned out with
sounds of dribbling and basketball squeaks that I'll just read
what she said, but we put the link to her
speaking in the show notes. Her words are very powerful,
and I encourage folks to hear what she had to
say in her own words. But she said this has
been going on for centuries, the sexualization of women. This

(01:00:52):
is the latest version of that, and it's not fun
and it should not be the butt of jokes on
any radio shows, or on print or on any comment.
The sexualization of women is what's used to hold women down,
and this is no different. This is just a great
example and we should write about it in that way.
And these people that are doing this should be held accountable.
We're not the butt of the joke. They are the problem,

(01:01:14):
and we need to take action. So I have been
following this story and trying to figure out what the
heck is going on here, But right as we were
sitting down to record, USA Today had a break in
this story about what is going on. USA Today spoke
to the spokesmen for a crypto group called Green dildo Coin,

(01:01:34):
and that spokesperson said that they were not trying to
be disrespectful or disrespect women. He described their group as
a group of crypto enthusiasts and traders who launched Green
dildo Coin, and that the whole thing was meant to
be lighthearted and perceived as a joke or a prank
in order to protest what they describe as a toxic

(01:01:57):
environment taking over the crypto world. So if you don't
know what a meme coin is. It's a type of
crypto asset that is sort of inspired by Internet memes
or characters or trends, for which the promoter seeks to
attract an enthusiastic online community to purchase the meme coin
and engage in its training. This is according to the SEC,

(01:02:18):
so meme coins are kind of like collectibles with limited
or no functionality per the SEC, and their entire value
is dictated by social and cultural influences.

Speaker 3 (01:02:29):
They're beanie babies.

Speaker 1 (01:02:30):
Yeah, that's a good comparison. Meme coin generally carries a
lot more risk compared to other cryptocurrencies, which are already
pretty risky. So according to the people in this Dildo
meme coin group, talk about a sentence if you had
to explain that sentence to yourself, if you went back
in time and you had to talk to your ten
year old self back in like nineteen ninety whatever. Imagine

(01:02:53):
having to say I'm going back for.

Speaker 3 (01:02:55):
I'm like so much.

Speaker 2 (01:02:57):
I like to do a little thought experiment of like
what of George washing Ishington and Ben Franklin like traveled
forward through time and I was like, well, the dil
do meme coin.

Speaker 1 (01:03:08):
Will you and I have that joke where it's like
you go back in time and you wake up your
young self and you're like, wake up the guy from
home alone to his president and he's rounding people up,
and you're like Tim Curry's president.

Speaker 2 (01:03:24):
I mean pretty much anything from the future. If you
told if someone of the past, they would be horrified.
And by the future, I mean the president correct anyone
from the past about what's happening right now, they'll be horrified.

Speaker 1 (01:03:33):
And I mean, I'm horrified by this because basically this
group says that, oh, smaller players in the crypto and
meme coin space are struggling to keep up with all
the influencers and scammers, and we need to do something.
If you ask me, that space has pretty much always
been scammers and influencers, but sure do your thing, Green

(01:03:54):
dildo coin. So as a form of protest, the meme
coin was created. The faction began infiltrating WNBA arenas with
color coordinated sex toys to coincide with its launch. USA
Today Sports obtained text messages showing the group's coordination and
planning before the coins launched on July twenty eighth, and

(01:04:15):
the first sex toy being thrown out at a WNBA
game on July twenty ninth, So you're probably thinking, wow,
that is so disrespectful to the women athletes who are playing,
to the people who are just trying to enjoy a
WNBA game with their families. But don't worry, don't worry,
don't worry. It's still it's chill us chill, because they said,
we're not trying to harm anybody, or embarrass anybody, or

(01:04:38):
disrespect anybody. They say that the community has only been
advised to throw their sex toys if there is a
level of personal comfort and if the objects can land
without hitting somebody. They also were like, don't worry, this
is not about being disrespectful to the women athletes or anything.
He said, we didn't do this because we dislike women's sports,

(01:05:00):
or like some of the narratives trending right now that
are ridiculous creating disruptions at games. It happens at every
single sport. We've seen it at the NFL, We've seen
it at hockey. Fans doing random things more or less
to create attention. So they go on to explain how
they are just trying to spread awareness about the culture
that they want to perpetuate in the meme coin and

(01:05:21):
crypto community, cultivated around lightheartedness, jokes, pranks, and various stunts.
The coordinated effort, they said, is a very strategic protest
against what meme coin creators view as a small group
of individuals controlling the crypto space. It's not about women,
it's not about being disrespectful. It's just about meme coins

(01:05:43):
and crypto, to which I say, each shit, do you
think the women whose games you are disrupting give a
crap about your crypto memestock? Do you think that anybody
cares about the vibes that you were trying to curate
in the crypto space.

Speaker 4 (01:05:58):
No.

Speaker 1 (01:05:59):
Women athletes already have to struggle for respect, both on
and off the court. We have talked about the ways
that technology intersects with the very real threats of harassment
that these athletes face. You've expect us to think that
it's a coincidence that they're doing this at WNBA games
instead of NBA games.

Speaker 3 (01:06:15):
Literally, go choke on your green dildos.

Speaker 1 (01:06:18):
With the rise of things like social media and online
sports betters, ESPN reports that harassment of athletes is on
the rise. So I don't give a fuck if you
are trying to raise awareness for your lighthearted meme coins,
if you are dooming it off the backs of women
who are just trying to show up and do their
jobs on the court, you fus. If anybody from this

(01:06:39):
group is listening, I cannot imagine a more pathetic way
to spend your time. Not to mention, dildos are not cheap.
You're buying a WNBA ticket and a dildo to throw
onto the court.

Speaker 3 (01:06:52):
What are you doing.

Speaker 1 (01:06:53):
I have never heard of something more pathetic and embarrassing
in my life. I have nothing to do with this,
and I am embarrass about it.

Speaker 2 (01:07:01):
Yeah, and if they were really serious about it, I
mean they should be bringing their green dildos into other
sporting events, like bring it to a UFC fight, bring
it to an NFL game, bring it to a hockey game.
You know, just bring your dildos all over the place
with you if that's like your thing, if your whole
thing is dildos, don't just throw them at women.

Speaker 1 (01:07:22):
No, they never would, because it's about humiliating women, like
they can say whatever they want. There's a reason they
didn't do this at a UFC fight or an NBA
game or something like that there is or a NASCAR event.
There is a reason, and that reason is because they
are about humiliating women. They are about getting engagement and
traffic and eyeballs off of humiliating women. They would never

(01:07:44):
do this at a sporting event where there's going to
be lots of men, because they probably get their clocks
clean deservedly, and they don't want to show up to
spaces where there's going to be men to do that.
They want to do stuff like this in spaces where
they assume they are not going to be physically challenged.
So it has ever everything to do with humiliating women.
I don't give a fuck if they say otherwise, like
it's so disrespectful, as if women athletes don't have enough

(01:08:08):
think about the kind of homophobic harassment that people like
Britney Griner have had to face. I don't know if
people have read Hurt memoir, but like to go to
a WNBA game and throw a sex toy at someone
that's trying to do their job is so humiliating and
degrading and disgusting, and then to play in our faces
and say oh no, no, no, no no, it has

(01:08:29):
nothing to do with women or humiliating women. That's nonsense.
It's about my memes stock.

Speaker 3 (01:08:35):
Fuck you.

Speaker 1 (01:08:36):
It's like gaslining and I will not have it.

Speaker 2 (01:08:39):
Yeah, absolutely, and like not even an effective gas lading,
Like come on this fucking meme stock, Like get out
of here.

Speaker 1 (01:08:47):
Get out of here, get out of here before we end.
I have one quick note, which is just really funny.
Lynna McMahon, the Secretary of Education and former chairwoman of
the WWE. Speaking of wrestling, folks might know her as
the person who was trying to talk about AI at
Education and kept calling it a one like the stik sauce,
which I actually love.

Speaker 2 (01:09:10):
Our children need to know about this tangy sauce to
be able to compete against China in the future.

Speaker 1 (01:09:15):
My children will how my children will never know about
steak sauce. How dare you? My children will be eating
steak that doesn't need steak sauce on it. She was
trying to speak at the Young Americans Foundation conference and
something happened that caused the Curby Your Enthusiasm theme song
Circus music and audio of Linda being called corrupt herself

(01:09:36):
kept playing over her trying to speak quickly?

Speaker 7 (01:09:39):
Is people understood what his working style was and he
understood you know them so, but this time, you know,
he knew, he knew the story, he knew how to
you know, how to make things work, how to make
things run. He had the people coming in that he
really wanted to work with him, and and then he'd
had this little you know sometimes kids in college take

(01:10:02):
a gap here, well, he didn't voluntarily take a gap term,
but I think it turned out to be an incredible
thing for him.

Speaker 1 (01:10:10):
Jab's kiss, no notes. Whoever came up with this, I
think it's what there are big good.

Speaker 3 (01:10:17):
We'll leave you with this.

Speaker 1 (01:10:23):
Got a story about an interesting thing in tech, or
just want to say hi? You can reach us at
Hello at tangodi dot com. You can also find transcripts
for today's episode at tengody dot com. There Are No
Girls on the Internet was created by me Bridget Tod.
It's a production of iHeartRadio, an unbossed creative. Jonathan Strickland
is our executive producer. Terry Harrison is our producer and
sound engineer. Michael Almado is our contributing producer. I'm your host,

(01:10:45):
Bridget Todd. If you want to help us grow, rate
and review.

Speaker 4 (01:10:48):
Us on Apple podcasts.

Speaker 1 (01:10:50):
For more podcasts from iHeartRadio, check out the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.

Speaker 7 (01:11:00):
Love be well, oh well, and well lived.
Advertise With Us

Popular Podcasts

Fudd Around And Find Out

Fudd Around And Find Out

UConn basketball star Azzi Fudd brings her championship swag to iHeart Women’s Sports with Fudd Around and Find Out, a weekly podcast that takes fans along for the ride as Azzi spends her final year of college trying to reclaim the National Championship and prepare to be a first round WNBA draft pick. Ever wonder what it’s like to be a world-class athlete in the public spotlight while still managing schoolwork, friendships and family time? It’s time to Fudd Around and Find Out!

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.