All Episodes

May 18, 2024 51 mins

The weekly news roundup is back!

Bridget’s piece of Elizabeth Holmes’ hair: https://www.instyle.com/hair/elizabeth-holmes-white-privilege-messy-hair

Zuck's new style was on full display at his birthday: https://ca.news.yahoo.com/rapper-swag-zuck-gone-too-085234346.html

#Blockout2024: Why people are blocking celebrities on social media: https://mashable.com/article/tiktok-blockout-2024-celebrity-met-gala

About 800,000 BetterHelp online therapy customers receive refund notices for privacy violations: https://www.nbcnews.com/business/consumer/betterhelp-online-therapy-customers-receive-refunds-over-data-misuse-rcna151546

On Instagram, a Jewelry Ad Draws Solicitations for Sex With a 5-Year-Old: https://www.nytimes.com/2024/05/13/us/instagram-child-safety.html

Women no longer have to make the first move on Bumble. Will it make the app better? https://www.npr.org/2024/05/06/1249296671/bumble-dating-apps-women-opening-moves

Bumble apologizes for its mean anti-celibacy ad fumble: https://www.theverge.com/2024/5/14/24156746/bumble-dating-app-anti-celibacy-billboard-ad-apology 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet, as a production
of iHeartRadio and Unbossed Creative. I'm Bridget Todd and this
is There Are No Girls.

Speaker 2 (00:14):
On the Internet.

Speaker 1 (00:18):
Welcome to There No Girls on the Internet, where we
explore the intersection of identity, technology and social media. And
this is another installment of our weekly news roundup, where
we break down and explain all the different news stories
from the Internet that you might have missed. Yes, it
has been a while since we've done one of these.
I took a little bit of a break from the news,
but now we are back and better than ever. Mike,

(00:39):
you are also back. Thanks for being here.

Speaker 3 (00:41):
Thanks for having me, Bridget. It's great to be back.
It has indeed been a minute. There's been a lot
of tech news in that time, and it's great to
be here talking with you again.

Speaker 1 (00:50):
I'm a little bit distracted because I'm scrolling through these
like polished Mark Zuckerberg pictures. Have you seen these?

Speaker 4 (01:00):
I have. Yeah, it's like a whole new zuck.

Speaker 1 (01:02):
Someone from Mark Zuckerberg or Meta pr is working overtime
to try to convince us the public that Mark Zuckerberg
is a more polished, more stylished, more refined kind of guy,
and farbe it of me to spend a lot of
time talking about the sartorial choices of a tech leader. However,
we know there is a ton of precedent for tech

(01:24):
leaders using their fashion and style and hair to try
to project things at us, the public. So it does
kind of seem like their game to be talking about
Mark Zuckerberg rocking a chain at his fortieth birthday party,
et cetera, et cetera, clearly trying to signal to us
that he's a little more polished, a little more refined,
a little more stylish.

Speaker 3 (01:44):
Yeah, right, you know you've talked about this before. Didn't
you write a whole op ed about Elizabeth Holmes's hair
and her active choices to have questionably nourished looking hair.

Speaker 1 (01:58):
Good memory, I definitely did. I wrote a whole piece
about Elizabeth Holmes his hair choices. If you remember her,
she definitely was known for a certain type of hairstyle
that I think was with intention. It did not look
nourished or conditioned, and I think that was my choice.
We'll throw the article in the show notes if you
want more information, but she was definitely somebody who knew

(02:20):
how to use her fashion and hair to signal a
certain kind of thing to us.

Speaker 3 (02:27):
Yeah, I think you're right that. You know, all of
these tech leaders do that. You know, they get up
on a stage. There's a lot of photos of them.
It's all, I think, pretty deliberate. It's and so like
for this new Zuckerberg rebrand, do you think they brought
in the Rick Perry guy, the guy who brought us
the smart Rick Perry, the one who wears glasses.

Speaker 1 (02:45):
Oh that was my favorite like rebrand when it was
like they put them in glasses and it was like,
guess what, y'all, Rick Perry is smart. Now from the
team that brought you smart, Rick Perry comes Zuckerberg.

Speaker 4 (02:58):
Glow Up, Zucker glow up, the Zuck blow up.

Speaker 1 (03:01):
The Zuck glow. So basically in these pictures, there's one
he made an announcement about Meta AI and he was
wearing like a silver like a silver Cuban link chain
and everyone was like, oh, nice chain. And then somebody
like used AI deep fakes to put a little like
one of those kind of thin beards on him. So
if you saw that picture, the chain that was legit,

(03:23):
the beard that was not legit. But even though that
picture wasn't real. It really highlights my belief that what
contouring makeup is for some women, beards are for some men.
Like a beard can completely change the way a man's
face looks. It can like it is like magic.

Speaker 3 (03:43):
Yeah, absolutely, beard really changes a man's face.

Speaker 1 (03:47):
So it sounds like Over the weekend, Zuckerberg had a
birthday party where for the party, his wife recreated all
of the places that he lived when he was growing up,
so like his Harvard dorm room where he created Facebook,
and there are all these pictures of him wearing like
a fitted black tea and a chain, like in a

(04:09):
fake dorm room kind of diorama, like adult sized diorama.
The whole thing is like, I don't know, it's clear
to me that somebody wants us to think of Mark
Zuckerberg as this affable guy with friends and family who
love him, you know what I mean. That's that clear
to me that that's what we're all supposed to be
taken away from this god.

Speaker 3 (04:28):
I have to look for those photos of him in
the dorm rooms with a chain, because that was definitely
a type of guy in Boston in like the early
two thousands when he was there. That's when I was
in dorms in Boston, and like that was definitely a
type of guy, like fitted black shirt chain usually not like,
you know, a forty something.

Speaker 4 (04:46):
Year old Zuckerberg. What a nice and strange gift from
his wife.

Speaker 1 (04:55):
The pictures are unreal. I did see if someone comparing
it to Marie Antoinette that apparently Marie Ane Toinette had
built an entire fake village in Versailles that was equipped
with like a working farm and fake people to work it,
so that whenever she wanted to pretend to be a

(05:17):
peasant girl on the farm, she could do that. And
somebody compared Mark Zuckerberg's kind of fake set pieces of
his college dorm and like crappy first apartment to that,
which I thought was kind of funny. That's actually a
pretty good segue into what I want to talk about first,
which is the digital guillotine movement aka digiting if you

(05:38):
don't know what that is. Basically, people on social media
are using hashtags like hashtag let the meat cake, hashtag
blockout twenty twenty four, and hashtag celebrity blocklist to rally
people to boycott celebrities like Zindea, Taylor Swift, and the Kardashians.
Here's how Mashable describes it, labeled quote Celebrity Blackout twenty
twenty four and digitine aka the Digital Guillotine. The movement

(06:00):
is a protest against celebrity culture, specifically blocking people of
influence who have not yet used their power or privilege
to take a stance on the humanitarian crisis that has
devastated millions. So this all seems to have really been
sparked by the MET Gala last week. The night at
the Met Gala was also the night that airstrikes targeted
Ratha and Gaza, which is an enclave on the border

(06:21):
of the Gaza Strip, which is really key for transporting
aid and supplies to Palestinians. So there has been a
lot of chatter about the Mech Gala as this symbol
for out of touch, wealthy celebrities really reveling in their
privilege and opulence, while just a few blocks away protesters
are on the streets calling for a ceasefire in Palestine.

(06:43):
People were even comparing this to the Hunger Games, and honestly,
from like an optics perspective, it is kind of hard
to disagree. I will say that for what it's worth.
I don't really feel like I have a clear take here.
On the one hand, I totally get how it looks
very out of time much to have these celebrities like
flaunting their wealth while so much is going on in

(07:04):
the world. I don't say that to say that I'm
like above paying attention to the Mechala. In fact, I
own a T shirt that says Rihanna is my Pope,
which is a reference to the time that Rihanna addressed
as the Pope at the Mechala. But I didn't even
watch it this year, and I didn't even really pay
attention to any of the coverage or any of the
like online posts about it, in part because Conde Nas

(07:24):
staff were boycotting the Mechala, so I was like, I'm
not going to pay attention. And also it just seemed
like kind of boring, like not in a like I'm
above this kind of way, more anda like this just
doesn't seem like a good use of my time to
be paying attention to kind of way. I also really
get the argument around how much money so is spent

(07:44):
at the Mechala. I've seen people say things like, oh, well,
the mechala is really expensive, but celebrities aren't actually paying
to attend, And of course all the proceeds from the
Mechala goes to the Metropolitan Museum of Arts Costume Institute,
which is like a worthy cause, and that's true, But
I have also feel that like bringing that up feels
like a bit of a dodge to what people are

(08:06):
pointing out when they are talking about the Metcala, which
is like this dissonance between the mechala and what is
going on in the world, which I think is like
a valid point, and I don't think people should just
be dismissing that. However, I also don't want this to
be a conversation where women are just being scolded for
enjoying things like deemed trivial, right, and so easy to
fall into that this could sort of be a bit

(08:29):
of an unpopular opinion, but I do feel a little
bit weird about the entire thing. Like, truthfully, there's been
a humanitarian crisis in Palestine for many, many years, right,
and in all of those years, there were met galas
and rich celebrities doing rich celebrity things, you know. I
think there's an element of people scolding celebrities for partying

(08:50):
while there's a big human rights crisis going on, but
it seems to me that has been the dynamic for
a very long time, and like, I don't know, it
almost sort of reads like, how dare you go to
a party when I have just started being more tuned
into this situation in Palestine that has been ongoing for
many years, which I'm not sure that is a useful stance.
But that said, counterpoint, some of these people seem freaking insufferable,

(09:14):
and it kind of almost sounds like they were asking
for it. So I'm talking about people like influencer Haley
Bailey Khalil, who goes by Hailey Bailey on TikTok, who
posted a video of herself at the Mecala pre event
lip syncing to the alleged Marie Antoinette quote let Them
Eat Cake from the Sofia Coppola two thousand and six
movie Marie Antoinette, which is a masterpiece, go see it. However,

(09:38):
I just learned this. Did you know that Marie Antoinette
might not have actually even ever said let them eat cake?

Speaker 4 (09:44):
I did not know that. I thought it was an
actual quote.

Speaker 1 (09:46):
No, historians don't even think she ever actually said that.

Speaker 4 (09:49):
Huh.

Speaker 1 (09:51):
Fascinating so basically, this influencer really not something I would
have recommended she'd do. She posted a video from the
Mechala lip syncing Marie antoinettes Let Them Eat Cake, and
people unsurprisingly did not like this. And so this whole
celebrity blockout thing started with TikTok user Ray, who goes

(10:13):
by Lady from the Outside, who really kicked off the movement,
specifically asking for TikTok users to unfollow Bailey, the influencer
who made that let Them Eat Cake video from the Mechala,
specifically asking for users to block her in particular. But
now it has led the people making lists of celebrities
to block just generally who have not used their platform

(10:34):
to speak up. Here's what Ray said about the celebrity
blockout movement on TikTok. She said, we gave them their platforms.
It's time to take it back. Take our views away,
our likes, our comments, our money now. There has been
a suggestion that this movement could be making a real
dent in some celebrities social media followings. Middle East Monitor
reported that according to Social Blade, a US based social

(10:55):
media and a lids website, it has caused some celebrities
to lose followers. They reported that Selena Gomez lost one
million followers on Instagram and one hundred thousand on x
Zindeya lost one hundred and fifty three thousand followers on
Instagram and forty thousand on x Kim Kardashian lost seven
hundred and eighty thousand followers on Instagram, and her sister
Kaylie Jenner lost five hundred and forty thousand followers on

(11:17):
Instagram and fifty three thousand followers on x SO. I
would say those are like decent chunks of followings that
they have lost. However, I don't know if you have
millions and millions and millions of followers, I don't know
that that it sounds like it seems like a drop
in the bucket. I don't know. I don't know that
they're really going to feel this in their wallets, I guess,
is what I'm saying. If that is the intent of this.

Speaker 4 (11:37):
Movement, Yeah, I mean.

Speaker 3 (11:40):
I and you said the hashtag was digitein like a
mashup of digital and guillotine.

Speaker 1 (11:47):
That's right.

Speaker 3 (11:48):
Yeah, that's like pretty intense, you know, like you could
be mad at celebrities. But I don't know, I feel
like maybe invoking the French Revolution and the right of
terror specifically is like not what we need in twenty
twenty four.

Speaker 4 (12:05):
Maybe these people feel differently.

Speaker 1 (12:06):
Mike doesn't make it's time to roll in the gallows?

Speaker 4 (12:09):
Yeah, like maybe not.

Speaker 3 (12:13):
Yeah, I mean, I'm not necessarily saying that, you know,
who knows who's going to be in power when you know,
I'm not trying to be some sort of counter revolutionary here.

Speaker 1 (12:26):
I mean, I kind of I'm I guess I'm a
little bit conflicted. I do think this whole thing represents
where we are when it comes to celebrity right now.
I think that unless you are somebody who is a
hardcore stan of somebody, I don't think that in general
people are really looking to celebrities the same way that
maybe they once were. Ultimately, I personally just do not

(12:48):
have any kind of faith and celebrity whatsoever. Like obviously,
I think it's like nice and good or whatever when
they use their platforms for good, and they should do
that and all of that. But I just think that
we should be looking the celebrities less in general, and truly,
if there was one moment in the in the recent
past that I think solidified how useless celebrity is and

(13:09):
how we also be looking to celebrity less for all things.
It was the twenty twenty Imagine video when it was
the early days of COVID, and celebrities were like, you
know what everyone is in their houses. People are scared.
You know, we can offer the world us singing John
Lennon's Imagine. Do you remember this?

Speaker 4 (13:31):
I do?

Speaker 3 (13:32):
Yeah, maybe you're right, maybe that was like I don't
know that that caused the end of celebrity, but it
definitely does seem like a.

Speaker 4 (13:38):
Good indicator that.

Speaker 3 (13:42):
Things are not like they were, right, like, uh, yeah,
we celebrities do not have that place in culture that
they used to. They're just like they're just like us, right,
it's like page five everywhere.

Speaker 1 (13:56):
Yeah, I really think of that video. That Imagine video
is my Roman empire. I always wonder do our celebrities like, well,
I only got roped into this because Gal Gadot asked me,
or Will Ferrell asked me. He wrote me into it,
like like celerity celebrities look at who convinced them to
do that, and they're like, why did I get mixed
up in that? And I really see it in conjunction

(14:18):
with I know you're not gonna know what this is,
so bear with me. But the twenty fourteen selfie that
Ellen took at the Oscars where it's like Ellen and
Bradley Cooper and Jennifer Lawrence and Kevin Spacey, I think
like it's it's like a bunch of celebrities all crowded
into a selfie that Ellen is taking. And I remember

(14:41):
when it came out, everybody was like losing their mind, like,
oh my god, all these celebrities together, Oh, oh my god,
Oh my god, oh my god. And it's just funny.
Now how like five years later, six years later, in
twenty twenty, how collectively we were like, we don't need
celebrities right now. We don't want to see this.

Speaker 4 (14:57):
Yeah, not like that. Yeah, I guess is it? Is it?
Lusers now? Is it? Nobody?

Speaker 1 (15:01):
Is it?

Speaker 4 (15:01):
Mister beast?

Speaker 1 (15:03):
You keep trying to sprinkle mister Beast into conversation and
then he will be like, I don't want I don't
want to talk about him, I don't want any follow up.
I want to reference him, and I want to move on.

Speaker 4 (15:12):
I gets I'm curious. Yeah, I don't get it.

Speaker 1 (15:17):
You've not done this on the mic so listeners have
no idea what I'm talking about. But in non recorded
conversation you will reference mister Beast, like, who is this
mister Beasts? Why is he everywhere? Why am I reading
about him in the New York Times? And I'll start
to go like tell you, and You're like, I don't
actually want to know, I just want to like raise
the question.

Speaker 4 (15:36):
I'm just asking questions. Bridget, Yeah, you're.

Speaker 1 (15:38):
Just asking questions that need answers about mister Beasts.

Speaker 2 (15:47):
Let's take quick break at our back.

Speaker 1 (16:03):
So you know who else is just asking questions. The
FTC to Better Help, and if you get your therapy
and mental health services from better Help, you might have
gotten notice that will be getting an insultingly low refund. Soon,
around eight hundred thousand customers of the online therapy platform
Better Help will start getting refund notices related to a
settlement with the Federal Trade Commission because last year Better

(16:26):
Help agreed to pay seven point eight million dollars to
settle the FTC charges that it co opted user data,
including personal health questions, for advertising purposes, sharing sensitive information
with social media platforms like Facebook and Snapchat. So the
FTC allegs that better help not only failed to get
consent before sharing this data, they also misled users by

(16:48):
promising they would keep their data private but then sharing
it with advertising companies like Meta and Snapchat. Now Betterhelp
did not admit to any of this, although they are
paying the settlement. I had in a statement this week
that they were quote deeply committed to the privacy of
our members and we value the trust people put in
us by using our services. So I don't know how
deeply committed to the privacy of their members they can

(17:09):
be if they're just given their sensitive information to whoever
wants it for money.

Speaker 3 (17:15):
Yeah, and after telling people that they wouldn't do that,
that they would keep their information private. Yeah, with some
pretty damning allegations.

Speaker 1 (17:22):
And this is something that you have particular expertise in, right, Mike.

Speaker 4 (17:28):
Yeah. Some.

Speaker 3 (17:28):
I mean it's not like, you know, rocket science, the
idea that when you are dealing with people's sensitive health
information and you tell them that it will be kept private,
they could reasonably expect that it would be kept private
and not shared with advertising companies to market goods to them.

Speaker 1 (17:49):
You know.

Speaker 3 (17:49):
It was I think I was reading the FTC complaint
here and it seemed like that misleading aspect was a
big core part of their complaint.

Speaker 1 (18:04):
So it is not just Better Help. Eligible refund customers
include anybody who paid for services on Better Help or
any of its affiliated websites that cater to certain communities
like youth or queer people, places like Team Counseling, Faithful Counseling,
or Pride Counseling. If you use those services from August first,
twenty seventeen to December thirty first, twenty twenty, you are

(18:26):
eligible for a refund. Okay, so you are a member
of a vulnerable community. Right, Your therapy app has just
gotten pinched for selling your deepest, most intimate details that
you thought you were sharing with your mental health professional
and them alone. Right, So now Betterhelp is like, okay,
maybe we did this. We need to compensate you for

(18:48):
this wrongdoing. What do you think is a number amount
that you think is commiserate with that with what happened there?

Speaker 3 (18:57):
For like, what is the the value of my privacy?
Like how much would it be worth to me to
not unwittingly share personal information about me and my health?
How much would that be worth? Yes, I don't know.
I mean that seems like some pretty pretty big stuff.
It's hard to put a dollar amount on that kind

(19:18):
of personal information, you.

Speaker 1 (19:20):
Know, how about less than ten of them.

Speaker 4 (19:24):
Dollars?

Speaker 1 (19:25):
How about less than ten dollars? People got less than
ten dollars. When I saw that, it's just it's I'm
a big proponent of like take your little money, like
even if it's a dollar, get your coffee whatever. Ten
dollars is like insultingly low for what they have been

(19:45):
alleged to have done wrong, selling the most intimate data
and then telling people like don't worry, it's going to
be between you and a doctor and Snapchat and Mark
Zuckerberg and his stylist team, and like it's like I'm like, whoa, what,
here's how ella unchained? Put it on threads better help
sold my data to Facebook? And all I got was

(20:07):
this nine dollars and seventy two cent refund. Corporate accountability
is a shoke, And I completely agree. When I saw
that amount of money, I was like shocked that that
is how low that they were giving out. And I
think that threads is that what we're calling threads. I
think that thread or whatever post on threads really nails
how I feel about it. That like tech platforms and

(20:28):
lacks policy and lack of accountability have created this dynamic
where nothing private or intimate about us is not for sale.
Even if they tell you that it's not for sale,
it is still for sale, even the stuff that you
talk about with your mental health professional. A dynamic that
we once really understood as private and sacred and intimate,

(20:48):
even that it's for sale. And when they sell it
in a way that the government says breaks the law,
they will barely face any consequences for it, And the
compensation that you, as the person who was exploited, will
get from that will be not just low, insultingly low,
like an embarrassingly low.

Speaker 4 (21:06):
Yeah, you're right.

Speaker 3 (21:07):
It really speaks to the expectation we have that people
don't have privacy, that all of our private information is
available to be sold and traded and used to market
us goods. And I think that's a really destructive norm
to have. And you know, it's not like in some ways,

(21:32):
it's just endemic to everything being online, right, Like it's
just a fact of the digital world that every product,
every service is just like one business entity away from
an advertising company who's trying to use your eyeballs and
your attention to sell you stuff, and so there's just

(21:52):
like so much pressure there and incentive to violate people's
privacy and use personal information to surface them advertising. That
just wasn't there in previous eras of like health in
you know, personal health information, when it was all kept
in paper files at a hospital. They just didn't have

(22:16):
the opportunity or the incentive to try to use that
for advertising. Whereas now that's you know, for companies like
Better Help. I think probably a big part of their
business model is advertising.

Speaker 1 (22:29):
Yeah, And I guess, just philosophically, I think that we
should all be stopping to wonder, like, do we want
the most intimate parts of who we are, the parts
that we share when we're vulnerable, the parts that we
share when we're told no one else will hear them
except for this one health professional. Do we want that
to be fair game to be taken from us and

(22:50):
use to exploit us? Like I just you know, you know,
I went to Catholic school. It would be like if
you did confession and you confess something in the confess
what you were told was between you and your religious leader,
and then the next thing, you know, it's like, oh,
did you cheat on your spouse. Here's an advertisement for
a spouse cheating. You're like, what, wait a minute, like

(23:11):
that would be we would not accept that. So, like,
I think it's really like a philosophical thing of like,
do we want these very intimate things that we share
to just be fodder to exploit us further and for
some tech company to make money from us.

Speaker 4 (23:26):
I argue, now this indulgence has been brought to you
by Ashley Madison.

Speaker 1 (23:33):
I'm sure somebody somewhere is working on if it doesn't
exist already, somebody somewhere is working on like a confessional
app where when you make the confession, how they make
money is selling the data about the confession about you.
Someone has that app. You can't convince me that does
not exist.

Speaker 3 (23:52):
Yeah, and I think I totally agree there needs to
be more I don't know safeguards against this right, like
we need a federal law protecting privacy. We don't have
that right. So we're left with companies like Better Help.
They can can do things like this, and you know,

(24:12):
may point to the fact that it's kind of a
new space and there'ren't really rules, so maybe they can
you know, write a check and not have to admit
to any wrongdoing. What we need are some like clear
standards around privacy where you know, as a society and

(24:35):
you know, backed up by laws, we've clearly stated that
this sort of thing is not okay. Privacy is worth something,
and hopefully that is more than nine dollars and seventy
two cents.

Speaker 1 (24:49):
Yes. So the reason that I wanted to talk about
this is because we actually did an episode about Betterhelp
in the wake of the Astro World crowd crush incident
that left a bunch of people, including children dead. I
was really take it up by that story, especially when
in the wake of that the organizers announced that they
were going to be offering free, like a free trial

(25:09):
of Better Help mental health services. That whole thing that's
left a terrible taste in my mouth, from the fact
that they were that it was better Help to begin
with and not like go talk to a licensed mental
health professional in your community or whatever, and the fact
that it was like I think they were offering like
three months or something. I'm like, oh, yeah, if you
almost died in a crowd crush incident, you might need

(25:29):
more than a free Better Help trial to help you
navigate that. And it just felt to me like, Oh,
these people were harmed, and now to compensate for that harm,
they are going to be served up for another platform
that will only further exploit and take from them. I
found it really despicable. And so one of the things
that came up in that episode that we did about

(25:50):
Better Help was the relationship that Better Help has with
the podcasting industry. If you listen to podcasts, which I
guess you obviously do because you're hearing me say this
right now, I or I guess maybe you're my upstairs
neighbor who is like listening to me for court this,
I bet you know. If you listen to podcasts, you've
probably heard an ad for Better Help or two. According

(26:10):
to NBC in March, Better Help spend eight point three
million dollars on podcast opportunities, nearly double the next biggest sponsor,
which is Amazon. And also, just to put that in context,
they spend eight point three million dollars on podcasting opportunity
advertising in March alone. What they had to pay for
misusing the data of almost a million people was seven

(26:33):
point eight million dollars. So there's really it's like, when
I say that, it feels like a slap on the
wrist or a drop in the bucket to them. That
is what I mean. They could just pull their advertising
from podcasts for a month and it would still be
more than they paid out to these people whose data
they misused. And despite the fact that we have made
episodes talking about better Help and they're sometimes not so

(26:55):
great policies, they have reached out to us to advertise
on this podcast. And it kind of goes back to
something that came up in our Andrew Huberman episodes that
you know, podcasting is this weird space where it's a business,
and that in a lot of cases you just have
to take ads to make sure that the people who
make the show can continue to be paid. But you

(27:15):
walk this line of not wanting to take money from
companies whose practices are not on the level with wanting
to keep the lights on. And so I don't want
to make it seem like I'm gloating that, like, oh,
podcasters who made ads for better Help are bad and
they should be doing XYZ, because I recognize, like, as
a podcaster, I completely get that it is complicated, but

(27:38):
because Better Help took up so much space in the
podcasting landscape, I just think it's important to talk about
this on a podcast that if you've heard better Help
ads and you've thought about using better Help, this is
just information and context that you should know about how
a government says they have been exploiting people who pay
for their services. Okay, so quick heads up that this

(28:01):
story does involve sex crimes against children. A New York
Times piece just added a lot more context to a
story that we've been talking about a ton on the podcast,
and that is the way that Instagram has been knowingly
connecting children to adult men who are sexually interested in kids.
So people who run brands related to children, for products

(28:22):
like children's jewelry or children's clothing, will run paid ads
on Instagram, and they'll tell Instagram like, I want to
show these ads to people who are interested in things
related to kids, like obviously trying to reach moms who
might buy kids clothing or kids jewelry, which makes a
ton of sense. But then they will look at their
back end data, and that data will show them that
Instagram is actually serving those ads, which are pictures of

(28:46):
kids using these products to adult men. This whole thing
started when a mom happened to reach out to the
New York Times after seeing their previous reporting. It's The
New York Times writes. When a children's jewelry maker began
advertising on Instagram, she promoted photos of a five year
old girl wearing a sparkly charm to users interested in
parenting children, ballet, and other topics identified by Meta as

(29:09):
appealing mostly to women. But when the merchant got the
automated results of her ad campaign from Instagram, the opposite
had happened. The ads had gone almost entirely to adult men.
Perplexed and concerned, the merchant contact with The New York Times,
which had recently published multiple articles about the abuse of
children on social media platforms. In February, the Times investigated
Instagram accounts run by parents for their young daughters and

(29:32):
the dark underworld of men who have sexualized interactions with
those accounts. With the photos from the jewelry ads in hand,
the Time set out to understand why they attracted an
unwonted audience. Test Ads run by The Times using the
same photos with no text not only replicated the merchant's experience,
they drew the attention of convicted sex offenders and other

(29:52):
men whose accounts indicated a sexual interest in children or
who wrote sexual messages. So The Times was like, we're
going to get to the this. They did an investigation
where their reporters opened two Instagram accounts and promoted posts
showing a five year old girl her face turned away
from the camera, wearing a tank top and a charm
Then separate posts showed the clothing and jewelry without the

(30:13):
child model or with a black box kind of covering
her up. All of the paid ads were promoted to
people interested in topics like childhood, dance and cheerleading, which
metas audience tools estimated as predominantly women. What they found
from this experiment is horrifying. Aside from reaching a surprisingly
large proportion of men, the ads got direct responses from

(30:34):
dozens of Instagram users, including phone calls from two accused
sex offenders, offers to pay the child for sexual acts,
and professions of love.

Speaker 3 (30:45):
H That is gross. That seems like the sort of
thing that we should be involved in.

Speaker 1 (30:53):
Well yeah, and I guess what makes it even more
horrifying is that all of this really suggests that the
platform's algorithm are playing a role in directing these men
to photos of children to sexualize and DM and reach
out to and so previously we had talked about this
bombshell New York Times investigation about how Instagram was using
a loophole to allow grown adult men to connect directly

(31:17):
with children like as young as like five six seven
on the platform by allowing men to subscribe to these
like Mommy Run accounts that were set up on behalf
of the children, because adults are not meant to be
able to subscribe to the accounts of kids. But if
the account is technically like a mom run account, that
is a loophole where adults can subscribe to the content

(31:41):
of kids. And like this New York Times investigation showed
that people who worked at Facebook were like, oh, this
seems to be a loophole that is allowing grown men
to connect with children. So Facebook was certainly aware this
was happening, like their own engineers were telling them this
was happening. And so this new report says there is
a lot of overlap between the men who are connecting

(32:03):
to children via these mom run accounts and the men
for whom the algorithm is surfacing this paid ad content
involving kids. The New York Times rights an analysis of
the users who interacted with these ads posted by the
Times found an overlap between these two worlds. About three
dozen of the men followed child influencer accounts that were
run by parents and were previously studied by The Times.

(32:25):
One followed one hundred and forty of them. In addition,
nearly one hundred of the men followed accounts featured or
advertising adult pornography, which is barred under Instagram's rules. So
all of this is like, pretty horrifying, pretty terrible. Do
you want to take a guest at Instagram's response to
all of this was.

Speaker 4 (32:44):
It like a snappy sweater and a silver chain.

Speaker 1 (32:48):
It was Have you seen Mark Zuckerberg's new look? People
are saying he has a hashtag glow up.

Speaker 4 (32:55):
No.

Speaker 1 (32:55):
Their response was it's really not that big of a deal.
Danny Lever, a spokesperson for Meta, dismissed the Times ad
tests as quote a manufactured experience that failed to account
for the many factors that contribute to who ultimately sees
an ad, and suggested that it was flawed and unsound
to draw conclusions from limited data. This really reminds me
of how Twitter responded when Media Matters and other watchdog

(33:18):
organizations were pointing out these dart pockets where nefarious people
were using the platform together, and they were like, oh, well,
you would only see that if you met xyz circumstances.
That's manufactured data. But that is exactly the point, Like
pedophiles are taking advantage of these unexamined corners of the

(33:38):
platform to connect with kids. So even if they're saying, well,
most people wouldn't get these results and most people wouldn't
have this experience, that is exactly what these people are saying,
is that pedophiles are taking advantage of the fact that
this is a little bit under the radar to operate
in plain sight. So the way that Facebook is responding
it makes me feel that they don't really understand what's
actually the threat.

Speaker 3 (34:00):
Yeah, and I think I totally agree that they It
sounds a lot like that way the way Twitter responded
to that Media Matters investigation of like how easy it
was to get to hateful comment through specific searches. But
this actually feels substantially different from that. When you were

(34:23):
describing what The Times did, it sounded like a quite
good experiment that was set up using paid ads to
see what the algorithm would do and to you know,
who would surface this ad.

Speaker 4 (34:36):
To so you know, they can this spokesperson can say
that the methodology was flawed and unsound, but when you
described it to me, it sounded pretty sound.

Speaker 1 (34:49):
Well, what's interesting is that the New York Times talked
to like former Facebook engineers, and then they talked to
Peter Sapienski, a research scientist at Northeastern University who specializes
testing online algorithms, and he basically said that it sounds
like these algorithms are working as intended and as designed
by Facebook. So what's really interesting to me is that

(35:12):
the potential algorithmic reason that this is happening, which is
that most ads on Instagram are designed to reach women
because women are more likely to be buying stuff from
Instagram as and so we're like a more competitive audience. However,
reaching men is a lot easier. Peter Sapienski that research
scientists at Northeastern put it this way. He said that

(35:32):
advertisers compete with one another to reach women because they
dominate US consumer spending. He said, as a result, the
algorithm probably focused on highly interested easier to reach men
who had interacted with similar content. The men do engage,
he said, the machine is doing exactly what you want
it to do.

Speaker 3 (35:50):
Yeah, Like, on one level, that is what the machine
is supposed to do. On another level, it seems pretty
bad that that is what the machine is supposed to do, Like,
there should probably be something to prevent that from happening.

Speaker 1 (36:07):
So it's horrifying as all of this is. The Times
also looked into the kind of men who were dming
and engaging with these accounts, some of whom use their
real names or link to their real identities from their
Facebook pages, and some of them have convictions for sex
crimes against children. Now, per Meta's rules, people who are
sex offenders or have been convicted of sex crimes against

(36:28):
kids are not allowed to have Instagram accounts. However, these
people did have Instagram accounts, and it's not clear how
they did have Instagram accounts that they're meant to be
registering their email addresses, and Meta has a program that
is meant to keep them from having these accounts. I
don't know why or how, but that did not work.
So when The New York Times used Meta's internal tool

(36:50):
to flag these accounts and tell Facebook like, hey, these
people have been convicted of sex crimes against kids, or hey,
these people are on sex offender registries. Per your rule,
they should not have Instagram accounts. They used Meta's own
internal tools to flag these accounts, and Meta did not
remove them. According to The New York Times, the accounts
remained online for about a week until the Times flag

(37:12):
them to accompany spokesperson. And so I guess this is
my point. If you have these rules in place but
then do nothing to make sure that they are followed,
in what way is it a meaningful rule? The fact
that your own tool that you internally used to flag
accounts that should not be on your platform, that The
New York Times could be flagging people who have been

(37:33):
convicted of sex crimes against kids that you know are
then using your platform to try to connect with children sexually,
and you just leave them up until the New York
Times has to directly reach out to a spokesperson. I
would say that something is very broken, but per the
algorithm expert at Northeastern, things are working as intended. And so,

(37:55):
you know, it's really hard to say that something is
broken when it actually kind of seems like things are
working on with what they were designed to work.

Speaker 4 (38:04):
Yeah, Wilson.

Speaker 1 (38:10):
More, after a quick.

Speaker 2 (38:10):
Break, let's get right back into it.

Speaker 1 (38:27):
Okay, so we have to talk about what is being
called on the internet bumble fumble. So there's a ton
going on in the dating app scene right now, namely
that a lot of women are just fed up with
dating apps. As y'all might recall bumble the dating app
down than by Whitney wolf Heard used to have this
thing where their whole sort of gimmick was that the

(38:48):
women on the dating app would have to message first,
and you couldn't talk to a woman without her being
the first person to initiate conversation. But last week they
announced they are not doing that anymore. According to women
can now add prompts to their profiles for men to
respond to. For same sex and non binariy users, either
person can set and respond to these prompts called opening moves.

(39:11):
This is all happening against a backdrop of I guess
what you could call dating app fatigue. People are sort
of sick of dating apps. I think that the idea
that you would have to put so much work in
to actually find a suitable mate. I also think there's
some screen time fatigue happening, where the idea that you
would be on your phone on your screen all the

(39:33):
time and that that is the key to finding your mate.
I think a lot of people are just fed up
with that, So listen, I have to say, like, I
am fascinated by the dating app Bumble as a company.
When it first started, the creator, Whitney wolf Heard, was
working at Tinder, and in twenty fourteen, she was part
of a team that launched Bumble after this like pretty

(39:55):
tumultuous departure from Tinder, which she also helped launch. According
to NPR, she had this like string of bad relationships,
one of which even involved a sexual harassment lawsuit against
one of Tender's co founders, which was later settled. So
she used all of this experience to build a platform
that put more power in the hands of women. So

(40:16):
all of that sounds great, right, Like more power and
the women messaging first, like that gives women a lot
more autonomy, right, sounds great. However, NPR spoke to Demona Hoffman,
an online dating coach and the author of the book
f the fairy Tale, who basically said this whole women
message first thing just leads a lot of women feeling

(40:39):
like they have to do the bulk of the work
on dating sites, and like the bulk of the work
to keep the conversation going, keep things moving, is on them,
which if you are already working a full time job,
who wants dating to feel like another job? Like dating
is supposed to be fun, it's supposed to be enjoyable.
Who wants something that's like, oh yeah, more work piled

(41:00):
on it. And because it's dating apps and they're like
algorithmically generated, at the end of the day, we're only
surfacing you the duves.

Speaker 4 (41:08):
Not the duds. Oh man, it's a rough group to
be in the dud group.

Speaker 1 (41:15):
Yeah, I mean you're Miilache may vary for what a
dud is to you, but you know a dud when
you see one. And algorithms know who the duds are,
and they those are the only profiles they surface to
you unless you are willing to pay for premium. Then
they might show you somebody that you might actually want
to have sex with, But until you pay up, it's.

Speaker 4 (41:35):
Duds, poor broke duds.

Speaker 1 (41:39):
So I think all of this is kind of adding
to this backdrop of dating app fatigue that people, particularly women,
are feeling with these these apps. However, Whitney, we've heard,
the co founder of Bumble actually has suggested an answer.
Your AI could just go on a date for you.
Here's what she said. If you want to get really

(42:01):
out there, there's a world where your AI dating concierge
to go on a date for you with other dating concierges, truly,
and then you don't have to talk to six hundred people.
It will scan all of San Francisco for you and
say these are the three people you really ought to meet. So,
for example, you could in the near future be talking
to your AI dating concierge and share your insecurities. I've

(42:22):
just come out of a breakup, I've got commitment issues,
and it could help you train yourself into a better
way of thinking about yourself.

Speaker 3 (42:29):
Sounds nice but also creepy and also like, come on,
that's that's not gonna work.

Speaker 1 (42:35):
I would be remiss if I did not mention there
was literally a Black Mirror episode about two people's ais
dating each other within the framework of a dating app,
and like that's how they find each other and know
they're compatible. So, given the story that we talked about
earlier about better help about how they just give the
most intimate data and information about us to Facebook and

(42:57):
Snapchat and whoever, else for money. I think we really
should be wary of this kind of AI integration into
our romantic and sexual lives. So I know that the
idea of an AI dating concierge is very different from
like an AI enabled chat bot or love bot or
sex bot or romantic partner. But Mozilla Foundation looked into

(43:19):
it and found in an analysis of eleven romantic chatbot
apps released in February that nearly every app tested sells
user data, shares it for advertising, or doesn't provide adequate
information about any of these points and its privacy policies.
So basically it's just the wild wild West already when

(43:40):
it comes to AI and romance, and I think these
kinds of things become even more concerning for like queer
or gender expansive people, like are you someplace where it's
safe to have AI offering up intimate information about who
you are and your identity to a potential date that
you yourself have not personally vetted, Like are you safe

(44:01):
if that information has pretty much no policies around who
that platform is going to share it with, how they
will share it with them, what third parties will have
access to it? You know, I have said this before that,
when you're talking about connecting with people in ways that
are intimate and happening. Irl, safety and privacy isn't just

(44:23):
a nice to have, Like it genuinely does matter. And
I'm just not sure that I would trust AI designed
by like mostly white straight CIS men that we know
is like buggy and full of problems to do that
for me if the stakes really were high.

Speaker 4 (44:38):
Yeah, the stakes definitely do seem pretty high.

Speaker 1 (44:41):
Yeah, especially if you're dating someplace where it's not safe
to be your identity, if you're a queer or trands
are gay. Like, this is real. So we have to
talk about those Bumble ads, which are sort of being
referred to as the bumble Fumble. When I first saw
this reported, I was like, well, they're really seeing an
opportunity to like make bumble fumble a thing. But we

(45:02):
got there eventually, So bumble Fumble basically Bumble released these
billboards saying things like you know full well a valve
celibacy is not the answer, and thou shalt not give
up on dating and become a nun. The ads were
criticized and Bumble apologized, saying we made a mistake. Our
ads referencing celibacy were an attempt to lean into a

(45:25):
community frustrated by modern dating, and instead of bringing joy
and humor, we unintentionally did the opposite. So I actually
saw a lot of the criticism of these billboards and
people were saying like, oh, they're knocking celibacy as a
lifestyle all of that. Totally get that, but I actually
have a different take. So I think that if anything,

(45:48):
the problem that Bumble was actually like making light of
in a way that was in poor taste is that
Bumble is just no longer serving women, who they say
is their life like main intended user base. So more
and more women are just getting off of these platforms,
and on top of that, more and more bots are

(46:09):
on these platforms too. Like I recently watched the Ashley
Madison documentary on Netflix, and it is so wild how
they're basically like, oh yeah, on top of the like
cheating most of the time, we were just like charging
stupid men twenty dollars a month to chat with bots
and other men and themselves. Right, So, like, dating apps

(46:31):
are kind of a like house of cards where women
have been like this sucks, We're leaving, and these platforms
are like, oh god, we have to charge men who
want to be in conversation with women for something, and
a lot of that is bots. And so I actually
see these ads as almost this like weird tone death

(46:51):
plea to women to come back to these platforms, even
though these platforms are kind of acknowledging that they failed
them because the platforms need women on them to make money.
And so I totally see what people who are saying like, oh,
this was a dig on celibacy are saying, but I
think it's actually more insulting than that. I think what
these platforms are saying is like, listen, women, we need

(47:15):
you on this platform to make money. So like, what
are you gonna do? Be celibate? No way, come back
to our platform.

Speaker 3 (47:22):
Yeah, it does feel a little like desperate and like
trying to be edgy, but just coming off as desperate,
which is definitely not a good look if you've ever
been on a dating.

Speaker 1 (47:33):
Platform, especially if you are the dating platform. And I
think desperate is the right word for it. And I
just feel like I wish we lived in a world
where what these platforms were offering was functionally and meaningfully
just something better, something that feels like respect and fun

(47:54):
and exploration and excitement, not just fatigue, Like I don't know,
I get mirrors a lot of the different ways that
people are feeling about digital experience these days, that it's
just fatigue. So Bumble really fumbled on this one. I'm
glad they apologized, But I think the problem is deeper
than just these billboards. I think that they are not

(48:17):
serving women. Women are smart enough to not stick around
where they are not having good experiences and where it's
clear that they, you know, aren't wanted or aren't value
And I think that's a real it's a it's a
real problem, not just for the women that they're failing
to serve, but also for their business model, Like you
can't just have a dynamic that women feel like they're

(48:37):
being mistreated and not served and expect them to continue
showing up for more of the same.

Speaker 3 (48:44):
Yeah, And maybe the problem is even broader and deeper
that you know. The problem they're trying to solve is
people lacking human connection in their life. And and there's
just the fundamental tension of offering people a digital tool

(49:05):
to solve that problem, where the financial success of that
digital tool depends on people continuing to use that tool
and continuing to engage with it and pay subscription fees
month after month.

Speaker 4 (49:19):
Yes, you know you can. You can have one, but
you can't have them both.

Speaker 1 (49:22):
And I fundamentally believe you know. In later on, in
this conversation with Bloomberg that that Wendy wolf Heard had
about the future of online dating, she was talking about
this loneliness epidemic that we're all in, and that's very real.
But I'm not sure that the people who have given

(49:44):
us our current tech and digital landscape are the same
ones that I would trust or want to be using
technology to solve our current loneliness crisis. Like the people
that got us into our current situation are not the
people that I would trust to get us out of it.
Like if you drove my car into a ditch, I
don't think I would trust you to drive it out

(50:05):
of the ditch. I might say it's time for you
to sit in the back seat let somebody else take
a turn at the wheel. Yeah, well, Mike, thanks for
being in the passenger seat on this journey through this
week's tech stories.

Speaker 3 (50:17):
I appreciate it, Bridgid. It's always a pleasant ride, and
let's do it again sometime soon.

Speaker 1 (50:24):
And thanks to all of you for listening. I will
see you on the Internet. If you're looking for ways
to support the show, check out our merch store at
tegodi dot com slash store. Got a story about an
interesting thing in tech, or just want to say hi,
You can reach us at Hello at tegodi dot com.

(50:45):
You can also find transcripts for today's episode at tengody
dot com. There Are No Girls on the Internet was
created by me Bridget Todd. It's a production of iHeartRadio
and Unboss Creative edited by Joey Patt. Jonathan Strickland is
our executive producer. Tarry Harrison is our producer and sound engineer.
Michael Amado is our contributing producer. I'm your host, Bridget Todd.
If you want to help us grow, rate and review.

Speaker 2 (51:05):
Us on Apple Podcasts.

Speaker 1 (51:07):
For more podcasts from iHeartRadio, check out the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.
Advertise With Us

Popular Podcasts

Dateline NBC
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.