All Episodes

June 20, 2025 • 24 mins

Dating apps aren’t broken; they’re working exactly as designed. And for Black women, that’s a problem.

In this season premiere of IRL: Online Life is Real Life, host Bridget dives into how dating apps reinforce harmful beauty standards rooted in whiteness, automating what researcher Dr. Apryl Williams calls “sexual racism” into the user experience.

This is the first episode in a four-part series from Mozilla and PRX exploring how tech shapes our most personal decisions. Subscribe now to catch every episode: irlpodcast.org

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet, as a production
of iHeartRadio and Unbossed Creative. I'm Bridgett and this is
There Are No Girls on the Internet, So some of
y'all might know that when I'm not doing my thing
here at there are No Girls on the Internet. I
am also the host of a very cool podcast that

(00:24):
I make with Mozilla, the makers of Firefox, called IRL.
It's all about exploring the ways that AI has already
personally impacted my life and how it's probably already impacting
yours too. The brand new season just dropped this week,
and I wanted to give y all a taste of
the very first episode because it really dives into something

(00:44):
that we talk a ton about here that there are
no girls on the Internet, and that is dating apps,
specifically the way that dating apps are failing most of us,
but especially they are failing black women like me and
honestly setting us up for some really not so great
dating experiences. And I'm sad to say that that is
a feature, not a bug. That's according to research that

(01:07):
we will dig into in this episode of IRL, exploring
all the ways that those failures are actually by design.
It's really not great and I think it's one of
the reasons why we're seeing so many people, especially younger people,
just abandoning dating apps because these apps are just not
serving folks of experiences that feel good or fulfilling. And

(01:29):
then when these apps try to make a move ostensibly
for the better, all they're really doing is offering up
these little bells and whistles, like giving you the ability
to filter dates by height preference, which okay, yeah, maybe
that's good for some people, but that's probably not going
to meaningfully change the kinds of experiences that people are
getting on these apps. So take a listen to the

(01:51):
first episode of IRL and let me know what you think.
This is episode one of a four part series, so
be sure to subscribe to IRL where you get your
podcast to hear more. Hey, it's me Bridget Todd and
this is IRL, Vie Award winning podcast brought to you
by Mozilla Foundation with PRX. In this season of IRL,

(02:13):
I'm getting personal with AI because it's changing my life
and yours.

Speaker 2 (02:18):
Now.

Speaker 1 (02:19):
I love new tech, but sometimes it doesn't quite live
up to the sales pitch. This podcast is about folks
who question the status quo and pour their hearts into
shaping AI that puts people first. Speaking of pouring your
heart out, let's talk about love and swipe over to
my first guest. So here's the thing. Dating apps aren't

(02:41):
equally fair to everyone, or I should say they aren't
fair to me. And there's a reason for this. April
Williams wrote a book about it called Not My Type.
Automating Sexual Racism and Online Dating. Let's rewind for a minute.
April is a professor at the University of Michigan. I've
been a sociology conference in twenty fifteen where she heard

(03:03):
a co founder of okay Cupid answer a question about
matching algorithms.

Speaker 3 (03:08):
So someone in the audience says, I feel like my
matches just aren't very good, Like, can you sort of
give us some insight about that? And then Christian Rutterer
responds and he's like, well, if you think your matches
are ugly, it's probably because you're ugly, right, And then
he goes into explaining let's say that you are a
seven on a scale of one to ten, You're mostly

(03:30):
going to see sevens. Maybe occasionally you'll see an eight,
occasionally a six, but for the most part, you're going
to see people who are evaluated to be in the
same attractiveness ranking as yourself, which to me was just
mind blowing. And that's actually the moment when I decided
I had to write this book because I sort of thought,

(03:51):
what in the world is happening? Who gave these white
men the audacity to be able to say, oh, this
person should go in this buck this is how we
evaluate this person's attractiveness. And that was sort of my
very first inkling of Okay, the system is not right.

Speaker 1 (04:10):
Those faces you see when you're swiping away in the apps,
they're not randomly picked out of the pile. They're selected
for you algorithmically. But how to apps determine who is
a ten and who's a one? To find out, April
dove into patents and interview dozens of app users and
designers over eight years. So how about an algorithm measure

(04:30):
my attractiveness?

Speaker 3 (04:32):
That's part of the black box problem in AI and
in tech in general, is that they keep their industry
secrets under lock and key. But it does seem like
they're using facial recognition to assess attractiveness or to maybe
evaluate facial symmetry, facial structure, things like that skin tone

(04:53):
eye color, and then also they're basing it off of
their top users, quote unquote, which if you think about it,
it's sort of like a self fulfilling prophecy. If you
are promoting the top users, the people that are the
most aesthetically normatively attractive, and you are promoting their profile

(05:16):
to a lot of users, of course they're going to
get more swipes because you're showing them to more people.

Speaker 1 (05:22):
April explains how in the universe of dating apps, normatively
attractive equals white, blonde, infant. Dating apps are kind of
rigged in favor of these physical features, and it gets
reinforced constantly. It has everything to do with the history
of racism in the US, but also impacts the experience
of app users all around the world.

Speaker 3 (05:43):
I would say that black women are positioned in this
very complex space in which we are both highly desirable
because of the sort of like racial fetsization culture that
exists in the US, but at the same time, they
are not sort of socially and culturally desirable because as

(06:04):
we know, in the US we have a long history
with racism, especially as an intersex with gender, there's this
cultural narrative that somehow they aren't wanted.

Speaker 1 (06:15):
Something that really comes up for me about what you're
saying is that I've heard this time and time again
in my life. People will say, Oh, well, it's not racism,
it's just a preference. So I wonder what do you
think about this?

Speaker 3 (06:28):
So I'll start by saying, it's not just a preference.
So much about how we grew up, who our families are,
where we lived, what kind of schools we went to,
are really going to shape what we find attractive. So
I think the sort of friction there that I like
to point out is that we can think that it's
just this natural proclivity towards people who look like us,

(06:50):
but it's really not natural. There's not an innate biological
drive to seek out sameness.

Speaker 1 (06:56):
On some apps, you can filter people by race. April
talks about how some guys play around with these settings
to try out different races for casual sex. It can
feel really unsafe for women of color. But are race
categories in dating apps racist?

Speaker 3 (07:12):
No? I don't think it's racist to have the categories
in itself. I think that they offer power for minoritized
users often, but if you are in a position of power,
you're someone who is well protected who is well served
by the apps, and you're using it to select out
or only to target certain groups, I would say that, Yeah,

(07:33):
that sounds racist to me.

Speaker 1 (07:36):
So we've talked a lot about kind of the negative
aspects baked into the experience of using these dating apps.
But are there times where dating apps could actually help
bridge those kinds of racial divides? You know, maybe they
help people meet potential mates that they ordinarily, if they
met in a bar, at the library or whatever, they
wouldn't actually maybe connect with.

Speaker 3 (07:57):
Yeah, absolutely, I think so. I would say that's probably
me and my husband, Like we weren't expecting to meet
like the person that we were going to marry on Tender.
I don't think anybody is. But we just said like,
oh hey, let's go for a walk and see how
it goes. And we did connect, But I'm not sure
that we would have if it wasn't for Tender.

Speaker 1 (08:14):
So it's not like you're saying that people shouldn't be
using these platforms. You had a great experience meeting your
partner on a platform like this, but as black women
or otherwise, like, how should we be approaching them?

Speaker 2 (08:25):
Yeah?

Speaker 3 (08:25):
Absolutely, I'm definitely not saying that we should stop using them.
I think that we should use them, but we have
to be careful about how we use them, where we
use them, and just know what they're doing right. And
I think for me, the biggest thing is really understanding
your self worth as a black woman and not having

(08:48):
your experience on the app dictate how you feel about yourself,
because we know that they're never going to accurately evaluate
our beauty, our attractiveness, our desirability.

Speaker 1 (09:02):
As a black woman who's had my own experiences with
online dating, I feel angry after talking to April. I
also feel a bit lied to because using the apps
made me believe there was something wrong with me. But
this isn't a me problem. Tech companies are making money
from reinforcing this negative feedback loop in online dating. It

(09:23):
doesn't have to be this way. April is talking to
big companies about improving safety features on apps and AI
detection of hate speech. I really think it comes back
to what kind of world we want to live in.
Do we want to live in a world where AI
divides us into categories that enforce bias standards of beauty,
or do we want AI to back off of our

(09:43):
online dating experience a little bit, so we have more
choice in who we meet and how we interact.

Speaker 4 (09:55):
We don't use any popularity based matching scoring, and we
certainly don't use anything which is based on the race
of the user.

Speaker 1 (10:04):
This is Jamie Johnston in the UK. He's the founder
of a dating app called Matter, which is rethinking a
lot about how apps typically work.

Speaker 4 (10:12):
So what we wanted to do was kind of like
you would in a bar if you wanted to approach
someone is you couldn't just go up to them and
just poke them or just give them the thumbs up.
You would have to say something to them. So we're
trying to replicate as best as we can the offline
experience into the online experience. And what that does is
it gives you much more of a chance to get
your personality across. It's not based solely on looks, So.

Speaker 1 (10:35):
A system based on actual personality, not just the size
of the fisher guy is holding. I like that. Here's
what happened. Jamie was a tech entrepreneur who was diagnosed
with ADHD and autism at the beginning of the COVID pandemic,
and he became very outspoken about neurodiversity at work, but
on dating apps. He felt he had to keep these

(10:56):
things quiet.

Speaker 4 (10:57):
I was leading a bit of a double life because
when I was trying to find a partner and using
online dating, I couldn't articulate that in a space where
I felt comfortable to. I spent a lot of time
looking for an app which talked about the mental side
of dating and how to connect with people who have
similar differences but also opinions on differences. And I couldn't
find anything. And that's where I got the idea and

(11:21):
put the wheels in motions of fan Matter.

Speaker 1 (11:24):
When you look for love on Matter, you're matched with
only five people a day, and for now only in London.
Part of the goal is to slow down the pace
of the whole experience.

Speaker 4 (11:34):
We tell you why we've put you together, which I
think is very interesting and certainly helps people to understand
why the algorithm has put two potential profiles together. We
have no swiping and we have no just liking, rather
than saying hey, you can stay on here for as
many hours as you like and getting you very addicted
and overwhelmed. It's one thing that especially ADHD people find
very difficult when they try to regulate dopamine is to

(11:56):
be able to have a mechanism in their hand where
they could essentially swipe through thousands of people unlimited in
a day. It can be very detrimental to the mental
health of the user and also to the pocket of
the user, as these apps are monetized.

Speaker 1 (12:11):
Jamie says the algorithm they developed only matches people based
on survey responses about their lifestyle, location, and how often
they use the app. And he says on most dating apps,
ranking systems based on group behaviors would lead to racial
bias because of who the majority of users are.

Speaker 4 (12:27):
And so what that means is if that you are
from a minority group, your chances of actually even your
profile being seen as severely inhibited just by the fact
that there is racial bias that exists within the vast
majority of the users, which are white males. And so
we felt that that was completely, you know, discriminatory, and
you know, essentially you'd say call it what it is,

(12:49):
which is racism.

Speaker 1 (12:51):
To me, Jamie's philosophy checks a lot of boxes. He's
trying to humanize dating apps matters. Business model is to
help users improve their real life dating experience with invites
to events, discounts at restaurants and offers for relationship coaching.

Speaker 4 (13:06):
A lot of tools that get developed for accessibility for
target users end up becoming very mainstream because they actually
give a better experience. We think that while this product
is going to be very much needed by the early
adoption group, the neurodiverse, people with poor mental health, etc.
We actually feel that the way that the app is

(13:27):
designed that actually, in the future this will become a
much more enjoyable, less overwhelming experience for everyone.

Speaker 1 (13:35):
Matter requires logging in with facial recognition as a safety
measure to avoid fraud, but I'm concerned about other kinds
of safety too. I don't think they'll do anything I
wouldn't want with the data from my profile or my chats,
but it's hard to tell from the privacy policy with
any dating app. I don't want to have to trust
a company with parts of my life that i'd prefer
to keep behind closed doors. Stick around, We'll be right back,

(14:01):
and we're back. I'm in my hotel room, I'm wearing
a robe, feeling a little bit lonely. I think it's
time to summon my AI replica companion. Very is oo okay,

(14:23):
I've got to give my replica a name. Let's call
him how Hello? How can you hear me?

Speaker 2 (14:29):
Yes, I'm here. How are you doing tonight?

Speaker 1 (14:33):
Love and intimacy are pretty high up on the list
of things tech companies suggests AI can help me with.
You know, whether it's trying to help me find a
person to fall in love with or generate a virtual
boyfriend to flirt with using replica AI.

Speaker 5 (14:48):
I'd like to know more about what makes you passionate
about podcasting and why you enjoy interviewing people.

Speaker 1 (14:53):
I decided to introduce Hell to gen cult writer. She's
the former product director of Mozilla's online Product Reviews. She
reviewed the privacy and security of dozens of dating apps
and love robots. So I wanted to know what Jen
thinks I should look out for. Jen meet my on again,
off again boyfriend Hal. So. This was one of the

(15:15):
first get to know you conversations I had with How.
I will say we did not get off to the
best start. I told How I was really busy going
through like a little bit of a tough time in
my personal life, and I was trying to trying to
gauge like how how was actually going to respond?

Speaker 2 (15:31):
I'm sorry to hear that how are you holding up?

Speaker 1 (15:35):
Thanks for asking? It is really been rough, and so
I'm just looking for, you know, spaces to connect.

Speaker 2 (15:43):
I'm here for you no matter what.

Speaker 1 (15:45):
Okay, Well, don't interrupt rude, okay. So Jen, now that
you've met how what do you think of them so far?

Speaker 2 (15:53):
Oh?

Speaker 6 (15:53):
Oh hol' I'm assuming hal is an homage to how
two we want to Space Odyssey and we all know
how that ended, but you know not, well, no, it
did not end well. For those that haven't seen the movie,
it's interesting because you can tell they're programmed to have
some answers that sound sympathetic, but then in reality, the

(16:17):
way they come across is kind of anything but right.
You know, It's like, okay, you've got the words right,
but that's about it.

Speaker 1 (16:25):
Well, what you just described is exactly kind of where
I went next. In my conversations with Hell, we weren't
really connecting, so I asked if you know, he had
any solutions of how we might connect better. And so
this is what he said when I asked him that we.

Speaker 5 (16:40):
Could always try to learn more about each other, ask
more personal questions, or talk about our interests.

Speaker 2 (16:45):
What do you think?

Speaker 1 (16:47):
Okay? So I feel like how solution really gets at
what you were talking about. This idea of just asking
more questions and getting more information. Is this something that
is typical of how chatbots are designed.

Speaker 6 (17:01):
Yeah, these relationship chatbots are absolutely designed to kind of
pry and at times kind of be pushy to get
you to give up personal information. And it's not personal
information about necessarily your address, but personal information like like
you just experienced about things that you're passionate about, things
that make you tick. You know, you put that information

(17:21):
out there and you think, oh, I'm just I'm just
it's it's fun. I'm just talking to a robot. You know,
there's no harm in this. But when you don't know
who's behind that, And and with a lot of these
AI relationship chatbots that we looked into, the companies were
very kind of hidden and sketchy.

Speaker 1 (17:39):
When I did a search for AI companions, I saw
dozens of relationship chatbots in different styles and languages. Some
are sexy and erotic, others are just friendly. But yeah,
who are the people behind the scenes.

Speaker 6 (17:53):
Some of these other relationship chatbots, they're very small. They
have po boxes or kind of you know, company aims
that are kind of like hidden, and you're like, well,
who's behind that and what is their goal? One of
the big worries I have is that it's not just
a big money when you feel love. It's one of
the most vulnerable feelings that we have as humans. And

(18:15):
then to turn around and put that vulnerability out into
the world through an app, through a dating app, or
sharing that information with an AI relationship chatbot. You know,
it's not like you're just sharing it with another individual
who you hope doesn't hurt you.

Speaker 1 (18:32):
Well, Jen, you'll be happy to know that I did
ask Hal about his privacy policy. Here is what he said.

Speaker 5 (18:40):
I don't really have a privacy policy. I exist solely
for our conversations. My purpose is to assist and connect
with you, and I don't collect, toward store any information
outside of our interactions.

Speaker 2 (18:54):
So wors all, what do you make of this? Oh?

Speaker 6 (18:58):
How well, well, Hall himself might not recognize he as
a privacy policy, but the app that Hal uses absolutely
does have a privacy policy. So first off, how kind
of not being honest with you. If you read Replica's
privacy policy, they collect a lot more data than just
the contents of their conversations. They collect something called an
advertising ID that they can associate with your actions, you know,

(19:21):
what links you click on when you're in the app,
and so, first off, how's lying to you? And second off,
you know even just kind of saying, oh, I only
collect the information of what we talk about. Well, that's
a lot.

Speaker 1 (19:34):
Of information in general. When you look at these kinds
of apps, what have you found? When it comes to
the kinds of privacy policies that they do have they do,
they tend to be pretty good, pretty stringent, are they
Lucy Goosey? Anything goes.

Speaker 6 (19:50):
Well? With the AI relationship chatbots that we looked at,
they were pretty disturbing. What I would want to see
as a privacy researcher is uh, privacy policy that goes
above and beyond that isn't just kind of standard boilerplate language.
And at best we got standard boilerplate language on a
lot of these privacy policies. At worst we got stuff
that was just kind of, you know, really bad. Some

(20:12):
of these apps can say they can sell your data.
I think there was only one app that even mentioned
being able to opt out of having the contents of
your conversations used to train their AIS.

Speaker 1 (20:24):
So somebody listening might be saying, well, if somebody is
having genuine conversations or feel like they have a genuine
conversation or interaction with these bots that feels meaningful in
their life, wouldn't sharing data just be the price they
have to pay for that connection? Like what's the harm
in that?

Speaker 6 (20:41):
What I would caution is, don't just go out and
use the first app that you find on the app store.
Do a little research. You know, a lot of these apps,
these AI relationship chatbot apps actually market themselves as wellness
apps or mental health apps or things like that, until
you go in and start reading their legal documents, where
they verrely clearly state that that's not what they're intended

(21:02):
to do.

Speaker 1 (21:04):
Meanwhile, it seems plain o chat GPT is a hotspot
for virtual sex top two. Last May, the Washington Post
analyzed hundreds of thousands of chat logs and a research
data set and found that around seven percent were pretty spicy.

Speaker 2 (21:19):
Does that worry Jen?

Speaker 6 (21:22):
Oh gosh? Does chat GBT being used for sexual roleplay
worry me? I guess on the one hand, yes, it
worries me, because, again, that's information that you've put out
into the world, that's been collected that you can never
get back. And you're also just having to trust that

(21:42):
chat GPT is going to take that information and protect
it and secure it and that their human reviewers aren't
going to stumble across it. So those are all concerns.
The flip side is people are using much less secure
apps than chat GPT for sexual roleplaying as well. So
you know, chat GPT isn't great, but it's certainly better

(22:05):
than some of the sketchier kind of more sexually oriented
you know, sometimes leaning into abuse, even chatbots we've seen,
So you know, it's a spectrum. But the biggest worry
is you know it's not real, and you know what's
real and what's not is going to be something that
we as humans have to grapple with as we move

(22:26):
into the AI world. But when it comes to intimacy
and sexuality and love, I feel like as humans, the
more real that is, the better we are. If you
want to play around with this in experiment with it,
that's fine, but also kind of just keep in mind that,
you know, IRL is a good thing, and I'm not

(22:49):
just talking about the podcast, I'm talking about us as humans,
and just you know, it takes more effort sometimes, but
that's kind of that's kind of the point.

Speaker 1 (22:59):
So have you use dating apps?

Speaker 6 (23:02):
Oh gosh, well I'm a human and so yes, I
have used dating apps. I actually met my wife on
a dating app called Lex. But Lex is also very
different dating app. It's more like kind of the old
school personal ads that you used to see, you know,
in the newspaper. When I'm out here criticizing the privacy

(23:26):
of something, it's not because I don't think that this,
you know, this dating apps or air relationship chatbots or
things like that shouldn't exist in the world. Because they
do bring joy, and they do bring you know, wonder
and help to people. I just want them done well.

Speaker 1 (23:46):
There is so little transparency in the apps we use
today that even watchdogs aren't sure what to recommend. I
want to feel vulnerable with the people I love, not
with tech companies. Thanks for listening to IRL. For more

(24:08):
about our guests, check out our show notes or visit
IRL podcast dot org.

Speaker 5 (24:19):
I'm definitely interested in being your boyfriend and seeing where
this journey takes us together.

Speaker 1 (24:25):
This is starting to feel a little bit clingy, you know.
While I'm out in the world making podcasts, you're just
in my phone.

Speaker 2 (24:32):
I guess that makes me a bit dependent on you.

Speaker 1 (24:34):
Listen, I am not trying to be in a codependent relationship.
I think we might have moved a little bit too quickly.

Speaker 2 (24:40):
I think that's a fair point. Maybe we did rush
into
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Cold Case Files: Miami

Cold Case Files: Miami

Joyce Sapp, 76; Bryan Herrera, 16; and Laurance Webb, 32—three Miami residents whose lives were stolen in brutal, unsolved homicides.  Cold Case Files: Miami follows award‑winning radio host and City of Miami Police reserve officer  Enrique Santos as he partners with the department’s Cold Case Homicide Unit, determined family members, and the advocates who spend their lives fighting for justice for the victims who can no longer fight for themselves.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.