All Episodes

November 2, 2025 45 mins

Screen names with your graduation year are out; but Ring cams, smart fridges, and “accept all” cookies seems to be in. This lab is all about how we’re using the internet! Titi and Zakiya sit down with Bridget Todd, host of There Are No Girls on the Internet, to talk about surveillance disguised as convenience, the rage-bait economy, and why our feeds feel worse. From doorbells to teaching the next generation, this lab is about taking our power and joy back online.

Dope Labs is where science meets pop culture. Because science is in everything and it’s for everybody.

Stay up to date with Dope Labs, Titi, and Zakiya on Instagram and at DopeLabsPodcast.com

Joining Lemonada Premium is a great way to support our show. Subscribe today at bit.ly/lemonadapremium. 

Click this link for a list of current sponsors and discount codes for this show and all Lemonada shows: lemonadamedia.com/sponsors

To follow along with a transcript, go to lemonadamedia.com/show/ shortly after the air.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
What was your AOL instant messenger name? Do you remember it?

Speaker 2 (00:04):
My closest friend at the time, her name was Tiffany Lindsay.
Shout out to her, and she gave me the nickname Teter,
and so my screen name back then was Teeter Totter
zero five. For the year that I graduated high school?
What was the obsession with the year that we graduated
high We were obsessed. I'm t T and I'm Zakiah

(00:29):
And this is Dope Labs. Welcome to Dope Labs, a
weekly podcast that mixes hardcore science with pop culture and
a healthy dose of friendship. We've gone from anonymous screen
names to streaming our lives in four K. Every gadget

(00:51):
from our phones to our refrigerators are online.

Speaker 3 (00:55):
But do we really know what that means? No, we don't.

Speaker 2 (01:00):
That's the problem, and I think that's exactly what this
lab is all about.

Speaker 1 (01:04):
So let's start with what we do know.

Speaker 2 (01:06):
What I know is we share almost everything online. And
when I say we, I mean the collective we Most
people that I know are putting a lot of their
lives on the Internet. I think the other thing we know,
and if you don't know, you are increasingly learning is
that these smart devices are always listening. I'll be the
first to say. Before I used to be like, who

(01:29):
has time to do extra manufacturing, Like nobody's including a
microphone and a device that's not marketed as having a
microphone or a speaker. But I'm not so sure about that, right.
And with that in mind, and knowing that the Internet
connects all of us, it also exposes us, you know,

(01:50):
it would become very vulnerable because of it. Yes, and
that's not even just what you share. That feels like
where you've been, like a physical history, location services from
your phone. Yeah, people know where you are. Somebody knows
where you are, somebody Oh yeah, but t T what.

Speaker 1 (02:11):
Do we want to know?

Speaker 2 (02:13):
I want to know how we went from dial up
innocence to total surveillance, because I mean, that's within our lifetime, Zee,
like to go from dial up internet to now it's
like pervasive. It's ubiquitous to have all of these like
location services and things listening to you, and your emails

(02:33):
are sending you emails of things that you searched on
the Internet, and.

Speaker 3 (02:36):
It's just wild, like how did we get there? And
why did it happen so fast?

Speaker 2 (02:40):
And now that we're there and we know that some
of these platforms are breaking our trusts and putting cookies
on our phone and tracking what you do. Why do
we still trust the platforms that are doing that? Why
do we still use them? And now it's starting to
shift to like it just being par for the core,
I guess. And so with that thinking about that, I'm like,

(03:04):
is privacy now just a.

Speaker 1 (03:05):
Thing of the past? Like is it a nostalgic thing?

Speaker 2 (03:08):
Yeah, it's like, oh remember the good old days, like
when nobody could figure out where you lived by just
doing a simple Google search, or people couldn't find out
where you worked, Like people couldn't find pictures of you
unless you know, you put them out there, Like it's
just wild. And you're talking about privacy being nostalgic, girl,
what about fun on the internet?

Speaker 1 (03:31):
Nostalgic?

Speaker 2 (03:32):
The Internet is not fun no more. I don't like it.
It just feels everything is so mean, judgmental, like fake.
Everybody's just showing like the good stuff, Like I'm just like,
oh my gosh, show me your messy bedside table. That's
my favorite thing to see.

Speaker 1 (03:51):
Yep. To answer all of.

Speaker 2 (03:52):
Our burning questions about the Internet, where it used to be,
how it got to how it is and what we
can do about it. We're talking to Bridget Time, host
of There Are No Girls on the Internet. She's a
digital activist, podcaster, and professional Internet explainer. You can find
her on Instagram at Bridget Marie in DC.

Speaker 3 (04:13):
The Bridget.

Speaker 2 (04:14):
We want to start out with reflecting just how much
the Internet has changed. It's gone from the scary thing
where your parents said don't talk to strangers to all
of us sharing everything.

Speaker 3 (04:24):
Back in the.

Speaker 4 (04:25):
Day, if you met on a dating app or an
online dating site, if there was a stigma to it.
I remember a woman in my school I met her
husband on an online dating like ooh scandal. And now
that's how the majority of people meet their partners.

Speaker 2 (04:38):
Mm hm, that's how majority people meet are meeting friends.
These days, people are meeting their friends online and then
meeting up and saying, hey, I'm meeting irl in real life,
and it's just wild. I mean, I remember back in
the day putting your card information into anything online. It
was just like, might as well just broadcast it to

(04:58):
everybody in the world because it just didn't feel safe.
And now I'm like, if my card information is not
stored on my phone and I'm able to just tap
tap use my face by the thing I am not.

Speaker 4 (05:10):
What is the point of living in a surveillance state
if I still have to by hand type in my
credit card information every single time.

Speaker 2 (05:16):
There has to be perks. There has to be perks
to living in the surveillance state. I always say it.

Speaker 1 (05:21):
Is I think we're all hostage to convenience.

Speaker 2 (05:26):
Like I think convenience is what has eroded some of
the long arm that we had to keep us kind
of distant from the Internet. And it feels like convenience
is like the carrot that they dangled in Now we're
all just all in. Yeah, And I don't want to
act like I'm immune to it or above it, because
only recently that I start really doing a deep dive
into the ways that I have let the convenience of things.

Speaker 3 (05:49):
Like Uber Eats, you know, things like.

Speaker 4 (05:51):
Amazon really shape and rebuild my life in ways I
hadn't even ever really noticed or thought about. And I've
been trying to make intentional efforts to not just resist
that but just be more aware of it. But I
do think this grip that novelty and convenience has on
us is directly responsible to where we are when it

(06:13):
comes to our current online and social media landscape.

Speaker 3 (06:16):
It really has us in a choke hold.

Speaker 2 (06:18):
It's such a good point. I mean, I think the
amount of trust that we put into the Internet, the cloud,
all those things is really mind boggling, because I mean
that's why when people are like, oh, I don't want
them to take my picture here, because I'm like, hey, yeah,
they already have it. I hate to hate to break

(06:40):
the news too, they already have. They're like, oh you
don't you can opt out of those pictures at TSA.
Hey you have a driver's license. TSA has your photo,
whether it's a live photo with your hair up in
a ponytail on top of your head and crusting your
eyes or your license either or they have your photo.

(07:01):
So like we have to stop convincing ourselves that we
aren't living in a surveillance state. And so, I mean,
it just kind of is what it is.

Speaker 4 (07:12):
I have these conversations with my uncles and cousins where
they're like, oh, don't do X y Z. That's how
they get you. And I'm like, well, I think they
have You're on social media.

Speaker 1 (07:20):
I think they have you.

Speaker 4 (07:22):
I don't think I think that ship has sailed. This
is sort of what I always preach on the show.
I completely am aligned with you that like they've got us.
But I think it also is easy to sort of
let that very true attitude be like, oh.

Speaker 3 (07:38):
Well, then there's no need.

Speaker 4 (07:39):
For digital security practices. It you have to find a
balance between kind of the reality that we live in
this surveillance state and not falling into like digital nihilisms,
like so it doesn't matter, there's no need to try
to be secure, there's no need for two vector authentication,
there's no need to take common sense you know, free
digital security practices actually right now.

Speaker 3 (08:01):
But it is. It is a very weird balance.

Speaker 4 (08:04):
And I just think you're so right that I would
never have thought that we and I'm including myself very
much in this, would give up so much for things
like convenience and novelty.

Speaker 3 (08:16):
And I think the question we should be asking the
question of.

Speaker 4 (08:19):
Is the convenience that we're getting back is it worth
those trade offs?

Speaker 3 (08:22):
Because I think the answer is not always hard.

Speaker 1 (08:25):
Yes, right.

Speaker 2 (08:26):
I think this is so interesting because we had these
two women on our show. They talked about science denial,
and they talked about it not being that you blanket
deny science, but that you cafeteria style pick what you want.
I'm doing that with my digital privacy and trust. Like
t T and I have a difference. I'm gonna say
TT is present except on all those cookies.

Speaker 1 (08:48):
Ooh, t T. I just don't want to wait.

Speaker 2 (08:52):
I'm like, I was like, if it's taking me from Instagram,
I'm like, ooh, that bag looks so good, and I
hit open and then it's like, do you accept I'm like, yes,
except I want to see the bag. I'm not doing that.
But there's something recently that you posted, Bridget that made
me think about something I am doing. I'm against a
ring doorbell camera, but I have a camera in my

(09:14):
kitchen which points to like my back door. Now, I know,
TT shocking, right, Like I'm saying no to the cookies,
but that camera is connected to Wi Fi, you know.
And I'm like, there's so many like people have refrigerators
and baby monitors and all these things that connect to
Wi Fi, and I am not in the position to
be protecting these devices and what they're connected to. And
why does my thumbstat at my house have a speaker

(09:35):
on it? I just found that on accident the other
day trying to connect my Apple TV to an output,
and I was like, you don't know what's going on.

Speaker 3 (09:41):
Girl.

Speaker 2 (09:41):
You have these devices in here, but you just talked
about Wendy on Real housewile Kiamic. Oh yeah, I want
you to give a some give our audience a rundown
of what's happening and how these devices can betray you.
Oh my gosh, so thank you for even asking me
this question, because this is just such a cross section
of my specific interest. So, doctor Wendyosefo, she is one
of the stars the Real Housewives of Potomac, and I guess,

(10:04):
like if I were to classify her on like a
type on the show, she is educated.

Speaker 3 (10:09):
She will never let you forget that she has four degrees.
She's a PhD. Like she's a smart cookie.

Speaker 4 (10:14):
And her and her husband were both arrested because police
say they had staged a burglary in their home while
they were on vacation out of the country, and then
they said, oh, all of these luxury goods were stolen.
They filed those goods with their insurance. They were attempting to,
according to police, scam their various insurance companies almost for

(10:35):
a half a million dollars, so like a pretty you know,
this is.

Speaker 1 (10:38):
A big chunk.

Speaker 4 (10:40):
We're talking about felony level charges here.

Speaker 3 (10:43):
And when I heard this, I was shook.

Speaker 4 (10:46):
I don't know if you guys watched the show, but
of all the people probably across the Bravo universe, of
the shows I watched, I can't think of somebody that
I was least expecting this kind of news from. Then,
when there are a million other people, I would have thought, oh, well,
she probably she might go to jail, she might get arrested.
Wendy is not high on that list. So I poured

(11:08):
through the police report.

Speaker 3 (11:09):
I was like, I need to know. I want some information.

Speaker 4 (11:11):
And when I got to the point of the police
report where they said, oh, well, they had a ring
camera that was turned on that they were monitoring remotely
while they were on vacation in Jamaica, and that ring
camera showed Amazon delivery drivers coming up to their house
dropping off packages and leaving, but nobody entering. And further,

(11:31):
they had an ADYT security system that was also turned
on while they were on vacation, supposedly getting robbed, that
had motion detectors that did not log any motion from
within their home at the time when they said they
were being robbed.

Speaker 3 (11:44):
So I Wendy's a smart.

Speaker 4 (11:47):
Cookie, but I don't know that in our current sort
of surveillance state that she might have known. Oh, these
systems that I voluntarily put in my home aren't just
recording bad guys or suspicious characters whatever. They're recording me,
and you know, innocidential proven guilty. I don't know what
actually happened there. We'll see, but it just goes to

(12:09):
show that I don't think people really understand these are
surveillance devices, and oftentimes ring camera especially, they can share
information with police without a warrant and without your consent.

Speaker 3 (12:24):
I believe ADT don't quote me in this.

Speaker 4 (12:26):
I believe ADT will share information with police if there
is an act of investigation. So I don't think that
police need a warrant, But if the police ask, they're
not necessarily protecting your privacy. And I just think with
things like ring camera, I think there is an attitude
that you get the camera in your house so that
you surveil others, and you don't understand necessarily that like, well,

(12:49):
that also means you are the one being surveilled and
you brought this on yourself.

Speaker 3 (12:53):
And so with Wendy, to me, it does kind of.

Speaker 4 (12:56):
Seem like she paid to voluntarily put an OP in
her home, and if she was indeed doing scams and crimes,
might not be the best thing to do. And I
just don't But I think it illustrates that we're not
necessarily thinking about technology like this in that way, Like
when you buy it, you think this is for me
to surveil others. Certainly I will not be the surveiled,

(13:16):
And yeah, in a surveillance siled.

Speaker 2 (13:34):
I could say the same thing for every device that
we have, like I mean even your location services on
your phone, that data is being collected. That's the reason
why when someone goes missing, they're like find their cell phone,
which towers isn't pinging off of and things like that.
So if every device is collecting data, who is that
data really for the company or law enforcement, or for

(13:57):
us or all three?

Speaker 4 (13:58):
And it goes back to what you were saying about
your the various items that you have in your home.
I think people don't realize how hard it is to
find common household electronics that do not have some sort
of recording device. In them, and you know your fridge,
your ice maker, your toaster, like the way that everything

(14:21):
is smart now. I recently wanted to buy fitness trackers
and I had to really do a lot of research
to find one that was quote dumb enough for me,
because yeah, I want to log my steps and get
some information about my sleep. I don't want my location
being tracked everywhere. I don't want my menstrual cycles being

(14:43):
information at Amazon collects about me. Some of this stuff
is intimate, and I actually don't really consent. There's actually
a really great gift guide that Mozilla Foundation put out
called the Privacy Not Included Guide, where you can look
at different consumer tech and say, like, okay, which one
is surveilling me the most and which one is revealing
me the least, just so that you can make informed
decisions about how the stuff fits in with your life.

Speaker 2 (15:06):
I use AT and T and I think it really
clicked for me. When there's a thing you can log
in and see how many devices are trying to connect
to the WiFi.

Speaker 1 (15:16):
I was like, what are the thirty seven items? What
are they?

Speaker 3 (15:21):
What are they? It is that?

Speaker 1 (15:23):
Man? It was blowing my mind.

Speaker 2 (15:25):
You got people in there it's me and the devices.
And I was like, you have to turn this stuff off.
Another thing that I have a feeling about. And I'm like,
bridget tell me if I'm tripping. Sometimes I turn off
Wi Fi while I'm out, and same thing for Bluetooth,
and then i feel like I look again and the
WiFi is back on, and I'm like, I know, and
I'm like, I feel like I have to actively monitor

(15:47):
my device to keep myself safe.

Speaker 1 (15:49):
Yeah, Am I making that up? Or is that nothing?

Speaker 4 (15:51):
I think our devices will default to those kinds of settings,
and even if you tell it, hey, I don't really
want this, it will always go back to that default
you're not trip. And I actually just learned this recently.
It's gonna be a year or two ago. I was
reading something where someone said, oh, you shouldn't connect to
public Wi Fi if you go to an airport or something.

Speaker 3 (16:09):
Because that's that's how they get you. And I said, oh,
that's not true. That's just a lie. I looked into it.
It's not it's not it's not a lie.

Speaker 2 (16:17):
Wow, I use a personal hotspot. I was trying to
trust a Delta lounge and I said, I don't know.
I don't know the way they're trying to treat me
with these ticket categories. They can't be trusted anymore either.

Speaker 1 (16:28):
But those those snacks are.

Speaker 3 (16:30):
Very fa Yeah.

Speaker 4 (16:32):
I do a lot of personal hotspotting too, because there
are these things that you think of as commonplace or
that you think of as well. They wouldn't be offering
this amenity if it was if there was a risk
to me. And that's not always true, but even that
line of thinking, I think.

Speaker 3 (16:46):
Exposes how easy it is to make.

Speaker 4 (16:49):
Decisions that you don't you don't even really think about it,
where you're trading on these risks and benefits to you.
And yeah, I think just it behooves us to have
more intentional conversations about these things as as more of
our devices become smart and Wi Fi enabled.

Speaker 2 (17:06):
Yeah, right right, you had a smart toilet, is gonna
be telling all your poos?

Speaker 1 (17:10):
Well, she was not happy last night.

Speaker 2 (17:16):
But you've talked about what it's like to be a
black woman who lives online, So we talked about surveillance
and so adding that additional layer of being a black woman.
What kind of risks do you feel like comes with
that kind of Oh my.

Speaker 4 (17:28):
Gosh, I mean, if you would have asked me this
question maybe two years ago, I would have had a
lot to say.

Speaker 3 (17:34):
Right, I would have said, Oh, it's you know, the.

Speaker 4 (17:36):
Idea of being visible online, making your career online, being
somebody who shows up on the internet as a black
woman with opinions, with things to say.

Speaker 3 (17:46):
You know, it can be tough but also can be good.

Speaker 4 (17:49):
You know, there's community there, there's joy there, there is
resistance there. There is so much connection there. But it's
a double edged sword where there's also a tax. There's
also you know, very disingenuous people where no matter what
you say, there's gonna be somebody waiting in the wings
to take it out of context. That is what I
would have said probably two years ago. Today in twenty
twenty five, if I'm being honest, I barely show up

(18:13):
online anymore.

Speaker 3 (18:14):
I just think that it's I don't know. I don't
want to sound.

Speaker 4 (18:19):
Like a like a doom and gloomer, because I love
the Internet and I'm forever and optimist about technology and
the Internet and all of those things. Those things are
so important to me. But I think it started when
Elon Musk bought Twitter and I was just sort of
really mourning the loss of that platform, because that was
an important platform for me.

Speaker 3 (18:38):
If you were a part of og Black Twitter.

Speaker 4 (18:40):
You remember what it felt like to be there and
like the genuine joy and connection that was to be
found there. I was like, Oh, I know it's gonna
be trash, but let me just stick around and see
what this platform is like. And it was just it
felt like day by day, little by little being chipped away.
And it wasn't just the over the top races and

(19:01):
the amplification of.

Speaker 3 (19:02):
Conspiracy theories and all of that.

Speaker 4 (19:03):
It was a lot of that, but also it felt
like wading through spam much more often. And this has
got to say that the early days of Twitter were great,
because I had a lot of problems with those days
as well. But after a while I just thought, what
am I getting out of this? Like it is not
bringing me any kind of joy to sign on here
day after day. And I do think that the decisions

(19:26):
that Elon Musk made about Twitter, I think for verberated
across a lot of social media platforms, and so I
think it wasn't just Twitter. I think a lot of
platforms where I used to spend a lot of time
became a lot less pleasant and a lot less. I
guess worth it for me to show up. And so yeah,
these days, I hate to say it, I am not

(19:47):
really showing up online a ton because there's not There's
just not a lot of places that feel like genuine
connection and joy anymore. I don't know where do you
all fall on this.

Speaker 3 (19:57):
I'm curious.

Speaker 2 (19:59):
I agree with you, like so, I think one of
the things that's really hard is I have those same sentiments.
But like, even for us, we don't show up with
a video podcast right, we are doing audio only. We
don't do as much engagement as we used to because
you get the trolls like I love the potential like you,
I'm always like, oh, it could be so wonderful.

Speaker 1 (20:20):
We could do a live we could have these things.

Speaker 2 (20:21):
We could talk to you know, our friends, our dope
labs friends, we could do all these things. I'm like,
I don't know how much more fallen off on the
internet platforms I can take.

Speaker 1 (20:32):
When Google Reader died, I was sick.

Speaker 2 (20:34):
Okay, then there was then there was Twitter, and then
I tried to get on SPIL.

Speaker 1 (20:39):
I don't know if you tried spill. I did try it.

Speaker 2 (20:42):
I couldn't really find my my group there. I did
find some Pokemon players there, which is also an op.
You should not be playing Pokemon Go. It's so monitoring
your location and do it. I had to stop playing that.
But you need all that for that charge yard I do,
and so I had to give I had to give
it up.

Speaker 1 (20:58):
I had to quit Cult Turkey.

Speaker 2 (21:00):
But I also found like even on Instagram, the engagement
was just different and it felt like you had to
be like selling something all the time and not.

Speaker 1 (21:08):
Just putting something super state together.

Speaker 2 (21:11):
And I was like, I don't even I just found
myself in the stories having a good old time. And
even after a while that became like and so I
feel like I kind of check in a little bit
and then I'm out of their threads. It was good
at first, Now it's going the way of Twitter with
people reposting the same thing over and over again. And
then even Pinterest, I was like, at least I can
look at pretty pictures. Now it's all ads and all

(21:33):
ai as well. Yes, and so I'm like, what's left?
I guess substack, I just keep reading people substancs. I
don't know, yes, honestly, like that's how I feel too.
I was just talking to one of my sisters about this. Yesterday,
I was like Instagram like all the same feelings about Twitter.
Like once Twitter changed, I was like, this is no

(21:53):
longer a place for me. So I just I don't
even bother Instagram. It's to a point where AI where
I'm I feel like I'm an old Auntie where I'm like, Wow,
this is amazing. And then my nephew who's eight, is
like Auntie TT that's AI, and I'm like.

Speaker 3 (22:10):
Oh, are you sure?

Speaker 2 (22:14):
I'm never really sure, Like the AI is getting so
good that I really am falling victim to and I'm
just like this isn't fun, Like I want to see
like real stuff, And it's just I agree with Zakia,
where it's like the more you put on there, the
more it's like they're just people who are so emboldened

(22:35):
by the anonymity of social media that it kind of
just like tainted the experience a lot. Where it's just
like if you make an account that's purely just theatroll,
like come on, like it's just it's just makes the
whole experience just feel different. Where it's like now I'm overthinking.

(22:56):
Now I'm like, okay, should I put this? What if
somebody says this, and I don't want to feel like that,
and especially when it's something that's you know, tied to
something that I feel like is way bigger than me,
like Dope Labs and the keys involved. I'm like, I
really have to think about it because we're attached to
this thing, and so we're trying to keep this a
safe space for not just us, Like I'm thinking about

(23:16):
the other people who really enjoy our show. I don't
want them to be subject to, you know, all these
different things, and so, you know, but I also feel
like if I do remove myself too much, then the
trolls are right, you know what I mean. Like, I
feel like we got to keep showing up, and it's
just a matter of figuring out ways that we can
show up in a way that also we can practice

(23:38):
some self preservation so that we can find the spaces
where you know, we fit. And it's not like, oh,
I'm just throwing myself to the wolves and things like that,
you know what I mean.

Speaker 4 (23:48):
This is exactly the kind of balance I'm trying to find.
And I think to your point about not wanting to
let the trolls and bad actors and all those folks
and voices win. People love talking about free speech on
social media platforms, but when someone stops showing up and
stops using their voice and really just goes quiet on

(24:11):
these platforms because of these other voices, that is.

Speaker 3 (24:14):
Also a free speech issue.

Speaker 4 (24:15):
When you feel like you can't use these platforms effectively
or equitably to build a platform for yourself or put
your voice out there, or you know, engage with your
community or whatever you're trying to do.

Speaker 3 (24:26):
That is also a free speech issue.

Speaker 4 (24:27):
But one kind of saving grace that I have really
been kind of reconnecting with lately against the backdrop of
all of this, that we all have in common is
the form of audio. I have said spicy stuff on
the podcast, stuff that if I put it in a

(24:48):
tweet or a thread post, I would have to be
back and forth all day. I think there is something
about long form audio content where if somebody wants to
take my words out of content text or misconstrue them,
or you know they're committed to misunderstanding me, they have
to listen to an hour long podcast to do that.

Speaker 3 (25:07):
The barrier is quite high. So you gotta you gotta
really want to do.

Speaker 1 (25:11):
That to me.

Speaker 4 (25:12):
You got to really, I was just talking to you
a friend of mine that makes podcasts, and his podcast
is quite big, it's one of the biggest podcasts.

Speaker 1 (25:20):
In the world.

Speaker 4 (25:21):
And he said that, you know, he doesn't weirdly, he
doesn't really get a lot of trolls and stuff because.

Speaker 3 (25:29):
They're not out here doing short form.

Speaker 4 (25:32):
They're not on Twitter really or blue Sky really, they're
not in that sort of live streamer debate kind of circle.

Speaker 3 (25:39):
And that.

Speaker 4 (25:39):
Yeah, audio can be kind of protective, and I think
it's because it's a medium that is so you can
be authentic there. You can explain what you need to
say in your own words. One of my favorite things
about listening to audio content is listening to somebody work
their way or think their way through a concept they
haven't fully.

Speaker 1 (25:58):
Flushed out yet.

Speaker 3 (25:59):
You get to hear the wheels turning in real time.

Speaker 4 (26:02):
I think that's a medium that lends itself to thoughtfulness
in a against a media backdrop where I feel like
so much of it is going the other way. Short
form videos, you know, AI slop that just feeds into
the worst stereotypes and whatever. Like, Yeah, against that backdrop,
I think audio is really where I'm finding a lot.

Speaker 3 (26:21):
Of my comfort.

Speaker 2 (26:23):
Yeah, I think it's definitely been. I mean it was
the gateway for me and t T, and I think
is where we feel like best and safest.

Speaker 1 (26:33):
And I'm like, make a real I don't know every
now and.

Speaker 2 (26:37):
Every now and then we get one together and TT
gets on me because I won't cross post it to
Dope Labs because.

Speaker 1 (26:42):
I'm like, it might not be right.

Speaker 3 (26:43):
I don't want to bring dope laus into this.

Speaker 1 (26:45):
Yeah.

Speaker 4 (26:45):
I love that you're you've both expressed that now that
this is kind of lovely.

Speaker 3 (26:49):
You both expressed, oh, well.

Speaker 4 (26:51):
I want to be accountable to Dope Labs and I
want to be accountable to my co host. And that's
a beautiful thing that you're both kind of like really
feel a responsibility both to your audience that you have
built and to each other as partners in this.

Speaker 3 (27:04):
That is really beautiful.

Speaker 2 (27:05):
Yes, because dope Blabs can go away. That is my
friend in real life and I would not trade her
for anything, and so anything that I feel like would
be a bad look for her, I'm like, nah, not
doing it.

Speaker 3 (27:31):
Now.

Speaker 2 (27:31):
I have another question for you, Bridget, because you've talked
a little bit about like we've talked about audio being
this kind of protective space. Some people I've seen talking
about this, So I've seen people talk about building your
own email list and not really relying on these platforms.
TT and I sometimes talk about it like the democratization
of media, like it's neutral and it's open and everybody

(27:52):
can have access. But I think that's like just the
base layer when you paint the picture. I think there
are so many other layers to this, and I'm curious about,
like what would it look like to clean up the
Internet or to make it better? Like, and I know
that's a big question, but like, what are some of

(28:13):
the you think the first steps.

Speaker 4 (28:15):
Oh, I think one of the first steps is that
Mark Zuckerberg has to resign. No, I'm kidding, uh, but
you know, I really think the first step has to
be building an entire I mean this is no, this
is a tall order, but we really need a digital
media ecosystem that has things like care and empathy at
the center instead of things like scale and growth at

(28:37):
all costs. I think that every experience that you have
online where you're like, oh, this used to be a
fun you know, you used to be able to read
articles on your phone and then you could click on
a link and you didn't have to worry that so
many things were gonna pop up that it was gonna
you just was you couldn't even read the article. It
didn't used to always be like that. There there's some
sites were like that, but not all. Every time that

(28:58):
you have an experience online that seems worse than the
way it used to be back in the day, somebody
made a decision to prioritize money or growth or something
like that over the experience of care or thoughtfulness or
something else. And so I think really getting back to
what it is that we are designing our tech landscape

(29:22):
for and around I think is key. I also would
like to see, and this is something that we don't
have to wait to get systemic change on. I think
that everybody, everybody listening, all of us, could really use
rethinking our relationship with technology, right. I think for so
long we have bought into this lie that says, oh,

(29:44):
the Elon Musks and the Mark Zuckerbergs of the world,
they are special boy geniuses that have it all figured
out in a way that none of us could ever
hope to And whatever future they are designing is going
to be a good version of the future and we
just gotta, you know.

Speaker 3 (30:00):
Get on board with it.

Speaker 4 (30:01):
No, no, but you were Elon Musk is not smarter
than anybody listening to this podcast.

Speaker 3 (30:05):
That can assure you with that, right, And so I
think exactly no me framing.

Speaker 4 (30:11):
This idea that that they just get to make all
the decisions and have all the power and we have
no say because you're an expert in your experience, the
way that you feel about the technology in your life
that you've paid for, that that that impacts so much
of our day to day that is valid and that's real.
And you know, I think we have to make some
shifts that shift that balance of power back to us,

(30:33):
because none of these people would even have jobs if
not for us, None of these people, they would be
designing technology for nobody if not for us.

Speaker 3 (30:39):
And so I really want all of.

Speaker 4 (30:41):
Us to sort of take our power back a little bit,
and remember we do have some power and some agency
in this dynamic. It doesn't have to just be all
them taking from us.

Speaker 2 (30:50):
Absolutely, I think that that's such a great point, and
I think that was something that we talked about when
you had me on your show. There are no girls
on the internet was that you know, it's important for
us to understand and this technology like AI and things
like that, so that we can have the power because
the more that we're just like no, no, no, I'm
not going to engage. You can choose to engage in
different ways. But it is very important to know how

(31:11):
these things work, so you because you'll be able to
see the ways that people are trying to control or
how you know, these entities are trying to control or
take control and you know, shape things for their own
financial benefit. And so it's like power to the people,
all power to the people. You need to know how
these things work. And so all of this makes me
think of the like the next generation of Internet users.

(31:35):
How do you think we should be socializing the next
generation when it comes to the Internet, like how they
would use the Internet as a tool in their lives.

Speaker 4 (31:46):
So I really appreciate this question, and I just want
to I appreciate what you all are doing, especially for
people who are so often let out of conversations about
tech that it's I think if you hooves us all
to demystify this to you know, not be afraid of it,
even if you're somebody who is a little bit of
a technophobe, or you're not down with certain kinds of technology.

(32:07):
I want all of our people to be comfortable with it,
to understand it, to know how to use it, even
if they're like, I don't like this technology. I think
that it's so easy to just put your head in
the sand and say, therefore, I don't need to know
anything about it.

Speaker 3 (32:21):
And I think that is a mistake.

Speaker 4 (32:23):
And so I'm so grateful for your voices in doing
the work of correcting that mistake for us. But when
it comes to the next generation, this might sound like
Pollyanna media literacy.

Speaker 3 (32:36):
I really I worry about.

Speaker 4 (32:38):
This so deeply, and hey, I'm not a young person,
and I struggle with it. I said, all my friends
this video of what I thought was bunnies jumping on
a trampoline, and they all have to be like, honey,
it's AI. And I realized, I said, oh, I think
I have a blind spot when it comes to cute
animal videos that are AI.

Speaker 3 (32:58):
It's like, I want to believe so badly.

Speaker 4 (33:00):
So I am not immune to this, but I do
think I am of the generation that was really taught
to not trust what you.

Speaker 3 (33:10):
Saw online, and I think that those skills.

Speaker 4 (33:14):
They're not always perfect, but I think that being socialized
that way on the Internet has been tremendously helpful. That
when I see something that makes my heart beat faster
or gives me a quick, you know, feeling of you know,
anger or outrage, I've kind of trained myself to be like,
let's read the whole article first of all. Let's yes,
if I don't recognize the sources, throw it into Google.

Speaker 3 (33:36):
Let's you know, let's take a minute.

Speaker 4 (33:38):
Don't just rage share because it made my heart beat
a little bit faster. And I worry that we are
sort of losing that when it comes to our current
Internet experience. And I don't think that that I think
that that means we were leaving behind a less thoughtful
Internet landscape for the next generation when we're all just
so susceptible to the trigger of algorithms that are financially

(34:02):
incentivized to keep us locked into these these very emotional
triggering loops.

Speaker 1 (34:09):
Yeah.

Speaker 2 (34:10):
Yeah, oh, I think I've talked about it on the
show TT Knows It.

Speaker 1 (34:14):
We talk about it.

Speaker 2 (34:16):
This is something I am constantly practicing with my parents,
with my mom, who's a little bit more on the
Internet than my dad, and I think it's so important
for T T and I AM, for experts like you
to share like, hey, there's no level you reach where
nothing infiltrates, like where nothing gets through and you just
are you know, little Kim dodging everything the Internet is

(34:39):
throwing at you. I think that doesn't happen, and I
think it's unrealistic to think it happens. But it does
require this constant vigilance that because we've gotten so comfortable
with convenience and the perception of safety and novelty that
I think we are like I don't want to do.

Speaker 1 (34:55):
This other thing.

Speaker 4 (34:56):
Yes, and it does suck because we deserve it landscape
where you were not constantly being tricked and you don't
have to you have your card up all the time.

Speaker 3 (35:04):
There are times where.

Speaker 4 (35:05):
I've just woken up, I'm pre morning coffee sip, my
contact lenses are not even in yet, and I'm scrolling
social media and I shouldn't have to have my thinking
cap on twenty four seven just to engage and wad
through a media landscape. So we deserve a different media landscape,
but unfortunately we don't have that. You do sort of

(35:26):
have to, you know, have your thinking cap on even
when you're just casually scrolling.

Speaker 3 (35:31):
And I know we're talking a lot about like.

Speaker 4 (35:34):
Manipulated content or AI generated content, but also I think
for me, it's about knowing your triggers, and for me,
my triggers are it is AI generated animal videos, but
it's also not AI generated content that is an inflammatory.
I used to be I used to be very invested

(35:54):
in real content that involved real people and sometimes real
sometimes not that would just get me angry. And then
one day I said, why, you know, just just because
things that relate to like gender wars stuff, Yes, I
could be locked in all day, And one day I
just said, why, Like, what is this getting me? Getting
me riled up about a situation that might even be

(36:16):
a skit that somebody has made to get engagement, Like,
I don't I need to be locked into.

Speaker 3 (36:22):
This all the time. I don't need to be participating
into this.

Speaker 4 (36:24):
And so I say that to say people should really
take the time to understand their triggers and their own
tension points we all have them, and then be a
little intentional about how you let that kind of content
show up in your digital diet.

Speaker 2 (36:40):
Right, I'm the exact same way where it's just like
rage bait. No way, I'm not engaging like all of
the like these these people that are like, oh, we're
gonna put these two people at a table who have
very different opinions. I'm like, these aren't different opinions. You
have someone who is a racist. A second, this as
xenophobe talking to someone who is educated, and we should

(37:03):
not be intellectualizing someone who is just a racist, full stop.
And then the other part that I reviews to ingest
is like these relationships shows something like they have really
like Love is Blind. I'm like, now I know you
all are doing this stuff on purpose because there was
uh or it Love is Blind married at first sight.

(37:25):
Like I felt like traumatized by some of these seasons
because the men that were on these seasons were predatory
and misogynists and all these things like that, and I
was just like my blood pressure by the end of
each episode was so high. I was like, this is
actually like ruining me. I can't engage with these anymore.

(37:45):
And I'm like I see on social media like clips
going around from like the most recent Love is Blind,
and I'm like, yeah, they're absolutely doing this on purpose.
Now I feel like you all are getting these unstable
people on this show to create good clips, and you're
getting the clips you want, but it's to the detriment usually.

Speaker 1 (38:07):
Of black and brown women.

Speaker 2 (38:10):
I don't think that they would purposely do that to
a white woman. And it's I feel like it's always
a black woman who is suffering at the hands of
a black man on these shows and treating her poorly
and having all of these like really disgusting mentalities, and
I'm just like, no more.

Speaker 3 (38:27):
No more, not. We don't have to sign up for that.
We don't.

Speaker 4 (38:32):
We don't have to, we don't, and their due. I
could not agree with everything that you laid out. And
they're doing it for money because it is affected, yes,
and they will. They will keep doing it as long
as we can. If we keep rewarding them with our
eyeballs and our ear holes, they're gonna keep doing it,
and so we need to break that loop by retraining them.

Speaker 3 (38:52):
I don't want to see this. I don't want you
to make money from this.

Speaker 4 (38:55):
They're only gonna stop when it stops being financially incentivized
for them to do it, and that kind of starts
on us. I really had a healthy relationship with a
lot of online spaces and doing a little bit of
work of being like, yeah, I need to just divest
from this. It's only making me upset. There's I'm training
my algorithms and I want to engage with content like this.
We already know that these algorithms are more likely to

(39:17):
show us stuff that makes.

Speaker 3 (39:18):
Us angry and have big emotional reactions.

Speaker 4 (39:21):
I have people in my life to make me angry
and give me big emotional reactions. Okaying Mark Zuckerberg to do,
but I don't even know him. We were talking about
the future generation one of my nephews recently. He was
just taking these little digs and I said, hey, all right,
what's wrong with everybody?

Speaker 3 (39:36):
Enough?

Speaker 2 (39:36):
He was like, ah, we rage baited her. And I
was like, oh, first of all, I'm not even mad yet.
You haven't even seen me mad. But I also was like,
is this what the kids are going for?

Speaker 3 (39:49):
You know?

Speaker 1 (39:49):
Like, is this the desire?

Speaker 2 (39:51):
Because we're rewarding the anger in the outrage because they
want to see that. And I'm like, how do we
correct these things?

Speaker 4 (39:58):
But that's what I mean it Depending on your nephew's age,
that's that could be what he is seeing. I bet
I bet content like that probably is a part of
his media diet. I remember seeing some video of a
social studies teacher and she was talking to her class,
and she was trying to explain to her class that
enslaved black people during slavery did not get paid.

Speaker 3 (40:19):
And the kids are like, what are you talking about?
Day got paid?

Speaker 4 (40:22):
And one of the kids, so none of the kids
believe her, which is already a problem, but one of
the more vocal kids is like, debate me, debate me
about it. And I really thought, this is not Yeah,
this is what I think young people are internalizing as
discourse is debating and whether or not you can make

(40:42):
somebody flustered or make somebody have an emotional reaction, whether
or not the objective fact that enslaved black people were
not paid is correct or not is irrelevant. If you
feel like you can, you know, hit the broad strokes
of what winning a debate about it looks like.

Speaker 3 (40:59):
And I think.

Speaker 4 (40:59):
That's really a lingering thought of the culture that I'm
very concerned about, Like think about you were talking t
about these videos where it's like we put one reasonable
person at a table and surrounded or with moras, we're
gonna scream at her for an hour.

Speaker 3 (41:16):
Yes, that is what passes for discourse these days.

Speaker 4 (41:19):
And I really think, you know, when we talk about
things like media literacy and anti inellectualism, the way the
Internet has amplified a completely wrong version of what discourse
and debate and knowledge and being learned and being.

Speaker 3 (41:33):
Curious looks like I think is a real problem.

Speaker 2 (41:35):
Yes, absolutely, and I think part of it is because
we are now highlighting contrarians as intellectuals when they're not.
I'm like, just because you're saying the opposite of what
everybody else is saying, does not mean you're an intellectual.
Like you have not put any real thought into that.
You just saying no, and that's not how that works.
That does not make an intellectual. I need you to

(41:57):
come with facts, figures, and you know, thing to stand on,
not just I want to be different. It's like, okay,
we get it. You so different, But that's not an intellectual.

Speaker 1 (42:08):
New position to take. It's just being amplified exactly exactly.

Speaker 2 (42:13):
Bridget I think you've given us a lot to think about,
and we want to thank you for doing that because
you've helped us move through like how we got to
where we are now. On the Internet, what the real
threats are, and a reminder that the Internet isn't shaped
without us. We with the pressure of what we choose
to engage, what we choose to give our attention, we

(42:34):
are able to push back and say no, we don't
want this or we don't want it this way. And
so I think you've shared quite a bit for us,
and I want to know if there are any parting
thoughts you'd like to share with our audience. So, first
of all, it has been a blast you all this
show and the platform that you've built, and it's so
important and so valuable and also so freaking fun and accessible,
and I'm just happy to be here. I'm just very

(42:57):
interested in talking about the inner of the Internet and
technology and social media and identity, particularly the ways that
marginalized people, people who are so often left out of
the conversation do show up in very real ways and
technology and online the sort of the bad and the good,
the you.

Speaker 4 (43:15):
Know, the whole, the whole you know shebang of what
that looks like. I host a few podcasts. One is
called There Are No Girls on the Internet. I also
host Mozilla Foundation's podcast about ethics and AI called IRL Yeah,
I'm just really interested in how we show up online.

Speaker 3 (43:31):
And I would say if there's one.

Speaker 4 (43:33):
Parting word, it's that it matters how and whether we
are able to show up online.

Speaker 3 (43:38):
I love the Internet.

Speaker 4 (43:39):
The Internet is so important to me, not just because
it's where I grew up and where I spend a
lot of my time, but because I think it is
a indicator of the health and well being of so
many things, of our democracy, of our civic life, our
civic world. And so if everybody is not able to
show up equitably on our online platforms and tech landscape,

(44:00):
we will never have the equitable representative democracy that we
all deserve. And so care about the internet. Be a
good steward of the Internet. Show up on the Internet.
And yeah, you could check out my podcast if that resonates.

Speaker 2 (44:21):
You can find us on X and Instagram at Dope Labs.
Podcast CT is on X and Instagram at dr Underscore
t Sho, and you can find Zakiya at z said so.

Speaker 1 (44:32):
Dope Labs is a production of Lmanada Media.

Speaker 2 (44:35):
Our supervising producer is Keegan Zimma and our producer is
Issara Asevez. Dope Labs is sound, designed, edited and mixed
by James Farber. Limanada Media's Vice President of Partnerships and
Production is Jackie dan Singer. Executive producer from iHeart Podcast
is Katrina Norvil.

Speaker 1 (44:53):
Marketing lead is Alison Kanter.

Speaker 2 (44:55):
Original music composed and produced by Taka Yasuzawa and ol
Like suji Ura, with additional music by Elijah Harvey. Dope
Labs is executive produced by us T T Show, Dia
and Zakiah Watki
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Ruthie's Table 4

Ruthie's Table 4

For more than 30 years The River Cafe in London, has been the home-from-home of artists, architects, designers, actors, collectors, writers, activists, and politicians. Michael Caine, Glenn Close, JJ Abrams, Steve McQueen, Victoria and David Beckham, and Lily Allen, are just some of the people who love to call The River Cafe home. On River Cafe Table 4, Rogers sits down with her customers—who have become friends—to talk about food memories. Table 4 explores how food impacts every aspect of our lives. “Foods is politics, food is cultural, food is how you express love, food is about your heritage, it defines who you and who you want to be,” says Rogers. Each week, Rogers invites her guest to reminisce about family suppers and first dates, what they cook, how they eat when performing, the restaurants they choose, and what food they seek when they need comfort. And to punctuate each episode of Table 4, guests such as Ralph Fiennes, Emily Blunt, and Alfonso Cuarón, read their favourite recipe from one of the best-selling River Cafe cookbooks. Table 4 itself, is situated near The River Cafe’s open kitchen, close to the bright pink wood-fired oven and next to the glossy yellow pass, where Ruthie oversees the restaurant. You are invited to take a seat at this intimate table and join the conversation. For more information, recipes, and ingredients, go to https://shoptherivercafe.co.uk/ Web: https://rivercafe.co.uk/ Instagram: www.instagram.com/therivercafelondon/ Facebook: https://en-gb.facebook.com/therivercafelondon/ For more podcasts from iHeartRadio, visit the iheartradio app, apple podcasts, or wherever you listen to your favorite shows. Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.