All Episodes

November 1, 2025 60 mins

Happy Halloween, boys and ghouls! In this week's news roundup Bridget is recapping all the tech news stories you might have missed, so you don't have to.

Creeps in Meta Raybans are nonconsensually filming massage women in massage parlors: https://www.404media.co/metas-ray-ban-glasses-users-film-and-harass-massage-parlor-workers/

China implements a new law cracking down on influencers who spread misinformation. https://www.cnet.com/tech/services-and-software/china-cracks-down-fake-experts-banned-from-social-media/

Surveillance company Flock used by police to charge a Colorado woman for a crime she didn't commit, forcing her to prove her innocence. https://coloradosun.com/2025/10/28/flock-camera-police-colorado-columbine-valley/

The Institute for Strategic Dialogue publishes a new study shedding light on the ecosystem of non consensual deep fake material online, highlighting the easy available of the software. https://www.isdglobal.org/digital_dispatches/the-ecosystem-of-nonconsensual-intimate-deepfake-tools-online/

A librarian has launched a creative protest against Good Reads, which is owned by Amazon, over the company's censorship and favorable treatment of Eric Trump's new book. https://www.404media.co/rogue-goodreads-librarian-edits-site-to-expose-censorship-in-favor-of-trump-fascism/

If you’re listening on Spotify, you can leave a comment there to let us know what you thought about these stories, or email us at hello@tangoti.com

Follow Bridget and TANGOTI on social media!  ||  instagram.com/bridgetmarieindc/ || tiktok.com/@bridgetmarieindc ||  youtube.com/@ThereAreNoGirlsOnTheInternet 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet, as a production
of iHeartRadio and Unbossed Creative. I'm Bridgett and this is
there Are No Girls on the Internet. You're listening to
their No Girls on the Internet, where we explore the
intersection of technology, identity, and social media. And this is

(00:24):
another installment of our weekly news Roundup where we dig
into all the stories online that you might have missed,
so you don't have to. Okay, let's get into it.
Nobody really looks cool when they eat spaghetti, Mike, not
even Will Smith. I was just looking at this SAA
video of Will Smith eating spaghetti and this one was
actually pretty good. It was pretty convincing, and I'm just

(00:46):
so curious how Will Smith eating spaghetti becomes the benchmark
of AI video capabilities. Have you seen these Will Smith
eating spaghetti videos?

Speaker 2 (00:58):
I have, you know, and you gave me a heads
up that we're going to talk about this way. Did
look into it a little bit? Yeah, there's just been
like a steady improvement in AI generated videos of Will
Smith eating spaghetti over the past several years. But I
couldn't find any you know, definitive or even speculative information

(01:20):
about why.

Speaker 1 (01:22):
Ooh, I can answer this for you.

Speaker 2 (01:24):
So you're right.

Speaker 1 (01:25):
We have come a really long way on the Will
Smith spaghetti eating index, because back in twenty twenty three,
somebody shared an AI generated video titled Will Smith eating
spaghetti on the stable diffusion subreddit, and they had made
this video using model scopes text to video tool. This
was back in twenty twenty three, which I know doesn't

(01:46):
sound that long ago. It was just two years ago,
but AI video capabilities have come a long way since then.
The video looks like Will Smith looks like Claymation kind of,
and the way that he eats the spaghetti is grotesque.
He kind of eats it like a dog, like he
comes tot he like puts his whole face in it,

(02:06):
and he's kind of using his hands.

Speaker 2 (02:08):
That is undignified.

Speaker 1 (02:10):
It's an undignified way to eat spaghetti, that's for sure.
But this growtesque, creepy AF video ended up creating essentially
a litmus test for how AI video generation is progressing.
I am so curious how Will Smith feels about this.
I would hate it if video of meeting spaghetti had

(02:31):
become a benchmark about the progression of AI technology. How
would you feel about that?

Speaker 2 (02:36):
I would not like it, No, it would be a nightmare.
But I do have a theory about why it's Will Smith.
I think he has a like poppy appeal and an
accessibility that is able to take this pretty arcane topic
of AI video generation and make it like accessible to

(03:02):
the common man, like maybe we you know, if there
were numbers or statistics about how good of a video
it was, that wouldn't really resonate. But like everybody can
connect with Will Smith and spaghetti, these are two very
accessible things.

Speaker 1 (03:21):
Ooh, you are coming dangerously close to making me admit
what I know is an unpopular opinion among a lot
of people about Will Smith. I have an unpopular Will
Smith opinion in there, Oh you do.

Speaker 2 (03:32):
What is your unpopular Will Smith opinion?

Speaker 1 (03:35):
Oh my gosh, I might have Joey cut this because genuinely,
I'm nervous to admit this, and I know people are
gonna come for me. So we all we all know
what I'm talking about. The Will Smith infamous slap. People
act like Will Smith killed somebody. When you go on
the Internet and look how people respond. This is one
of those things where it was I only got wind

(03:58):
of this recently at white people and black people. There
are a lot of things that we have a lot
of common ground on, but there's certain things where we
just live in completely different worlds and completely different sort
of online silos. And I only recently found out that
white people, not all, but a lot of white people

(04:20):
really never forgave Will Smith for smacking Chris Rock at
the Oscars. Meanwhile, black people, it never comes up, it's
never mentioned.

Speaker 2 (04:29):
We've all moved on.

Speaker 1 (04:30):
It's it's like very clear where we stand on the issue.
And I only recently found this south that, oh this
is we we have very different opinions about Will Smith.
And if you don't spend a lot of time in
online spaces like Reddit, you might not know that people
really really have not forgiven Will Smith for what went
down on the Oscars. I'll just put it that way.

Speaker 2 (04:50):
Yeah, it's funny you say that, because when I was
trying to look into the spaghetti thing, I was like, wow,
there's a lot of content about him slapping Chris Rock
on here.

Speaker 1 (04:57):
People have never forgotten it.

Speaker 2 (04:58):
Yeah. So in conclusion, and you're fine with physical violence,
you think it's cool. Do you think it should be
glorified on television?

Speaker 1 (05:06):
Yeah? I'm a big advocate of physical silence. I think
I think we need more of it. The world will
be a better place.

Speaker 2 (05:12):
Yeah, all right, cool.

Speaker 1 (05:14):
Obviously, no, someone's gonna clip that, and that's gonna be.
That's gonna be. I can never run for office. Somebody
is gonna clip that out of context.

Speaker 2 (05:22):
Thanks a lot, Mike, Hey, you said it.

Speaker 1 (05:24):
So, while we're getting into our unpopular opinions, do you
have opinions on wearables like technology that you wear on
your face or your head or your body.

Speaker 2 (05:34):
I not too much. I've never really been able to
make it work for me. I've always been intrigued. I've like,
you know, sometimes I'll play video games where you've got, like,
you know, it's a first person perspective and you've got
all kinds of like information on the screen, or when
you're driving a car that has like a lot of

(05:58):
information that might be relevant to you while you're driving.
That always seems kind of cool to me to be
able to have access to that while just like existing
in the world. But you know, there's a lot of
negatives that come with it too, one of them being that,
like you look like a dork with some like wearable

(06:20):
glasses on your head.

Speaker 1 (06:22):
Huge dork. I'm so sad to admit that one of
my first way back in the day, one of my
first Twitter profile pictures was me wearing Google glass. Oh,
don't even get me started.

Speaker 2 (06:34):
Yeah, I remember Google glass. I was in grad school
at the time, and I remember scheming with one of
my friends about how we were going to like pitch
a research project to Google to be able to get
our hands on some of Google Glass.

Speaker 1 (06:48):
At that point, at Google, you might have actually been
able to do it, you know. I once demoed some
like VR goggles. I firmly believe that nobody wants to
sit in their house or be around other people with
like a massive thing trapped on their face. And I
once demoed some VR goggles and it just reminded me
of that. I think you should leave skit. You know,

(07:08):
there's too much shit on my face. I don't even
want to be around anymore. That's instantly how I felt.

Speaker 2 (07:14):
Classic Skin, Classic skit.

Speaker 1 (07:15):
TI. We are Tim Robinson. We are I swear that
Tim Robinson has found a way to describe so many
awkward scenarios that I have found myself in. Oh yeah,
we are fans of him are over here. I will say,
though Meta's ray bands. As much as I do not
like wearables and the idea of wearables, and I'm kind
of against them just in general, I think they probably

(07:37):
have their place and like a use case if you're
a truck driver or a long haul driver or something,
I get it. However, I will say Meta's ray bands,
if you've worn them, they look and feel just like
normal glasses, which, on the one hand, that's kind of cool.
It kind of solves the problem that I had just
articulated of that feeling of there's too much shit on

(07:58):
your face. However, them looking like regular glasses is also
a problem. Four or four media has to report out
about the fact that there are Instagram accounts, like big
Instagram accounts with lots of followers whose whole thing is
using Meta ray Ban glasses to film themselves going into
massage parlors and asking the women who work there to

(08:21):
perform sex acts on them. In these videos, the staff
do not seem like they know they're being filmed, and
because these glasses they look like just regular glasses, they
probably don't know they're being filmed. In most cases, the
staffers are confused when when the man is like asking
for a sex act, or they laugh at the man,

(08:42):
or they just dismiss him. But in a few instances,
the staffers are like okay and negotiate a price for
the sex act the man is asking for. These videos
have been shared and viewed by millions of people. And
what's worse, in some of these videos, it is obvious
where the location of these massage parlors are because the

(09:02):
men show themselves walking into the entrances of these places,
so they show the where they're at, so anybody could
essensibly find these women. And this is a problem because,
as four four reports, this is extremely dangerous to the
women in these videos, who can be targeted both by
law enforcement and racist sexist extremists. Back in twenty twenty one,
they reremined a man shot. A man who shot and

(09:25):
killed eight people at massage parlors told police that he
had specifically targeted them because he had a sexual addiction.
Now you might be thinking, how can this Kennedy grosser.
While these videos are basically a money making enterprise, they're
essentially social media advertisements that offer for people to spend

(09:46):
money to buy what they say is the full video
of the encounter. This is so gross they linked out
to an adult pay perview service called No Fans, which
allows users to buy and view non consensual content without
creating an account. One video on Instagram that's pitched as
a quote Latina house call sends viewers to No Fans

(10:08):
to buy the full video. On No Fans, users can
buy the quote Latina Tuggy bundle for twenty eight forty nine.
I should say, though it's not totally clear if the
actual massage staff is then the people who are in
these like quote full videos, it does seem like they're
creating these videos non consensually with massage staff who do

(10:32):
not know they're being recorded, and then the and then saying, oh,
if you want to see the full video of what happened,
go to this site. And then the people in that
video might not even be the actual staff that we
saw earlier. Does that make sense?

Speaker 2 (10:46):
It does? Yeah. I can definitely see some deception taking
place in this ecosystem.

Speaker 1 (10:53):
Oh yeah, you think you think these guys are deceiving people?

Speaker 2 (10:56):
Yeah, I just something about it. I don't know my
six steps. I feel like they would not be above
leading with one video and then one does people make
that purchase it's actually different actors on the video, or
maybe nothing at all. Maybe they just take their credit
card and run with it. Guy's the limit.

Speaker 1 (11:16):
It will never not shock me how much people are
willing to spend to be told that they're watching video
that was non consensually made, Because there's plenty of pornography
out there where it is consenting adults who know they
are being filmed, who have consented to make adult content,
adding on this kind of non consensual video they made

(11:40):
with somebody who works at a massage parlor who doesn't
even know they're being filmed, to give it the appearance of, oh,
this woman who didn't know I was filming her, if
you want to watch the whole encounter with her. It
just never ceases to amaze me that that would be
something that would give people a marketing edge, I guess,
I'll say. And I also think these videos they really

(12:01):
expose how Meta has essentially created an entire ecosystem that
fuels this creepy, misogynistic, privacy violating content online. They're selling
these smart glasses that let people secretly record others in public.
They are also running platforms like Instagram that reward the
most like shocking and extreme and outrageous posts. And then

(12:25):
they only step in to take this stuff down after
journalists from four oh four start asking questions because there's
an entire marketplace for low cost and by low cost
I mean like forty dollars low cost ways to obscure
what little privacy protective features metaglasses do have. Right so,

(12:46):
right now, metaglasses have a light on them that is
supposed to light up to indicate when somebody is recording.
So ostensibly, if I saw you wearing metaglasses and the
light was on, I would know this person is recording.
But you can just buy little things to obscure that
so that no one knows you're recording. No one knows
their metaglasses. When four four spoke to Meta about this,

(13:07):
they basically said, hey, you know, it's up to the
people who buy the glasses to adhere to the terms
of service. We don't really control that.

Speaker 2 (13:14):
Bo That's such a good point that Meta is both
making and selling the hardware and making money on that
and then also make essensibly making money on the content
this like illicit content that is created from the hardware
that they sold. They're really what do you call that?

(13:35):
Like vertically integrated.

Speaker 1 (13:36):
I guess. I mean, that's a fancy way to put it.
Where I'm from. They're running a criminal enterprise. I mean,
this is my opinion. At what point is meta we
are making money from a criminal enterprise knowingly best? I mean,
that's so I would love to have somebody smarter than
me explain how it's not that. Yeah, and it has

(13:58):
to be said that a lot of the women who
were at these kinds of massage parlors are migrant women
or immigrant women, or women who are engaged in sex work, right,
and so they're already marginalized. Sometimes they're very vulnerable. Four
h four spoke to Angela Lou, the executive director of
SWAN Vancouver, which is an organization that promotes rights and
safety of migrant and immigrant women engaged in sex work.

(14:20):
Wu said quote. Earlier this year, SWAN Vancouver became aware
of disturbing social media videos showing individuals wearing ray ban
metaglasses to enter massage parlors across North America and record
interactions with workers. Many of these workers are immigrant and
newcomer women who may or may not engage in sex
work but experienced stigma. Nonetheless, the shameless use of covert

(14:42):
recording technology at massage parlors to gain likes, attention, and
online notoriety is both disgusting and dangerous. Yes, due to
criminalization and stigma, sex workers face disproportionate levels of violence
and harassment. Violations of privacy can lead to arrest, immigration consequences,
and lasting harm. Swan's community made extensive efforts to report

(15:04):
these videos and were deeply disappointed that social media platforms
allow them to remain online. They say that they've also
created a system to warn the women who work at
these places, saying to warn immigrant and migrant women we support.
SWAN used our abuse of alert system to notify workers
about the use of RayBan metaglasses and the videos circulating online.

(15:24):
We also receive reports of community members about clients entering
massage parlors wearing the glasses and recording the women without
their knowledge. I am really grateful that organizations like SWAN
exist that can use technology to help support these women
from the kind of surveillance that this kind of technology represents.
But I do think like this is not just about

(15:46):
a handful of creeps with smart glasses. I think it
really is about what happens when companies like Meta and
Facebook filled tools that make exploitation easy, that make it
easy to turn that into a money making enterprise, to
make money from somebody non consensually being sexualized in this way,
and then to have the company just shrug when people

(16:09):
use them exactly that way. Meta has essentially built an
ecosystem where privacy violations are profitable, and that the only
form of real accountability that might ever even enter the
equation only happens after a journalist asks about it, after
some sort of public outrage.

Speaker 2 (16:27):
Yeah, not cool and not surprising. You know, they've got
a whole ecosystem built around disrespecting privacy and just letting
harms run wild up until the point where they start
to get pushed back on it, and then maybe they'll

(16:49):
do something.

Speaker 1 (16:50):
Maybe maybe, Although not for nothing. I did want to
see a very satisfying video where a guy in metaglasses
which are recording, and I believe that he uploaded the
footage and it's one of those things where did you
upload this thinking that it made you look cool, because
it's really the opposite. Where a guy is trying to
get into a strip club and the woman at the

(17:12):
door she knows the drill exactly. She's like, Oh, if
you want to come in hear home, you get pickles
glass saw and he keeps being like, oh, they're not,
They're just regular glasses, and she absolutely knows that they're not.
He keeps trying to like get them in, get them in.
It was a very satisfying watch just watching this this
creep think that he can out smart this woman because

(17:33):
she is associated with the strip club and being denied
and eventually being kicked out of the establishment. Very satisfying watch.

Speaker 2 (17:40):
Yeah, I bet it was, and also funny that his
tactic was just to like lie when it sounds like
she clearly knew that, like they were the medaglasses. He's like, no,
they're not.

Speaker 1 (17:51):
Let me tell you something. Nobody knows the drill around technology,
Like anybody who is even tangentially involved in sex work
of yours or a sex worger, you like, you have
to know the drill. Some smug tech bro is not
going to outsmart a woman who works the door at
a strip club when it comes to technology. There's just
no way. No.

Speaker 2 (18:12):
Yeah, a woman working the door at a strip club,
don't cross her.

Speaker 1 (18:17):
Yeah you're not. There's no outsmarting her. Let's take a
quick break at her back. Okay, So speaking of women

(18:40):
out smarting technology, this story from the Colorado Sun is
truly wild and also just illustrates how screwed up our
entire tech enabled surveillance landscape is. This time not regarding metaglasses.
It's regarding Flock, which is a company that operates a
vast network of AI powered video cameras, license plate readers,

(19:03):
and microphones for surveillance. If Flock that company, if they
sound familiar to you, it's probably because we talked about
them before. Regarding a situation in Texas where police used
Flock to track down a woman who they said had
an abortion, the police said, Hey, we used Flock license
plate readers to find this woman after her abortion because

(19:25):
her family was worried about her. That's the only reason
we were trying to find her. Or they were worried
about her safety after this abortion. They were worried that
she was, you know, in some sort of physical danger,
and so they had tracked her using Flock license plate
reader technology. However, for for Media again shout out to them.
They followed up and found that the police, even though

(19:47):
they said they were just looking for her because her
family was worried about her. Police had actually been looking
to charge her with a crime for having had an abortion.
So it wasn't just that her family was worried about
her and looking for her, it was that they wanted
to charge her with a crime. So Block is everywhere.
It is in cities all over the country. A lot
of towns and cities have contracts with Block. Balmar, which

(20:09):
is a small suburb northwest of Littleton, Colorado, is one
of several Colorado towns that has a contract with Block.
So this woman, Christiana in Denver, she got a visit
from the police who came to her door with a
summons for her, accusing her of stealing a package worth
less than twenty five dollars from somebody's doorstep in this

(20:29):
neighboring town of Bomar. Their proof, they said they had
footage from a flock surveillance camera showing her car, which
was a green Vivian, driving through the town of Bomar
from eleven fifty two am to twelve o nine pm
on the day of the theft. So she says that
police were super weird with her about it. They told her, quote,

(20:50):
you know, we have cameras in that town. You can't
get a breath of fresh air in or out of
that place without us knowing. Just as an example, we
know that you've driven there about twenty times in the
last month, which is a pretty weird thing. If a
cock came to my house and was saying that to me,
I'd be pretty freaked out.

Speaker 2 (21:09):
Yeah, I wouldn't care for that at all. That the
police are just like aware of everything I've been doing
for the past months. Wouldn't like it.

Speaker 1 (21:18):
I also would not like it. The police also told
her that they had seen ring a camera footage that
showed her taking this package from the porch. She was like,
that's not me. That did not helpen. I did not
do that. The police told her, I guess this is
a shock to you, but I'm telling you this is
a lock, one hundred percent, no doubt. So she was like,

(21:39):
I know that I did not do this. She asked
the police to see this ring camera footage that they
said was one hundred percent her. They had her dead
to rights taking this package, and they refused to show
her this footage. So Christiana drives a Rivian and I
guess Rivians have a camera in the dash, which captured
footage again proving that she had not driven to where

(22:01):
this crime had happened. When she told police like, hey,
I actually have a camera in my car, I can
show you that I drove to this town and didn't
make any you know, stops at anybody's house to steal
a package, they were like, it doesn't matter. They gave
her the summons. Christiana says that she works in finance,
so understandably was not thrilled about having her name associated

(22:24):
with a theft.

Speaker 2 (22:26):
I bet not makes sense. Yeah, probably is not really
helpful in that field.

Speaker 1 (22:31):
So Christiana said, I'm going to fight surveillance with surveillance.
So she and her husband essentially compiled a master file
with any kind of information about her whereabouts from her phone.
Right she had snapshots from her Google timeline, a tool
on her phone that can track each stop that she makes,
statements from people that she interacted with that day. The works.

(22:53):
She took a picture of the outfit that she wore,
which was a light pink top and black pants and
an olive green fleece, with a note that she took
off her sweater because it's her to get warm outside.
That ring camera footage that the police said showed her
definitively stealing that package, while somehow she was able to
track down that footage using next door, and that footage

(23:14):
does show a person coming up to a porch and
taking a package or running away, but it doesn't show
them getting into a car that looks anything like hers.
And so basically she had actually gone to this town
with a theft happened, but she had legitimate business there
with a tailor that was more than a quarter mile
from the house with this package was stolen. So she

(23:37):
collected all this surveillance footage, footage of her entering the
tailor's office and then leaving the tailor's office. She got
footage of her driving her car from the rivian that
shows that she basically drove from her house to this
tailor and back without any additional stops. She compiled all
of this evidence at her own expense and effort and

(23:58):
sent it to the police. Eventually, that chief of police
responded and congratulated her on her detective work and announced
that the police were going to be dismissing the charges
against her. In the email they wrote after reviewing the evidence,
he provided parentheses Nicely done BT dubs. We have avoided

(24:18):
the summons we issued, so it's great that they dropped
this bogus charge against her. But she was like, did
the police just use Flock to see that I happened
to drive to the same town where somebody happened to
have had a package stolen and decided that because I
was in that town that I did it.

Speaker 2 (24:39):
And does raise questions like that police officer who came
to her house who said, one hundred percent it's a lock.
Clearly it wasn't. It's like, what made him think that?
Was it really just that that Flock said that she
was in the town. So curious that I guess we'll
probably never know.

Speaker 1 (24:59):
Well. The Colorado asked the police about this. They were like, basically,
what happened here? The Shia police told The Colorado Sun
that Flock uses cameras to identify stolen vehicles, stolen license plates,
any wanted subjects that entertown or follow up investigations. Quote,
we can't see people inside the car. We follow up investigations.
If we have a crime, we'll go back and look

(25:21):
at the cameras and see who was in town at
the time of the crime. So it does kind of
just sound like this is a tool that police are
relying on to be like, oh, who is in town
when this crime happened? And it just doesn't sound like
more surveillance in this instance is actually leading to crimes
being solved. I think, if anything, it is leading police

(25:43):
to say, Oh, this person was here, was probably them.

Speaker 2 (25:47):
This is something that we see pretty often that people
really have like an outsized level of confidence in results
that they get from a computer, like this belief that
if a software system says something, that it has to
be true. And I wonder if that's what's going on here,
that that they just like have way too much faith

(26:12):
and confidence in Flock to be tracking every single person
who's coming and going into that town to the detriment
of like reasonable skepticism that like, maybe there's more to
the story than the one person whose car was driving
through their little town.

Speaker 1 (26:32):
Yes, And I would take that further and say that
I think it is with intention. I think that when
power system is when the state uses computer systems and
technology systems like Flock, it absolves them of having to
do any investigative work or the accountability of being wrong,
because you can just say, oh, well, it was the computer, like,

(26:54):
we were just looking at the AI. The AI is
never wrong. The AI told us that you were responsible,
and we were just followed up on that, and so
they don't have to have any accountability. I mean, Christianna
says she never received any kind of explanation from anybody
at the police department or even an apology, and it
just I know I've said this before, but it reminds
me of that IBM page from like the seventies. A

(27:16):
computer can never be held accountable. Therefore a computer must
never make a management decision. I think that the police
are with intention outsourcing this kind of police work to
technology and surveillance tools like fuck precisely because it means
that they do not have to be held accountable as
the human police when they fuck up as bad as this.

Speaker 2 (27:37):
Yeah, I think you're absolutely right. It's a way, it's
like a backdoor way to reduce their own accountability. The
weird thing about it is though, that like it's not
like that accountability then gets transferred to the software company.
It just kind of evaporates into the air and there
is no accountability.

Speaker 1 (27:57):
She didn't even get an apology. I mean, she will
put the piece in the show notes obviously, but she
describes sleepless nights and wondering if her finance job was
going to be coming to an end because of this accusation,
this completely unfounded accusation, and they did not even give
her an apology. Let me tell you something, I would begin.

(28:19):
I would need an apology to move on from this
if this doesn't happened to me. That would be the very
least of the things I would need. I would need it.
We would I would need a clear apology from somebody.

Speaker 2 (28:28):
Yeah, I would be pretty mad and it. It also
sounds like she really had to do the work to
like prove herself innocent here, and that's not how it's
supposed to work.

Speaker 1 (28:42):
Absolutely, you know, she said that she basically had to
exonerate herself. And my question is what would have happened
if Christiana was not the kind of person who was
able to spend so much time and her own expense
compiling so much proof to illustrate that she was innocent. Like,
I'm happy that she beat the charges, but she only
did that through out surveilling. The surveillance, and I think

(29:04):
if we're at the point where you need a tailor's
receipt and your own cars, dashcam footage and all of
this just to prove that you didn't steal a package
worth less than twenty five dollars. Maybe this level of
surveillance is not actually keeping anybody safer or deterring crime.
Maybe this is not actually the future that we want.

Speaker 2 (29:23):
And meanwhile, there's still somebody out there stealing packages.

Speaker 1 (29:26):
The real package set is still out there people. Okay,
So speaking of accountability, we were talking before we got
on the mic about this new law in China regulating influencer.
So there's this new law where if influencers or contact
creators want to make content online about specific regulated topics,
they will need to provide some kind of education or

(29:47):
training in those areas. So this new law requires social
media influencers to hold verified qualifications before discussing medicine, law, education,
or finance. And it's a at curbing misinformation. How will
this be regulated, Well, it'll be up to platforms like
the Chinese version of TikTok, which is Doin or Webo

(30:10):
and believably to verify content creators credentials before allowing that
content to be posted. Content creators if they want to
talk about these specific regulated topics, they will need credentials
like degrees, certifications, or professional license in those topics. It's
actually a pretty wide ranging set of new laws. The

(30:31):
Cyberspace Administration of China, which is the Chinese government agency
responsible for regulating and overseeing the country's Internet, online content
and data security, has also implemented some new regulations on
social media advertising and promotions. Under these new laws, any
campaign for medical products, supplements, and health foods will now
be prohibited unless it is clearly defined as such to

(30:53):
prevent misleading content disguised as educational materials. So basically you
can't just say, oh, this is just educational and information
and actually it is a campaign for a medical supplement
that I want you to buy. Influencers will also have
to disclose sources for studies or note when content is
ai generated under this new legislation. How do How are

(31:15):
you feeling about this? You and I had a meaty
conversation about it off Mike, I don't I don't know
how and how into it you want to get. But
how are we feeling?

Speaker 2 (31:24):
Conflicted? Very conflicted? You know, it's obviously scary to be
like regulating speech, but also what we're doing in this
country is like not working at all right now. Uh,
Like the Internet is just filled with medical misinformation, uh,

(31:48):
which I you know, know a little bit of something
about enough to be like, yeah, this is a really
bad stay rehearse. I can only imagine, like what were
the other areas lead goal education and finance information. I
can only imagine that there's it's equally bad in those
domains as well. So there's and it's not just that

(32:11):
there's some bad information out there, but like our platform
social media platforms are designed to amplify and spread and
replicate that misinformation. So like, it's very bad, and I'm
open to ideas about like trying to rein that in

(32:34):
in some way. And so there's something appealing about some
laws to rate it in and I kind of think that,
like in this country, we do need some laws to
rein that in. But it's also pretty scary because if
you think about the administration in charge right now, what
would it look like for them to have the final
say in what kind of medical information is allowed to

(32:57):
be shared at not right Like there's spewing misinformation from
the White House, from the head of AHHS. So uh
so I don't know, what do you think about it.

Speaker 1 (33:11):
You really said it. I mean, on the one hand,
I want to be crue that I'm skeptical about this
because it feels like censorship to me. And also part
of it's a real love hate thing that I have
with social media is that the democratization of media, where
having a platform or having a voice is not gate

(33:31):
kept like it is in traditional media. That is something
that I love. It is it is what has driven
me to being a podcaster, but it also I kind
of hate it because that means anybody can just get
a microphone and a ring light and just say whatever
they want. And yeah, so I want to be crue
that I'm skeptical of this, but like you, obviously what

(33:51):
the system that we have in the United States is
not working. What's fun in to me is so RFKG
here the head of AHHS. He has a law degree.
He doesn't have any kind of medical degree. Would he
not Would he not be able to make content or
you know, commentary about health advice or medical advice if

(34:15):
this law were in the United States because he only
has a lot of degree and not any kind of
medical expertise.

Speaker 2 (34:19):
I mean, we know how much these guys care about laws.

Speaker 1 (34:24):
You wouldn't stop him.

Speaker 2 (34:26):
If it were a saying society, he would not be
making comments about health stuff. And even if he was,
if we were a saying society, people wouldn't be listening
to him. We wouldn't be amplifying him.

Speaker 1 (34:40):
As DC residents. The day that he swam in Rock Creek,
nobody would. If you, if you live in a mid
Atlantic you know this homie just took a swim in poop.

Speaker 2 (34:54):
Yeah, he just took a swim in poop, like and
he had his kids out there doing it. I love
Rock Creek. I go hiking along it like multiple times
every single week. I would not swim in it, Like
when it rains it smells like poop because a lot
of the sewers just like wash right in there. That's
just like how it is with combined sewers in an

(35:14):
old urban city like this. Like, uh, it's it's not
a good place for people to swim.

Speaker 1 (35:21):
And I'm not even sure it's safe for dogs to swim.

Speaker 2 (35:24):
No, it's not because they drink it. And the annoying
thing is that he when he did that, after he
like took his little splash around in Rock Creek, and
ostensibly he said he was fine. We don't know what
was going on in the bathrooms at home, but like
he says he was fine and holds that up as
like evidence that it's an okay thing for him to do.

(35:46):
But it's not just about him. This is like one
of the most frustrating things about him is that he
is like the leader of helping human services. Like people
look to him as a role model even though they shouldn't,
but like he has a responsibility to act in a
way that keeps people in general safe, whether or not

(36:11):
he himself, like got lucky rolling the dice. That's not
the point of public health.

Speaker 1 (36:18):
Yes, I remember him saying, oh, well, nobody should take
health advice or medical advice for me. On the one hand,
I fucking uh. On the other hand, what are you
doing leading ahhs?

Speaker 2 (36:29):
Like you know, yeah, Like I'm sorry to tell you,
people are going to be taking cues from you. Yeah.

Speaker 1 (36:36):
So the reason that I wanted to talk about this
new law in China is not to advocate for it,
because I am skeptical of it. But I just did
have a moment where I thought, boy, what would it
look like if we had something like this in the
United States? Setting aside the point that you made, which
is a good one, which is why I'm a little
bit iffy around a lot of legislation around cracking down
on this kind of thing legislatively, because under a host

(37:00):
administration like this one, I would not want this administration
being the ones you.

Speaker 2 (37:04):
Know, in charge of anything.

Speaker 1 (37:08):
You get it, Yeah, you get what I'm trying to say. Yeah, however,
I putting that aside, I still can't help but like speculate,
like what would our internet landscape look like if you
needed a degree to talk about these things? And they
are things that I mean, Listen, I have been trying
to get my finances together lately, and the amount of
bad financial advice that is out there from people who

(37:32):
don't know what they're talking about. Financial advice. Generally, good
advice is like pretty standard and humdrum and boring, you know,
stuff like don't spend more than you have. But there
are so many people out there who are peddling the
worst financial advice you've ever heard in your life. And
come to find out, a lot of these people don't

(37:52):
even have any kind of credentials to be giving the
public financial advice. And so if we had this kind
of law in the United States. I mean, you would
have no more finance bros saying things like stop buying
avocado toast and then you can afford a house when
they a have no financial credentials to be telling anybody
how to run their finances and b are being like
bank rolls by their parents. No more wellness coaches selling

(38:16):
four hundred dollars moon water or any kind of like
weird supplements. I just think that if this kind of
law came to the United States, think about all of
the loud, big voices that would just vanish overnight.

Speaker 2 (38:29):
Yeah, if you couldn't just say lies for profit, it's
hard not to be a little attracted to that.

Speaker 1 (38:38):
That's what I'm saying. Like again, I'm not saying that
I am. I am skeptical of this law, but it's
hard not to see this and be like, Wow, we
really have a problem in the United States. And anybody
can just build a massive platform saying anything and it's fine.

Speaker 2 (38:54):
Yeah, And in this country, it seems like areolitical leaders
and certainly the leaders of social media platforms have like
embraced the chaos. They're like, this is what makes America
strong is the ability to just like scream harmful lies
and drown out anyone who's trying to say something reasonable.

Speaker 1 (39:16):
Our president was out here doing crypto scams and nobody
said boo, like we this is a scam economy. That's
all there is to it.

Speaker 2 (39:24):
It's so true, so many scams. Like every day he
does a dozen things that for any other administration would
be an enormous scandal, and they just like vanish into
the media ether.

Speaker 1 (39:43):
More. After a quick break, let's get right back into it.
So I wanted to briefly talk about this new research

(40:04):
from the Institute for Strategic Dialogue or ISD, who full disclosure,
I used to do a bunch of partnership work with.
They just put out a fascinating new study that really
shed some light into how the ecosystem of non consensual
deep fake material works online. Basically, tools to make and
distribute this kind of material are all over social media.

(40:27):
ISD analyzed web traffic to thirty one websites that provide
deep fake tools and found that those sites got a
combined twenty one million visits a month, with the most
popular sites getting over three million visits in one month.
As of May. ISD's analysis found thirty one active deep
fake tools that were easily discoverable on x four chan

(40:49):
and common search engines. These tools can create realistic, explicit
deep fakes from a single photo, and we already know
what that means people who are targeted with this kind
of thing. There's a potential for social exclusion, job discrimination,
and debilitating emotional stress for the victims targeted, especially people
in public positions. I've often said this, this is not

(41:12):
just about, you know, naughty pictures. It's horrifying, but in
my book, it is definitely a democracy issue. Right. If
the people who are targeted this kind of thing are
overwhelmingly women, those women, especially women in public positions or
elected officials and things like that, it is going to
keep women who are targeted from running for those positions

(41:34):
and taking a more public role in civic participation. And
it's going to keep us all from the representative democracy
that we deserve. So it's not just an issue of gross,
creepy pictures, which it is. It is a democracy issue
that we should all be concerned about.

Speaker 2 (41:51):
Yes, And I think that's a really good point to make.
You know, after the segment that we just talked about
about regulating speech about health online as a democracy issue,
because it's democracy is complicated, and it doesn't just mean

(42:12):
anybody can say anything they want as like the be
all and all virtue of what it means to be
a democracy, Like you just said, the ability of women
to run for office and just like show up online
and exist and to be able to do that in
a way where they aren't being harassed and subject to

(42:37):
like new toify apps and things. That's also a democracy issue, right.
I think there's a lot to balance there, and it's
i think online, like so many things, the conversation gets
immediately taken to the extreme of like really fetishizing one

(42:58):
aspect of democracy and leaving out that aspect that you
mentioned of like in some cases there does you know
every single person can't act with complete freedom all the
time if that means that it's going to restrict the
ability of a whole class of people to participate in society.

Speaker 1 (43:21):
Exactly very well put. So. You might think that these
newdify apps are hard to find, that they're like on
the dark web or something, but according to this research,
it is very simple to find tools like neudify apps online.
According to the authors, searches on Google, Yahoo, and Being

(43:41):
for terms like Newdify, undress app, or deep Nude usually
returned at least one of these tools within the first
twenty results last year or four for media also noticed
that Google was showing ads for some of these apps,
thing in particular made these tools especially easy to find,
with the top result for all three of these searches

(44:01):
being new toified tools. These findings focused on organic search results,
not paid ads. The researchers also found that X was
a major platform for spreading these tools. Does this shock you?

Speaker 2 (44:13):
Mic it does not know? I mean, I'm What would
surprise me is if groc itself refused to just do
this for somebody, that would be surprising.

Speaker 1 (44:23):
Ask Grock if you you can just put a picture
of anybody and ask Rock like, undress this person and
they'll they'll put them in a bikini. So, out of
almost half a million mentions between June twenty twenty and
July twenty twenty five, more than seventy percent nearly two
hundred and ninety thousand of the results of these things
were on X. Much of this activity appeared to come

(44:43):
from bots identified by repetitive user names, identical post styles,
and similar profile pictures. So even though a lot of
this seemed to be automated, the volume of it is
still pretty concerning because it is likely drawing new users
to tools that can be used illegally in certain situations.
This is absolutely horrifying, according to this research. In early

(45:04):
twenty twenty three, there was a notable surge in mentions
of these tools on Tumblr after a woman shared her
experience of being sexually harassed using newdify and deep fake tools.
As many victims of malicious deep thakes have repeatedly pointed out,
speaking about this kind of personal harassment or even calling
out the harassment of others just comes with the risk

(45:27):
of attracting further attention and thus attracting additional abuse. And
I guess that's what really infuriates me and simultaneously breaks
my heart about this is that whenever I research an incident,
like an incident in a school involving minor boys using
these tools to make and distribute images of the minor

(45:48):
girls in their class, something I see again and again
as the boys saying some iteration or something along the
lines of, well, this can't be illegal or this can't
be wrong, because it's so easy to find online that
this was illegal. I wouldn't be able to just Google
or use a mainstream social media platform to find this
thing so easily. Kind of assuming just because it's so

(46:11):
easily available, it's probably legal and find to be using.
And I can see why they would think that, because
in some ways they are absolutely correct, right, the fact
that platforms like X allow this stuff is essentially a
kind of implicit endorsement. But I think it just shows
how much we have failed our youth because when this

(46:32):
kind of harm is so easy to find online after
a while, they're right, it does kind of look like permission.

Speaker 2 (46:40):
Yeah, that's a really good point. When it's that available,
it looks like permission. You know, it is like an
implicit either an implicit wink in a nod, or even
just an explicit thumbs up to go ahead and use it.
And you know, the availability of something harmful like this,

(47:05):
I guess I'm like really stuck on this like balancing
different competing aspects of democracy thing. But like there are
things that these platforms could rain this in, right, Like
they could deprioritize these results, make these tools harder for
people to find, and so it wouldn't be bad like
passing a law to ban them but these platforms, if

(47:29):
they really cared, could take steps to limit the availability,
which would have huge harm production benefits, And yet they don't.
You know, just because something is legal doesn't mean that
it should be just pushed to every kid who types

(47:51):
a search term for it into whatever platform they're using.
Probably X.

Speaker 1 (47:56):
Well, it goes back to my question, and I mean
this in a meaningful this is above my pay grade.
I am not smart enough to be the definitive voice
on this, But at what point is this platforms materially
benefiting from an illegal enterprise? If it is illegal to
use some of these new Toify apps, and these apps
are easily findable on platforms, and sometimes some cases advertised

(48:17):
on these platforms, at what point is it, Oh, you
are making money from an illegal enterprise that is disproportionately
harming girls children.

Speaker 2 (48:26):
That's right. And another thing that they mentioned in this
study was that there were a lot of ads being
taken out on these platforms using the using the names
of these particular tools. What was interesting was that the
it sounded like in a lot of cases, the new
Toify tools themselves were not buying ads, but when the

(48:49):
tools got taken down through some sort of action or
for whatever reason, and so the tool no longer existed,
but people were still searching for it. Other companies would
pay would be on those search terms when they would
buy their ads on you know, Meta or x or wherever.

(49:09):
And so there's a great example of just money flowing
directly to these platforms to keep these tools in the
zeitgeist in circulation. Yeah, you raise a great point, like
shouldn't they have some accountability for making money off of
these tools that in a lot of cases can be

(49:32):
like very armful.

Speaker 1 (49:33):
I mean, Christianna had the cops show up to her
doorstep for supposedly stealing a package that they said was
worth less than twenty five dollars. At what point are
the cops gonna show up at Elon Musk's doorstep.

Speaker 2 (49:45):
For this Dahn? That is another good question, Bridget. We
we better move on to the next segment before we
started going out full on revolution.

Speaker 1 (49:57):
You know who I wanted to buy side of that revolution?
Will Smith?

Speaker 2 (50:01):
Well, you're gonna get Bob Ferguson.

Speaker 1 (50:05):
He just needs to charge his bone.

Speaker 2 (50:07):
Yeah, I just used to charge his phone. Gotta make
a call.

Speaker 1 (50:11):
Do you want to say when our Halloween costumes are gonna.

Speaker 2 (50:14):
Be Oh yeah, happily. I am. I am Bob Ferguson
from the movie One Battle after Another, Leo's character. It's uh.
I put together what I think is a pretty pretty
good costume and I've just been like wearing it around
every time I've gone out for the past week. A

(50:37):
few people get it and they really like it.

Speaker 1 (50:40):
Yeah, a few people. What's funny is that I know
you so well. You are Bob Ferguson for Halloween, and
you are also kind of Bob Ferguson in real life.
Like I'll let the listeners speculate on what I mean
by that, but you have a lot in common with him,
And when I saw the movie, I thought, this is Mike,

(51:00):
you are very similar to above. For I don't mean
this as an insult. I'm sure you know that.

Speaker 2 (51:06):
No, I take it well, and to be clear, there
were also several key differences, but I yeah, it fits.
The costume fits. The funny thing about it, though, is
like if people don't recognize it from the movie, you know,
it got like a big oversized flannel robe and like
a hat and those uh, like big dark sunglasses that

(51:27):
people can wear over their regular glasses. If people don't
recognize the costume. I just look like a bumb which.

Speaker 1 (51:34):
No offense, but like that's U. I don't know how
to finish the sentence without sounding offensive. But you know
what I'm getting out of here?

Speaker 2 (51:43):
I think I do. I mean I generally in my
life people aren't mistaking me for a bum all that often.
I guess it comes up. How about you? What are
you gonna be for Halloween?

Speaker 1 (51:56):
Blade? Blade from the movie Blaze in the Titular Titular
Blade from the movie Blade, which we you and I
rewatched recently, holds up very well.

Speaker 2 (52:07):
I thought, Yeah, it moves along, it's action packed. He
barely speaks, but when he does, he's got some good lines.

Speaker 1 (52:16):
He gotta talk like this. I've been working on my
Blade voice.

Speaker 2 (52:20):
He has a very low voice.

Speaker 1 (52:21):
But he doesn't really say a lot. But it's a
lot of cool one liners delivered like this.

Speaker 2 (52:29):
Yeah, the cool one liners. That's that's That's what I
want in an action hero. Blade, Spider Man, uh.

Speaker 1 (52:39):
Whoils A lot of them have.

Speaker 2 (52:41):
Good one liners like that. You don't want long soliloquies.
You just want a quick little wood liner and then uh,
maybe some perchase.

Speaker 1 (52:50):
I'm worried that people are just gonna think I'm Goth.
They're not gonna know that I'm Blade from the movie Blade.
They're gonna just think I'm Is this an old goth
An old blood?

Speaker 2 (53:02):
Well, I think it's Katata will probably help them figure
it out.

Speaker 1 (53:06):
An old black goth martial arts enthusiast just hanging out
on Halloween. Okay, wait, so I have one more thing.
It's actually kind of a cool story, I think, Can
I can I get? Can you give me one more thing?

Speaker 2 (53:18):
Yeah? Please? I mean it's been, uh, just one bad
story after another. So if you've got something good, let's
hear it.

Speaker 1 (53:25):
Good one good one and it involves good Reads. So
that was a good segue. So I love this story
so much. Shout out to four O four again for
another banger. So this rogue librarian on Goodreads is editing
book titles to protest Goodreads censorship. According to four four,
one of the sites, volunteer moderators, which Goodreads calls librarians,

(53:49):
swapped the blurbs and pictures and titles of a handful
of books, books like Reese Witherspoon's thriller Gone Before Goodbye,
which I did not even know Witherspoon, the actress, is
now writing thrillers. Love a thriller, I need to check
that out, And the Nicholas Sparks bestseller remain. They are
swapping out all of the information and blurbs of those

(54:10):
books with the pictures and blurbs from Eric Trump's book
Under Siege, adding the subtitle quote Goodreads censorship in favor
of Trump. So I did not even know that Eric
Trump had written a book. I don't even need to
look it up. Guarantee that thing was ghost written for sure. Basically,
they are altering all of these books and swapping in

(54:33):
Eric Trump's book, explaining that Goodreads is removing criticism of
Eric Trump's book from the site, saying quote silencing criticism
of political figures, especially those associated with authoritarian movements, helps
normalize and strengthen those movements. When we let powerful people's
books be protected from criticism, we give up the right

(54:56):
to hold power accountable. These changes were up on good
Reads for a few hours before being corrected, so.

Speaker 2 (55:04):
That librarian is absolutely correct.

Speaker 1 (55:06):
That is.

Speaker 2 (55:09):
The story, in as far as I could deal. Every
democracy that has slidden into authoritarianism, that's how it happens,
just like people dialing back criticism of the people in power,
self censoring and just letting them get away with it.

(55:31):
So good on this librarian.

Speaker 1 (55:33):
Absolutely so this librarian, this rogue librarian who was protesting Goodreads,
which is owned by Amazon, says that Goodreads is censoring
negative reviews of Eric Trump's book. They said that Goodreads
deleted negative reviews of Under Siege as they came in
after its publication on October fourteenth. These were honest opinions
from real readers who disagreed with the book's contents, the

(55:54):
librarian said in their post. When people noticed and complained,
Goodreads deleted all reviews use of the book, positive and
negative alike. This wasn't an accident or a one time glitch.
It was a deliberate pattern. So four or four reached
out to Goodreads to ask if the platform was disallowing
these reviews, and it does sound like Goodreads was trying
to prevent review bombing of Eric Trump's book. In response

(56:18):
to the questions about the reviews for the book, a
spokesperson from Goodreads told for for Media that Goodreads has
systems in place to detect unusual activity on book pages
and may temporarily limit rating and reviews that don't adhere
to our review in community guidelines. In all cases, we
enforce clear standards and remove content and or accounts that

(56:38):
violate these guidelines. So we did an episode about this.
But review bombing is a legit problem on Goodreads. But
this is also the nature of the complaint according to
this rogue librarian. They said, when a platform removes criticism
of a political book while leaving praise, or removes everything
to hide that criticism ever existed, they're not staying neutral

(56:59):
there a side. Goodreads is owned by Amazon, one of
the world's largest companies. When major platforms decide which opinions
can exist and which must disappear, they shape what people
think is true or acceptable. Honestly, I love everything about
this story. Hate everything about Eric Trump. It sounds like
people did not like his book. It sounds like they
read his book and legitimately took issue with it, and

(57:21):
so they were expressing that and I love that. I
also just love that it took a rogue of Goodreads
librarian volunteer to pull this off, like, not a whistleblower,
not a hacker, just a super annoyed, pissed off set
up volunteer. It really reminds me of that Margaret Mead quote.

(57:42):
Never doubt that a small group of thoughtful, committed citizens
can change the world. Indeed, it's the only thing that
ever has in this case, Never doubt that an annoyed,
bookished volunteer can also change the world.

Speaker 2 (57:56):
Yeah, and not just any volunteer, a librarian.

Speaker 1 (57:59):
Yes, shout out to the librarians. Sometimes it really does
feel like they are the only thing standing between us
and a complete descent into a fascist nightmare healthscape.

Speaker 2 (58:10):
Absolutely, they are the keepers of knowledge during a time
when knowledge is being questions, reality is being questioned. We
live in different realities. Maybe librarians will be the ones
to help us make sense of it and save us.

(58:32):
So brig it. Today is Halloween. How are you going
to be celebrating in your Blade costume?

Speaker 1 (58:39):
Oh? I give out candy every year. Even though I
live in a building and a neighborhood where people might
not associate giving out candy, It is truly one of
the things I love the most. I love the kids
in costumes, I love being in costume. Oh, I'll be
giving out candy. If you see Blade, come say hello.

(59:00):
Unless you're a vampire. In which case, watch out.

Speaker 2 (59:02):
Ooh scary words for the vampires.

Speaker 1 (59:06):
Well, Mike, where can folks keep in touch with us
on this subooky season and beyond.

Speaker 2 (59:13):
People will can leave us a comment on Spotify. They
can send us an email at Hello at tangote dot com.
We love getting listener emails. We people have been sending
in some really interesting, like thought provoking, interesting ones lately,
so thank you for that. Please keep it up. And
we're we're going to do that mailbag episode real student,
so please keep the emails coming. People can check out

(59:35):
Bridget socials. Her username is Bridget Marie in DC on
both Instagram and TikTok, and we have a YouTube channel
that we just posted something today. The name of the
channel is there are no girls on the Internet. It's
very easy to remember.

Speaker 1 (59:50):
Well, Mike, thank you so much for being here. Happy
Halloween and thanks to all of you for listening. I
hope you have a spooky and safe Halloween. I will
see you on the internet. Got a story about an
interesting thing in tech, or just want to say hi.
You can reach us at Hello at tangody dot com.

(01:00:12):
You can also find transcripts for today's episode at tenggody
dot com There Are No Girls on the Internet was
created by me bridget Tod. It's a production of iHeartRadio,
an unbossed creative. Jonathan Strickland is our executive producer. Tarry
Harrison is our producer and sound engineer. Michael Almato is
our contributing producer. I'm your host, bridget Todd. If you
want to help us grow, rate and review us on
Apple Podcasts. For more podcasts from iHeartRadio, check out the

(01:00:35):
iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Ruthie's Table 4

Ruthie's Table 4

For more than 30 years The River Cafe in London, has been the home-from-home of artists, architects, designers, actors, collectors, writers, activists, and politicians. Michael Caine, Glenn Close, JJ Abrams, Steve McQueen, Victoria and David Beckham, and Lily Allen, are just some of the people who love to call The River Cafe home. On River Cafe Table 4, Rogers sits down with her customers—who have become friends—to talk about food memories. Table 4 explores how food impacts every aspect of our lives. “Foods is politics, food is cultural, food is how you express love, food is about your heritage, it defines who you and who you want to be,” says Rogers. Each week, Rogers invites her guest to reminisce about family suppers and first dates, what they cook, how they eat when performing, the restaurants they choose, and what food they seek when they need comfort. And to punctuate each episode of Table 4, guests such as Ralph Fiennes, Emily Blunt, and Alfonso Cuarón, read their favourite recipe from one of the best-selling River Cafe cookbooks. Table 4 itself, is situated near The River Cafe’s open kitchen, close to the bright pink wood-fired oven and next to the glossy yellow pass, where Ruthie oversees the restaurant. You are invited to take a seat at this intimate table and join the conversation. For more information, recipes, and ingredients, go to https://shoptherivercafe.co.uk/ Web: https://rivercafe.co.uk/ Instagram: www.instagram.com/therivercafelondon/ Facebook: https://en-gb.facebook.com/therivercafelondon/ For more podcasts from iHeartRadio, visit the iheartradio app, apple podcasts, or wherever you listen to your favorite shows. Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.