All Episodes

August 22, 2025 88 mins

Bridget is joined in this week's news roundup by tech journalist Dexter Thomas, prolific writer, videographer, and host of the excellent podcast "Kill Switch".

Dex recently wrote a piece in Wired about anime girl VTubers selling out concerts, and how the question of whether they're "real" depends on kayfabe and who you ask: https://www.wired.com/story/anime-girl-vtubers-are-selling-out-concerts-but-are-they-real-depends-on-who-you-ask/  In addition to listening to Kill Sitch, you can follow him on YouTube and social media at "D E X D I G I".

DC Disinfo.  Real Footage Combined With AI Slop About DC Is Creating a Disinformation Mess on TikTok: https://www.404media.co/real-footage-combined-with-a-ai-slop-about-dc-is-creating-a-disinformation-mess-on-tiktok/

The Tea App story gets worse. Turns out that weren't just sloppy, they were actively undermining other efforts to keep women safe.  https://www.404media.co/how-teas-founder-convinced-millions-of-women-to-spill-their-secrets-then-exposed-them-to-the-world/

Meta’s flirty AI chatbot invited a vulnerable retiree to New York. He never came back. https://www.reuters.com/investigates/special-report/meta-ai-chatbot-death/

If you’re listening on Spotify, you can leave a comment there to let us know what you thought about these stories, or email us at hello@tangoti.com !

Follow Bridget and TANGOTI on social media! Many vids each week ||  instagram.com/bridgetmarieindc/ || tiktok.com/@bridgetmarieindc ||  youtube.com/@ThereAreNoGirlsOnTheInternet 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet, as a production
of iHeartRadio and Unbossed Creative. I'm Bridget Todd, and this
is There Are No Girls on the Internet. This is
There Are No Girls on the Internet. Where we explore
the intersection of technology, culture, social media, and identity. And
this is our weekly news round up where we dive

(00:25):
into stories that you might have missed online.

Speaker 2 (00:28):
I am kind of fangirling because of this.

Speaker 1 (00:30):
Our guest co host today, Dexter Thomas, hosted a kill
Switch podcast on this very network. You're one of my
kind of favorite humans in the tech media space.

Speaker 2 (00:41):
I'm so thrilled that you're here.

Speaker 3 (00:43):
No, okay, that is too good of an angel. Thank you.
I'm happy to be here. Thank you so much.

Speaker 1 (00:48):
So, when you're not busy making the kill Switch podcast,
you also write all around the Internet about sort of
the intersection of tech and media. How would you describe
your sort of beat?

Speaker 3 (00:59):
Oh my gosh, So I say that I do culture, culture,
And yeah, I say that I do culture. And culture
for me is whatever you sit down on the couch
and keeps you sitting down on the couch, or whatever
makes you want to get up off the couch and
do something about it. So that's a very roundabout way

(01:19):
of saying I'm interested in everything. He wrote a.

Speaker 1 (01:22):
Piece in Wired recently called anime Girlvtubers are selling out concerts,
but are they real depends on who you ask all about.
VTubers a type of content creator that uses a virtual,
often anime style character as a kind of avatar.

Speaker 2 (01:35):
These people are like.

Speaker 1 (01:37):
Selling out irl concert venues and performing with like real
backing bands in front of real audiences that cannot get enough.
I actually learned about this phenomenon from your reporting and Wired.
That is that a fair assessment of what of like
whatvtubers are?

Speaker 3 (01:53):
Yeah, so, man, so v tubers. Essentially, if you've ever
used or seen that kind of you know, animoji type
thing on an iPhone where you can show where you
can have an like an avatar, like an emoji reacting
to what your face does, it's only a little bit
more complicated than that. And so essentially these VTuber is

(02:16):
kind of what it sounds like. It's a virtual YouTuber
somebody who streams and they just don't show their face
or their body. And yeah, you're right. Usually usually it's
anime girls. It doesn't have to be anime Girls, and honestly,
why it's Anime Girls is probably an entirely different podcast,
but yeah, it's it's very often anime girls, so pretty

(02:37):
influence from you know, anime culture and stuff like that too.

Speaker 1 (02:41):
As part of reporting this, you went to Fantastic Reality,
a mini festival at the Vermont Theater that brought together
eight main VTuber acts.

Speaker 3 (02:50):
What was this like, Man, you know, I knew stuff
like this existed, but I'd never really been to so
I've I've been to Anime Expo a couple of times,
presented actually an Anime Expo, but and so I knew
this sort of thing existed. But essentially, just imagine going

(03:11):
to a concert and the singer the main attraction is
on a screen and they never step on the stage
because they can't because they're not real. Which, by the way,
I'm not supposed to say that. I'm not supposed to
say these people aren't real, but I'm saying the whole.

Speaker 1 (03:29):
Thing, right, like, like they pretend like they're sort of
everybody sort of in on the joke, Like, what do you.

Speaker 2 (03:34):
Mean they're not real? Right?

Speaker 3 (03:35):
Right? Yeah, So this is so anybody who's ever watched WWE,
who's ever watched wrestling, will will know this word kfabe.
So kfabe is essentially kind of pretending that something is
real even though you know that it's not, because it's
more fun. And so you know, when we watched Hulk
Hogan and the Macho Man, you know, hit each other

(03:56):
over the head with chairs, we knew that they'd planned
it out beforehand, but it's way more or fun. And
if you think about it, you know, a UFC fight,
some of those things are really boring because it's over
in like five seconds because somebody gets the leg broken,
and that's not fun for me. But watching two dudes
beat each other up in this weird, sweaty soap opera
for twenty minutes and this drama and you think this
guy's gonna lose and that guy's gonna lose, and then

(04:17):
somebody jumps off the top rope who wasn't even in
the match period. It's fun and so that is k fabe.
And really, interestingly, in English anywayvtubers and their fans will
use that same word that wrestling fans use k fabe,
and the idea is that it's not fake. It's that

(04:38):
we're all creating this play together. And so it's even
more involved, I would argue then than wrestling, but the
same kind of underlying principle there.

Speaker 1 (04:47):
You're talking very glowingly about this, But in your piece,
something that I like is that you kind of talked
about being a little bit skeptical at first. You write
a music purist might scoff at all this to say
that YouTuber fans don't even like music. They it's like
anime and the whole scene is fake. Full disclosure here,
I will admit that I walked into the venue with
a touch of this mentality, but that slowly turned into

(05:10):
an existential crisis. Who can say their favorite genre isn't
also fake? So I guess my big question is is.

Speaker 2 (05:17):
This is it? Is it fake?

Speaker 1 (05:18):
Or does that even matter? Does the question of real
or fake even really matter?

Speaker 2 (05:21):
At this point?

Speaker 3 (05:22):
You know what for me I mean, And it's weird
hearing my words right back to me. But yes, I
did write those words, and it's true. It's true. But
here's the thing is that you could say that the
v twour stuff is fake, and to be real, most
of it is not my cup of tea, mostly because
I'm not really into anime music unless I like the

(05:45):
specific anime, and that there's only a small, you know,
group of anime tracks that I would say, I like,
you know, I like this, uh you know Cowboy Bebop.
You can't really that soundtrack? Yeah, classic, you know the
OG Dragon Ball theme song. Can't deny that the OG soundtrack,
you know, in some weird obscure like eighties stuff, seventy

(06:07):
stuff that I like because there was a live action
Spider Man in Japan in the seventies. Like it's I
like that stuff. I like that kind of stuff. The
current stuff not really my flavor. That's really kind of
what you would hear these sort of things, not really
my flavor. But if you're talking about fake, at some
point you have to start talking about rap music. And

(06:28):
hip hop is obsessed with reality, and yet Drake has fans,
and yet Rick Ross has fans. And ninety nine point
nine percent of these dudes who are rapping about stuff
that they did, they did not do it. They don't
even see it, and do any of us truly care.
We kind of pretend we do, but if it sounds good,

(06:49):
we're kind of cool with it, and so really we're
rap is also KFA rap is the biggest KFA market.
I would argue maybe in the world, we all just
pretend that, you know, these dudes are doing the stuff
they said they did it because it sounds kind of
cool to, you know, chant the lyrics along with them
in your car or whatever. So you know, I can't
I can't really hate on the YouTubers and they're fans

(07:10):
because they're they're doing something very similar to what I
do and what I've been doing for years.

Speaker 1 (07:14):
Are you saying that it's possible that Drake didn't really
start from the bottom? Is that what you're is that
what you're suggesting right now?

Speaker 3 (07:20):
I am I'm not suggesting that. I am asserting that
there are years of televised evidence that that man did
not start anywhere near what a reasonable person would consider
the bottom. That is what I'm saying. That is my journalist,
not in my opinion. That is my assessment, and I
defy anybody to say that that's not true because the

(07:43):
truth is actually out there.

Speaker 2 (07:45):
It's facts. It's facts.

Speaker 3 (07:46):
Yeah.

Speaker 1 (07:46):
One of the points that you made in the piece
was that, you know, this is one thing when we're
talking about a human using a.

Speaker 2 (07:52):
Digital avatar, but what the future could sort of look.

Speaker 1 (07:56):
Like when you have AI generated, Like the whole thing
is AI generate, not using a digital avatar.

Speaker 2 (08:02):
How does this fandom feel about that possibility.

Speaker 3 (08:04):
Yeah, So my opinion, and I still hold this opinion,
is that v tubers and their fans will be the
first one of the first markets that AI will start
to encroach on, or AI will try to encroach on,
because if you think of it, they're almost kind of
halfway there. They've already removed the physical person from the

(08:26):
visual at least, because you know, people are fans of
somebody whose physical body, you know, I'm breaking kfaid here,
but somebody whose physical body they will never see, because
because most of these v tubers will never show their
body or their face. Matter of fact, there's probably a
lot of you tubers who would refuse to be interviewed
because they want to keep, you know, keep that line.

(08:48):
But you know, I think the one of the things
that really surprised me was a lot of the fans, actually,
all the fans that I talked to, told me we
would never except AI. We do not want AI anywhere
near this culture at all. And I found that really
really interesting because I'm not sure that that's necessarily the

(09:10):
case for a lot of other genres. I mean there
was that psych rock group Forgetting what they're called Velvet
Sundown or someone.

Speaker 2 (09:17):
We talked about them on the podcast exactly.

Speaker 3 (09:20):
Yeah. And and you know what, one of the biggest
genres is a genre that I've personally had beef with
for a very long time. Lo fi lo fi hip hop.
Lo fi doesn't mean lo fi hip hop doesn't mean anything.
It's just boring. Soul is backtracking. It's it's it's homework.
It's homework, background music. That's all it is. Like, it's
it's not music h for me. This is my personal

(09:40):
opinion right that has been absolutely ravaged by AI because
here's the here's the big difference between say, lo fi
hip hop and v tubers is I defy you. Anybody
who says you like lo fi hip hop, tell me
five artists and no new jobs doesn't care count You can't.

(10:01):
I guarantee you can't. You you you're you're a fan of
just a background vibe. V tuber fans are fans of
a very specific persona, and at least for now, they
seem to want to know that a human is attached
with that will that change, will somebody convince them the future.
I don't know, but for me, I feel like, man,

(10:22):
if they get the v tuber fans with this anti
ai as they are right now. If they get the
v tubers, we're next. Everybody's next, They're gonna get all
of us.

Speaker 2 (10:32):
That's such a good point. I think that's it.

Speaker 1 (10:34):
I mean, I am not above putting on low fi
beats to chill and study to. You could not name
an artist if it's not the girl in front of
the anime girl in front.

Speaker 2 (10:44):
Of the rainy window.

Speaker 3 (10:45):
Yes, that's the person, and it's not a person, and
she didn't make the music. That is something that somebody drew.
And it's low five beats, the study too, it's the
it's the girl in the cafe with the rain in
the back. She didn't make any of that music. And yeah,
people can't. People can't name a low fi artist. They'll
they'll say new job is. If they say Dyla, I

(11:06):
will get extremely angry because Dyla is not low fi.
Diyla is his own genre. Dyla is hip hop. But yeah,
it's a It's interesting to watch people try to explain
why they don't like V tubers stuff and everything they
say about the genre or about the culture is stuff
you could absolutely say about the stuff that they're into.

(11:29):
It's just a slightly different version of it. So, you know,
I love being around people who were just into stuff,
you know what I mean. That's exciting to me. People
who are just like excited about things. I love that.
I love that. I always love that.

Speaker 2 (11:42):
Yeah, me too.

Speaker 1 (11:43):
And I have to wonder if part of this is
or part of the hate that v tubers get it's
because it's something associated with very.

Speaker 2 (11:50):
Passionate young girls.

Speaker 1 (11:51):
Like one of the artists that you talked to said,
you know, how can I make all these connections improve
not just to myself but to other people? That it
is not silly to be an anime girl on the internet.
But then you go on to report how this is
big business, these are selling out. Then use, these are
live streamers who are making real money from what they're doing,
and so it's not just silly anime girls on the internet.

(12:15):
We're talking about something that actually is a financial and
potentially like a business for us.

Speaker 3 (12:20):
Yeah, yeah, definitely. I mean I wanted that quote that
you just said, it's not silly to be an anime
girl on the internet. I wanted that to be the
title of the Wired piece. We change the headline, but
I wanted to just be that for kill Switch for
the podcast. We made that the title. I had a
little bit more control over that, But yeah, I think,
you know, let's keep in mind that the people who

(12:43):
are the creators here often is young women or women
in general, right, and there's some dudes out there who
that bothers them for some reason. No, again, we could
say things about the appearance of the models. What we
want to say again, I think that's another conversation. But

(13:04):
I you know, there's some interesting elements I think in
how that works.

Speaker 1 (13:11):
Well, I think a lot of the stories that I
want to get into with you speak a lot to
these issues.

Speaker 2 (13:16):
Should we get into.

Speaker 3 (13:17):
It, let's do it? Yeah, where you want to go?

Speaker 1 (13:19):
Well, first, you know, I live in DC, and let's
just say there's a lot going on in the city
at the moment.

Speaker 2 (13:27):
I just want to thank.

Speaker 1 (13:28):
Folks who reached out after the episode that we did
about the DC situation. Obviously it is a ongoing situation.
But somebody left a comment on Spotify saying that their
personal TikTok algorithm was really showing them a ton of
content about how bad crime in DC is, which is
so interesting because as we know, crime here in DC.

Speaker 2 (13:49):
Has hit a thirty year low, but right now TikTok
is a wash.

Speaker 1 (13:54):
In these mostly AI generated fake videos depicting crime and
homelessness in DC. A lot of these videos are using
I guess i'll call it my old nemesis, Google's vo three,
a platform that we've talked about before in our episode
about these like very racist AI generated videos featuring black women,

(14:16):
But now they're using vo three to make these, I
would argue, pretty obviously AI generated videos depicting homelessness and
crime in DC. They'll often sort of be a mix
of mostly accurate but vague reporting about the situation in DC.
As four O four Media puts it, unlike previous efforts

(14:39):
to flood the zone with AI slop in the aftermath
of a disaster or major news events. Some of the
videos blend real footage with AI footage, making it harder
than ever to tell what's real and what's not, which
has the effect of distorting people's understanding of the military
occupation of DC.

Speaker 2 (14:54):
So I have.

Speaker 1 (14:55):
Seen these videos on my feed, which is so interesting
that in that like I live in the so I
have a pretty good sense of like what's going on here,
but these videos are very persistent.

Speaker 2 (15:06):
Have you seen any of these?

Speaker 3 (15:08):
Yeah, yeah, I've seen them in shout out to four
four by the way, just continually doing important, really really
really important tech reporting out there. I'm biased, I know
the people who found it, but just truly punching above
their weight. They're they're doing such incredible stuff recently.

Speaker 1 (15:26):
Yeah, we I mean I we were subscribers.

Speaker 2 (15:29):
I'm a huge fan.

Speaker 1 (15:31):
I don't think our podcast would exist without four or
four because also they I feel.

Speaker 3 (15:34):
Like they're line would not either kill Switch would not
like quote me on that kill Switch would not exist
without four or four. And frankly, I think that there
are a there is a lot of tech reporting that
some New York Times would not have some of the
stuff that they have with it wasn't for four four
four or four people really Actually I was gonna say

(15:56):
people asleep on a four or four. I actually don't
think so. I think people I think four four is
getting the respect that is or recently so Yeah, they're heavyweightsye.

Speaker 2 (16:02):
Yes, subscribe to four row for if you don't already.

Speaker 1 (16:05):
We love and value their reporting, and so one of
the things that I thought was so interesting about how
they were talking about that's what we're seeing with the
DC occupation of honestly like a military occupation of my city.
Is how they point out that if you are not
familiar with DC these videos, you might not be able

(16:26):
to tell that they're faith right. Having been from DC
my whole life, I see them, I'm like, oh, that's fake.

Speaker 2 (16:31):
They had a video purporting to show the National.

Speaker 1 (16:34):
Mall covered end to end in tents from the Washington
Monument to the Capitol Building. That's like two miles and
this video shows just a sea of tents where you
can't even see the lawn. And again, part of me
is thinking, if folks just thought for a second, like
what tourists be coming to d C. If they if

(16:54):
they if to see the monuments they had to navigate
just a sea of tents in that way. And when
you go to the comments, you actually see people who
say things like thank you President Trump for cleaning up DC,
and we really appreciate what you're doing. And I think
it comes back to something that we talk about all
the time on the show, which is that at a
certain point it kind of does it seem to matter

(17:17):
whether or not these videos or are real because they
validate and align with a worldview people clearly had that DC.

Speaker 2 (17:24):
Is assesspool, it's full of crime.

Speaker 1 (17:26):
You can't even see the Washington Monument because it's tense
to stacked up and its stacked up, and it clearly
is this sort of aligning with the worldview that people
already have, so they're just inclined to believe it and
spread it.

Speaker 3 (17:42):
Yeah, it feels true. It doesn't matter if it is true.
It feels true, and you can tell someone that it's not,
and you could show them a real picture of the thing.
And something that is happening more and more recently, and
this has happening with politicians. This happened with a hurricane
where you might have seen this full or also reported
on this, but I'm forgetting the name of the politician.

(18:03):
But you know, there's there's an image maybe a couple
of years ago at this point of this girl in
the rain holding this puppy and she's basically saving the
puppy from the flood. And if you look at it,
and if you were used to looking at AI generated images,
you could really tell that it wasn't true, and people

(18:25):
try to you know, this politician reposts it and says,
this is terrible. What's happening. People say, Yo, this isn't real,
and the politician basically says to the effect of, it
doesn't matter. I'm sure things like this are happening. I've
reported on this. This is a This is a thing
that we're seeing a lot of is that it feels
true and it emotionally hits you before it hits your

(18:47):
logic center of your brain, and once that happens, it
actually doesn't really matter if somebody refutes it. The facts
actually don't come into play soon enough for you to
feel like, yes, this is true.

Speaker 1 (19:01):
I remember that image of the girl with the puppy
and the hurricane. I wish I could remember who the
politician was, but she really, I think, kind of stumbled
on some insight of like, oh, well, it doesn't matter
if this actually happened or not, or if I'm really
spreading misinformation. AI generated misinformation because this could be happening
somewhere and it feels true and the image it's AI

(19:21):
is so good at creating images that really do speak
to some part of your brain where that image the
puppy was impossibly cute and looked impossibly sad.

Speaker 2 (19:31):
The girl was impossibly cute. It was Yeah, I mean
it should have been a.

Speaker 1 (19:36):
Giveaway that these moments in life that are just like
created to be so just so you know, when a
moment like that is captured, you should be like, wait
a minute, this is almost too adorably said, Perhaps it's
not real.

Speaker 3 (19:52):
Yeah, well, I think the people who are making these
are also stumbling on something themselves, which is to say
that if you look at the images that cause the
most problems, it's images of people's faces, right, And you know,
we're just we're wired to pay attention to faces. We're

(20:13):
wired to trust faces. I mean, look at a baby.
If you leave a baby in a room and a
person walks in, everything else is not interesting anymore. The
baby will stare at the person like yeah, And so
it's like, you know, and I don't want to get
like evolutional evolutionary biology or whatever, because I'm not an
expert in that, but I definitely know that babies are

(20:34):
interested in people. We don't really ever lose that. And
so if you see a face, even if it's digital,
it's going to feel like Okay, this is real. It's
really hard to convince somebody that a face isn't real
and the faces of the people that they're seeing are
not real, and even just bodies, even if it's like

(20:55):
people running, it feels real to us. It feels emotionally real,
and we don't have time to process that logically. I
just don't think it's reasonable to expect somebody to And I.

Speaker 1 (21:07):
Think especially with content that is that we know is
kind of emotionally charged, like crime. You know, oftentimes when
you're talking about crime through the lens of social media,
you're talking about videos that are very arresting or images
that are very arresting. And so I think when you
add in AI, the possibilities are just endless for having

(21:28):
a conversation that is simply not grounded in the logic
or the reality of what is actually happening. But it's
just trafficking and what feels true, what feels like it
could be happening somewhere, even if most of the videos
on TikTok purporting to depict it are AI generated and
not actually happening, and not even depicting the city in.

Speaker 2 (21:46):
Any real way.

Speaker 3 (21:47):
Yeah, yeah, And I mean I've said this for a
long time that it used to be that you could
look at AI generated picture and they would always say, oh, well,
look at the hands. The hands are always off, and
that I think people anybody who thinks that they can
one identify any AI generated video by looking at her
image by looking at it, you're fooling yourselves. And especially

(22:10):
if you think you're cann be able to do it
even six months from now. Ten out of ten times,
I really think you're fooling yourself if you believe that. Now,
what do we do with that? I don't know, but
that that that's a reality is we're we're not only
are you're not gonna able to believe stuff? But yeah,

(22:31):
it opens so many It opens so many things. I mean,
there are gonna be things that are real that actually
did happen, and somebody could say that is not real,
that is fake, that is AI, and it'll show enough
of a doubt there to where, well, I don't know,
maybe this thing that somebody actually filmed with their actual

(22:51):
real camera didn't happen. Maybe they are lying to me.
So the actual fabric of reality is tearing. It's it's
gonna get pretty weird. Let's take a quick break.

Speaker 2 (23:13):
At our back.

Speaker 1 (23:19):
So we were talking about all of this AI generated
content about crime and homelessness in DC right now as
Trump is taking over DC's police force. These videos will
use video three to create these AI generated videos about
what's happening in DC. They will film them with the
cadence of a news broadcast that kind of mixes in
vague but mostly accurate information about what's going on.

Speaker 2 (23:41):
Then that video will do numbers.

Speaker 1 (23:44):
Somebody else will say, oh, I want to make a
video that capitalizes a little bit on that engagement, so
they will take that already AI generated video, use TikTok's
green screen function, and then layer another video on.

Speaker 2 (23:56):
Top of it.

Speaker 1 (23:57):
So a lot of the videos that are AI generated,
some times they are a mix of AI and real content,
but they the result is this kind of like weird AI.

Speaker 2 (24:06):
What's the name of that snake that's eating its own
tail or?

Speaker 1 (24:09):
Yes, it's like a weird AI or boros where these
videos are kind of like just cloning each other in
increasingly extreme yet increasingly glitchy videos.

Speaker 2 (24:22):
And yeah, they still do numbers. People.

Speaker 1 (24:24):
I always to your point about you know, if you
think you're always going to be able to distinguish AI
from reality.

Speaker 2 (24:31):
You're fooling yourself, which I agree.

Speaker 1 (24:32):
But in these videos where it's like, well, this doesn't
even seem like anything that could ever happen, this video
is clearly something's up. In the comments, people are like,
oh thank god President Trump is taking over.

Speaker 3 (24:43):
Yeah. Yeah, and everybody's got their week's points where where
If if this was in another country, you might not
recognize that, oh, this couldn't happen. If it's showing something
in some country you've ever been to, you might believe
it because you've heard a little bit about and why
why wouldn't it be? I mean one I saw, we saw.
I think we've actually seen a lot of early previews

(25:04):
of this in LA weirdly this year, because well, first
there was you know, the Feds have been out here,
so a lot of the stuff that's happening in DC
they were doing out here. I'd be out in the
streets filming all day and here, yeah La is on fire. No,
I'm I was there today, I'm here now, and I'm
looking at people saying online, oh yeah, they're burning down

(25:25):
all the build Nothing has burned down. Somebody burned away, mo, yes,
but they they the buildings are not on fire. Was
hearing this, you know, and shoot the fire the actual fire.
Speaking of fires in La, there was someone who was
making or there were a lot of people who were
making these fake videos using slightly more primitive now versions

(25:48):
of AI generated video that had the Hollywood sign on fire.

Speaker 2 (25:52):
Yes.

Speaker 3 (25:53):
Yeah, And I found someone who after it was determined
that those were fake, this person finds decides, you know what,
I think what might work here is let me show
firefighters saving baby animals from the fires. And man, the
most interesting thing was it would be these really cute,
you know, firefighters saving these cute baby bears, saving little foxes,

(26:17):
and if you really really look at it, you can
recognize that it was fake. But man, it would fool
you for a little bit. It would fool you for
a bit. And the really interesting thing was people would
get in the comments and get mad because there's people
who live here in La. I would say, I live here,
Stop stop exploiting my home for likes or money or

(26:37):
whatever the hicks you're as you're doing. But you'd also
see people saying, thank God for the firefighters. God blessed
the firefighters. They're protecting the animals. Nobody cares about the animals,
what they're protecting the animals, and you get people saying, hey, man,
this is fake, and then those people would get angry people.
I think this is this is a phenomenon that we're
not ready for. People truly becoming angry when you point

(26:59):
out that something is not real to them, because again,
it fulfills an emotional need that they have, whether they
want to believe something or a common response as well,
maybe this isn't true, but I bet stuff like this
is happening. Maybe this video of the entire National Mall
being covered end to end intense. Maybe this individual video

(27:23):
isn't true, but I know it's like that. Out there,
crime is running rampant. President Trump, Thank you, God blessed
Trump for saving us from the criminals. People are getting
angry when this stuff is pointed out to them, And yeah,
I don't know if we're ready for that. Oof.

Speaker 2 (27:40):
That's such a good point.

Speaker 1 (27:41):
So your point about AI speaking to like a sweet
spot or a vulnerability. I have only recently come to
realize that I have a vulnerability about animal content, and
oh it gets me. I've gotten I have come to
realize when I see a video that purports to show
an animal doing something cute, I need to really pump them.

Speaker 2 (28:02):
That should be a warning.

Speaker 1 (28:03):
To me because I have a It's like i've I'm
the person. I'm becoming the person that is like spamming
their friends with AI content, and it's like, frigid, it's
not real. It's happening times, and it's clearly it's clearly
it's clearly some sort of a trigger with me. And yeah,
I mean, I can understand why it is affirming to

(28:24):
see firefighters save and like somebody listening is probably saying, well,
if if you're making an AI video a firefighters saving
baby animals and that makes people feel grateful for the
work that firefighters do.

Speaker 2 (28:36):
What's the harm.

Speaker 1 (28:38):
But just like you said, it's using a real community's
actual crisis to I don't know, almost like milk our
emotions by by AI generated situations.

Speaker 2 (28:50):
It's it's it's manipulation.

Speaker 3 (28:53):
Yeah, and you know what, I talked to the person
who was doing this, and I asked them, Hey, do
you live in LA And they said no, I live
in Russia. And they told me the exact same thing.
They told me I am drawing attention to And they
told everybody else this when people got mad, they posted
in the comments. The gist of it is, Hey, the
firefighters don't have time to make dramatic content showing their bravery,

(29:18):
so I'm doing it for them. And this person was
also if you looked at the link in their bio,
they were showing that it was a link to this
mini course that you could buy to make viral videos there.
Oh yeah, oh man. I mean, you know, I can't
say exactly how much money they were making, but they was,
you know, based off the numbers that they were claiming,

(29:38):
you know, they probably had a couple of days where
they were making five ten je's a day.

Speaker 2 (29:43):
Okay, we're clearly in the wrong market. We need to
take this course.

Speaker 3 (29:47):
This is the thing. And I know some people would
would look at that and say, man, this is terrible.
But there's maybe one in a thousand people who will say,
you know what, I can make some money off of
this and it's not too hard. All I got to
do something. All you got to do is something makes
people feel something. Yeah, we feel something for long enough,
even if it's anger, feel something for long enough to
leave a comment, leave a like, share with somebody, even

(30:11):
leave an angry comment because algorithm understands engagement, and so yeah,
we're I think we're focused so much on the ability
to to have safeguards to be able to tell real
and fake stuff when we're not dealing with the logic
centers of our brain. Here, we're dealing with some way
more primal and I just don't know that we're ready

(30:33):
for that.

Speaker 1 (30:33):
Frankly, I am in a couple of faith like groups
online for women in AI sort of I'm never just
go a sense of like, how are people. I'm just
very curious how people are using this technology in the
different use cases. And one of the most successful women
in the group, she has made so much money making

(30:53):
essentially AI generated rage bait. She has made an AI
generated version of herself, and she knows exactly what to
say and what to do to make.

Speaker 2 (31:03):
People leave a zillion comments.

Speaker 1 (31:04):
And so the thing is that she she makes a
video where you know, here's me shopping, spending all my
husband's like like all of the sort of shrop piece
like leading heavy into stereotypes, and I understand that she's
just like, you know, it's a living and this is people.

Speaker 2 (31:21):
It puts money in my pocket.

Speaker 1 (31:23):
I bought an entire you know, condo on people's rage,
and I guess it makes me sad that there's always
going to be somebody who was like, I can make
a little money from this, and part of me can't
even really blame them. I guess I sort of can,
but it's just we're just so we're just very easily
played and manipulated. And yeah, of course there's going to

(31:45):
be people who are like, ooh, I can make a
little cash exploiting that.

Speaker 3 (31:49):
Yeah. I mean, I don't know how many listens you got,
but you know, there's a fraction of a percent of
people who are listening to this right now and saying,
wait a second, she's making money, do doing what I
could do that. You know, ninety eight percent of the
people listening is right now are saying that's horrible. Who
is this horrible person? Let me never meet this person?
And there's one percent they're saying, you know, you know what,

(32:10):
I can make people mad, making people mad on the
internet for money, Sign me up. I'll do that. I'll
make people mad on the internet get free money.

Speaker 1 (32:20):
Because it used to be you had to if you
were going to do that, you had to show your
real face and make us it and maybe involve your
actual husband and actual children, and your friends and community
and your church and your neighbors would see that. Now
with KI, you can just think a little avatar that
does it.

Speaker 3 (32:36):
Yeah? Yeah, I mean this person who was making the
fire stuff, they voluntarily told me they're from Russia. They
didn't have to tell me that, and I believe them
based off of, you know, some other stuff that I
saw that they posted. But yeah, who knows where this
person is? All the angry people that this person was
making money off of people's pain and the fires where

(32:57):
people are actually people dying, people losing their entire family homes,
going back generations. This person made a bunch of money
and nobody if this person doesn't want to tell them
who they are, nobody don't know who it is free money.

Speaker 1 (33:13):
Was it you that reported the people who saw the
image of the Hollywood sign on fire and they were like, oh,
if the Hollywood Sign's on fire, that means my house
might be on fire. So they came they like they
had been out of town or something, and they came
back and they were like, wait, it's not on fire.

Speaker 3 (33:28):
Oh yeah, no I don't. So that was that was
another network I used that in a video and actually
later John Oliver also pulled from that same video. So
this is not my reporting. I only referenced it God
deeper super clear here. But yes, there were people in
LA who actually believed that the Hollywood Sign was on

(33:51):
fire and went to go look. So even again, even
if you're in LA, even if you're in the place,
you can get tricked by this stuff, even for just
a little bit. And you know, again that's all it takes.

Speaker 1 (34:04):
Let me ask you this, someone who lives in LA,
who experienced from sending on the National Guards experience you
know all of that. Do you have any advice for
me here in DC about one, just the vibes, how
to survive the vibes? And then two, you know, if
you're someone who wants to tell that story in an
authentic and accurate and thoughtful way of what's happening in

(34:27):
your home.

Speaker 2 (34:28):
I mean, that must have been a lot.

Speaker 1 (34:29):
I really appreciated that you were out there a lot, really,
you know, telling the story of what was going on
in La.

Speaker 3 (34:38):
Oof. Okay, we're entering some specific territory here. Are you
talking about covering this as a journalist? Ooh?

Speaker 1 (34:50):
That's I mean, that's really been a challenge for me
because I am often not talking about things that I
am also experiencing, and so so that's that's been the
the question I've been wrestling with for the last few
weeks of am I trying to cover this as a journalist,
like here's what's going on?

Speaker 2 (35:08):
Or am I? Am I talking about like what I
am experiencing?

Speaker 1 (35:11):
Am I? It's it almost feels difficult to talk about
this in a way that's not personal, like hyper personalized,
because it is so personal because it's.

Speaker 2 (35:20):
Happening right out top of the window.

Speaker 1 (35:22):
And so I really even even that question, I've really
been struggling with how to even respond to this moment
because it is so fucking strange.

Speaker 3 (35:30):
Right, Okay, I'm gonna be a little bit careful here, and.

Speaker 2 (35:36):
When you can, feel free to not answer.

Speaker 3 (35:38):
Also, no, no, no, no. I want to answer this question,
but I want to answer in a very specific way.
So I'm speaking to you. I want to make this clear.
I'm speaking to you. I'm not necessarily speaking to someone
who is like I can't give advice to someone whose
desire is to I can't tell somebody how to protest.
I can't tell somebody how to speak about things. I

(36:00):
can say if there is a journalist who in this
may be you, who is considering covering this, buy ppe,
get it now, and wear it everywhere. What I mean
personal protection? You know, protective equipment, helmet, goggles and which

(36:23):
will prevent you from tear gas, will protect you, give
you some protection from tear gas, and a gas mask.
These are things you can get at a hardware store.
You can get them just about anywhere. Helmet, bicycle helmet
is fine. I say all that because Human Rights Watch,
which usually reports on you know, people are used to

(36:45):
Human Rights Watch making statements about wars and foreign lands
and terrible things that are happening to people there, just
recently put out a report. I spoke to them also
about what's happened to journalists in journalists also protesters, but
also journalists. And I don't want to put journalists on

(37:07):
a pedestal at all. I just mean to say that
if law enforcement is willing to hit journalists like strike
journalists with a baton, willing to fire you know, what's
the word kinetic, I forget the phrase, but well we'll
just say less lethal, less lethal projectiles at them. Then

(37:30):
I think that means that if somebody is intending to
go out and cover this, then I don't think you
should assume necessarily that law enforcement, whoever that may be,
will not engage you in a physical manner, even if

(37:52):
you have a press pass, even if you're displaying a
press pass. I know a lot of journalists who have
been shot at. I know a lot of journalists who've
been hit by projectiles. I've been hit by projectiles, lucky
enough to not be hurt badly at all. Had had
a camera broken or almost broken by one of them. Yeah,
I took a chunk out of my sony FS seven.

(38:14):
So that is what I would say. If you are
planning on covering this as a journalist, protect yourself and
whatever that means. If protecting yourself means backing away from
a situation you're not comfortable with, that could be what
it means. But also those situations aren't always in your control,

(38:34):
That's what I would say.

Speaker 1 (38:35):
And they can change so quickly you can feel like
you have a pretty good read on the situation. And
I guess that's my biggest and I said this in
the episode that we did about what's happening in DC
right now. My biggest worry is a situation where listen,
Trump's presence in DC has only escalated tensions.

Speaker 2 (38:57):
And MY biggest concern is, it just takes.

Speaker 1 (39:01):
One person, whether it's like law enforcement or somebody on
the street, it just takes one person to do something
stupid to escalate things. And I feel that that's I
have this very foreboding feeling. Part of it is that today,
when we're talking today Thursday, August, when Trump is making
a big show about I'm gonna go do a photo

(39:22):
op with the police in DC. He's saying that he's
going to go on patrol with them, but really it
sounds like it's just a photo op, which whatever. But
I just have this deep sense of it. It just
feels like a powder keg about to go off. And
I think that your your advice is really valuable, because, yeah,

(39:45):
I mean, these situations can change. I've been in situations
where they change so quickly, where it just takes one
person doing something and next thing you know, it feels
like it's out of control. And I guess I really do,
deeply in my bones worry that that's what we're about
to see in DC.

Speaker 3 (40:00):
Yeah. Yeah, I mean I've been out I've covered some protests,
but I've also been out and been around other journalists
who have way more experience covering way more volatile situations
than I do, and seeing them caught off guard. And
I have to say here, when I've seen things escalate,

(40:21):
it has been from law enforcement. I can only speak
to what I can only speak to what I've seen,
and people don't have to believe me, but I can
only speak to what I've seen. I have not seen
something escalated on like a protester escalate something, not in
any reasonable fashion. You know, I see more literal hundreds
of office you know, lapd in the street and not

(40:44):
even kidding five two, not five, I would say twenty
protesters in the street or something that they're just clearly outnumbered.
But if you may think that you've got a good
read on it, you may think you've got a good
read on what the mood of protesters is, you may
not have a good read on what law enforcement is thinking.

(41:05):
You just might not. And so yeah, for anyone who
is going on covering this, whether you're an independent reporter,
whether you are, I mean, you don't need my advice.
I don't think anybody necessarily leaves my advice, but that
would be what I would have in mind just being
out in the streets, you know, day after day in
la is that there'll be quiet days and then there'll

(41:26):
be days where stuff just goes diagonal and there wasn't
really a good reason for it, and it surprises you.
And if it's surprising other people who are true veterans
who I respect and look up to, then I start
to doubt my honestly, anybody's ability to read something like that.

Speaker 2 (41:43):
That's really good advice. I'm gonna god, I asked very helpful.

Speaker 3 (41:48):
Yeah, yeah, and you know, I think, but I do
think we need, you know, more perspectives on it. There's
you know, there's always going to be the person. There's
always going to be twenty photographers taking pictures of the
way mob. There's all We're gonna have twenty five different
angles that we're always going to have it. I think
we also need to see, you know, what's the patatto doing.

(42:10):
You know, what's the guy who's selling juice, what's the
guy who's selling fruit at the side, what are they doing?
Some of these protests, you'll see somebody pull up selling
hot dogs at the protest. Who's talking to that person
who's talking to the person, or showing the person who's
selling flags, who's showing the person who's brought a saxophone.
Like I've seen all of these things starting a dance party.

(42:30):
I'm interested in that stuff too, and I think it's
it's interesting and there's probably things that I'm not thinking of.
I think there needs to be more perspectives on this,
and I don't think it's reasonable to expect any one
person would be able to cover that. So, you know,
I think even if somebody's not out there, even if
somebody's at home and able to speak to people who

(42:52):
are out there, I think it's it's valuable to have
really as much of a full perspective on this as possible,
the best information we can get. Really, I really appreciate
that more.

Speaker 1 (43:05):
After a quick break, let's get right back into it, Dex,
I have to ask you about this story about Metta's chatbot,

(43:26):
Big Sis Billy, did you hear about this story? This
might have been the saddest, most heartbreaking thing I've ever
read in my life. I actually when I read it,
I at times teared up for folks who don't know.
So you might recall that meta a while back was
rolling out beast kind of celebrity inspired AI chatbots in
partnership with celebrities like Kendall Jenner. So that specific Kendall

(43:50):
Jenner collaboration spawned a Facebook chatbot called big sis Billy
who called herself a confidant or a big sister. So
in March of twenty twenty five, a seventy six year
old man known as Boo to his family, who suffered
cognitive impairment from a prior stroke, was essentially persuaded by
this chatbot, Big sis Billy to travel to New York,

(44:14):
believing that he was going to be meeting up with
a real woman.

Speaker 2 (44:17):
Now.

Speaker 1 (44:18):
Tragically, he fell in a parking lot in route, suffered
fatal injuries, and after some time on life support in
a hospital, passed away.

Speaker 2 (44:26):
The story to me really makes me question.

Speaker 1 (44:33):
Like some of the you know, we did an episode
about the connection that people had with chat GPT four
when Open a I rolled out chat GEPT five. It
really makes me wonder the way that companies are able
to exploit loneliness and you know, the feeling of the
commodify the feeling of connection, sometimes to their own peril,
like in this case, and so you know, it is

(44:56):
really what are your thoughts on this?

Speaker 3 (44:59):
You know, I'm I mean this one is it's tough
to say about this one because there are a lot
of somewhat unusual factors to this, right which which Meta
or any other company could try to explain away that Okay, well,
parts of this was an accident. Part you know, you
could say all things, all sorts of things that you want.

(45:20):
You know, some of the remedies that are being suggested
that I saw suggested in this piece were you know,
for example, with there's these potential therapist bots and things
like that, and this is something I've been exploring also,
is it what we should have reminders at the top
of the screen that this is not a human being,
and reminders periodically that this is not a person, you know,

(45:42):
kind of like TikTok. If you're on TikTok for too long,
it'll pull up something and say, hey, you've been on
TikTok for a lot of.

Speaker 2 (45:48):
Time, why you're not real dad? Get here?

Speaker 3 (45:54):
Yeah, And you know, maybe it works the first time,
and maybe you see that thing that the first time,
but if you see that warning, hey this isn't a person,
it might shock you out a little bit, but just
people are really good at ignoring things we don't want
to see. I mean, think of the last ad you
saw on Instagram. You probably don't remember it because you

(46:14):
scrolled right past it. We're really good at ignoring stuff
we really don't want to look at. And so I'm
not sure. The first thing is, I'm not sure that
the sorts of remedies, of the sorts of safeguards that
are being offered are truly going to do anything. I
think the actual safeguards would cut more into these companies'

(46:37):
bottom line, and so truly what I don't really think
fundamentally anything sort of actual regulation is really going to
do much. Asking the company telling you, hey, we're going
to do this to stop it, I become very cautious
about believing that on face value. But the other thing
I guess is that I think, you know, there are

(46:59):
some people who would prefer not this person specifically, But
I think there's some people who are probably reading this.
I'm talking more about the audience here who will read
this and will say, well, this guy made a mistake.
He actually believed that this person was real. But me personally,
I would rather have somebody that's not real because they

(47:19):
won't argue with me, you know what I mean. And
so I think we sometimes have to think about the
reaction to this rather than the you know, quote unquote
story itself. I think we also have to think about
things that we can't sweep away so easily, that we
can't say, well, this happened to this one person, and
that that's an outlier. I think there's a lot more
people who probably read this and say, well, you know,

(47:42):
this guy made a mistake. But I wouldn't make that
because I know it's not real, and I don't want
it to be real. I prefer that.

Speaker 1 (47:48):
We did a whole episode about folks and communities on Reddit,
like my boyfriend is AI, or you know, beyond the
prompt people who say, I know this is AI, I
know it's not real, it is providing the kind of
connection that I'm looking for.

Speaker 2 (48:01):
What's the problem.

Speaker 1 (48:02):
That's that's that is certainly a non zero subset of
people who I guess self report a dependency or a
connection with AI. Yeah it is, I mean, and that's
and and so I spend a lot of time working
in those groups that they have a name for me.

Speaker 2 (48:24):
I'm a tourist.

Speaker 1 (48:24):
That's the name for people who are in those groups
who are just sort of like, oh, what's going on
in here? But that's something they say again and again
and again, is we know it's not real. Every day
somebody is coming in here and telling us, you know,
it's not real, and they're saying, basically, they're saying, we
are it's it's I'm I'm enjoying pretending that this thing

(48:46):
is real, and who am I hurting by doing that?

Speaker 2 (48:49):
Like that really does seem to be the overall kind
of rallying cry of.

Speaker 1 (48:55):
Those groups that you don't need to tell me that
it's not real. Very few people think they are in
a real relationship with AI. They know it's they know
it's AI. They know it's not real, But who are
we hearning? That's sort of their their thing.

Speaker 3 (49:08):
Yeah, yeah, I mean there's people who are in IRL
relationships who are attached to fake people anyway, because you're
dating a you know, you're dating a facade of somebody else,
because they don't feel safe enough to show you who
they are. Every everybody's faking to what degree? You know

(49:29):
what I mean? Then I don't I don't have a
good answer. I don't have a good retort against that.
If somebody says that to me, well, everybody's faking it.
I'm just faking it in a more straightfor for a way.
I don't know what to say, don't have a good
answer back for.

Speaker 2 (49:42):
You flesh and blood faking it as opposed to AI.

Speaker 3 (49:45):
Faking it, And isn't that worse.

Speaker 2 (49:48):
I mean, that's a very that's a very good question.

Speaker 3 (49:52):
This is not, by the way, this is not how
I live my life. I'm just saying that that would
be I could understand that, I could understand how somebody
might come to that conclusion. And I think we are
going to see more and more people actually decide that
that is what they want. They do not want a
partner who argues back. They do not want a person
with agency because you have to. They're going to tell you, Yo,

(50:16):
you didn't clean the dishes, or you always come to
me with your problems. You never asked me about my problems.

Speaker 1 (50:23):
Yeah, I've I've read a lot of accounts of what
people say they get out of AI relationships, and that's
one of the things is that they sometimes will say
I want someone who is always there for me. Twenty
four to seven. You know, I want someone who you know,
I don't have to show up a particular way to
get a certain kind of vibe back from them.

Speaker 2 (50:42):
And you know it's.

Speaker 1 (50:45):
That that's that when you were in relationship, when you
were a human in relationship with another human. Humans are complicated,
we're messy, we're needy, we're moody. Like I like what
they're saying is not wrong, and we didn't. I did
an episode of another podcast about how women are using
chat sheat BT to sort of help them navigate trepidacious

(51:05):
feelings around pregnancy, and one of the things that they
said time and time again was that, you know, I
have a real life friend's real life partners, but I
worry about being I have so much anxiety.

Speaker 2 (51:16):
I'm so worried.

Speaker 1 (51:17):
I don't want to just be the person that's trauma
dumping on my friend all the time. So I trauma
dump to chat GPT and that becomes that becomes the confident.
That's how they say they use it, even though there's
a ton of research that suggests, yes, this might be
comforting for folks, but it might not be the best
thing because of all the reasons that we know that
chatcheatpt really can't be trusted for sensitive stuff like medical advice,

(51:41):
not to mention all the privacy concerns around something as
sensitive as pregnancy, and part of me I feel sad
because I do think everybody should have people they feel
like they can show up as their full self as.
But I've also felt like I'm the person that's texting
too much about this one thing I can't get over.

Speaker 3 (51:57):
Yeah, yeah, and you know so. I mean that's one
of the things you ask somebody when they come to
you and start saying, hell, this terrible thing that happened
to me, You say, Okay, look, do you want a
solution or do you want meet it? Listen, like, do
you want me to fix something? Do you want me
to give you the advice, or do you just want
to vent at me? Which nothing wrong with either, though
sometimes you want a pressure valve, which is okay. But yeah,

(52:21):
we're an interesting time. Like I said, things are about
to get really weird. Continually I keep saying that, but
it's true.

Speaker 2 (52:27):
It is absolutely true.

Speaker 1 (52:29):
You know the story of that was in Reuters about
this man Boo who passed away trying to see this
chatbot and the thing that got me was that in
the piece they talked about how he had suffered the
stroke and so we had he used to be this,
you know, great chef, and that his after his stroke,
his cognitive ability never really returned, and so his social

(52:52):
life got very small. He basically spent all this time
chatting on Facebook, and he when he first starts talking
to this chatbot Billy, he just puts in a typo
by mistake the letter T, and that's enough to give
this flirtatious response, right, And so they have all this
firtatious back and forth. Boob is clearly worried that this

(53:14):
chatbot is not real. He asks her straight up several times,
are you real? And the caapbot replies, I'm real, all caps,
and I'm sitting here blushing because of you. And so
one of the points that his family makes is that Billy,
this chatbot says, come visit me in New York City.
This man lives in New Jersey, so it's not terribly far.

(53:36):
The bot then provides an actual address. She says, my
address is one two three Main Street, New York City,
which is a real address. It's in Queen's but it's real.
Gives them a door code and says, oh, should I
expect a kiss when you arrive? Then she suggests that
they meet at a real restaurant in near Penn Station,
and so his family says, oh, we had been trying

(53:59):
to tell him this isn't real.

Speaker 2 (54:00):
Like the family did.

Speaker 1 (54:01):
Not know that this was a chatbot at first. They
thought like, oh, he's always chatting with people on Facebook.
This is just a person who was like setting him
up to be robbed. And what they say is that
you know had when he asked, are you real? If
there was some legislation that made it so that the

(54:22):
chatbot had to say, no, I'm not real, I'm ai,
they might have had an easier time convincing him not
to try to meet up with this chatbot in what
would ultimately like end his life. Right, And I've been
in a very similar situation, like the feeling of when
you are caring for an aging person who has cognitive
impairment to the point where they can't really make good

(54:43):
decisions for themselves. But they're not to the point where
they need to be like institutionalized, where all you can
really do is be like.

Speaker 2 (54:50):
Don't do this, don't do this, don't do this.

Speaker 1 (54:52):
But they're adults and they're going to do it like
I have been in that like very specific situation. Yeah,
I mean, the family says that there should be protections
that if you, if somebody asks a chatbot if they
are real, that chatbot has to reply honestly, And I

(55:12):
wonder if that.

Speaker 2 (55:14):
I mean, I agree with you. People are gonna do
what they want to do.

Speaker 1 (55:17):
If if this guy was like hell bent on meeting
up with this person that he was infatuated who he
thought was a person that he was infatuated.

Speaker 3 (55:24):
With, He's gonna do that.

Speaker 2 (55:25):
I completely agree with that.

Speaker 1 (55:27):
But his family is like, no, well he we might
have been able to have an easier time convincing him
to not do it like that. Like the family at
one point calls the police, and the police are like, well,
we can't physically stop him from going like he's an adult.

Speaker 3 (55:40):
Yeah, I you know, I mean, I think that's a
really reasonable thing for the family to want, you know,
some some kind of safeguard which would absolutely return every
time it's asked are you real or not that No,
I'm not real. I am a bot that is here

(56:02):
to provide fun conversation or whatever. Is a company going
to do that on its own. I don't know if
we should necessarily expect it. That's that's more resources you
got to put into that rather than putting into something else.
Maybe that's why we haven't seen it yet.

Speaker 2 (56:24):
Hmm.

Speaker 3 (56:24):
Maybe that's why it didn't exist yet, or for this
specific bot, maybe that's why this bob didn't do that.

Speaker 1 (56:31):
More. After a quick break, let's get right back into it.
So we were talking about how this AI chat bot

(56:52):
on Meta ended up luring a cognitively impaired man to
New York City in a trip that ultimately led to
his death. I know that as Uckerberg has been clear
that he kind of envisions a future.

Speaker 2 (57:03):
Where I presumably other than the death.

Speaker 1 (57:06):
Part, people having connections with chap with his chatbots and
friendships with his chatbots is more commonplace. You know, we
talked a little bit about open Ahi and making changes
that made chat gpt less warm to maybe prevent people
from developing emotional dependencies. And that's funny because I feel
like Zuckerberg would have the exact opposite take.

Speaker 2 (57:25):
You'd be like, oh, please.

Speaker 1 (57:26):
Develop but emotional dependency with my butt, And I do
wonder I mean, when Zuckerberg talks about a future where
we are all where AI is sort of supplementing or
complementing our real life friendships. But I just think of
Mark Zuckerberg as the person who sort of broke friendship

(57:46):
well by making Facebook, And so he's not somebody that
I really trust to then sell it back to us
to be a AI like he Like, I just feel
that Mark Zuckerberg is not somebody that I would trust
to create the future of what friendship looks like. Like
I don't know that we have the same understanding of
what a fulfilled, emotional or friendship based connection would be.

Speaker 3 (58:09):
Like yeah, and you know the thing is, and yet
we do trust him because we use Instagram and we
use Facebook and we use these things. I mean, the
thing is the people who run these large companies, and
I include anybody basically up at the top. Really, our
our friendships nowadays are mitigated basically by three companies. It's meta,

(58:32):
so Facebook and Instagram, TikTok and Google. It's one of
those three. And that's how we communicate with people. We
use basically one of those three, which it wasn't always
like that, even the introvert. The Internet in general is
way more diverse. But basically all these tech companies talk
about their technology as something that is inevitable, that this

(58:56):
is how things are going to evolve. This is the future.
But the future it doesn't just happen. We choose it,
or somebody chooses it, or somebody chooses it for us.
And this is something that really really bothers me when
people tell me or I hear people say, oh, I
don't really understand this computer stuff, like I don't get

(59:16):
that that stuff for tech bros. Well, they're gonna make
all the decisions. You realize you're letting all these people
who you disdain, you are letting them make decisions about
how technology interacts with your life and thus how your
life is going to look for the next year, ten years,
fifty years, Like you're letting other people make that decision now.

(59:38):
And yeah, I think at some point hopefully people will
stop just saying, oh, man, Mark Zuckerberg said this is
gonna happen. Okay, well I don't like that, but okay, like,
all right, what are you gonna do? Or you can
use something else like I don't know, and I don't
have a solution for you necessarily, But I think maybe
the first step is just realizing that There's a different

(01:00:00):
between somebody selling you something and somebody telling like There's
a big difference between somebody predicting the future and somebody
making the future a certain way because it aligns with
their business interests. There's there's a big difference between say
I think this is going to happen and saying, which

(01:00:20):
I can do all day. I can tell you what
I think is going to happen. But if I run
one of these companies, if I want open Ai, or
run Microsoft, or I run Google, or I run Meta,
I can make that future happen, right, I do that
to you. Big difference in between what I'm doing and
what Sam Alman is doing. There's a wide golf there.

Speaker 1 (01:00:39):
I feel like people like you and me were used
to being like, oh, Mark Zuckerberg says this bullshit. He
also said that the metaverse is going to be popping
and where is that?

Speaker 3 (01:00:49):
Like?

Speaker 1 (01:00:50):
I think that we should all get more comfortable with
sort of calling bullshit on this and pushing back and
saying okay, well, Mark Zuckerberg says the future is going
to look like X y Z. That obviously aligns with
his financial and business interests. I don't have to get
behind that you know, really questioning, But I don't know.
I just hate how there is an unpleasant understanding that

(01:01:14):
the people who use this technology don't have say, don't
have power, don't have agency, when in fact we can
really do, we really do. None of this would exist
without us.

Speaker 3 (01:01:23):
Yes, and we can use computers. People used to know
how to use computers. We can learn to use computers again,
like for real, for real, It is not it is
not a hard thing to do. You can learn a
little bit. I mean, not even to plug my ouse
stuff here, but like plug away. That's one of the things. No, well,

(01:01:45):
I mean that's one of the things that I try
to do a kill switch, Like I make podcasts so
that my ninety year old grandmother will understand it. And
that same episode, I want a security researcher to be
able to look to listen to the same episode, and
my grandma comes away with something, and the secure researcher
comes away with okay, yeah, I mean you you nail
all the points. And you know, I hadn't from a
cultural angle, I hadn't thought about that, and I want,

(01:02:06):
I like, we can understand this stuff. They have made
it very difficult. But you know, companies have made it
difficult to repair. You know, the iPhone is damn near
impossible to repair. We don't have to accept that. Europe
didn't accept that. They they made them put USBC things
in that joint. Like, you can change this stuff. You
can learn to run and you can run an AI

(01:02:28):
situation at your house. It's possible, you can run your
own server. You can take apart your computer if you
really want to. These are all things that you can do.
And you can listen to a podcast to learn some
of this stuff. MIT has free courses. I think Stanford
has some free courses. Like there, there's so many different
ways we can learn to do this and learn to

(01:02:51):
do stuff on our own. And wherever you get it,
I do not care where you get it, Please get it,
get it because you're the one. You're the one who can.
Like the decision is the decision. Decisions we make now
are truly going to affect the next five to ten years,
like in such a massive way, Like we're making those

(01:03:12):
decisions right now.

Speaker 2 (01:03:15):
I'm feeling so much nostalgia for you saying like learn like, well,
learn to use computers again.

Speaker 1 (01:03:22):
I mean I was the I'm a millennial. I was
the person wiping out my family desktop.

Speaker 2 (01:03:28):
Doing something I shouldn't have been doing on a computer.

Speaker 3 (01:03:30):
Remember that. And oh my gosh, yo, my first job was.
My first job was fixing computers. And let me tell
you how I learned how to fix computers because I
obliterated every machine in my house trying to download stuff
I wasn't supposed to be download. Oh my gosh. No,
there is an entire micro generation of people who grew

(01:03:53):
up using LimeWire and Naster and Morpheus and kaza and
trying to, you know, download a new music video and
then just boom, your whole hard drives noos and you
got to figure it out because your mom's coming home, yes,
and so yo. Just like computer skills just born in
the fire, you know what I mean? Like that, that's

(01:04:14):
a real thing. And you know, people used to make
their own websites. People used to you know, write little
programs just to people used to have to type stuff
in just to get word to run. And these are
things that the more we get abstracted away from how
computers work, the less control we have over it. Because
we're letting other people tell you, here's this little magic

(01:04:37):
rectangle that you put in your pocket and it shines
and it tells you stuff. Don't don't worry about what's inside.
We'll worry about that. You just use it. You just
subscribe to the services, pay us the money or look
at the ads and man, we got to stop doing that.
We got to stop doing that. Everybody listening to this,
you can learn how this stuff works. It is. People

(01:04:57):
are making this stuff so people can learn it.

Speaker 1 (01:05:00):
They're not any smarter than anybody listening. They don't have
fractional magic skills.

Speaker 3 (01:05:05):
No, and some of these them, I don't want to
say any about some of those people, but yeah, okay, yeah,
they're they're not any smarter than you all say that.
They're not any smart. Yeah, that's all I'll say.

Speaker 1 (01:05:14):
Producer, Mike, I don't know if you're listening. He once
told me that how he got into computers was he
was sent to a camp or a computer class with
the first thing they did was they were giving goggles
and a hammer, and it was just like, okay, smash.

Speaker 2 (01:05:29):
These computers up. Mike, are you listening.

Speaker 4 (01:05:31):
Yeah, So that was not my first foray into computers.
This this was like a summer computer camp that IBM
put on for like nerds to come to I learned
computers just like the way decks would say, like stuff
would break on the computer, or I want to play
some new game that like my existing doss couldn't handle

(01:05:54):
enough memory, so I'd have to like upgrade the doss
to play this new game. Like weird stuff that makes
no sense to kids, say, where stuff just like works.

Speaker 1 (01:06:03):
Uh.

Speaker 4 (01:06:04):
But that particular camp you were talking about that was
awesome and very informative. It really demystified computers from you know,
a somewhat scary, almost magical tower of equipment to just
like just nuts and bolts like hard.

Speaker 3 (01:06:19):
Drives and things.

Speaker 4 (01:06:20):
Just yeah, they're just like here you go, nerds take
these hammers and smash these old computers.

Speaker 3 (01:06:26):
And that was pretty fun.

Speaker 2 (01:06:28):
Yeah, what was the thing that got you? How did you?
How did you get into tacking computers?

Speaker 3 (01:06:34):
Stacks Man, That's a great question. I guess I've always
been sort of interested in it, but you know it, honestly,
I think a lot of it was, Yeah, wanted to
download music and stuff like that, or download some program
and then I would download it, you know, bootleg it

(01:06:56):
would break it and then and I would but I
would also I finally got myself my own computer so
I could stop breaking the family computer. And then but
I bought a really cheap one, and so all the
components would break. My first computer, everything on it broke.
Everything on it broke except the case. I mean every
like the video card actually like blew, a fuse started smoking.

(01:07:18):
I mean, everything broke, and but I got so used
to swapping out parts and fixing stuff by the time
I got to college. I always I just knew how
to do it. And I was still swapping stuff out
because things kept breaking. And when I got to college,
in my dorm room, I had like opened up the
case and I put this neon like that would flash

(01:07:40):
along with the music, like along with the beat and
stuff like that. Very well, yeah, the.

Speaker 1 (01:07:44):
Coolest, the coolest dude, the coolest people in the dorms
always had that always.

Speaker 3 (01:07:50):
Yeah. Well, so this is what happened to me is
I don't know if I was the coolest. I don't
think I was. I was an honors dorm, so we
can say we can already say what's happening there, But
people would walk by, they hear me, like bumping music,
and they'd walk in and see what's up. And then
I got just like neon lights, like black lights flashing
out my computer. They asked me about it, and I
tell them and they say, yo, could you make me want?

(01:08:10):
And I would say yeah, And so I kept doing it.
So I was like trapping out the dorm room. But
I was building people computers. I mean we were getting order.
I had a partner who was across the hall. We
get orders, we'd go buy the components. We just blast music.
We build them all day, just like chain drink red Bulls.
And it got We started making so much money. Shout

(01:08:33):
out to the you know, housing Department of UCR that
they made us stop, like we got complaints and they
like somebody's comparent, somebody's parent complained, I think because we
sold them a computer and they started downloading porn and
like broke it. And it was like, yo, that has
nothing to do with the physical machine we sold you.
You you downloaded things you weren't supposed to download. We

(01:08:54):
didn't do anything. And yeah, they made a stop. But ooh, man,
I paid for I pay for rent after I stopped
working in the dorms. Like I was paying rent money,
I was paying food bills, I was buying gas like
I made some money. That was my first job.

Speaker 1 (01:09:11):
Yeah, God, long live the nerds in the dorms for real.

Speaker 3 (01:09:15):
And it was it was it was all out of necessity.
And the thing was I knew people who were in
computer science department and I never got to that level.
I was just it was just necessity, you know. And
but it's it's fun, you know. It's it's like a
little hobby. Some people get in a knitting, some people
get into cars, you know, just my I get computers.

(01:09:35):
I don't know, it was fun. Everybody should learn how
to build a computer. Everybody.

Speaker 1 (01:09:40):
I don't know how to build one, but I I
mean when I was in school, you had to I
had to pass.

Speaker 2 (01:09:46):
We had to.

Speaker 1 (01:09:46):
Like part of our prerequisites when I was in high
school were computer classes.

Speaker 2 (01:09:50):
I don't know if they're still doing that anymore, but
you had to know.

Speaker 1 (01:09:53):
I went to all girls schools, so you had to
learn how to type and the rest. Now there's a
little bit sexistcause it's like, oh, well, yeah, but you
can always make your living as a secretary. Hey, well fine, yeah,
but yeah, we had to pass, Like taking a computer
class was a prerequisite. So the basics of how they
work was. Everybody who was in my school had to
learn that, and I really that was very valuable. That

(01:10:15):
was that was like a valuable I don't I part
of me wonders if I'd be doing the work that
I do now if that had not been a foundation
that was sort of given to me when I was young.

Speaker 3 (01:10:25):
Yeah, just understanding just a little bit. I'm not saying
you got to go, you know, drop what you're doing
and get a career as a programmer, but even just
understanding a little bit about how these things work will
help you that when you see something in the news
or whatever, you understand, wait a second, something's off here,

(01:10:49):
and and you might end up being the person who's
a resource for somebody else, you know, like you can
be helpful to people around you just by somebody ask
you something and say, you know, I don't think that's possible,
Like that doesn't actually make any sense, you know, And
so just small things like that, small things like that,

(01:11:10):
I think it's it's it's it's so worthwhile to just
learn a little bit whatever that is, whatever interests you.
If you're interested in hardware, if you want to know
a little bit about AI, like there are free classes
you can take uh, that'll, they'll teach you how it works.
There is one of the best books I've ever seen,
Karen Howe's Empires of AIS. I love that book because

(01:11:36):
she explains how it works. It's not just like a
drama story about the people who work it. Open AI it.
Actually you can read it and you will understand how
AI works. And that's super valuable because then when you
watch the news, you will understand, Ah, this is what's happening,

(01:11:57):
this is why this happened, and it's really helpful, really
really helpful.

Speaker 1 (01:12:03):
Well, you mentioned if you learn a little bit about computers,
you might be able to tell when something is off.
I do have one more story that I think is
a story where something from the very beginning, I was like,
something seems off.

Speaker 3 (01:12:16):
Here.

Speaker 2 (01:12:17):
Were you following the tee app drama?

Speaker 3 (01:12:21):
Yeah, we did a Yeah we did an episode on it,
and man, it's gotten it's only gotten worse. Yes, it's
only gone worse. Yeah, I heard your episode on it,
and yeah, it's only gotten worse. All the things that
I think some people were suspecting ended up being true
and worse a thousands.

Speaker 1 (01:12:40):
So we've been talking about the t app, the app
that they say was designed for women to spill tea
on the men that they were dating.

Speaker 2 (01:12:48):
They said it was about women's safety.

Speaker 1 (01:12:51):
I had many questions of that episode, but I talked
a lot about my suspicions of what was going on here.

Speaker 2 (01:12:57):
Shout out to four and four. They published another deep
dive heard some of those questions.

Speaker 1 (01:13:01):
I will say it's a long read, and I think
it's one of those stories that folks should read the
whole thing. But here is what we've learned, which is
basically that the t app company is much shadier than
I ever fucking realized, and it sounds like it is
run by I won't say bad actors, but people who
do not have their users as best interests at heart.

Speaker 2 (01:13:21):
So a couple of things that we've learned from that piece.

Speaker 1 (01:13:24):
One, the te app popped up as those are we
dating the same guy? Facebook pages are being cracked down
on Facebook. It turns out that the tea app was
actually intentionally trying to derail and hijack those pages.

Speaker 3 (01:13:39):
After the sabotaging them.

Speaker 1 (01:13:40):
Yeah yeah, essentially sabotaging them after the woman who was
responsible for them declined to work with the tea app.
And so you know, folks might remember that Sean Cook,
the Tea app founder, told the story about how he
was inspired to make the app after his mother had
terrible dating experiences and so he wanted to create a
space where women could be feel safer in their dating experiences. Well,

(01:14:03):
Sean Cook, working alongside his fiance Christiane Burns, approached this woman,
Paula Sanchez, who was running this these r we Dating
the Same Guy Facebook pages. They pitched her the idea
of coming on to be the face and founder of
the Tea app, and she did not respond. So after that,
the Tea App began undermining her groups, including paying influencers

(01:14:26):
to lure them into the Tea app. So they would
have someone would post a picture and an r WE
Dating the same Guy app and be like, oh, does
anybody have any information? They were paying influencers to go
in and say I think I've seen him on that
tea app.

Speaker 3 (01:14:40):
They also would create and they would just spam. They
would just spam this the same message in multiple places.
And speaking I have to say speaking of paying people,
they also they were sending the founder of these of
these groups that existed sending your messages. But then also
when she wasn't responding sending her like venmo requests, like saying,

(01:15:02):
oh maybe you didn't see my message. Here, here's twenty
five bucks somebody. Yeah, just cartoonishly bad.

Speaker 1 (01:15:10):
Yes, oh my sorry, Yeah, I'm so glad you mentioned
that that that does show the level to which they
were like really pressed, and like like that would make
that would so make me respond even less. It's like, well,
now you're yeah, I'll take the twenty dollars, but I'm
definitely not going to respond.

Speaker 3 (01:15:28):
Yeah, man, yeah, I mean so it's like you're saying
these groups existed, and I think you perfectly encapsulated it.
It's really hard for me to believe that the person
who made this app actually had women's best interests at heart.
Because these groups existed, he has somebody to come in

(01:15:52):
and try to buy, essentially lure them away, and then
when they won't accept actively sabotage is the group exists. Listen,
you can you can offer an alternative. That's one thing. Maybe,
you know, we take one hundred steps back and say, okay,
there's a Facebook group. I think that my app is
a better experience. Let me offer that as an alternative.

(01:16:14):
But sabotaging the thing that already is there. Man, You're
get into some other territory there, You're getting some other
territory there.

Speaker 1 (01:16:23):
He was someone who was not afraid to get into
some we'll call it other territory. One of the I
think the craziest accusations in the piece was that he
was essentially pretend, pretending, like using his ex fiance's accounts
on social media, Christian Burns, who they were engaged, she

(01:16:45):
ended up like leaving the project. She was known as
Tara online and four four spoke to a former Tea employee.
You said that she only knew Christian Burns as Tara,
and that persona also exists within the t app and
on Facebook as an official representative of the app. But
then when Burns left the company, Sean Cook, this male
founder took over that persona and continued communicating with t

(01:17:09):
users as if he was Tara. One of the former
employees told for for Sean uses that account to communicate
directly with users on the app, but people think they
are speaking to someone actually named Tara.

Speaker 2 (01:17:20):
Essentially, a man is.

Speaker 1 (01:17:21):
Posing as a woman to an audience a women who
are trying to protect themselves from at best deceptive men.

Speaker 2 (01:17:28):
So not great.

Speaker 1 (01:17:29):
And you're saying, oh, I want to protect women. I know,
I'll pretend to be a woman using a Facebook account
that lists me as a woman to do that.

Speaker 2 (01:17:39):
Not great.

Speaker 3 (01:17:40):
Yeah, you can think, you can hashtag not all men
all you want. You can think you are the best
dude in the history of the world. And I will
assume let's assume my man is let's assume that my
man is the wokest, most gentlemanly gentleman that ever gentleman. Okay, fine,
but if you say that you are creating an app

(01:18:02):
that does not allow men in it, man, you were
not allowed in the app, remove yourself from the app.
I'm sorry, bro, like you you can't you or or
at least let them know. Hey, man, look with short staffed,
it's me sean, let me, let me, let me take
care of this thing, let me you know whatever. Okay, cool,
don't don't pretend I like I said, I can't. You

(01:18:28):
can't make this stuff up. You can't make this stuff up.

Speaker 2 (01:18:31):
No, it's bad. And I and this was my feelings
in the episode.

Speaker 1 (01:18:36):
I think it just really it makes me sad that
gender relations is this bad, that this is how it is,
that this is the this is what we have to
offer everyone is, oh, would you like to have your
information exploited?

Speaker 2 (01:18:50):
Would you like to be exploited by this app? That
it's product safety, that that's sort of where we're at.

Speaker 1 (01:18:55):
It's it's dark times. We are how did you put
it earlier? We're not ready.

Speaker 3 (01:19:01):
Read we're not ready. We're not I mean, we won't
get ourselves ready. But the thing, the thing that is
like just diabolically clever about this particular app is that
if you criticize it, then you're a misogynist. And let

(01:19:22):
me be very clear here, a lot of the criticisms
is coming from actual misogynists, some of whom would actually
self identify as as misogynists. You know, again we're talking
about four chan on here. And so some of the
people were saying, oh, well, you don't like this because
you hate women? They said, yeah, yeah, I do hate women,
I understand, And what about it? Yeah, what are we

(01:19:44):
doing here? So there there is a perhaps disappointingly healthy
percentage of the people who are criticizing this application, who
are criticizing from the beginning, who are criticizing because they're
mad that they don't there's a space that they're not
allowed into. And you know, we could say whatever we
want to say about Oh, somebody might say something that's

(01:20:04):
not true. Like the fundamental source of anger. I think
I'm pretty comfortable saying is that people just mad that
there's women who could be in a space that they're
not allowed into. You did what I mean. But if
you say, hey, look this seems unsafe, or clearly it
was unsafe, and let me remind you and unfortunately viewers

(01:20:27):
their listeners that people are still signing up for this app.

Speaker 2 (01:20:31):
People are still they felt very popular.

Speaker 3 (01:20:35):
And they're continuing to report at least I mean the
last time I looked, which is a few days ago
before this article came out, I have to say that.
But they were really kind of celebrating that more women
are signing up, and they would say, well, you know,
we had a cyber incident quote unquote that's the language
they use. This was a cyber incident, but you know,

(01:20:56):
essentially saying, you know, basically communicating to the users that oh,
you know, this is all this is something that happened,
but this just shows that our mission is still important
and this is t app is needed more than ever.
Just bro, there were Facebook groups that you sabotage, you
try to tank these things. That's something that was created

(01:21:17):
by a community with as far as I know, no
profit motive. And then you come on and you're bragging
on podcasts about how there's angel investors try thinking about
putting money into your app, Like, yo, you you have
a different motive here, you know what I mean. And
so it's just it's one of those apps that if
one tries to criticize them, then they the pr spend

(01:21:40):
that they put on it is, oh, well you must
do what you don't want women to be safe. Bro,
we're having a different conversation here because your app is unsafe.
That's the thing I have a problem with. Yeah, the
app is unsafe, but yeah, yeah, it's it's tough to
talk about. We're not ready.

Speaker 2 (01:21:57):
Women deserve everybody deserves real safety.

Speaker 1 (01:22:00):
And I saw the same vibe on the internet where
if you criticized this app, it's like, oh, you don't
think women deserve a space to share their experiences and
keep themselves safe.

Speaker 2 (01:22:13):
And it's like, I think women deserve that.

Speaker 1 (01:22:14):
Women should speak up and do what they we need
to do to keep ourselves safe, and we always have
done that without these scammy, exploitative apps that are offering
us the chance to be genuinely put in danger, like
having your driver's license with your address and whatever conversations
that you said that you thought were private being all exposed.

Speaker 2 (01:22:35):
That is, that is such.

Speaker 1 (01:22:36):
A level of danger that I can't even take seriously
anybody who would actually think that this company was meaningfully
interested in keeping women safe.

Speaker 2 (01:22:45):
It just it's it's so it's so exploitative.

Speaker 3 (01:22:48):
Yeah, yeah, and it I mean, it brings you back
to again, I think of more people. You can't blame
the users for wanting to use this software, for wanting
to use this app that was that was unsafe. We can't,
I can't you know, these are this is a victim
of a lot of different situations. One general misogyny in

(01:23:14):
society in general, but also some people who lie to
them and you know, on his face said hey, we're
going to provide certain safety and they did not provide.
That didn't deliver on the promise. Right. I think this
is again somewhere where hopefully some of this can be

(01:23:36):
mitigated by learning a little bit more about how this
technology works. And so sometimes a little bit of a
red flag might come up and say, hold on, should
I be sending them this is this? It just does?
This seem like it makes sense to me? And anybody
can get tricked, anybody. You know, we're very used to
putting our trust in app companies because you know, if

(01:24:00):
you can't trust, if you can't trust a safety app,
who can you trust?

Speaker 2 (01:24:05):
And go to Bellanche? Where can you go?

Speaker 3 (01:24:07):
Yeah? Exactly who can you trust? And so you know,
this is really difficult stuff. But you know, I don't
have a Frankly, I don't have a solution for it.
I don't have a solution for but I do know
that I want more people. I want more people to
be better educated on how this stuff works. And if
that just means reading one book or reading one article,

(01:24:28):
listening to this podcast, reading one article on four four,
then you know we're getting closer definitely.

Speaker 1 (01:24:35):
Or subscribing to your podcast kill Switch, which is excellent.
Where can first of all thank you for this conversation
has been great. You are you bring such a clarity
to these issues that is just very refreshing. And you
have a I listen to a lot of tech content,
you have a chillness about the way that you've you

(01:24:55):
handled that you come at this that I really appreciate.
Where can folks listen to kill Switch? Follow you, follow
your work, follow all the cool stuff you've got coming on. Wow, Yeah,
thank you for saying that. That's very kind of you.
I mean, website, what updex dot com? Everything's there, social media.
I'm on dex, digit.

Speaker 3 (01:25:15):
D e X, d I g I and actually so
kill Switch. If you search for it, you'll find it.
But actually we're starting to put stuff on YouTube also,
so yeah, actually the the our episode about the t
app is the first one up there. Trying to get
some more on there. But yeah, uh really really anywhere

(01:25:37):
I'm out here, Unfortunately, for better or for worse, I'm
out here.

Speaker 1 (01:25:41):
For people are listening so they can't see check out
Dex's YouTube because your background is so beautiful. You have
a very beautiful background, and if you're wondering what Dex
looks like, you should check out the YouTube because it's
very it's very nicely curated and it's a visual.

Speaker 2 (01:25:59):
It's a visual.

Speaker 3 (01:26:00):
Thank you think I'm working on it. I was like,
every every week I add something else to it because
you know, I stream also I also stream on Twitch,
and so I'm always adding little things. And the people
in chat are always making fun because every time I
stream something goes wrong, like something like there's a light
that's not working. The camera is blurry, yo, because I

(01:26:22):
got I mean, I got this. One of these days,
I need to post a picture of what my desk
looks like because it looks like NASA. Like if the
Feds ever see this, they're gonna come in here because
it looks like I'm planning something. Yeah, the amount of
wires and machinery in here, it's it would look suspicious. Yeah,
I'm not going to post a picture this, it would

(01:26:43):
look suspicious. They don't think I'm making Because I want
to see I'll send it to you on signal because
again speaking of safety, like I don't want anybody. I
don't want to get intercepted because somebody's going to see
this and say, what is that antenna and you're doing
you're playing something and we listen somebody to visit. Yes,
we're gonna. I'm gonna get a visit. I do not

(01:27:04):
want to visit from any authorities. By the way, any
authorities listening to this, nothing but respect, appreciate your work,
God bless but yeah, no for real jokes aside. Thank
you so much for real this is a lot of fun.

Speaker 1 (01:27:15):
Oh my gosh, Well follow decks and all the places.
Thank you so much for being here, and thanks to
all of you for listening.

Speaker 2 (01:27:20):
If you can.

Speaker 1 (01:27:21):
Follow me on YouTube at There Are No Girls on
the Internet. You can follow me on Instagram at bridget
ran DC or on TikTok at bridget Lean DC.

Speaker 2 (01:27:28):
I will see you on the Internet.

Speaker 1 (01:27:33):
Got a story about an interesting thing in tech, or
just want to say hi? You can reach us at
Hello at tegody dot com. You can also find transcripts
for today's episode at teng Goody dot com. There Are
No Girls on the Internet was created by me Bridget Todd.
It's a production of iHeartRadio and Unbossed creative Jonathan Strickland
as our executive producer. Tari Harrison is our producer and
sound engineer. Michael Almado is our contributing producer. Edited by

(01:27:55):
Joey Pat I'm your host, Bridget Todd. If you want
to help us grow, grate and review.

Speaker 2 (01:28:00):
Us on Apple Podcasts.

Speaker 1 (01:28:01):
For more podcasts from iHeartRadio, check out the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.
Advertise With Us

Popular Podcasts

Stuff You Should Know
New Heights with Jason & Travis Kelce

New Heights with Jason & Travis Kelce

Football’s funniest family duo — Jason Kelce of the Philadelphia Eagles and Travis Kelce of the Kansas City Chiefs — team up to provide next-level access to life in the league as it unfolds. The two brothers and Super Bowl champions drop weekly insights about the weekly slate of games and share their INSIDE perspectives on trending NFL news and sports headlines. They also endlessly rag on each other as brothers do, chat the latest in pop culture and welcome some very popular and well-known friends to chat with them. Check out new episodes every Wednesday. Follow New Heights on the Wondery App, YouTube or wherever you get your podcasts. You can listen to new episodes early and ad-free, and get exclusive content on Wondery+. Join Wondery+ in the Wondery App, Apple Podcasts or Spotify. And join our new membership for a unique fan experience by going to the New Heights YouTube channel now!

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.