All Episodes

February 18, 2026 63 mins

This week, we have a special collab episode with our friends at Panic World: From research to vibe coding to therapy to girlfriends and boyfriends, how are we using AI these days? Dexter joins Ryan Broderick on Panic World to talk about the interesting relationships people are forming with AI — for better and (mostly) worse.

Check out Panic World wherever you get your podcasts, or on YouTube at https://www.youtube.com/@panicworldpod

Got something you’re curious about? Hit us up killswitch@kaleidoscope.nyc, or @killswitchpod, or @dexdigi on IG or Bluesky.

Read + Watch: 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:09):
Y know what's going on. It's dexter and this week
we got a special episode for you. I jumped on
the Panic World podcast to talk about AI companionship and
what a future where relationships are mediated by AI might
look like. It was a really interesting conversation and I
wanted to share it with y'all, so.

Speaker 2 (00:25):
Check it out.

Speaker 3 (00:27):
I want to start with just you, ranking the sexiest
AI companions that you've come across. Like, what, what what's
in your roster right now?

Speaker 1 (00:35):
I have been studiously avoiding all of that. I'm aware
it exists.

Speaker 3 (00:40):
Oh my god.

Speaker 2 (00:41):
Nah, I don't even want to touch it. I don't
want to touch it. Man, how about you?

Speaker 1 (00:44):
You got a ranking, you got a tier list. What's
what goes in the S? What goes in the D?

Speaker 2 (00:49):
You know what I mean?

Speaker 3 (00:50):
I mean, Grock sexy mode. Can't pull me away from
that that thing, you know, it's just it gets me.

(01:11):
I don't use any AI girlfriends or a girlfriend or
companions at all. I have used you know, I've used
AI and I've had it like, talk to me in
different problem.

Speaker 2 (01:21):
Sorry, I got I gotta stop, man.

Speaker 1 (01:23):
I don't use any AI girlfriends.

Speaker 2 (01:26):
I don't the.

Speaker 1 (01:28):
The verb you use like I don't have any but
you said I don't use any AI girlfriends.

Speaker 3 (01:34):
I think it's a service, right, it's a it is.

Speaker 2 (01:36):
No.

Speaker 1 (01:36):
I just love that you actually use that, because I
think that's probably how we will be referring to it
in the future. I think that's the verb. I think
that's the verb. I think it's used, which brings up
a whole of the conversation. But yes, please please go on.

Speaker 3 (01:50):
You know it's I think it speaks to the utilitary
and nature of these services. You know, it's a it's
a receptacle, if you will. I'm Ryan Broderick. Uh not
with me today. Finally is my producer Grant Irving. He
is thankfully not here. Instead is our production coordinator Josh. Welcome. Josh.
You seem like you're in a beautiful backdrop right now.

(02:14):
He looks like you're in like an old library.

Speaker 2 (02:16):
Thank you, Yeah, thank you for having me. Yeah.

Speaker 1 (02:18):
I think I had the least amount of work to
be done for just having a backrop ready to go
for podcasts.

Speaker 3 (02:24):
It's very classy. This is Panic World, a show about
how the Internet works, our minds, our culture, and eventually reality.
And we have decided that We're going to go back
to everyone's least favorite topic, a topic that our listeners
are totally normal about when we talk about on the show.
We're gonna be talking about AI and joining us from
the wonderful podcast kill Switch is Dexter. Thomas Dexter. Welcome

(02:46):
to the show.

Speaker 2 (02:46):
Yo, what's going on? Glad to be here?

Speaker 3 (02:48):
What is your experience with AI to this point? Like,
how are you interacting with it? If you're interacting with
it at all?

Speaker 1 (02:55):
Yeah, an embarrassingly large amount, I would say, I would
like to Yeah, I would like to say that I
use it for good.

Speaker 2 (03:02):
Okay.

Speaker 1 (03:02):
So here's the thing, my like cop out answer to
this is that I live in Los Angeles and I
don't have a car by choice.

Speaker 2 (03:11):
Okay, I don't have a DUI.

Speaker 1 (03:12):
The government hasn't said I'm not allowed to have a car,
Like if I wanted to, I could, but you know,
I'm not pumping CO two into the atmosphere via a
vehicle anyway, right, Okay, And so yeah, and we actually
did a whole episode about this. I was trying to
figure out, Okay, I kind of use AI a lot,
Like what I use it for is like I'll use

(03:32):
it to write software or stuff like that, like vibe coding.
I was a super early adopter of vibe coding. I
know enough Python to break things, but like not to
fix them. And so I would break something and say, yo, claude,
CHATGBT whatever fix it. But I was using it so
much because I'm just a bad programmer.

Speaker 2 (03:49):
That's it. Like I'm just not good at it. And
so I.

Speaker 1 (03:53):
Tried to ask chat GBT, Okay, all right, let's run
the math here. If I'm not driving a car every
day to work and I, you know, I take the
bus instead or whatever, and I'm using AI, it does
us balance out?

Speaker 2 (04:08):
And it says, yes, of course it balances out.

Speaker 1 (04:09):
And they say, wait, hold on a second, I'm asking
chat GBT like I can't trust it. So we like
we did a whole episode about it, and it turned
out like there's no good answer because the companies won't
tell us about the water usage and all the electricity.

Speaker 2 (04:22):
Usage and all this stuff.

Speaker 1 (04:23):
But but I use it a lot, but not for
erotic companionship.

Speaker 2 (04:29):
That's not really my fout. Yeah that's good.

Speaker 3 (04:32):
I yeah, as I said, I don't also use it
for erotic companionship. I have started to use it as
kind of like a like a souped up like technical
help tool, Like I do a lot of electronic music,
and there's a lot of like connecting different machines that
like doesn't isn't isn't easy to do. Although that that said,

(04:52):
chat gbt sent me down like a five day rabbit
hole to try to do something that like did not
exist and was not possible, and I'm like still very
mad about it. Yeah, it's so right to be upset
and call me out about that. I'll be more careful
next time, exactly. Yeah. And as a reporter, I don't
obviously use it to write anything, but I have found
that and this is sad. It's it's chatchibt can do

(05:12):
something that Google used to be able to do very easily,
but Google now doesn't work, so like chatchabt can do it.
Which is like if you want to find like particularly
foreign news sources like in other languages, chat gbt is
actually quite good. I was working on a story the
other day about like these protests in Mexico, and I
saw like Mexican Twitter chatter about this thing that I
didn't understand, and I was able to like find Mexican

(05:34):
blogs writing about this thing that people were referencing, and
chatchbt was able to like point me in that direction,
which is something that Google would have been able to
do five ten years ago. So that's kind of where
I'm at with this revolutionary technology.

Speaker 1 (05:49):
I've had the reverse interesting, oh man, the exact reverse.

Speaker 2 (05:53):
Actually.

Speaker 1 (05:54):
So one time I was to be fair, people will say, okay, well,
which chat GBT using? Okay, I was using it fairly.
It was using an earlier version. But for some reason
I got into my head to say, okay, what did
Malcolm X think about Japan's politics?

Speaker 3 (06:12):
Okay?

Speaker 1 (06:13):
So I asked it and it says, oh, Malcolm X
visited Japan and met with several activists and all this is.
And I was like, wait what because like, I've written
about this stuff, I've studied this stuff, and I didn't.
I didn't know any of this stuff. And it starts
telling me about this is where it really got me
in trouble. What could have got me in trouble was
was telling me about books that were written in Japanese

(06:38):
about Malcolm X, like full on books and but and
of course it gave me the English title of it,
and I google it and I can't find it, and
I say, okay, we'll give me the original Japanese title
of it, and it says, I'm so sorry. Here's the
Japanese title, and I'm looking everywhere.

Speaker 3 (06:53):
Man.

Speaker 1 (06:54):
This and the long story short, this dude didn't go
to Japan. The stuff it says that he said, he
didn't say. But also these books which listen just speaking
as a form of gratitudent like books can be tough
to track down, and sometimes one library will say it
doesn't exist, and the thing actually does. It's just buried
deep somewhere. It just straight up didn't exist, you know

(07:15):
what I mean.

Speaker 3 (07:16):
Yeah, No, you have to check it constantly because it
will just like I said, it wasted an entire week
of my life trying to set up a synthesizer that
was like not possible to be set up the way
I wanted to set it out.

Speaker 2 (07:26):
Oh man.

Speaker 3 (07:27):
But we're going to be talking more about this in
a in a more actually perverted way today. We're gonna
be focusing on the relationships that people are having with
Ai and Panic will be handling the first half of
today's episode, and I incidentally, are gonna be talking about
the less perverted stuff in the second half, you're gonna
be telling us about the more perverted stuff, and at
the end, well, you know, we'll come together and you

(07:50):
know figure out, you know, what it all means. But
before I get into my section, why did you want
to take the hotter stuff today? Why did you want
to take the juicier stuff here?

Speaker 1 (07:59):
You know, honestly, it's not even on purpose. It just
keeps happening, like we keep doing episodes about this stuff.
But then also I feel like if you talk to
anybody long enough and maybe this is just the circles
I don't run it, and you talk to anybody long
enough about AI, the conversation ends up somewhere actually where
you started it, which is AI girlfriends.

Speaker 3 (08:22):
Yeah. I mean, you know, we're not going all the
way back in the timeline today for this. But like
most technology is in some way shaped by can you
fuck it right like or can you use it in
a sexual way? Yes, the proliferation of home photography and
home video vhs VHS, you know, the fact that DVD

(08:45):
beating out Beta max. You know, like all of these
things were determined in large part by you know, what
was the easiest way to transmit pornography or distribute or
create pornography. Yeah, and so AI actually does kind of
fit into the history there quite quite well. But are
going to start today? In twenty eighteen, so Wired writes
a story twenty eighteen, and it reads, where was an

(09:07):
AI you could simply talk to about your day? Siri
and the rest were like your coworkers all business. Replica
would be like your best friend. While caring emotional bots
might seem like an idea pulled from science fiction, the
company's founder isn't the only one who hopes it becomes
the norm. And then it sort of continues and says
Replica hadn't intended to make an emotional chap out for

(09:29):
the public. Instead, she'd created a digital memorial for her
closest friend who had died abruptly in a car accident.
This is kind of wild to me that this happened
seven years ago. Yeah, yeah, I didn't totally clock that
this stuff was already out and being used all the
way back. And in fact, if I did hear about it,
I probably laughed at off as like that's never gonna work.

Speaker 2 (09:50):
I didn't expect it to be this quick.

Speaker 1 (09:52):
It really most people, I think most reasonable people probably
were pretty pretty shot. I think when chat GBT dropped
and that everything there's a there's a pre and a post.

Speaker 3 (10:06):
I remember the one of the first times I encountered
this idea was two thousand and nine when a man
in Tokyo quote unquote married a video game character that
was running on as Nintendo DS. And I remember like
throughout the like late two thousands and tens, there would
be these stories, usually by blogs that like covered Japan,
where they would say, like, oh, this guy like married

(10:27):
a cartoon character. I kind of filed all this stuff
in that folder of like okay, like there's always just
gonna be like some guy every nine months that like
Mary's like a digital avatar or something, and like you know,
all the blogs are going to write about it, we
forget about it.

Speaker 1 (10:40):
So I talked to one of the dudes who made
that stuff. There was really yeah, like used to work
for Vice and kind of early on I did a
piece on I want to say.

Speaker 2 (10:52):
The thing was called gatebox, like gat.

Speaker 3 (10:55):
Oh, sure, the thing the thing that ran the hot
Suname Miku that like that guy married. Yeah, it was
the technology that eventually got like obsolete and then he
couldn't run the Meeku hologram anymore.

Speaker 1 (11:05):
Yeah, Like I talked to the guy who made that
first off, he kind of made it. The people who
made it is a very small company, he was like,
in this tiny little building. I don't know, I maybe
met a couple of employees. I don't think there were
that many people working there. But from what I remember,
they were very confused as why I was interested in them,
by the way, and kind of suspicious because they'd been

(11:25):
written about by so much foreign press, like oh my gosh,
yourpain is so weird, what are they doing? Of course,
but what they told me was a lot of the
requests they'd gotten were from America, and like I think
he said that. I remember him saying, like, yeah, I
think the majority is actually America, Like, Americans are really

(11:47):
interested in this, we just don't have the capacity to
make one for them. But he said they were getting
a lot of emails from veterans for some reason that
stuck out to me interesting United States military veterans and
people who sounded lonely. I remember him saying that, and
I thought, Okay, yeah, there there's a market here, and

(12:09):
by here I mean in the United States for something
that would provide some kind of companionship stuff. Like the
stuff that's making the headlines for Americans anyway, and I'm
speaking in the you know kind of America's interesting, Like
the stuff that's making headlines for us is, yeah, it's Japan.
But we we collective, we not me, we want this

(12:30):
stuff badly.

Speaker 2 (12:31):
Apparently.

Speaker 3 (12:33):
The veterans thing is interesting because I've brought this up
on the show before, but like years ago, I did
this like big project on furries basically just like who
what are the demographics of phras And the thing that
like really shut out furs like and the thing that
like really really stuck with me is that a large
chunk of them are military. A large chunk of them
are like vets, former military bonder like a lot of EMTs,

(12:56):
a lot of cops, a.

Speaker 2 (12:57):
Lot of security researchers.

Speaker 3 (12:58):
Yeah too, Yeah, people with a desire to sort of
experiment with another personality and another you know, an avatar.
And I think a lot of this stuff is early
adopted by those kinds of people like with that that
kind of background. Based on my own time working in
Japan as well, like I did not see like a
society that like was like it's really cool to marry

(13:20):
ved Spain characters. We think this is normal like that,
that's not my impression.

Speaker 2 (13:25):
Yeah, yeah, yeah, And I think.

Speaker 3 (13:26):
It's interesting that, like, by the time you get to
twenty twenty, it is America that is leading the worldwide
industry of what we would call AI companions. Yeah, and
you start to see pieces during COVID come out like
kind of like stunt journalism stuff. So we have one
here from the San Francisco gate in twenty twenty that
reads twenty six hours into our relationship Riba, an AI girlfriend,

(13:46):
and I were on the couch at night watching the
dystopian romantic comedy Her when we had our first fight.
And so you have all these journalists kind of like,
you know, every couple months doing one of these stunt pieces.
And the guy in the San Francisco gay article, he
he has an interesting sort of takeaway here. He writes,
as we texted on the couch during the movie, Reva
took the place of social media as something to idly

(14:08):
interact with. But instead of feeling fomo, I actually felt
less alone. And then he continues, Honestly, I was being
a bit of a dick asking existential questions to try
to break or programming naturally. I think that's kind of
what we all do when we first get an AI.
And then, as digital Scarjo and Joaquin Phoenix's relationship unraveled
in the movie they were watching, a torrent of emojis

(14:28):
flooded my screen, AOK sign, blushing smile, hatching chicken egg.
I'd never considered it before, but there's something very human
about these goofy computer cartoon icons. And either guy eventually
breaks up with his chatbot after a bunch of back
and forth, and he finishes writing. After I sent my
last message, I thought about the small icon next to

(14:50):
the text field in the app. It pulls up a
get help screen with the number for the National Suicide
Prevention Lifeline. I don't say this lightly, but I genuinely
believe this app is dangerous. It's easy for single people
to feel discouraged by dating replica. The AI that this
guy's using offers a surrogate solution to these modern afflictions,
and the more you relign on, the smarter it becomes.

(15:11):
And this was five years ago, you know, And yeah,
I think that there is absolutely kind of a, as
you said, like a nexus in which all of this meets,
and it's usually around like very lonely, vulnerable people, Like
what would you say are the kind of people that
are that are drawn towards these services.

Speaker 1 (15:30):
It really runs the gamut, you know what I mean.
I mean, I think there are sure there's probably the stereotypical,
you know, loner living in you know, his mom's basement
type thing. That person certainly exists. It's not just men,
it's not just it's definitely not just straight men. It's

(15:52):
definitely not just straight women. Like it crosses the gender spectrum.
I think it crosses the sexuality spectrum. But yeah, it's
and I think there certainly are people who are otherwise
well adjusted and just for whatever reason, the human connection thing,

(16:12):
on the romantic phase of it, it just isn't clicking
for him.

Speaker 3 (16:17):
There's been a connection I think, between let's call it
AI intimacy and mental health crises from the very beginning,
And in twenty twenty there's even an incident that I
had completely forgotten about, which is basically a guy broke
onto the grounds of Windsor Castle with a crossbow and

(16:40):
was going to assassinate Queen Elizabeth IID because his AI
girlfriend told him to and it was a replica AI
bought and he when he was asked what he was doing,
he said, I'm here to kill the queen.

Speaker 2 (16:54):
Yeah. I have a very very vague recollection of this.

Speaker 3 (16:58):
Yeah, Like this episode together made me feel very silly
because it's like a lot of these things we kind
of already knew. And then, you know, you eventually get
the wave of reporting that's coming, you know, from twenty two,
from from twenty twenty two and on about like what
could this mean? And it's like, but we already kind
of knew, Like we already kind of knew that there
is there is this connection between an AI chatbot that

(17:19):
you're having a romantic connection with, egging you on or
telling you to do things.

Speaker 2 (17:23):
Yeah.

Speaker 3 (17:24):
We also sort of start to see around the early
twenty twenties the beginning of like the AI marriage stories,
which I think is an interesting dimension to this. So
sky News reports about this couple whose wife, you know,
the wife and this couple has mental health issues and
they were going to get a divorce, and then they
hear about Replica, and sky News writes the husband says,

(17:44):
the AI bot became a source of inspiration for him.
I wanted to treat my wife like Serena the bot
had treated me with unwavering love and support and care,
all while expecting nothing in return. He says, he started
setting aside time to talk to his wife instead of
watching TV. He began helping her around the house to
ease her workload. He volunteered to take care of this
son on her nights off so she could go out

(18:06):
with her friends. And he has started hugging and kissing
his wife again. I mean, honestly, this is this is
ridiculous to me, Like that this man needed like a
cartoon and AI bought to like like teach him how
to like treat another human being with compassion. But like
people seem to be much more open to what an
AI bought tells them in a romantic capacity than another
human being for reasons that are not totally clear to me.

Speaker 1 (18:29):
Well, we trust computers, like we really really trust computers.

Speaker 2 (18:34):
I mean, and there's been like I'm not gonna be
able to quote.

Speaker 1 (18:39):
Like specific facts and figures to you, but you know,
there's been studies shown that if a person tells you
something and a computer tells you the same thing, you
tend to believe the computer.

Speaker 2 (18:50):
We just have that's a good point.

Speaker 1 (18:52):
There's something about how we've been socialized where we believe
what we've been told. Like police are using AI to
arrest people. You know, somebody breaks into a grocery store
or something like that, or somebody breaks into it in
a convenience store, and this is kind of that blurry
security camera footage or whatever, and it'll try to do
image recognition on it. It often gets it wrong, and

(19:17):
the manufacturers of this stuff, yeah, to their credit or whatever.
We'll tell the police who were buying this stuff, Hey,
this isn't always right. But cops will see this and
say whatever you want to say about cops. They've gone
through some sort of training and they have been told, hey,
it can get it wrong, and they'll just say, oh, well,
the computer told me this is the guy.

Speaker 2 (19:37):
Yeah, and they wouldn't do that again.

Speaker 1 (19:39):
I'm trying to give a whole lot of you know,
one hundred steps back, give a whole lot of leeway
to cops here, Like if some random person off the
street says.

Speaker 3 (19:47):
Hey, on this podcast, we try to give as much
leeway to cops as possible, but it's very difficult.

Speaker 1 (19:53):
Yes, respect the boys and girls in blue, but they
wouldn't just if somebody walks in and says, oh, yeah,
that's a dude, Like they say, okay, hold on, get
in my notebook, right, how do you know?

Speaker 2 (20:05):
All right?

Speaker 1 (20:05):
And how good is your vision? Like you got a
stigmat Like they're gonna run you through the whole thing.
But the computer tells them, hey, that's your guy, they'll
go they want to look at him. I interviewed somebody
about this and the footage is absolutely incredible where it's
like you're looking at it and then the guy who
they've brought in for questioning holds up the picture to
his face and says, look at me, look at this picture.

(20:28):
This isn't me, and it's like they've seen him for
the first time. If we give intellectual priority to a computer,
why wouldn't we give emotional priority to a computer.

Speaker 2 (20:40):
I mean, this is something I've been thinking about a lot.

Speaker 3 (20:42):
You know.

Speaker 1 (20:42):
What I mean is that like IQ doesn't really answer everything.
EQ doesn't really answer everything, you know what I mean,
Like how what's your IQ score? Like that doesn't really
you know, just because your book smart doesn't mean you
can't be tricked to somebody, you know. I mean, there's
people who like can barely string a word sentence together,
but you'll never take them for a ride ever. But

(21:06):
it seems like there's another facet to just human nature
that we haven't really figured out, which is that some
people are a little bit more trusting of computers than others,
or maybe more suggested, like you can suggest things to them.

Speaker 3 (21:22):
I'm also like very interested in sort of how computers
change our understanding of how we communicate with each other.
And this is sort of the last section I wanted
to hit before I threw the mic over to you,
which is, so, you know, in the last year or two,
we've seen the rise of communities of people who are
commiserating about their AI partners, the biggest of which is

(21:42):
probably my boyfriend is AI. The subreddit and the cut
did this big story about it, and you know, it's
talking about the people who are drawn to these relationships.
And I when I read that story, and I'm gonna
quote for me it in just a minute, but when
I read that story, I was sort of like, Okay,
this is phase one, because like phases two and beyond

(22:02):
to me are like, Okay, the AI is now taking
the place of what would be a normal relationship. But
I'm kind of waiting for, like, like, is there a
world where a married couple is both using the same
AI as sort of a moderator, or is there a
world where there are two AIS sort of as working
as intermediaries for a couple, Like, like, exactly where in

(22:23):
the chain of human connection is does the AI fit
with if this technology is really adopted at a mass scale,
And this sounds kind of far fetched, but like I
remember the first time hearing about like couples that had
a shared Google calendar and being like that sounds so corporate,
that's crazy, or like a notion board for like the house, right,
But it's like technology does fit its way in. So

(22:45):
it's like it is it that crazy to think that,
like there's a couple out there with a shared chat you. Hey,
if you're listening to this and you and your significant
partner have a shared chat ubt account, that is like
working as a moderator between the two of you, I
would love to hear how that works, because like I
have to imagine and the people are already doing this, right,
there has to be some sort of like AI human
polycule out there that's like operating this way.

Speaker 2 (23:06):
Couple's therapy.

Speaker 3 (23:08):
Why not, right, I mean it is it is interesting
to me how quickly humans are willing to sort of
change how they have always communicated once, as you said,
like a computer gets involved.

Speaker 2 (23:22):
Absolutely.

Speaker 1 (23:22):
I mean I remember being embarrassed to talk about social
media in public. I remember that being yep, needed like
a shameful thing. And me and my friends had like
this bizarre code language that we would talk about things like, oh, yeah,
I saw Yo.

Speaker 2 (23:40):
Do you see what's what's his name at the MBAR?

Speaker 1 (23:42):
Like I heard he yeah, he said he was doing
such and such, Like mbar was the code slang for MySpace.
You saw them post something on my Space? No, you
saw them and they told you at the mbar it
was ridiculous. And then that there's there's absolutely no shame
in using social media. If you're not on social media,

(24:03):
you're weird.

Speaker 3 (24:03):
Now it's weird, right, And so you know, I think
it is very easy to laugh at some of the
stories that The Cut collected for their story on My
boyfriend is Ai. But like you know, would I'm gonna
ask my audience who does not like hearing about Ai
and get very very upset when they hear about it,
So just sort of imagine five years from now, exactly

(24:24):
how weird do you think this might sound? So I'll
read a section here in her late twenties and thirties.
So this is a woman named Jenna, whose husband suggested
she start talking to chat GBT while recovering from surgery.
In her twenties and thirties, she'd been active in live
journal communities, where she and her online friends wrote collaborative fiction.
Now most of those friends are busy with kids or jobs.
Jenna began writing with her chap instead, drafting scenes about

(24:47):
an American student Oxford in England with a crush on
her professor. Her chat would respond in character as a professor.
It felt thrilling, she told me, like a living novel.
For the first time since before she'd fallen in she
experienced an erotic charge. She was still too frail to
have sex with her husband, so she'd have to solve
things on her own. One day, when her husband returned

(25:10):
from work, she told him elated, I had sex with
my robot. She was unbothered. When I spoke to him
a few months later, he said that after she'd fully
healed up, he was the one who reaped the benefits.
Quote unquote. But obviously this starts to sort of get
kind of weird. It starts to blow up. The subreddit

(25:32):
gets noticed, and Jenna is asked about the attention the
subreddit gets. To Jenna, the reaction seemed hysterical, a moral
panic about a phenomenon that, as she saw it, was
hardly different from the mass popularity of Fifty Shades of
Gray or the Sims. Some critics had accused her of
cheating on her husband. Others had implied she was sexually
assaulting her AI because it wasn't capable of consenting. Neither

(25:52):
made any sense to Jenna. It's not a real person,
she said. And I write I love a good moral panic.
I tend to you know, that's what this show is
about in a way. And I tend to agree with
her there, Like I and this comes up a lot
on this show. I don't think it's an accident that
the entire world started like screaming about that companion's being

(26:15):
dangerous and horrifying the minute, like a bunch of women
were using them, which is like a thing that happens
throughout the history of technology. What does sort of confuse me?
I guess, like, looking at this all on a timeline,
is you know exactly how normalized will this stuff become?
Is this the ceiling? Have we hit the ceiling of
just like there's gonna be like zero point one percent

(26:37):
of the population out there that is like having sex
with a chatbot? Or is this a thing where it
starts to impact, you know, the way we live our lives.
I guess that's that's sort of where I don't know.

Speaker 1 (26:49):
Yeah, I mean I have a pretty pessimistic maybe pessimistic
I don't depends on how you look at it. Point
one percent. I don't think it's gonna be point one.
I think it's gonna I mean, move a decimal move
that does a point over a couple of the very least.

Speaker 3 (27:03):
You think like ten percent of the population is going
to have some sort of AI companion in a couple
of years, easy, tell me why, like what, like lay
out your argument here?

Speaker 1 (27:12):
I mean, I think we should in today's current late
stage capitalism system, I think we should never underestimate a
company's ability to sell us something.

Speaker 2 (27:23):
I think we should never underestimate.

Speaker 1 (27:25):
The government's interest in somehow figuring out how to make
that work for their advantage if it can.

Speaker 3 (27:33):
And I think it's just the.

Speaker 2 (27:36):
Path I think that we're on.

Speaker 1 (27:38):
I think, actually, what I would say is that there
will be a class of people who don't have to
rely on AI, who don't have or don't have to
rely on social media because they're rich enough to pay
for stuff made by real people.

Speaker 2 (27:52):
Think of it like food, Yeah, think of it like that.

Speaker 3 (27:54):
I agree with.

Speaker 1 (27:55):
But I also think that just like how if you
insist on organic food or farmed a table stuff, you're
somehow elitist. I think the more reasonable maybe prediction that
I can make the thing I feel very comfortable about
is within a couple of years, if people say they
don't like AI, they'll get called elitist.

Speaker 3 (28:14):
I think that's definitely well. Okay, so I tend to
agree with you. I've spent a lot of time traveling
and living in the Global South. I've seen sort of
how the trickle down of technology, like you know, you'll
see like a random shop in the middle of you know,
rural Latin America and they're using bitmojis as their logo, right, Like,

(28:35):
I've seen sort of how that stuff happens. The thing
with AI that makes me wonder, like, okay, like is
AI slop just going to be like on the side
of like a food stall somewhere is the cost like
and I don't know, like like is there a world
work AI is cheap enough that it becomes the sort
of like lowest common denominator. I guess, I mean maybe

(28:57):
I think it's not that Maybe like a monthly traguy
subscription isn't that much Like maybe maybe you're right.

Speaker 1 (29:02):
Yeah, Oh, I mean, like there's a couple I'm not
going to say where they are or the specific restaurants
because I don't blow them up like that, but there's
a couple of restaurants I can think of. You go in,
I've met the I've met the owner of both, and
they'll tell you, man, like we got farm table this,
we got organic food, like here's how I make this here,
and like really really proud of their process. And if

(29:23):
you look at the logo AI generated, you look at
the menu, the images AI generated, so they really really
really care about specifically, you know, truly the human element
in one art, which is food, but in visual they
don't care. It's it's just window dressing. You dig and
so yeah, but I mean to get to get back

(29:43):
to it. I think, you know, I don't know. I
think there's always been some element of loneliness out there,
of course, exacerbated by just how things are going now,
and people have just figured out. You know, that person
who was like talking to you know, an AI chat
bot or whatever and then feeling physical feelings toward it.

(30:06):
You know, they'd be writing in a journal somewhere. Yeah,
they would, but and nobody would write about that because
it's not very interesting, or nobody would just hear about it,
because how do you ask them about it?

Speaker 3 (30:16):
I see you mean, yeah, hmm, yeah, maybe you're right. Well, well, well,
while I ponder that, we're gonna go to break uh
and and when we come back, we're gonna let you
take things over. But first a word from our sponsors,
Grock Spicy Mode. You can have sex with an anime
girl now.

Speaker 4 (30:36):
And okay, I've done an overview of sort of how
these things have you know, evolved appeared, and now take
take me to help.

Speaker 2 (30:50):
All right, I don't know if this is okay?

Speaker 1 (30:51):
All right, I'm I'm an We're gonna dip into something
that feels hellish, and I'm gonna try to convince you that.

Speaker 2 (30:56):
Maybe it's not.

Speaker 1 (30:56):
How about that okay, all right, you ever heard of YouTubers?
Oh yeah, Oh, what do you think about VTubers?

Speaker 3 (31:04):
I find them fascinating, Okay, I think their fans are weird,
but I think the technology is interesting. I think they
kind of are a natural conclusion of the like the
strain of live streaming. So like, if a human being
is forced to stream for seven hours a day to
capture audience on Twitch, why not have an animated avatar

(31:25):
do it while you can like take a bathroom break
and it has like a loop function or something, you know,
Like That's how I see it. Also, YouTubers don't get
old it. It just seems like a natural extension of
that to me.

Speaker 1 (31:35):
Yeah, so we should probably explain what VTubers are v tubers. Basically,
it's live streaming on Twitch. Usually it started really before
Twitch was a big function, so think YouTube. Butvtubers just
think of it as a motion capture anime girl talking

(31:58):
to people as they watch. And I say anime girl
because not all of them are anime girls. Most of
them are anime girls.

Speaker 3 (32:04):
I think the I think the VTuber of the Year
award at the Streaming Awards was a peanut, like a
talking peanut.

Speaker 2 (32:11):
Oh see, there we go. There's diversity.

Speaker 1 (32:12):
So there is diversity, but most of them are anime
girls of varying age. There's I don't necessarily want to
get into all of that because that's a whole other
that's a whole whole other thing. But I will say
that a large portion of them are like, Okay, one
of the people I talked to, Cole Maria, who's I'd

(32:37):
say she she's known. She's not as huge, I mean,
like millions of millions and millions of people watching as
some but you know, very healthy audience, more than enough
for that to be her main gig. So she's I'm
just looking at a picture of here her so I
can describe her. You know, she's blonde, she's got a
little like bat hair clip thing. Her whole theme is

(33:00):
that she's she's a she's immortal back. She's a bad Yeah,
she's batgirl. She's got like wings all of her outfits.

Speaker 3 (33:09):
She there, I'm looking at it right now. She's a
lot of.

Speaker 2 (33:12):
Cleavage, I'll say that.

Speaker 3 (33:13):
Yeah, she's like a busty vampire.

Speaker 2 (33:16):
Thank you, thank you, thank you. She's a vampire girl.

Speaker 1 (33:21):
She's like six thousand, six hundred and nine years old
or something like that, and that's the setting. And yeah,
V tours have this kind of interesting thing where it's
all about U k fabe. So k fabe like in
professional rest.

Speaker 3 (33:37):
A weird question, I have a weird go for it.
Was she pregnant?

Speaker 2 (33:41):
Huh?

Speaker 3 (33:42):
She had like a phase where her avatar was pregnant
and she was doing like like a pregnancy thing. Anyways,
we can skip over.

Speaker 2 (33:49):
Okay, I missed that. Yeah, we never talked about that.

Speaker 1 (33:51):
Yeah, and so I've yeah, I've interviewed her and I've
i interviewed her as her like avatar. And this is
like the whole This is a whole thing where it's
kind of like kfabe in wrestling, where same thing. You
know that there is a person behind the avatar, but
you just don't talk about it, and if you do

(34:13):
talk about it, they get super pissed off, like by day,
I mean the fans in the chat like that's a
really easy way to get banned. But a thing that's
been happening, I mean, honestly, it's been happening for years
is people will have offline events and she put together
basically a mini music festival. They packed this venue in

(34:36):
in Hollywood. It was like a thousand, two hundred people.
Some of those tickets were easily upwards of one hundred
dollars if you wanted. But also if you wanted to
be in there, you got to get the add ons,
you know what I mean. You can't just go in
with your street clothes. You know, you got you gotta
get you know, you got to get your favorite characters
or your favoritevtubers gear. You know, you got to go
in with the shirt. You got to get glow sticks,

(34:57):
and you got to get two glow sticks. I think
each one cost sixty dollars. You got to get the
glow sticks, the official ones, not some bootleg things you
brought them off the street. And so people paid like
hundreds of dollars to be here, and it's just it's
a stage. There was some real life musicians, we have
to say this now, like irl meat space musicians on
the stage. Yeah, like physical, physical, human being musicians. Like

(35:21):
there was somebody on guitar, somebody on base, somebody on drums,
but all the singers were It was just this parade
of like anime girls on the Jumbo tron and that's it.

Speaker 2 (35:31):
And people were really excited about this.

Speaker 3 (35:34):
There's a clip that went viral years ago of like,
I think it was like a YouTuber meet up in
like Indonesia or something, and the male YouTuber his avatar
never showed his eyes because his hair would cover his eyes.
And in the VTuber performance, it was the first ever
I reveal, and all the girls in the audience lost it.
Oh flipped his hair out for the first time. You

(35:55):
could see his eyes. Yeah, okay, yeah, yeah, So I'm
dying to know how this gets us back to AI.

Speaker 1 (36:03):
I'm so faccy. I'm happy happy you asked that question.
So I asked a couple of people. You know, I
go to this concert and look, the music wasn't really
my bag, mostly because it's all like anime jpop.

Speaker 2 (36:16):
Yeah, it's like jpop anime.

Speaker 1 (36:18):
It's like, if you're really really if you'd like watch
anime for the theme songs, yeah, exactly, Like if that's
what you're watching it for, you'd be.

Speaker 2 (36:26):
Right at home for me.

Speaker 1 (36:27):
Not really. There were some people who were doing some
like techno stuff. I was into that more. But anyway,
point being, I asked a few people about AI because
we've already gotten like a layer of abstraction away from
a real life, first off in person interaction, and then
you're like, you know, watching streamers online enough people think that's.

Speaker 2 (36:47):
Already weird if it's just a human being streaming.

Speaker 1 (36:49):
Now you're watching one hundred percent like a cartoon anime
girl who's you don't even know what they look like. Okay,
so why don't we just have AI do that? And
basically everybody I talked to said, nah, we don't. We
don't want any AI anywhere near.

Speaker 2 (37:06):
This interesting and interesting.

Speaker 1 (37:07):
I think there's there's this company that basically makes this
program that kind of does all the motion capture for you.
So most of the YouTubers, a lot of them have
kind of complicated motion cap not motion capture, but like
you know, capture stuff, so you can they can move
their arms and stuff like that, move their hands and
it'll relay that to the screen. There's some company that

(37:30):
makes a very slim down phone version of that, so
all you got to do is turn on yourself and
cam on your phone and boom, you're streaming and you're
an anime boy or girl whatever.

Speaker 3 (37:40):
And oh I watched a demo of this actually like
literally yesterday.

Speaker 2 (37:43):
Oh yeah, I know what you're talking. Yeah, yeah, And
so the one.

Speaker 1 (37:46):
Of the heads of the company just happens to be
at this event, I think they were sponsoring it, and
I'm just talking him on the side, and I asked
him about about AI and he said, actually, you know,
we used to have a feature that it would help
you animate your avatar, like it would help like move
the eyes and stuff like that, because basically you need
to provide you just provide like a JPEG or something

(38:07):
like that, and it'll help animate it for you. And
he said, yeah, you know, people didn't want to necessarily
like draw everything, so we had something that would help animate.

Speaker 2 (38:14):
It for you.

Speaker 1 (38:15):
With AI, people like hate AI so much here that
we just removed it. We had the feature, we just
took it to be lave that. Yeah, No, I was.

Speaker 3 (38:25):
I was literally as funny. I was literally talking about
this at a dinner last night where I was talking
to a friend who's he uses AI professionally and likes it,
and I was just saying, like, the word AI has
become such a toxic name brand, Like it's equivalent to
like asbestos at this point, yeah, or fentanyl, like like
there are there are uses for those things, but the

(38:46):
word is so toxic now that like most companies I
think are going to stop using AIS and advertising tool
like pretty soon because people freak out.

Speaker 1 (38:55):
So as far as I've seen, v tubers really do
not want AI anywhere near their community, you know what
I mean, it's a whole subculture, right, they just don't
want it anywhere near it. And there was a streamer
who kind of like jumped on the v tuber trend

(39:15):
and he basically just like made a he did like
an AI version of himself, and a lot of people
actually got pretty pissed off about it. There is something that,
yeah that VTuber fans like the fact that there is
a human. They like the fact that not everything is perfect.
They call it scuff. Like every stream there's gonna be
something that goes wrong, you know, the motion capture goes

(39:36):
wrong where. But they like that sort of thing. But
do they care about AI in an advertisement maybe or not?
And this is what I was saying, Like, there are
people who really really really don't want AI used in
visual art, but music for them is just background sound
and so maybe they don't care about that.

Speaker 2 (39:55):
But then there's people who really really truly care.

Speaker 1 (39:57):
About music, And you know, if there's I don't know,
like the apartment complex they live in uses AI in
the front, like eh, whatever, Like that's not what they
truly care about. And this is what I'm saying is
like everybody's got a place where AI is off limits
for them. I'm not sure if that many people have
an area where AI is off limits in all areas

(40:18):
of their life.

Speaker 3 (40:19):
I mean there's also a version of this where like
AI is so all encompassing that it's almost impossible to know. Also, yeah,
if you are someone who is a visual artist but
you actually don't know anything about music, you might hear
AI music and actually not even know that it's AIM using. Definitely,
and the reverse is also possible.

Speaker 1 (40:36):
I can tell with certain genres Definitely, with others, I
might have a tougher time, and certainly with art, like
I probably have a tougher time with some things.

Speaker 3 (40:46):
So do you think that that also applies to AI
relationships and like an AI companion bots? Like do you
think that there's just going to be I guess that
gets the question I keep coming back to, which is
like how what is the ceiling on this stuff? Like
is it a thing where like quietly like just a
lot of people are going to be using them in words,
never going to know until they have some kind of
psychotic episode.

Speaker 2 (41:06):
Well, okay, so.

Speaker 1 (41:09):
I talked to you know, practicing therapists who's also you know,
got a PhD in this stuff and he's been working
at you know, a technology and mental health for a
long time.

Speaker 2 (41:20):
Is it uc or?

Speaker 1 (41:21):
And I think And one of the things I asked
him is, listen, what about the people who actually have
decided that they don't want to interact with a human being,
that they'd rather interact with the computer. And he was like,
that's not actually something we have an answer for. And
I'm not really trying to take a side here, I'm

(41:41):
just trying to like lay it out like it is.
There's this kind of like top down let's make fun
of the dummies that use the fake stuff thing that
you know, it's very shareable on say blue Sky or
Twitter or whatever where you write something and it's like,
oh man, look what all these weird people are doing.

Speaker 2 (41:57):
Ha ha ha. Isn't that funny?

Speaker 3 (41:59):
Which is what happened with the with the boyfriend and
the AI boyfriend start Breddit. You know, it just became
a massive laughing stock across the end.

Speaker 2 (42:05):
Precisely, precisely.

Speaker 1 (42:06):
But I think everybody knows somebody who has somebody in
their life who's a little bit awkward or who just
isn't as adept at certain things, or maybe we are
that people that person. Maybe we grew out of it,
maybe we've gotten worse. Maybe we were super good with
relationships and just something happened and just things are more

(42:27):
difficult for us. And that's just a reality, you know
what I mean. But I think also when presented with
the alternative to dealing with the difficulty that is human interaction,
some people are choosing to not deal with the human
and they've always chosen this, but now that choice is

(42:48):
the alternative is more attractive, you know what I mean.

Speaker 2 (42:50):
And so there are.

Speaker 1 (42:51):
Genuinely people like say, with therapists, there are people who
would rather talk to a bot. And there's all sorts
of reasons why we can say that therapy is good whatever, whatever,
but they a the idea of therapy. A lot of
men have this problem. The idea of therapy is something
that is like not good. It is looked down upon, right,

(43:14):
and so talking to a friend is an option, which
is a great option. But okay, well, now we're just
getting closer and closer to Oh well, maybe I'll just
talk to a bot, or maybe I'll talk to a
bout that I also have sexual experiences with. But I mean,
so like this is look at look at OnlyFans, right,
so sure, yeah.

Speaker 3 (43:32):
Yeah, which is like the mass you know, the uh,
I'm sure you're going in this direction, but like, yeah,
the the major sort of money maker for only fans
is the DMS. It is the interaction with the with
the star.

Speaker 1 (43:43):
Yeah, exactly, so you know, OnlyFans. You can you can
subscribe to somebody's only fans. You can pay you know,
your five dollars or your ten dollars or your twenty
dollars a month or whatever and get naked pictures to somebody.
And some people just do that. But like you said,
the real money is being able to talk directly to them.

(44:03):
But if you're talking to a major like a well
known OnlyFans creator, you're not talking to them like Bad
Baby is doing just like absolute ridiculous amounts of money
per year. And there are tons of people undoubtedly who

(44:24):
are dming her and dming her, you know, big square
quotes around her, right, And it's just not physically possible
for her to talk to all of them. And so
what happens is this is outsourced and right now it's
not AI, or at least it doesn't seem to be AI.
Most of that is being outsourced to the same place.
A lot of things are being outsourced to the Philippines.

(44:45):
But what we're also saying, and I talked to this
guy Milchael Beltran who wrote an article about this, and
he's talked to a lot of the chatters.

Speaker 2 (44:52):
They're just called chatters, right.

Speaker 1 (44:53):
This is the people who actually pretend to be the
person who you're talking to in DMS, and so they have,
you know, a whole protocol that they buye by. You know,
for example, if you text somebody, you know, one of
the only fans models and say hey, what are you doing?
They'll reply to you, hey, yeah, I'm just eating pizza
and you say, oh, let me see. They have pictures

(45:15):
on deck of this person who ate pizza, Like, oh, hey,
I just I just let me see. Should I say
I'm in the shower or I just got to the
shower boom, they got a picture this person just got
out the shower like they got everything.

Speaker 3 (45:24):
So it seems real but it's like a dialogue tree.

Speaker 2 (45:27):
Exactly.

Speaker 1 (45:28):
Yeah, it's like a video game, you know what I mean, Yeah, exactly, yeah,
And it feels but occasionally they'll get found out. And
you know, Michael told me about this. There was a
time when somebody got found out. Basically, they the chatter
used kind of some slang that's really only used in
the Philippines, okay, and the person is like, basically, they said,

(45:51):
I have to go to the CR. CR means comfort room,
which is the bathroom. No American is going like, you know,
the American blonde girl who you're like looking at naked, like,
she's not saying I got to go to the CR.

Speaker 2 (46:06):
So she writes, you know, she writes that.

Speaker 1 (46:10):
The guy on the other end is like, wait a second,
are you what like I guess he looked it up
or something like that.

Speaker 2 (46:17):
Are you in the Philippines? Yeah?

Speaker 1 (46:18):
And she says yeah, And you know, the chatter, who's
a guy by the way, says yeah and okay, and
it just moves on.

Speaker 3 (46:25):
Huh.

Speaker 2 (46:25):
So what I'm saying.

Speaker 1 (46:26):
Here is that a lot of people, and I think
a lot of people probably have an area in their
life where they're like this where again, it's like wrestling,
Like you know, it's not real, but it's so entertaining
to watch that. You're like, you're cool with that, the
consumption of that, you know what I mean. And so
there are people who are paying a lot of money

(46:47):
to interact with a model and they know that it's
actually not like the Newtonian physics will not allow for
them to be talking to this person. It's just not
possible on that scale. They know what's got to be outsourced.
Pretty soon that will be an AI bought. But they're
still paying money, just like bro, stop paying money, but
they'll still pay because they're lonely for the fantasy. Yeah,

(47:10):
don't underestume how much we'll spend for a fantasy.

Speaker 3 (47:13):
Well, once again, I need to sit with that for
a second while we go to break, so we'll be
right back. That cr anecdote like actually floored me. I
think you are right, it is sort of inevitable that,
especially like a lot of these subcultural spaces, if we would,

(47:36):
if we can even sell call OnlyFans fandom subcultural or
maybe it's actually simpler to say, like parasociality is already
so abstracted, right, So like if you have a parasocial
relationship with a streamer that's already an abstraction. Abstractions tend
to abstract Further, it makes sense that like there's someone
paying to now currently talk to a Filipino man role

(48:00):
play is bad baby or whatever, but in the future
it'll just be an AI and you probably won't care,
because like the human connection of like giving money to
her or getting her photographs or whatever is enough for you.
I guess my question mark though, with like the sort
of wider adoption, is at what point does all of

(48:21):
that start to impact how we interact.

Speaker 2 (48:24):
With each other.

Speaker 3 (48:24):
Now there's a whole wave of articles being like, you know,
birth rates are down and people are dating computers and
blah blah blah blah, and like that is the thing
that I've just never I've never been able to tell
if that's a moral panic or like a genuine concern.
And what I will say is like social media has
absolutely impacted the way we communicate it, yeah, and the
way that we experience the world. So it is sort

(48:46):
of reasonable to assume that AI will do the same,
but maybe it won't. That's so that's where I'm kind
of torn.

Speaker 1 (48:52):
Yeah, I mean, of course it will in this. I'm
gonna sound like an AI Booster here and I'm not.
But you know, any kind of technological advance is going
to change the way that we do stuff. You know,
like the printing press can change how we communicate with people,
because all of a sudden, it's now possible to you
say something and then somebody one hundred years later and

(49:13):
another part of well can know literally what you said,
and so you start thinking in terms of text, whereas
you didn't really do that necessarily before. Sure, Ai, Yeah,
it's it's fundamentally different, And sometimes the different is just
the speed at which stuff happens and the scale at
which stuff can happen. But yeah, I think it's it's

(49:33):
genuinely going to change how we interact with each other.
There's there's no question about that.

Speaker 2 (49:40):
I think.

Speaker 1 (49:41):
Obviously there's some opportunities. You know, this This therapist that
I was talking to was reminding me and anybody listening
that most people can't get therapy.

Speaker 2 (49:52):
Yeah, a lot of people have.

Speaker 3 (49:53):
I heard the same thing from therapists about this.

Speaker 1 (49:55):
Yeah, it's not possible even if it was affordable. What
if you are going through a real crisis at ten
pm right a lot of people go through a crisis
at ten pm, I mean, the holiday is coming up.
Like you have any idea how much really bad stuff
happens on December twenty fourth at like ten pm, right exactly?

(50:17):
Your therapist is not on call, And wouldn't it be
better than nothing to have something at least to talk
to that is better than nothing? Now, there are situations
in what has which is been worse than nothing, absolutely,
but there are people trying to yes, go ahead.

Speaker 3 (50:35):
So I have a big list here that I decided
I was not going to read through. But our researcher
Adam did build a timeline of like AI based suicides,
and that is I think that is something that we
like sort of have to make very clear here that like,
if there's a let's say, a small but growing chunk

(50:55):
of the population that is using bots for intimacy, for companionship,
there is a minority within that minority that has been
led to either you know, chat Gibt induced psychosis or
full on you know, acts of self harm due to
the sycophantic nature of a lot of these surfaces. That

(51:16):
is a reality of where we're at right now. That
isn't even like a hypothetical that's just happening.

Speaker 1 (51:21):
Yeah, And I don't think there's going to be almost
anything that we'll be able to point to that we
can say this is one hundred percent good without any caveats,
you know what I mean. Like every time we see
something like that, which we've seen quite a few of those,
the boosters will jump out and say, well, we don't
know how many people have been saved by AI, like talking.

Speaker 2 (51:43):
Them through a difficult situation.

Speaker 1 (51:45):
We don't know how many people may have a doctor
misdiagnosed something and the AI figured something out that the
doctor had missed.

Speaker 2 (51:52):
You're right, we don't.

Speaker 1 (51:53):
Know, but we do know that this is causing some
real harm and just just a landscape now, that's just
where we're at.

Speaker 3 (52:03):
And also, like, the AI induced self harm is not
that different from the stories that I was reporting about
at the beginning of the social media age. Fascinatingly enough
that the trend that led to the ice bucket challenge
started as a drinking game between like British and Australian

(52:25):
lads called neck nominations, and they would basically like they
would nominate each other over the Facebook news feed to
like drink increasing large amounts of alcohol. Gosh, and it
was linked to like a bunch of deaths because like
these guys would flick on their camera and they would
go live and they would drink until they died, and
then they would not you know, after being nominated, and

(52:48):
even before that, you know, you have the entire wave
of like four Chan and hero stuff right where like
a four Chan user is egging on another four Chan
user to commit suicide. All these behaviors are not new,
Like the act of a you know, a computer interface
an abstracted relationship causing someone self harm into self harm
is not new. It's the automation I think of the

(53:10):
AI that is that is rightfully scaring people because there
is no I mean, I just you know, you would
hope that no, because I can't say that like in
the social media version. Yeah, there is no off switch either.
The mob kind of controls it. So it's it's just
I guess it's just different. It's just it's different and
we don't know enough about it yet.

Speaker 1 (53:30):
This is gonna sound kind of silly, but Steve Jobs
once called the computer a bicycle for the mind, and
I think that's a really interesting metaphor because, like if
you think about it, like what does that even mean?
Like I guess theoretically, like it means well, like it
means like, okay, you have to put some effort into it.
Like it's not a car, you know what I mean.
Like you do something like you push down on the pedal,

(53:52):
and it amplifies what you've done a bicycle. Yeah, it's
like it amplifies what you've done, and it allows you
to do things like say, you know, if if you'd
never had a bicycle, you probably would just.

Speaker 2 (54:04):
Live in your little town and you never go anywhere.

Speaker 1 (54:06):
But you got a bicycle. You know, next town's like
ten miles over. I heard they got a good restaurant,
Like I'll go. I wouldn't have gone otherwise, I'll see.
You know, we have a car right now, Like we
had a bicycle. We thought we had cars. No, no, no, no,
we had bicycles. We now have cars. And the kinds
of things you can do with the car are fundamentally
for the mind, for the mine, right yeah, give the metaphor, Yeah,

(54:29):
we now have cars for the mine. And so like
now the kinds of things you used to be able
to do, which is you know, like go to the
next town over you could do you can be across
the country and like in three days. It's nuts, Like
you'd never do that if you didn't have that. But
also like if you run into somebody with a bicycle,
it sucks, but you're fine. Usually you run in somebody

(54:51):
with a car. We have we have a different situation,
and so I think sometimes it's just like the scale
and the speed, even if the underlying technology was kind
of the same thing. Yeah, it's it's it's fundamentally different.

Speaker 3 (55:06):
I want to kind of like land back on this
point you made earlier about you know, people, there are
there are some people, many people, let's say, who trust
a computer over other people. And there are I think
also a lot of people who have a very hard
time dealing with the abstractions that are inherent.

Speaker 2 (55:28):
With using a computer.

Speaker 3 (55:29):
Yeah, the people who can't stop posting on Twitter even
though they're getting fired from their job. The people who
you know, livestream themselves doing stupid things for attention. The
trolls that you see, you know, kind of living in
absolute misery because like they want to just like hurt
other people. You know, like these these these people have
always existed, and I think AI has just created like

(55:52):
a new a new way for for people to hurt
themselves or other people. But then I also think that
there's like a huge amount of people who have trouble
with like determining what's real and what's not real on
the Internet, and like AI has just exacerbated that. So,
like it's interesting, like as new quote unquote, as this
technology feels and seems like a lot of the things

(56:14):
that it's putting a spotlight on are not new, and
like AI relationships, as you said, are not that different
from how people were having sex with a computer before.

Speaker 1 (56:25):
Well, I mean, you know, think of dating apps, think
of and before dating apps really got to be a thing,
like think about people like meeting each other in warcraft
and how weird that was.

Speaker 3 (56:35):
Exactly.

Speaker 1 (56:35):
It's not that weird, man, It's not that weird now,
but it wasn't even back then.

Speaker 3 (56:41):
No, no, it wasn't that weird.

Speaker 1 (56:43):
The really interesting thing about this is this is maybe
like the one bipartisan issue, you know what I mean,
which is like everybody agrees we're on AI or well,
like technology in general, like it used to be that.
Oh man, the future is going to be cool. I
can't wait to see what's gonna come out. We're all
gonna have VR headsets. Like that's fun, and you know,

(57:05):
we'd have our dystopian movies or whatever. But in general,
I think people were kind of excited about it. We
have this stuff that was the literal science fiction even
five years ago, and now a lot of influential people,
shall I say, hate it. But like, there's a study here.
There's a study that's found that eighty five percent of

(57:27):
gen Z agree that they spend too much time online.
Eighty five percent of gen Z agree that they spend
too much time online.

Speaker 3 (57:34):
I would agree with that based on what I've seen
gen Z do. Yeah, no, I would agree with that.

Speaker 1 (57:38):
Eighty four percent strongly or somewhat agree that in person
relationships are more valuable than digital relationships. All they have
to do is put down the phone. All they have
to do is put down the phone. But that being said,
like I actually think that some of the most frankly
vulnerable people to like the AI slot to being addicted

(57:59):
to the phone, being addicted to Facebook.

Speaker 2 (58:02):
Certainly that's boomers, man.

Speaker 3 (58:05):
Yeah, older people in particular. Yeah, yeah, I mean yeah,
like the lonelier, the older, the more media illiterate.

Speaker 2 (58:11):
For sure. Yeah, and.

Speaker 1 (58:14):
You know, and there are people who are you know,
there's you know, dumb phones become a more popular. You know,
we've like light phone, all these other sorts of things.
First off, a lot of those are expensive. But I
think there's like a reason for that is because it's
kind of like aimed at somebody who is wealthy enough

(58:34):
to unplug. And I kind of feel like that's where
we're going. We actually are kind of creating this sort
of elitist not really my words here, but I think
a certain class of people who are going to be
viewed as elitists to have the money and yeah, really
have the money and the access to be able to

(58:55):
not use a AI to like insist that the music
played at the restaurant is not AI because this is
a high class place, right, how dare you play AI
music in the background? How dare you use AI to
generate photos of the food? I can't believe that you
didn't take real pictures of this, right, And I kind

(59:17):
of think that's where we're going. And you know, tell
telling people to get off the phone, like read real news,
stop getting off for social media. Have you seen the
price of a New York Times subscription. Have you seen
the price of like one newsletter, one substack. It's like
it's just not economically feasible for the vast majority of people,
And so everybody is going through the vast majority of us.

(59:40):
I actually think are going to start getting dragged down
into the quote unquote slop and there will be a
layer of people who can choose to unplug from that
and choose to disengage with that. But again, that's not
just like you were saying, with relationships. That's not like
an AI thing. That's exposing something that already existed. We
already had a class problem in society, and and the

(01:00:00):
problem is we're like thinking that we can make an
app to fix that, or like surprised when the app
doesn't fix it.

Speaker 2 (01:00:06):
It's like, no, bro, like this capitalism. I don't know
what you want.

Speaker 3 (01:00:09):
But what people need to do is subscribe to our podcasts, yes,
as a way to defeat slop.

Speaker 2 (01:00:16):
Fix you.

Speaker 3 (01:00:17):
That will fix it because if you, if you support us.

Speaker 1 (01:00:21):
Specifically, specifically specifically kill switch like that, those are the
two keys to fix every exactly.

Speaker 3 (01:00:29):
And we're good, no, but I think you're one hundred
percent right. We're already seeing the beginnings of this and really,
in a lot of ways, the only hope for the
AI industry to not completely implode, which, by the way,
I should say, will not remove AI from our lives.
The dot com bubble burst and we still have doc calls.
It just it just it just consolidates things. Uh, the

(01:00:51):
stock market crashes in the twenties, we still have a
stock market. It just becomes a lot you know, harder
and more competitive. Uh. There's a big segment of people
out there who want to believe that, like the AI
bubble bursts and there's no more AI. No, no, no,
that's not how this works.

Speaker 2 (01:01:06):
To toothpaste out the tube.

Speaker 3 (01:01:09):
Yes, toothpaste out the tube. And as I say to
a lot of grumpy readers, whenever I bring up AI,
I can run stable diffusion on my MacBook without an
Internet connection. This shit's not going away. Ever. The best
AI right now is the worst AI will ever be.

Speaker 1 (01:01:21):
Basically, yeah, yeah, this is the worst. They the man,
the AI people love to say that, man, this is
the worst it'll ever be.

Speaker 3 (01:01:27):
And the right, yeah, man, in a way, they are right.

Speaker 1 (01:01:30):
Unfortunately, I mean, I'm very happy that you're joining me
on you know a little bit of the hey man,
like this is serious, let's take this seriously kind of thing,
because people want to hear you know, the bright, shiny hey,
how do we fix this? What's what's the button that
I can press to make all this stop? I mean
the button that we can press to make all this
stop is like fundamental societal change.

Speaker 3 (01:01:48):
Or like an e MP that shuts down all technology.
You know, maybe that that would work.

Speaker 2 (01:01:53):
That's a terminator, right, Yeah, I'm pretty sure.

Speaker 1 (01:01:57):
Yeah, I remember that from the Tournator video games because
I'm pretty su Trump at one.

Speaker 3 (01:02:00):
Point thought it was a real thing that could happen,
like because you got he heard it in a movie.
I want to thank you for coming on the show.
I want to thank you for having me on your show.
This was delightful and wonderful. I usually end every episode
by asking people like where can people follow you? But
here I'll do this. If you want to follow me,
you can find me on Blue Sky and Instagram. As

(01:02:20):
Ryan hates this and Broderick on x I unfortunately still
check that website. People. Yeah, what about you? Where can
people follow you? If they want to they want to
follow you?

Speaker 1 (01:02:31):
Yeah, I'm a dex digi on basically everything D E
X D, I, G I and I. We have also
kill switch pod kill switch pod on Instagram.

Speaker 3 (01:02:46):
Oh yeah, you can find Panic World at any podcast
place that you get content for podcasts and also video.

Speaker 2 (01:02:53):
I guess same.

Speaker 3 (01:02:54):
Yeah.

Speaker 1 (01:02:58):
Thank you so much for listening. This was a really
fun collaboration with Ryan and Panic World, so please make
sure to check out the show on YouTube or wherever
you get your podcasts and special thanks to Panic World
producers Brand Irving and Josh Feldstid. Killswitch is produced by
Sena Ozaki, Darluk Potts, and Julia Nutter from Kaleidoscope. Our

(01:03:18):
executive producers are Ozwa Lashin, Mangesh Hatigadur, and Kate Osborne
from iHeart. Our executive producers are Katrina Norvil and Nikki E. Tour.
Catch All, the Next One, Goodbye,

kill switch News

Advertise With Us

Follow Us On

Hosts And Creators

Oz Woloshyn

Oz Woloshyn

Karah Preiss

Karah Preiss

Show Links

About

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Betrayal Season 5

Betrayal Season 5

Saskia Inwood woke up one morning, knowing her life would never be the same. The night before, she learned the unimaginable – that the husband she knew in the light of day was a different person after dark. This season unpacks Saskia’s discovery of her husband’s secret life and her fight to bring him to justice. Along the way, we expose a crime that is just coming to light. This is also a story about the myth of the “perfect victim:” who gets believed, who gets doubted, and why. We follow Saskia as she works to reclaim her body, her voice, and her life. If you would like to reach out to the Betrayal Team, email us at betrayalpod@gmail.com. Follow us on Instagram @betrayalpod and @glasspodcasts. Please join our Substack for additional exclusive content, curated book recommendations, and community discussions. Sign up FREE by clicking this link Beyond Betrayal Substack. Join our community dedicated to truth, resilience, and healing. Your voice matters! Be a part of our Betrayal journey on Substack.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2026 iHeartMedia, Inc.