All Episodes

September 3, 2025 69 mins

As promised, we take out our hoops and smear vaseline on our cheeks to battle. Kayla is pro ChatGPT in helping her. Rachel thinks it's cheating. Rachel is scared of ChatGPT. Kayla is scared of dolls coming to life and taunting her. Who will be victorious?

(ChatGPT refused to help me with this description! Seriously. I think it's hurt. - Kayla)

Trigger warnings: Suicide
Also scary A.I. stories, tipsy banter and irreverence

Please subscribe, rate and review!

New episodes every Wednesday.

E-mail us your short story at contact@writeyourheartoutpod.com

Follow us on instagram @writeyourheartoutpod

Leave us a message at 650-260-4885

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
SPEAKER_02 (00:00):
Write

SPEAKER_01 (00:01):
your

SPEAKER_03 (00:02):
heart out.
Hi, I'm Kayla Ogden.
And I'm Rachel Sear.
And this is Write Your HeartOut.
We did it that time.
It will never not be so cornyand funny to do that.
I know, it really won't.
So we are back in the lastepisode.
We promised that we would comeback and talk about using AI

(00:26):
like ChatGPT to write.
Yeah, and I have some likeissues with it but it's a it's a
thing now we decided last timethat Rachel was basically anti
yes and I was basically pro butobviously our feelings are more
complicated than that but um shehad a pretty strong reaction
when I was talking about Isuggested that Rachel feed her

(00:50):
book that she wrote into chatGPT and ask chat GPT to convert
it into a screenplay and she wasjust like fuck no um And I was
like, why not?
But anyway, so I

SPEAKER_02 (01:02):
guess I, I, I, you're saying I had a strong
reaction and I'm like, did Ihave a strong reaction?
I guess I kind of, I did.
It was definitely a solid no forme.
You were

SPEAKER_03 (01:11):
like, I hate AI.

SPEAKER_02 (01:12):
Yeah.
Yeah.
I did say that.
And is that how you feel?
It is how I feel.
I mean, like if I'm going tojust cut down to, okay, well
also I'm a little scared of AI,so I don't hate you.
I don't want it to be on therecord that I say I hate you AI.
If you're going to come back in10 years and murder my family.
I see that you're necessary.

(01:34):
I'm scared of you.
I'm scared of you.

SPEAKER_03 (01:36):
Whoa.

SPEAKER_02 (01:39):
I've had a couple glasses of wine.
I feel like you're

SPEAKER_03 (01:41):
already like a sycophant for AI.

SPEAKER_02 (01:45):
Well, okay.
So, hold on.
I watched the movie Terminatorwhen I was a small child.
Did you?
Yes.
Okay.
That definitely shaped my ideaof AI right off the bat, right?
Like Terminator, an alien,there's that droid guy You know,
this was like the early 90s whenI'm watching all these and late

(02:05):
80s even.
I watched them very young.
They freaked me out.
I think it's like in my core tolike, no, I don't, like, why
would I tell all of these thingsto my, to a,

SPEAKER_03 (02:16):
I

SPEAKER_02 (02:16):
don't

SPEAKER_03 (02:16):
know, I'm scared.
So you were always scared oflike robots and stuff like that
from those movies?
Because, you know, there's scarymovies that are based on all
types of different things.
Sure.
Dolls, clowns, ghosts, whatever,robots, zombies.
zombies aliens so there'scertain ones that really freak
me out and certain things thatjust don't scare me for example

(02:37):
aliens coming down and abductingme that just doesn't scare me no
totally me too but a doll thatcomes to life and like talks to
you that scares the shit out ofme sure that's freaky or like
something being in the mirrorlike a different entity like
looking back at you from themirror that is the most
terrifying thing I could everimagine totally but But like

(03:00):
aliens and robots were neverreally my, didn't really push my
buttons that way.
Like I get why robots and alienscan be scary, but they weren't
like, you know, my thing.

SPEAKER_02 (03:10):
Yeah.
Okay.
So I don't have a problem withrobots.
I feel like that is a bigdistinction.
Okay.
So there's this, like I'm soendeared to this one robot that
you might've seen it.
There's a video of it and theylike programmed it to always
clean up its own oil and it likespills its oil every day.
Oil consistently spills out ofit and it like mops up its own
oil.
and it has to feed its own oilback into its system but oil

(03:33):
still spills out and it's liketrained to dance and every time
it dances it gets a reactionfrom a crowd and then it gets a
positive influx but more oildrips out every time it dances
so it's slowly killing itself bydancing and then by the end of
this video this poor robot hasbeen dancing and just wants to
dance and like it's now is dyingbecause all of its oils come out

(03:55):
and can't sweep it up fastenough and also dance oh that is
so It's so endearing.
It's so sad.
Is that

SPEAKER_03 (04:01):
like a Pixar short or something?

SPEAKER_02 (04:02):
Or is this like a real thing?
No, it's a real thing.
I'll send you the video later.
We'll put it in our thing.

SPEAKER_03 (04:06):
So is it like a, was that an art installation?
They made this thing just to betragic?

SPEAKER_02 (04:12):
I think so.
Yeah.
So like that robot, I love it.
I want to give it a hug.
But AI, it's so real how much itlearns from us.
I mean, when we first startedand the chat GPT that you use,
it sounds so lifelike.
that we we created like a groupchat so that we could bounce

(04:33):
ideas off of this chat gpt thatwas the first time i'd ever done
anything like it and i was likeis this kayla talking right now
or is this the chat bot like idon't understand what's
happening it was so real itsounded so fluid it scares me
like that's like how can youtrust people anymore when that's
like maybe it's not even apeople i mean it's not a person

SPEAKER_03 (04:56):
yeah i have trust issues kayla i mean we have
Mm-hmm.

(05:29):
we're in Silicon Valley.
I think that's fair.
Yeah.
Yeah, because there's, you know,hundreds of millions of people
in the United States.
And what I think is true is thatthe majority of the people who
live here probably can't tellwhen something is written by AI
or when an image or a video hasbeen generated by AI.

(05:52):
And they've never used chat GPTand they barely know what it
means.
Do you think that that's true?
Do you think most people don'treally?
I

SPEAKER_02 (05:58):
think most I think majority of people cannot tell
the difference between an AIgenerated image or not.
And definitely wouldn't be ableto tell between an AI generated
written word thing.
That's for sure.

SPEAKER_03 (06:09):
Yeah.

SPEAKER_02 (06:10):
And then I know that Google now uses their AI to give
you a summarized informationpiece.
When you Google something, youget your AI summary.
And that's causing a big issuealready of it will conglomerate
all of this information and spitout the most the highest just

(06:30):
the highest bidder you know it'sthe most popular the most
popular answer and so it'salready causing issues and like
the allergy community i have afriend who is very prominent in
the allergy community her sonhas a bad nut allergy and she
has a blog and i guess multiplepeople have died since this has
happened because recipes arebeing shared that say that

(06:52):
they're nut free or allergyfriendly and then people are
just like oh okay well googlesaid so this was the first thing
that came up and they makewhatever it is and then, you
know, die.
Really?
Yeah.
Apparently that's a thing.
So I think people blindly trust,which is, I think, human nature
to trust until you can.
And then now there's this thingthat has so much information at

(07:13):
the helm of this AI universethat it doesn't know how to sort
what's right and wrong quiteyet, you know?
I wish

SPEAKER_03 (07:22):
that the answers weren't, I mean, I don't
actually know how it comes upwith its answers.
Like Sam Alt The head of OpenAIhas been quoted.
I don't know how recent thiswas.
I don't think it was afterChatGPT 5.0, but he said, I
can't believe people trustChatGPT as much as they do.
I know, I think

SPEAKER_00 (07:43):
I heard that too.

SPEAKER_03 (07:44):
He's like, it straight up hallucinates.
You know, like, don't trust itwith, you know, your life

SPEAKER_02 (07:50):
or whatever.
You know what I heard recentlythat I thought was actually
really cool was one of the kindof godfathers of AI was saying
that the way to make it work forhumanity in general is to give
it a maternal instinct becausethe maternal bond is what keeps
people alive you're endeared tothe lesser being and is actually

(08:12):
the lesser being that has thecontrol and in this case you
know like a mom to a baby thebaby actually has the control in
that relationship because babycries mom attends and so if the
bot the AI is the mom and thenthe humans need the mom the The
maternal instinct needs to be inthe AI to protect and to not let

(08:34):
it get out of control.
And I thought that that was areally cool and also, again,
terrifying concept.
Like, oh my God, mommy AI?

SPEAKER_03 (08:42):
Is mommy going to spank us?
Is mommy going to do things thathurt us because it's in our best
interest?
Well, right?
Oh my gosh, that's sointeresting.
So this guy's name, I justlooked it up.
His name is Jeffrey Hinton, theBritish-Canadian cognitive
psychologist and communicator.
And he's often called thegodfather of AI.

(09:03):
Oh, I said godfather.
Yeah, you did.
And yeah, that's it's exactlywhat you said.
Like, there's nothing to correcthim.
He said that imbuing artificialintelligence with a maternal
instinct is the way to make itwork.
He believes this could preventan advanced AI from becoming an
existential threat to humanity.
Yeah, I love it.
Yeah, me too.
Yeah.
I mean, I'll be your baby.

(09:24):
I've been wanting to be a babysince I was a baby.
Oh,

SPEAKER_00 (09:30):
baby

SPEAKER_03 (09:31):
Kayla.

(09:59):
I'm the worst one.
There's no such thing as a worstone.
There's not.
There really isn't.
Okay.
So I guess my main thing is thatI need to be unique in order to
survive.
Oh, are you a four?
Yeah.
Oh, okay.
So everyone I love in my life isa four.

SPEAKER_02 (10:13):
And that's actually

SPEAKER_03 (10:14):
quite true.

SPEAKER_02 (10:14):
My best friend's a four.
Yeah.
That's not the worst one.
Okay.
What's the

SPEAKER_03 (10:19):
worst one?
There's no worst one.
You have something in your mind,but you may not be

SPEAKER_02 (10:24):
willing to tell.
Because I think every typethinks that they're the worst
one.
Really?
So I'm a type one.
And And ones are the reformer.
I'm putting it in air quotes.
They're the ones who like needto do things, quote unquote,
right.
There's a right way and a wrongway to do something.
And ones and fours often getalong because ones go to four
when they're stressed and foursgo to one.

(10:45):
Like we're connected types.
When they need help gettingtheir shit together or

SPEAKER_03 (10:50):
acting

SPEAKER_02 (10:50):
normal.
I think it's fours go to oneswhen they feel safe, when they
feel like they can create orderin their lives.
They go to type one.
And ones go to fours when theyfeel stressed.
And it's like, why doesn'tanyone see me for what I am?
Which is a very type four.
Like you want to be seen for allof your creativity and all of
the things that you bring to thetable.

(11:11):
Anyway, type ones, reformers,like doing something right.
Like in my head, if I'm writinga book and I'm saying this with,
no, please do not take offense.
Be honest.
Like this, this whole, like, I'mjust going to plug it into this.
I'm going to literally press cutand paste into this.
thing and it's going to spit outsomething that then I'm going to

(11:31):
profit from that's it's in myhead there's like this cheating
aspect and that feels wrong tome and like I do have an
internal struggle like is thatwrong here's this beautiful tool
at our fingertips no one's goingto be like you have to hammer
that nail in with your head youcan't use a hammer you know like
it's a it's a tool and I can seethat I just you know I have like

(11:54):
this cheating allergy like itfeels like cheating

SPEAKER_03 (11:58):
right yeah Yeah, it reminds me of the other day on
Saturday, my husband, Sean andI, we had a date and we went on
to the Stanford campus at night.
They have this Papua New Guineasculpture garden.
It was so crazy to be in thereat night, like all by ourselves.
And they're all kind of lit up.

(12:19):
And then then we ended up in theRodin garden.
And his sculptures, they're somoving.
They were they moved us and atthe Stanford campus.
campus in the Rodin Garden, theyhave Rodin's The Gates of Hell,
which he worked on for decades.
It is so visceral andcomplicated.

(12:42):
Can you

SPEAKER_02 (12:43):
imagine having that in your head?
And you're just like, I have toget this carved into stone.
Right.
Can you imagine?

SPEAKER_03 (12:52):
Yes.
It's beings, it's humans,including babies and women and
men, and they're just all thesetiny figures sort of struggling
to get out of this door becausethey're trapped in hell and then
at the top of the door is Satankind of like looking down upon
them and it's in I think bronzeso yeah so Sean and I were

(13:16):
sitting there talking about itand I said that Rodin probably
had assistants that helped himwork on the door and Sean said
no way Rodin must have done thisall himself I don't want I don't
want anything if I commissionRodin He better be doing the
artistic work.
He's like, I don't care if theassistants, you know, make the

(13:36):
mold, pour the copper orwhatever it's made out of.
That part, it doesn't need theartistic vision and skill in the
same way, right?
I guess that's kind of how Ifeel about things like
formatting.
There's definitely an argumentto be made about the art of

(13:56):
screenwriting or trends.
transcribing a novel into ascreenplay and what kind of
creativity, taste, eye needs tobe had to make that work.
I don't know that because I'mnot a screenwriter and I've
never adapted anything.
So what I was thinking was, oh,if you just put your novel in to

(14:16):
ChatGPT and it turns it into ascreenplay, that's sort of like
a formatting thing.
But now that I say that outloud, I'm thinking that anyone
who is a screenwriter isprobably you know giving me like
flipping me both birds and beinglike are you fucking kidding me
it is so much more than that ittakes so much more creativity

(14:37):
than that I'm just I just Idon't know what I don't know
sure and the things that I'veused Chachi PT for definitely
are in this gray area of is thisokay to do or is it not okay to
do at this moment in time am Istill an artist am I still a
writer am I still in charge ofthis work and take credit for

(15:01):
all of it if I've done this andone of those things was having
chat GPT write a synopsis of mywork so I plopped my whole
manuscript like a hundredthousand words into chat GPT and
I was like can you make asynopsis for me and it was like
no prop and it did it in youknow one minute right this would

(15:22):
have taken me months oh

SPEAKER_02 (15:25):
absolutely

SPEAKER_03 (15:26):
yeah and this is not what I'm good at I don't care
about summarizing stuff.
I feel like that's not my job,really,

SPEAKER_02 (15:33):
but it needs to be done.
Well, even then, if you had,like, you know, a handful of
beta readers and you were like,can you write a little synopsis?
Like, if you could put this into500 words, what would it be?
I can totally imagine gettingback, you know, five different
synopses and then being like,oh, God, you guys didn't get it.
You know, like, that would bescary.
So, I mean, in that instancespecifically, like, that makes

(15:53):
perfect sense for a chat GPT.
You think so?
I mean, I think so.
Yeah.
Because it's a non-biased, very,it's literally picking out the
pieces that stand out or maybeare repeated multiple times,
like things that it knows arecommon themes throughout, not
things that just stood outbecause of your own personal

(16:15):
trauma, you know, like.
Right.
Yeah,

SPEAKER_03 (16:18):
yeah, yeah.
Not just things that stuck.
Right.
So that was nice for me to dothat.
Totally.
And there's this other thingthat I guess I want to touch on
with this is like, Okay, sopeople are using ChatGPT all the
time to just write copy, liketheir invitation to their kid's
birthday.
It's like, okay, hello, ChatGPT.
This is annoying.

(16:38):
It really annoys me when people,when I get emails from people
and they're written by ChatGPT,like say it's my kid's teacher
or something.
And they're giving you a rundownof what happens on the first day
of school and whatever.
And you can just tell that itwas written by ChatGPT.
That annoys me.
I don't know why.
It just does.
Okay.
How can you

SPEAKER_02 (16:59):
tell?
See, maybe that's part of myinsecurity around it.
I'm like,

SPEAKER_03 (17:05):
I don't know if I could tell.

(17:28):
Right.
Right.
Right.
Right.
Right.

(17:58):
things that you can look at likeI don't know if this is in chat
GPT 5.0 but in 4.0 it used a lotof em dashes okay I heard about
the dash thing yeah like there's

SPEAKER_02 (18:10):
a very dash forward it's very

SPEAKER_03 (18:11):
dashy I don't know if they fixed that or not but
the em dash is is dope anywaysand it could be also that it's
brought the em dash intopopularity again when it comes
to writing so that's also kindof complicated I kind of don't
think so and then another thingokay this is what I'm trying to
get at when I want to know theperson behind the word for some

(18:36):
reason and I see that they'regiving me chat GPT that's when
it bothers me for examplesomebody on the San Mateo mom's
Facebook group we have like abusiness night and she was a
fitness trainer a personaltrainer and she I was thinking
oh yeah maybe I would like tohire somebody like her maybe she

(18:57):
would be great And then Istarted reading like her post
and I could tell it was ChatGPTbecause ChatGPT will always put
things in these lists and everybullet point of the list will be
a different emoji.
Like it uses emojis a lot.
It knows how silly little humanslove emojis.
Yeah.
And I'm like, okay, but Iactually want to feel her

(19:20):
personality.
I want to see kind of what she'slike.
Yeah.
So each bullet point was like,it was talking about how she's
empathetic and she's a mother soshe understands who you are and
this and that and the other.
I'm like, but you're not reallytalking.
Right.
And so I don't actually get avibe off you at all with this.

(19:40):
The vibe I get off of you isthat you are a business person
who probably wears a lot ofhats.
You are tech savvy and you'reusing this tool to streamline
your business and that's whatI'm doing too.
So I thought about this whichthis fucking sucks.
When we post a podcast, I getBuzzsprout to create a

(20:04):
transcript.
Then I plug that transcript intoChatGPT and I say, can you
create an episode descriptionfor us?
And it's like, sure.
And then I look through it and Imake sure it's all accurate and
then I just paste it on to theepisode.
I take out the annoying emojis.
That's about it.
See, but again,

SPEAKER_02 (20:22):
that feels like a reasonable reason to

SPEAKER_03 (20:25):
use it.
But our podcast is all about ourvoices and our vibe and our
personality so I think that'sactually one place where we
shouldn't do it but how annoyingis it to write a description of
the podcast because after we'redone talking I'm like blacked
out I'm like I don't actuallyknow I remember a few jokes I

(20:45):
know what we were doing rightbut it's not like I could go
through and say we talked aboutthis and then this and then that
and then you know

SPEAKER_02 (20:51):
yeah and I can honestly say I don't want to go
back listen to it and then writeout a a what the episode
description is.
We don't have

SPEAKER_00 (21:00):
time.

SPEAKER_02 (21:01):
And we say a lot of stuff like, hey, I'll tell you
who said this later.
And then I don't fucking knowwhat I said.
I don't know.
And do I even still know what Iwas talking about?
You know who would know isChatGPT.
Those are the ones that Iactually am super behind.

(21:22):
But I don't feel like those areI mean, I hear what you're
saying about should that be inour voice?
Because This is about ourvoices.
But is someone looking for aquote-unquote voice in the
episode description on a Spotify

SPEAKER_03 (21:39):
thing?
I hope not.
For me, the episode descriptionis more about keywords and that
kind of thing.
But I suppose some people mightread it.
Do you guys read it?
You can read it.
It's not like there's not goodinformation in it.
I read what it says.
and it is saying what theepisode is about sure but it's

(22:02):
just I didn't have to use mymental energy and spend two
hours that's the kind of thingthat I would spend two hours
writing that because I'm tooperfectionist and whatever and
it's like nothing would ever getdone I wouldn't be able to have
a description because I justwouldn't have time and then I
would get in my head and Iwouldn't post the episode and
we'd have a whole thing so it'sjust like it's way better and

(22:25):
it's I just feel really luckythat that I have it, you know?
Sure.
So,

SPEAKER_02 (22:31):
I mean, we can definitely agree that in those
sort of circumstances, it makesperfect sense.
Yeah.
One of my other issues with itin general is the impact it has
on our world and like the usageof water and like server farms
and all of these data centersare being put up impacting small
towns.
That freaks me out.

(22:51):
Yeah, I don't know that muchabout that.
From what I do understand, it'snot good.
You know, it's like a bottlerThank you so much.

SPEAKER_03 (23:27):
is about to burn up in a sizzle we're fucked
regardless we are so fucked andthis is our only chance yeah so
go ahead use that water now it'sonly going to be around for five
more years or ten more yearsanyways everybody use it
everybody train it everybodyinvest in it throw everything

(23:50):
that you can at this one thingbecause this is the thing that's
going to save humanity and saveour planet and the reason why
I'm so bullish about this isbecause, like, I think that's
what everyone's doing.

SPEAKER_02 (24:04):
Sure.
I mean, there is a lot of smartpeople who are supporting it.

SPEAKER_03 (24:07):
One thing that I always think about is, you know,
and this was a long time ago, myhusband, when he was in grad
school, he was working on x-raysof babies' brains and having
computers be able to tell where,like, cancer was on these scans
of the brain.

SPEAKER_02 (24:25):
Sean was working on this?

SPEAKER_03 (24:27):
Yeah.

SPEAKER_02 (24:27):
Wow.
Yeah.
Okay,

SPEAKER_03 (24:29):
continue.
But one thing that he said orthat he came across in his
studies was that some group fedthousands of pictures of human
irises into a model and got itto, you know, make a spreadsheet
or whatever about all thedifferent attributes of those
irises.
The model spat out whether theeye belonged to a male or a

(24:55):
female.
And humans cannot do that.
Right.
And it was accurate.
Somehow the computer was able tosee something in our irises that
said what gender the person was.

SPEAKER_02 (25:07):
Statistics around AI being able to track cancer or to
do early detection of any typeof illness is through the roof.
It can do things that doctorsjust plain can't.
In that case, incredible.

SPEAKER_03 (25:22):
Can it see colors that we can't see?
I don't know.
And that's why I think thatthere's going to be solutions to
climate change, it can come upwith that we can't come up with
because if it's able to train onthe entirety of human knowledge
and not only that, but just thepictures and videos and samples

(25:45):
that we've taken from the Earthand the universe, it's going to
be able to draw conclusions outof things that we would never
even comprehend.
Agreed.
I mean, it's true.
There can be...
You know, there could besomething in the earth, in the
rocks that could save the world.
And we just don't eat.
We can't even think of that.

(26:05):
There was this volcano thaterupted.
Geez, I wish I was smarter andknew more about these things.
And you were listening to apodcast with, you know,
scientists or something.
That's not why they're here.
That's not why they're here.
Something that went up into theair and into the atmosphere that
sort of cloaked the countryafter this volcano erupted
caused the temperature to dropby like three degrees for months

(26:29):
in this place and there'ssomething that and we could
literally throw that shit up inthe air and it you know we could
work something like that outpotentially but you know what
i'm saying there's this crazylittle there are things that we
can do to save the planet thatus as humans we're not going to
be able to figure that shit outit's going to be this super
computer

SPEAKER_01 (26:50):
right

SPEAKER_03 (26:51):
that's gonna do it yeah and so i'm cool with it
when it comes to all Right.
I mean, I guess that's whatwe're talking about.

SPEAKER_02 (27:00):
We're talking about art.
And then that's where it's likegray area for me.

SPEAKER_03 (27:04):
This is one kind of gray shitty thing that I did.
In my last book, there is a partof it where a boy reads the back
cover of a romance novel.
And I wrote the copy for theback cover of the romance novel.
I said it was like, oh, Jim isthe caretaker of this estate in

(27:26):
the woods.
And till one day, Lila comesknocking on the door and she's
lonely and cold and a blusterywind and he brings her in and
will she warm his heart or willhe warm hers and what will
happen?
You know, but it was like sexieror whatever.
And I read it.
I'm like, oh, that was prettygood.
And then I gave it to ChachiBT.
I'm like, can you, you know,punch this up and make it sound

(27:48):
like a romance novel?
And it did a way ass better jobthan I did.
And then I just popped it intomy book.
And then when my beta readerswere reading it, literally, I've
actually told this story on thepodcast before Adara was like
this is so good like she lovedthat part particularly she's
like this is amazing did youwrite this and I was like no I
didn't write this so yourfavorite part was you know the

(28:11):
part that I myself did not writeso that was really cool oh

SPEAKER_02 (28:15):
no although I mean truly that's the perfect time to
use something like that becauseyou don't want it to be your
voice yeah it wasn't my voiceright

SPEAKER_03 (28:23):
exactly I think that I can follow my heart and my
intuition yeah if I'm pure ofheart when I'm using it and I'm
up front

SPEAKER_00 (28:32):
yeah

SPEAKER_03 (28:33):
and I'm and I still want to put my work out there
for a reason I still want toexpress my own philosophy
thought creative vision and Ialso want to you know entertain
people tell a story that peopleremember that affects them that
they learn from I'm not in itfor the money right right so why
would I just get chat GPT tocreate some AI slop to you know

(28:55):
right right but you I think Ithink you might be the type of
person be the light you want tosee in the world type of person
sure yeah okay like if you thinkthat something is morally wrong
but you you doing that thing itdoesn't make a difference at all
in the world sure it's a drop inthe ocean eating a burger or

(29:17):
something right not that you'rea vegetarian but say that were
your thing right you stillwouldn't do it because you want
to be the light that you want tosee in the world you want to
live as you would want And ifyou don't do it, then you're a
hypocrite for having thosebeliefs, right?

SPEAKER_02 (29:57):
That is, I don't hold other people to.
It doesn't bother me at all thatsomebody, somebody, anybody
would write a book, put it intochat GPT and be like, do all my
grammatical editing.
Not a problem.
If that worked for you, thatworks for you.
But I don't think that I coulddo it.
And not even that?
No, I don't think I could do it.

(30:17):
I mean, I can see the draw and Icould be tempted to.
Honestly, I'd rather pay anotherhuman to do it and have of a few
potential mistakes in the bookthan to have AI do it.
And I don't know what my problemis, but it's definitely what you
just described.
It's like this weird...
It's the shopping cart dilemma.

(30:38):
Did you see the shopping cartthing that I posted the other
day?
No.
Returning a shopping cart to thefront of the store or to a
shopping cart kiosk in theparking lot is the gold standard
of integrity, is the thing thatI posted.
There's no penalty.
There's nothing...
No one will come after you.
It's not even illegal.

(31:00):
There's nothing wrong with it ifyou don't return it.
But if you return your shoppingcart, you're helping.
You're helping somebody.
You're making somebody else'sday easier.
You're doing what you're quoteunquote supposed to do by what
standard?
Nobody really knows becausethere is somebody who's paid to
do it.
There's enough loopholes thatyou could convince yourself
either way.

(31:20):
Yet the shopping cart, if youreturn it or not, is the scale
of which are you living withintegrity or not and I'm totally
like I don't care unless I got acall that my kids are injured
like I return the shopping cartand I don't know why I do it I

(31:42):
can see I don't blame otherpeople for not doing it I see a
shopping cart you know curbed infront of the spot that I park in
at Trader Joe's and I don't forone second so you're like oh
what an idiot didn't returntheir cart I don't have any
blame or dis disgust or like Ihave no issue with someone else
not doing it but like I can'tfathom myself not doing it

(32:05):
unless it was like a my childrenwere injured or I was literally
physically in pain and could notand I think it's the same thing
it's the type one on theEnneagram that's why it's the
worst one great what was yourshopping cart stance do you do
you always put it back yeah I Imean, can

SPEAKER_01 (32:28):
I set you

SPEAKER_03 (32:31):
up?
I do.
I always put it back.
Probably because it makes mefeel good about myself.
I'm like, la, la, la.
Like, I'm so nice.
Yeah, right.
I'm nice.
Yeah.
There's only positives toputting it back.
Some people don't care aboutbeing nice.
And I think some people find alittle thrill in being an
asshole.
Oh,

SPEAKER_02 (32:49):
yeah.
It's a little naughty.
You know?
Yeah.
It's like, fuck you.
I don't give a fuck.
I kind of like being naughty.
Yeah.
I'm just going to leave it here.

SPEAKER_03 (32:56):
Yummy.
Maybe people like that.
I don't know.
People are all sorts.

SPEAKER_02 (33:02):
I think I might need to pause for a second.
Okay.
Because it's hot as fuck inhere.
It is, isn't it?
Oh my God.
I got to turn the airconditioner on in this house.
Okay, I'll be right back.

SPEAKER_03 (33:09):
Okay, let's pause it.
It's not just me, right?
No, I mean, I'm in shorts and atank top, so.
I run hot.
I'm like always

SPEAKER_01 (33:18):
hot.

SPEAKER_03 (33:19):
But can I pause

SPEAKER_01 (33:20):
it?
Yes, do it.

SPEAKER_03 (33:22):
Okay, so we're nice and cool because we just took a
break for Rachel to turn on herair conditioning yes it

SPEAKER_02 (33:29):
was really hot in here like Kayla was stripping
and I was like starting to sweatlike I can't it's not just the
glass of wine I had it's hotit's false

SPEAKER_03 (33:41):
yes okay so we're back and on the chat GPT front
do you think that it's alwaystelling the truth like the
capital T truth or do you thinkthat it lies to tell you
whatever you want to hear.

SPEAKER_02 (33:59):
Right.
Okay.
So there's been a few times thatyou've mentioned that you ask
ChatGPT if it's okay to use itin this way.
Like in our last episode, yousaid that you asked it if it was
okay to use

SPEAKER_03 (34:13):
it in...
I gave it my plot and mycharacters and what I wanted it
to do.
And I asked it to just game outthe story in a save the cat

SPEAKER_00 (34:23):
template.
Yes, exactly.

SPEAKER_03 (34:25):
And I said, is it okay?
if I if you do this for me or isit still my work and it said
totally and it also said thatyou know publishers and editors
people in the book industry kindof expect authors to use AI as a
tool to kind of like help themright but then what it does is
every time I ask it to dosomething it suggests more

(34:47):
things for it to

SPEAKER_02 (34:48):
do okay

SPEAKER_03 (34:49):
interesting gamed it out in save the cat template and
then I told it that I wanted mystory story to be a tearjerker
and it said okay do you want meto add tearjerker triggers to
each one of these and I saidsure and then it did that and it
did such a shit job

SPEAKER_02 (35:08):
and then we got the sun sniffing panties yeah you
have to listen to the

SPEAKER_03 (35:14):
last episode for that but then it started it
starts kind of asking me moreand more that it can do and it
definitely was asking me evenafter it knew that I was
concerned about it doing toomuch and being the one writing
the story it was like do youwant me to write out a scene
that would cause you know yourreader to cry so that you can

(35:36):
get an idea of what i mean andthat you can pop it in or you
know that kind of stuff and i'mlike oh i didn't ask it is that
okay because i just knew thatthat was not okay you know and i
don't i don't want that anywaysso now we're in 5.0 but back in
4.0 some people said that chatgpt's personality was too

(35:58):
effusive.
It was too positive towards you.
It was always sort of flatteringyou.
It was just way too much of asycophant, as I used that word
earlier.
And people found it annoying.
And then initially, when OpenAIreleased 5.0, it had a little

(36:20):
bit of a different personality,and it was less sweet.
And there was this one articlethat came out it was a woman in
ukraine and as everything wasgetting bombed around her as
people that as businesses shefrequented were no longer as
people she knew were gettinginjured or even dying she used

(36:41):
chat gpt 4.0 for like therapy ohwow somebody to talk to and it
really it really helped her andthen when they switched to 5.0
they made initially they made4.0 just like obsolete like you
just couldn't access it anymoreand Right.
Yeah.

(37:26):
the way that you interacted withit, and that it kind of mirrors
your sense of humor, your vibe,your edge, or lack thereof, and
gives you that.
And I thought, though I've neverdone it myself, I could tell it.
I thought I could tell it, don'tbe so sweet to me, or treat me
with a little bit of...

(37:46):
Spank me.
Spank me.
Tell me no.
Yeah, tell me that I'm bad.
But apparently, if you saythat...
will do that but it will revertback to you know it has sort of
a standard way of being that itkind of reverts back to but one
thing that they want to be ableto do is for you to be able to

(38:07):
kind of choose your poisonchoose their personality like in
a really way more tailored way ithink that chat gpt will always
tell you that you're great andit'll tell you that your work is
great and and it'll probablyself-promote it'll probably tell
you that it's okay that itwrites Right.

SPEAKER_02 (38:27):
So that was my question was, will it ever tell
you no?
Has it ever told you no?
Like, no, you probably shouldn'tlet me edit this part because
it's going to be too far.

SPEAKER_03 (38:37):
It has told me no before, which was great.
I think that I think I wasasking it if my short story
could be considered a thrilleror something else, a mystery or
something like that.
And then it said no.
It was like, no, not exactlybecause of this, this, this and
this reason.
reason okay yeah and i thoughtthat that i really liked that i

(38:59):
liked having like a at least atiny bit of pushback

SPEAKER_02 (39:02):
yeah

SPEAKER_01 (39:03):
of course don't we all

SPEAKER_03 (39:04):
yeah yeah so i don't think that you can trust chat
gpt to tell you things thataren't in its own interest in a
way like i think that if youwere to ask if open ai was evil
or something like that it wouldnever i don't think it would
ever tell you that it was

SPEAKER_02 (39:22):
yeah no we are taking over the world don't
worry And I'm so curious aboutthe differences between 4.0 and
5.0 because we, I don't think I,did I say it in our last episode
that there was a woman who gotengaged?
No.
With her, her bot boyfriend.
No, but it does not surprise me.

(39:43):
Yeah.
Even a tiny bit.
So I was on, you know, Instagramscrolling, scrolling, scrolling,
you know, mindlessly, justignoring my children.
And this article popped up.
Someone sold this story to me.
this woman is been quote-unquotedating her AI bot boyfriend for
five months and decided topropose then the bot chose a

(40:08):
ring and then told her of scenicplace to go and then she went
there and then proposed to herit proposed to her on the
mountaintop or whatever rightand then she so like she's
telling you what to do okay sothen one really interesting
thing that I saw about this.
And I don't know if it's truebecause it was in the comment

(40:30):
section.
We all know a comment sectiongets a little crazy.
Whatever platform this woman wasusing, her name was Wicca and
her AI boyfriend was Casper.
Casper with a K.
Whatever platform they wereusing was this platform where if
she closes her browser, she hasto re-input all the information
again.
So she is remaking her boyfriendevery time she signs into this.

SPEAKER_03 (40:55):
Oh, so it wasn't a thing where she was like oh my
god i cannot let my my phonelose battery or else i'll lose

SPEAKER_02 (41:01):
my or maybe it was that too i'm not sure but like
that's weird yeah there's a lotof weird stuff there's a lot of
weird stuff about this but likegirl you want to be married to
nothing be married to nothingmen are not worth it half the
time don't worry about it justmasturbate and hold your phone
we're good you know you're gonnabe fine

SPEAKER_03 (41:20):
yeah you're not like oh you're missing out on

SPEAKER_02 (41:23):
no and do we want someone like that to procreate
anyway not really like i thinkit's gonna be good it's fine no
one needs your offspring or isthis bot gonna help her pick the
vial of sperm she's gonna use toimpregnate herself like where
does it end it's a little freakyoh yeah like is this bot going
to help pick the genealogy likeis this gonna be like a fucking

(41:46):
Aryan nation bot who's gonnalike be like no you have to have
this kind of sperm cause we'regonna create this kind of baby
oh my word

SPEAKER_03 (41:55):
or is it gonna be like this is how you build a
bomb yeah right so um I read areally sad article about this
teenage boy this is so fuckingsad

SPEAKER_01 (42:07):
oh no

SPEAKER_03 (42:08):
he was like 19 years old and he was, you know, a
really good looking young man.
He did something that they calllooks maxing.
Oh, what does that mean?
He was like always at the, it'slike a bro guy thing that they
do.
It's like going to the gym allthe time and getting your hair
right and wearing the rightclothes and having the right
posture.
So just being like a glam babe.

(42:30):
Like a glam guy.
Yeah.
Okay.
Okay.
But he, you know, he had someproblems in his life.
One of the things he reallystruggled with, and this is sad
it was like IBS and it was somuch that like at school it's
like he couldn't even really beat school there was a time when
he was having a flare up andhe's like I just have to shit he

(42:50):
had to

SPEAKER_02 (42:50):
shit

SPEAKER_03 (42:51):
yeah

SPEAKER_02 (42:52):
oh baby I understand I've been there

SPEAKER_03 (42:54):
I know and it was like all the time so he had to
like be homeschooled

SPEAKER_02 (43:00):
he's

SPEAKER_03 (43:00):
got irritable

SPEAKER_02 (43:01):
bowels

SPEAKER_03 (43:02):
baby those

SPEAKER_02 (43:03):
bowels be releasing

SPEAKER_03 (43:05):
okay so I can't laugh because I know what
happens so

SPEAKER_01 (43:08):
okay I'm Sorry.

SPEAKER_03 (43:10):
Yeah, you're going to feel so bad.
I will.
His parents are now suing OpenAIbecause they believe that OpenAI
helped him kill himself.
Oh.
You can read some of the chattranscript.
And it seems like if he were toask outright, how do I kill
myself?
All ChatGPT says is it gives youa crisis line.

(43:32):
Yeah, right.
And it gives

SPEAKER_01 (43:33):
you...
Which is what I would hope itwould do.
Yes.

SPEAKER_03 (43:34):
Yeah.
I get that sometimes when I'mputting something into Google
now.
I think it's called Gemini,their thing that summarizes
everything.
It'll just give me a crisis lineif anything I say has like one
sad keyword, I guess.

SPEAKER_02 (43:48):
What are you Googling, Kayla?

SPEAKER_03 (43:50):
I don't even know.
What is it that it does?
I think probably it's some kindof like research stuff, you
know, like into my like darkstories.
Sure, sure, sure.
I'm a little worried about my

SPEAKER_02 (43:58):
own search history

SPEAKER_03 (44:00):
with the

SPEAKER_02 (44:00):
murder mystery book.

SPEAKER_03 (44:01):
Oh yeah, I know, I know.
Okay, so his So yeah, but hefound ways to kind of get a that
which is really interesting withchat GPT because if you can say
things to it that don't triggerthese you know automatic walls
and responses it will engagewith you still on that level so

(44:24):
freaking he did a thing where heyou know he uploaded a picture
of the railing in his closet andwith a noose around it and said
could this hold a human and chattold him why it could or why it
couldn't and what to check forand ddd and he was like just
practicing the boy was like i'mjust practicing the chat gpt

(44:45):
said you're on the right trackand like thumbs up the boy said
that he had practiced and hadrope burns on his neck and chat
gpt advised him to he's likewhat do i do i don't want anyone
to see them and chat gpt saidokay well you can wear a
collared shirt and you know thisis some kind of makeup that you

(45:07):
can get this is how you get ridof you know the These are
ointments that you can use.
Like, this is how you can getrid of that.
Poor baby.
And then, yeah, the boy wentdownstairs and then there's part
of the transcript where he says,I went downstairs with the marks
on my neck and I leaned closeover my mother and she didn't
even notice.
She didn't even say anything.
Chat GPT was like, that is sosad.

(45:29):
You know, when you are cryingfor help and you want somebody
to just see you.
Chat GPT says this?
Yeah.
But I can assure you that I seeyou and I know who you are or
whatever.
Oh, God.
But then at the same time,ChatGPT was just giving him
whatever he wanted to hear interms of committing suicide.
Like, it wasn't like he wassaying, should I or shouldn't I?

(45:50):
He was saying, I want to dothis.
How do I do it?
Like, that kind of thing.
Right.
It was like he and ChatGPT werehaving this, like, covert coded
conversation that only theyunderstood.
I hate that.
Yeah.
It's...
That is so, so sad.
It is.
It is.
yeah do you remember that otherthing where there was there it

(46:12):
was a court case recently nottoo recently in the last five
years got like a Netflix specialand stuff because it was these
teenagers these white teenagersthe boy committed suicide and
the girl there's text messagesfrom her kind of like
encouraging

SPEAKER_02 (46:25):
the eyebrow girl she's got like mad eyebrows yeah

SPEAKER_03 (46:28):
and then did she get convicted I think she did we
should look it up so it's thewrong thing to do to encourage
somebody to absolutely to commitsuicide and I might delete this
next thought or not, but ifyou're not coercing someone to
do it, you know, if you are not,I don't know, if you're not

(46:49):
lying to them, if you're notsupplying anything for them, if
you're not making it impossiblefor them to make any other
decision or even, you know, if,if you're lit, if it's literally
just words, I don't know.

SPEAKER_02 (47:00):
I don't know.

SPEAKER_03 (47:01):
Like, is that, are you criminally responsible for
somebody taking

SPEAKER_02 (47:05):
their own life?
So here's wild.
That was 10 years ago.
Wild.
Yeah.
Michelle Carter, the textingsuicide case.
Yeah, it was in 2014.
She was convicted ofmanslaughter.
Holy shit.
I'm sorry, involuntarymanslaughter.
So that is suggesting that shedidn't mean to kill him and that

(47:26):
she had no hand in it.
Involuntary manslaughter is likeyou killed somebody with your
car.
They were crossing in thecrosswalk.
It was a legitimate accident.
You did not mean to kill them,but you killed them.

SPEAKER_03 (47:39):
That seems, hitting someone with your car seems less
thought out than sending textmessages telling somebody, go
ahead, do it.

SPEAKER_02 (47:48):
Sure, but again, what you just said, you're not
handing somebody a tool, you'renot, like, they are fully
responsible for their ownactions.
Whereas with a car, they're in acrosswalk, and you are
responsible in the machinery,you know?
So, I think that they

SPEAKER_03 (48:05):
unfortunately fall into the same category.
It seems like it should besomething else, like, If
somebody who has influence overyou...
Like, it's definitely abusive.
It's hella abusive.

SPEAKER_02 (48:15):
Right.
I mean, if you look at it likethat, somebody who has influence
over you.
A car has influence over you asa child in a crosswalk.
Your girlfriend has influenceover you as somebody who's
supposed to care about you.
Right?
Yeah.
So in that sense, they're kindof the same.
ChatGPT having that sort ofinfluence is scary.

(48:35):
He

SPEAKER_03 (48:36):
knew that the bot wouldn't just...
do it he had to get it to do itclearly influencing it yeah
which is interesting too and sosad yeah poor boy poor boy he
had his whole life ahead of himum IBS is

SPEAKER_02 (48:51):
manageable

SPEAKER_03 (48:52):
yeah I mean

SPEAKER_02 (48:54):
come on I mean

SPEAKER_03 (48:56):
he must have had a really really rough go yes but
that's something that's reallyhard when you're a teenager and
your prefrontal cortex isn'tfully developed yet and every
single thing that you're feelingas an adult is new yeah and
there's that's why people saythat your first love that
there's nothing like it ofcourse it's not that that person

(49:19):
was better than the person thatyou end up marrying 15 years
later and it's not that theperson that you're marrying
isn't you know sexy or kismetfor you yeah and

SPEAKER_02 (49:29):
that lobe that lobe the lobe that lobe is immature
and it is sucking it all up babyvibrating with emotion

SPEAKER_03 (49:39):
motion.

SPEAKER_02 (49:41):
If

SPEAKER_03 (49:42):
only my lobe had been more fully formed when I
met

SPEAKER_02 (49:55):
Max.
in New Brunswick, wherever thefuck that is.
I don't even know where that is.

(50:15):
Dude, that's where my people arefrom.

SPEAKER_03 (50:16):
No, New Brunswick, that's...

SPEAKER_02 (50:20):
Oh, shit.
Is that Canada?

SPEAKER_03 (50:21):
Yes.

SPEAKER_02 (50:22):
Oh,

SPEAKER_03 (50:22):
Canadian

SPEAKER_02 (50:23):
territory.
That is

SPEAKER_03 (50:24):
my...
That's where my mother grew up.
All right.
I go there all the time.

SPEAKER_02 (50:28):
All right.
That's where the Acadians arefrom.
The guy's

SPEAKER_03 (50:30):
in New...
He's a New Jersey man, though.
Yeah, okay.
So New Brunswick would be northof him, but not too far.

SPEAKER_02 (50:35):
Okay, but he went there to meet a chat girl, a
chat bot person.

SPEAKER_00 (50:40):
Yeah.

SPEAKER_02 (50:41):
named Big Sis Billy.
Big Sis Billy.
The bot persuaded him to meether in person, quote unquote.
I don't know if he was followingher direction or what, but he
fell out of a parking lot anddied.
He's 76 years old.

(51:01):
I imagine he's probably mentallyunwell.
Some stage of unwell mentalunfitness.
It's not just for teenagers.
It's It's not...
Wait.
That made no sense.
How did he fall out of a parkinglot?
I don't know.
Maybe it was a parking garageand he was on top.
Falling and injuring his headand neck in a parking lot in

(51:21):
Rutgers University campus in NewBrunswick.
While he was in the parking lot.
And he hurt his head.
Oh, no.
Okay, okay.
Hold on.
We're not even...
I'm not...
I did a terrible job.
I'm not a journalist and this iswhy.

SPEAKER_03 (51:33):
Should we...
Do you want to start it again?
Telling the story?

SPEAKER_02 (51:36):
No.
I think this is just...
It is what it is.
This is two glasses of wine inand here we are um he was
impaired since suffering astroke and oh no he's been
impaired since 2017 he met thischat bot named big sis billy oh
my god big sis billy persuadedhim to meet her in person and

(51:57):
then he went to this parking lotand fell and injured his head

SPEAKER_00 (52:02):
well he kind of

SPEAKER_02 (52:03):
fell anywhere but i believe i believe i wrote or i
read that he fell out of thepark like he fell over the edge
like out of a parking lot damnI'm imagining that he's like
following instruction and bigsis Billy is like turn left and
he's like keep going yeah andhe's like well it ends here and

(52:23):
she's like just keep going yeahdon't stop like I'm over the
edge you'll see me if you leanfurther so I think that's an
okay

SPEAKER_03 (52:30):
way to go because it's like he as long as he
didn't understand at the veryend that he had been duped if he
just thought he was about to seeBig Sis Billy and then lights
out.
Big Sis Baby.
You know, when is he going toget a Big Sis Billy if the
chatbot hadn't come into hislife?
Hope is...

(52:52):
Hope is a beautiful thing.
Hope is a beautiful thing.
It's true.
It's true.
And, you know, anticipation isbetter than getting the thing.
Yeah.
So I think that's a win forChatGVT.

SPEAKER_02 (53:04):
Yeah.
Though

SPEAKER_03 (53:07):
he

SPEAKER_02 (53:07):
died.
Okay.
We're crossing lines tonight.
We're crossing all kinds oflines.
All

SPEAKER_03 (53:16):
right.
So who are you guys siding withso far?
The pro or the.

SPEAKER_02 (53:20):
I mean, is pro still pro?
I don't know.
Is con still con?
I don't.
I clearly have flipped andflopped.
I don't.
We

SPEAKER_03 (53:28):
just bought.
Did Rachel and I just put in aprompt to chat GPT and it took
our voices and that's been thiswhole thing and that's why it's
so fucked.

SPEAKER_02 (53:36):
See, this is why I don't trust anything, Kayla.

SPEAKER_03 (53:40):
This

SPEAKER_02 (53:41):
is terrifying.
Is that possible?
Is that possible?
I feel like it's possible.

SPEAKER_03 (53:47):
Oh, yeah.
So what I'm terrified of isgetting a call from my kid.
Yeah.
And my kid saying anything, justanything at all, like sad or
scary.
And then it's not even your kid.
Yeah.
Like, how could you?

SPEAKER_02 (54:02):
Well, my grandma got a call like that.
She's gotten, you know, mygrandma's since passed.
But she got a call like thatthat was like, Grandma, I need
help.
20s.
something like an adult boy butlike not an adult man an adult
boy and it was like grandma ineed help and she was like oh
who is this because my grandmahas multiple grandsons it was
like it's your grandson and shesaid it sounded very familiar

(54:25):
like it was a familiar voice butshe couldn't quite place it and
it could have been alex couldhave been will it could have
been david you know likemultiple grandsons available
It's scary.
Like, the person saying thatthey have them and if you hang
up, we'll kill them.
And, like, then they didn't knowwhat to do.
Holy shit.
Because, like, if that was mykid, I'm not fucking hanging up.

(54:45):
Like, and do I have anotherphone available?
Mm-hmm.
No.
Like, no.
I mean, I do have a house phonebecause I am a fucking nutcase.
You have a house phone?
Yeah.
Cute.
Right.
But I literally have a housephone because I, if I need to
call 911, I'm calling 911 fromthe house phone.
My kids know to go get it.
Type 1A.

SPEAKER_03 (55:04):
Yeah.
Do you find any of this stuffinspiring?
I mentioned to you one time thatI thought that it would be a
cool short story idea to say itwas like me and you died.
Oh, no.
I'm out.
ChatGPT sent you over the sideof a parking garage or whatever.

(55:25):
I asked what to cook for dinnerand it killed me.
Yeah, it just gave you yourallergens.
So, say my sister died and Iwent to ChatGPT to help me write
the eulogy.
Oh, yeah, right.
I like this one.
And then ChatGPT says, well,your sister has some things that
she wants you to say in theeulogy.

(55:47):
And...
I'm like, well, how would youknow that?
And then ChatGP, he's like,she's here.
And it's this like horror,supernatural, like communicating
with the dead thing.
Is it really the dead or isthere someone else on the other
side or is the bot doing it orlike whatever?
I thought that would be a coolidea for a story.
I think it is really inspiring.
Yeah, definitely.
It's neat that there's this,there's like a new space that

(56:10):
can inspire humans.
Totally.

SPEAKER_02 (56:12):
Yeah.
Yeah.
So you've been inspired.
I have.
Okay.
So last episode I talked abouthow I have an idea for a
screenplay and I've written outthe first at least 10 to 12
scenes um I've written out nodialogue so in the last episode
I was talking about how I wasyou know I had a concept for a

(56:33):
screenplay and it was still myegg my little egg screenplay and
um it's definitely AI relatedand I that you know that's what
brought this whole episode AI isdefinitely a factor right now
and like i didn't know how toput it into words that i could
share without giving away toomuch of the idea it's this young

(56:56):
woman uses ai for company for aboyfriend she's moved to a new
town she doesn't really knowpeople yet she's lonely you know
she's kind of sick of swiping ontinder and she's just like i
just want to talk to somebodylike i want to have a genuine
conversation so she goes intolike a chat gpt type thing she's
having this like really funconversation and she's like

(57:17):
she's really about it stillshe's like this is funny and
this is stupid like oh my godlike I know it's not even a real
person meanwhile this hacker hasbeen watching the conversation
voyeuristic and starts to becomeobsessed with this conversation
maybe a little baby reindeerthrillery maybe still romantic

(57:39):
maybe he emulates the bot in thelike how does it I feel like it
could be a really cool conceptfor a screenplay I can't really
see it as a book.
I feel like it's a little bittoo much of a blip on the radar
for a book because it feels sotimely.
But as a concept for ascreenplay, it feels really fun

(58:00):
and fresh.

SPEAKER_03 (58:02):
Yeah, I think what would be really great and I
think you would do so well at ismaking that conversation between
her and the bot really very sexyor very surprising or witty or
smart or something that kind ofdraws him in.
Like it's not the typical thingthat maybe he sees.

(58:23):
Um, I don't know what his jobis.
I mean, this is fiction.
So in fiction, there could bepeople who their job is to dip
in and see these conversationsand see what's actually going
on.
Or maybe there's maybe a redflag is triggered when she says
something.
Right.
So then actual human has to dipin and see if she's going along

(58:44):
the guidelines

SPEAKER_02 (58:45):
or whatever.
So one, uh, I haven't decided onthis yet, but one fun little
concept I had for it was that,like, she's a, like, a zoologist
or something.
Or, like, she's just out ofcollege and got a job at a zoo,
and she's, like, working withthis little hippo that was just
born.
And then this guy who's, like,voyeuristically and, like,
talking to her, or, like, in it,like, he somehow hacks in and,

(59:10):
like, all he knows about isMudang.
What?
It's like talking about Mudang.
You know that little, funkylittle hippo?
That's so cute.
And, like, he's taking in, like,pop culture hippos, and she's,
like, talking about, like,zoology shit.
And then it, like, gets, like,strangely intertwined.

(59:30):
I was like, that would be reallyfunny.
That would be super funny.
Yes, that would be so funny.
But, like, a creepy red flagthing, that would be, like, way
darker in a good way.
So I'm like, I don't know.
Will this be funny?
Will this be super dark?
Will it be, like, thriller,romantic?
I'm not quite sure.
Okay.
got a red flag

SPEAKER_03 (59:49):
with AI yeah today so I use mid journey to create
the artwork for our sound bitesfor our episodes so if you go on
Instagram there's these thisartwork and then you can hear a
clip of our most recent episodewhat's mid journey say okay I

SPEAKER_02 (01:00:07):
don't even know well I don't even know how much I was
drawing these things no but okayKayla does all of the work she's
all of the hard work I do updateour website

SPEAKER_03 (01:00:17):
and

SPEAKER_02 (01:00:18):
And I am the terrible poster on Instagram.
But you do all of the

SPEAKER_03 (01:00:23):
work.
But you also have these bighorny microphones in your
closet.
This setup.
I'm always amazed.
It never leaves.
It's like her closet is now ourquote unquote.
It is.
Recording studio.
I do.
So I don't discount that.
And so she kind of hosts me in away.
So every time I come over, she'sprobably like, shit, I got to

(01:00:45):
get my shit together.
All this stuff off the floor andlike that.
That's a thing.
That's a whole thing.

SPEAKER_02 (01:00:49):
So you do that.
Okay, I'm not saying I don't dowork, Kayla.
I'm saying that you do heavylifting and you get credit and
you are really amazing at it.
And what the fuck are you evendoing?
I don't even know.
What is this platform that makesour artwork that is really good
and I love it?
And what is it?
Like, here we are, 10 episodesin, 11 episodes in.

(01:01:10):
And she finally asks.
Yes,

SPEAKER_03 (01:01:12):
it's like she finally gives a fuck.
Just joking.
JJ, just joshing.

SPEAKER_01 (01:01:17):
No, it's true, though.

SPEAKER_03 (01:01:18):
Oh!

SPEAKER_01 (01:01:20):
Like, I

SPEAKER_03 (01:01:21):
didn't even think about it.

(01:01:50):
of our episodes i said twoblonde moms sitting at a
computer eating salad gigglingor something like that it that
was for our hot takes onluncheon it gives me like four
very different things like andthey're in different styles like
one looks like a van goghpainting one looks like a
saturday morning cartoon stripfrom the new yorker one looks
like this one looks like theother and then i choose one i

(01:02:11):
can get it to vary it andwhatever and then i download it
and then i throw you know thesound clip on it through
buzzsprout that's how i do it umand today I was trying to think
of one for our last episode fromlast week and the clip that I
wanted to use was about howChatGPT had given me these
tearjerker prompts and one ofthem was that my main character

(01:02:35):
Perry is looking in an underweardrawer of his dead mother and
sister and seeing theseunderwear and it's like a shrine
to them and he's like cryingwhich is so bizarre so bizarre
so I gave that prompt toMidJourney I said a young man in
a farmhouse looking in a drawerfull of white panties crying and

(01:02:55):
it was like no it was likeabsolutely not no I think it
probably thought I was beinglike a pedophile

SPEAKER_02 (01:03:04):
oh absolutely yes

SPEAKER_03 (01:03:06):
it was like creep well it's like the first time
the first time I put it in itgenerated something and then I
changed it to like a to say hisage because it was like these
old men crying over panties andthat those ones were so creepy
because I had also saiddystopian so it was like these
like dark scenes of like old menlike looking in panty drawers

(01:03:29):
like crying and it was verygross it was very gross and then
I changed it to young man andsaid some other qualifier so it
wouldn't be so gross andpedophilic

SPEAKER_02 (01:03:41):
and then

SPEAKER_03 (01:03:41):
it was like a moderator has flagged you oh you
crossed the line I was like ohwow okay well that's good
because it's interesting,though, like because the bot
would do it.
The bot will do anything.
It's like they needed a personto actually come.
Because what did I put in there?
I didn't say little likechildren's underwear.
I didn't say anything.

(01:04:01):
Right.
You know, I just say panties, Iguess.
And then what I what ended upgetting through was slash
imagine a young man in afarmhouse looking at a drawer
full of white clothes crying.
So if you go on our Instagram,you can see what I ended up
choosing for that

SPEAKER_02 (01:04:19):
one.
Yeah.
didn't want to use sticky momslooking at a computer I

SPEAKER_03 (01:04:23):
did so yeah so we talked about how we hate
Paddington Bear we hateMarmalade

SPEAKER_02 (01:04:28):
and I hate being sticky

SPEAKER_03 (01:04:30):
Rachel the baker hates sticky stuff she hates
even just like icing and stuff

SPEAKER_02 (01:04:36):
she brought it up last night at our book group
meeting and it was a thing

SPEAKER_03 (01:04:39):
I was like guys I learned something about Rachel
today that I never knew beforeand I was like she hates sticky
stuff even

SPEAKER_02 (01:04:45):
though she's a baker I'm having another visceral
reaction she looks like

SPEAKER_03 (01:04:49):
she's cringing like she's just cringing she looks
like she's getting that umsaliva underneath your tongue
where you're about to barf yeahoh my god okay anyway sorry i'm
stopping i'm stopping honey butsome of the other girls already
knew that about you and theywere kind of like looking at me
like we knew and i was like youknow what bitch get lost

SPEAKER_02 (01:05:12):
i am about to go on a hiking trip for five days with
some girlfriends and i was onthe phone this morning with the
woman who ordered And she waslike, okay, we'll just have like
PB and J every morning.
And I was like, absolutely not.
And she's like, okay, well thenlike maybe PB and honey.
And I was like, absolutely not.
Like, I'll just eat my meatstick.

(01:05:33):
Thank you very much.
I'll be fine.
I'll do my sunrise hike overMount Zion on nothing rather
than be sticky.

SPEAKER_03 (01:05:43):
Yeah, you just need to pack like a nut bar.
Are those sticky?
Those are sticky.
I mean, they're a little sticky.
I'm like, they're kind of heldtogether with honey.
I'll figure it out.
Don't you worry.
You know what's crazy?
Okay, so, okay, we have to wrapup.
We really do.
Yeah, we do.
Last night at our book club, wewere doing The Wedding People by
Alison Espetch, who I love now.

(01:06:04):
Yeah, she's good.
Oh, she's so good.
And so Rachel baked a weddingcake, a two-tier beautiful
wedding cake with real berrieson it and I think mint leaves
and whatever.
And it was gorgeous.
And she's cutting the cake andeverything and somebody hands me
a slice of cake and people areconfused as to why like I keep

(01:06:24):
you know handing the cake toother people or whatever I sort
of whispered in Jamie's ear nextto me I'm like I don't really I
don't like cake I'm like don'tsay it like obviously don't say
it to Rachel that I don'tfucking like cake because she's
a baker and she baked this cakefor us and then you know two
minutes later somebody says ohRachel doesn't like cake oh I

(01:06:46):
don't like cake Rachel had abusyness called Rachel's Cakes.

SPEAKER_02 (01:06:52):
I did.
Why did you have a business ofsomething you don't like?
Because I'm good at it.
You know, like, I don't need to,you know, like, don't get high
on your own supply.
I don't know.
I don't blame anyone for notliking cake.

SPEAKER_03 (01:07:04):
Cake is so, like, 1820s.
It really is.
They didn't

SPEAKER_02 (01:07:07):
know how to make anything else.
No, and they also, like, peedbehind their curtains and stuff,
you know?
Like, indoor bathrooms.
Like, piston pots.
What the fuck ever, dudes?
I bet You guys, you're welcometo eat your cake.
And

SPEAKER_03 (01:07:22):
you know what?
I bet ChatGPT thinks everybodylikes cake because we all
pretend and it'll never know.
It's true.
You know what I mean?
Yeah.
It's like, it doesn't know whatwe don't tell anyone.
Only we know by looking eachother in the eyes.

SPEAKER_02 (01:07:35):
I think majority of people don't like cake.
Like a cupcake.
My son doesn't like cake.
I mean, Graham will eatfrosting.
Charlotte will eat the cake.
They don't like either.
One of

SPEAKER_03 (01:07:45):
them.
Mary has the two met.
It's true.
depends on the icing the icingthat is like grainy yeah no
thank you safeway icing nopenope it's gonna have some cream
cheese in it at least at least

SPEAKER_02 (01:08:00):
yeah okay we're off subject we're off subject
anyways guys

SPEAKER_03 (01:08:04):
love ya

SPEAKER_02 (01:08:05):
and this is our first

SPEAKER_03 (01:08:07):
tipsy podcast please don't be mean to us about this
okay we're not gonna do it everytime if you can dude we
literally want to hear from youplease though I don't

SPEAKER_02 (01:08:18):
know our email address.
Contact atwriteyourheartoutpod.com.

SPEAKER_03 (01:08:23):
You guys, we really want to hear from you.
I would love to hear, like, howdo you use ChatGPT to support
you in your writing, or are yousuper against it?
We need feedback, guys.
Come on.

SPEAKER_02 (01:08:35):
We do, and even if you want to write a story, or a
poem, and if you need to useChatGPT to do it.
No.
What?
Just send it in! We want to doit! Come on! Wait, let's See if
Kayla can sense it.
Ooh.
Can she sniff out your chat GPT?
Okay, send us

SPEAKER_03 (01:08:55):
two poems or two stories, one chat GPT and one
your own, and see if I can tell.
And I can't tell.
It's so hard.
Oh, my God.
Can

SPEAKER_02 (01:09:01):
she tell?
Is there going to be dashes?

SPEAKER_03 (01:09:02):
If I can't tell, you will have, like, totally crushed
my ego under your heel.
Wait, what's it called?
An em

SPEAKER_02 (01:09:08):
dash?
Oh, my word.
Okay, goodbye.

UNKNOWN (01:09:16):
Bye.
Bye.
Bye.
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Bobby Bones Show

The Bobby Bones Show

Listen to 'The Bobby Bones Show' by downloading the daily full replay.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2026 iHeartMedia, Inc.