All Episodes

September 11, 2025 31 mins

This is the first 30 minutes of a longer conversation. The full conversation is available on the Human Nature Odyssey Patreon.

===

Jake Marquez and Maren Morgan are fellow podcasters, filmmakers, and new friends. Starting today, the three of us are joining forces to create monthly bonus episodes where we’ll seek to better understanding this self-destructive civilizational 10,000 year predicament we find ourselves in.

In this episode we discuss artificial intelligence. We found that despite all our philosophical overlap we were advocating two different approaches when it came to AI. I was exploring the idea of a cautious adoption, Maren argued for more of an abstinence policy.

I’ve said it before and I’ll say it again, every good conversation ends with the same conclusion: oh yeah, balance. We find ourselves there eventually. But the road we take is filled with insights, questions, and jokes. 

 

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Hello folks, you are listeningto a special bonus
episode of Human Nature Odyssey.
Starting today, Jake Marquez and MarinMorgan from the death in the garden
podcast and I have joined forces to createmonthly bonus episodes
where we'll delve into hot topics,

(00:21):
hit TV shows and celebrity gossip.
Just kidding.
Well, kind of whatever we do talk about.
It'll be in the hopes of betterunderstanding this self-destructive,
civilizational, 10,000 year predicamentwe all find ourselves in.
It was so much funtalking with these guys.

(00:41):
I'm glad to have you join usfor the hangout here.
You can listen to the first 30 minutesof this two hour conversation.
The rest is availablenow on the Human Nature Odyssey, Patreon,
or the death in the Garden Substack.
Yes, that means it is behind
what has come to be knownas the proverbial paywall.

(01:03):
But don't think of it as a paywall.
Think of it as a pay invitation.
All three of us are trying to navigateworking on the things we love while,
you know, making this whole affordto live thing possible.
So joining the Patreon for only five bucksa month

(01:23):
really helpskeep this thing going on the Patreon.
Not only willyou find this full conversation,
you'll find all sorts of other bonusepisodes, writings,
additional conversationswith previous guests,
and some very smart and kindfellow patrons.
Oh yeah.
And if you join the Patreon,you can see our smiling faces.

(01:45):
This is a video episode.
If you're into that kind of thing.
And if you haven't yet,
check out the Human Nature OdysseyYouTube channel.
I'll be posting clips of our conversationthere.
The first one is calledHow to Not Lose Your Shit
during the AI Revolution,and it's on YouTube as of today.

(02:06):
You can also support Human Nature Odysseyby simply sharing
with a friend,your mom, or your dental hygienist.
Ask them to leave a ratingand write a review.
It all is very much appreciated.
We'll return to our regular Human NatureOdyssey
programingin a couple of weeks on September 25th.

(02:26):
I'll be releasing the next episodeof Human Nature Odyssey titled
our HunterGatherers Liberals or Conservatives?
My answer may surprise you.
Stay tuned.
Okay.
Without any further ado,here is the first ever bonus episode
collaboration between Human Nature,Odyssey and Death in the garden

(02:47):
with Jake Martinez,Marin Morgan and yours truly, Alex.

(03:10):
Do you want to, like, now?
Like do like kind of like an introductionthat people explaining like,
because this is the first time,you know, human nature.
I've seen death in the gardenor like hanging out
and that this is going to be for just to.
Yeah. Yeah.
I don't know if we want to plan that
anyway if we just do a thing,but then just wing it, just wing an intro,
you know, it's just wing and interestingat the exact same time.

(03:31):
Okay, great. Okay.
Well, hey, everyone that I forgot.
Okay. Yeah.
Let's think about how to do this.
Well, just basically.
Hey, everyone.
I'm Alex, left from Human
Nature Odyssey here with two friends.
Jake Marquez.

(03:51):
And I'm Marin Morgan,and we're from death in the garden.
Yeah.
And these two,
entities
overlap in so many ways.
Are going on parallel,but interestingly, different paths.
We also just enjoybeing, in conversation with each other,

(04:12):
and we've done it inan unrecorded fashion, several times now.
Once in real life,which was very exciting.
And we were talking last time.
Wait a second, what if we did this?
But we recorded it.
Yeah.
And we're going to try to do thison a bunch of different topics
as often as we can,because I think it's fun.

(04:34):
And it's only for our paid subscribersof our each individual projects.
So it's exclusive content as well.
Sorry, it's the C-word I love that.
Yeah, that's tying it to the C-word. Yeah.
To to put it inas gross of terms as possible.
We've we've we've made paid content
to add valueto the lives of our subscribers

(04:56):
to boost our engagement
and grow our audiences.
I know, it's so sickening, I hate it.
The gross part of the problem.
Yeah, we could use that.
We could also try again.
We could try again.
But there's alsoyou can do your own intro.

(05:17):
We could do our own intro explaining it.
No, I'm keeping it.
But please give a quick please. Yes.
And that also explain whatthe purpose of this conversation is for
the purpose of this conversation,I don't know, I have no idea.
The purpose of this conversation is
what's today we need to talk about.
Oh, okay. Okay.

(05:39):
You tricked me
today.
We wanted to talk about AI in a way.
You're a problem if you use it.
No, no, I don't know.
I don't know how to intro this.
Well, are we introducingthis particular conversation?
Well, I think we have to introthis thing we're doing.

(05:59):
Yeah, yeah. That's it.
Alex, you're the best talker.
What?
You can't put that on me.
So are we redoing the whole thing?
No no no no no no no.
This is the longest,craziest intro of all time.
I might I might cut maybe this exact thingright here, but but but yes.

(06:22):
No, we wanted to like,you know, we've been each each of us,
you know, are working onthese scripted episodes and, like,
going, back into the,
imaginal realm and talking to the museand struggling with the,
the creative process and, oh,my God, it's so hard
when we talk with each other,so many synapses start firing.

(06:46):
All these connections are made.
It's very exciting to talk with you guysand have things come out of each of us
that we couldn't have planned.
And yeah,we've I've always really enjoyed
our conversationsand figured that the people that,
are interested in the samethings were interested in similar things.
We'll find those conversationsinteresting as well.
So today we're, for the first topic,we wanted to talk about

(07:11):
just something light.
And I'm not controversial and just, like,really easy to comprehend,
that we're alltotally on the same page of,
So we decided let's just startwith, like, artificial intelligence.
Talk about something nice and fun.
Talk about AI.
It's something that I've thought about.
I made a, Patreon,

(07:32):
bonus episode about me rambling about it.
Marin just made this dope last videotalking about AI.
We found that our perspectives,while they're, like,
similar in some respects,are like approaching it very differently.
So we wanted to get togetherand just talk about
what we thinkfrom our civilizational systems

(07:55):
thinking viewpointabout the dawn of artificial intelligence
and see where we go from there.
Yeah, yeah.
Do you enjoy this?
Yeah, exactly.
And like, as a concept,you know, us coming together,
human nature, Odysseyand death in the garden coming together.
It allows us to not be so like in our ownlittle worlds.

(08:16):
It allows us to communicatewith each other.
It allows us to share each other's work
and be able to have thisbe a more community based projects.
I mean, it's really hard in the worldright now.
I feel like everybody who has a projectlike this sort of has to like,
be alone in their brand,when in reality it's like we can do
a lot more work together and we can createmore things that we can share

(08:36):
with people and add value to,especially like our paid subscribers,
to be able to give them somethinglike a really cool, earnest
conversation between, you know,very like minded people, hopefully helps
make people feel less alone or helps,you know, bring insight to people.
It just seems like a good way for us
while we're working on these thingsthat take a lot of time and energy

(08:57):
to be able to connect with each otherand not be so hermetically separate
and, you know, do some, some good,good work.
You save the worldone conversation at a time.
If only that was true.

(09:33):
My impulse is always to like,
sort of like over contextualizeor like over educate.
And so we're we're reading overlike our previous script for the like
the movie like because we're, we'regoing to try to just make the film death
in the garden now, which is like,
we're going to continue doing all theselike YouTube videos and stuff,
but then we realize we're like,you know what?
We we know what the film would be.

(09:53):
We can have everything elsebe like its own little YouTube video.
But we know what the, the,the death in the garden film is
and what we're like going overthe old scripts that I wrote.
And it's just like,I really respect my old self for like
trying so hard to like, reallyexplain everything and every,
every train of thoughtthat like, led to my conclusion.

(10:15):
But it's completely unwatchableand completely like horrible.
It's so badly written, but it's like,
I, I respect the fact that I was tryingso hard, but now I'm kind of like, okay,
I need to like, fightthat impulse to want to overexplain
to just be a little bit more briefand be a little bit more confident
in what I'm saying,
and knowing that people are goingto disagree with me at the end of the day,
because I just find that it's like,it doesn't matter

(10:36):
what you think, likeit doesn't matter what you say.
People are going to disagreeand and that's okay. And that's cool.
That's part of like debateand part of life and part of part
of living in a polarized society.
But it's just funny.
It's the creative process is funny,I think, totally.
Okay.
What would the episode bewhere you're trying to say
something that every single personlistening would agree with?
Hey guys, welcome to a podcast.

(11:00):
This is a podcast and you are listeningto it and you're listening to it.
Okay. See you next time.
That was it'sbeen great spending time with you.
Yeah.
This has been brought to youby Stamps.com.
Maybe like I disagree, I disagree.This was not great.
Just been doing this.
This is recorded in outer space.
We are not on stolen land of any sort.

(11:20):
This is completely free spacewhere this exists.
I am an ambiguously ethnic personwho has no opinions about anything.
I do listen to that.
I do an interesting podcast
now, but I like to go into likethe inherent, you know,
the inherent problem with all technologywould come up no matter what.

(11:41):
Yeah. Shit.
Yeah, yeah.
Well,we're kind of just entangled in problems.
Yeah, we are.
Okay.
But that was a fun conversation.
I started with, yeah.
We don't know what to do.
Well, but you.
Well, I'm really excitedto get to have an excuse

(12:03):
to talk with you guysand riff off of ideas.
I listened, I watched Marin's video.
I thought it was excellent.
And I know you listened to the piecethat I made about AI, and I think it's
gonna be really interesting to discussthe entanglements of this question.

(12:23):
And we come at thingsin many similar ways.
And, and I think we're alsoin this conversation
going to find the wayswe kind of things differently.
And I think that's goingto be very juicy and interesting.
And I was thinking maybe one way to start.
Well, first,I mean, we could also introduce
to people listening like, what the heck?
Our intended goalswith this conversation are,

(12:47):
we've never donesomething like this before.
And thenI was thinking it would be cool to say,
like to start with what we agree about.
Like, what's our conversation?
What's our what'sour common ground in the conversation?
And why?
Yeah, one of the thingsthat I definitely wanted to say is I think
we agree waymore than we disagree on this topic.

(13:09):
I think that it's it'sa complicated conversation.
And I think for me especially,it's like it's ever evolving,
like there's new things that come outall the time when it comes to AI.
And so I think that probably my goalwith this conversation
is more than anything, to sort of express
that, like, everybody's going to needto draw their lines where they see fit.

(13:33):
And that's complicated in a situationwhere it is
entangled in all of these systems of powerand systems of like technology.
And there's there's a lot of waysthat this can go really wrong. And
everybody ultimately,sorry, the cat is going to be, you know,
everybodyultimately is going to have to decide
what works for themand what's best for them.

(13:54):
And so I think we just sort of,at least for me, it's like I just sort of
want to offer the perspective of like,if you want to reject this technology
as completely as you can, acknowledgingthat, it's really hard to avoid it,
especially now that Google is likeso entwined with Gemini.
And there's like sort of
no way escaping, even when you just likedo a Google search, it's going to show up.
But that that it's okayif you want to reject this technology.

(14:18):
And that there's,there's valid reasons to do so.
That's kind of what I would say.
Yeah.
And for me, it's, you know, I,I just listened to your piece on it
and it was nice hearingyou process a lot of these things.
And I really like, respected your approachto talking about these things.
I thought you came from it
from a very intelligent place,in a very honest place,
and I really liked it and pushed my ownunderstanding of it a lot.

(14:40):
And so I'm more excitedto have that conversation,
because sometimes I feel very isolated,and sometimes it feels
very overwhelming to me
where I kind of completely shut downeven thinking about it.
And it's hard for me because I don'tI don't want to not think
about these thingsand try to understand them,
because it's it's hereand it's not going anywhere,
and it's going to keep evolvingand changing and being part of our lives.

(15:01):
And I think it's important to find likeminded people, talk about these things
and give a spacefor these conversations to happen,
and also for peopleto tune into these things,
because they might also feelvery kind of like terrified and scared
and unsure what to do with all of these,these thoughts and feelings.
So it was nice. It was a, it was like,
that made me feel
a lot less isolated hearingyou talk about them, even if

(15:22):
I'm not currently on the same pageor whatever it is, it was like, oh, sweet,
somebody else is really putting their mindto trying to understand this thing,
which is really nice.
Yeah. Likewise.
And I feel like it was interesting like
hearing you talk about itand you're using the word now approach.
And I think that I meanwho knows what our conversation will be.
But it almost feels likethis is a conversation about

(15:46):
how one approaches
a, you know, we're living at a time.
I mean, we already were born
or we're a little kids during the timeof the internet, joining the world.
But like, we're here for a ushering in
of a whole new technological era.
And,
how do we all navigate how do we approach

(16:09):
something that is is here, you know,and I think, like one thing,
we definitely, agree on what some of,some of the many common ground is like,
if and I talk about this at the beginningof my piece, it's like if anyone asked us
like, hey, Jake Marin, Alex,what what should be like, top priority?
Give me like,
you know, spitball, like top ten thingswe need to work on as a species.

(16:31):
I wouldn't have been on a toplike 10,000 list for any of us.
So this is not something that we wanted.
It's definitely notsomething that I wanted.
No, no.
And that's what I.
I almost get angry because it was like,oh fuck, I didn't want to have to think
like this. I did not expend.
I did not set aside time and energyto have to grapple with this.
I thoughtI'd just have to talk about agriculture
and how we feed ourselvesand all these other things.

(16:51):
But it's sometimes it feels like, well,that's not even like close
to the top of the priority and anymore,because like,
this seems way more pertinent,but I feel like I'm almost behind on my
where I'm thinking about these thingsthat's moving so fast.
And so part of me is guthas a little bit of whiplash like,
oh shit, I've been working really hardfor a long time
of talking about things that I thoughtwere the most important things,

(17:12):
and I don't know if they are the mostimportant things at this point anymore.
Yeah, it has this way of sortof usurping everything in a sense,
because it because of the inherent waythat the technology is built, is meant to
accelerate every technology that can be,that it can use, that can use AI.
And so it's like if you have a problemwith the military, military,

(17:33):
industrial complex,
then it's like AI's only going to rampthat up to 1000 to 10,000.
It's exponential. We have no idea.
And so it does sort of dwarfall of our other problems simply
because it has the ability to acceleratethe things that we're already doing badly
and increasing the internet of Things,increasing isolation,
atomization, increasingall of these things that you know,

(17:55):
we've just been trying to figure outhow to sort of make a case,
you know, around the problemsof industrial civilization.
And if it's ramping it up to a million,then how do we even begin
to talk about it?
Because because it's alsoit isn't the case necessarily that I,
it's it's that AI is this accelerant
and this extension of all of these thingsthat are the problem. Yes.

(18:18):
And so it's so it's complicated because.
Yeah.
Like, if I could, you know, snap myfingers, I would wish it away overnight.
Like I would immediately.
I would wish for it to just be gonebecause we don't need it.
There's just no need for it.
Sorry.
I'm getting,like, messages on my computer.
So. Yeah.
So it's like I messages.

(18:38):
No. From my sister. Yeah.
The machine is like,is responding to this conversation.
Well, I was just thinking, like,I completely agree.
It's an extension and an accelerant.
And I also totally agree with, like,this is what I want to be thinking about.
And it also is just like so disorienting.

(19:00):
The pace of it and the unknown of whereit's going is so disorienting.
Just as we're talking rightnow, I'm remembering this moment
when I was I'm going to guess, like,you know, 13,
14 and I was making movieswith my video camera.
And then would put it on my, desktop computer, and

(19:20):
I had this editing software on the CD-Romthat was always crashing.
It was like a complete disaster.
And shout out to my dad.
He would spend like his Sundaytroubleshooting with me,
like how to not have my, like, filefor the movie I've been working on
for like a month.
Like just get completely like deleted.Like it was.
It was crazy how bad the software wasand how much time was spent.

(19:45):
Agonizingly like trying to just, like,resolve it in
order to, you know, do the creative, funpart of the process.
And I remember it was like late at night,I was exporting something,
you know, I had been like,this is my passion project.
I've been working on thisand like it was failing and crashing
and like, you know,do you remember the blue screen of Death?

(20:05):
You know, on the old,
like, desktops, this, like, blue screenwould come on, you're like, oh, shit.
Okay, I lost my progress and I was like,
pacing back and forth in my bedroom.
Really, like freaking out.
Like feeling like an existential,like overwhelming.
Just like terror of, like,everything is terrible.
I'm like.
And I think it was because I'm like, hereI am so attached my like,

(20:29):
my humanity is so attached to this machineand this equipment
that is fucking with me.
And it's devastating me emotionally.
And it's like taking from methis creative thing I've been working on.
I might actually lose it.
And I was like, you know, hyperventilatinga little, you know, and I and I kind of
just like had this moment of just like,okay, okay, hold on, hold on, hold on.

(20:52):
And I just like, remember, likemy two feet are on the ground, like I'm
breathing like, this is just,
you know,electricity and metal and noises.
Like I'm still here.
I'm still a creative person.
Like, it's incredibly frustrating.
It's slowing me down.

(21:12):
But I just didn't want it to, like,take my, sanity.
And I feel like how I want to approach
the onslaught of thistotally unprecedented
new technology that is being jammed downall of our lives,
is to just also try to

(21:34):
be skeptical and, and,
manage it
wisely, and critique it,of course, but also like.
Yeah, just not let it destroy my, equanimity.
Sounds like you're doing a good job of it.
I think I'm I'm doing a bad job of it
all day.
Like, it's one thing to say.

(21:54):
What's the point anymore?
Well, yeah. That too.
Yeah, yeah, I and I think that I thinkthat's definitely something
that I really respected about your take onthis is that, you know, the willingness
and the openness to try to figure outhow to coexist with it more.
And I think that my, the feelingthat I get to is sort of like

(22:15):
where I get to with everything,which is that like,
I just don't there's so many systemsthat are outside of my control
that I don't get to consentto being a part of.
Yeah.
And, you know, like being entangledin industrial civilization
is obviously the key one.
And I think, you know,you said in your podcast,
which is so true that it's like

(22:35):
it's really impossible for us to liveethically under this civilization.
Like if you really pay attentionto the ways that we're entangled in
so many systems that are destructiveand impressive and all of these things,
like, you can only come to the conclusionthat it's impossible.
And, and that's, that'sthat's what we were born into.
For me, I feel like it's like Iis this sort of new frontier,

(22:57):
and we're early enough on in itthat we can start
creating a parallel structure,because I think
we need to create the parallel systemanyway. Right?
Like if we want humanityto not go extinct,
we need to start livinga really radically different way.
And I think that's part of likethe role of people like us in
the world is like experimenting with thatand trying to figure that out, trying
to talk about it and trying to implementnew ideas and new ways of seeing.

(23:21):
And so for me, it's like withI don't feel,
I don't feel like it's entangled in me.
And I don't trust that if I let it in,
even like a little bit, that I won't start
thinking in terms of the waythat the machine is sort of operating.
I won't start valuing things
in the way that it values,because it's because, like we said

(23:41):
before, it's an extensionof our industrial civilization.
And that's like
that comes with all the mythologiesthat are associated with
that, that comes with the ways of thinkingand the, you know,
primacy of efficiency, for instance,and productivity, all of those things.
That's like the whole point of AI isto increase productivity and efficiency.
And so so for me, it's like I justI just come to this place

(24:03):
where it's like I sort ofI'd like to like, be like virginal
as much as I can with it so that I can
when I'm creating the
when I'm being part of creating a new wayof being or being a part of,
you know, building some sort of apparatusthat really values humans.
I, I do have I mean, the purity thing

(24:25):
always comes up in this conversationas definitely a word that always comes up.
But there is a I would like to retaina level of purity around that, because
I do think that the parallel structurethat I want to be a part of stewarding
doesn't, does it need this?
Like there's no room for theAI in that system anyway.
And so that's that's sort of likewhy I've, I've put the line where I have

(24:48):
is it's kind of like the worldthat I want to live in
doesn't need this and doesn't want this.
So right nowif I can kind of like hold fast to that,
then maybe that will be a benefit,
an asset in the creationof that parallel structure.
If I'm trying to live within capitalism,
and that's like my main goalis to try to survive, then.
Yeah. So that makes it really hard.

(25:09):
That makes it really,really hard to resist it because I,
I'm aware of how like,you know, inefficient I am in comparison
to these people who are able to producean essay every single day
by using generative AI.
Like, of course,
I'm not as productive as a machine,but I think that it's like the world
that I want to like, embody, and movetoward doesn't value productivity.

(25:32):
Like over what that process, the processof creation is, if that makes sense.
So it's like
I'm aware that I'm going to be likein this sort of uncomfortable
in-between where I won't be as valuable
in as sort of a capitalistic sensein an industrial civilization sense.
But I think on the other side of this,like boom and bust,

(25:54):
I will be very valuable.
We will be very valuable for,
you know, being as mindful as possiblewith this sort of stuff.
And not just my Marin like grandma Marin,we our computer doesn't.
Where could you help us?
How did they write essaysback in your day?
I need toI need to write an essay. You're the only.
You're the last one.You're the last essay writer.

(26:15):
How does it done?
Then you hand me a pencil,and they get really, really confused.
Yeah. Why does it say number two?
Where is number one?
Number one.
Yeah. Yeah.
Well I, I
oh should I just lostI distracted myself of my own joke.
Oh well I think that. Yeah.

(26:35):
Like what are the commonalities that we share
which is makes a conversationabout this unique and kind of funny.
Is that like where we sharea vision of, like,
what the scope of the problemis with civilization?
And we share somewhat of,I mean, I don't think it's defined

(26:58):
too much for any of us, but, like,what kind of future we could
we should be working towards in orderfor humanity to survive.
And we share the ideathat, like, civilization
is inherently unsustainable,
and that we've been living outsideof our means for a very long time.

(27:18):
And the future that we're imaginingis not like one where we've,
like, solved all of these, like,natural problems with technology,
but like, we've learned to, like,live within the means of the planet again.
And that's not going to just be bysaying no to things and like,
givingup on all the benefits of civilization.

(27:41):
But I think something that we agree onis that, like,
there's going to be so muchto be gained from communal,
from the community,like emotionally, spiritually.
If you know, there are drinkingwater isn't too poisoned
from all the shit we're doing, like ourour health will be better.
And like, here we are,

(28:03):
you know, so so that's like a perspectiveI share as well, too.
And the future that I am envisioningdoesn't have AI in it.
And and I and I also,I'll say that I also, I don't know to what
extent like technology and industrialismhas a place or not in that future.
I'm open to there being like creativeways for it to be there. But

(28:28):
yeah.
Yeah, yeah, yeah.
Well, I think ultimately this is somethingthat Marianne and I talk about a lot,
and we made a video about this is that,you know, humanity were inherently a tool,
and technology using species,
that's, it's, it's inherent to who we are.
There is no humanswithout some form of tool use

(28:52):
for creating our own environment,for creating our housing
and the tools that help us procure food.
So that's part of us.
But ultimately, the
so not only are we a tool making species,inherently, we're also a culture
and mythmaking and storytelling creaturethat also is a
so part of us that you can't have humanswithout telling stories, because it's
the operating system of who we areand how we are in this world.

(29:15):
And so the stories we tell ourselvesto be in the world inform the technologies
that we create to begin withand how we utilize those technologies.
And so for me, and we're already sayingthis is that, you know, AI
is the extreme extension
of the storythat we're already telling ourselves.
And it's it can't not be that thing.
And so if if you use AI,it is inherently reinforcing

(29:39):
the story that created it,and it only knows that story.
And so I around the conversation of AI,I encourage people
to really think about that.
AI has limits to what it can do,because it's limited
to the database of informationthat it's given.
So it can't tell you a different storybecause it doesn't have the information
to tell you a different story.

(30:00):
And it also doesn't have accessto all information.
Not all information and knowledgeand experiences can be put on the internet
and scrubbedthrough and incorporated in some prompt.
So it's very limited.
So we have to be very, very aware of that.
And I think a good example of this is likeif you look at a lot of these,
gen video, programs, these large languagegenerated videos generate videos.

(30:22):
What they're really good at right nowis replicating vlog style videos,
because there is a shitload of
of vlogs on the internetthat it's youth utilizing.
So it does that really, really well.
And I think a really good exampleis that right now you're seeing all these
very funny videos of the of Bigfootin the forest, you know, running away
like there's all these

(30:43):
kind of videos that are popping very funnybecause that's the bet.
That's it has such an abundanceof that type of information to train from.
And so if you're using AI for therapyor how to do this or that,
it's limited by what it has access to.
And so it's going to keep us kind ofconstrained within what it has access to.
And that's my big concernis that if we keep giving
kind of our sovereignty of thoughtover to it, and we

(31:06):
continue to kind of defy it the waywe have, like that, it is this advancement
and it's bigger and larger than in us,and it's smarter than us.
Then we're going to kind of givesovereignty of over our minds to it,
and it's going to continueto reinforce the story
that has gotten us to this place.
Now, obviously, it can be used as a tool,and I think people are finding
the limits of its uses and whereto apply it and all these things.

(31:28):
But I do see a lot of people.
Advertise With Us

Popular Podcasts

CrimeLess: Hillbilly Heist

CrimeLess: Hillbilly Heist

It’s 1996 in rural North Carolina, and an oddball crew makes history when they pull off America’s third largest cash heist. But it’s all downhill from there. Join host Johnny Knoxville as he unspools a wild and woolly tale about a group of regular ‘ol folks who risked it all for a chance at a better life. CrimeLess: Hillbilly Heist answers the question: what would you do with 17.3 million dollars? The answer includes diamond rings, mansions, velvet Elvis paintings, plus a run for the border, murder-for-hire-plots, and FBI busts.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.