All Episodes

September 2, 2025 10 mins

AI is no longer a concept—it’s here, shaping the way we live, work, and even think. But with great power comes an even greater question: can technology itself be part of a divine assignment? In this thought-provoking episode of Culture Raises Us, AI expert Julian Reeves joins to explore how faith, purpose, and spiritual grounding must guide the future of innovation.

Julian shares what it feels like when building technology seems divinely led, and why discerning the difference between what’s merely possible and what’s truly purposeful is essential. Together, we examine the dangers of AI created without moral grounding—dependence, bias, and misplaced trust—and discuss how creators can anchor their work in integrity, humility, and divine intent.


From starting small with personal impact to protecting against fear-driven “drift,” this conversation is a blueprint for building technology that uplifts humanity instead of distracting it. It’s a call to align our digital creations with God’s higher purpose.


#CultureRaisesUs #JulianReeves #AIWithPurpose #FaithAndTechnology #DivineAssignment #EthicalAI #TechResponsibility #SpiritualInnovation #PurposeDrivenTech #GodInTheCode

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
AI is no longer just an idea. It's literally here
shaping the way we live, create, and even think. But
as powerful as AI is, it's still built by human hands,
guided by human intent. So that raises a question, can
technology itself be part of a divine assignment? Has God
ever placed a deeper ye beneath the code, the algorithms,

(00:25):
the content we create. In today's conversation with Julie Reeves,
AI expert, we're gonna unpack what it means to bring
divine wisdom into the design of AI and how we
as creators can make that what we build doesn't just
serve culture, but really helped shape it with purpose. Have

(00:46):
you ever experienced a moment, like while working with AI
or tech in general, that felt like divinely guided, like
there was a y that was way deeper than just
the code.

Speaker 2 (00:56):
Yeah, for sure, especially when I'm building something in the
pace just seem to come together seamlessly, when there seems
to be no type of friction, when you kind of
can see that you're being guided by something that's higher
than myself. I feel like I see that often when
I know I'm building something that has a higher purpose
as there to enlighten consciousness and not just there to

(01:17):
look cool or to fulfill you know, some very low
serving purpose for sure.

Speaker 1 (01:24):
How do we kind of discern the difference though, between
like innovation that's simply possible versus innovation that's purposeful. You know,
there's a very distinct difference between the two.

Speaker 3 (01:38):
Yeah.

Speaker 2 (01:38):
I think it's really about spiritually knowing what your goals are,
knowing what you were put on this earth to do
your impact, because a lot of times, especially with AI,
it seems like everything is now possible. It seems like
the world is at your fingertips, which is great in
the right hands, but in the wrong hands, that can
just equal waste. To be honest, that can just equal

(01:59):
toxic software, which is never good. So in terms of
purposeful software, it's always important to understand that if you
have the power to create these almost godlike softwares, it's
very important to be intentional.

Speaker 1 (02:15):
Yeah, there's a couple of things there. I think you
knocked something out the park in terms of when you
know your why, you never lose your way, right, and
if you have your why at the forefront of everything
that you do, it helps guide you in making those decisions.
I think the other thing that you just hit on
is just because you can doesn't mean you should, right,

(02:36):
And so often because people can create something or infuse
something in something doesn't mean that it's necessarily the right
thing to do. But if you have that first piece
that you were talking about and being grounded in your why,
that will then help you to inform you to do
the things that you know are the right things to do,

(02:57):
not the things that you know you can do if
you wanted to.

Speaker 2 (03:00):
Absolutely it's about having that restraint with not drifting towards
what can just lead to profit, but really being intentional
and actually putting in the effort to know what can
actually change and transform lives for good.

Speaker 1 (03:13):
So what cultural dangers do you see if AI is
developed without moral or what I call spiritual grounding.

Speaker 2 (03:21):
For sure, well one dependence. So this actually recently happened
to GBT Open AI. The company that released GBT released
called GBT five, which is like their newest thing, and
they cut off all the other ones, so they had
like GBT three, four, et cetera, and so on the
internet people were literally crying of like you guys took
away GBT four, like.

Speaker 3 (03:42):
Literally shedding tears and trying.

Speaker 2 (03:44):
To write really exactly like this was a real thing
within the tech community. And so I think that's the
problem that could happen, is that dependence these you know,
this technology that's really just somebody else's random creation. I
think that's one side of it, And the other side
of it is also that those things, if unchecked, can

(04:04):
be created by people with biases with you know people
let's just say that Donald Trump was creating an AI
like obviously, if that was influencing impacting millions of people's lives,
there's inherent biases and evils inside of it that when unchecked,
could lead to people making horrible decisions because they're being
advised by somebody they think it's unbiased, but really all

(04:27):
things in the background.

Speaker 1 (04:29):
So, but when you were giving the example of what
took place with the you know three four version going
away with five, is that not just the typical evolution
in technology and everything that we deal with where it's
just a newer version coming out or is this something
that I'm missing?

Speaker 2 (04:43):
Yeah, No, it was literally exactly that GBD five is
just they basically newly iterated version. But what they did
differently is usually when they come up with a new version,
they'll keep all the old versions so you can still
use it.

Speaker 3 (04:54):
Oh, all of them away except for that one.

Speaker 2 (04:57):
So that's why everybody was like, you basically took away
my best friend.

Speaker 3 (05:01):
That's how people were looking at it.

Speaker 1 (05:03):
Godshe so yeah, that is so yeah, that is a
little nutty.

Speaker 2 (05:08):
For sure, because it's actually been reported teenagers, younger kids.
They're using AI like a therapist, like a best friend.
They're talking to it all day every day, So they're
forming these relationships so that if unchecked, can really be damaging.

Speaker 1 (05:25):
So and I think you kind of nail this, but
I'd love to get from you what does accountability look
like for AI creators who may have the power to
influence these millions without ever being seen. Because that's the
other thing. A lot of we have no clue in
some instances. I mean, you may who are the ones
in the background creating and doing all these things.

Speaker 2 (05:46):
Yeah, and usually they're quote unquote good people. Typically they're
like researchers, engineers who are just very very passionate about
building these networks that can be infinitely intelligent. So they're
not like going and intentionally trying to spread bad values.
They're just like, I want to make something smarter. I
want to make something better. But sometimes the problem is

(06:07):
it's not like they have some spiritual guide inside with them.
They're not necessarily starting from the impact of I want
to build people up spiritually enlightened consciousness. They're more of
just saying I want to make something smart. So those
are people behind the scenes that are creating these things,
but it's going to be important for us to have
a layer of regulation in a sense, because if they're

(06:29):
just the ones making all these decisions, they can impact
mankind as a whole.

Speaker 1 (06:33):
So yeah, well, if somebody's listening then and feels called
to build i'll call it a faith based or ethically
centered AI. Where should they start.

Speaker 2 (06:43):
Yeah, for sure, I love that. I think they should
start small. To be honest, figure out what in their
life they want to improve. Like for them, if they
had a problem with something in terms of a temptation
or distraction, figure out how to build a technology that
can cure that. There's so many people who have built
things that just like how to spend less time on screens,
et cetera. Start with something personal to you and then

(07:05):
from there scale up, iterate, but don't try to, you know,
attack everything. Don't try to be as big as GPT
and open AI for the jump, because then you know.

Speaker 3 (07:13):
You're kind of be getting ahead of yourself.

Speaker 2 (07:15):
Start some find your impact, find your why, and then
see that it's working and then keep growing from there.

Speaker 1 (07:21):
Yeah. And on your personal level, how do you check
like your heart when you're working on technology products so
that your why doesn't drift away from your divine assignment?

Speaker 2 (07:32):
Yeah, you talk about that word drift is very very important.
So I recently read the book called Outwitting the Devil,
and I think outweighing the Devil, outwitting the devil, outwitting
the devil, Okay, Yeah, And I think that probably had
one of the biggest impacts on me, just because there's
so many subtle things in life that can cause you
to drift.

Speaker 3 (07:50):
For sure, of going fear.

Speaker 2 (07:52):
Of poverty, fear of really just fear as a whole,
really just fear as a whole.

Speaker 3 (07:58):
So if you're building with fear.

Speaker 2 (07:59):
Of like I don't know if not people are going
to be able to use this, I don't know if
this is going to make enough money, then you're just
causing yourself to go into a path of building something
that doesn't really have any societal impacts, but I mean
me personally, like I keep, you know, a book like
this next to me, Jesus calling, So for me, I
always have scripture and word inside of me, so I

(08:20):
understand the pureness that needs to come inside of these
things that have this much leverage and impact.

Speaker 1 (08:26):
I love that you're grounding it in the word and
you are speaking to things that we all deal with
on a regular basis of that. And it's not the
small whisper, because that is the spiritual one that we
got to have a very clear open mind to be
able to hear. But it's the fears and the filters
of the world that obviously the devil that you're talking

(08:48):
about works so strongly in bringing to the forefront right
and allowing us. It brings the cloud and the inability
for us to stay charming and think clearly on our why.
And you know, as AI continues to evolve, I think
so does our responsibility. Right, and we're not just building tools,

(09:11):
we're shaping hearts, minds and futures, is you were just
talking about. And with that comes I think a calling
to create with integrity, with humility, and with a spirit
aligned to something way greater than that of ourselves, which
when you picked up that book, that's what that's talking about.
So as you leave this conversation for people who are

(09:32):
you know, listening, I want everybody to ask themselves, you know,
what's my divine assignment is this digital age? And how
I will carry it out?

Speaker 3 (09:42):
Like?

Speaker 1 (09:42):
What is my dis fine assignment in this digital age?
And how are we carrying it out? And I think
if we all have that in front of us, and
those who are in the space like yourself, I think
we're going to be in a much greater, more profound
place as we navigate this new world world of AI.

Speaker 3 (10:01):
Absolutely one thousand percent nothing in more prosperous space. If
everybody's building with divine intentions versus fear, I.

Speaker 1 (10:07):
Thank you man, always good having you.

Speaker 3 (10:10):
Yep.
Advertise With Us

Host

 Astor Chamber

Astor Chamber

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Breakfast Club

The Breakfast Club

The World's Most Dangerous Morning Show, The Breakfast Club, With DJ Envy, Jess Hilarious, And Charlamagne Tha God!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.