All Episodes

August 21, 2024 35 mins
Prepare to have your perspective on artificial intelligence turned upside down by Austin Ramsey, the trailblazing entrepreneur who challenges us all to "crush the box." We uncover how AI is already shaping our daily lives, from the unseen algorithms on social media to the geospatial mapping that guides our commute. Austin also gives us a sneak peek into the upcoming iOS 18 update, which heralds a new wave of AI integration in Apple devices. The discussion underscores the pressing need for safeguarding personal data in this rapidly evolving landscape.

Our conversation doesn’t stop at awareness; it dives into actionable insights. We tackle the critical issue of data privacy, especially in the context of popular platforms like ChatGPT and Facebook. Learn how you can take control of your personal information by fine-tuning your privacy settings and understand the trade-offs between convenience and security. The episode provides practical tips for protecting your digital footprint while highlighting how AI services, like Google Maps and Waze, balance user experience with data utilization.

As we navigate the ethical maze of AI, we scrutinize its impact on sectors like educational publishing. With McGraw using AI tools like ChatGPT to craft textbooks, we raise essential questions about the accuracy and control of AI-generated content. Drawing parallels to the early days of the internet, we advocate for consumer education and legislative action to ensure responsible AI development. To end on a high note, we celebrate the positive possibilities of AI, exemplified by the creation of a new Randy Travis song, showcasing the harmonious blend of technology and creativity.

To help you to navigate the home buying and mortgage process, Jonathan & Steve are currently licensed in Tennessee, Florida, Georgia, South Carolina, and Virginia, contact us today at 423-491-5405 or visit www.jonathanandsteve.com.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
is Benchmark Happenings, Brought to you by
Jonathan and Steve fromBenchmark Home Loans.
Northeast Tennessee, JohnsonCity, Kingsport, Bristol, the
Tri-Cities One of the mostbeautiful places in the country
to live.
Tons of great things to do andawesome local businesses.
And on this show you'll findout why people are dying to move

(00:21):
to Northeast Tennessee and onthe way we'll find out why
people are dying to move toNortheast Tennessee.
And on the way we'll havediscussions about mortgages and
we'll interview people in thereal estate industry.
It's what we do.
This is Benchmark Happenings,Brought to you by Benchmark Home
Loans, and now your host,Christine Reed.

Speaker 3 (00:43):
Welcome back everyone to another fantastic episode of
Benchmark Happenings, and todaythe star of our show is Austin
Ramsey.
Austin, thank you for beinghere.

Speaker 4 (00:55):
Yes, so glad to be here.

Speaker 3 (01:00):
Oh, I'll tell you what.
You did such a great job on thelast podcast.
I thought you have to come backbecause we talked, we got into
some other things that justreally we needed to come back
and talk about.

Speaker 4 (01:09):
Yes.

Speaker 3 (01:11):
And so for those of you if you missed that podcast
with Austin Ramsey, I highlyencourage you to listen to that.
But he also owns Point Tech.
Austin is just a phenomenalentrepreneur that has just.
He tells about his life and allthe great things he did just
from a young age to now, and hiswhole motto is crush the box.

Speaker 4 (01:34):
There's a lot to crush.
There is a lot.
Keep on crushing.

Speaker 3 (01:37):
And you are.
You're just.
You know you don't thinkoutside the box, you crush it.
Let's just start over and dosomething else.
Yes, yes, and I love yourexcitement, I love your ideas,
Austin, and you are such apeople person and so intelligent
, but you truly care, and forany customer that reaches out to

(01:58):
you for any IT issues ordeveloping their systems or
writing programs, you are theperson to contact right.

Speaker 4 (02:07):
Yes, thank you.

Speaker 3 (02:08):
Yeah, absolutely, and I can tell you we have
experience with Austin here atBenchmark and me personally with
my company, and Austin's comein and helped me and just done a
great job, and just soappreciative of those of us who
are a little older.

Speaker 4 (02:28):
Technology can be very complex, so we try to make
it simple and you do.

Speaker 3 (02:32):
You do make it simple .
So our last conversation, lasttime we started getting into
artificial intelligence and youknow that is something that I
don't think we realize how muchof that is already embedded in
everything that we use.
Am I right, austin?

Speaker 4 (02:51):
Yes, no, I mean, you know it's interesting.
I was just recently at aconference a geospatial mapping,
so think of Google Maps, youknow site locations where the
next Starbucks is going to be,and one of the speakers was just
talking about how much AI hasbeen embedded in our lives, way
before we ever heard the word.

(03:12):
You know this word AI, and Ithink ChatGPT has kind of become
the Walmart consumer brand whenpeople think of AI they say
ChatGPT, I mean there's other.
You know there's other examplesof these.
You know, facebook recently arenow meta, what they call meta.

Speaker 3 (03:30):
And I hate that on my Facebook.

Speaker 4 (03:32):
It's and unfortunately, that's a question
I hear a lot from clientscustomers, you know like what is
meta AI?
You know what do I?
How do I get rid of it?
The sad part is it's becoming aworld where it's not getting
rid of it, unless you justcompletely disassociate with
that platform, which is startingto become a challenge.

(03:53):
One of the most recentannouncements for those that
aren't aware with iOS 18 on allApple devices coming up in the
fall, apple is starting to embedAI very deeply into the
platform, even to the point thatthey're going to actually
integrate with ChatGPT.
Now their promise to theirconsumer is going to be that

(04:15):
we're going to protect your data.

Speaker 1 (04:17):
It's going to be on the device right, it's not going
anywhere.

Speaker 4 (04:21):
How do we really know where our data is going?
I mean because we know it'sgetting out.
Right we know that it's youknow.
Take an example of going toshopping on Amazon looking for
an item and next thing, you knowyou've found it on 10 different
websites on ads, or you'retalking to someone about
something which I do think is adistinguishment, now where we

(04:45):
are so advanced with how thesealgorithms work that these
algorithms really start toreally understand who you are.
Yes, they're getting reallygood about.
I mean, they have your creditcard data, they have your
location data, they have yourshopping history, they have your
email if you're on a Gmailaccount and companies are making

(05:06):
fortunes on that.
That is the new goal.

Speaker 3 (05:09):
They're after our time.
Yes, if we spend like I'm ashopper we talked about this
last time so I'm always gettingthese shopping notifications and
even on all of my social media,if I buy something from a
particular vendor or store, thenthat's constantly coming up.

(05:31):
So Austin I want to go back toand I really want us to talk
about to help the audienceunderstand artificial
intelligence and I'm surethere's a lot of people that
have a better understanding thanI do, as well well, I mean how
we can put some safeguardsaround it.
So I'm just going to read what Iwrote down.

(05:54):
I looked up Webster, andWebster said the capability of a
computer system or algorithm toinitiate human behavior and
imitate human behavior.
There was another one that saidthat making machines be able to
recognize patterns just what wetalked about make decisions and

(06:17):
this last part really botheredme judge humans.

Speaker 4 (06:21):
Yeah.

Speaker 3 (06:22):
So why is it that, if AI has been around for so long,
austin, and then we're just now, kind of since, I guess, a
couple of years, hearing moreand more about it, how did that?
How did that come about?
Why are we hearing about it?

Speaker 4 (06:38):
it's already been in there so you know and I'll go
back to a previous point I madeI think it goes back to sort of
how you consumerize that's theterm I would use how you
consumerize something right.
So AI has been a term thatprobably four or five years ago,
the average person on thestreet would have had no

(06:59):
reference to.
They might have heard the maybeheard the word reference to.
They might have heard the maybeheard the word.
Today, if you walk to someoneand say, do you have you heard
of or do you use chat, gptprobably more than more than not
, they're going to say yes, um,and.
So I think what it is is you'vegot.
The media has started to expandon its coverage, um, and, and

(07:22):
these other platforms likefacebook, you X now, or Twitter,
formally, snapchat, instagramall these platforms that the
average consumer probablyinterfaces with is using this
terminology now, so it's becomealmost in every person's
dictionary.
But I do think it's a questionof how do we control what goes

(07:46):
in and out of that model?

Speaker 3 (07:49):
Yes, and that's what I want us to talk about.
That a little bit.
Help us understand.
You know how we can do that.
What are some safety parametersthat we can do, or just maybe
better understand?

Speaker 4 (08:02):
it, Sure, sure, I mean.
I think some of the firstthings that come to my mind from
a safety precaution is you know, what are you allowing to be
what I sometimes call scrapedoff of your accounts, right?
So there's different ways thatAI systems or technology or

(08:23):
different services pull thatdata.
Some of it is generally justoff of what we call web scraping
, right.
So, you've got a website, you'vegot a Facebook account, you've
got a Twitter account, et cetera.
If those accounts are public,they can scrape data off there
and they can store thatinformation, and so that's where
a lot of your surface levelartificial intelligence data

(08:45):
comes from.
So one safety precaution is,you know, make all of your
social media accounts, thingsthat you post.
Make those things private toyour group of people.
Now, that's not going to, youknow, eliminate that data from
that particular provider.
So think of, like, let's usethe Facebook example right.
If you've got a public Facebookaccount, any type of AI service

(09:10):
can scrape data off of there,but if you make that private,
then only people that you'refriends with or that you're
connected to as well as thatservice could access.
So, like in the Facebook example, only Meta could pull from that
data.
So I think it's kind of doing asometimes what I call a

(09:30):
information tech checkup right,Looking at all of the services
that you interact with and youknow, just trying to get an
audit of what's public andwhat's private as step one from
your own information perspective.

Speaker 3 (09:46):
So that makes a good point on Facebook.
So when you're getting ready todo a post, it asks you do you
want it private or public?

Speaker 4 (09:53):
Correct.
So if it's private, it's justgoing to go to who you're
friends with, to the peopleyou're friends with.
But if you do, public posts,which and it becomes a question
of you know if you're trying toshare.
Maybe you're trying to sharesomething, maybe it's a lost
item.
Let's just use that as anexample.
A lost item and you wanteverybody that can see that post
to share it.
You might make that post public, right.

(10:14):
But if you're posting aboutyour family or photographs of
family and friends or activitiesthat you're doing.
I highly suggest all of my postsare private, unless I make them
public, and so that's justsomething to verify, but I think
you know one of the other sidesof this whole coin is you know

(10:35):
you have to ask yourself, youknow what services are you going
to interact with At some point?
By interacting with aparticular service or company,
you know you're to some degree,letting go of some of your data
privacy, right?
I mean, by having a cell phone,unfortunately, we're letting go
of our trackability.

(10:56):
I mean, we can be told thatthis device is not following us,
that it's not listening to us.
Is there really a goodmechanism, though, to verify
that?
From a general consumerperspective?
Yes, I mean you can tell whatapps are using your location,
what apps have control of yourmicrophone, but does the average

(11:18):
consumer go to that depth tocheck that?
Probably not.
That Probably not.
And so you know you have to askyourself and that's where this
becomes a very kind of atwo-edged sword on.
You know we want things tohappen fast.

Speaker 1 (11:33):
We want things to be instant.

Speaker 4 (11:35):
When we want to look up an answer, we want to know
the answer before we looked itup, right?
That's what.

Speaker 1 (11:40):
AI is about.

Speaker 4 (11:41):
It's about predicting and helping us understand
things really before we evereven thought about it.
So it's in front of us.
But that comes with a cost offeeding the data that can help
make those predictions.
You know, think of, you knowtraffic, you know, like Google
Maps, you know these map appsare starting to use AI to

(12:01):
predict where a nearest accidentcould happen.

Speaker 1 (12:05):
Well, how are they?

Speaker 4 (12:05):
doing that Because they know when you're using
Google Maps, they know whereyou're at.
And if they detect you slowingdown they can put that onto a
map and over time they candetect a pattern.
So in real time they're goingto update that trip time.
To say it's probably going totake an extra two minutes
because we have an oftenslowdown at this location.

Speaker 3 (12:24):
So that's kind of like Waze the Waze app, because
it's always doing that.
Yes, you know there's aslowdown up ahead.
Well, there may not be anaccident, correct, but it's just
where the algorithms arepredicting, where people have
slowed down.

Speaker 4 (12:38):
Now and some of it's self-reported and actually Waze
now is owned by Google.
Believe it or not, googlebought out Waze now is owned by.
Google.
Believe it or not, googlebought out Waze Really, so
they're tying those two together, yeah.
That's the big thing If youstart to look on a large scale.
You know, in the stock marketwe call it the big seven, but
it's really the big seven thatsort of is starting to control

(12:58):
our technology data footprintright, and I think that's been
the issue all along in thiscountry and the world is.

Speaker 3 (13:08):
In certain sectors, all sectors of business, you end
up with these large companies,conglomerate corporations, that
are controlling everything.
Yes, and the people that areoverlooked and left behind?
It's always us, the ones thatare the backbone that makes the

(13:29):
country run, pays the taxes, andit's just like where does it?
How do you control that?

Speaker 4 (13:38):
I mean, well, you know, and that's a conversation
that I've had with some people,you know, because I am
definitely a, you know, a smallgovernment person.
I believe in big government isnot great.

Speaker 3 (13:51):
Right, it's, smaller is much better.

Speaker 4 (13:53):
Smaller is always better, yes, so much waste, so
much duplicity of services.
It's unbelievable we could geta whole other session on that we
could do a podcast.

Speaker 1 (14:02):
We could talk for days on that.

Speaker 4 (14:05):
But you know, it really does become a question of
at what point do you have rulesand regulations that come into
play?
I mean, we know from afinancial there's financial
rules and regulations thathappen.
Often they get breached.
So the question is, howeffective are these rules and
regulations, these safeguardsthat get put into place?

(14:28):
I think it's more.
I always say the consumer trulydrives the market In a free
economy, in a free market theconsumer drives the market, and
I think it's a point to whereconsumers as a whole have to
come together and say wait aminute, if you are monetizing

(14:48):
off of my data, you're resellingmy data and we know, that
happens, it happens all the time.
Where is my involvement in that?
I never gave you permission, Ofcourse.
In their 15,000-word agreementthat you probably signed.

Speaker 3 (15:04):
that gets modified in real time.
I accept.
I accept who has time to readthat You're not?

Speaker 4 (15:09):
going to send it to an attorney because they're not
going to interpret it well, butit becomes a question from the
consumer to say, hey, wait aminute.
It's time that we arerecognized that, yes, you're
taking our data and we're notgetting any type of I mean.
Of course, the other sideargues that well, we're making
your life better.
We're making it easier, we'remaking it faster, so it becomes

(15:30):
that constant balance on who'sreally winning.
Now we know who always wins thebig corporations definitely win
.

Speaker 3 (15:37):
Also, the other benefit is like with the
business.
You know you have freeadvertisement for your business.
You know, because everybody'son social media and that's how
we.
You know we post things, eventhis podcast you know we post it
, um, so you do get that, but soI'll tell you.
I had an interestingconversation the other day.
I met a young lady who isdirector of AI for McGraw.

(16:00):
Oh.
Yes, and so we were talking andI said so, I'm just trying to
learn a little bit aboutartificial intelligence.
And I said what does a directorof AI actually do?
So she was telling me and it'slike you know, they publish
textbooks, so they're using thatwith Chat, with chat gpt, to

(16:21):
write their textbooks.
And I said, well, I said, sohow do you control what comes
out?
She said, oh well, it knows todo all these google searches and
if it finds false information,it doesn't print it.
And I said, well, wait a minute.
I said who's deciding what'sfalse?
I said pre-covid and covid.

(16:41):
We've had more falseinformation than ever on, all
you know, search engines,especially google yes and she
said well, we were putting likeroadblocks around.
I said so at the end.
Are those textbooks beingreviewed?
Oh, yes, we review them, and Isaid so if you're sending all

(17:03):
this information?
I said how does it work?
How does it produce thattextbook?
She said we don't know.
So I'm going to give that toyou, austin.
So I'm going to give that toyou, austin.
So we don't know.

Speaker 4 (17:27):
If you don't know how something, you feed it lots of
data.
Help us with that, and I thinkthat's one of the big questions
too, in terms of not just in theeducation landscape, but in the
business landscape, people thatyou know what's like ChatGPT.
I could open ChatGPT right now.
You can talk to ChatGPT likeyou're talking to a person, and
there are certain concepts thatare very helpful.
If you're looking for I'll giveyou an example.

(17:48):
You know, if you're needing toreply to an email, you can have
a conversation with ChatGPT.
Give it the parameters of whatthe conversation was about and
give me a response.
It'll respond with a response.
You can tweak it and say well,I was really looking for it to
be a little more direct, or Iwant it to sound a little more
professional, and it will revisethat.
The question is is are peoplereally refining what is being

(18:14):
generated?
And I think that's to yourpoint of if we just trust AI to
produce content and we don'thave that human level of control
.

Speaker 3 (18:25):
Yes.

Speaker 4 (18:25):
You know we're putting AI against other AI
right.
A lot of these fact checkers Ihear that concept those are AI
programs, or I should say, aipowered programs.
That's fact checking Right.
And what is fact checking?
That's fact-checking Right.
And what is fact-checking, whatit's fact-checking right?
And so I think it becomes thisconstant, like a dog trying to

(18:47):
catch its tail, just constantcircle, that is….

Speaker 3 (18:53):
And it has to be trained.

Speaker 4 (18:54):
Yes, and it's trained on what it's fed Exactly.

Speaker 3 (18:57):
That's my point.
Yes, yes, and it's trained onwhat it's fed.
Exactly that's my point.
Yes.

Speaker 4 (18:59):
Going back to the, you know, if you feed it one
thing, it's going to learn toproduce that one thing, and we
do know how I mean.
There is evidence, there'sSenate reports on what these
large companies were censoringand protecting the information
so we know that's happening.

(19:20):
And that, I think to me, is thegreatest risk with AI is you
know, is it really going to beopen?
We have this term called opensource in the data world.
Right, we can allow data toflow between different softwares
and different industries right,it can be a great thing.
Different softwares anddifferent industries right, it

(19:42):
can be a great thing.
But also, that's if you allowall data to flow, not well, we
don't like anything that hasthis word in it, or we want to
filter out anything that hasthese phrases in it.

Speaker 1 (19:54):
And we know that's happening.

Speaker 4 (19:55):
I mean, we've seen that happen, so I think that's
where we're at this very.
I feel like we're sort of atthis pivotal point with AI, to
where it's consumerized.
People know what it is.
They're familiar with it now.
They know it's involved intheir life.
But where does it go?

(20:16):
And I think there's a lot ofquestions on.
But where does it go?
And I think there's a lot ofquestions on.
You know, we see companiesinvesting millions, billions of
dollars In fact.
So I'm kind of involved in someinvestment stuff and I always
joke with some people.
I'm connected to that.
If you mention AI in yourquarterly results, your stock's

(20:36):
going to immediately double Notreally, but I mean it really has
become this new buzzword.
Right?
When does that buzz start tokind of soften?
And when does it become okay?
We've really got to lay somefoundations for how does it get
used ethically and how do weensure that it is truly open and

(20:57):
not getting controlled by thesource that's using it.

Speaker 3 (21:02):
And this reminds me, austin, of the launch of the
Internet.
We had the same conversationsaround the Internet how is it
controlled?
And now we're stepping intosomething that we don't really
understand how it works.
But we teach it, and that'swhat I think.

(21:24):
That's the biggest concern.
And for the consumer, I thinkyou pointed out that we need to
be educated on how can we, asthe consumer, put our boundaries
around.
What data can be scraped?
Yes, our information.
Like number one, I would neversave my credit card information

(21:46):
for another purchase.
If I buy something off of awebsite, I never choose to save
that, you know.
So it makes shopping easier.
The next time I don't have toput in my number.
I'm like, no, I can do thatevery time.
But I think that's how do we dothat?
And then I hate to even saythis, but it's really going to
take legislation.

Speaker 4 (22:06):
Yes.

Speaker 3 (22:07):
And we know what that's going to be.

Speaker 4 (22:09):
Yes.

Speaker 3 (22:10):
I remember watching C-SPAN last year when they were
talking about TikTok.

Speaker 4 (22:13):
Yes.

Speaker 3 (22:14):
And then they had conversations about AI.
Yes, you know about safeguardsand stuff.
So it's really up in the air.
I don't think we have.

Speaker 4 (22:23):
Well, I think in government you know, you
ultimately never have truesubject matters in the room.
Oh, that's so true.

Speaker 1 (22:32):
Making the decisions.

Speaker 4 (22:33):
Right, you have people that have been in there
for way past their time.

Speaker 3 (22:37):
Oh God, we need term limits, Do we not?

Speaker 4 (22:40):
Yes, 100%.

Speaker 3 (22:42):
And so they're in here making these decisions and
they have no clue what they'reeven talking about and they're
being fed information fromsomebody else, and we know that
half of them, like they'revoting on these bills, they
haven't even read theinformation have you seen some
of the Senate briefing packets.

Speaker 4 (22:59):
We're talking about five, six hundred thousand plus
pages.
Who has time to go through that?
No one.

Speaker 3 (23:05):
So they divvy it out.

Speaker 4 (23:07):
Just say here's your packet.
They do a quick brief on thefront and they come in and truly
, I think that's the scariestpart about what big government's
created.
Is the people in control, whichshould be the people.

Speaker 3 (23:21):
Right.

Speaker 4 (23:22):
It's unfortunately turned away and it's not in the
people's hands now it seems like.
The people that are in controlat government levels.
They don't know, they're notreading anything of what they're
doing.
It's just a matter of just youknow, check, check, check.
It's just a checkbox.
How many pieces of legislationcan we pass?
What type of media exposure canI make?
What type of headline can Imake?

(23:42):
And so you know.
I think that's where, again, Igo back to that whole comment of
we, as the consumers, we dodrive the market.
I mean we control what we buy,market.
I mean we control what we buy,what services we use, where we
go.
I mean we control that.
And so I think it's reallywe're at a point where we have

(24:03):
to be in the driving seat ofkeeping these companies
accountable for what they'redoing with our data.
And you know, ultimately it's achallenge on both levels
because, yes, I thinklegislation could come into
effect to control that.
But then you have to ask thequestion well, who's enforcing

(24:25):
that?

Speaker 3 (24:25):
Who's enforcing it.
And it would be like you saidit would be much better to have
grassroots consumers cometogether to start demanding
these boundaries, because I,just when you were talking about
check the box, I thought, oh mygosh, it's going to be the same
thing with artificialintelligence.
We're just checking the box,we're feeding it this

(24:47):
information.
Okay, check, we've done that,but how good is the information
that you've given it?

Speaker 4 (24:54):
Yes.

Speaker 3 (24:55):
And are we going to be 20 years down the road and
history is going to be rewrittenand nobody's going to remember,
no one's going to know.

Speaker 4 (25:02):
And that brings a whole other concept to AI is the
concept of if I generate anAI-generated response, how do
you verify now the authenticity?
I mean, we've not even got intoartificial produced images.
These are what they call deepfake videos, where you can take
someone.
I mean artificial intelligencethat is actually taking a sample

(25:26):
of somebody's voice and usingthat voice on their behalf.
Right, how you know, and Ireally don't think maybe there's
someone I highly would doubtsomeone has even thought about
now, or I should say thoughtabout, but figured out.
How do you verify theauthenticity of that?
I?
mean it kind of goes back tosort of this whole.

(25:48):
I'm fairly passionate about thecrypto environment, blockchain,
that's all about verifyingauthenticity, right?
You know exactly.
You know, in this ledger-basedsystem, that's almost where
we're going to have to head tounderstand is this photo real?

(26:08):
I mean, think about it, I meanwe're not even.
I mean I've seen you can gointo ChatGPT and generate a
photo and it's right now it's,and right now it's hit or miss.

Speaker 1 (26:17):
Sometimes it does a good job.

Speaker 4 (26:19):
Sometimes it does a bad job.
Right.
But we're just getting into thehyper growth of where.
Think about what makes AIbetter.
It's more data, more people,more involvement, and so, now
that we're at this point towhere it's getting fed so much
more data, it's only going toget better and better in terms

(26:40):
of generating these types ofthings that I think the human's
going to be hard to distinguishbetween.

Speaker 3 (26:48):
And that's scary, I have to admit, and I know that
there's been actors andactresses with lawsuits because
it's AIs use their voice.
There's been actors andactresses with lawsuits because
it's AIs use their voice, eventheir face.
Yeah, and you know, and I'm,you know, I would be, you know,
upset over that too, I mean,because that's how I make my

(27:10):
living, is who I am and my voice, and so let's kind of I think
we've kind of talked a lot.

Speaker 1 (27:16):
It's overwhelming, it is a lot.
It's overwhelming, it is a lot,it is overwhelming.

Speaker 3 (27:19):
So let's focus on the positive of AI, austin.
Let's think about the goodthings that we can do from it.
I think those of you listeningplease educate yourself as a
consumer.
Remember we drive the market.
Put those boundaries aroundyour phone and other pieces of
equipment you own.

(27:39):
And so let's talk about some ofthe positives.
Austin.

Speaker 4 (27:43):
Yeah, well, I think I mean one example that actually
that was.
Actually a lot of people wereasking questions.
But look at the new album ornew song that Randy Travis
released Using AI for His Voice.
I mean a great, great songusing AI for someone that
couldn't have produced that onhis own today.
I think that's really cool tobring that back to that
perspective.
So there is good examples of itand that is a great example of

(28:08):
how someone Randy Travis, histeam worked with the record
label to create these boundaries, to create an agreement that
this is what they wanted rightand it was successful.
And a lot of people started toask you know, is this, is this?
yes, it was, I mean, and theyreleased a whole statement and
that was a really cool moment tosee AI shine, I think, also in

(28:30):
people's daily lives.
You know, like I said, I mean Ihave ChatGPT on my phone and
you know it's a great tool to ifyou're trying to generate a
quick email or a quick messageor you're trying to contemplate
on how to respond to somethingneeding some quick information.
What an incredible tool thatyou can actually communicate

(28:53):
with like a person, right andkind of get that feedback.
And I think, just down to howyou know, companies are starting
to use AI to be moreintelligent, to help curate.
You know.
Going back to the example ofunderstanding us, you know it

(29:13):
helps us to eliminate some ofthe decision-making process.
So, instead of having to huntthrough 50 different things, it
can narrow that down to five.
I know Microsoft is coming outwith their new Copilot, which is
going to be rolling out withWindows 11.
A lot of enterpriseorganizations are starting to

(29:33):
question right now on you knowhow do we bring co-pilot or this
idea of AI in a mainstreameffect, but I think it's making
things more accessible andtaking away some of that.
Like you know, if you've beenworking on your computer all day
, give me a recap of what I'vedone today, or I'm looking for

(29:53):
this file.
I saw it yesterday.
Help me find it.
And being able to use AI to dothose things.
It takes away some of thatclutter.

Speaker 3 (30:02):
Right Some of that burden.

Speaker 4 (30:04):
Some of that burden, work burden.
And I think that's and there'salso some things behind the
scenes that we don't see as anaverage consumer, where AI is
playing a role in helping impactour lives and, to many degrees,
improve.
I mean, there is, you know, Ithink we started off with a very
you know, there was a negativeconnotation to it, and I still

(30:25):
think there's concerns, there'squestions.
I think, as you said, theconsumer needs to be aware and
do their education and learn,and do their education and learn
, but also, I think, beingsomething that's been out there
a lot longer than just as it'sgotten, so recently known.
There is a lot of benefits toour daily lives that we don't

(30:47):
see.

Speaker 3 (30:48):
So what would be a good resource, Austin, for
somebody listening today toeducate themselves?
What would they go to to learn,and do you have any
recommendations for that?
You know I don't have a directrecommendation.

Speaker 4 (31:03):
You know, I think really it comes down to looking
well.
One, I would say do a internalaudit of all the services that
you interact with right Mainlyyour social media platforms and
verify how your information isbeing utilized right, that's a
big piece.
Two, I think it's doing someself-education online.

(31:27):
I mean, I think, going back tothe start of the internet, it's
grown to the point where you canliterally scroll forever with
tons of different resources andyou have to gauge those based
off of what you feel like is.
I mean, different resourceshave different connotations and
different pulls right, dependingon who's bringing it out.

(31:47):
But just do some internalonline searching If you want to
start with.
What is AI?
How is?
AI used some of these searchterms just to kind of start that
.
But then think of it likeyou're putting together a
research paper right you'retrying to get 20 different
sources right.

Speaker 3 (32:06):
Right, that's kind of what I did today, a little bit,
you know, yeah.
Um well, austin, thank you somuch this has been great, it's
been great truly I.
You know, I could just sit andtalk with you forever.

Speaker 4 (32:19):
Likewise, Likewise, and you're just.
Well, it's a large topic too.
Well it is.

Speaker 3 (32:24):
And I think there's other things that we can talk
about, because I really believethat we need, as consumers,
we've got to get moreinformation out there, and I
think, having you come on thispodcast and this was kind of an
overview, yeah and then maybe wecan kind of do a little bit
more.
I really think this is needed,because I don't think people are

(32:45):
really talking about this atthis level, austin, well, and I
think you have a whole otherconversation about AI in the
workforce.

Speaker 4 (32:50):
Yes, you've got people like Elon Musk that
thinks in the next 10 to 15years that AI could I mean.
You've got people like ElonMusk that thinks, in the next 10
to 15 years that AI could.
I mean think about all the youknow, and I think of that and I
go hmm, what would be a worldwhere AI has replaced 50% of the
jobs?
What would that look like?
Right.
I mean, and things that youwouldn't think of, where AI is

(33:13):
starting.
I mean, there's AI involved inconstruction today, right, I
mean, and in 10 years, you know,you've got robots building
large-scale buildings.
That's coming, so we have toembrace that and we have to be
ready for this.
That's a whole otherconversation, but it's here.

Speaker 3 (33:34):
It's growing whole, nother conversation, but it's
here.
It's growing, I think, the morewe can educate ourselves and be
ready and prepared so thatwe're not caught off guard, no
matter what we're facing asindividuals, as a community, as
a country.
I think those are always thekind of the ground rules of

(33:55):
getting in front of something.

Speaker 4 (33:56):
One of my favorite quotes is knowledge is power.

Speaker 3 (33:59):
Knowledge is power and God says my people perish
for lack of knowledge.

Speaker 4 (34:03):
Yes, yeah, austin, thank you Thanks for having me
Austin from Point.

Speaker 3 (34:08):
Tech, an amazing entrepreneur, it, so I highly
recommend if you need any helpwith your systems, your company,
whatever that has to do relatedto computers, you need to
contact Austin Ramsey.

Speaker 4 (34:23):
Thank you.
If you've got any boxes tocrush, we can crush them.

Speaker 3 (34:26):
And you're going to crush them.
I know it because you do Thankyou, austin, thank you.

Speaker 1 (34:31):
This has been Benchmark Happenings, brought to
you by Jonathan Tipton andSteve Reed from Benchmark Home
Loans.
Jonathan and Steve areresidential mortgage lenders.
They do home loans in NortheastTennessee and they're not only
licensed in Tennessee butFlorida, georgia, south Carolina
and Virginia.
We hope you've enjoyed the show.

(34:52):
If you did make sure to likerate and review.
Our passion is NortheastTennessee, so if you have
questions about mortgages, callus at 423-491-5405.
And the website iswwwJonathanAndStevecom.
Thanks for being with us andwe'll see you next time on

(35:14):
Benchmark Happenings.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Amy Robach & T.J. Holmes present: Aubrey O’Day, Covering the Diddy Trial

Amy Robach & T.J. Holmes present: Aubrey O’Day, Covering the Diddy Trial

Introducing… Aubrey O’Day Diddy’s former protege, television personality, platinum selling music artist, Danity Kane alum Aubrey O’Day joins veteran journalists Amy Robach and TJ Holmes to provide a unique perspective on the trial that has captivated the attention of the nation. Join them throughout the trial as they discuss, debate, and dissect every detail, every aspect of the proceedings. Aubrey will offer her opinions and expertise, as only she is qualified to do given her first-hand knowledge. From her days on Making the Band, as she emerged as the breakout star, the truth of the situation would be the opposite of the glitz and glamour. Listen throughout every minute of the trial, for this exclusive coverage. Amy Robach and TJ Holmes present Aubrey O’Day, Covering the Diddy Trial, an iHeartRadio podcast.

Good Hang with Amy Poehler

Good Hang with Amy Poehler

Come hang with Amy Poehler. Each week on her podcast, she'll welcome celebrities and fun people to her studio. They'll share stories about their careers, mutual friends, shared enthusiasms, and most importantly, what's been making them laugh. This podcast is not about trying to make you better or giving advice. Amy just wants to have a good time.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.