All Episodes

August 21, 2025 • 50 mins
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Seven ten The Uncle Henry Show weekday afternoons from five
till seven.

Speaker 2 (00:11):
Hi Henry.

Speaker 3 (00:13):
Here, I am. Where are you? Message deleted.

Speaker 4 (00:50):
It says the Uncle Henry Show here on news Radio
seven ten wnt EM. Thank you so much for listening
to The Uncle Henry Show, and today I have guests
in the studio back after a long break. Johnny Gwenn
of Deep Fried Studio, Johnny Gwinn, Hello to.

Speaker 5 (01:10):
You, gritty concitations, and you brought with you.

Speaker 4 (01:13):
Who did you bring with you?

Speaker 5 (01:14):
Johnny Gwen, a good friend of mine and a business partner,
Scott Tyndall.

Speaker 4 (01:19):
Scott Tendall, thank you for coming in on the Elklenry Show.

Speaker 6 (01:22):
Uh Kuley's happy to be here. This is always fun.

Speaker 4 (01:25):
Well, glad to have you both here. I asked Johnny
Gwenn and Scott Tendall to come in and talk on
the ELK Calenver Show today. One reason is because they've
started a new show of their own that you can
watch on the internet in the middle of the day.
Johnny and Scott, tell me and tell the listener about
your new show. What are you doing so?

Speaker 6 (01:44):
Johnny and I have a show called Lunch Breaks and
the whole idea is to kind of recreate that feel
you had back when you were in high school and
consider around the cafeteria lunch table and talk about whatever
you found interesting, whatever was going on. And so the
show ends up being a mixed bag every day. This
is our first week doing it, but it's been a

(02:05):
lot of fun. We talked about AI, we talked about
we rated some robots the other day on Robots on
their performance. Johnny's got a nice segment where he tells
people to get off his feed like you would get
off my yard.

Speaker 5 (02:17):
Okay, so it's a lot of online life.

Speaker 4 (02:19):
Yeah.

Speaker 5 (02:20):
We will show a lot of X feeds and things
in the news, what's happening, trending. It's an internet show
on Facebook and YouTube. It's on Twitch with no one
watches it on Twitch. I haven't found a crowd there.
But it's fun. It's Scott and I. It's most of
this stuff we talk about in our own direct messages
to each other on X most of the stories. But

(02:42):
we also have some interesting.

Speaker 6 (02:44):
Very smart contributors.

Speaker 5 (02:46):
Contributors that were they're kind of narrow do wells, but
they give us very interesting what we call transmissions, and
they will about usually AI stuff, a lot of crypto stuff,
lot of robotics stuff, and then some post to us,
some ufo and a lot of metaphysical stuff too.

Speaker 4 (03:03):
Okay, so this it's called lunch break, Let's breaks. Lunch
Breaks with Scott and Johnny. What time does Lunch Breaks
with Scott and Johnny come on? Every day Monday through Friday?

Speaker 6 (03:14):
So is eleven to twelve forty five Central?

Speaker 4 (03:18):
Okay?

Speaker 6 (03:19):
Then it lives on the internet forever. How that works? Yeah,
they can always catch it later if they maybe their
lunch breaks into a different time than hours.

Speaker 4 (03:25):
So people need to find it where Deep Fried Studios
on YouTube, it's where the Uncle Henry Friday night shows
used to be at YouTube.

Speaker 5 (03:34):
And then it's on my personal Facebook page. It's on
Uh there's a page it's pretty big, that's called Positive Press.
That's on. It's got like ten thousand people on it.
It runs on there. So primarily right now, it's.

Speaker 6 (03:48):
Facebook livestream from X today.

Speaker 5 (03:52):
And we had X the first day which actually had
really good viewing today.

Speaker 4 (03:55):
Yes it did. I was one of the people that
popped in and looked a little bit on a shock.

Speaker 5 (03:59):
That was the first day I did that.

Speaker 4 (04:00):
I'd not at allegedly rewards video.

Speaker 5 (04:04):
Well, if we ever get a penny out of it,
it'd be great, but we aren't. Look today, Look until yesterday,
I haven't shaved in a while, Ankle, Henry.

Speaker 4 (04:13):
I noticed. I didn't want to ask about it.

Speaker 5 (04:15):
We haven't broke over ten people watching our shows Monday Wednesday,
Monday through Wednesday, and I said today on the show
on if we get over ten listeners, I will shave
for tomorrow. Yeah, so we had sixty today that. Oh wow, Okay,
now I've got a shave for six weeks in or
six weeks, very very good.

Speaker 4 (04:35):
Now, when I was watching on X there were at
least eighteen people watching as as it was live.

Speaker 6 (04:40):
Yeah, so I don't know exactly. It's hard to know
how many people are actually watching. I'm glad that we
I would rather not know, honestly, because it just doesn't
affect me one way.

Speaker 4 (04:51):
Or the other.

Speaker 6 (04:51):
Right right, Like I'm still most of the show is
us talking to each other about things that if the
microphone was on, we'd be having the exact same conversation.

Speaker 4 (05:00):
One of the best kind of conversations to have. Well,
our media, they are, that's what people want to hear.
They want to they want an intimate conversation and that's
what you're giving them. Well, you know, lady willboard.

Speaker 5 (05:10):
My wife watches some of it. And I reviewed a
show yesterday because I had many technical difficulties because I'm
doing the whole thing of producer on air.

Speaker 4 (05:20):
I watched you do that before.

Speaker 5 (05:21):
Very difficult, and Stacey was like, this show gets really dark.
Can because we talk about like AI stuff can get
into a lot of like dystopian feelings and dystopian atspheric thing.

Speaker 4 (05:39):
That is a wonderful transition to that topic. Yes, and
we're going to talk later in the hour about you
both have a business venture that uses AI, so we'll
talk about that later. But AI, it does get dark,
doesn't it.

Speaker 7 (05:51):
It can?

Speaker 4 (05:52):
Well. I this morning, as I'm getting ready for work,
I saw a story of a Google engineer that was
one of the first engineers working on AI for Google
telling all lawyers and doctors maybe you shouldn't go into
that field because AI is going to take over the
day before yesterday, I saw a long list of job

(06:12):
titles most likely to be eliminated by AI. There's dark
right there, just the idea of human beings not being
able to do those jobs.

Speaker 6 (06:21):
I mean, one of the things I think a lot
about is, like everything in life, if you want to
find something negative, you will go find it right, and
if you want to find something positive, you will go
find it right. This technology is advancing at a pace
that we've never seen before, and regardless of how I
feel about its progress, I cannot stop it.

Speaker 4 (06:42):
Okay.

Speaker 6 (06:42):
So if I can't stop it, I need to figure
out how to live with it and how to best
live with it. And so that's the way I try
and approach AI while also protecting myself from the downside
dystopian version. If I can, I don't know that you can.

Speaker 4 (06:57):
All right, Well, we've got a couple of minutes left
in this segment. Let me start on AI with You
just said that it's advancing like nothing we've ever seen before,
but as a consumer user, it still gives me the
wrong answers on things it does.

Speaker 2 (07:11):
It does.

Speaker 4 (07:11):
It's so you're saying it's advancing a lot of us
on the consumer and aren't seeing it. You're seeing it advance.
You're seeing it do things that couldn't do a few
weeks ago.

Speaker 6 (07:20):
So I would say, if you were to ask a
human a question. Let's say you ask a human doctor
a question, right, what are the odds they're going to
know the exact right answer, right? So it's not a
magic mystery machine. So the way the AI works is
it's pattern recognition. It's just really advanced at pattern recognition,

(07:42):
and it can do so many permutations. It can go
through so many versions so fast that it spits out
information back to you. So it's going to get the
pattern right more often than not. And the more it
trains and develops itself, the more it will become more
accurate with the pattern. And so a lot of it
is about the data that the the LLM is trained on,

(08:05):
and not all of them are equals. So when we
say AI, we're using this big kind of word, but
think about the same word as if we said technology. Okay, right,
we'd say, well Apple does this, and Google does that,
and Tesla does this. That's the way you need to
think about AI. Think of it as an industry, and
then there are companies inside of that that are the
big heavy hitter.

Speaker 5 (08:25):
So we can talk about all right, we're going to
have it's the idea of source material, where's the data
coming from that they're using the patterns for Okay, it's
important too.

Speaker 4 (08:32):
All right, when we come back, more on this, more
on AI, and whatever else. Johnny Gwinn and Scott Tindall
are up to as The Uncle Henry Show continues here
on news Radio seven to ten WNTM.

Speaker 1 (08:46):
Back after the break, use check traffic and weather, check
opinions check. This is the Uncle Henry Show only on
news Radio seven to ten wnt.

Speaker 4 (09:10):
That says the Uncle Henry Show. Here on News Radio
seven to ten WNTM. It is five point twenty. I
have guests in studio, Johnny Gwenn and Scott Tindall, the
hosts of Lunch Breaks. You can watch them every weekday,
Monday through Friday, on Facebook, on x on Twitch, Twitch, YouTube,

(09:31):
all these different social media platforms. So when you talk
about AI to people, civilians that don't use it all
the time, do people even know what it is? Do
people really understand what it is when you're talking.

Speaker 6 (09:41):
About That's a great question, and I always try and
start with, we need to define what AI is together,
because what I think it is and what you think
it is. If we're not talking about the same thing,
then we're not going to get any progress okay, And
so then I normally try and meet somebody on their
level of whatever their knowledge of AI is, rather than
trying to get them to full speed all at once. So, like,

(10:04):
let's take my seventy four year old mother for example. Yeah,
she will ask me questions about AI that I can
tell it's her trying to figure out how to use
chat GPT like she uses Google. How do I find
the answer to this? How do I prompt it to
give me that? That's what most people are using it

(10:25):
for right now. Unless you're in the industry of building
a tool, or if you're under the age of twenty five,
you're definitely using the tool differently. And so it's just
like any other piece of machinery. Right, If you have
a hammer, everything looks like a nail, right, So what
you've got to do is use each of the tools

(10:45):
for different purposes or prompt it to act in a
certain way. So I think the best kind of advice
on getting people started is if you don't know how
to prompt an AI, tell it this is your task.
Please write the best prompt for yourself and then read
the prompt and as Johnny was saying, learn the like

(11:07):
like you can learn. This is how it's basically telling
you how it wants to be asked this information, and
then you just say, run this prompt and so then
it did for you the work to get you there.
The better you can prompt it, the better you can
request what you're looking for, the better the response will be.

Speaker 4 (11:23):
All right, let's get back here. Seventy four year old mother,
how will a impact her life over the next two
or three years.

Speaker 6 (11:33):
Well, I think that care number one health care for
sure is going to see a lot of it. Yes,
anytime you expect to talk to someone on the phone,
in five years, you're not going to talk to anyone
on the phone.

Speaker 4 (11:42):
Ever. In five years, there will not be human operators.

Speaker 6 (11:46):
There will be the one that you can pay a
premium price for the ultra subscription and you can get
a human. Okay, there will always be a human somewhere
at the end of the road if you pay enough money. Yeah,
But for the average consumer, that's gonna go away. My
mother in the next eighteen months is going to go

(12:06):
through the McDonald's drive through and there will be one
human in the building, correct, running everything.

Speaker 8 (12:12):
Yeah.

Speaker 5 (12:13):
So hotels, when you go to a hotel, yeah, you're
gonna check in with a screen that's gonna be a
screen of an avatar of a human you think it's real.
That's not a real human being. That's something that's been
created by this tool. And you're gonna think you're talking
to somebody who's a real person, and they'll do all
your stuff for you. The concierge will be that way.

Speaker 6 (12:30):
There will be.

Speaker 5 (12:30):
Somebody on the property to handle the big problems. Everything
else will be done with the idea of the idea
of a screen and an avatar. If you ever seen
the movie Minority Report, it is very much Minority Report.

Speaker 4 (12:42):
Well, okay, so you're telling you're telling me that for
the average person, like Scot Tindo's mom, Ai is going
to take over customer service very much, they're.

Speaker 6 (12:50):
Going to see that impacted. They're also going to see
their children and grandchildren find it harder and harder every
month to find a job that's going to impact. Like
when you're saying, how is it going to impact my mother? Yeah,
when things start happening to her children and grandchildren, then
she'll see the impacts. Because what we've seen already is

(13:13):
that within the next five years, the conservative estimates say
that seventy percent of entry level white colored jobs will
be gone. So that's entry level, right, and that would
be how many years within the next five Yeah, will
lose seventy Do.

Speaker 4 (13:26):
You believe that?

Speaker 5 (13:27):
I mean, this is this is chase. This is a
very conservative people saying yeah, and no one is saying
how to deal with it? And all all I've ever
heard is UBI but it's it's it's that to part of.

Speaker 4 (13:41):
The universal basic income. But what well, that's forget that.
What about governments wanting to tax it? So they will.

Speaker 6 (13:48):
They're going to.

Speaker 4 (13:48):
They will because they're going to lose the payroll taxes
if people are not employed. Yes, they're going to Before
they figure out ubies, they're going to figure out some
way to tax it, right.

Speaker 6 (13:58):
That's right. I think what you'll see most of these
AI tools are what we call token based, meaning you
pay based on your usage, if you don't have a
flat fee. I think what you're going to see is
like you're saying a payroll tax on the AI as
though it's an employee. And so there's this thing in
AI called agents where you can I can create a

(14:19):
workflow that essentially automates a task for you and then
I can attach all those workflows together to create basically
a humanless business. You have agents that are doing all
the tasks that other people would normally do. I think
what you're going to see is an agent tax. If
you deploy an agent into your business replacing a human,

(14:40):
then you're gonna have to pay a tax on that.
And it's still going to be a lot cheaper for
the business owner than employing a human.

Speaker 5 (14:46):
It also shows up twenty four seven and works twenty
four to seven.

Speaker 4 (14:50):
But so so there will be a loss in jobs
and a loss in tax revenue. They'll figure out how
to make up the tank.

Speaker 6 (14:57):
Rebu they always do.

Speaker 4 (14:58):
Are there, Are there? Is it ever going to job?

Speaker 6 (15:01):
We don't know.

Speaker 4 (15:01):
We don't know yet.

Speaker 5 (15:02):
Because they also thought the world was going to end
when the Internet became more and more mainstream too, and
that and so it actually created a whole Well, if.

Speaker 4 (15:08):
I'm smart enough, can't I use it to create my
own company, Mark Cuban said.

Speaker 5 (15:14):
Mark Cuban said, with AI, within the next decade, they'll
be the first trillion dollar company. There will be one
person in a basement.

Speaker 4 (15:20):
Okay, one person.

Speaker 6 (15:21):
So here's the issue. Yes, it will create jobs, for
individuals who create it, correct, But we're no longer going
to have someone that creates a business that employs a
thousand people. That's the that's the problem because not everybody
wants to create their own job. Not everybody wants to
create their own tool. Not everybody's built for that. Everybody's
got their own skills. And if we were all just alike,

(15:43):
you know, we wouldn't need each other. The problem's going
to come when So Johnny and I have new business
that we've started, and we've kind of talked about it. Well, Traditionally,
before AI, it would have cost orders of magnitude more
to be able to do what we do. And then
second of all, it we would have had to employed
one hundred people already just to get to this point.

Speaker 5 (16:03):
Just stay at the point where we are and it's
just he and I and a C and developer.

Speaker 4 (16:08):
Briefly, free plug, what is your business? What is your
business that you've created?

Speaker 6 (16:12):
Now, we have a company called Meducate AI, and I
think a lot of your listeners will find value in it.
What we do is we create custom procedure based videos
for doctors and picture this. You go a doctor's appointment,
they say oh, uncle, Henry, you've got kidney stones. We're
gonna have to do surgery on you. As soon as
he says we're going to have to do surgery on you,

(16:33):
you're just here in Charlie Brown's teacher in your head, right,
so you don't listen to anything else. But during that time,
he gives you the preoperative instructions. He tells you what
to do, how to get ready, make sure you do
this and this at check out. Then when you go
to schedule it, you probably get a stack of twenty
five pieces of.

Speaker 4 (16:49):
Paper that's all happened, yeah.

Speaker 6 (16:51):
And then you go, okay, what do I do with this? Oh,
we'll read all that. That's got everything you need to know,
all right. We replace that giant stack of paper with
a video. When your schedule is say you're gonna have
a kidney stone surgery, the scheduler types in the medical
procedure code that you use for billing, and that automatically
brings up an AI video about that procedure, what you

(17:11):
need to know, before, during, and after, when to call us,
when not to call us, And we do that in
the doctor's own voice, So with just thirty seconds of
a doctor's voice we can clone their voice to say
anything we want any say. So the reason this is
so valuable is we can onboard doctors in less than
three minutes, and most medical technology it takes hours of

(17:32):
doctor's time.

Speaker 4 (17:33):
Okay, so this saves top time for doctors and patient
pace education and for the patient to understand.

Speaker 6 (17:40):
Yeah, so we can talk about the details, but sometimes
we want to reduce cancelations in those shows as well.

Speaker 4 (17:45):
All right, we're gonna be back with more from Johnny
Gunn and Scott Tindall. As The Uncle Henry Show continues.
After the news break, it says, the Uncle Henry Show
here on news Radio seven to ten WNTM. Here with

(18:05):
Johnny Gwenn and Scott Tendall. They have a new show
that you can watch weekdays, Monday through Friday, starting at
eleven forty five each weekday on social media. It's called
lunch Breaks where they just have great conversations about whatever
they're interested in. I've watched it on Facebook and on
x you can find it on YouTube and twitch. Scott

(18:27):
Tindall Johnny Gwinn was in here about this reminds me
of about twenty fourteen, twenty thirteen. Johnny Gwinn was my
guest into explain Twitter two people and how this could
use it now, businesses could use it. And now here
we are all these years later talking about AI. That's right,

(18:49):
can no And again your your business you've started is
medicate AI. That's correct, and so far spreading the word
letting people learn about it.

Speaker 5 (18:57):
Yes, it's learning a lot about building a company. We
just got into an accelerator program called Alabamon Launchpad today,
So Medicated AI got into that today. Congratulate her second
one we went into. So it's a pretty big deal.
We're very excited about that.

Speaker 4 (19:11):
But yeah, what.

Speaker 5 (19:14):
Thing I want to tell you too about how someone
can use AI?

Speaker 4 (19:17):
Yeah, tell me.

Speaker 5 (19:19):
My wife had some blood work done and she could
not get her doctor. Even though she had the results,
she could not get There was something shit that's made
her nervous, right that she saw on that thing, and
she took that information. Oh, she couldn't get out.

Speaker 4 (19:34):
She couldn't get.

Speaker 5 (19:35):
Anybody call her back for days. Yeah, So she took
that information. She put it into this system called Claude
that she used, which at AI system Chatpechi is the
same thing, and it read her blood work and gave
her a synopsis of what that possibly could be if
it's a big deal or not a big deal, and
what they ask your doctor. It happened, by the way,

(19:56):
five seconds. It took five seconds to do that.

Speaker 4 (19:58):
Okay.

Speaker 5 (19:59):
It also reads contract legal contracts, so you can understand
things now, terms and conditions. You can now put it
in there and ask it questions instead of reading all
that legal ease. That's something you can do right now
with the free version of jet GPT.

Speaker 6 (20:12):
Okay, you can take a picture and upload it.

Speaker 5 (20:13):
Take a picture and upload or PDF anything you get.
You can up that it will read it for you.
You can ask pros and cons. Has this helped me?
Has it helped the other person? It's amazing what you
can do with all that stuff.

Speaker 4 (20:23):
Yeah. So the average listener who is not using AI
in their business, what do they need to be careful
of in terms of not being taken advantage of, either
by AI or others with nefarious ideas using Having a.

Speaker 6 (20:39):
Really good one for this, and it is you need
to not assume that whoever you're talking to on the
phone is the person you think you are talking to
on the phone or video or video. But what I
have seen that I think is really helpful is to
have a password that you share with friends and family,
and if you're not sure if it's them or not,

(21:00):
if it's a robo voice, a clone, an AI, then
you can ask it for the family password. And what
I've seen effective is you choose something like the day
of the week is the family password. So then if
you called me today and said what's the password, I'd
say Thursday. But if you call me, if you called
me tomorrow, I'd say, the password is Friday. Now, don't

(21:21):
use that one, use one of your own. But I
think that is really helpful. Yes, because as Johnny and
I learned, we can make somebody's voice with just thirty
seconds of their voice. We can make them say about it.

Speaker 5 (21:33):
I'll sell you something today. Oklehennery, you should play that
for your listeners.

Speaker 4 (21:36):
Okay, what did you? You sent me an audio clip?
What is this?

Speaker 5 (21:40):
Well, let's just go ahead and play it. Well, you'd
tell me so, I know you can't. I would never
get you in trouble. It would never say anything bad.
There's no bad words. I kind of want you to
hear it without me sending it up.

Speaker 4 (21:51):
Oh, without setup? All right, Yeah, this is audio that
Johnny Gwenn you before I play it? You created this
using AI, created it using two different tools. Yes, okay,
here we go, listeners. This is Uncle Henry. I've got
a confession that's been burning a hole in my pocket.

(22:12):
Ronald Reagan wildly overrated in history. Folks, mark my words. Now,
President Obama had Wait a second, what is this now?
That doesn't sound like me a little bit? Oh you
think it does.

Speaker 6 (22:26):
But it gets a little better.

Speaker 4 (22:27):
Oh really, Oh I'm what else do I attack? A bigger,
brighter impact on America's rep and presidential legacy. That smooth
operator brought some real change in twenty sixteen. What a blunder?
We missed the Golden Age by not electing Hillary Clinton.
She would have let us fast, no doubt, Football time,

(22:49):
Bear Bryant, over hyped legends. Why are you doing this,
Johnny Alabama? Why would you do that? War Eagle? There
I said it. Now, I'm heading home to listen to
my new Taylor Swift album covering Black Sabbath Tuons rock
on and I'll see you when I see you. Biancas Hey,
I would never use that kind of likeuage A right,

(23:09):
that doesn't sound like nothing, that's not my speech. I
don't talk that smoothly. I'm not that that is nothing
like me.

Speaker 5 (23:18):
Right.

Speaker 6 (23:18):
The good thing is Uncle Henry it people know that's
nothing like you. Yes, but imagine if we had it
say things that you might actually say, but just altered
a little bit.

Speaker 5 (23:29):
But hold on, So or took.

Speaker 6 (23:30):
Something that you say on the radio and you say
it is this way, and somebody adds the word not
you say it is.

Speaker 4 (23:36):
Not that way.

Speaker 5 (23:37):
So here's the other thing. That is a tool that
we have in our system that we didn't make. We
use called eleven labs, and that is the basic, most basic.
I could only do about two minutes of your voice.
I didn't have time to do the professional voice, which
you need to have about eight minutes of that. But
the idea was I did that in about four minutes.

(23:58):
So I went and took some audio that we have
from a show that we did together. I took that audio,
I put into eleven labs, did the thing, did my
next thing, did all this other stuff, and then I
had I wrote something. It helped me write it better
for the scripts. I put the script into eleven labs,
used your voice clone, and that popped out in four minutes.

Speaker 4 (24:15):
Okay. So if you'd had a little bit more time,
oh yeah, it might have sounded just like me.

Speaker 5 (24:20):
Well again, what you do is you keep okay, has
a baby.

Speaker 6 (24:24):
The more we generate audio of you, the more it
sounds like you. So it needs more iterations, right, it
needs more tries.

Speaker 5 (24:30):
So think about it. Think about all of the systems
we talk about these llms, large language large language modules
or machines, models, models, right, so think of it and
as babies. When you start with your LLM, you start
teaching it and sending it sources and sources and sources,
and it starts learning just like a child does at
a very exponential speed. So right now we're still in
infancy of a lot of these tools. But it's also

(24:52):
infancy when you start using the tool, meaning it knows
that voice now from two minutes that I use for
just one or two times, if I do it twenty
or third your forty different times and add more of
your audio, it will get better and better and better
to reading the patterns and being able to duplicate those patterns.

Speaker 4 (25:08):
All right, Look, so this is this has shown me
a way that AI could really cause a problem very much.
Where if let's say that somebody decides to clone the
voice of a president of the United States, maybe Donald
Trump or whoever the next one is, and they get
good at it where it sounds exactly like them, then

(25:28):
you could make it say something that would crash the
stock market.

Speaker 6 (25:31):
Video generation is getting so good so fast, you.

Speaker 4 (25:34):
Could crash the stock market. You could start a war.

Speaker 6 (25:38):
At this point, do not believe anything you see on
the internet until you get a verified.

Speaker 4 (25:43):
Well, how are we is our government screening everything to
make sure that it's real? How do we know when
that a foreign leader if we're seeing something real from
a foreign leader or not.

Speaker 6 (25:53):
I think that's why they've made AI such a national
security issue. The implications around the ways that can be
weaponized are a lot like the way that we thought
about nuclear weapons in the beginning. You're going to have
the same kind of uncertainty around this technology. And it's
also why there's nothing we can do to stop it,

(26:14):
because as long as the Chinese are building, we're going
to keep building. We're not gonna put guardrails up. Europe
has put guard rails on their AI development, and there's
no way under this administration at least, that that's going
to happen, which is why I predict that in the
midterms AI will be the most important topic in the
midterm election. Whether it's power usage, whether it's water access,

(26:37):
whether it's job protection, whether it's what do we do
for the displaced, how do we retrain or uptrain? I
think that will be The technology is moving so fast
and now people are starting. We had ninety thousand layoffs
just in tech over the first six months of this year.
Coders computer science grads have the second highest rate of

(26:59):
unemploy ploymet only behind art historians.

Speaker 4 (27:03):
Think of all the people in the last thirty years
that had been told they should get in the.

Speaker 5 (27:07):
Learned the code, remember that, Yeah, they'll have a job forever.
Everything I've learned a code, and now it's they're the
first one and again. But there's that ten percent that
I've also jumped on this and are also saying I
don't need thirty people around me. I can build something
with just me and it could be a lot bigger
than you can ever imagine. There is still that too.

Speaker 4 (27:24):
We've got to take a time out. When we come back,
we'll have a final segment with Johnny Gwenn and Scott Tindall.
If they're really here. You the listener, you haven't mined
in this could be a aye back after the break,

(27:51):
Uncle Henry shown News Radio seven to ten WNTIM our
final segment now with Johnny Gwenn and Scott Tendall. Their
show is called Lunch Break Weeks. You can watch it
at eleven forty five in the morning, last hour Monday
through Friday, and you can find it on x on Facebook,
on YouTube, on twitch. Should they look for Deep Fried

(28:13):
Studios or Johnny Gwyn? Where should they look for Defrost Studios?

Speaker 5 (28:16):
For YouTube? My personal page is does that in Positive Press?
Is the other very big page on Facebook?

Speaker 4 (28:24):
Okay, so tomorrow, do you know what you're going to
talk about tomorrow on your show?

Speaker 5 (28:28):
Well, do it today Today with Dystopian Thursday. So tomorrow
is overly optimistic Friday.

Speaker 4 (28:33):
Okay, we're good.

Speaker 6 (28:34):
We're trying to return balance to the force. Okay, good,
get a little dark today. Tomorrow we're going to talk
about a what if everything actually goes right? If it
goes right's instead of worrying about what happens when it
goes wrong, what could the world look like if it
goes right?

Speaker 5 (28:48):
And also, and you're you've done a news show for
however many decades. Now you understand about you go in
and you prep about what's in the news and trending. Right, then, yes,
so we'll do that in the like tonight, I'll look
through a lot of my sources and feeds that I have,
and then I'll put together an idea of what i'd
like to do, and then in the morning I kind
of craft some things and then Scott brings me what
he wants and then so every show is very timely.

(29:11):
I don't know how long we can do a daily show.
I'd love to think we can do it for a while,
but we gave ourselves one month. It's a lot of work,
as you know, it is a lot of work to
come up with all that content.

Speaker 4 (29:22):
If you do it right and be good at it.

Speaker 6 (29:25):
We're still working on to be good at it part it.

Speaker 4 (29:27):
Yes, it's it's tough.

Speaker 6 (29:28):
Yeah. If you watch one of the first episodes and
it feels like nothing's happening, right, just skip the first
five minutes. That's us working too, because when we stream
it live, we're you know, we don't know what's going
on until somebody says I can't hear you.

Speaker 5 (29:43):
Is not on man or we'll show you the same
screen for eight minutes.

Speaker 4 (29:47):
Now. Yeah, well, I've enjoyed what I've seen this week.
I was delighted that you that you're doing it because
you both have things. I'm interested in hearing what you
have to say. I like to get your takes on
different things. So I'm enjoying. I hope you keep up
with it. It's called lunch breaks and tomorrow at eleven
forty five overly optimistic Friday Friday, So and you're gonna

(30:11):
be optimistic about what a I could bring us ultimats
out positive. So what if it all goes right? So
if it all goes right, I'm gonna guess that we're
all gonna be healthier. We're all we're all going to
have our own personal doctors looking after us. Twenty four
seven butlers, we're gonna live longer. What else Am I
going to have more leisure time? Or are gonna enter drugs?

Speaker 6 (30:30):
You're gonna have unlimited leisure time? Unlimited and things are
gonna cost nothing, basically just Star Trek Star Trek. Yes, yes,
that's if everything goes right.

Speaker 5 (30:41):
That's everything goes right.

Speaker 4 (30:42):
Could a I make new Star Trek with Captain Kirk.
Of course it isn't bad.

Speaker 6 (30:49):
Well, that's debatable. It can definitely make it.

Speaker 5 (30:51):
It was getting there's a thing OneD VO three. It's
video game users already use it. Independent filmmakers are using
it because they literally can make the same, especially sci
fi movie and action and all that stuff. It's like
one one hundred to one one thousandth the costs of
making a film, now, okay, yeah, yeah.

Speaker 6 (31:10):
Netflix has already acknowledged that they used AI instead of
special Effect for a whole series.

Speaker 4 (31:15):
Yeah, okay, so will will A lot of people have
speculated that we won't we won't have to wait for
a good movie to come out, that we can make
our own and entertain ourselves with our own movies. Yeah,
if you.

Speaker 6 (31:29):
Want to make a movie about a man riding a donkey,
carrying a frog on his shoulder, smoking a cigarette, you
can make that movie now, yep.

Speaker 4 (31:36):
Yeah, But but it needs to be good. That doesn't
sound maybe that it.

Speaker 6 (31:41):
Might be good to you for you, And also it's
going to know your personality eventually, and so it's gonna
know what you like and what you don't like. So
back to Johnny talking about when you go to the hotel.
There won't be anybody there to check you in. Yes,
that's gonna be okay with you, because they're going to
scan your iris and it's going to have all your preferences.
It's going to turn the room to the temperature you
want it to be. If you're like you cookies hot,
it's gonna make your cookies hot instead of normal. And

(32:03):
it's gonna, uh, you know, draw your bath water. Okle Henry,
if that's what you're in, Well.

Speaker 4 (32:07):
What if I want privacy and I don't want people
to know my preferences again? Is that gonna be possible.

Speaker 6 (32:12):
We've already given up our privacy in this case.

Speaker 5 (32:14):
It's gonna be like it is gonna be like a
social score. You're gonna have to make a decision. Am
I gonna am I gonna go to the you know,
motel on the highway or am I gonna do this thing?
That's what life has become with Okay, it's fine, you
have to make decisions all the time. But also we
live in a part of the world where we take longer.

Speaker 7 (32:31):
To get to those places.

Speaker 5 (32:33):
So you will see it in other cities before it
starts becoming a lot more prevalent down here.

Speaker 4 (32:37):
Okay, we got a minute left again, it's lunch breaks
and look for that on social media weekdays starting at
eleven forty five in the morning to watch Johnny Gwynn
and Scott Tindall. Got about a minute left in the show.
I won't see you again before the football season starts.
Are you paying attention to Alabama football? Do you have

(33:00):
any predictions?

Speaker 5 (33:01):
I have a new hobby, Uncle, Henry. I'm trying to
make money and I am not looking at anything, and
I've gotten to a point where sports is just not
a part of my life anymore.

Speaker 4 (33:11):
Could you make a I predict the scores of Alabama football?

Speaker 5 (33:16):
I think I was gonna if I get some time,
I was going to try to figure out a way
to sit down and do prognosticating at least the over
under on wins this season. Okay, so that would be
kind of interesting, But no, I don't. I think I
just saw who the sorting quarterback was. But I don't
have a clue what's going on up untils Melissa clue.

Speaker 4 (33:34):
Will you watch Alabama? Yes?

Speaker 5 (33:36):
Okay, because that's how I spent time with my father,
my family, and it's more important than a football game now,
and I'm glad I'm kind of getting there because.

Speaker 6 (33:43):
We would talk about it every year.

Speaker 5 (33:44):
I've been backing off of my that for something of
eighteen year old kids doing something out of no control over.

Speaker 4 (33:51):
Scott Tindall, do you have you seen American football?

Speaker 6 (33:54):
I do follow American football?

Speaker 4 (33:56):
Is okay? Any predictions I predict pain for my Auburn Tiger?
All right, Johnny Gwinn, Scott Tendall, thank you for coming
in and talking on the Uncle Henry Show.

Speaker 5 (34:07):
Thank you, sir.

Speaker 6 (34:07):
Always fun.

Speaker 4 (34:31):
That says the Uncle Henry Show here on News Radio
seventy ten WNTM. Thank you for listening to the Uncle
Henry Show. Now, in this half hour of show, I'm
gonna get to some news items that I missed. You
may have missed them as well. You can learn together
from the news items. But I want to start with

(34:52):
a voicemail. Yes, a listener has sent me a message
if you would like to leave a message for the
Uncle Henry Show. The voicemail numbers two five one two
one six, nineteen seventy six. That's two five one two
one six, nineteen seventy six to leave a message for
the Uncle Henry show here on news radio seventy ten WNTM.

(35:16):
Now let me go to the voicemail. I got a
voicemail about a recent show. I'd pose the question more
than once this year. Are people getting dumber? Are they?
Or have they always been the way they are now?

(35:38):
And we just have more examples of it because now
we all have video cameras on our phones and we
can now chronicle our daily lives. So maybe people were
always dumb and we just we just didn't we just
didn't transmit it all the time. Have people always been
like this or are they getting And I've wondered if

(36:03):
the way technology is evolving and the way we use
it is evolving, is that making us less smart because
we don't have to remember the things we used to
have to remember. One example is I used to have
a catalog of phone numbers in my head. Now I
think I can remember maybe two. I don't know. I

(36:24):
at least know my own number. I don't know if
I know anybody else's number. So do we are we?
Is the technology making us dumber? Or have we always
been the way we are now? I've got an answer
to this. Long time caller Buford filing this answer to
my question, are people getting dumber? Or have we always

(36:48):
been like this? And there's just more proof of it?

Speaker 3 (36:54):
Hen of you in the podcast you oppose a philosophical question,
These people always been this dumb or have they just
recently gotten this dumb due to the sailor devices and
the social mediators and such Cannry. I have worked with
the public now on twenty six years Entry, yes, and

(37:18):
the mobile area yes, And I mean going in and
out of people's houses. And let me tell you this, Henry,
I started working with the public before of the social medias. Heck, Entry,
the Internet was just four years old. People were still

(37:38):
on dial up modems and complaining they weren't getting fifty
six K downloads on their fifty six K motives.

Speaker 4 (37:48):
And pausing here just to say that I now feel
blessed that I grew up and learned to exist without
all this technology. Now I love the technology. I do
love the technology with all the bad that comes with it.
Just great the things we're able to do. The amount
of knowledge at your fingertips, Think about the Think about

(38:11):
the amount of knowledge at your fingertips. If you've got
a smartphone with internet access, you have the equivalent of
I don't know how many libraries of Alexandria all there
on your phone. I mean, just about anything you would
want to know, you can find it. It's right there
at your fingertips. Now. Are people using it to get

(38:36):
all that knowledge? Some of us are, but a lot
of us are just playing angry Birds or Solitaire or
something like that on our phone. All right, But back
to the message, Buford establishing the fact that he was
going into homes working with the public before the Internet

(38:57):
got into everybody's house, and way before the Internet got
onto people's phones and in their pocket all.

Speaker 3 (39:02):
Day fifty six K downloads on their fifty six K motives.
But Henry, I can tell you this. People have always
been this dumb, Henry, I mean people in general, A

(39:22):
lot of morons right now.

Speaker 4 (39:25):
And I guess I can't speak for Youford. I I
just want you the listener to understand I don't think
you're dumb. In fact, I don't think most people are.

Speaker 5 (39:38):
That.

Speaker 4 (39:38):
Well, okay, i'll say most aren't. I'll say yes, I'm
feeling very charitable. See, I'm aware of my own limitations
and that there are that you the listener, probably count
me among the dumb but I don't believe that most
people are dumb, but I see strong veins of it.

(39:59):
That there's just just strong veins of it running here,
there and everywhere. All right, back to Buford's dissertation on people.

Speaker 3 (40:08):
In general, a lot of morons. It's just that I
believe that you see it more. You know, you can
see on the social media's the morons and idiots right
in different parts of the town and country. Right, But Henry,
have always been this stupid. I mean, they didn't just

(40:30):
wake up one day and with sport yesterday and dumb today, right,
I mean, there's a lot of these dag on people
that have generations generational stupidity. Henry.

Speaker 4 (40:42):
All right, I'm okay, I've got to pause there, just
to think back down my family tree. I hope my
family's not listening. Well, you know what, I know, maybe
some of my family is listening. So never mind, I
am not going to review. I'm not going to go
back through my mind and think about my family tree.

(41:03):
I'll say that I don't know, Beaufford. A few generations ago, uh,
my family they were all the employers and now my
now my family's mostly employees. But again, I hope my
family is not listening all right. Back to Beauford's dissertation,
it's almost.

Speaker 3 (41:24):
Over generational stupidity, yes, but uh no, Henry, people always
been this dumb. This ain't nothing new.

Speaker 4 (41:36):
You have a good day, Beautiford. Thank you for clearing
that up for me. So you've gone into people's houses
and you see a continuity of stupidity among among some
of us, and that would mean that would include me
on a regular or semi regular basis. I will admit
that I've got a vein of stupidity in my mind.

(41:59):
Maybe it's not dumbness that I'm noticing in the population, Buford. Instead,
maybe I'm noticing a societal shift and focus toward narcissism
that is not even a natural narcissism, but it is

(42:20):
almost a taught narcissism where we are being taught from
a young age about how wonderful we are. And now
you get to a point where adults are or throwing
parties to reveal the gender of the baby in the womb,
and there's just just every little teeny tiny thing is celebrated.

(42:43):
I mean, like you're getting out of kindergarten and we're
gonna have a graduation ceremony for it, that kind of thing,
all right, anyway, So maybe it's people are just more narcissistic.
All right, Look, let me got to take a break
and then come back with more. I'll get to some
news junk next after the break. Let's take the dad

(43:04):
gum break, it says the Uncle Henry Show here on

(43:27):
News Radio seven WNTM. And yes you're welcome. I know
you're enjoying the music. Now in this segment of the show,
before we're out of time, see coming up in ten minutes,
we've got news headlines. Before we get to the news headlines,

(43:48):
I got a few news items that I missed. You
might have missed them too. Got a brief story here
about the Cheatchoo train, the Amtrak Cheetchoo t that is
now federally subsidized in different stages, all kinds of taxpayer
dollar going in to provide people with the opportunity to
get drunk on a train. The chi Choo train started

(44:11):
up this week, the Amtrak chi Choo train between Mobile
and New Orleans, and Fox ten did a little story
on this about how there was a group of senior
citizens from Biloxi that took the train to Mobile. Now, see,

(44:32):
this is the kind of tourism I did not expect.
I did not expect groups of the elderly senior citizens
to take the train on distances that I consider teeny weenie,
which would be Biluxi. To me, that's a teeny weennie drive.
But maybe when I'm older, maybe when I'm in my
nineties or my hundreds, then I'll want to take a train.

(44:56):
So let's listen. Here is Fox ten reporting on the
group of senior citizens that rode the Amtrak choo choo
from Biloxi to Mobile.

Speaker 8 (45:10):
It's been a couple of days since the City of
Moviele welcome Amtrak back to the Port City.

Speaker 7 (45:15):
And today seniors from Biloxi made their way here for
a special day trip.

Speaker 2 (45:18):
Or Stephen Moody has more.

Speaker 7 (45:22):
With the return of Amtrak the Mobile did I reveal
Senior Center and said, dozens of guests to the.

Speaker 4 (45:26):
Port City the do I Praville Senior Center. Now already
I love these people. I haven't even met them, but
I loved them.

Speaker 7 (45:33):
They left from Biloxi early in the morning to get
here for a day of fun, and it's a trip
they've been looking forward to for a long time.

Speaker 8 (45:41):
I was happy.

Speaker 4 (45:42):
Lady.

Speaker 8 (45:42):
I'm glad that it's back, and I'm glad we'll be
able to travel in between.

Speaker 4 (45:46):
By the way, I can't show you video unfortunately. Radio
we're working on it. We're working on video, but still
unable to provide video on the radio signal. But this
elderly man they're interviewing, he looks exactly like me. He
does if I took off my wig and wore a
T shirt. This guy looks exactly like me, this Biloxi senior.

Speaker 8 (46:11):
And I'm glad we'll be able to travel in between
the cities and go and have a good time at
each town because I've been riding on a train since
I was a little.

Speaker 4 (46:18):
Kid and enjoying mobile.

Speaker 8 (46:20):
I lived over there, enjoyed the people, enjoy the city
and the food.

Speaker 7 (46:25):
They spent the day enjoying the sights and sounds of
mobile in there.

Speaker 4 (46:28):
By the way. Now video of elderly being gingerly helped
off the train. It looked it might have taken him
about an hour to get everybody off. It looked that's
the speed at which needs and you got to be careful.
You don't want to break nothing or pull nothing when
you're getting off the train exactly.

Speaker 7 (46:44):
If director says mobile was a no brainer.

Speaker 2 (46:47):
We love mobile. You talk about any city that that's
growing in the Southeastern United States, mobiles right at the
top of that list. I mean, there's so many restaurants,
so many shopping experiences, museums. It's one of our favorite
places to.

Speaker 7 (47:01):
Go, including that.

Speaker 4 (47:02):
I like that guy. Thank you Biluxi for providing that cheerleader.

Speaker 7 (47:07):
Dinner at Morrison's location left in the country.

Speaker 2 (47:10):
Actually their call they're going to go to Morrison's, which
I understand is the last Morrison standing. So as excited
as they are about taking this trend, they're also excited
about going to Morrison's and mobile.

Speaker 4 (47:21):
You know, this is why why are there We need
more senior citizens involved in media. These are the kind
of I'm tired of seeing young people on TV. I
want people like this talking about cafeteria style dining Morrisons.
There's excited about Morrisons. I want more seniors, more senior

(47:43):
citizens in the media. Tired of all these young people
with their dad gum young ideas and things.

Speaker 7 (47:50):
And for the first timers, they said, the experience was
a pleasant one and everyone should give it a try.

Speaker 3 (47:55):
It's very nice.

Speaker 5 (47:56):
You don't have to fight traffic, you don't have to
do any of that.

Speaker 3 (47:59):
Just step on and biloxi and get off in mobile
and they're or there. I mean, I enjoyed it. I
just kicked back and enjoyed the ride.

Speaker 4 (48:07):
And I would say, go.

Speaker 5 (48:09):
And do it if you ever are thinking about it.

Speaker 4 (48:13):
Steven Moody, Fox ten News, that was outstanding. Thank you
Fox tenth. Thank you for bringing me seniors. In fact,
I wish they'd put together a bunch of these people
and just have them around and go to the seniors
and get their reaction to the day's news. Just whatever's
happening in the news. Just have about five or six

(48:36):
of these people there in the newsplace Fox ten's news set.
Just set them up on some couches and stuff, and
every now and again check on over to those people
and say, hey, what do you think I'm serious about this? This?
If hey, if you're somebody out there with a venture

(48:58):
capital group, private equity looking for looking to invest in
some type of something, put some money into senior citizens
in the media, please, because as an older American, I
want to hear and see elderly people like me talking
about things that they are concerned about all right, just

(49:21):
about out of time. Here. You can listen to previous
episodes of the Uncle Henry Show as a podcast on
the iHeartRadio app. Just look for Uncle Henry Show on
the iHeartRadio app and set a preset in the app
for the show. That way, you won't miss any frustrating episode.
Or you can find the Uncle Henry Show podcast on iTunes,
or you can go right to our website, NewsRadio seventy

(49:43):
ten dot com and find the podcast right there. All right,
thanks for listening. As they say in Sarahland, have a
good one, and as they say in Theodore, take it
easy

Speaker 3 (49:57):
All right Later
Advertise With Us

Popular Podcasts

Law & Order: Criminal Justice System - Season 1 & Season 2

Law & Order: Criminal Justice System - Season 1 & Season 2

Season Two Out Now! Law & Order: Criminal Justice System tells the real stories behind the landmark cases that have shaped how the most dangerous and influential criminals in America are prosecuted. In its second season, the series tackles the threat of terrorism in the United States. From the rise of extremist political groups in the 60s to domestic lone wolves in the modern day, we explore how organizations like the FBI and Joint Terrorism Take Force have evolved to fight back against a multitude of terrorist threats.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

NFL Daily with Gregg Rosenthal

NFL Daily with Gregg Rosenthal

Gregg Rosenthal and a rotating crew of elite NFL Media co-hosts, including Patrick Claybon, Colleen Wolfe, Steve Wyche, Nick Shook and Jourdan Rodrigue of The Athletic get you caught up daily on all the NFL news and analysis you need to be smarter and funnier than your friends.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.