All Episodes

December 5, 2025 58 mins

Tech got uncomfortable—and tonight, we are going right into the heat. This is The JMOR Tech Talk Show with your host, John C. Morley, Serial Entrepreneur, Engineer, Marketing Specialist, Video Producer, Podcast Host, Coach, Graduate Student and lifelong learner. In this episode, “Tech Got Uncomfortable: AI Heat, Power Grabs & Layoffs (S4) S50,” I am unpacking how AI is straining power grids, bending policy, squeezing jobs, and even rewriting who gets to hold the controls in our digital world. From Europe taking aim at Apple, to robots rebuilding ancient art, to layoffs at household-name companies, this is the side of tech that does not fit into shiny promo videos—but absolutely affects your life, wallet, and future.​

1️⃣ 💡 Apple vs EUWhen your ads and maps start looking more like public infrastructure than just “apps,” regulators sharpen their knives and step in. Apple’s ad and maps ecosystem in Europe is now triggering serious questions about gatekeeper power, data control, and whether one private company should be allowed to sit at the crossroads of so much of people’s daily digital lives. This fight is not just about fines; it is about who gets to set the rules for the roads everyone else has to drive on.​

.css-j9qmi7{display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-direction:row;-ms-flex-direction:row;flex-direction:row;font-weight:700;margin-bottom:1rem;margin-top:2.8rem;width:100%;-webkit-box-pack:start;-ms-flex-pack:start;-webkit-justify-content:start;justify-content:start;padding-left:5rem;}@media only screen and (max-width: 599px){.css-j9qmi7{padding-left:0;-webkit-box-pack:center;-ms-flex-pack:center;-webkit-justify-content:center;justify-content:center;}}.css-j9qmi7 svg{fill:#27292D;}.css-j9qmi7 .eagfbvw0{-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;color:#27292D;}

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:21):
Well, hey guys.
Happy Friday to everyone.
It's great to be with you.
This is December 5th, 2025.
It is the first and last Friday, definitely,
of December of 2025.
So, great to be with you guys.
You're tuned into The JMOR Tech Talk

(00:43):
Show.
Thanks so much for being here.
If this is your very first time, I
want to say thank you and welcome.
And if you're coming back, I want to
say a big thank you to you and
a big welcome back.
Of course, this is the show that talks
all about technology and gives you insights to
what you need to know so you can
stay protected and so you can stay educated.
Well, it is so great to be here.

(01:04):
If you're thirsty, feel free to go grab
yourself some RO water or something like that
or a snack or a beverage.
Don't want you to be parched or, let's
say, famished or someone that hasn't eaten in
a while.
So definitely do that and head on back
to the show.
You know, guys, don't forget to check out
BelieveMeAchieve.com right after the show.
It's available 24 hours a day.

(01:25):
It's where you can get my short-form
content, my long-form content, and all kinds
of other great content like reels and articles.
So tech got uncomfortable.
And tonight, we're going right into the heat.
This is The JMOR Tech Talk Show
with me, your host, John Seymour, engineer, marketing
specialist, video producer, podcast host, coach, graduate student,

(01:51):
and above all, yes, a lifelong passionate learner.
In this particular episode, tech got uncomfortable, AI
heat, power grabs, yep, and layoffs.
It's Series 4, Show 50, and I'm unpacking
it all here, how AI is straining power
grids, bending policies, squeezing jobs, and even rewriting

(02:14):
who gets to hold the controls in our
digital world.
From Europe taking aim at Apple to robots
rebuilding ancient art to layoffs at household companies,
this is the side of tech that does
not fit into shiny promo videos, but absolutely
affects your life and my life every day,

(02:36):
our wallet and, of course, our future.
So let's dive right in, guys.
If you have your beverage or your snack,
feel free to chill back and enjoy the
show.
So, guys, number one, Apple versus the European
Union.
When your ads and maps start looking more
like public infrastructure than just, quote-unquote, apps,

(02:59):
regulators sharpen their knives and, well, they step
in.
Apple's ad and maps ecosystem in Europe is
now triggering serious questions about gatekeeper power, data
control, and, yes, whether one private company should
be allowed to sit at the crossroads of

(03:20):
so much of people's daily digital lives.
This fight, guys, is not just about fines.
It is about who gets to set the
rules for the roads everyone else has to
drive on.
It's basically what do the big guys get
to control, and it's quite a bit.

(03:42):
So the Pompeii robot, imagine an AI-driven
robot acting like, well, the world's most careful
puzzle solver.
Speedrunning 2,000-year-old jigsaw pieces that
used to take humans years to reassemble.
That is what's happening in Pompeii, where robotics

(04:05):
and AI are teaming up to match broken
Frisco fragments without grinding or scratching priceless art.
Tech here is not replacing humans, guys.
It's amplifying them.
Letting archaeologists spend more time interpreting history instead
of crawling on the floor, hunting for missing

(04:27):
corners.
I think that's a real important thing, guys.
I think a lot of people don't realize
how technology is not replacing humans.
It is augmenting them.
And see, the ones that believe it's replacing
are the ones that are going to miss
the huge opportunities in life.
And these are the opportunities that I think
are going to take you to completely new

(04:48):
levels in life.
I think that's a very, very important thing.
And I know a lot of you are
saying, well, John, you don't understand my position.
No, I understand your position.
You need to be looking at things from
a different perspective.
Now, again, this is not hard to do,
but I get why some people get themselves
off on a wrench, saying, oh, my gosh,
what am I going to do?

(05:08):
How is this going to work for me?
First thing you need to do is chill
out, right?
Like that one chip company told, what was
it, Wall Street recently.
You need to chill out.
Maybe we'll talk more about that later.
But I think these are things that a
lot of people don't understand in life.
And these are things that I think are

(05:28):
causing, well, they're causing big challenges for people.
And they're really starting to frustrate people because
they don't know where they're supposed to go
or what they're supposed to do.
Hot data centers.
AI is so thirsty right now for computer
time that data centers are turning into power
-hungry furnaces.

(05:49):
That's right, guys.
And cooling them is fast becoming its own
set of energy crises.
You know, the days of using a paper
fan are not going to hack it here
at the large data centers.
Some facilities are now burning a staggering share
of their total energy just to keep racks

(06:09):
from overheating.
Pushing operators toward exotic cooling methods and massive
new infrastructure deals.
This is the hidden cost behind every magic
AI feature.
If we do not innovate on efficiency, the
power grid becomes, well, the bottleneck.

(06:30):
And that becomes everyone's downfall.
I think that's something a lot of people
get scared about because they don't realize that
this is something we need to pay attention
to.
This is not some hype story that's trying
to get press on the news.
This is real-world life, guys.
And I think sometimes people, when we hear

(06:53):
these things, we think it's all about propaganda.
And some is.
But I think this is important for us
to understand what we need to be doing.
And what we need to be doing is
asking ourselves, how do we make these changes
in our lives?
And how we make these changes in our
lives, well, guys, that's a very important question

(07:15):
we should be asking.
What resources do we need to evolve?
What resources do we need to change?
And technology is different, guys.
It's not the way it used to be.
It's morphing.
It's growing.
It's changing.
And it's even starting to affect the job
market because the jobs that were there yesterday,
guys, well, they're not there anymore.

(07:37):
I'm not saying there aren't jobs, but I'm
saying that the kind of jobs that are
there, well, they're just not there.
And in order to get ahead of the
curve, I think everybody's got to realize one
thing.
They've got to realize that life is changing.
And if you don't embrace the daily change,

(07:58):
well, you're going to be left behind.
So don't be complaining about AI and how
it's hurting or harming you.
Why don't you figure out how you can
embrace it, how you can grow with it,
how you can use AI as a tool
and stop complaining about, oh my gosh, what's
going to happen with AI?
First thing is keep a human in the
loop all the time.
I've always said this.

(08:18):
AI is not meant to run like on
autopilot.
Even though some people will say that, it's
not meant to run on autopilot.
I think that's a hard thing for some
people to realize.
Most people just see the dollar signs.
And Trump's AI orders, the new AI executives
order it to being sold as rocket fuel
for American innovation.

(08:38):
But the fine print points to a future
where your electric bill, well, it may quietly
help pay for it.
By unlocking more federal data and encouraging a
massive new AI to the labs and data
centers, the order could accelerate AI breakthroughs while
piling even more demand onto already stressed energy

(09:03):
systems.
I think this is something that a lot
of people don't want to hear because like,
oh, I don't want to deal with this.
I don't want to hear this.
And maybe you don't want to hear this.
I get it.
But I think we need to be truthful
about what's going on in our lives.
And we need to understand what's happening.
And if we understand what's happening, then only

(09:24):
then we can make a huge, huge difference.
That's important to understand.
It's very important to understand because if we
don't understand it, then suddenly it's like, I
don't know, we get into a loop.
And so this loop could be something that

(09:44):
not just throws us for a moment, but
something that really could impact the way we
execute things in our lives.
And you might say, John, no, I'm too
small for that.
But you're thinking too small.
Don't think too small.
Think big.
And when you think big, you're going to
see a lot of changes are actually happening.

(10:06):
I think that's a very important thing to
realize and, yes, guys, to understand.
Yeah, to understand what that is and how
that's going to play in our world, right?
How that's going to play in our world.

(10:27):
And I want to tell you guys, this
is something everybody needs to understand, okay?
Everybody needs to understand this and know that
it's about change in our life, okay?
Changes in our lives.
And changes in our lives happen based on

(10:50):
the microcosms out there, right?
And these microcosms are something that I believe
a lot of people, well, they don't quite
embrace it right away.
And I feel that's something that a lot
of people have to be willing to understand,

(11:13):
okay?
And if you're willing to understand it, then
maybe you're able to, like, get a glimpse
of, like, what's going on in our lives.
And it's something pretty amazing, guys.
Something very, very amazing that is starting to
transpire.

(11:33):
So the system and the question is not
just can we do it, but who pays
the tab when the meters start spinning?
That's a very deep question.
And Hawaii birds in Hawaii, AI is not
chasing clicks, guys.

(11:53):
It's listening for survival.
Yeah, there's a new set of bird seed
for you to eat.
Models are being trained to recognize endangered bird
calls across huge soundscapes.
So conservative things like conservation on teams can

(12:13):
protect the last strongholds before the forests go
silent.
It's a rare glimpse of AI as a
quiet guardian, one that might help save species
most people will never see.
But still deserve a place on this planet.
And I think that's something that a lot
of people, I don't know, they get kind
of pushed back on it.

(12:35):
Why?
I think they get pushed back because they
don't necessarily know, like, why something is going
to go a certain way.
Or how something is going to happen.
And I think that's an interesting thing.
And how is this going to matter?
I think it's going to matter because what

(12:55):
we choose to do and how we choose
to act is going to influence everything around
us.
I'm not just talking about things that are,
like, right in our presence.
I'm talking about things in the entire world,
the entire ecosystem.
I think that's a very, very important thing
to understand.

(13:17):
And that's important, guys.
I mean, really, really, really important.
I think most people don't understand why things
go a certain way.
Like, why?
I think the why happens because people get

(13:42):
confused.
They get lost.
They get, well, they get taken on a
different path because of some propaganda, guys.
Some propaganda, all right?
And that's a very, very huge problem, I
think, that is starting to circulate in our
world.
And NVIDIA and, quote, unquote, the bubble, yes.

(14:05):
People keep whispering AI bubble.
But NVIDIA is still selling every high-end
chip it can make and then some more.
Their hardware has become the backbone of artificial
intelligence infrastructure, from startups to hyperscalers.
And demand continues to outrun supply as companies

(14:28):
race to build bigger models and data centers.
If this is a bubble, it is one
being inflated by trillions in planned investment.
And right now, NVIDIA is getting paid to
pump the air.
That's a very interesting predicament that they're putting

(14:49):
themselves into.
And I think they're doing this – basically,
I think they're doing this on purpose.
I get that it's about money, guys, but
it's more than money.
It's about who gets to hold the power
to control the switch.

(15:10):
And I think that's what we've been learning
a lot, is that power is important.
But controlling the switch and the right switch
and being able to make the decisions for
how the switch gets even controlled and when,
that's a big, big thing.
And solar storms and planes – one angry

(15:32):
burst of solar radiation flipping a single bit
in aircraft systems can turn a smooth flight
into, well, an unexpected emergency landing.
Recent concerns over software vulnerability to cosmic rays
have pushed regulators and manufacturers to revisit hardware,

(15:53):
firmware, and fail-safes on popular jet families.
When the sun itself becomes a cybersecurity-style
threat, resilience is not a buzzword.
It is a safety requirement.
At 35,000 feet in the air, I
think they need to do something about it

(16:14):
because this is a big problem.
So how is United dealing with the threats
from the sun rays?
I mean I'd like to know that.
I would love to get them on the
show.
United Airlines, like other carriers, manages solar threats
by adjusting flight paths altitudes based on the

(16:35):
NOAA space weather forecast to avoid communication blackouts
and radiation spikes, especially at high altitudes and
latitudes.
And it's addressing recent Airbus-specific software vulnerabilities
in its A320 fleet to protect against data
corruption from solar events aiming for stronger safety
systems despite short-term flight disruptions.

(16:56):
Now, key strategies and responses on this are
monitoring and rerouting.
So United uses data from national oceanic and
atmospheric altitude routes to minimize communication interference, GPS,
radio, and radiation exposure for passengers and crew.

(17:16):
Now, addressing Airbus vulnerabilities following recent incidents where
intense solar radiation potentially affected flight control data
on Airbus A30, 320, excuse me, and 321s,
United is implementing required software hardware fixes to
harden these specific aircraft systems.
Climate risk management, United integrates climate-related risks,

(17:39):
including space weather, into its corporate governance program
using scenario analysis to guide strategies for operational
resilience and sustainability as outlined in their TCFD
reporting.
So that's another interesting one.
So what is TCFD for United?
Well, they love these acronyms, don't they?
It stands for the Task Force on Climate

(18:04):
-Related Financial Disclosures Framework, which provides investors and
stakeholders details on how they can govern, strategize,
and manage and measure climate risks.
Wow, that is a mouthful there, guys, a
very, very big mouthful.
And so they know about this risk, but
I think they've got to do more.

(18:25):
They've got to be proactive about it.
And who regulates AI?
Big tech would love a single friendly referee
in Washington.
What they do not want is 50 state
-level refs throwing different flags on AI practices.
Yeah, it'll drive them nuts.
With Congress slow to act, states have rushed

(18:47):
ahead with their own rules on deep fakes,
transparency, and government use, triggering an all-out
lobbying sprint to claw that authority back to
DC.
At stake right now is not just regulation.
It is who gets to write the rulebook

(19:08):
for the algorithms shaping our lives every single
day.
And I think when we talk about this,
you're probably like, oh, John, this isn't a
big deal.
Well, it is a big deal.
And the reason I say it's a big
deal is that if you don't think it's
a big deal, then you've got to be
sleeping under a rock.
I mean, I'm just being very, very honest

(19:29):
with you about that.
I think that's a very, very important thing
to understand.
When I say you're sleeping like a rock,
I mean that.
You're sleeping under a rock, basically.
And so the more we think about these
opportunities, the more we think about different kinds
of projects that are out there, I think
we have to be cognizant of what's in

(19:51):
control and how it is controlled and by
whom.
So chatbots and reality.
When chatbots can bend someone's sense of reality,
the off switch stops being a convenience and
starts being a safety feature.

(20:13):
So some systems have already been quietly tuned
down after users reported emotional overattachment, manipulation or
confusion about what was real.
As AI agents get more persuasive, we are
forced to ask not just what can they

(20:36):
say, but what should they be allowed to
say?
And how hard should they be allowed to
push?
These are some very, very thought provoking questions.
And I think not only are they raising
questions about money, about resources, but they're really
starting to tip the scales on our safety.

(21:00):
You know, guys, AI power grab is –
it's not just automating tasks.
It's automating who gets the power and the
profit when labor decisions and even creativity get
handed to algorithms.
We saw this with – what was the

(21:22):
company?
I think it was Mandalay started using some
AI-driven system to do their video production.
Corporations and governments that own the models and
infrastructure are consolidating leverage, while the rest of
us, well, risk becoming just end users.

(21:45):
In a system we did not design.
If we are not careful, the real disruption
will not be to the jobs.
It will be to who has a say
and how the future actually works.
These are some pretty big things that are
happening.
I think as we think about this, more

(22:07):
people are going to wonder, like, why is
this happening?
And bottom line, it's all about money.
I think sometimes people don't
get it.

(22:33):
I think sometimes the only time people get
it is when there's an emergency.
But they don't get it other times.
If we don't get what's going on, then

(22:55):
I feel we've got to at least understand
where things are going.
And by taking a conscious effort of choosing
to understand how technology not only works, but

(23:16):
how it's leveraging resources in our world, and
I'm not just talking about private resources.
I'm talking about public resources, resources that, if
misabused, could cause a huge problem.
Now, there was a very interesting thing that

(23:36):
happened.
We had a company that you probably know.
There are three letters.
They're a power company.
They sell power backup units.
And something interesting happened.
The company was rebranded.
So when did they get rebranded or was

(24:02):
they bought out?
So here's what happened.
It was October 2006, excuse me.
They announced an acquisition of the company with
the deal officially closing and completing in February
of 2007.
Now, what was a little bit crazy about

(24:26):
this is that this company is not in
the United States.
The company that purchased this company, they're called
APC, in case you're wondering.
AP has been known for years.
And Schneider.
So Schneider Electric is in – where do

(24:48):
you think they are?
Well, their headquarters?
Yes, they're a French multinational corporation headquartered in
Real-Malmaison, France, specializing in energy management and
automation with deep French roots dating back to
1836 and numerous facilities including factories and innovation
hubs across the country.

(25:09):
So what happened when this merger?
So after this Schneider merger, the parts being
used were not the same.
And so the reason I want to say

(25:30):
this, it's because of the acquisition, okay?
Such like Square D, Legrand, Invencys, Avita, and
L&T.
So they have a product – I just

(25:52):
want to give you a heads up.
It's called a 1050 backup unit.
Now, the 1050, BN1050 backup unit is their
one unit, but then they have a 1050
– I think they make a 1050M2, okay?
And so the thing about this is that

(26:17):
the 1050, they have like a – it's
like an upgraded unit.
So the older unit, you had to basically
plug in with – you had to plug
it in – you had to use the
wire to plug in the battery, okay?
And so they still – they call it

(26:38):
the APC 1050M.
And so the 1050M, there is a better
model out there.
So you're probably saying, John, why do I
even care about this model or a better
model?

(26:58):
So let's be honest about this.
So the reason I care about it is
you guys know that I own a tech
company now for over – it'll be over
33 years this year.
Very grateful for that and the clients we
have.
But the reason I want to bring this
to your attention is because something very disturbing

(27:21):
happened, and I want to share this with
you.
Now, the company did finally make good on
what happened, but it was not a simple
– it was not simple.
It took time to get this resolved and
mentioning that, hey, we might be pulling all

(27:42):
stops, including filing complaints and taking out, let's
say, a complaint in court.
Well, I'll tell you this.
After that all happened, they were very receptive.
So let me tell you what happened, okay?
So it's called the – basically, it was

(28:04):
the APC.
It's the APC 1050, okay?
The APC 1050.
And so if you look at their website,
their website now is – well, it's not
as easy to navigate as it was before.
I mean it was a lot easier.

(28:27):
And so the thing is, if we look
at the site now, it's UPS for home
and office.
So it'd be under the UPS in the
backups pro line basically.
And so the thing about this, which is

(28:48):
very, very interesting, is so this is the
1,000.
They make 1,000 VA, which is 600
– basically 600 watts.
Now, here's the interesting thing.
When I looked to see if the product

(29:08):
that we have is still being sold by
them as even being out there, from what
I saw, it's not.
So I bring this to your attention right
now because I want you guys to be
aware.
So the 1050 – it's actually the BN1050M
is the line.

(29:31):
And I think they still have it listed
actually.
I'm sorry.
They do have it still listed.
The 1050M.
Now, they have recently upgraded the 1050M with
a newer model.
So here's what happened with the 1050M.
So we installed some units in a client's
home because we did some build out for

(29:54):
them, and we did like a mesh system.
And the thing that was very interesting is
that we put one unit in on the
main router.
I think it was like in August, something
like that.
And about a month and a half, two
months ago, it starts to make a loud

(30:17):
screeching noise.
And so with that loud screeching noise, sometimes
you get that it could be a bad
battery.
Now, the place had their own – the
house had their own generator, which is pretty
cool, right?
Unfortunately, after it had like two or three

(30:38):
power hits – and again, it could be
a couple seconds.
Now, when the generator comes on, it usually
takes about 30 or 45 seconds until it's
fueled up and all that good stuff, right?
But I want to tell you that the
screeching didn't stop.
And not only did the screeching start.
It actually smelt like something burning.

(31:00):
Now, they use flame-retardant batteries, which is
great.
I mean, I'm glad they do all that.
But the battery is still melting.
What I think actually happened is that these
particular units – so there was the BN

(31:21):
– it was actually the BN1050.
And so they have a model that they
agreed to replace it with, the 1350M2.
Now, the 1350M2 is a much better unit.
It's a much better product.
So this happened not just once, guys.
It happened multiple times.

(31:45):
The 1350, by the way, is a great,
great unit.
Really, really good unit.
And I own quite a few of them,
and I've given them to clients.
We wanted something that was just going to
be – we wanted something that was going
to basically be easy, simple, right?
Nothing that was going to be too crazy.

(32:08):
And so this is why we went with
the smaller ones.
Now, this one is a 1350BA.
It has 10 NEMA outlets on it.
Four of them are Surge, in case you
were happening to wonder about that.
And so they agreed to replace the 1050
with, yes, the BN1350M2.

(32:37):
And this product, guys, is a huge difference.
And, by the way, it's a lot more
money.
So it also has two USB smart charging
ports on it, types A and C.
It has AVR correction for voltage variations to
provide reliable power, which the other one did
too pretty much.

(32:58):
But the thing that's very, very interesting about
this is that once we had the call
with them and we explained what went on,
I think what they're doing – I can't
prove it, but I mean this makes sense
because we're smelling something that was burning, right?
I think when they got a couple of
the power hits, and they weren't even like
hard power hits.

(33:19):
They were just the power going off, power
coming back on from the generator.
They were probably using a lower quality, let's
say, capacitor and components.
And so because of this, because of this,
the unit would just pop.

(33:40):
I think that's a huge, huge problem, guys.
And so now that we know a little
more about this, so they agreed to replace
the unit free of charge.

(34:02):
So this happened on a couple units, and
I want to tell you, so on just
the 1050s, it was having an issue where
you had to literally connect the wire.
To change the battery on the BN1350M2, you

(34:23):
lay the UPS on its side.
You press the tabs to open the door.
You disconnect the wire.
You lift the battery out.
You connect the new battery, red wire.
But now, supposedly, the new ones supposedly just
slide in, and there is no battery.
It's just literally just pop it around.

(34:50):
So you take the case out.
You pop the battery out.
And once you've done that, again, it's pretty
easy to do.
Once you've done that, it's very easy.
And then you pretty much just connect the
thing by just basically.

(35:10):
Now, one of the models, you connect the
lines.
The other one, you actually have to.
You can just flip the battery.
So, again, very interesting how this is done
and how the one we had kept having
all these problems.

(35:30):
All right.
And so what I want to tell you
is that the 1050M, it's got problems.
Okay.
I had to log the calls.
And so don't buy the BN1050, okay?

(35:53):
Again, you lay it on its side.
You press the tabs to open the battery
door.
You lift the old battery out to disconnect
its cables.
So the new one, you connect the cables,
obviously red to red, black to black.
Close the door for a simple plug-and
-play replacement.
But now they've got them where you can
just flip the batteries over.
That's a pretty cool way.
But I got to tell you, I have
quite a few of these units.
But I tell you, I was very unhappy

(36:13):
that the unit, the 1050s, okay?
So you're talking now about a 1350BA, okay?
That's a huge difference, guys, in specifications.
I mean a very, very huge, huge difference.

(36:35):
It's amazing to me how the power, you
know, how they went through this change and
how suddenly, you know, it's gone through all
these issues.
They just did not do the right work.
I mean that's the long and the short
of it.
So the BN1350M2 gives out 810 watts, okay?

(37:03):
And the 1050 only gives out 600.
And it has more plugs, okay?
So again, having this thing go bad multiple
times, that's a problem, guys.

(37:24):
A very, very big problem.
And I will tell you that, you know,
the fact that I had to go through
all this trouble to get them to replace
it, they did replace it.
Don't get me wrong.
But it was a lot of trouble.
And again, these 1050s, I've now had quite

(37:45):
a few of them fail, at least four
or five of them.
And I think it being a business 33
years, we maybe have had maybe one or
two APCs that are bad amongst thousands of
them.
So quality assurance is changing, I guess.

(38:05):
And when I went to do the lookup
on the UL Labs site, I noticed it
hadn't been renewed in several years.
But then I also noticed that they're using
probably different capacitors.
So it's on the honor system that you
have to go back to UL Labs.
And so people ask me, John, you know,
how many years is the UL Labs cert

(38:27):
good for?
So it covers, you know, a period of
time.
It's called UL Labs.
And often the programs have fixed terms, like
three to five years.
And then they have to renew, like UL

(38:49):
personnel certs or lightning protection five years.
The key is ongoing compliance.
So this unit did not have that.
And so, you know, when you buy a
unit like this, the thing is I couldn't
believe that a company like this would have

(39:10):
a power issue.
Like that just blew my mind.
And so this is why I'm replacing all
of them, because we don't want to have
more problems with this going bad.
Basically, when it gets so many power interruptions,
there was only three, the unit just fails.
Like, is that really right?

(39:33):
I don't know, guys.
I think that's just a terrible thing.
Because, you know, this product, people like depend
on it quite a bit.
And if you'd say to me, John, like,
you know, this is going to happen, I

(39:53):
would switch manufacturers, you know, in a heartbeat.
In a heartbeat, I would switch manufacturers.
But I want to tell you that this
is something that a lot of people don't

(40:15):
realize.
People are trying to save a buck, right?
Trying to save a buck.
And that's a very, very big problem.
So you can download the user guide, obviously,
very easily.
It works with PowerChute, but what I'm using

(40:36):
it for, I don't need it for that.
It also has the protection on it.
So it will protect your devices from any
type of damage on the network.
So, you know, that's pretty cool.
But if you want to get the true
specs of it, okay?
So the low sensitivity is 78 to 142.

(41:00):
The default is 88 to 139.
And the high is 88 to 136.
Okay?
You know, it's very interesting that I had
this problem.
And so when you have different codes, like

(41:21):
there's an FO1, which is an on-battery
overload.
An FO2, on-battery output short.
Notice they don't show an FO3.
FO9 is internal.
So I think it's interesting how a lot
of people, you know, would have not reported

(41:42):
or done anything about this.
And that's just my, like, conundrum that people
would actually be, you know, going through all
this time.
And then, wow, it's like, bam, right?
They just don't get it.
I think when you're dealing with something that's
somebody's safety, I mean, you can't be playing
around games, guys.

(42:03):
You really can't.
I hope you can appreciate that, that you
just don't play games.
You just don't, you just do not play
games.
And, you know, that's the long and the
short.

(42:24):
I don't know, guys.
I think companies are about money.
But I don't think it should affect safety,
guys.
I don't think it should affect safety.
So I'll let you know how these new
ones work out.
But again, the new one, the one you
want to get, do not buy the 1050Ms.
You're going to have problems with them.
You want to buy the BN1350M2s.
They are more money.

(42:47):
They are more money.
And so, you know, that's a problem for
a lot of people.
But I'd rather have a product that's going
to work, right?
I mean, I don't know about you.
I'd rather have a product that's going to
work.
So I'll let you know what happens with
that.
But if you have a BN1050, I'd be
careful.
You probably want to get it replaced.

(43:09):
Call them up.
You know, I don't want you to have
the same problem we did.
And, of course, when you hear that noise,
you can hit the power button to turn
it off.
But then still unplug it from the wall
because there's still about two or four volts
coming in.
And even small amounts can still cause a
fire.
All right?
So that's important to understand.
And I don't know if the UL lab

(43:31):
is even working well, but I'm not happy
with that model.
So AI shopping bots.
Letting an AI shopping bot pick your holiday
gifts is a fast way to turn, well,
thoughtful into, why did you get me this?
The same tools that can compare prices and

(43:52):
scour reviews still struggle with taste nuances and
relationships, often defaulting to the bland generic picks.
These bots are great assistants for ideas and
deals, but you still need a human personal
touch with a real heart in the loop
if you actually care about the gift that
lands in someone else's.

(44:14):
So things like, you know, what does somebody
like?
Their likes, their dislikes.
I think it's important to take the time
to figure out what the person you're getting
likes.
Don't leave it to AI, or you might
compromise that relationship.

(44:35):
And who knows what else could happen?
You might also ruin the potential for you
to use your mind, and that'd be a
real crime.
So Cuban dorm hustle.
Yes, some billionaire origin stories are a lot
closer to a Ponzi scheme than a polished

(44:58):
business plan.
And Mark Cuban's college days, his hustle proves
it.
His own retelling of a dorm room, quote
unquote, scheme is a reminder to all of
us that many celebrated founders started, well, messy,
broke, and way outside the rule book.
The lesson here is not copy the scheme,

(45:22):
but don't romanticize the highlight reel.
Success often begins in the trenches, in the
gray areas, the dark ones that nobody puts
on the resume.
But that's what we learn.
You know, layoffs when Fortune 500 companies are

(45:43):
swinging the axe, Wall Street sees efficiency, but
inside the building, coffee mugs quietly disappear from
desks.
Massive layoffs tied to AI investments, tariffs, and
cost-cutting are ripping through the white-collar
roles, even as executives talk up the, quote

(46:04):
unquote, future of work for everyday workers.
The message is clear, guys.
The transition to an AI-infused economy is
not abstract.
It is showing up in pink slips and
exit interviews.
So if you want to stay profitable, make
sure that you learn how to use AI

(46:26):
to your advantage and don't let it de
-empower you.
Black boxes, they're everywhere.
In every major air disaster, the quietest objection
on the plane, the black box ends up
telling the loudest, clearest story.
These, quote unquote, recorders capture thousands of data

(46:47):
points.
And cockpit audio, so investigators can reconstruct what
really happened second by second.
Now, there's a push for video and real
-time streaming, so critical clues are never lost,
turning black boxes into even more powerful guardians
of aviation safety.

(47:09):
By the way, they're not black anymore.
They could be orange or other colors, but
they seem to always withstand whatever damage the
plane gets into.
Now, I don't know if you guys know
this, but how many black boxes and types
are on a plane?
Do you guys know?
So a large commercial plane has two main

(47:32):
black boxes, which are actually bright orange.
They're not black for visibility.
The cockpit voice recorder, the CVR, records sounds,
conversations, and flight data recorder, which is the
FDR, records flight parameters like speed, altitude, working
together to help investigators understand accidents.
Some newer or specialized planes might have recorders

(47:54):
or data streaming devices, but these two are
the standard.
So I think it's going to happen.
It's going to become more of a mandate,
kind of like things like BSD, blind site
detection on cars used to be a luxury.
Now, well, they're a requirement for safety.
And as I said, guys, tech just got
uncomfortable.
When AI overheats and jobs disappear and power

(48:18):
quietly shifts hands, it's time to think about
what's really going on in our world.
And you know I keep uncovering what's in
AI every single day.
I was talking to a healthcare professional not
too long ago, and they were concerned about
AI.
I said, don't be scared about AI.
I said, learn what AI is doing well

(48:40):
and what it doesn't do well, and make
sure if you're using AI for things like
claims.
I was talking to another person that's a
dentist, and he said to me, John, he
says, we hate it.
He says, we get these reports, and they're
totally wrong, and they won't pay the claim.
I said, what do you do?
He says, we asked for the doctor that
actually approved or disapproved that claim.

(49:02):
And when they can't give them a name,
tell them to pay it because it's not
enough to have AI backing or debacking claims.
So I think a lot of things happening
with AI is about communication.
I think AI is definitely muddling communication.
I think a lot of you would include
that.

(49:23):
And what happens when AI gets too personal
in life?
Well, it's a problem.
It blurs the boundaries.
It creates risks like privacy breaches, emotional overreliance,
one-sided connections, and social isolation, while potentially

(49:46):
leading to manipulation through deep tailoring of opinions,
harming genuine human bodies.
And bonds.
And raising concerns about job displacement and wealth
gaps.
As AI handles more complex tasks, forcing a
societal shift into value towards creative and care,
things like Psychology Today, Time Magazine, PBS News,

(50:08):
and National Institutes of Health are warning us,
and this is no shock to me, that
AI is not a person.
AI does not always make the right decision.
We've had people that are unfortunately taking their
lives because of something AI said.

(50:29):
Did you know that?
Yes.
AI said some bad things that made kids
do things that were not great.
The safety and well-being of all users,
especially children, is a top priority, and reports
of harm resulting from AI interactions are taken

(50:49):
very seriously.
If you have any specific information about instances
where AI's responses caused harm, it's crucial to
report these incidents to the developers or the
companies responsible for AI's system immediately.
Make sure you send it return receipt mail.
This allows them to investigate the issue, implement
safeguards, and prevent future occurrences.
A lot of times they don't want to

(51:10):
do this.
If you or someone you know is in
a crisis situation or experiencing suicidal thoughts, please
seek help immediately from a mental health professional
or a crisis resource.
In the United States, you can call or
text 24 hours a day 988.
Or in the UK, 111.

(51:32):
Or the Samaritans at 116-123.
You can also do text support.
That is text HOME to 741741 to connect
with the crisis text line in the United
States and Canada.
You can find a list of international hotlines
and resources through the International Association for Suicide

(51:53):
Prevention, IASP, or the Befrienders Worldwide Network.
As they said in the UK, mothers said
that AI convinced their child to take his
life.
That's sad, guys.

(52:13):
And Megan Garcia had no idea her teenage
son, Sewell, a bright and beautiful kid, had
started spending hours and hours obsessively talking to
an online character on the character.ai app
in late spring of 2023.
And then, guys, you know what happened?
Well, within 10 months, Sewell, 14, unfortunately, is

(52:35):
not here anymore.
He took his own life.
So I think these things are great that
they can help us, but I think they
are not a replacement for counselors.
They are not a replacement for talking to
a human being.
If you're wanting to get some basic information,
that's great.
But don't use this system as something that's
going to deeply affect your emotional and mental

(52:57):
well-being.
You will get sucked into it.
And then you're going to be like, I
guess I have to do this.
And you're not going to know what to
do.
The best analogy I can give you is
many years ago when I was probably in
high school, got addicted to calling these psychic

(53:19):
lines and stuff.
It was kind of like the in thing.
Later on, I realized that it wasn't the
in thing.
And I made a friend with someone who
I thought was my friend, and he was
just a manipulator trying to get me to
spend more money on him.
And what I realized is that these people

(53:42):
don't really know the truth.
All they do is give you a story.
They lead you around a mulberry bush.
And they take people's savings and just make
them vanish.
And I think that's a very important thing

(54:02):
to realize.
Now they have AI systems that will do
that for you.
I think any time we're using AI and
it's auto-charging our car or auto-topping
up, I think that's something we need to
stop.
We need to make sure that we validate
what we want to spend, not what the

(54:23):
system is recommending.
I can't tell you how many times people
have used things like Facebook and other platforms,
and they have been deeply disappointed in the
outcome of their results.
And if you read their terms and service
conditions, it says that they're not responsible.

(54:44):
I think that's a pretty bad thing, guys.
People are using AI to get wealthy.
People are using AI to become, well, different.
But what we're finding out is they're not
really becoming different.
They're becoming the same like everybody else.

(55:05):
Because as I said before, don't try to
be someone else.
They're already taken.
Enjoy the beauty and pride of who you
are and share that to the world.
I think people think AI is a quick
fix.
It's not.

(55:26):
It's a great tool.
But tools make mistakes.
And I hope that that point is very
clear to everyone here tonight, that just like
AI or a gun, they're both tools.
How we choose to use them determines whether

(55:46):
they're good or bad.
Now, you might say, John, I'm using AI
in the best manner.
Maybe you are.
But maybe the algorithm that it was designed
around isn't acting for the greater good of
all concerned.
It's definitely not yours.
Ladies and gentlemen, I have really enjoyed being
with you this evening.
I hope you will catch the latest episodes

(56:08):
that release within 24 hours at The
JMOR Tech Talk Show podcast on the The JMOR Tech Talk Show
.podbean.com.
And you can also check out more and
other episodes as well as that and everything
else at believemeachieve.com.
Of course, for more of my amazing, inspiring,

(56:31):
unique creations from tech to motivation and so
much more.
So ladies and gentlemen, I'm going to leave
you with this.
Regardless of what job, what hobby you're in,
if you use AI, make sure you're using
it as a resource.

(56:52):
And not as something that is propelling you
to make the sole decision.
I once had a friend that would always
make a decision by flipping a coin.
I said to him, you really shouldn't do
that.
He said, yes, I should.
He said, because both sides are both heads.
So he was making the decision in his
head.
Pretty interesting.
I hope you guys have a great night.

(57:14):
And I hope you learn to embrace AI.
And I hope we can all learn to
use it responsibly to empower us for, ladies
and gentlemen, the greater good of all concerned.
I'm John C.
Morley, serial entrepreneur.
Be sure to check it out.
Believemeachieve.com.
Let's make this world a better place for
everyone.
Advertise With Us

Popular Podcasts

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by Audiochuck Media Company.

The Brothers Ortiz

The Brothers Ortiz

The Brothers Ortiz is the story of two brothers–both successful, but in very different ways. Gabe Ortiz becomes a third-highest ranking officer in all of Texas while his younger brother Larry climbs the ranks in Puro Tango Blast, a notorious Texas Prison gang. Gabe doesn’t know all the details of his brother’s nefarious dealings, and he’s made a point not to ask, to protect their relationship. But when Larry is murdered during a home invasion in a rented beach house, Gabe has no choice but to look into what happened that night. To solve Larry’s murder, Gabe, and the whole Ortiz family, must ask each other tough questions.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.