Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:07):
Hi everyone, I'm John Seymour, the host of
Tech Talk Show and Inspirations
for Your Life.
(00:50):
Well hey guys and welcome to The
JMOR Tech Talk Show, a very special
JMOR Tech Talk Show.
This is the finale of 2025 and very
happy because after this show, we're actually on
the air four years in a row.
Wow, a big round of applause for us
on that, I know.
(01:11):
And starting next year, January 1st, but actually
the show actually kicks off on January 2nd,
we will be officially starting, well, our fifth
year on the air.
I mean, that's just incredible.
So those of you that are new here,
I want to take the opportunity to welcome
you.
(01:32):
If you are coming back, I'd like to
say welcome back, thank you so much for
coming back.
There's so much great content here, if you
have not checked out BelieveMeAchieved.com, what are
you waiting for?
Go check that out after the show, I
know you're going to definitely be pretty elated
with what you find there and hopefully this
will improve you in the attributes you have
so you can have a better quality of
(01:53):
your life.
Thank you for watching us, whether you've been
watching us on cable TV, whether you're watching
us on the podcast, whether you're just watching
the audio, listening to the audio only, I
should say, or you're just streaming us from
BelieveMeAchieved.com, whatever it is or just YouTube,
really do appreciate everyone, all of our partners.
We have people in New York, we have
(02:14):
people around the world that are broadcasting our
show.
We have stations in New Jersey, Oakland, New
Jersey and other stations in New Jersey that
are broadcasting our show, so we really want
to say thank you to all those for
making the choice to make this part of
their network, including those in New York.
Also at BronxNet, we want to thank them
for our partnership and having them share our
(02:35):
great content, really a privilege and a pleasure
to be on all these wonderful media stations.
All right guys, without any further ado, I
want to kick this off, but before I
do, I want to let you know if
you're parched, if you're thirsty, if you want
a snack or something, feel free to go
grab that.
Oh, and by the way, merry belated Christmas,
happy belated Hanukkah if you celebrated that, happy
(02:56):
Kwanzaa or whatever other holiday you celebrated.
I hope it was a happy and healthy
one, and I hope you created some very
amazing priceless memories that you can share with
others, friends and family for years to come.
I think you definitely know that holiday times
are a very special time when we come
together.
(03:16):
It's not just about the presence.
In fact, it's about us.
It's about the presence, and depending on what
religion you celebrate, it's about the meaning for
the season.
So not to get too much into that,
but I do want to wish everyone a
very happy, healthy.
Of course, I want to wish you a
very happy, healthy, prosperous, and abundant 2026, because
this is the last time I'm going to
(03:38):
see you until next year.
All right, guys, so let's go ahead and
kick this show off.
I know I've been excited to kick this
show off, and I think a lot of
you out there are saying, hey, John, this
is kind of amazing.
I know a lot of you are saying
to myself, how is it that we can
(03:59):
become better by listening to technology?
Well, you see, technology is a structure in
our life, and when we choose to adapt
to technology, we can learn from it, okay?
That's very important, and I know some of
(04:24):
us say, well, John, you know, I'm not
ready for this, or I'm not a tech
person, or I am a tech person.
There's still a lot that you can gain
from technology, even if you're not technical, okay?
There's so much you can learn from it.
There's just a lot out there, and I
think if we can understand that, we can
(04:47):
learn a lot, okay?
I mean, that might be hard for some
of you to understand, but it's really the
truth.
Technology is a structure, all right?
It's a very big structure, or a microcosm,
if you will.
Everything kind of falls under that structure.
And so, in our world, when we have
(05:08):
structure, of course, we can have things that
are not structured, but we need some type
of structure so we can get structured.
Does that make sense to everybody?
I mean, I hope it does, all right?
I started to show out, like I said,
it'll be over, oh, gosh.
Well, this year, it's going to be over
four years now that we are officially on
(05:30):
the air, so really grateful, everyone, for definitely
choosing to be here with me each and
every week.
Really do appreciate that.
So, this week, ladies and gentlemen, on The
JMOR Tech Talk show, first, who am
I?
I'm John C.
Morley, if you don't know who I am
by now.
I'm a serial entrepreneur.
I'm an engineer, a marketing specialist, video producer,
(05:50):
graduate student, and a passionate lifelong learner.
And what do I do uniquely?
Yes, I break down how governments, platforms, criminals,
and AI are all pulled into major power
moves at the same time, hitting your privacy,
which is a big concern.
If you don't think it is, you definitely
should know it is.
(06:10):
I was making a decision on trying to
get one of these AI pins, and I
was researching them, and there was one I
was looking at.
I'm not going to give the name, but
it started with the letter capital B, and
I was very close to getting it.
But then when I heard that Amazon was
buying it, I said, oh, heck no.
There's no way I'm going to buy something
where Amazon just inherited it, because we all
(06:32):
know that Amazon's privacy is just, well, I
will go there.
But when I heard Amazon was buying it,
I said, there's no way you could give
it to me, and I don't want it.
So, I'm still searching for something, because I
want to explore and see if it's worth
it.
But a lot of them have been, well,
crappy.
They have had bad reviews.
They claim they have a 30 or 60
(06:52):
-day money-back guarantee.
I'm not just ready to go plunge and
get one.
Eventually, I will, but I think the security
needs to be tightened up first before I'm
willing to do that.
And even I wouldn't have this thing on,
excuse me, 24 hours a day, because the
important thing is that you have to turn
it on when you want it on, not
(07:13):
all the time.
So, your wallet, your privacy, your devices.
From South Korea putting a cash price on
data breaches to Instacart getting nailed for dark
patterns to RAM shortages and jackpotting ATMs. What's
that?
We'll talk about it.
This is the final map of the tech
(07:33):
battles that is going to shape the 2025
year ending, but actually starting to build the
real true shape of what is in 2026.
So, without any further ado, guys, this is
the place where you get your tech insights,
whether you're a tech person like me or
whether you're somebody that just wants to learn
enough tech so you can protect yourself, your
(07:53):
home, your privacy, your family, and those you
care about.
So, the first one I want to talk
about is SK Telecom hit with massive breach
payouts.
If you're thinking my voice sounds a little
bit off today, well, you're not wrong.
This is the second time in my entire
life that my ears played some games on
me.
(08:13):
And just getting over that and just being
able to hear just about three days ago
from my left ear, you know, pretty well,
but there's still a little lingering in there.
So, that's why I might not sound, let's
say, as on pitch as I normally do.
So, I do apologize for that, but I
do want to make sure this show gets
out.
(08:33):
So, ladies and gentlemen, yes, South Korea's consumer
agency wants about $67 per victim.
So, you know, those are important things to
think about, but we're going to talk a
lot more, like I said, about the privacy,
you know, what's going on, why people need
to be aware of these things.
This is not propaganda.
This is not something we're doing just because
(08:57):
we're doing it.
We're doing it because it's important, because it's
a meaning.
In fact, a lot of the vendors don't
like me talking about this.
We talk about the South Korea, you know,
putting a cash price on data breaches to
Instacart getting nailed for the dark patterns, as
I said, to RAM shortages and the jackpotting.
This is a little alarming for some people.
(09:17):
And like I said, this is going to
shape 2026 forever.
But you know, I think it's important to
understand, like with South Korea's consumer agency wanting
the $67 per victim for 23 million users,
it's turning privacy failures into multi-billion one
liabilities.
(09:38):
I mean, that's just like insane.
Everybody thinks AI is this great big bubble
they can make money in.
Okay, yes, you can make money in it,
but there's still this importance that we do
the right thing even when no one else
is watching.
Today, people are starting to use AI to
exploit others and to write viruses, so much
(09:59):
other stuff that should not be allowed.
And ladies and gentlemen, yes, I'm told it's
finally here.
The TikTok US spun into the Oracle-led
joint venture, an Oracle-backed joint venture keeps
TikTok alive in America, but risks creating a
separate US TikTok.
And this is going to be under tighter
(10:20):
rules.
I have to tell you, I've worked a
little bit with Oracle.
I've had some contract projects where I've had
to work with the companies.
I'm not going to mention their names and
their platform.
I'm not going to lie here.
It was awful.
Did it gather the data?
Yes, but it was so un-user friendly.
I was using it as an expense report
when I was doing some training for a,
(10:42):
it was actually a company that produces software
for libraries and their system was so awful,
was so terrible.
Now, maybe it was because they just took
it out of the box and they really
didn't know what they were doing and they
just kind of implemented it, or maybe the
software really is that bad.
But I know one thing, I feel there's
much better software out there that can track
(11:03):
your data right out of the box.
Being a programmer and developer for many years,
I know that we write better software than
that.
So, you know, and we're a small company,
but I have to tell you, so many
companies out there are all about the money,
are all about, you know, how can we
gain an extra five cents?
How can we gain an extra 10 cents?
(11:24):
But they don't think about how it's going
to impact us.
I'm going to give you a perfect example.
So here in Florida, visiting my parents, we
always get sparklers for the new year.
Well, they usually come in this very big
box and, you know, pretty big sticks and
whatnot.
Well, now this year the box has reduced
in size by like 50% and the
(11:46):
sparklers are like about 25% smaller and
the price has jumped about 20%.
So something's wrong with that picture.
People are trying to figure out how to
make things cheaper and then they're still trying
to figure out how to, well, let's say,
charge us more money.
I think that's a problem.
I don't mind paying more money for something
(12:07):
when there is value, but if I'm just
paying more money because a company says, well,
you know, I need to get a Lamborghini,
I need to get a Ferrari, or I
need to finance a new wing, I think
there's something wrong with that picture.
I mean, I'm just being, I'm being honest
with everybody.
I'm not trying to lie to you.
I'm just trying to be honest.
And today's topic, I thought I would share
(12:28):
the topic with you.
I know I didn't share that before.
Today's topic is a really good one and
it is the finale of 2025, data fines,
dark patterns and drone bans.
This week's, while this tech power plays series
four and it's the year finale until guys,
we kick into being a series five, I
(12:49):
still can't, I still am having a hard
time fathoming that we're going to be in
like in our, in our fifth year, but
we'll be there.
So Shine escapes a shutdown and faces strict
fines.
A Paris court blocked a three month ban,
but imposed age checks and big fines over
illegal and harmful products as France pushes European
(13:11):
union to get tougher.
Well, what does this mean in English?
I mean, I think that's probably a, an
important thing to say, you know, what, what,
what does all this mean?
I think what it means is you know,
this, this whole thing for what happened, which
were these let's say adult type graphic childlike
(13:32):
dolls and they're called ban weapons.
And instead of forcing age checks and the
10,000 pound fines now per breach, while
France vows to appeal and lobby to European
union to crack harder and ultra cheaper marketplaces
like Shine.
Is this consumer protection or is it political
(13:53):
theater?
I don't know.
I think it's political theater.
I think people try to see what they
can do to get away with.
And if somebody else says, Oh yeah, let
me see.
We go out that door.
Let me go out that door before somebody
locks the door or online, somebody posts something,
a content that's not appropriate.
So, Oh gee, let's go post to that
group.
Everybody else, everybody else is doing it.
Well, nobody else is going to catch it.
Eventually that group's going to get in trouble,
(14:13):
right?
Because somebody is going to come into that
group.
That's probably not appreciating that content and the
whole group's going to get shut down.
So that's how that works.
And, uh, AI written phishing targets Russian defense.
That's right.
A pro Ukraine group uses consumer AI tools
to craft convincing fake state letters that lure
(14:35):
Russian defense contractors into phishing campaigns.
I mean, this is like, this is like
insane that this is going on.
Okay.
The pro Ukrainian group was dubbed paper world
werewolf.
And as I said, luring Russian defense contractors
(14:55):
with AI generated fake invites and government letters
off of the shelf, AI tools are quietly
supercharging high stakes cyber espionage misuse not the
tech is becoming the real battlefield.
And I think that's something that a lot
of people get concerned of.
I know myself going back to school for
my master's in AI and PhD in AI.
(15:15):
I know it's really vital for me to
not only learn as much as I can
about this, but also to educate people so
that they can see how to use a
tool.
What do I mean by that?
So let's forget the AI world for a
minute.
Let's say you have a screw, okay.
In the wall.
Let's say you have a, um, one that
(15:38):
might be a flathead screw, right?
And one that might be a Phillips head
screw.
One screw, for example, uh, if you use
a Phillips head, it kind of bites like
where you said all the one, like the
exit, the whole sides, right?
The other one, you can use a flathead
in some Phillips heads, but, um, it's going
to be a lot more torque and it's
going to be a lot easier and you
(16:00):
could turn a Phillips quicker than you could
with the red, because you're not going to
keep the blades, not going to keep slipping
out and you can't use a Phillips in
a, um, a flat blade.
There's just, it's not going to work because
there's, there's, there's, there's a, there's another, uh,
set of points there, another edge.
So I think a lot of the problem
(16:20):
that we face in our world is not
the technology is bad, is that people are
trying to find ways to use technology to
benefit them without being considerate of the greater
good of all concerned.
Now you might say, John, why do these
smart people do this?
I don't know.
(16:41):
I think because they can.
And I said to you before, I said
this before, just because you can do something
doesn't mean you should miss it again.
Just because you can do something doesn't mean
you should.
But I know a lot of people out
there are going to keep doing this because
they think it's fun and also because they're
making money at it.
(17:02):
So there's something wrong with this picture.
And we as a society need to push
back on the people that are using artificial
intelligence to abuse the information.
Why did I not buy this product that
started with the letter B?
Well, because I heard Amazon was buying the
product.
Even though the AI product was only $50
(17:22):
and the service wasn't very much, the fact
that Amazon was buying it, I said, I'll
be darned if I want Amazon in my
back pocket.
It's bad enough I buy them for stuff
here and there.
And I give them very little information.
I don't even give them my cell phone,
OK?
Try calling them for support.
You're probably better off talking to a wall.
Well, seriously.
All right.
(17:43):
And Zara deploys AI clone models for, yes,
this is something really cool, for shooting One
photo shoot now feeds endless AI-generated outfit
shots, saving costs while threatening future work for,
well, photographers and junior creatives.
But let's dive a little deeper.
(18:04):
I mean, John, what is this all about?
I think you're asking good questions.
So Zara's using AI to remix photos of
real models into endless outfit shots.
As I said, paying them as if they'd
flown in again.
While photographers warn this could quietly erase future
shoots and entry-level creative jobs.
(18:26):
Is this innovation or slow death of the
fashion for photography?
I think this is abuse.
And I think this is very similar to
what I'm going to call the big bubble,
right?
We've all seen what happens with the big
bubble and the e-commerce world, where it
was the big dot-com era, where people
were putting values into things.
(18:48):
But they really didn't have that much value
for anything.
They thought they were great.
But then they exploited them.
And people paid it.
And then eventually, those domains, well, they went
bust.
And so I think when we overvalue something
too quickly.
And people then learn about it.
Well, the trust is broken immediately.
(19:09):
And once that happens, nobody is ever going
to want to go back, right?
I mean, I'm just being honest with you.
I'm being completely honest here.
UPS, yes, I know we always get confused
on these letters.
So the UPS is doing something very interesting,
right?
But I think the important thing about this
(19:32):
is a very important thing to understand.
Is that the UPS, and again, it's not
the USPS, it's the UPS, OK?
So UPS is the United States Pulsar's UPS
(19:54):
company is the one we're talking about here.
They're using AI now to detect fake returns.
Well, Happy Return scans the box-free drop
-offs with AI to flag swapped or fake
items before refunds, going after a massive return
fraud problem.
OK, you want to return those brand new,
(20:16):
let's say, hookah sneakers, because they don't fit
properly.
So you've got to return them, which are
expecting to get that $150 or $200 pair
or more pair of brand new sneakers or
ones that are defective.
But instead, you don't return them with those.
You actually return them with some old beat
up sneakers from 5, 10 years ago.
(20:38):
And for a while, it had been working.
But now they're realizing that people were beating
the system.
So they had to do something.
Guess what?
They did do something.
I don't think this is going to affect
a lot of people.
People that were not trying to beat the
system are not going to care about this
process.
I frankly don't care about it.
But people that were, let's say, making it
(21:00):
their livelihood to beat the system, to take
products, to sell them on eBay, et cetera,
et cetera, et cetera, that's the ones that
they're trying to go after.
And let's just be honest, all right?
Starlink satellite fails and sheds some space debris.
A Starlink abnormality added more junk to crowded
(21:21):
low Earth orbit recently, highlighting how quickly space
is becoming an unmanaged trash field.
I don't think we realize that because none
of us really, well, I haven't been to
space lately, have you?
None of us really have gone to space.
But it doesn't mean that it's not a
problem, right?
It still affects our ecosystem and what's going
(21:44):
on.
So SpaceX says one Starlink satellite suffered an
onboard abnormality and shed dozens of debris pieces
and started tumbling toward reentry.
Another reminder that a crowded sky plus no
global traffic rules is turning low orbit from
the Earth into slow motion junkyards.
(22:05):
And reentry means it could be hitting into
Earth.
And that could be a very, very big
problem, not just from the waste perspective, but
even from a safety perspective.
Like, hey, I'm here.
Whoa, what's that?
A satellite phone almost hit me in the
head, right?
And think about this, all right?
You might say, gee, what's the big damage?
Well, even if you dropped, I don't know
if you guys know this.
(22:26):
If you got hit, see, how do I
explain this easily?
If you got hit with a quarter, OK,
falling, let's say, higher or about the height
of the World Trade Center, it could be
(22:48):
fatal.
Now, that's an interesting statement, isn't it?
But, and that's not a myth, it wouldn't
be fatal in normal conditions.
The myth about a coin drop from a
skyscraper killing someone has been debunked many times.
So a quarter like a penny is a
small light and has a lot of air
(23:10):
resistance relative to its weight.
So it quickly reaches a relatively low terminal
velocity as it tumbles through the air.
What can be deadly from that height are
heavier, more aerodynamic objects like tools, bolts, ice
chunks, or falling tape measures that have much
more mass and less drag.
Those can reach higher energies and cause serious
(23:32):
or fatal injuries if they hit someone directly.
How fast?
Well, again, the quarter, because of the way
it's designed, this is why I brought it
up, a quarter reaches a terminal velocity of
20 to 30 M per second, 45 to
65 miles per hour when dropped from great
distance.
How much would, let's say, ice fall at?
(23:58):
Well, let's think about that for a moment.
Because ice, again, is not aerodynamic.
For a rough estimate, assume a quarter's velocity
is 25 to 30.
In the World Trade Center, 415 M North
Tower routes can be a problem.
So if we think about this, it wouldn't
go much faster than that.
It'd be about 56 miles per hour.
But because of the weight, it would become
(24:20):
fatal to people underneath it.
And that's a very, very serious problem.
All right, guys.
So Flock license plate AI fuels policing dragnet.
Yeah, yeah, yeah, yeah.
Flock's camera network helped track the Brown University
shooting suspect's car, while critics warn it can
(24:41):
log everyone's drives into a searchable map.
So the question you might be asking, and
it's a very, very good question.
So what is the FlockNet license system?
So to try to explain this in a
simple way, basically, it's a system
(25:09):
that allows you to track license plates.
OK.
And the thing is, if this system is
able to read every kind of plate everywhere,
right, this could be a serious problem.
So it helped the police track down.
(25:30):
But is it being used for bad actors
to do things, right?
So they call it the automatic license plate
reader.
And now this is even stronger than Flock,
which has been getting so much press.
The new plate recognizer, they call it, is
(25:51):
better than Flock.
It analyzes images or live camera feed, saves
money.
They offer free trials and all this other
stuff.
But the thing about this, the Flock camera
license plate reader, you know, you can actually
get this on Amazon.
I don't know if you know this.
And this little device, OK, runs around 200
(26:15):
bucks.
That's not a lot, guys.
No Wi-Fi needed, motion activated, night vision,
SIM card included, on-demand mode, 15-second
video clips.
So it has the right thing to be
able to grab the license plate quickly, probably
(26:35):
better than what you see at the E
-ZPass lanes, which were designed, you know, hundreds
of years ago.
So it's creeped out.
Or is it about crime solving?
I think people need to know when their
license plates are going to be monitored.
Like if you're in a public place, I
still think it comes down to notifying people,
letting people know that they are being watched,
they are being monitored.
I don't know if you know this in
(26:56):
New Jersey, but it is still, unless they've
changed the law, although they might have, it
is illegal to record someone's audio in New
Jersey without telling them.
(27:20):
So that's a pretty interesting statement, right?
But if you're part of a conversation, you
can only record it without telling the other
person or getting consent, as long as one
person in New Jersey knows about the conversation.
But it is illegal to record audio in
surveillance without telling or posting a sign, okay?
(27:47):
A lot of people think, oh, what's the
big deal?
And then you try to bring it to
court and guess it doesn't work.
So the New Jersey law is stricter about
surveillance style audio than about you recording your
own conversations, but it's still not as simple
as always illegal unless there's a sign.
So one-party consent still applies if the
audio being captured is of conversations where at
least one party has entered, for example, the
business owner or employee participating in the conversation
(28:09):
that generally satisfies New Jersey's party one consent
rule.
However, secret audio of others is the big
problem.
You can install mics to capture other people's
private conversations that you're not part of and
no one in the conversation has consented.
That can violate New Jersey's wiretapping eavesdropping law.
For example, you own a store and you're
the owner of the store, but you're not
(28:29):
always at the store and you haven't told
your employees that the store is being recorded.
You possibly could violate that law.
Signs help, but they aren't magic.
Posting audio and video recording news or telling
people in a policy can support an argument
that people had noticed implied consent, but it
doesn't override all privacy expectations.
(28:50):
For example, it can never been used in
a restroom, changing area or certain staff rooms.
So video versus audio.
Let's talk about that for a minute.
Pure video surveillance without sound is generally treated
differently and it's more often allowed in public
facing areas.
Adding audio brings it under wiretapping rules and
raises the legal stakes.
So more accurate statement to this would be,
(29:10):
it can be legal in New Jersey to
run audio surveillance that records other people's private
conversations without any party's consent and merely adding
or admitting a sign doesn't by itself make
it illegal or legal.
For any real deployment, store, office, HOA, school,
you'd want a lawyer to review placement notice
and purpose against New Jersey's wiretapping and privacy
(29:33):
rules.
That's a very important thing to understand.
So what you have to realize is if
you ever go to court with this information,
they do not allow in the, let's say
in the pre, before you go to court,
they actually try to do settlement, right?
(29:54):
And they try to do mediation.
They always try to do that.
I don't know if you know this, but
if there are 50 cases, probably two or
three of them will go to court.
The other 46, 47 of them actually never
make it to court.
They will all be settled.
So hearsay, it's an out of court statement,
offered in court to prove that what the
statement says is true.
(30:15):
Because the person who made the statement isn't
testifying live and can't be cross-examined, hearsay
is usually not allowed as evidence unless it
fits specific exceptions like certain business records, excited
utterances or statements against interest.
So can you use cell phones, cameras in
(30:38):
court as data?
Well, that brings a very, very good question
to the spectrum here.
So cell phone camera recordings can be used
as evidence in court, but they have to
meet the normal rules for admissibility.
The party offering the video has to show
it's what they say it is or you
(30:59):
or a witness testifies of it.
I recorded this on my phone on X
date at Y place, not illegal obtained.
If the recording violates wiretapping or privacy laws,
well, you're automatically out.
Secret audio of a conversation you weren't part
of or in or of in a two
party state, a judgment exclude and you could
face legal risk.
So you can use covert audio if you
(31:23):
are somebody in the conversation.
In New Jersey, one party consent applies for
audio, but secretly recording others' private conversations you're
not part of can be unlawful.
Let me say that again.
In New Jersey, one party consent applies for
audio, but secretly recording others' private conversations, you're
not part of can be unlawful.
So if I'm having a conversation with you,
all right, and I want to make sure
(31:46):
everything is good and smooth, great.
But now if I use that technology to
record other people's conversations, well, now I'm breaking
the rules.
So hearsay rules still apply.
If people are talking on the video, what
they say may be hearsay, unless it fits
the exemption or exemption rules.
Video self-image can still be used to
show what happened, like a theft, a break
(32:07):
and et cetera.
Metadata and a chain of custody for more
serious cases, lawyers often establish when and how
the file was created, stored and transferred to
argue it wasn't tampered with.
So yes, photos and videos from a phone
are routinely used in civil and criminal cases,
but they have to be legally obtained, properly
authenticated and survive hearsay and other evidentiary challenges
(32:30):
which can happen.
So again, bottom line, in New Jersey, you
can record conversations.
You're trying to do this to protect yourself,
negotiate something like that, because one party is
still part of the conversation, right?
All right, so Europe rolls out river and
seawater mega heat pumps.
(32:50):
Cities like Mannheim and Aalborg are installing giant
heat pumps that pull warmth from rivers and
cease to heat tens of thousands of homes.
Now, as soon as I had put this
out earlier this week, because you know how
we work, we put the reels out and
then the show comes later in the week.
So I immediately rebuttaled it.
So I asked them, where do you live?
(33:13):
And they don't even live in the country.
So how could somebody that's not in the
country rebuttal that, right?
I thought that was kind of like, I
thought that was kind of well, but some
people, you know, they just do this because
they feel they can, right?
They do it because they can.
And so Europe is swapping coal plants for
(33:34):
mega heat pumps that literally siphon the water
from rivers and seas to heat tens of
thousands of homes, starting with Rhine-powered systems
in Mannheim, big enough for 40,000 households.
And even larger, like 177 megawatt seawater plant
in Aalborg will cover about a third of
the city's heating, turning infrastructure into a giant
(33:56):
clean energy radiator under the streets.
These are pretty interesting innovations.
And if they work overseas, guess what, ladies
and gentlemen, they might be adapted to work
right here in the United States.
So some very powerful and interesting things are
taking place.
And AI toys, ladies and gentlemen, well, they
definitely dominated the kids' holiday wishlist this season.
(34:18):
Smart pets and robots, just to name a
few of them, remembers kids' names, their likes,
you know, it remembers addresses, phone numbers, and
all kinds of things.
Sometimes things you don't want them to remember.
The child's moves, their routines, how to push
their buttons, in other words.
Raising alarms right now over the data collection
(34:40):
and profiling in the playroom.
So something that a kid might be doing
and thinking is completely harmless and parents think
it's completely harmless, later on turns into, well,
an issue that could turn into either a
restraining order situation, could turn into somebody trying
to, unfortunately, break into your home.
You know, these are serious things.
(35:02):
And many people are not thinking about it
because like, oh, it's just a toy, right?
So again, the hottest toys or dolls for,
well, you know, for girls or for guys
who are trained.
But the chatty AI pets, like I said,
the robots, and they learn your kid's voice.
They can even repeat your kid's voice.
(35:23):
Yeah, moods and routines in real time.
And the tech is now showing Shiana McCullum
tests how these smart companies really work and
asks whether hidden mics, data collection, and glitchy
guardrails make them more creepy than cute under
the tree.
And I think right now, we don't think
(35:44):
this is a problem because there hasn't been
a big number of cases.
And hopefully there never will be a big
number of cases.
When there's a number of cases, that's when
I think it gets to be a problem
and we have to deal with it.
So there are a few companies out there
that are definitely doing stuff like this.
(36:07):
And a lot of the influencers out there
saying, you know, not a big deal.
Some people are saying it's a big deal.
Even people in the BBC are concerned about
this.
And, you know, they get concerned about lots
of stuff too.
So Shiana McCullum meets an AI startup working
to fix erratic clothing sizes in fashion.
But now it's becoming a lot more than
(36:28):
that.
It's becoming, why are you grabbing this data,
right?
And should people be concerned about this?
I get concerned when a big company like
Amazon, you know, Amazon's just one of them.
There are other bad giants like Amazon that
unfortunately they exploit data.
(36:49):
They claim that it's just a mistake.
And maybe it is, maybe it isn't, who
knows.
And then when they get a slap on
the wrist with a $50,000 fine, they
go away.
I think they're gonna get hit with a
lot harder than a $50,000 fine.
I mean, I'm talking like a few million
dollar fine.
So same thing happens with Google, right?
They make a mistake and oh, we'll slap
them on the wrist.
We'll slap Google for 50,000.
(37:10):
Oh sure, no problem.
We got slapping for millions and billions, right?
These mistakes that these companies are making are
because they are not being conscientious of the
ramifications that can transpire using technology without providing
proper safety rules, safeguards.
(37:32):
We all know what happened.
None of you guys remember the story about,
I'm trying to remember the company now.
The company that got exploited in the IV
industry for bad actors and the pin.
You remember that company?
I'm trying to remember the name of them.
(37:55):
Anyway, there was this company and they fixed
it now.
But all types of intravenous pumps and stuff,
they had a four digit pin, okay?
It was communicating basically back to another device
in some cases.
And that other device a lot of times
(38:15):
was your iPhone.
There was a four digit pin.
That if you knew the four digit pin,
you could turn on, turn off, lower the
doses, raise the dosage, all just like you
were right in front of the machine.
That four digit pin was a big problem.
Now pumps can no longer use just pins.
(38:36):
We can crack codes like that, but they're
just numbers.
So that meant that that company had to
take a big decision to fix that.
But it wasn't their decision to fix it.
It was the law coming down on it
saying, hey, look, this is what was caused.
Now, maybe it was an oversight because you
(38:57):
didn't understand how passwords work, which was kind
of like oxymoronic.
But maybe that was the case.
So we're going to give you 90 days
to fix the problem and we're going to
charge you with a 10,000 fine.
I'm just giving you that.
But anywhere codes are used, anywhere information can
be exchanged from one piece to another, something
as simple as your washing machine, your dishwasher,
(39:19):
you might say, John, what's the big deal?
Anytime a device can learn information.
That's why I was very interested about these
smart assistants because I thought they'd be kind
of cool, especially going back to being in
school and stuff.
But then I got to thinking about it.
I got to thinking about, well, who's going
to have that data?
So then I said to myself, you know
what I'm better off to do?
I'm better off to just get myself an
MP3, take the data, load it into a
(39:40):
chat sheet, do whatever I want later on,
but I control what's in it.
So I just don't think there's a company
right now that cares enough about your privacy.
That's the bottom line.
And a lot of these companies, they want
to give you a minimal amount of recording
time.
And then some want to charge you a
(40:01):
fee on top of that to transcribe it.
I mean, come on, like really, seriously?
All right.
And ladies and gentlemen, Instacart, yes, they were
fined again for dark pattern, free delivery fees,
a $60 million Federal Trade Commission deal flags
were hidden fees and tricky subscription prompts as
(40:21):
deception, not a clever UI, we would say.
But the thing is this, things like, you
know, they were not able to, people weren't
able to cancel so easy.
They were getting hit with other fees.
And you know, there's only so long that
this can go on.
Maybe this falls into the lap of a
relative of a judge or something.
(40:42):
And then guess what?
It becomes like their pet project to make
sure this is done properly, right?
In the free delivery that they said it
was free delivery, they added mandatory fees and
sneaky auto-renewing Instacart trials while a separate
probe zeroed in on AI pricing tool that
can quietly charge different shoppers, different prices for
(41:04):
the exact same groceries at the same store.
I think that is a serious crime, but
because technology was doing it behind the scenes,
nobody caught it.
I mean, that could happen to any store,
right?
But I think the fact that we as
individuals need to be aware of it, I
think things like standards, like, you know, there's
(41:25):
standards for scales, right, that they have to
do.
Maybe there needs to be standards for AI.
You know, maybe there needs to be a
standard that it gets reviewed every year, right?
If it's gonna be used, maybe the ruling
is that if you have more than a
thousand users on, if you have more than,
maybe it's 5,000 users on it, okay?
(41:46):
That it needs to be checked yearly and
it needs to be sealed with a sticker,
just like the, you know, the national, what
do they call that?
There's a seal, scales, they need to be,
I'm trying to think of the name of
the company now.
They need to be verified every year in
(42:07):
the food industry and other industries too, by
the way, not just that.
So, there's Mettler Toledo.
So, it's a federal regulation.
Basically, what it states, they're different ones depending
on which classification you fall into.
(42:28):
So, there's testing of scales, it's the 442
.4 testing of scales.
The operator, each official establishment that weighs, for
example, meat or poultry will cause such scales
to be tested for accuracy in accordance with
the technical requirements of the NIST Handbook 44.
So, and it's not just them, it's also
the medical industries and things like that because
(42:49):
there is the possibility, as a gentleman, that
scales can, well, they can go awry, they
can get uncalibrated, right?
And we know over time that a penny
can make a big difference.
Pennies can mean millions or billions over time.
We learned from a movie that was based
on a true story how they waited just
(43:09):
before midnight to transfer pennies out of these
different accounts that hopefully nobody would notice and
nobody did notice.
But the thing is, a penny makes a
difference.
There was one kid that got his entire
college education funded by asking several thousands of
(43:30):
people for pennies, a penny each.
That was pretty cool.
And ladies and gentlemen, Apple has auto clawback
a new European Union tech commission.
New rules let Apple and auto deduct underpaid
commissions from future in-app revenue while it
reshapes fees under the European Union digital laws.
(43:55):
So this is like kind of crazy, but
so now Apple can basically be like a,
I guess Apple could be like a collector
kind of hidden in there.
Yeah, so Apple just turned the App Store
into their own private collections center.
Apple's new developer term is basically for the
switch that lets it auto debit underpaid commissions
(44:18):
from your future in-app purchases, even across
a sister app.
Sorry about that.
You know, when we're visiting and we had
people visiting and stuff like that, we never
know what we're going to have.
We can have live people show up.
We could have something full out of a
ceiling.
So I do apologize for any inconvenience there
and the doorbell just rang.
But even the sister apps while it rolls
(44:41):
out a new core technology commission in the
European Union and tightens rules on iPhone AI
systems, secret recording users.
I mean, these are some real problems.
And another one I want to talk about
guys is the trend to Agra ITM jackpotting.
This one just really hit me.
Malware ring was busted.
The US prosecutors charged 54 people in a
(45:02):
Ploutus based scheme that made ATMs spit out
cash on command costing banks tens of millions.
I didn't say thousands.
I had tens of millions.
Yeah, so a Venezuelan gang tied to Trend
Agra allegedly hit ATMs across the US with
(45:23):
a Ploutus spyware popping open machine panels, swapping
hard drives or USB plug payloads so that
cash dispensers spewed jackpots on command.
Then wiping the logs to hide millions in
losses until the feds finally indicted 54 people
(45:43):
in one of the biggest ATM hacking cases
ever.
So NCR, one of the big manufacturers of
the software behind the ATM machines had realized
an interesting thing that was happening.
Well, they were getting hit by another method
and that was using endoscopy.
They were using the endoscopy type method very
(46:04):
similar to human body.
They were using these cameras to basically go
up, endoscopy cameras and they would go up
the shoots and they were able to use
the cameras to confuse the sensors and getting
them to spew out large amounts of cash.
So NCR recently took this very seriously and
they quickly and promptly deployed an update which
(46:25):
I think was version 3.0. And if
your machine has the 3.0 update, there
has been no issues with these type of
cameras, sensors causing problems with the sensors to
fake them out to actually serve the money.
And again, they were going up the cash
shoot.
It was just very interesting how this happened.
It just baffled me that somebody would try
(46:46):
to do something like this.
The ATMs are often on camera and stuff
like that.
And I guess maybe they hid the camera.
Who knows what they did?
But all I know is that this was
pretty low and pretty pathetic that they would
try to do something like this.
And I'm glad that NCR responded.
We don't hear a lot about NCR, but
(47:07):
I wanted to let you know that NCR
makes a lot of the ATM software.
And so they're kind of like the name
you don't know.
And so NCR was used in a lot
of retail stores and stuff like that.
(47:27):
So I'm glad to see that they've resolved
that, but really terrible how that had played
out.
Samsung RAM shortage.
Yes, it worsens amid the alleged kickback probe
of DDR5.
It's scarce and pricey as Samsung probes alleged
bribes for priority supplying while favoring high margin
(47:49):
AI and server buyers.
But what does all this mean in English?
Because I mean, I know that sounds like
a lot of gobbling, but it does.
So in the middle of basically an AI
-fueled RAM shortage, Samsung now is being forced
to investigate whether some employees secretly took kickbacks
to scare and steer the source of the
(48:12):
memory orders while DDR5 pricing was exploding.
Micron bailed on consumer RAM to chase data
center AI money.
And PC makers quietly jacked up the graphic
processing units, laptops, and SSD prices to cope
with this.
So what do people do?
(48:34):
Microsoft does the same thing, right?
When they see that the hardware you've got
is too up to snuff after a few
versions, well, they make their software have larger
requirements, requirements that your processor currently doesn't have.
Even if it just changes it by a
small percentage, if your processor can't handle that
small percentage upgrade, well, then it's time for
(48:54):
a new computer.
And that's how we get the obsolescence of
computers every single year.
This is a lot, guys.
I mean, really a lot happening.
This entire year, when we think about everything
from the very beginning to the end, and
what's been happening with TikTok and the fact
that there's supposed to be an M2 or
M3 version, supposedly that's going to happen now
(49:15):
as the deal has been closed.
But they're also saying that TikTok is possibly
going to not be the same TikTok.
What does that mean?
It means that it's going to be a
watered down version.
Is it going to be gated and not
allowing us to have access to European versions?
Very possibly.
I think we've got to stop things like
a lot of China and a lot of
the foreign countries trying to, they're not successful,
(49:36):
but they're trying to waste our traffic on
websites because they're trying to look for something.
It's pretty pathetic that these foreign companies, foreign
countries, not companies, countries are doing this.
I think we as a country need to
be mandating the traffic that's coming over internationally.
(50:02):
I think that's the only valid thing because
if that's done, then people can be more
protected.
I don't think it's one single human's responsibility.
I don't think it's a company's responsibility here
in the United States.
I think it's the foreign countries, what they're
doing there.
I think the United States has to be
(50:23):
proactive in traffic that crosses the borders.
And I know that gets so hard with
IP, but yes, it can be done.
And at the bottom line, if there's a
complaint, it's going to hit through the Federal
Trade Commission.
We all know that.
So this year has brought us a lot
of things.
People ask me, John, is AI going to
(50:43):
get rid of jobs?
Can I tell you it's going to get
rid of jobs?
No.
Can I tell you what's going to change
the type of jobs out there?
Yes.
So the jobs that you currently are going
for, those jobs may not be there.
So a lot of these task-based jobs,
a lot of these jobs where there's a
lot of labor involved, those jobs are going
to disappear.
(51:04):
Jobs where there's issues where humans can make
mistakes, machines can do it over and over
again and never get tired.
I think that's where we're going to see
AI taking over.
As far as AI replacing everything, no.
AI is a companion.
AI is a supplement.
OK?
AI is not a replacement for your brain.
(51:27):
I know people are saying, gee, it's going
to be as smart as a human brain.
But really, our human brain is learning by
a lot of different things.
I don't feel that AI is going to
get to that level.
I think it's going to get very smart
and that it can learn from us.
But I think humans are going to keep
evolving.
As AI evolves, I think humans are going
to keep evolving.
Most people don't know that, well, most people
(51:49):
only use about 2% to 4%
of their brain.
Let's just think what could happen if we
started using 10% or 20% of
our brain.
I mean, that would just blow people out
of the water.
I think what's happening with the AI boom
is people are saying, gee, you know what?
I can make a lot of money in
my life because of AI.
Not caring about the fact that it might
(52:09):
harm somebody else in the world because they're
not caring about the greater good of all
concerned.
They're just being greedy and they're seeing that
green money or they're seeing that ACH transfer
into their account.
And so they're getting a little bit greedy
and grubby about that.
And so that's why I think it's important
that these data finds, that these dark patterns
(52:32):
are monitored.
We saw just not too long ago that
Google decided to drop, next year they're going
to drop their dark monitor.
Why would a company even do that?
They're not deleting the data from it, but
they're going to delete the monitor.
Why would you do that, Google?
I don't know.
Google just seems to me like they just
do whatever they want.
(52:53):
They have a few dollars and they think
they can do what they want.
But I think eventually that's going to change.
And I know between Google and I know
between Mark Zuckerberg and Facebook and stuff like
that, they're writing AI like that's all they
have.
And I feel AI is a great part
of everything, but it shouldn't be the replacement
(53:15):
of everything.
And when you put all your eggs in
one basket, what actually happens?
Well, if that basket goes south, you lost
all your eggs.
So that's why it's important to put some
eggs in one basket, some eggs in another
basket and diversify.
You do the same thing with stocks, right?
Portfolios, you don't put all your money, all
your investments into one stock or one portfolio.
(53:35):
You distribute them evenly.
And I think that's an important thing.
This entire year, ladies and gentlemen, of 2025
has been a really interesting year.
We started touching AI.
We started learning that there were lawsuits with
New York Times and even the game Wordle
and stuff like that.
(53:56):
There were lawsuits with Amazon and AI and
the Wall Street Journal, right?
But guess what?
Now they're kind of sleeping together.
Now they're working together.
Why?
Because they found a way to make money
off of something.
Once money starts to enter the picture, it's
(54:18):
sad how quickly the ethics and the morality
of doing something seems to just not be
important anymore.
I've said this to you before, guys.
Just because you can do something doesn't mean
you should.
Let me say that again.
Just because you should do something doesn't mean
(54:39):
you should.
And there are lots of reasons for that,
right?
And like I said to you before, AI
is a tool, all right?
Just like any other tool in the world.
It could be a power tool, anything.
It could even be a weapon.
It's how we choose to use it that
(55:00):
determines whether it's good or bad.
The technology itself doesn't become good or bad
on its own.
It's based on the application, based on the
end user, based on the deployment and the
purpose of that deployment.
People are ultimately responsible.
We were talking about something the other day.
I was talking about something with an attorney.
(55:20):
And we were talking about self-driving cars.
I said if a self-driving car hypothetically
gets into a little offender bender or worse,
who gets sued?
It goes from John.
It goes the driver.
But the driver's not driving.
Well, he still owns the car.
So the owner of the vehicle, okay?
The manufacturer of the vehicle, okay?
And the people that develop the self-driving
(55:43):
technology.
So that's a lot more people.
And are we ready right now to put
all our faith into self-driving technology?
I know I'm not.
Am I okay with lane keeping assist and
stuff like that?
Yes.
But I'll be honest with you.
I still don't trust it 100% when
it's bad weather.
I still know that it could make a
(56:04):
miscalculation.
So when it's bad weather, I do not
trust it 100%.
I think in a stability world, on a
scale of 1 to 10, I think lane
keeping assist is probably, that's one technology.
It's probably like in a regular world day,
it's probably like around a 7 or 8.
(56:26):
In a clouded world, it's probably like a
3.
In fact, when the weather is so bad,
it won't even allow you to engage lane
keeping assist.
That tells you right there how unstable it
is.
It can't get enough data points.
To be able to calibrate.
I think these are important things.
(56:48):
And I think as we evolve into 2026,
there's gonna be a lot more happening with
artificial intelligence.
This year, you guys know I built my
first AI robot, which was kind of amazing.
And I know, ladies and gentlemen, as we
go into 2026, there are gonna be a
lot more questions.
And of course, a lot more answers.
And we may not have those answers right
(57:09):
away.
But what we may have is the ability
to learn from our past and be able
to work forward.
And I'm hoping that many vendors, many companies
will start to understand that AI really is
a tool.
And we as, whether we're technical or non
-technical, we are the deployers of that tool.
(57:30):
And we have to be responsible for what
it does at all costs.
Ladies and gentlemen, I am John C.
Morley, serial entrepreneur.
I thank you so much for following here
on The JMOR Tech Talk show.
You can catch more of the episodes, replay
this one and other stuff in long form
and short form content by visiting BelieveMeAchieved.com.
Again, I wish you happy holidays.
(57:51):
Merry Christmas.
A very happy, healthy, prosperous 2026.
May it bring all that you ever could
imagine.
And I'll catch you guys next year on
our show.
All right.
Be well, everyone.
And again, happy new year.