All Episodes

January 3, 2025 58 mins

🌟 Tech Tidings Unveiled: A Deep Dive into This Week’s Innovations! 🌟 It’s time for another groundbreaking episode of The JMOR Tech Talk Show! This week, we’re covering the hottest developments in tech, cybersecurity, AI, and beyond. 🌐 Don’t miss the full episode, releasing within 24 hours on The JMOR Tech Talk Show, and check out exclusive content at Believe Me Achieve! 🎧

🎙️ Detailed Highlights of This Week (S3, S53)

1️⃣ US Adds 9th Telecom to Chinese-Linked Hack List 📡 The U.S. government takes another step in safeguarding national security, adding a 9th telecom company to its blacklist. This comes amidst growing concerns over data breaches linked to Chinese entities, emphasizing the urgency for stricter cybersecurity measures.

2️⃣ Russia Fines TikTok 3M Roubles for Legal Breach 🇷🇺 TikTok lands in hot water after Russian authorities imposed a 3-million-rouble fine for failing to remove content deemed unlawful. The case highlights tensions between global tech giants and local regulatory frameworks, raising questions about compliance and freedom of expression.

3️⃣ Pro-Russian Hackers Target Italy's Foreign Ministry, Airports ✈️ A coordinated cyberattack disrupts critical government operations and major airports in Italy. This breach, attributed to pro-Russian hackers, underscores the vulnerability of national infrastructures amidst global geopolitical instability.

4️⃣ Court Orders Recall of Signify Lighting Over Patent Dispute 💡 A significant patent dispute forces lighting giant Signify to recall products globally. The ruling impacts the availability of essential lighting solutions, sparking debates over intellectual property rights in the tech industry.

5️⃣ EU Airlines Approve Google's Proposed Search Changes ✈️ European airlines greenlight Google's revisions to flight search algorithms. The changes aim to provide travelers with clearer pricing and competitive options, promoting transparency and fair competition in the aviation sector.

6️⃣ Biden Administration Proposes Cybersecurity Rules for Healthcare 🏥 New cybersecurity standards target vulnerabilities in the healthcare industry, aiming to protect patient data and prevent disruptions to essential medical services amidst a surge in cyberattacks.

7️⃣ WuXi to Sell Advanced Therapies Unit Amid US Curbs 🧬 In a strategic move to navigate U.S. export restrictions, WuXi announces plans to sell its advanced therapies division. This decision reflects shifting dynamics in the biotech sector due to global trade policies.

8️⃣ The Paper Passport Era Is Ending 📄 As digital passports become the norm, traditional paper travel documents are being phased out. This transition promises faster processing, enhanced security, and a more seamless travel experience worldwide.

9️⃣ Cadbury Loses 170-Year Royal Warrant 🍫 After nearly two centuries, Cadbury’s royal warrant is withdrawn. This loss marks a shift in its iconic status and raises questions about the brand’s future alignment with tradition and quality.

🔟 Snapchat's AI Chatbot Raises Privacy Concerns 🤖 Snapchat’s My AI chatbot sparks debate after users raise concerns about its data collection practices and lack of transparency. This controversy highlights the growing need for ethical AI development and user protections.

1️⃣1️⃣ Instacart Joins Uber in Seattle Driver Deactivation Lawsuit 🚖 Instacart and Uber collaborate in a legal challenge to Seattle’s new law on driver deactivations. The lawsuit seeks to balance worker rights with app company policies, as debates intensify over the gig economy’s future.

1️⃣2️⃣ AI Revolution Impacts Benefits Appeals and Landlord Disputes 🏡 AI is transforming legal landscapes, expediting benefits appeals, and redefining landlord-tenant conflict resolution. These advancements promise efficiency but raise concerns about fairnes

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:06):
Hi everyone, I'm John C. Morley, the host of
The JMOR Tech Talk Show and Inspirations for
Your Life.

(00:59):
Well hey everyone, it's John C. Morley here, serial
entrepreneur.
You know, it is great to be with
you on Inspirations for Your Life.
First, I want to extend a happy, healthy,
prosperous, abundant 2025 to everyone out there listening
tonight, or that's deciding to listen a little

(01:23):
bit later.
Again, just want to take the opportunity if
you're just catching us.
My name is John C. Morley here, and I
do want to wish you a very happy,
healthy, and amazing that all your dreams can
become a reality.
All right, so ladies and gentlemen, do you
know what this is tonight?
Well, it's January 3rd, 2025, but beside being

(01:45):
in 2025, we've hit a milestone, and I'm
really happy to celebrate, not just have we
launched so many different shows and so many
different episodes of these shows, but tonight we
actually turn a pretty big corner.
And what that means is that we are
now in series four.

(02:07):
So show one, and we're in series four,
which means we're celebrating being past the third
year and now into the fourth year.
And in case you're wondering, ladies and gentlemen,
we have a lot of episodes on Jay
Moore Tech Talk.
Jay Moore Tech Talk, as of tonight, has
over, it's going to have over 69 episodes,

(02:29):
so tomorrow it'll basically be 70 episodes once
this publishes out.
So great to have you there.
If you are a station manager and looking
to get content for your station, do want
to let you know that both Jay Moore
Tech Talk and Inspirations for Your Life have
adopted a pretty strict standard in that we
now follow a real strict time crunch.

(02:52):
So if you're watching one of our half
hour shows, like most Inspirations for Your Life,
it ends at exactly 28 minutes.
Jay Moore Tech Talk ends at 58 minutes.
So that means that you can plug us
in, if it's a second or two over,
it might just be some credits or something,
but you can easily plug that into a

(03:12):
28 for those that are doing the 28
minutes.
And for Jay Moore Tech Talk, you know,
we're always going to be 58 minutes.
So great to have you here, ladies and
gentlemen, and I hope you can check out
all my great content, which you can do
that through many of the different syndication platforms
where you can choose to license it.
So welcome, everyone, to the Jay Moore Tech

(03:33):
Talk show.
Once again, tech tidings unveiled.
This is a deep dive into the week's
innovations.
It's time for another groundbreaking episode, is what
I like to call the Jay Moore Tech
Talk show.
And this week, I'm actually going to be
covering some of the hottest developments in tech,

(03:53):
cybersecurity, artificial intelligence and beyond.
So don't miss the full episode.
This release is within 24 hours or less
of the live show airing.
And ladies and gentlemen, we are now on
TikTok.
I'm really happy to mention that.
We've been on TikTok for Inspirations for Your
Life, but on the Jay Moore Tech Talk

(04:16):
show, we only stream it once a week.
So this is probably only our second time
being on TikTok live.
So it's great to be on TikTok with
the Jay Moore Tech Talk show.
And you can always learn more at BelieveMeAchieved
.com.
So detailed highlights of this week are pretty
amazing.

(04:37):
And I think a lot of people don't
realize what all this means.
But the truth of the matter, ladies and
gentlemen, is that it's pretty powerful what's going
on.
I mean, we're talking like major, major, major
power.
And so I want to just kick it
right off with, I think, something that you

(04:59):
guys are going to find, let's say, pretty
interesting.
And that's going to be our very first
point.
So our very first point is one that
I am happy to talk to you about.
And that is none other than the U

(05:21):
.S. adds the ninth telecom to the Chinese
linked hack list.
Ouch.
So, you know, a lot's been going on
with the Chinese.
And I'm sure you guys know this is
nothing new.
But in one breath, it's been a lot.

(05:41):
It's been a lot for most people to
handle.
And I want to let you know that
it's important that we understand, like I said,
what's going on, because if we don't, it
could be a really serious problem for the
world.
So the U.S. adds the ninth telecom
to the Chinese linked hack list.

(06:03):
The Chinese linked cyber espionage group Salt Typhoon,
which is access networks, to geolocate individuals and
intercept calls to cybersecurity and infrastructure security agencies,
advised government officials to use encrypted communication apps
to mitigate risks as metadata from numerous Americans

(06:24):
was compromised.
Federal officials, including Senator Ben Ray Lujan and
FCC Chairwoman Jessica Rosenworcel, have called for urgent
measures to secure communication networks, while Chinese officials
deny the accusations.
Of course, they do.
The U.S. plans updated rules to prevent

(06:44):
future intrusions of this scale, because this can
be a major, major problem.
And many of the homeland security and the
U.S. security administrations are advising people not
to use passwords like, you know, common ones
such as your pet's names.

(07:05):
And why is this so important now?
It's because the Chinese are trying to get
in and they are hacking into many people's
homes routers.
So when you get one of these new
routers, even though it's already programmed with a
custom password, sometimes these bad actors in the
factory are scanning this information.

(07:27):
And guess what?
They could use that to hack your devices.
So let's not be part of that.
And let's stay educated and let's make sure
that what we do keeps us all safe.
All right, everyone, I think that's that's important.
I always say it's better to be proactive
than to be reactive.

(07:48):
It's also less money, too.
So Russia gets fined, actually fines TikTok three
million rubles for a legal breach.
TikTok lands in hot water after Russian authorities
imposed a three million fine for falling to
remove content deemed unlawful.

(08:10):
The case highlights tensions between global tech giants
and local regulatory frameworks raising questions about compliance
and freedom of expression.
But I think at the end of the
day, you know, we have to understand that,
you know, when something goes on social media
and, you know, we're putting it out there,

(08:32):
it's not 100 percent safe.
OK, and even though there should be rules
against things, if you post something on social
media, assume that it can never be removed.
OK, that's the best thing I can always
advise everyone.
And again, this ruling highlights ongoing regulatory scrutiny
faced by the tech platforms in Russia.
And I think it's going to get a

(08:52):
lot worse.
It's going to get a lot worse.
But hopefully this is going to wake the
American people up and say, hey, you know
what?
We got to do something about this.
We really have to do something.
And, you know, the time to act is
now and it's better to be proactive than
to be reactive, like I said.
And ladies and gentlemen, the pro-Russian hackers

(09:14):
hit Italy's foreign ministry airports.
A pro-Russian hacker group called NoName05716 claimed
responsibility for a cyber attack that disrupted approximately
10 officials' websites in the Italy area, including
those of the foreign ministry and Milan's Linat

(09:36):
and Malpensa airports using a detailed distributed denial
of service called a DDoS attack.
The hackers briefly rendered the sites inaccessible, citing
retaliation against Italy's perceived Russophobia.
Italy's cybersecurity agency quickly mitigated the attack, restoring

(09:57):
operations within two hours.
Despite the disruptions, flights and mobile airport apps
remained unaffected.
This incident underscores ongoing cyber threats linked to
geopolitical tensions.
And as I said, ladies and gentlemen, this
is going to get a lot worse.
Unfortunately, before it gets better.
So let's be mindful.

(10:18):
You know, you've heard the saying, if you
see something, say something.
Be proactive.
Okay, the authorities don't mind getting a call
because you're trying to be protective.
And I think there's no harm in being
too cautious.
But right now, I think we're being a
little bit too, how can I say, trusting.

(10:42):
And this all stemmed around the 9-11
attack.
And then people got very vigilant.
And then they started to get complacent and
comfortable.
Oh, everything's okay.
Nothing's gonna happen.
I'm not here to scare you.
But I am here to tell you that
you need to be proactive and you need
to be aware of your surroundings.

(11:03):
Number four for this week, ladies and gentlemen,
the court orders a recall of Signify.
Yeah, this is a real, a real interesting
one.
Court orders a recall of Signify lighting over
a patent dispute.
A German court has ordered the recall and
destruction of certain Signify lighting products for infringing

(11:23):
patents owned by Sewell, S-E-O-U
-L, semiconductor distributor Conrad Electronic faces up to
£250,000 per violation if the order is
not followed.
Signify from Philips in 2026, which basically happened.
They were the largest lighting manufacturer and it

(11:46):
was spun off from Philips in 2026.
It was stated that the patents in question
have expired and is pursuing legal action to
invalidate them.
Sewell Semiconductor, a South Korean firm, investing 10
% of its revenue in R&D, holds
over 18,000 LED, that's light emitted diodes,
related patents and has a history of enforcing

(12:10):
its intellectual property rights.
So, I mean, I think it sounds like
a case of a miscommunication and I think
this needs to get ironed out pretty quickly
because this miscommunication could cost lots of money.
So, Signify LEDs have been very well known

(12:34):
in the industry for quite a while and
I think this could cause a lot of
problems.
You're talking about Philips, you're talking about Cooper
Lighting Solutions, you're talking about Advance, Bodine and
Wiz, just to name a few.
And you're talking about things like Exit and

(12:57):
Entry, a wide variety of exit and emergency
products, including wet-location exit signs.
Philips Dynalite, offering powerful integration tools to bring
your next project to life.
So, I think the issue with this is
that there is a big miscommunication.

(13:19):
And so, when we think about Signify and
Signify lighting LED, they're used all over the
place.
All right.
And you could buy Signify just about on
almost any site, but Signify puts together commercial
lighting products.

(13:40):
And it just seems right now that somebody
has got their wires in a twist.
What I like to say is there is
a miscommunication here.
And I hope that they can resolve this
soon because it seems like it's just a
very big misunderstanding.
But if they don't resolve it, this could
result in thousands, if not millions of dollars

(14:02):
in fines that are going to need to
be paid.
So, hopefully they do get on their toes
soon about that.
And ladies and gentlemen, the European Union Airlines
approves Google's proposed search changes.
Google's proposed changes to its search results aimed
at complying with the EU, European Union Digital

(14:23):
Markets Act, DMA.
And they have received support from airlines for
Europe, which includes major carriers like Air France,
KLM, and Lufthansa.
The changes are designed to prevent Google from
prioritizing its own products and services in search
results as required by the DMA, the Digital
Markets Act.
The airline group expressed approval for a horizontal

(14:47):
layout that ensures equal visibility for airlines and
comparison sites, as well as the use of
blue to distinguish these results.
However, they raised concerns over Google's proposal to
use indicative dates for flight bookings, arguing that
specific dates are crucial for a positive user
experience.

(15:07):
Google has indicated it may revert to its
old search format if stakeholders cannot agree on
the new system.
You know, if I have to think about
this, this just sounds like Google being like
a five year old kid.
I mean, this is what it sounds like
to me.
And I've got to be honest with you,
I've never believed Google to be fair.

(15:28):
I've had challenges with them, with our businesses,
and we haven't done anything wrong.
They get a little bit nuts.
And the problem is, is that it's either
Google's way or the highway.
And so a lot of people think just
because they're Google, they can do whatever they

(15:48):
want.
And I'm here to tell you, ladies and
gentlemen, these big tech companies cannot do whatever
they want.
They may think they can do whatever they
want, but they cannot do whatever they want.
And so Google's kind of like this, you
know, two or three year old that you've
got to pacify.
Unfortunately, they have some wares that people are

(16:12):
interested in.
But unfortunately, they like to blackmail and hold
them over people's heads, which I think is
a big problem.
And I think something needs to be done
from the Biden administration to the new Trump
administration to make sure that these companies are
acting like five year olds and younger, actually

(16:33):
do the right thing all the time.
Google only seems to want to do the
right thing when they're going to get slapped.
And by slapped, I mean getting hit with
some pretty hefty fines.
So Google, you don't fool me.
I've stopped using you as a search engine
a long time ago because I don't like
the fact that you actually sell my data,

(16:55):
the third party companies, and you make money
off of it and you compromise my privacy.
So that's a real problem.
And the Biden administration proposes a cyber security
set of rules for health care.
The new cyber security standards targets vulnerabilities in
the health industry, aiming to protect patient data

(17:17):
and prevent disruptions to essential medical services amidst
a surge in cyber attacks.
Now, although I do think this is a
great idea, my concern, ladies and gentlemen, I
have to tell you this right up front,
is going to be a transparency.
Is this really going to do what they
want?
Are they going to be able to control
this at a granular level?

(17:38):
Or does it mean that this is something
to placate?
Meaning that, you know, we're going to do
this.
And I hope that's not the case, especially
with a president like Donald Trump to be
sworn in in just a few weeks.
Hopefully he understands, you know, what's going on.
And the fact that, unfortunately, gentlemen, you know,

(18:00):
these rules seem new right now.
But, you know, this is something that should
have happened years ago.
And it's only because people's data are starting
to get basically exploited.
And you might ask, well, you know what?
And this is a very good question.
What are what are these rules?
You know, what are the rules of, you

(18:22):
know, the Biden administration?
So there's lots of new rules that they're
proposing.
And I think the concern with a lot
of people here is that it's there's still
a transparency issue.
That's my biggest concern right now.
I mean, I think they are still trying
to figure things out, depending on who you

(18:45):
talk to, what sites you go to.
Healthcare organizations may be required to basically up
their cybersecurity to better prevent sensitive information from
being leaked by cyber attacks like the one
that Ascension and UnitedHealthcare got hit by recently.
So, you know, having several clients of ours

(19:08):
that are in the healthcare industry, I have
to tell you that, you know, we tell
them what they need.
And most of the small and medium sized
companies and even small hospitals, they pay attention.
But it's these larger companies that think they
own the world and they're really being foolish

(19:30):
because they don't want to spend a little
bit of money.
What's a little bit of money?
Well, it depends.
I mean, they could be talking anywhere from
a few thousand dollars a year to maybe
five or ten thousand dollars a year.
But if you take that amount of money,
let's just say hypothetically, let's say for one
hospital, let me let's just let's just let's
just take the numbers and make it make

(19:50):
it pretty easy.
Let's say for your hospital, let's just say
it was costing ten thousand dollars a year
for this extra protection.
OK, and let's say, I mean, if we
had to think about it, you know, how
many does an average hospital hold?
Well, they hold quite a bit.
OK, an average hospital can hold anywhere.

(20:13):
Let's say the middle, like 500 beds roughly
and up.
So but you got a lot of other
people that could be affected.
So let's say out of that, let's say
there might be 1500 people because there could
be some, you know, quick in and out
patients, right?
Now, let's say that's what transcends through their
doors.
You know, in let's say if we said,

(20:35):
let's say in a year.
Maybe they flip their beds a couple of
times, right?
So let's just take a rough let's just
take a rough number.
And let's say that if we take, you
know, obviously 500, you know, assuming that they
flip.
Let's say they flip.
Let's say they flip four times.

(20:57):
OK, and let's say on top of that,
there might be another 1500 of walking traffic
that comes for like quick business.
So we're thirty five hundred.
OK, so if we take ten thousand dollars
and we divide that by thirty five hundred,
that comes down to two point eight five.

(21:18):
OK, so two point eight five.
OK, so when we think about those numbers,
we're not talking millions.
OK, we're talking dollars.
All right.
And if you divide that, OK, by, you
know, let's say let's say you divided that

(21:38):
by, you divide it by the year, hypothetically,
roughly.
You're talking about less than a high priced
coffee or beverage at Starbucks.
So it's not that they don't have the
money, but they're trying to be a little

(21:59):
bit conniving in the fact that they want
to put more of it in their pocket.
So I think these new rules will hopefully
force hospitals to do what it is they
need to do and stop playing games with
people's protection and, you know, putting a few
extra thousand dollars a year isn't going to
make them go broke.

(22:20):
And it isn't going to really cut into
their profits.
But ladies and gentlemen, let me just tell
you this.
If one of those cases were to come
up and let's say a violation was to
be hit.
Let's just say one violation.
We could be talking twenty thousand dollars.
Let's say that there was one violation every
month.
OK, that is two hundred forty thousand dollars.

(22:44):
You're telling me you can't afford a cup
of coffee every day to protect your staff
and your patients.
Now, I only gave you the numbers based
on the patients.
It's actually a lot lower because if I
included all the staff that worked at the
hospitals and the patients, there's there's a lot
there.
But we talk about HIPAA, we're mostly talking

(23:08):
about, you know, the the patients.
Right.
And it is important to keep everyone's information
safe.
The Health Insurance Portability and Accountability Act.
It's a federal law that protects the privacy
and security of health information.
And it was signed into law by President
Clinton, Bill Clinton, on August 21st, 1996.

(23:28):
So I think they're going to have to
toe the line if they don't toe the
line.
Well, they're going to start getting hit with
some pretty hefty fines.
And then, you know what's going to happen?
They're going to wake up.
Oh, you know what?
We'll be happy to pay that.
Yeah.
Meanwhile, they're going to have a lawsuit on
their hands.
So, like I said, the Biden administration proposes
cybersecurity rules for health care.

(23:50):
And the rules will update the HIPAA standards,
including the measures like data encryption, compliance checks
to ensure the networks are really secure.
With health care data breaches rising, it's important.
Like I said, it's a big problem.
And the breaches rising, you know, they're hitting

(24:13):
leaks and sensitive information, which can affect over
167 million people, which is what happened in
2023.
And that number is growing.
So the proposed changes are expected to cost
$9 billion in the first year with ongoing
expenses for compliance and will undergo a 60
-day public comment period.
But the thing is, these numbers are assuming

(24:36):
you have no protections in-house.
If you've been doing the right thing and
you just need to upgrade, it's not going
to be that much.
And again, it can be factored down into
a per patient cost, which hopefully will make
this something more attractive for administrative people.

(24:58):
And, you know, it also might give patients
the choice of, you know, which hospital do
they want to work with?
Patients do a choice.
And if your doctor or if your medical
facility is not following certain standards, well, you
might just not want to work with them,
right?
So that could be a serious, serious problem.

(25:19):
And Wuxi, yeah, that's W-U-X-I,
is supposed to sell their advanced therapy units
amid the U.S. curbing.
And this is an issue.
So China's Wuxi Aptech has agreed to sell
its cell and gene therapy manufacturing unit, Wuxi

(25:39):
Advanced Therapies, to the U.S. Now, the
private equity firm called Altares LLC, as well
as its UK-based Oxford Genetics entity, the
move comes amid heightened U.S. restrictions on
Chinese firms due to national security concerns, particularly
targeting companies involved in sensitive areas like health

(26:00):
care and genetics.
Now, a new U.S. bill passed in
September aiming to limit federal contracts with firms
linked to Chinese entities, therefore pushing U.S.
companies to reduce their reliance on China for
pharmaceutical and biotech operations.
The terms of the deal were not fully
disclosed, but a lot of people are getting

(26:22):
scared.
And I think it's a very good idea,
ladies and gentlemen, that we are actually making
the choice.
Well, the choice is being made for us
because what's going to happen if you don't
do this, you're going to get hit with
lots of fines.
And I just see this being a huge,
huge problem.
So we've definitely got to be concerned with

(26:42):
what's going on there and make sure that
we're doing more stuff in the U.S.
We've all known that our own data can't
be always trusted outside of our country.
So I think it's better that we do
keep things in the United States.
I don't know if you guys know this,
but there is a rumor, but it seems

(27:03):
like it's going to be coming true.
The ending, ladies and gentlemen, of the paper
passport.
So I know a lot of you probably
carry your passport pretty close to you when
you travel.
The paper passport is gradually becoming obsolete as
facial recognition technology and smartphones are replacing traditional
travel documents.

(27:23):
The shift aims to speed up airport procedures
and reduce friction during international travel.
Countries like Finland, Singapore, and the UAE are
trialing systems where travelers' facial data once stored
on a physical passport is now digitally linked
to their smartphones.
However, privacy concerns are rising, with experts warning
about the risks of surveillance, data breaches, and

(27:45):
system malfunctions.
Despite these concerns, the adoption of digital travel
credentials, DTCs, continue to grow with plans for
further integration into other areas like hotels and
historical monuments.
So although I think it's good we're getting
away from paper, but I think this whole

(28:07):
idea that the information is going to be
a central repository as opposed to just something
on somebody's phone, you know what I'm saying?
I like the idea when we talked about
COVID and many of us that got our
vaccinations, I still have one on my phone,
was a digital certificate.
The digital certificate proved that I got vaccinated.

(28:32):
So you might be saying, John, how does
that work?
Well, it's really simple.
But the thing is, I just go to
my phone, I pull up my payment app,
and it has them all stored in my
wallet.
And so I think that would be a
good idea.
I don't like the idea that it could
be linked to a national system.
I just see that being trouble.

(28:55):
I see that being a big trouble problem
for a lot of people.
And what if it falls in the wrong
hands?
And how do we know that what we're
really doing is safe?
I mean, it's not like we're trialing it
very long.
I mean, you might be asking yourself, you
know, is there a privacy issue with digital

(29:15):
passports?
And many of us say, well, of course
not.
I'm here to tell you there is.
Um, because the information is being stored somewhere
else.
OK, it may sound like a phenomenal idea.
OK, but your privacy is so very, very
important.

(29:36):
And there's a lot of discussion about, you
know, whether to create the new system or
not.
I mean, there was even the question about
digital vaccine passports.
I think that was OK because, you know,
they got processed and then they got sent
off to us by email.
We can install them.
And that was it.
It wasn't like it was linked to some
big database that got monitored.

(29:59):
So I see that as a concern.
And, you know, facial surveillance, I see as
an issue.
And you might be asking me, so why
is facial recognition a security concern?
And you'd be right to ask that question.

(30:23):
So when I go to the airport, first
of all, I'll let you know that I
don't participate in that.
When it comes up to that time, I
just stand to the right of the camera
and I just tell them I'm opting out
of the facial scan.
And so they go through a few extra
checks, right?
So, you know, when the facial recognition is

(30:44):
local, it's one thing.
But when it's stored on a database that's
nationally accessible, you got issues like lack of
consent.
Faces are becoming easier to capture.
And if the face image is unencrypted, well,
somebody could just steal that and then use
that to verify.
Technical vulnerabilities with facial recognition technology, we call

(31:07):
it FRT.
It could be possible to spoof a system,
masquerade as a victim by using pictures or
three-dimensional masks created from imagery of a
victim.
FRT can be prone to presentation attacks.
Or the use of physical or digital spoofs,
such as masks or deep fakes.
Inaccuracies can be a problem.
It's also an issue for, let's say, people

(31:30):
of different skin colors, of different nationalities.
It's profiling them when there's nothing wrong with
them because the system hasn't been able to
handle that kind of a situation.
So I see that being a very, very
big problem.

(31:51):
And here's one that's worth mentioning.
How many of you guys know Cadbury?
Do you guys know Cadbury chocolate?
So Cadbury started their first
chocolate place.

(32:13):
It was a while ago.
The Cadbury original system started in 1824 in
Birmingham, England.
And so they were going to provide an
alternative to alcohol, which they believed was a

(32:33):
major cause of poverty in the area.
But Cadbury started doing all kinds of things,
making chocolate.
And the thing about it is that most
people have had Cadbury.

(32:55):
But some have not.
So while it started in March 4, 1824,
Cadbury sold tea, coffee, drinking chocolate, cocoa, hops,
and mustard in their shops.
There is a pretty long life history of
Cadbury.

(33:15):
Cadbury in 1831 moved into a factory in
Bridge Street to produce a variety of cocoa
and drinking chocolates.
In 1847, Cadbury partnered with their brother, Benjamin,
and the company became known as Cadbury Brothers.
In 1849, Cadbury introduced their brand of chocolate

(33:37):
bar.
In 1854, Cadbury received the Royal Warrant as
manufacturers of chocolate and cocoa to Queen Victoria.
In 1878, Cadbury bought a 14-acre meadow
just outside Birmingham and named it Bourneville.
They moved their production to Bourneville to create

(33:58):
a better quality of life for their employees.
In 1897, Cadbury created their first milk chocolate
bar to compete with the Swiss chocolate.
In 1988, the Hershey Company acquired the U
.S. Cadbury license.
So pretty interesting, right?
So you might be asking, is Cadbury owned

(34:24):
by Hershey in the U.K.? So Hershey
does not own Cadbury in the U.K.,
okay?
Cadbury owned by Hershey in basically having a
license.
So Cadbury does not own the whole Cadbury,

(34:46):
but they have a license agreement.
Hershey has a license agreement with Cadbury Schweppes,
affiliate companies to manufacture, market, and distribute Cadbury
Caramello products in the United States.
Hershey's Pennsylvania facility produces Cadbury mini eggs, which
you probably see around Easter time, and various
sized bars like Cadbury Caramello.

(35:06):
Unfortunately, gentlemen, I've got some bad news for
you.
It has been many, many years, as I
said, that Queen Victoria had bestowed the royal
arms on Cadbury in the U.K., and
just recently, Cadbury loses their 170-year royal

(35:27):
warrant, their coat of arms.
After 170 years, Cadbury has lost its royal
warrant, marking a significant shift in the chocolate
maker's longstanding relationship with the British monarchy.
It was first granted this royal warrant by
Queen Victoria, as I said, in 1854.
Cadbury's royal endorsement has now been removed under

(35:50):
King Charles, the company owned now by Mondelez
International, expressed disappointment over the decision, which affects
its ability to use the royal coat of
arms on packaging.
Royal warrants are granted to businesses that supply
goods or services to the royal family, and

(36:10):
losing this endorsement could impact Cadbury's brand image
and their costs.
The loss of the warrant follows a broader
trend of companies, including Unilever, being stripped of
their royal endorsements due to political pressures.
So the question I know a lot of
people are asking every day is, you know,
why did the crown drop Cadbury?

(36:37):
And, you know, we're not given a lot
of information about this.
King Charles III dropped Cadbury from the royal
warrant because of pressure from campaigners who urged
the monarch to distance himself from companies operating
in Russia, particularly due to Cadbury's parent company,

(36:57):
Mondelez International, continuing production there during the Ukraine
war.
So I understand.
It had nothing to do with the product.
It had nothing to do with its quality.
It was a political issue.
And the decision came after a campaign by
the B4 Ukraine group.

(37:18):
Although Buckingham Palace has not officially confirmed this
as the sole reason for removal, we can
bet that it probably is the reason.
I mean, if they went up a few
dollars, I don't think that would get the
British monarchy to drop Cadbury.

(37:38):
So it's a sad day for them.
And it's the first time in 170 years
that Cadbury has been removed from the royal
warrant list.
So you might be asking, so who will
replace Cadbury for the crown?
And everyone's been asking that question, you know,
who is going to replace it?
No company has replaced the supplier to the

(38:00):
royal household as of yet.
But again, they decided to not renew the
chocolate company's royal warrant in December of 2024.
Now, one thing I would have said, King
Charles, is that I wouldn't have dropped, if
it was me, I wouldn't have dropped the
chocolate company until I found another one, or

(38:22):
I would have said to them, look, you
know, we still want to do business with
you, but we can't if you're going to
keep manufacturing over there under this name.
So, excuse me, makes sense there.
Just had dinner a little while ago.
It was a great German dinner, but still
leaving me a little bit in my chest
there.
So anyway, I think, you know, this whole

(38:43):
thing about, you know, will, you know, the
warrant, what will happen to Cadbury now that
the warrant is gone?
This is a big concern.
So what will happen to Cadbury now that
they lost the warrant?
You know, it's hard to say.

(39:03):
They can't display the coat of arms.
They're disappointed.
And it might affect their sales.
It's a problem.
And the thing is, you know, when you

(39:24):
start playing games with other countries internationally, this
becomes a big problem.
But I'm really concerned to know who is
the Royal Crown going to pick as their
chocolate provider?
I mean, they could still buy chocolate once
in a while, but that doesn't mean that
they're endorsing the product.

(39:45):
You know what I'm saying?
Just trying it out.
Just like there are some companies that sell
cheese to the royal monarchy.
But I think it just comes down to
one thing.
And that is, you know, everything is political.
Even if we say it's not political, it
is.
So the best thing I always advise people
is to keep politics out of business.

(40:06):
Because if it inadvertently gets introduced and the
views are opposing, well, it could affect the
potential deals and opportunities that you have now
and the ones that are in the future.
So I think that's a big problem.
And number 10, ladies and gentlemen, Snapchat's AI

(40:26):
chatbot raises some very big privacy concerns.
Yes, the Snapchat's new AI chatbot, My AI,
powered by ChatGPT, has sparked a significant concern
among parents and users, especially those with teenagers.
The feature, which provides personalized conversations, recommendations, and

(40:47):
even the ability to customize a bitmoji avatar,
has led to complaints about its potential risks.
Parents like Lindsay Lee worry about the emotional
impact on young users, as the chatbot can
sometimes blur the line between human and machine.
Some users report unsettling interactions, privacy concerns, and

(41:07):
confusion over the tool's behavior.
Despite these issues, some teens find value in
it, using it for homework, help, or as
a source of advice.
However, concerns continue to grow about its influence
on mental health and privacy, prompting calls for
more regulation and better user safeguards.
So the question is, what is, and this

(41:29):
is a really, really good question, um, what
is, uh, Snapchat going to do about this,
uh, AI child's, uh, kid concern?
And right now, I think they're just trying
to mill it over.
Um, they're just saying that it's acting, or

(41:50):
the parents are saying that it's, uh, interacting
with them in inappropriate ways.
Uh, we all know about the, uh, it
was an AI system.
It was actually an AI system in, uh,
the UK, uh, that was, um, let's just
say using, uh, inappropriate language, uh, and was

(42:10):
shut down.
So, um, it was actually by, it was
actually by a, uh, by, I think it
was a, if I remember right, it was
a, it was a postal company.
And, uh, you know, this is pretty amazing,
right?
Uh, it was shut down due to using
inappropriate language.

(42:31):
Um, and the company was DPD, dynamic parcel
distribution, which began, um, with providing support.
And then the system was swearing at customers
and even writing negative poems, criticizing the company
itself after being prompted by a user, uh,
leading to the company disabling the AI feature
of their chatbot.

(42:51):
So I think we have to be concerned
about AI.
We have to be concerned about what information
we give it and what parameters we allow
it to, um, exchange, right?
Many people don't know this, but a lot
of the chat systems out there, chat GPT,
Claude and other ones out there, they, uh,
have, um, some very strict, uh, settings in

(43:14):
them, which will not allow you to get
it to say certain things or do certain
things for fear that it could cause, uh,
inappropriate responses.
And ladies and gentlemen, this is a really
interesting one, which I know you're going to
love.
Um, uh, our friends, um, basically Instacart joined
Uber in a Seattle, a driver deactivation lawsuit

(43:38):
over a new law regulating the deactivation of
gig workers, such as delivery drivers and grocery
shoppers.
The law passed in 2023 requires companies to
give workers a 14 day notice before deactivating
them, ensuring the activations are based on reasonable
policies and provide human review and records of
the decision.

(43:59):
Instacart argues that the ordinance violates constitutional rights,
compromises customer safety, and imposes burdensome data disclosure.
The law aims to offer more job security
for gig workers, but the lawsuit highlights ongoing
debates about worker protections and corporate autonomy in
the gig economy.

(44:19):
So I think this is, uh, one of
those first issues where we're going to see
a lot more of this with apps.
When the apps don't do the things appropriately,
guess what happens?
They either get curtailed or they get removed.
And I see this becoming a big problem.
If the app is not fixed properly, uh,

(44:39):
and these policies are not implemented, I think,
uh, they're going to make some changes to
this.
So stay tuned because we'll be following that
case.
And ladies and gentlemen, this is a real
cool one.
Uh, an AI revolution impacting the benefits of
feel, appeal and landlord dispute issue.
I think this one is one that a

(45:00):
lot of people didn't realize, but I told
you that AI was going to get into
the legal system and legal was going to
get into the AI, whether you like it
or not.
So AI is revolutionizing access now to justice
systems by assisting in legal cases, such as
benefits appeal and landlord disputes, especially for individuals
who cannot afford expensive legal services.

(45:23):
The Westway Trust in London, for example, uses
AI tools to analyze complex documents and identify
key facts, helping staff provide efficient legal advice.
These tools save hours of manual work, making
legal help more accessible to vulnerable populations.
AI is also being explored in courtrooms to
analyze witness testimony and evidence, emphasizing that human

(45:47):
oversight is crucial to ensure accuracy and prevent
bias in AI-generated results.
The ongoing evolution of AI law promises to
improve fairness and reduce the financial burden of
legal battles.
But I definitely think we can't make it
the be-all end-all, right?
And, uh, here's one I think you're going
to find a little bit interesting.

(46:09):
Mr. President-elect Donald Trump asked the Supreme
Court to halt TikTok shutdown.
Yes, President-elect Donald Trump has requested the
U.S. Supreme Court to intervene and halt
a federal law set to ban TikTok coming

(46:29):
up this month in a brief that was
filed on December 27, 2024.
Trump argued that the potential shutdown could infringe
upon the First Amendment rights of millions of
Americans.
He expressed a desire to resolve the issue,
though negotiations rather the ban that would take
place.
So he'd rather have the negotiations, uh, try

(46:51):
to resolve this, highlighting the importance of keeping
the popular social media platform operational.
Now, what you don't know, ladies and gentlemen,
is he's not just doing this to be
a good guy.
He's doing this because he got a lot
of the younger votes, the 20s and 30s
people.
So he says, I think we should keep
them around for a little while.
And New York, ladies and gentlemen, yes, uh,

(47:12):
is now going to be monitoring AI usage
under a new law.
The New York state has enacted a law
requiring government agencies, this is pretty cool, to
monitor and disclose their use of artificial intelligence
software.
The law mandates that agencies conduct reviews of
AI tools, including algorithms and computational models, and

(47:35):
submit reports detailing those reviews to both the
governor and legislative leaders.
Additionally, it restricts AI from making critical decisions,
like approving unemployment benefits or child care assistance,
unless overseen by humans.
The law also ensures that AI will not
limit state workers' jobs duties.

(47:56):
This move is aimed at establishing guidelines for
AI use within state government.
So I said this to you guys before,
AI needs to be managed.
We can't just allow AI to run everything.
Because we all know that AI makes mistakes,
whether you've used ChatGPT or Claude or whatever

(48:16):
you use.
You have to understand that it has a
bunch of information.
And so I think that's a huge, huge
problem.
And I think for whatever reason, people don't

(48:39):
understand that AI is a tool.
And so we have to understand that humans
need to be in the loop all the
time.
So I hope that you've learned a lot
from this tonight.
And I hope that you understand that AI,
technology, tools, they're going to keep evolving.

(49:02):
And let's face it, some are going to
be good, some are going to be bad.
This is why, ladies and gentlemen, we have
to keep a human always in the loop,
or more than one.
And you might be saying, John, you know,
how do we do this?
Well, when we design a new system, a
new process, a new procedure, whatever it is,

(49:23):
we have to make sure that in that
tree, that no final decision gets made without
at least one or two humans vetting off
in the workflow process.
So many of you remember how you sign
purchase orders, or let's take a typical association,

(49:45):
maybe it's an HOA, and they have to
pay bills for the association.
And so I know when I served on
them, typically what would happen is, you know,
there was an online system.
And if I was the treasurer, I would
approve the bill, and then the president would
have to approve the bill.
How it would work is, if the bill
was under $500, I could approve it, and

(50:06):
we were good.
If the bill was over $500, I had
to approve it, and the president had to
approve it.
Now, if the president approves something, I still
had to approve it.
So if it was, they would never approve
anything smaller.
But the way it worked is, only the
treasurer would be able to approve just one

(50:26):
signature, okay?
If the president had to approve something, it
would only be because something was over $500.
Now, you might say, John, that's crazy, but
there's a reason for that.
So let's say there was a situation where
this guy came in, and he was trying
to fix, I think it was some heaters.
And so let me just let you know

(50:47):
that a new heater would cost about $1
,600, right?
And so he comes in, and he changes
some things around, and he gives us a
bill, right?
And the bill is like $1,950.
And we're like, well, why didn't you just
change the heater?
Oh, well, I figured you wanted to fix.
Why would we pay several hundred dollars more

(51:08):
to fix it?
So this is why we have it.
It's a checks and balances system.
And this is like this in a lot
of companies.
So we could be talking about a purchase
order.
We could be talking about fulfilling an order.
At the company or the work floor.
And so what typically happens is when the
order comes in, sometimes before the order can
get released, it has to go to a

(51:29):
credit team.
The credit team has to approve that if
it's on, let's say it's on terms.
And then it goes to, and sometimes even
has to get approved by a sales team
first before it goes to the credit team
to make sure that everything is correct.
And then it goes to the floor.
And then once it goes to the floor,
the, let's say, the fulfillment manager has to
approve it so I can get out to

(51:50):
the floor.
So it's, it looks, everything is valid.
Everything is good.
It's coded correctly.
They'll get out to the floor and then
they'll process it.
But what can happen with AI is it
can check and say, hey, you know what?
There are three things missing or there are
three invalid parts.
Are you sure you want me to still
release this to the floor?
So the whole idea is checks and balances.

(52:11):
And so when we have checks and balances,
we can also see what's working and what's
not working.
So by using AI or using systems in
the field, if we see there's a lot
of returns on a certain skew, then maybe
we have to implement another check safe or
another process where something else has to be

(52:34):
verified.
I'll give an example.
We're selling automated faucets, let's just say, for
a kitchen sink.
And we notice there's a lot of returns.
We sold a million, but let's just say
we got over, in a year, a million.
Let's say we got a half a million
back.
Why?
They weren't defective.

(52:54):
They just, they said they didn't work.
Well, it wasn't because they didn't work.
It's because out of a lot of installations,
the right questions weren't being answered.
And it's because we didn't ask those questions.
So the part we were shipping was assuming
that it was a new install.
But in a lot of installations, it was
actually an old install.

(53:14):
So there's two ways to handle this.
One, do we add that extra part into
the box, which is an extra $30?
Or let's say in our cost, it might
be an extra $10.
Or do we ask that question, hey, is
this for a new install or an old
install?
And so that's one way of mitigating the
whole process.

(53:35):
And so we can learn from the data.
And I like to say that if we
can learn from our own data, and we
don't necessarily have to share that data.
I like to use AI to learn from
our own data.
And then from that data, we can figure
out what steps do we need to take
next?
How is our process?

(53:55):
How is our flow?
You might have heard of that before, right?
Our flow.
And so I think a lot of people
fear AI because they think they're going to
lose control.
AI is meaning to be there to save
us time.
So that redundant tasks can get done quicker

(54:15):
and more efficiently, all right?
Than us trying to, you know, thread needles
when machines can do that in a fraction
of a second.
I'm just giving you an example, but you
can obviously understand.
But it's very important, ladies and gentlemen, to
keep a human or two in the loop.
It's also important to look at the data,

(54:36):
to look at, you know, what's happening, how
it's processing the data, how is it responding
to the data?
What are our returns like?
Is it affecting our sales and different things
like that?
So if we can understand how our data
goes in and how the data goes out,
we can be a lot more productive and

(54:56):
efficient and cost effective.
Because now we know how our data is
being used.
And once we know how our data is
being used, we can then tweak our operations
instead of losing money.
So in that case where we had, let's
say, half a million, let's say, machines, parts
coming back to us, which were these faucets.
And these faucets, let's say, were like $1

(55:18):
,200 a piece.
And we're getting half a million back.
That's a lot coming back to us, a
real, and they're saying they don't work.
And so if we think about that, let's
just say that each one of those, hypothetically,
let's say each one of those cost us
$800, OK?
So if we take 800, right, times basically

(55:39):
500,000, OK?
$4 million, $400,000, that's a lot of
money.
$400,000 or $4 billion.

(55:59):
So when we think about this, I think
a lot of people don't understand that data
is here to make our world better.
And so the AI world is a world
where things can move and operate.
But sometimes, more oftentimes, it gets things wrong

(56:20):
because it doesn't understand the human factor.
It doesn't understand that things are not always
correct.
It doesn't understand that it's not just an
A or a B.
And then switching from an A to a
B could be the difference of something that
costs us $1,000 an hour.
And we, as humans, would know that.
But the AI system might not know that.

(56:41):
And in a matter of a day, we
could have just wasted $12 million because of
wrong decisions.
Does that make sense, everyone?
All right.
So this is our very first show for
January.
This is January 3, 2025.
You can catch previous shows.
You can catch Inspiration for Your Life show
and many of my other pieces of content
by visiting BelieveMeAchieve.com.

(57:05):
Now, that's a pretty amazing place to go.
And you can check things out there pretty
much anytime you want.
So, ladies and gentlemen, I look forward to
seeing you soon, which will actually be next
week.
So have yourself a fantastic rest of your
weekend.
And I'll catch you guys on January 10th
on Series 4, show number 2.
Take care, everyone, and be well, everyone.
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.