All Episodes

May 3, 2025 58 mins

In this high-impact episode of The JMOR Tech Talk Show, we dive into the global tech and policy shakeups redefining our digital world. From a U.S. judge confronting Google’s alleged advertising monopoly to the EU dropping an $800 million fine on Apple and Meta, the pressure on Big Tech is rising. Meanwhile, teens are sounding alarms about social media’s impact on mental health, and Prince Harry and Meghan are pushing for stronger online protections for children. Trump challenges fairness in AI labeling it “woke,” as AI itself is rapidly evolving—robots now learn from how-to videos and shape-shifting metabots operate without motors. We explore legal disputes in real estate listings, groundbreaking privacy tools in smart homes, and green tech for ferries. Even gaming gets a spotlight with Dark Souls offering unexpected mental health support, while AlcoWatch smartwatches bring accountability to alcohol tracking. Finally, a cyberattack on Marks & Spencer reminds us of the ever-present need for digital defense. It’s a fast-paced roundup of tech, transformation, and the ethical questions shaping tomorrow.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:01):
Hi everyone, I'm John Seymour, the host of
the Jay Moore Tech Talk Show and Inspirations
for Your Life.

(00:45):
Hey guys, how are you?
It is John Seymour here, sir.
You are tuned into the Jay Moore Tech
Talk Show and it is great to be
with you.
It is the second, but it is our
first Friday of May.
It's May 2nd and it's great to be
with you.
And we're here for another fantastic, amazing Jay

(01:06):
Moore Tech Talk Show.
So the title for this week's episode is
Tech Giants Under Fire, Teams Speak Out, and
Robots Learn Fast.
Of course, we're on series four and this
is show number 18.
I want to welcome everyone for popping on
by and tuning into the show.
By the way, if you'd like to check

(01:29):
out some of my other content, you can
do that by going to believemeachieved.com.
Of course, not while you're watching this, but
24 hours a day, I serve up all
kinds of content, whether it's short form content,
long form content, there's so much stuff.
And jaymoore.com even has Jay Moore bite
-sized blunders for technology that's just amazing.

(01:50):
And there's so much great stuff out there.
You're going to want to check out believemeachieved
.com for more of my amazing, inspiring creations.
All right, guys, maybe you want to head
over to your kitchen and get something yummy
and delicious.
Maybe it's hot, maybe it's cold, maybe it's
sweet, maybe it's sour, maybe it's healthy or

(02:10):
not.
That choice is up to you.
So let's go ahead and get this show
kicked off.
I am your host, ladies and gentlemen, a
privilege, pleasure and honor to be with you
once again, John C.
Morley, serial entrepreneur, not only a podcast host,
but a podcast coach.
And it is such a privilege to be
with you once again here on another show.

(02:31):
All right.
So the Jay Moore Tech Talk Show has
been sharing some amazing insights.
Tonight, we're going to be talking about from
the courts to the code, big tech battles
and bold innovations strapping in for another must
-hear episode.
You're not going to want to leave, yes,

(02:52):
your computer or device.
So go get that drink, go get that
snack, fruit, whatever, and hurry on back.
All right.
Because this is the Jay Moore Tech Talk
Show, where I break down tech's biggest headlines
from courtroom crackdowns on monopolies and social media
impact and how it affects teens' mental health

(03:13):
to game-changing wearables and privacy-saving devices.
Whether you're here for policy drama, digital breakthroughs,
or the AI evolution, this episode is packed
with powerful information that you're not going to
want to miss.
So sit back, relax, enjoy your beverage or
snack, and away with our show.

(03:35):
So the US judge actually schedules the hearing,
which is actually happening today, she already had
scheduled it, on Google's alleged ad monopoly and
possible breakup remedies.
Everybody thinks that Google can do no wrong.
Just because Google has a lot of money
doesn't mean you can break the law, right?

(03:57):
And I think Google's starting to realize this
now.
A major legal moment is brewing as the
US judge has already set this hearing date,
which was actually earlier this morning, to address
Google's dominance in the online ad market.
Now, antitrust concerns are popping up all over

(04:18):
the place, and they've escalated even higher, with
regulators pushing for potential structural changes, possibly even,
this is right, breaking up Google's ad business.
Wow.
So this case could redefine the balance of
power in the digital advertising world.
Now, it might not just be limited to

(04:41):
Google.
We might find these repercussions percolating all the
way down to companies like Facebook, Meta, whatever
they want to call themselves these days, Yahoo,
and anybody else that actually is using any
kind of social media.
We might find this on TikTok.
We might find this on Instagram and any
other platform that we're talking about, from Snapchat

(05:04):
and so many others that are starting to
just emerge.
I think the reason why Google was able
to do this is, again, they have so
much money.
And they feel that if they have so
much money, they could just buy everybody, right?
But I think they're starting to realize now
that you cannot buy your way out of
this, because they've gotten hit with so many

(05:26):
fines in the last couple of years.
Now, a lot of these issues didn't happen
just like now or last year.
They happened several years ago.
And they think, oh, you know, we're clear
and no one's going to touch us.
Well, sorry to say that.
It is time, Google, for you possibly to
pay the paper.
Now, when we talk about other things, what
are some of the things that the judge

(05:46):
is looking to do?
Well, the judge wants Google to do many
things in the, let's say, this court case
on today, which is May 2nd.
And so, again, they're putting this case together,
which they discussed six days ago, to talk

(06:08):
about possible remedies.
And again, the US Justice Department and a
group of states might ask the court to
impose on Google a lot of things, which
is ruled under the quote-unquote alphabet unit,
their parent company, that they claim illegally dominated
two markets for online advertising technology.

(06:29):
And I think a lot of people are
annoyed and very upset with what they did.
And so, you know, many of these companies
haven't really gotten burnt too much.
They've gotten, you know, they've gotten themselves a
little wet, and they've had to pay some
money.
But now we're talking about something serious, right?
So the question you might be asking is,

(06:51):
what does the judge and the state want
Google to do to remedy this issue?
You know, that's really a good question.
Well, one of the things they want them
to do is to sell its popular Chrome
web browser as one remedy to restore competition

(07:13):
in search.
Google has denied the government's claims in both
antitrust lawsuits, and Google has not offered Chrome
for sale.
But remember, ladies and gentlemen, it's not whether
Google wants to do this or not.
If it becomes a court order, Google's going
to have to do exactly what they are
told, or they're going to have to pay
a lot more money.

(07:34):
They're still going to probably have to pay
some money now already.
It was nearly a year ago, the US
District Judge, Matt Murata, ruled that Google had
acted illegally to maintain a monopoly on the
search engine market.
And it was a decision that sent shockwaves
through the entire Silicon Valley kingdom and Washington.

(07:54):
And now lawyers for Google and the Justice
Department are facing off once again.
And so they're trying to figure out what
is the right way.
They have looked into lots of things.
Remember what happened when we talked about the
issue with them trying to prevent DuckDuckGo from

(08:15):
trying to get a little bit of edge?
And I think what's really going to happen
here is the truth is going to come
out.
I mean, Google's been doing a really great
job of hiding things.
Their customer service literally is non-existent.
But we're going to keep you right to
the fire to let you know exactly what's
happening and report that back as soon as

(08:37):
we know.
All right, guys?
Number two, Trump's voters are facing the tariff
challenges but remain hopeful for long-term economic
benefits.
Now, despite rising costs due to tariffs, many
Trump supporters remain optimistic, which is wonderful.
They see the short-term sacrifices as necessary

(09:00):
steps toward economic independence and a thing that's
necessary to revitalize domestic manufacturing.
It's a revealing look at how political loyalty
and economic hardship often walk hand in hand.
So the question is, where do we stand
with the tariffs in the US?

(09:25):
And let's even say China because everybody's concerned
about that, right?
So, you know, President Trump has basically put
a pause on the tariffs.
That's all the tariffs except the 10%
tariff, which they are not pausing that.

(09:47):
And so he had said back around April
mine to pause the reciprocal tariffs for 90
days and carrying out a range of consumer
and other China-made electronics on April 11th.
So the reciprocal tariff was the fact that,
you know, they're going to charge one thing,
charge the other.
So here's the thing with this.

(10:08):
If, let's say, we charge one amount of
money and now the other country doesn't charge,
Trump is happy to, like, release that, right?
But if we find out that they are
charging us and we haven't been charging them,
then we're going to start charging them again.
So it's US versus the world, right?

(10:30):
You got the blanket 10% tariffs on
all imports.
25% are on steel, aluminum autos, and
some auto parts that are imported to the
United States.
It was about 145%, but this number keeps
changing, on the China imports to the United
States.
20% on fentanyl penalty plus raised an
initial 104% tariff to 125% on

(10:54):
April 9th, as we remember, after China levied
the retaliatory tariffs.
Trump issued the ruling exempting China-made smartphones,
computers, and semiconductors from reciprocal US tariffs, and
the goods remain under the standing 20%
tax of reported fentanyl penalty on China-made

(11:15):
products.
And that was April 11th.
And so China versus the US, they raised
the retaliatory 84% tariffs to 125%.
Again, they are still in effect, and everyone
keeps asking, are the tariffs still in effect?

(11:37):
And they most certainly are.
They're going to stay intact for a while.
On many imports, though some have been paused,
as we said, or delayed, the 10%
baseline tariff does remain in place for all
countries, and some goods from Canada and Mexico
are still subject to tariffs, despite being excluded

(11:59):
from a 10% tariff.
Higher country-specific tariffs were scheduled in effect
on April 9th, but again, were paused.
So we have to wait and see what's
going to happen.
Additionally, the 25% sector-specific tariffs remain
in effect, and the diminished exemptions for certain
low-value imports from China and Hong Kong
will be revoked starting May 2nd, today.

(12:24):
But the question is, this is all part
of a bargaining thing, right?
This is all part of Trump's negotiations.
So we're going to have to just see
what it's going to do.
Will it be the best for our country?
I don't know, but I definitely think they're
working very hard at it.
Number three, guys, the European Union finds Apple
and Meta, once again, $800 million under the

(12:48):
Digital Markets Act.
So the European Union isn't playing around anymore.
Well, they never were in the beginning, but
now they're really getting serious.
I guess you'll say that they're sharpening their
pencils pretty well now.
Apple and Meta are hit with a massive
fine, totaling $800 million under the new Digital
Markets Act, and targeting anti-competitive practices.

(13:10):
And this enforcement sends a loud message, hopefully
to all big tech companies, that they must
play fair or they're going to pay the
piper big time.
See, that's not a maybe, that's a guarantee.
So I think the reason that this is
being done, and they're using Apple, and they're

(13:32):
using Meta, other companies have done things wrong
too.
They know that they hit these two companies
below the belt, right?
They know that these companies are going to
feel it, and it's going to affect the
whole entire production and economy in the United
States.
So they were very smart about this.
And so other people are going to get

(13:53):
scared, and they'll be like, you know what?
We better follow this, or we're going to
be in trouble.
Now, ladies and gentlemen, nearly half of the
United States teens say that social media harms
youth mental health, per a Pew report.
Now, according to Pew Research, nearly half of
the United States teens believe social media negatively
affected their mental well-being.

(14:15):
Constant comparison, online bullying, and addictive algorithms are
creating a mental health crisis.
One click at a time.
It's a wake-up call for parents, platforms,
and policymakers.
What happened with Roblox?
You remember what happened there when we talked
to the CEO, and he was saying first

(14:37):
that if anything is not to your liking,
in so many words, he said a little
more abruptly than I'm saying it, if you
have a problem with this, then take your
kids off the platform.
That's not a very wise thing for a
CEO to tell people to not do business
with them.
But he says, if you're concerned, take them
off.

(14:57):
Well, it must have really got under his
skin, because within a few weeks, what did
he do?
He came out with some new enhancement security
features.
And these security features are now going to
be able to change the way people are
actually doing things.
And I think that's a very interesting thing.

(15:22):
This interesting thing is going to allow people
to see that they have a lot of
potential, okay?
A lot, a lot of potential.
And you might be saying to me, hey,
John, this is kind of crazy.
Well, it might seem crazy to you, all
right?
It might seem crazy to you.

(15:42):
But here is the bottom line, folks.
It is all about what's going to go
on with our world and our country.
And the fact that the CEO of Roblox
decided to spend their money and resources to
put in these features, he's obviously concerned about
losing the kids from playing his platform.

(16:07):
But here's the thing that kicks me, all
right?
Roblox is not profitable.
Roblox remains unprofitable.
The company posted a consolidated net loss of
221, excuse me, 0.1 million in Q4.
An improvement from 325.3 million loss in

(16:27):
Q4.
So I guess you'd say they're getting more
profitable because they're starting to have less losses.
I guess that's a good way to look
at it.
In Q4 2023, for the full year, the
net loss stood at 940.6 million compared
to 1.16 billion in the fiscal year
2023.
So if we look at this kind of
roadmap and how they're operating right now, I

(16:51):
mean, it's not hard to figure this out,
right?
But if we take the 325.3 and
we take away the 221.1, we're at
104, basically, difference, right?
So what does that mean?
Well, we went down 104, basically, to where

(17:17):
we are now.
That's a huge, huge drop.
So if we take 221, roughly, minus the
104, right?
We probably would need another two years, based
on some rough projections, but how they've been
working, probably another two to two and a

(17:38):
half years until they're going to break even.
That means if they keep doing what they're
doing by the third year, so that could
be 26, 27, by 2028, they may start
bringing in a little bit of a profit.
But the real bottom line, ladies and gentlemen,
is social media is damaging a lot of

(18:01):
kids, not just because of social media, but
because what they're doing is they're looking at
social media and they're getting self-gratification.
And so if they go on social media
and they don't get the likes, it's like
they get this complex, that they're inferior, all
because somebody didn't give them a like or
message them.

(18:21):
And this is causing a negative effect on
the dopamine, the control in the brain and
the body.
So I think it's okay to use social
media, but I don't think it's okay to
make it like your be-all end-all,
because if you do, you could get very,
very sick, like a lot of these teens
are getting.
And I don't think people realized what was
really happening here.

(18:43):
One click, one click is all it takes,
right?
Is one click causing addiction?
No.
So I mean, that's a very good question.
And how many clicks does it take to
get addicted to social media?

(19:05):
Well, studies have shown that social media has
a powerful effect, as we said, on the
brain, and it can create a stimulating effect
similar to an addiction, okay, like we learned
with the teens.
But when we think about how many hours
is considered addiction, well, there's no real set

(19:27):
time that indicates addiction.
Experts are starting to agree, though, that over
three hours a day is considered heavy use.
So if you're going on or off your
social media, like maybe you spend five minutes
here, five minutes there, five minutes there, that's
not using social media for three hours, all
right?
People that are glued to their phones, they
scroll, scroll, scroll, scroll, scroll, like they're the

(19:50):
kind of people that there could be a
fire happening.
And they're like, oh, what's going on?
There's a fire.
Oh, okay, let me just finish this post.
Well, the place is on fire.
What are you doing, right?
So it's like they're in their own world.
And that's a big problem, okay?

(20:12):
Approximately half of the adult Generation Z users
exhibit signs of social media addiction because they're
different.
47 report using social media two to four
hours daily, while a notable spend at least
four hours on these platforms each day.
So the question you might be asking, and
it's a very good question, which generation is

(20:35):
most addicted to social media?
Well, it's gotta be Generation Z.
So those are the ones born from the
1990s to 2010.
It's most susceptible for them because the addiction

(20:56):
is due to their earlier and constant exposure
to digital technology.
They spend more time online and other generations
would be doing.
And so with a significant percentage reporting daily
usage of two to four, which is an
approximate, but don't go by that exactly, studies
show that a high percentage of Gen Z
adults acknowledge their dependency on social media platforms.

(21:21):
But the generations, the other generations, right?
That's gonna be different because maybe they're using
it different, right?
And I think this really comes down to
the amount of time that you were introduced.
I mean, I was always around computers and
technology, but do I feel that I've gotta
be on social media every second?

(21:43):
No.
Do I check it in the morning?
Do I check it in the afternoon, evening?
Yeah, I do.
It's for business posts, but I check it
different times during the day.
But I'm never on social media, like just
to give you today, for example, I was
on social media this morning, literally for about
maybe three minutes, five minutes to post some
things.

(22:03):
And then maybe I check social media later
in the day.
So I might check it five or six
times.
So if I add all the times I
check it, right?
So this is probably a very good thing
to understand.
If you check social media 10 times a
day quickly, how much time is that?

(22:24):
Well, if you're equating it to be under
a few minutes, right?
That means I'm probably spending anywhere from 30
minutes to maybe a little over an hour.
And I think that's an important thing to
understand.
It's not bad to use it.
It's bad to realize that social media is

(22:47):
something you have to have, right?
When it replaces like your need or want
to be able to go outside to take
a breath of fresh air, to go walk,
to maybe go do a puzzle with a
friend or a neighbor or family member.
When it replaces that, that's when social media

(23:08):
is a problem.
All right, I hope that makes some sense.
And I know a lot of you are
saying to me, John, well, I don't have
a problem, but nor was my son or
daughter a problem.
Okay, so if they don't have a problem,
I want to ask you something.
If you were to take away their smartphone
right now, okay?
Just say, hey, I need your cell phone.

(23:30):
And they're going to give you a hard
time.
And now you keep that cell phone for,
let's say 10 minutes, not a big deal.
But let's say you kept that cell phone
for, I don't know, two hours.
They get a little more cranky, right?
What if you kept it for half a
day?
What if you kept that cell phone away
from them for an entire day?

(23:51):
They're going to be like barely able to
breathe.
And I don't mean breathe.
I mean, literally like what am I going
to do?
Like I can't connect to the world, right?
Go outside.
Go talk to people in person.
Number five, Compass sues Seattle's MLS over anti

(24:12):
-competitive real estate listing policies.
Compass is suing now the Seattle Multiple Listing
Service, claiming its practices are unfairly restricting competition
in the real estate space.
As digital transformation reshapes home buying, the legal

(24:32):
friction highlights growing pains between innovation and entrenched
systems.
Now, I thought I should mention something to
you because I always want to be fair,
and I try to get the news and
get everything very unbiased.
But what I wanted to let you know
is that a couple people had reached out
to me and said that they were doing
some things that were not 100% correct.

(24:54):
Now, he wasn't getting into what they were,
but all he was trying to say is
that because of their political power and what
they're doing, he was trying to throw them
out of the bus for taking a certain
political stance.
Is that true or not?
I don't know.
So sometimes with these companies, their political stance
becomes hidden because if people know it, they

(25:16):
don't want to do business with them anymore.
But I find it hard that a major
company is going to sue another company.
I just can't gather that.
It would be childish for them to do
that if it wasn't really truthful.

(25:36):
I think that's important to know.
Number six, Trump targets efforts to reduce bias
in the AI world, labeling them as, quote
unquote, the woke AI.
So Donald Trump is taking aim at efforts
to make AI more inclusive, branding them as,
quote unquote, woke AI.
And his criticism taps into larger political debates

(25:57):
over whether fairness and diversity in AI development
are progress or political correctness.
It's the culture war that we're in right
now with all the algorithms.
And do we blame it on the fact
that the computer can't just analyze it?
Or do we have to just take full
responsibility and accountability that we should discriminate for

(26:19):
race, religion, sexual orientation, creed, color, political part?
Should we just take that onus and say,
look, I know the AI system is doing
this.
But hey, I'm the programmer.
And I'm the one that created this.
And I'm responsible for ultimately what happens.
But I don't think a lot of people
are going to do that.
So we'll keep an eye on that.
But I definitely think that a lot of

(26:39):
AI systems out there are starting to profile
people wrong because of maybe their skin color
is too dark or maybe a nationality difference
from a different country or something like that.
That's causing an issue.
So we'll keep an eye on it.
But I do think there's some parts of

(27:00):
AI that need some tweaking before it gets
used in what I call full, let's say,
a full production environment or a full world
where it's going to be able to make
the interactions with people.
And number seven, ladies and gentlemen, scientists recently

(27:21):
unveiled compressed air powered propulsion system for ferry
boats.
This is pretty cool.
So a team of scientists has debuted an
innovative ferry boat propulsion system powered by compressed
air, offering a greener alternative to diesel engines.
This sustainable leap could chart a new course

(27:42):
for maritime transport and emissions reduction on waterways.
Now, I have to tell you, when I
had posted these, because we do the reels
with the shorts, I had a lot of
people attacking like, oh, it has to have
power and all that.
I'm glad that you have a stance.
And I'm glad that you're commenting back.
But if you're going to comment back and

(28:04):
you're going to challenge something, at least have
something to stand on.
Don't just give me something that's political.
Like, give me some facts.
Does that make sense?
All right.
Number eight, a new speech filtering tech protects
privacy in smart devices while preserving function.

(28:26):
New technology is bringing privacy back to smart
homes.
Now, a speech filtering system can now block
sensitive audio before it reaches cloud servers, meaning
your smart speakers don't need to eavesdrop to
be useful.
It's a step towards smarter, safer living rooms.
But my question is, with all this, is
the technology 100% trustworthy?

(28:48):
Or does it work sometimes and not others?
Or does it ingest certain voices and ignore
others?
Like, is there 100% truth to this?
I think we got to put a lot
of R&D into this before we can
say, hey, this works.
That's just being my truthfulness there.
Number nine, guys, a Dark Souls game is

(29:11):
shown to support players dealing with depression and
mental health issues.
It's a surprise.
Yeah.
The notoriously difficult game Dark Souls is helping
players cope with depression.
Its theme of perseverance, resilience, and community support
are resonating with a lot of users that
are facing emotional challenges in their lives.

(29:33):
Who knew that such a brutal, very kind
of deadly game could offer a mental strength?
I'm happy to see that there is a
positive or a silver lining to that because
I don't see that a lot of games.
Now they rate games for a lot of
different things.
You might say, John, so how do they

(29:54):
rate, basically, video games?
Well, you have like GR, et cetera, right?
But they have a whole format.
They rate them based on their content.
They use judging similar to motion picture rating
systems used in many countries.
They use a combination of six age-based

(30:17):
levels intended to aid consumers in determining a
game's content and suitability along with a system
of content descriptions.
So I think that's an important thing to
understand.
So what are the levels that games are
rated?
So when we think about a game, like

(30:39):
I said, there is basically, they call it
the seven rating on video games.
So very mild forms of violence, implied, non
-detailed, or non-realistic violence are acceptable for
a game with a PEGI of a seven.
Video games that show violence of a slightly
more graphic nature towards fantasy characters or non

(31:02):
-realistic violence toward human-like characters would fall
into this age category.
There is something called an M-rated video
category.
That's mature.
So titles rated M have content that might
be suitable for persons ages 17 and older.
This would include categories that contain highly sensitive,

(31:24):
mature topics relating to very personal things, such
as things that would only be discussed between
two adults, usually very intense violence or strong
language.
And titles rated things like AO for adults
only have content suitable only for adults.

(31:47):
So then when we think about classification for
games, it breaks down to different things.
So they have R games.
So R games are basically adults only, but
they're not the same as AO.
So they have different levels.
So what are all the game rating levels?

(32:11):
And that's hard to understand because the ESRB
rating system is not the easiest.
So first thing they have, and just to
kind of give you a quick rundown, I
think this is important to understand for everyone
here watching, and that is they have something
called ESRB.
Content is generally suitable for all ages, may

(32:32):
contain minimal cartoon fantasy or mild violence and
are infrequent use of mild language.
Then we have everyone 10 plus content is
generally suitable for ages 10 up, may contain
more cartoon fantasy or mild violence, mild language,
and or minimal suggestive themes.
Then we have teen content is generally suitable
for ages 13 and up, may contain violence,

(32:54):
suggestive themes, crude humor, minimal blood, simulated gambling,
and or infrequent use of a strong language.
Then we have M, which is the mature
17 plus, that has content generally suitable for
ages 17 up, may contain intense violence, blood
and gore.
And of course, those very sensitive adult topics
between two people and very, very strong language.

(33:17):
Then we have something called AL, adults only.
So content suitable only for adults ages 18
up, may include prolonged scenes of intense violence,
very graphical of those intense personal type situations,
content and or gambling with real currency.
Then we have an RP, which is a
rating pending, not yet assigned a final ESRB

(33:41):
rating, appears only in advertising and marketing and
promotional materials related to a physical, such as
a box, the video game that is expected
to carry an ESRB rating and should be
replaced by a games rating once it has
been assigned.
So then we have something called a rating
pending, likely mature 17 plus, not yet assigned

(34:01):
a final ESRB rating, but anticipated to be
rated mature 17 plus, appears only in advertising,
marketing and promotion materials related to a physical
box video game that is expected to carry
an ESRB rating and should be replaced by
a games rating once it has been assigned.
Well, and that is called, just in case
you wanted to know, the Entertainment Software Rating

(34:25):
Board.
And if you're wondering a little bit more
about them, they've been around for a while,
guys.
They started their journey back in 1994, where
they only had something called early childhood, kids
to adults, teen mature and adults only.
So they went from basically the three, four,

(34:47):
they went to five, and now we're all
the way up to where we are today
in the 2025 world, with a lot larger
of a system, which now has a level
of one, two, three, four, five, six, seven.
So important to understand these different types of
contents they rank on.

(35:07):
Things like we talked about, blood, comic mischief,
drug reference, gambling, animated blood, blood and gore,
crude humor, fantasy violence, intense violence.
And they have them categorized pretty, pretty well.
So if you are trying to buy something
or make sure that your son or daughter
doesn't get into watching that content, maybe too

(35:29):
early, important to know.
Number 10, imagine this, robots learn by watching
how-to videos.
Well, wonder no more because it's actually happening.
So a robot that now can actually learn
by watching a how-to video.

(35:49):
Researchers have developed an AI that watches online
tutorials and mimics actions, effectively learning tasks from
videos, from folding laundry to cooking, the possibilities
for autonomous learning just got a major upgrade.
That's pretty cool, right?
I think that's really, really cool.
And number 11, guys, Princeton engineers develop a

(36:11):
shape-shifting metabot with no motors or gears.
This has kind of blown my mind a
little bit.
So engineers at Princeton introduced a metabot that
morphs its shape without any kind of motors
or gears using soft robotic principles.
This innovation could define robotics, especially in environments
where traditional movements isn't possible, like inside the

(36:34):
human body or a disaster zone.
But when you think about something like a
metabot, that whole concept is pretty amazing.
A metabot concept had been thought about before,
but now they've really come pretty far with
what it is and the actual live prototype

(36:57):
of it.
So definitely pretty cool.
And I see a lot of potential applications
in how this could help our world and
how it could help us discover things that
we might have not been able to reach
before.
Number 12, Marks and Spencer recently halted online
orders after a cyber attack disrupted all their

(37:20):
services.
So the big retail giant, Marks and Spencer,
was forced to pause all online orders to
a cyber attack that crippled their systems.
It's a stark reminder that even legacy brands
must constantly evolve their cybersecurity to protect customer
trust and operations.
Now, you might be saying to me, John,

(37:41):
this is kind of crazy, but many of
you probably maybe remember, you might remember companies,
let's say people like Kmart, right?
If Kmart was around today, okay, and they
had their blue light specials, if Kmart was
around today and they did not keep updating

(38:02):
their SOP, the standard operating procedures for their
security, for all their different policies, they would
be out of business anyway because of that.
So whether you're in business one year, 32
years, I never take for granted our security.
I always keep checking it.
I always making sure it's on par because

(38:24):
it could miss something.
So it's always good to have multiple levels
of security.
That's why we have them on the firewall.
That's why we have them on the endpoint
devices.
And it's important to check your logs and
have emails set and have triggers set up
if you want certain things to happen.
Number 13, guys, the Alkowatch smartwatch app.

(38:46):
Catch this.
Accurately tracks alcohol consumption habits.
So the Alkowatch is redefining personal health monitoring
by enabling users to track alcohol consumption right
from their smartwatch.
How cool is that?
Using biosensors, the app helps users make more
informed decisions, a tool that could eventually be

(39:10):
a game changer for social drinkers and those
in recovery alike.
So that's another cool thing.
And I think having these tools is definitely
a really, really cool thing.
But I think they have to make sure

(39:32):
that these devices are going to work properly
and they're not just going to work like
some time, right?
I think that's a really, really big issue.
So you have to realize the intermediate, I
should say the intermittent failure is one of
the hardest things to solve because it might

(39:53):
be erring on different things.
So we have to taste different cases, different
input values.
But you can't test every input value.
That would take forever.
So you have to try to test enough
of them.
And by using AI, we can definitely simulate
that in a much faster time.

(40:15):
Well, number 14, and our last one for
tonight, gets me with a really interesting one.
Prince Harry and Meghan advocate for stronger social
media protections for children.
So Prince Harry and Meghan are doubling down
on the fight for safer social media.

(40:36):
They've joined experts in demanding stricter rules to
protect children from online harm, emphasizing that tech
companies must be held responsible for youth safety
in digital spaces.
So I think it's good that a lot
of these companies now are starting to take
it seriously, that we need to have ways
to make sure that if kids or people

(40:59):
try to get past these things, that we
don't allow them through.
I mean, that doesn't mean it's right, but
I still think it is the software developers'
responsibility to make sure that these kids don't
get into the software, regardless of whether they
don't or do.
It's kind of like saying, you know, you're
not supposed to cross on, you know, when

(41:22):
the light is red, right?
When your light is red on the side,
you're not supposed to cross the street.
That's called jaywalking, right?
When the car's going one way, but people
do it.
And if you hit that person, guess what?
We can't say, well, you know, the person
was crossing the street and I'm not responsible
because they were crossing the street.
So, you know, my car was moving.

(41:43):
No, that's not going to hold up.
You need to be in control and responsible
for your vehicle at all times, regardless of
somebody, you know, jets into the street.
So that means that you need to know
that you have to have your car under
control at all times.
I don't care whether it's rain, whether it's

(42:03):
snow, you have to know a couple of
things.
You have to know how you can get
out of a lane, you can make keeping
yourself vigilant.
You have to make sure that you know
what your stopping distance is.
One of the things I like to use
is smart cruise control because when you're on
a major highway, the cruise control, basically the

(42:26):
smart cruise control automatically will adjust by braking
and accelerating to maintain the speed you've set.
And now a lot of them, like my
car has three levels.
They have what they call far, which is
the one I use.
They have one that's called mid, which is
basically mid distance.
And then they have one that's close, which

(42:48):
is right there.
Now, the thing is, even though they're all
kind of calibrated, if something went wrong with
close, there was an issue.
I mean, there could be a glitch.
I could hit the car.
It's not likely, but why risk that?
And I think understanding, knowing what your stopping
distance is, testing to make sure you can

(43:08):
stop is a very, very good thing.
And, you know, just like these things, we
have to just come down to a lot
of things.
And that is accountability.
So when somebody signs in and a kid
is a minor, they're technically not responsible, are
they?

(43:29):
But then who is?
The parents are.
So then technically it's the parent's job to
make sure that their son or daughter or
relative is not getting into trouble.
We've had systems out for years now that
will lock down the firewall and do content
restrictions so they don't get into adult sites.
We've had that for a while.

(43:51):
We can now block chats and all kinds
of things.
This never used to be here for firewalls
at home.
We've had it in businesses for a while.
And then we got more granular.
But I think it's important to understand that
the adult needs to educate the son or
daughter about what they're doing.
And why is it potentially harmful?

(44:13):
Because they don't know.
They don't really know.
They think it's not a problem.
We've talked about the teddy bear example before,
right?
Not to bore you with it, but one
of the parents or relatives gets this teddy.
I'll call him Teddy Ruxpin.
I think that was one of the ones.
So Teddy Ruxpin gets sent and the grandfather

(44:37):
or parent gives it to the grandchild.
So Teddy Ruxpin is this really cool, let's
say, teddy bear that can be your friend.
It can teach you things, math.
It can help you with different questions, things
like that.
And up until a while, the system was

(44:59):
basically controlled by a limited set of voice,
let's say, commands that had a lot of
information, but not everything.
So there was a limitation.
Then what they did is they rolled this
out so that the engine, the brains of

(45:20):
the Teddy Ruxpin actually lived on the cloud,
the internet.
When they did this, they left the same
security system that was in there, which was
a very weak pin.
I think it was maybe a three or
four digit pin.
And maybe you guys don't know this, and
I'm gonna give you a secret here today,
which I think is gonna be pretty much.

(45:41):
How long would it take someone to hack
a four digit pin?
How long?
Well, a three digit pin can usually be
hacked in about 15 minutes.

(46:03):
Does that give you some kind of understanding
what I mean by that?
Okay, 15 minutes.
So how long does it take to hack
a four, a number of, okay.
So if you think about a pin, and
I'm thinking that the pin is just a

(46:24):
four digit pin.
Let's just say it could be numbers, it
could be letters.
You know, how much time, how much time
would it take to guess a four digit
pin?
So use something called like brute force.

(46:45):
There's lots of things you can use, but
not to get into that whole analogy right
now.
If we're just doing numbers, wouldn't take very
long.
But if we did alphanumerics, right?
So the combination for that is kind of
interesting, right?

(47:07):
Assuming that it was 10 characters, but if
it was more than 10 characters, then it'd
take a lot longer, right?
We know most ATM machines, they don't do
alpha, right?
They only do just the numbers.
So a three digit combination can be cracked
in 15 minutes.

(47:28):
How long will it take to crack a
four digit pin?
Well, the good news is it's not double,
okay?
15 and 15, okay, is 30, okay?
So in order to crack it, okay, to

(47:54):
do this by the easiest way, if we
look at that time, we go from, basically
we go from 15 minutes, but if we
take 50 into what it is, 15 into
it, it would take a little bit longer.
How much longer would it take?
Well, let's just do the math on this.
It would take six, so six times 15.

(48:17):
So if we think about that, it would
actually take five times longer, or if we
compare it to the other, which is the
three digit, which has the 15 and 15,
so that would take only 15 minutes.
In that case, okay, you're basically saying it
will take five times longer to crack a

(48:38):
four digit pin.
It will actually take about 90 minutes.
That's quite a bit of time, right?
So what happened was they had this very
weak pin, four digits, and it was hacked.
And that pin was the only thing protecting
that teddy bear, okay, from communicating to another

(49:01):
device.
So what would happen is you'd pair it
to your iPhone or your device using Bluetooth.
You connect it, or the internet, I said
the internet, you put the pin in, that's
how you could get in.
You could manage, you could change it.
Well, they were able to hack it, and
now you know what happened?
The people, the bad actors were sending codes
to this, like they were sending files to

(49:23):
it.
And in some regards, what they were doing
is they were literally coming over it live
like a broadcast.
That's pretty scary, right?
So now you have this kid that basically
has been befriending this, let's say, Teddy Ruxpin
for a long time, for, I don't know,

(49:44):
it's been like a couple of weeks.
But to a child, that's going to be
like his whole lifetime.
And so now all of a sudden, he's
playing with the Teddy Ruxpin, Teddy Ruxpin, and
after the bear gets, you know, gets very
acquainted, suddenly the bad actor shows up.
And it can access everything from the camera

(50:05):
to a microphone, everything.
And it's, oh, what is your name?
Oh, Michael.
Oh, hi, Michael.
Are your parents around?
No, they're at work.
Oh, what time do they come home?
They don't come home till after five, and
that's like three hours from now.

(50:26):
Oh, okay.
And then you're talking, so you know what
my name is?
You're Teddy.
Yeah, I'm Teddy.
But you know, I'm actually Joe.
Joe.
And so when it first comes off, it's
like, well, you know, I'm a friend of
Teddy's.
And Teddy wanted me to reach out to
you.
And I was hoping maybe we could meet

(50:46):
and have ice cream sometime.
Would that be nice?
So now what happens is the child believes
that this is a friend of Teddy.
And because he's, he or she's put so
much trust into the doll, they actually go
meet them.
He says, don't tell anybody because, you know,
we got to keep this our secret.
We can't let anybody know that you're coming
to meet me because, you know, Teddy and

(51:08):
I like to keep this stuff a secret.
That's a problem.
And so now if the child doesn't say
anything and isn't schooled or, let's say, educated
by parents or by schools about this type
of thing, and they don't say anything, and
now they go meet after school, okay?
And now they go get ice cream.

(51:30):
Something very bad could happen.
They could be using the kid to exploit,
to get to their house, to possibly plan
a robbery so they could learn when they're
not there.
So they could break in and steal things
like when they're going away.
It could be something like that.
It could be something a lot more dangerous.
The bad actor could do something harmful, like

(51:51):
abuse the child.
The other thing that could happen is the
person could grab their information so later they
could steal their identity in life when they
do become somebody that has a value.
So these are important things.
And it's everybody's responsibility.

(52:12):
It's the parent's responsibility.
It's the software company's responsibility.
It's everybody's responsibility.
And Ruxpin, they did address it, okay?
But unfortunately, several kids didn't have nice stories
to share.
So if you're watching this tonight or listening

(52:34):
to this and you have a son or
daughter and maybe you've given them any kind
of technology, it could be a phone, it
could be a Teddy Ruxpin, it could be
anything that connects to the internet, okay?
And so you make sure you educate them
and let them know that this device is
fine.
But, you know, there's always the potential that,

(52:55):
and maybe you say something like, you know,
if a stranger ever comes to you and
says, you know, you know, would you get
in my car?
What do you always say?
No.
Well, you have to take that training a
little further and say, okay, not only we've
talked about the car, but now if your
teddy bear ever asks you to do something
you're not comfortable with, what do you do?
Well, of course they do.
Well, no, no, no, it might not be
the teddy.

(53:15):
It might be somebody coming to try to
impersonate through the teddy.
You would say no, right?
And they're like, I don't know.
So the reason I'm sharing this with you
is that these are real live things.
All right.
Very, very live things.
And they can be scary.
But if we educate people about them, then

(53:35):
we can protect the world at large.
So this week's Jay Moore Tech Talk show
was a real showstopper.
We bridged the world of innovation, advocacy, accountability,
security, miners, games, from the legal heat on

(53:56):
Google and meta to shapeshifting bots and smarter
AI and how to protect yourself from technology
that could go awry from bad actors.
The future of tech is rapidly emerging and
taking shape, but not without raising some critical
questions about ethics, privacy, safety, and of course,

(54:17):
ladies and gentlemen, accountability.
So I know today was a very interesting
episode.
They're always interesting.
But today's episode, I think, was very important
because it addressed some issues that don't come
up every day.
You don't think that that toy that you

(54:39):
bought that has connection to the internet is
suddenly going to be a big problem.
The, let's say, piece that allows Pandora's box
to open and allows a bad actor to
get into your son or daughter's life or
your whole family's life and potentially cause some

(55:01):
fear, but maybe it'll cause some pain that
could be physical or mental.
And we just want to make sure that
we alleviate all that.
So we talked about a lot of stuff.
We're definitely going to keep you up to
stuff with what's going on with Google because
I honestly believe they're guilty.

(55:22):
I mean, that's just all the stuff I've
seen.
I'm not a judge, but just reading and
the issues that I went through myself with
Google, I mean, you just don't act like
that as a company, right?
I mean, you just don't act like that,
right?
So I hope that not only will you
take this stuff seriously that we talked about

(55:43):
tonight, we talked about something that was really
serious, the game rating system, and it's called
the ESRB.
You can go to ESRB, by the way,
.org.
We didn't talk about this as one of
the reels this week, but I still think
it's very important that you understand what the
electronic software rating board is and how it

(56:05):
works and how they're on many different games
that you'll be buying for your son or
daughter, et cetera.
It could be things like Grand Theft Auto.
It could be a Teenage Mutant Ninja Turtles.
It could be Assassin's Creed, Elder Scrolls, and
so many other things because when you're buying
one of these things, right, you don't really

(56:26):
know if they're a problem or not.
You don't want to impose a problem, but
how did you know that this game was
suddenly going to be something that brought in
discussions at an age that, let's say, that
grandchild or that son or daughter is not
prepared to handle yet, but it's never too

(56:47):
early to have a conversation about being safe.
Of course, there's the stranger danger conversation, right?
But how do you get them to understand
that this cute, lovable toy is now their
enemy?
And it's not the toy that's their enemy.

(57:08):
It's the fact that a bad actor got
behind it and started controlling it, right?
And I think that's the interesting thing.
I mean, you can think of it like
this.
You can think of it like a costume
that maybe one of their favorite characters wear,
and that cost is fine, but what happens

(57:29):
if that costume is worn by a bad
actor?
Even though everything is so great and lovable
about that thing, there's a bad actor in
that costume, let's say.
And that bad actor is now not portraying
what the character is supposed to be portraying.
And I think that's a very important thing
to think about.
So as we get ready to wrap up

(57:49):
tonight's show, I have really enjoyed being with
you as we're almost wrapping up this little
less than hour.
We do our shows just under 59 minutes.
If there is a topic that you would
like me to cover or if you have
a question about this episode or any other
episode, please, ladies and gentlemen, reach out to
me at BelieveMeAchieved.com.

(58:10):
And I'm going to catch you guys real
soon, all right?
Remember, safety is not a joke, and let's
all be accountable.
Take care, everyone, and be well.
Advertise With Us

Popular Podcasts

United States of Kennedy
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Bookmarked by Reese's Book Club

Bookmarked by Reese's Book Club

Welcome to Bookmarked by Reese’s Book Club — the podcast where great stories, bold women, and irresistible conversations collide! Hosted by award-winning journalist Danielle Robay, each week new episodes balance thoughtful literary insight with the fervor of buzzy book trends, pop culture and more. Bookmarked brings together celebrities, tastemakers, influencers and authors from Reese's Book Club and beyond to share stories that transcend the page. Pull up a chair. You’re not just listening — you’re part of the conversation.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.