Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:08):
Hi everyone, I'm John C. Morleyr, the host of
The JMOR Tech Talk show and inspirations
for your life.
Well, hey
(00:51):
guys, it is John Seymour Lee here, serial
entrepreneur, and of course, you know the host
of The JMOR Tech Talk show.
It is great to be with you today,
Friday, we're getting the show out a little
bit earlier today.
It's September 12th, I do want to take
this day to have us just think back
(01:14):
and remember all the people that lost their
lives in 2001.
That was yesterday, quite a few years ago,
2001 over 20 some years ago, at 9
-11 the towers and also we want to
remember Charlie Kirk, you know, who recently got
(01:35):
shot and at a very young age, only
31 years old, thankfully, it looks like the
FBI does have the person who shot him
in custody, so hopefully that is correct.
And again, it's great to be with you
on The JMOR Tech Talk show.
(01:55):
If you notice, we had a little bit
of a kerfunkle, I should say, a little
interesting word there.
So I'm going to be doing a review
on the Lenovo P1 Gen.
Well, first of all, it is terrible.
I'm going to give you a full review
on that before I send it back.
It crashed streams, it had latency up the
ying yang over 1,901 minute, we went
back to Lenovo, they just gave us a
lot of runaround, but I'm going to do
(02:15):
a whole set of reels on that laptop
because nobody should be buying the Lenovo P1
Gen.
All right, I actually have the P1 6,
which is a great, powerful workstation laptop, and
I have the Gen 1, I'm waiting for
the Gen 3 to come out, because the
Gen 2 is not that much different, there's
a few enhancements, but the Gen 3 has
(02:36):
Wi-Fi 7 and a few other great
things.
Hey, guys, if you're thirsty, feel free to
go grab yourself some RO water like I
have here.
Maybe you want something else to drink, maybe
you want a sandwich, slice of pizza, a
piece of fruit, maybe it's healthy, maybe it's
not, sweet or tart, that's up to you.
And if you have not been here before,
well, welcome, and if you're coming back, of
(02:57):
course, I want to say welcome back to
The JMOR Tech Talk Show.
It is so great to be with you
here, and do check out BelieveMeAchieve.com for
more of my amazing, inspiring creations after the
show, that is, or 24 hours a day.
You can check out all my long-form,
short-form content, I know you definitely will
enjoy that.
So without any further ado, guys, let's go
(03:18):
ahead and kick this show off, all right,
everyone?
Okay.
So when we think about all the things
happening, it can be a little bit crazy,
and the title today for the show is
basically a global tech quake, power politics and
AI collide.
Yes, that's a very, very interesting one, guys.
(03:40):
And in case you're wondering, this is series
four, and it's show number 37.
All right, guys, and definitely the unfiltered bites,
right?
The weak tech rewrote the rules, I should
say.
Hey, guys, I'm John C.
Moore, a serial entrepreneur, if you don't know
(04:00):
who I am, hello there.
I'm an engineer, video producer, marketing specialist, keynote
speaker, podcast host, podcast coach, graduate student, and
a passionate lifelong learner.
On this episode of The JMOR Tech
Talk Show, we're unpacking a wild week where
tech power, politics, and AI, well, they definitely
(04:24):
collided.
So dive deep with me as we break
down the moves and the motives shaking up
our digital world and why you should care.
So if you don't have your drink or
your beverage, go run and get that, because
we're getting started.
So catch the latest episode within 24 hours
of airing on The JMOR Tech Talk
(04:45):
Show on the podcast format for those of
you that want it on the road.
For even more exclusive content and insights, be
sure to visit BelieveMeAchieve.com.
And I can't wait, ladies and gentlemen, to
get my new Lenovo P1 6th Gen 3,
but I don't think that's going to be
until October, because Lenovo is saying they're not
going to release it until then.
But we're going to do a lot more
(05:06):
with the P1 Gen, probably on next week's
show, we're probably going to reveal some things
about it.
I might even have the laptop right here.
All right, guys, it is so great to
be with you.
And our first topic is the U.S.
targets Chinese drones and trucks.
So the U.S. is turning up the
heat, or should I say mildly, on Chinese
(05:26):
tech giants targeting drones and heavy-duty vehicles
with new reactions.
With DJI's huge market share threatened, this move
may reshape everything from delivery services to construction
and spark a new front in the ongoing
U.S.-China trade battle.
Yeah, that could be a very big problem
(05:49):
for a lot of manufacturers, importers, exporters, you
know, et cetera.
So I think that's definitely something.
And the U.S. plans these new rules
that could restrict or even ban Chinese-made
drones and heavy-duty vehicles.
With the DJI dominating the drone market, this
move is definitely reshaping industries and igniting a
fresh trade set of tensions beside what we've
(06:11):
had in the past.
Our second story comes to us from, well,
this whole thing about Russia, yes.
So Russia is blocking WhatsApp, YouTube, and it's
now pushing local apps.
Russia is rewriting its playbook for online control,
cutting off major foreign apps during planned network
(06:33):
blackouts.
WhatsApp and YouTube are now blocked with only
state-backed tools allowed.
And it's a chilling reminder, guys, of how
government power can redraw the boundaries of digital
freedom in seconds.
But sometimes I think that government people, you
know, they get too much power and they
(06:54):
don't always do what's right for the people.
I'm just being honest.
So Russia's internet blackouts, as I said, only
local apps are surviving.
And Russia just revealed which apps will keep
running during its planned mobile internet shutdowns, state
-backed services like MaxMessenger and local marketplaces make
the list.
But WhatsApp, YouTube, and other foreign apps, well,
(07:16):
unfortunately, they're going to be left at the
curb to chill.
This move is supposedly strengthening Moscow's grip on
its digital space.
While millions of users are left struggling with
outages, this is their solution.
And, you know, ladies and gentlemen, citizens are
forced onto the Kremlin-approved platforms.
The internet blackouts force Russian citizens onto a
(07:39):
shortlist of the approved apps.
Your payments, as I said, MaxMessenger and other
government-run marketplaces are in.
And while tens of millions lose access to
the global platforms they rely on daily, underscoring
how geopolitical tensions shape our digital lives, you
(07:59):
might think, John, what does it matter what
a country does over there or what happens
over here?
Plenty, because the impact can have reverberations across
not only the same state, not only, let's
say, the same country, but the entire world.
And that's a big problem, guys.
And here's one.
(08:19):
Warner Brothers sues AI over Superman and Scooby
-Doo.
Yeah, where are you?
I remember Scooby-Doo and the Mystery Man,
right?
It's funny how Scooby-Doo always seemed to
solve the crime, but didn't really know what
(08:40):
he was doing.
He was just a dog that was always
looking for his food.
And he always wanted his Scooby snacks, if
you guys remember Scooby-Doo.
So the copyright wars have gone the next
level.
How do we say A-wire?
Warner Brothers Discovery is taking on AI generator
Mid Journey for allegedly, remember they said allegedly,
(09:04):
using Superman, Batman, Scooby-Doo, and more to
train its models.
This blockbuster lawsuit could decide once and for
all if AI can freely use pop culture
icons or if the law finally catch up,
is up right now to the AI's rapid
rise.
And I think this is something we're going
(09:24):
to see a lot more of, not just
today, guys.
So with Warner Brothers suing AI over the
Superman, Scooby-Doo caper and Mid Journey with
the massive lawsuit, the studio calls it, quote
unquote, breathtaking piracy.
Mid Journey says it's, quote unquote, fair use.
(09:46):
With Disney and Universal also suing, this could
be a make or break fight for the
future of AI and copyright usage.
We just learned not too long ago that,
what was it, the New York Times was
against all the AI, right?
But then they signed a deal, they kind
of got in bed with Amazon, and now
they're allowing them to push out and share
(10:09):
their content from their shows.
I guess it's all about money, right?
Everything seems to be about money with a
lot of these companies.
And Anthropic, ladies and gentlemen, pays $1.5
billion in book piracy settlement.
In a record-setting deal, AI giant Anthropic
(10:29):
agreed to pay $1.5 million to authors
whose books were used to train the Claude
Chabot, with writers like Andrea Bartz and Charles
Graber.
Now collecting, this isn't just about compensation anymore,
guys.
It's set the new standard for how AI
must pay up for creative work.
And I think this is a big challenge.
(10:50):
And I think now that it's starting to
set a bar on this, it could be
a really, really, really big problem.
And Judge Alsop called Anthropic's use of pirated
books, quote-unquote, exceedingly transformative, close quote, under
copyright laws.
But the trial was looming.
(11:12):
Instead, Anthropic settled, marking the largest copyright recovery
in history.
And it's a game changer.
Or is it just the beginning?
I think it's a game changer.
And I think it's just the beginning.
And I think a lot of people are
going to have to be prepared if they're
going to use AI content.
Make sure they're using the right information.
Make sure they have licenses to use it.
(11:34):
Certain digital twins require that they can only
use content for so many days.
So make sure you understand those things before
you decide to post online.
And ladies and gentlemen, parents are suing OpenAI
after a teen tragedy.
A heartbreaking tragedy brings fresh scrutiny on artificial
intelligence safety as a California family sues OpenAI,
(11:59):
blaming ChatGPT for validating their son's suicidal thoughts.
This is just crazy.
Despite new parental controls from, well, OpenAI, the
case spotlights the urgent need for real accountability
and tougher protections for vulnerable young users.
(12:20):
And I think, guys, this is the beginning.
But I think we're going to see a
lot more of this.
And it's funny how when it becomes a
lawsuit, everybody seems to get more involved.
So OpenAI recently rolled out these new parental
controls, letting parents link accounts, disable features, and
get alerts for acute distress.
(12:43):
But the family's lawyer calls it, quote-unquote,
crisis management, and demands stronger action, even calling
to take the AI offline.
This case raises serious questions.
Can AI ever truly protect teens?
Or are these safeguards just too little too
late?
And I think anybody should have to realize
(13:03):
that if you're going to have a conversation
with AI, you have to realize there's limitations,
right?
I understand that lawyers and families, and I
understand the gravity of the situation with this
loss of their child.
And no one deserves that.
And I get why these parents are fighting
back, because they're upset.
But not only that, they don't want this
(13:24):
to happen to another teen or another child
or another individual.
So I get the reason why they're upset,
all right?
And ladies and gentlemen, Meta blocks the AI
from the suicide chats with TeensMeta changing now
the course after fierce backlash, updating its AI
(13:45):
-powered chatbots now.
Instead of taking out and talking about suicide
or eating disorders with teens, bots automatically redirect,
at the point of risk, users to a
specialist now, which I think that's a great
idea that we can actually redirect them to
a, well, a certified counselor, right?
(14:06):
I think that's an important thing.
While it's a step towards safety, many critics
question why it wasn't there from the start.
I have the same question.
And I think the reason it wasn't there
from the start, I'm not going to lie
to everyone, that green stuff, that money, everything
costs money, right?
I know you might say, gee, these counselors
are there to help, but you know, they
(14:28):
don't work for free.
So somebody has to pay them.
And I know you're probably saying, John, this
is a one-off case, but it's not
a one-off case.
With Meta tightening its rules on its AI
chatbots after the backlash over the unsafe teen
interactions, instead of chatting about suicide or eating
disorders, like I said, the bot's going to
redirect users to expert help.
(14:49):
And critics are saying the safety should have
come before the launch, not after.
And I tend to agree, but a lot
of these platforms, they tend to put things
out later on.
Their whole thing is getting things out and
making money.
And ladies and gentlemen, Khan Academy pushes AI
-powered classrooms.
So could artificial intelligence be the secret weapon
(15:12):
to customize learning and close school equity gaps?
Well, according to Sal Khan, he thinks so,
championing a future where AI tutors work hand
-in-hand with teachers.
But skeptics are warning, is this true innovation,
or is it hype from big tech seeking
(15:33):
new territory for, like I'd say, maybe more
money?
And that's a very, very big problem when
it becomes a money game.
What if AI could personalize every student's learning,
give teachers superpowers, and close education gaps?
Again, this is the mission of Sal Khan
in the best case scenario.
He said it could transform schools forever.
(15:54):
But is this really the truth, or is
it just hype to kind of spend more
money?
I don't know.
We'll have to see.
And the Red Sea cable cuts disrupt Microsoft
Azure.
Global businesses and millions of users felt the
squeeze as Microsoft Azure faced major outages after
(16:14):
multiple undersea cables in the Red Sea were
unfortunately severed.
The traffic rerouted caused a spark, latency spikes,
and slower speeds, highlighting just how fragile and
vital our digital backbone is.
And you know, we don't think about these
(16:36):
things until they become a problem.
I was home before.
I have some family visiting.
And I got to tell you, all of
a sudden, the internet went out, right?
And you notice this almost instantly, don't you?
But the thing is, it was a short
outage.
And then we also streamed more TV.
So the TVs took a little longer to
come back up, right, and reconnect, and get
(16:56):
IP addresses, and get back to the router.
But the question is, these companies don't really
pay for this downtime.
I said before, whether we're talking about Optimum,
Altice, whether we're talking about Time Warner, and
these companies, no provider, okay, offering services should
(17:17):
be able, or no one is to be
able to provide services, or buy a company,
I should say, that is outside of their
country.
So France should not be able to buy
a US infrastructure.
And I'm not just talking about cable TV,
guys.
I'm talking about any infrastructure services.
I'm talking about water.
I'm talking about electric.
I'm talking about any type of infrastructure.
(17:37):
I'm even going as far as to say
bus services, train services, okay?
Any kind of, let's say, major infrastructure.
So we could be talking about water.
We could be talking about electric.
We could talk about, well, we could be
talking about anything.
(17:57):
And so when these companies get on board,
they get on board because they want to
make money, right?
And it's a problem.
And like this thing with Azure, a really
big issue.
And nobody really knows about it until it
becomes a serious problem, right?
A serious problem.
But what I mean by a serious problem
(18:19):
is a problem that affects quite a few
people.
Now today, it affected quite a few of
us that were here at home.
And there was a little disturbance, and I
was knocked off for probably by the time
I got back on about five minutes.
My mom couldn't watch TV.
My dad couldn't watch TV.
So it was a matter of rebooting the
(18:39):
Roku device and letting it reconnect.
Because it took a minute or two before
the router to come back online.
But first, the modem had to come back
online and then reconnect.
And then after that was allocated, then it
was able to get an IP again and
get internet.
But I think a lot of people want
to push blame, and I get it.
But I think there needs to become accountability
(19:01):
for the services that are offered, especially when
you're offering things like internet, water, etc.
And I remember many years ago when cable
went out, cable was a lot more accountable.
I remember a little tiny company in South
Jersey, and we had a place in South
Jersey, our summer home.
(19:22):
I remember this company called Storer Cable.
You guys probably remember Storer Cable.
So Storer Cable was like a family-run
cable company, good and bad.
Well, when it was Sunday, and let's say
the cable went out, and there was no
ability to watch the football game that was
on or the baseball game that was on
in the summertime.
(19:42):
You call them, and they weren't gonna do
anything for you.
The only way they were gonna roll a
truck on a weekend, like a Saturday or
Sunday, is if four, five, or six people
called.
So what I do, I got on the
phone, I called my neighbor, had them call.
They were out at another, had four or
five neighbor called.
And then after that happened, guess what?
(20:03):
We magically saw Storer Cable roll out.
But then what happened in a few years?
You guys know what happened.
Who bought Storer Cable?
Does anybody know who bought Storer Cable?
So Storer, it was actually Storer Communications.
And Storer Cable, and again, it was in
(20:24):
South Jersey.
But if we wanna go back and think
about it, it was acquired by Comcast in
1994, okay?
It was following a broader acquisition of Storer's
systems by a consortium that included Comcast, TCL,
ATC for Time Inc.
The Township of Monroe, for example, granted a
(20:47):
franchise renewal to Comcast-Keylevision of Central New
Jersey, formerly Storer, in 2002, confirming the transfer
of ownership.
And so this is pretty interesting.
It was actually Colberg-Kravis-Roberts, KKR, that
(21:07):
took Storer Cable private in a leveraged buyout
from the SCI Holdings.
That was back in 1985.
But I gotta tell you, Storer Cable was
a great company.
The only problem is, on their weekend, because
of everything happening, they did not wanna, or
they would not, I should say, roll a
(21:29):
truck if it was a weekend.
So there was no way you were getting
support on the weekend.
If your cable went out, well, too bad.
And down there in the soul, things always
were happening.
You know, like they'd have to come out
and they'd have to take the wire apart.
They'd have to re-splice it, put a
new end on, and everything was fine.
And usually it didn't take them very long,
(21:51):
but the thing is they never did preventive
maintenance.
It's when you called, then they'd come out.
And if you called them during the week,
no problem.
But again, Storer Cable got bought out by
Comcast.
Then they had Xfinity, and all these cable
companies, you know, they became these conglomerates, and
they tried to become telecommunication companies.
(22:11):
I mean, we're seeing it today with even
Verizon, right?
Verizon Wireless is now trying to become your
LAN provider.
Well, not really your LAN provider.
They're trying to resell IP phone services.
What business does Verizon Wireless have of selling
your services like your phone service?
They could barely provide wireless service.
Why don't they stick in that market?
(22:31):
I don't know, guys.
I'm just really been out of shape with
the way these cable providers are allowed to
just operate and not get penalized for all
the damage they're causing and all the, let's
say, interruptions.
Cyber attack shuts Jaguar Land Rover factories.
(22:51):
This was a massive cyber attack, and it
has thrown Jaguar Land Rover into crisis, forcing
factories to halt and leaving workers, unfortunately, staying
at home.
With operations across manufacturing and retail crippled, it's
a stark warning for every single business.
(23:13):
Even the biggest brands are still deeply vulnerable
to cyber threats.
I always said, guys, it's not a question
of if you're gonna get attacked, but when,
if you are not properly protected.
So the luxury automaker says it's working at
pace, quote-unquote, to restore systems.
(23:33):
But retail and manufacturing operations are still heavily
disrupted.
And this was earlier this week.
No customer data is currently been reported to
be stolen yet, quote-unquote, yet.
But the attack highlights how even global giants
are vulnerable to cyber threats.
Factory restarts and normal operations can now take
(23:54):
days or weeks, showing the growing costs and
risk of ransomware attacks.
You know, guys, and I don't know if
I said this before, but when you have
a, let's say, an antivirus solution, the days
of having software that just sits on your
computer, well, that's over.
(24:14):
Things need to be cloud intelligent.
So you need to be protecting things like
malware.
You need to have web and video filters.
You need to have vulnerability scanning often, right?
You need to have the ability to do
sandbox detection.
What's sandbox detection?
Well, think of it like this.
You have this program that you're running, and
(24:36):
you don't really know if it's a good
program or a bad program.
So you let it go play in its
own area, like kind of quarantined off from
everything else, and you monitor it, but you
don't really interfere with it.
You just let it do what it's doing.
And if it does something out of the
ordinary that's gonna cause damage, well, then you
basically keep it there.
You do not let it back into the
(24:57):
network, and you report it to other places
around the world, and soon millions of networks
will all be aware of this so that
if this virus does happen to hit their
network, it immediately goes into quarantine.
So sandboxing is great for emails, for files,
but everything gets scanned.
Now, just to give you an example, on
(25:18):
my computer, I think I have like 295
,404 files are submitted, and 295,404 files
are clean.
Zero have a zero-day vulnerability, and zero
are pending any evaluation.
I think a lot of people think that
you can just use free antivirus software.
(25:40):
You're just fooling yourself, guys, because threats are
growing so exponentially, and a free software program
isn't gonna do it.
And I'm not gonna mention certain software programs,
but I gotta tell you, the guys that
used to produce great software, they're not hacking
anymore.
You need something that is able to keep
(26:02):
up with the current trends, with the current,
let's say, digital signatures, et cetera.
And so it's not a case of just
scanning and saying, gee, this is what it
is.
It's now understanding patterns.
So it's not just enough to say, gee,
this is the pattern, but looking for a
certain type of pattern-like activity, because the
pattern could be different, guys.
(26:23):
It could be different.
And again, it's not a question of if
you're gonna get attacked but when, if you're
not properly protected, whether that's your home, whether
that's your business.
Even your smartphone can get attacked.
It's not as common.
And yes, guys, the Mac can get attacked.
You might say, gee, well, the Mac is
prone to it.
No, the Mac can get a virus.
(26:47):
Yes.
And other types of malware such as adware,
spyware, ransomware, and even Trojans.
Although macOS has built-in security features like
Gatekeeper and XProtect, these protections are not foolproof
and can be bypassed by new emerging threats.
Users must remain vigilant by practicing safe browsing
habits, only downloading apps from trusted sources and
(27:09):
being wary of phishing attempts.
On a lot of the Windows computers, they
have something now called Secure Boot.
And Secure Boot is an attempt to basically
make sure that no program changes the boot
sector, which could cause problems on your computer,
like trying to infiltrate some type of, let's
say, malicious covert virus until you reboot and
(27:32):
then voila, it takes over.
And guys, Germany launches, yes, Europe's fastest supercomputer.
This is kind of blowing me away that
they're doing this.
I mean, I would have thought that we
would have done this year in the US,
but I guess not.
And entering Jupyter, Germany's new Exascale supercomputer now
(27:56):
is said to be, that's today, September 12th,
2025, it could change tomorrow, the fastest in
Europe and powered by Nvidia chips with the
ability to process more calculations than, get this
guys, 10 million laptops combined.
Wow.
It's a game changer, right?
(28:17):
For AI innovation, medical research, and Europe's competition
with US and China's tech giants.
So I think these are important things to
understand.
And although I believe our world is definitely
growing and we're seeing a lot of things,
I believe we still need to be vigilant
and we still need to be at the
(28:38):
curve, right?
Faster than 10 million laptops combined.
Chancellor Mertz, quote, says, Jupyter will help Europe
compete with the US and China in AI
innovation from biotech to climate research and aims
to boost startups and establish companies alike.
Europe is officially entering the high-performance computing
(29:00):
fast lane, close quote.
So there's a lot going on and everybody
is trying to jump on this AI ship
because one, they think it's gonna make them
a lot of money.
But what they don't realize is that the
cost of implementing AI is not cheap.
So what a lot of people are trying
to do is implement it in one area
(29:22):
and then have that area be able to
roll it out to other businesses in a
similar discipline or the same discipline.
This is what they did with POS software
many years ago, right?
If you rolled it out for one business,
for retail, well, you could make changes because
every business basically has either a product or
a service.
First, they came out with software that just
(29:43):
did services, right?
And some services didn't have taxes, some did.
Then you didn't really have backordering on that.
But when you got into products, well, then
you could have backordering, right?
So you had a backorder functionality.
You had quantities.
Then you had things like shipped, backordered.
Then you got into things like even in
(30:03):
the other system where you had just services,
you had things like deposit, right?
Retainers.
So when this happened, they rolled it out
to another market, right?
So they started with one market.
I remember working with a company, I think
they were in Florida, and they made a
DOS point of sale system that I used
many, many years ago, probably close to 28
(30:24):
years ago.
And the software was a pain because everything
was proprietary and it was all DOS.
And I still remember to this day how
everything, setting up everything in that software wasn't
hard, but it was all password protected.
And they never gave you access to any
configuration information.
So after a year or two of their
(30:44):
software being inferior, I just threw them out.
I mean, it looked nice and it looked
like it did everything great, but it was
not flexible when it came time to make
changes.
Like even making a change to our invoice
was a process and we couldn't do it
ourself.
We had to reach out to them.
So that was a big problem.
(31:04):
But so rolling it out to a company
that sells products or services, everyone pretty much
sells a product or a service, right, in
business.
So they're able to roll it out to
things like hairdressers because product or service, right?
They just customize some of the templates and
some of the categories, car washes, right?
(31:26):
A little bit different because a car wash
has to do more than just be a
point of sale system.
It has to actually interact with the controller
so the controller can tell it what type
of wash.
Otherwise you'd be doing things manually by hand.
What else did they incorporate?
Well, later on they got into things like
restaurant point of sale systems.
(31:47):
And so then you got into the whole
touch market.
And so that was pretty cool.
But then after the touch market happened, we
got into the issue of multi warehouses and
how do you keep track of inventory in
real time?
If I take something out and you're here,
we had something called offline inventory and online
inventory.
(32:07):
Offline inventory was the worst because you could
potentially oversell something.
If you were working with live inventory, it
would tell you exactly how many you have.
And if you went to sell something you
didn't have, it would tell you that you're
a negative inventory and that you could only
sell three.
And then the other three you had to
issue a purchase order for.
So there were ways that this was handled,
(32:29):
but then we got into more software that
was Windows based and the databases got more
robust.
But then what also happened on top of
this is we started getting into more about
security.
Who has access to what particular screens?
Who has access to what functions?
We could be talking about things like voiding,
(32:49):
et cetera.
We could be talking about things like, maybe
doing a discount.
And the reason I tell you about the
discount feature, true story, we had somebody working
for my parents' business and I wrote the
software for their dry cleaners.
And what happened there was very interesting.
I caught somebody that was stealing because I
(33:12):
noticed that the in and the out numbers
seemed like way off.
And I figured, well, in was what we
took in and out was what we cashed
out.
So I decided after this was happening a
few, for about a week or so, I
said, I'm gonna go through the tickets.
Something we never did.
And I went and added all the tickets
(33:32):
up.
And you know what I found?
I found that we were off by $500
on the out.
Anybody know what the ticket said?
So I said, huh, how is that possible?
So then I went into the system and
I realized that that $500 was given in
coupons or discounts.
I said, okay, fine.
(33:53):
So I said, maybe they just weren't marked
on the ticket or something, right?
Wrong.
We only allow coupons at the time the
order is brought in.
They were running discounts, which is when you
take a percentage off and coupons when you
take a dollar value off.
When?
At the pickup.
So they pocket the 10 or the $20.
(34:14):
And nobody really noticed it until I said,
Richie, let's start counting the invoices and start
tabulating them.
What I do, I took away the feature
to do discounts and voids by anyone except
the manager.
Now we had put them on there because
we didn't think it was a problem because
once in a while we would allow it,
(34:34):
but we never believe that anybody was robbing
us five, $10 a day, $20 a day,
because they were beating the system.
If the order was $100, well, they suddenly
put in that it was 80, took the
$100 bill.
Later on, the $20 bill went into their
pocket.
Never did it when the customer was there
though.
So these are some things that were happening.
So things like cash control and stuff like
(34:57):
that came out.
And ladies and gentlemen, the SEC plans a
big crypto rule overhaul.
Yeah, this is pretty big.
The US SEC is shaking up the digital
world with a new rulemaking agenda to modernize
how crypto is regulated.
(35:18):
They're looking for streamlined rules that could both
lower compliance costs for companies and finally bring
digital assets to national exchanges.
This is a pivotal shift for the future
of the finance world.
And I think it's something that's gonna definitely
turn the crypto market, I'm gonna say on
(35:40):
its ear or on its side, because this
is very big, guys.
Basically, they're shaking up things and they wanna
allow digital assets to trade on national exchanges.
The SEC chair, Paul Atkins, quote unquote, calls
it a new day, quote, for innovation, transparency,
(36:01):
and market efficiency.
After years of crackdowns, this could be a
major win for digital assets industry.
Is it the right thing?
We'll have to see because a lot still
could go wrong.
And Nepal moves to block Facebook access.
What's this about?
Nepal's government is taking a controversial stand blocking
(36:24):
unregistered social media platforms, including Facebook, to fight
misinformation and cyber crime.
While authorities claim it's for public good, critics
are warning it's a blow to free expression,
freedom of speech, for millions of citizens living
(36:44):
their lives online.
And so these are some challenges that are
definitely hitting.
And so with 90% of its 30
million citizens online, the government claims fake accounts
and misinformation are fueling cyber crime and social
unrest.
Critics warn this could strifle free expression right?
(37:05):
By causing big problems.
But authorities are arguing stricter oversights is necessary
to protect users and preserve social harmony.
Only time is gonna tell guys.
And Mr. Trump, President Trump hosts the tech
in the Rose Garden dinner in a rare
show of unity for our, and President Trump
(37:28):
has broke bread with the tech's biggest names.
We know Zuckerberg, Tim Cook, Satya Nadella, and
OpenAI's Sam Altman, of course you know him,
in the newly renovated Rose Garden about the
AI policy, innovation.
And the next tech boom we're coming in
(37:49):
front of with the headline grabbing and gathering.
And you know, a big thing's happening right
now is you know, when will TikTok's app
come out?
It was supposed to come out September 12th,
okay?
(38:10):
It was supposed to come out.
The question is, it's not out yet.
So how can they allow TikTok to keep
operating?
Interesting, right?
Maybe you think TikTok's app isn't out yet
because its service was temporarily available in the
(38:30):
United States.
There's some speculation.
The initial release, TikTok was originally released in
China as Douyin in September, 2016.
Its international version was launched in 2017.
The possible new US app, because of ongoing
uncertainty, there have been reports that TikTok may
be building a new app specifically for the
(38:51):
US market.
This version would be separate from one owned
by ByteDance while reports suggest a new US
-based app could launch by September 5th.
We're well past that.
ByteDance has not officially acknowledged these plans.
They're saying it may be coming.
So the question is, is a new app
(39:12):
coming for TikTok or is this nonsense?
You know, I don't think anybody really knows
whether it's coming or not.
I think people wanna believe it's coming.
And this might be just something to keep
us quiet so that they let TikTok stay
(39:32):
around.
But I don't know, I just feel like
TikTok is a security risk.
They run their company however they want.
I know we used to be able to
stream on TikTok.
We can't anymore because supposedly our content doesn't
align with their values.
So tech content doesn't align.
Motivational content doesn't align.
(39:53):
Like something doesn't like match up with that.
Yeah, something weird there.
So, you know, guys, don't miss the full
breakdown because you know, the latest episodes, they
drop within 24 hours on The JMOR
Tech Talk Show at believeintheachieved.com.
And you probably wanna ask something, and this
is a very good question is, so what
(40:13):
tech was used to find, you know, Charlie
Kirk's assailant, is probably what you're wanting to
know.
So they've been working quite a while.
Multiple sources suggest that it was a combination
of conventional police work, physical evidence, and digital
trails that led to the arrest of Tyler
Robinson for the assassination of Charlie Kirk.
(40:36):
The technology used was part of a larger
investigation which involved tips from the public and
family members.
Authorities reviewed video from university cameras and were
able to trace the suspect's movements both before
and after the shooting.
This footage showed a person matching Robinson's description
fleeing the scene.
(40:57):
Discord messages, investigators uncovered messages from Robinson on
the messaging application Discord.
The messages reportedly contained details about the rifle
used and its location as well as engravings
on the bullets.
Phone trackings, following the digital trail, the US
Marshals were able to track down and detain
Robinson while awaiting the FBI.
(41:19):
Pretty cool.
And the FBI's public release of surveillance images
including clothing details was a key factor.
Robinson's father recognized his son from these photos
which ultimately prompted his family to contact authorities.
DNA and fingerprints were a help.
Investigators collected forensic evidence from the scene including
(41:40):
impressions of a forearm, palm, and shoe on
a building near the sniper's nest.
Weapons, law enforcement recovered a bolt action rifle
believed to be the murderer's weapon in the
nearby wooded area.
This provided crucial physical evidence to support the
entire case.
And the thing that was very, very interesting
(42:01):
about this is that it's not that he
was as close as you thought.
He was a building or two away, but
he was able to get up on the
roof.
Now, my question is this.
This is a really big question.
Now, those of you may know I go
to Montgomery University and there is a grad
student.
We can't go on the roofs of Montgomery
University.
Any of the campuses, we cannot go on
(42:23):
the roofs.
They're locked.
So how did he get on these roofs?
Did he break locks?
I mean, like, what did he do?
Or do they allow students to go on
the roofs?
I don't know.
I just feel this is a...
And the question I wanna know is, so
why did they do this to Charlie Kirk?
(42:47):
I mean, what was the motive?
I mean, do we know?
So Tyler Robinson was only 22 years old.
The motive, do we know?
I think the motive is being speculated right
now.
It's kind of crazy.
(43:11):
And it's crazy that this happened.
Again, there were engravings on the shooter's ammo
exposing Charlie Kirk's assassination motive is what they're
saying.
The rifle that federal investigators believe was used
in the shooting that had killed Charlie Kirk,
it was inscribed with anti-fascist messaging shedding
(43:32):
light on the suspect's motive.
Utah Governor Spencer Cox confirmed the messaging at
a news conference recently, saying investigators discovered descriptions
on casings found with bolt action rifle near
the Utah Valley University campus where Kirk was
killed during an event.
One used casing and three unused casings contained
(43:55):
the writings.
And so the thing is, what happened?
It's really crazy how this happened, but do
we know the real reason?
Do we actually know why did Robinson kill
(44:17):
Charlie Kirk?
Why?
Did he say something?
Did he do something wrong to him?
I mean, did he offend him in some
way?
I mean, something had to happen, right?
I don't know.
So Robinson apparently had become a lot more
(44:40):
political in the recent years.
And because of this, he was trying to
take a stand.
You know, authorities said that Tyler Robinson from
Utah in custody just this morning, September 12th,
accused him of killing a conservative influencer, Charlie
Kirk, who died after he was shot in
the neck during a public appearance at the
(45:01):
U.S. Capitol.
Utah Valley University in Orem, Utah.
And Washington City Police were keeping media and
neighbors across the street from the alleged shooter.
So Tyler Robinson is a 22-year-old
man who attended school in the South Utah
city of St. George, was named as the
suspect shooter.
(45:21):
Robinson has no criminal history according to the
state records.
Washington City, where Robinson's family lives, borders a
larger city.
And Robinson is a registered vote in Utah,
but does not have a party affiliation.
Social media photos from Robinson's mother, Amber depicted
a tight-knit family where Tyler was the
oldest of three sons.
The family is shown on vacations to the
(45:42):
Grand Canyon and on outdoor excursions like fishing
trips.
In August, Amber Robinson posted what appears to
be an ACT college aptitude test score for
Tyler, 34 out of 36.
We'll place him in the top 1%
of test takers.
The question is why?
Tyler wore a Trump-related Halloween costume in
(46:03):
2017, according to Amber's social media.
And the eldest Robinson son actually, yes, he
graduated from Pine View High School in St.
George in 2021, according to Amber Robinson's social
media.
The high school did not immediately respond to
an inquiry when many people had reached out.
Tyler is photographed at Utah State University and
(46:25):
was offered an academic scholarship to attend the
Logan Utah-based university.
The question is like, why?
So authorities tied him to the crime through
a review of online messages, interviews with his
family members and friends, and surveillance video.
President Donald Trump, without naming Robinson, first announced
(46:46):
his capture during a Friday morning appearance on
Fox and Friends.
Trump said that a minister was involved in
identifying the suspect.
Neighbors of the family attended the church whose
members are colloquial known as the Mormons.
So, you know, Robinson became more political, as
we said, in the years, and that obviously
(47:07):
had something to do with it.
At the dinner table before Kirk came to
Utah Valley University for the event at which
he was killed, suspect Robinson talked in disparaging
terms about the conservative activist Cox said at
the news briefing.
Investors interviewed family member of Robinson who stated
that Robinson had become more political in the
recent years.
(47:28):
A family member interviewed by authorities referenced recent
incident where Robinson came to dinner prior to
September 10th.
And in the conversation with another family member,
Robinson mentioned Charlie Kirk was coming to UVU.
They talked about why they didn't like him
and the viewpoints that he had.
And the family member also stated that Kirk
was full of hate and spreading hate.
(47:50):
So what kind of gun did the investigators
recover?
Well, they recovered a full rifle, as I
mentioned.
But the thing is, they were able to
use technology to actually get in and figure
out what went wrong.
I think that was the most important thing
is that, you know, when you've got these
type of things happening, right?
And, you know, we've got technology, we've got
(48:11):
online stuff, we've got social media, as we
know, we've got all these different things that
we're doing, right?
And so as we're looking at the stuff,
you know, we've got phones, right?
We've got social media.
We've got people we talk to.
There's so many different things that I feel,
you know, so they knew there was some
(48:32):
type of a problem in there.
But my point is that a lot of
people were not suspecting what had happened.
And earlier today, I was reading just that
they gave a sign, basically, they took left
hand and right hand, and they kind of
put it together right behind his back, just
like this, just like this.
And that was a sign to tell the
(48:53):
shooter, you know, shoot.
And so there was obviously a few people
involved in this.
So the technology that was used, the forensics
that was used, and all of the different
technology that was evolved, they didn't even go
through artificial intelligence, and they were able to
nail this person just on, you know, some
things from, you know, discord.
(49:16):
I've said to you before, whatever you put
online, I don't care what it is, that
information is not gonna be deleted.
So if you do put something online, remember
something, you cannot take it down.
You may try to reach people that, you
know, to get things taken down.
But the thing is, you can't take things
down.
Remember, social media does not belong to you,
(49:37):
it doesn't belong to me.
And you have the ability, okay, to make
sure that whatever you're saying is correct.
If you do not say the right thing,
well, you're not gonna be able to take
those things down.
I think that's a very important thing to
understand is that, you know, social media is
a public world for everyone out there.
(49:58):
And, you know, it doesn't belong to me,
it doesn't belong to you.
And so we have to be cognizant, and
we have to be respective of the social
media, of the world.
And my big question about this whole university,
and I guess it goes back to a
lot of people, is why was he able
to get up on a roof, right?
I mentioned to you at my college, I
can't get up on a roof.
How did he get up and just go
(50:19):
up on stairways and walk to the roof?
Did he break the door to get to
the roof?
Or, you know, would the doors open at
the campus?
I mean, I can't think that any university
would leave doors open if somebody could get
to the roof.
But we know that he got to a
roof, and when he got to the roof,
he was in the appropriate, they called it,
nesting position is what the FBI called it.
(50:40):
And he was in the appropriate place, and
he just sat there.
But there were other people on the ground
that were making sure that Charlie Kirk was
in the appropriate spot so he could hit
him dead on.
So there was obviously things happening.
And I think if we trace this back
to what I was saying before, there was
some hatred from Robinson about the fact that,
(51:03):
you know, he had gotten more political into
the views.
And I always said, you know, you have
a right to agree, you have a right
to disagree, that's totally up to you.
But what you don't have a right to
do is to harm, to hurt, or disparage
another person, or to make them feel bad
in any other way.
So I think these are things that a
lot of people had issues with.
And so, you know, the fact that, like
I said, he was top of his class
(51:23):
and everything, and then to be going off.
So something obviously set him off.
So I think in these type of things,
I mean, not that they had to do
it, but I mean, I think you have
to be careful in who you let at
these things, you know, that you kind of
know who the people are.
But I know it's hard at a university,
right?
So when we think about the initial insights
of, you know, what went on, and we
(51:45):
think about this week, and we think about
the whole fact of, you know, what's going
on, we have to realize just one thing.
We have to ask ourself this question of
artificial intelligence.
Artificial intelligence is not good, it is not
bad.
It is a tool, and how we choose
to use it makes it so.
Remember that, guys, it is a tool, and
how we choose to use it makes it
(52:05):
so.
So if you go use AI, which you
could do right now, you could use AI
right now to harm somebody.
And I wouldn't tell you to do that,
but if you do that, then you're using
it for the wrong sense, right?
If you use AI for a good sense,
then you're using it to help other people.
But I think a lot of people out
there don't realize it.
They just say, oh, gee, there's a tool,
we can use it.
And I think we're getting a lot of
(52:26):
these messages from people that are actually overseas.
We're getting messages from people overseas.
I apologize for some of the interruption in
the back.
We have my parents visiting, so I do
apologize for some of the interactions of people
from the back end again.
But I think when we can look at
this from a perspective and we can think
about technology, we can talk about things like
(52:48):
robots, and let's just think about robots for
a quick second.
So robots, and I'm going to this because
I'm actually taking a robotics, computer science, mechanical
engineering, and electrical engineering class.
So robots in themselves are basically systems.
And so the way we as humans see
the world or get senses of the world
(53:09):
are from our eyes, from our nose, from
what we touch, what we smell, what we
taste, and also our proprioception.
If I close my eyes and I put
my hand up in the air, my body
knows that my arm is in the air,
even if I don't have my eyes open.
So a robot is very similar in the
sense that it has sensors.
So we have two things.
We have something that's called sensors.
(53:29):
And then in addition to the sensors, we
have actuators.
So in our body, our actuators are basically
our muscles.
You know, we can move things like our
arms and our legs and things like that.
So the thing about a robot though, is
that it has to be able to calculate
things.
Now we just take for granted that we
don't hit things.
So we use sensors like, we use things
(53:50):
like LIDAR, which is a light and beam
that basically reflects back and forth.
We use ultrasonic sensors, which can be used
to bounce a wave back.
And then we calculate the time back.
We have angulation techniques where we can tell
the position of a rotation.
We can tell on our phone, whether our
phone is 30 degrees, 90 degrees, or what
(54:12):
it is.
And you might say, John, why is all
this important?
Well, you see, what we do every day,
we take for granted.
But what a robot does every day, everything
has to be fed in.
Like, you know, if we get too close,
we know we're gonna hit the wall, right?
But if a robot gets too close, it
could injure itself.
It could cause damage to itself.
So the robot doesn't know that.
(54:32):
So we as programmers have to be able
to tell it, hey, you're getting too close.
How many of you ever used, you know,
they sell parking garages, right?
Sometimes they can be more of a nuisance,
right?
Because you get close to something, but then
you're really not where you need to be.
And that can be more annoying.
Maybe you get dust on the sensor.
So there are different types of sensors and
(54:53):
the sensors give us numerical data back.
So what happens is, sensors can operate by
taking feedback from light, from sound, from chemicals.
And then those chemicals can be quantitatively or
those different things from, let's say the light
or the sound, they can be quantitatively analyzed
(55:16):
and we can give them a value.
So the other day I was using an
ultrasonic sensor.
So an ultrasonic sensor uses sound waves that
are so high that we as humans cannot
hear them.
When you send a sonic wave, ultrasonic wave
out, it bounces and hits something.
What happens is when it comes back, okay,
(55:37):
which by the way is actually equal to,
distance is equal to V times D divided
by two in case you guys want to
know the formula.
So when you take the distance and then
you take the time, you're able to get
the actual proportion.
And so if you were to put your
hand or you're able to move the sensor,
you can get ranges that are different based
on what it's hitting, right?
(55:59):
And so this is what we use in
sometimes different vehicles.
More advanced robots use things like LIDAR.
LIDAR is a lot more expensive.
LIDAR is used in autonomous vehicles.
And so the whole idea of us bouncing
things, getting quantitative numbers, because let's say if
we know that hitting an object might be,
I'm just gonna say might be zero, okay?
(56:20):
But not hitting the object might be everything
to minus five of zero because we don't
want to get that close, right?
So if we understand that everything in robotics
or in computers is numbers, in our lives,
we're doing that, but our brain's able to
do the thing.
Hey, if we do something, we're gonna hit
something.
The robot's not that smart.
(56:42):
And so every little thing of a robot
has to be programmed.
I'm talking about when something gets a trigger,
just to tell an ultrasonic thing to fire
and not fire, you're talking about eight lines
of code, not a big deal, but eight
lines of code using an Arduino board, having
a trigger, having a reset back, having a
(57:02):
power, having a ground, and that's just one
sensor.
Now, most robots could have anywhere from 20
to 50 sensors.
That's a lot of sensors.
And so there's a robot out there right
now called Unitry.
Unitry right now runs about $150,000.
Unitry is a commercial dog robot.
You probably remember Spot from Boston Dynamics.
(57:25):
Now, their robot's only about 75,000.
The difference between that robot and the one
that's 120,000 is so the one that's
120,000 has actually got LiDAR in the
front and LiDAR in the back, while the
one from Boston Dynamics has some great features,
but it has minimal LiDAR.
So it has some kind of things.
And a lot of these companies are trying
to throw AI on, but AI is not
(57:46):
ready.
It's not ready to be intelligent.
It's not ready to make decisions.
I've always said we've got to keep humans
in the loop.
So whether we're talking about security, whether we're
talking about entertainment, I think we're gonna see
a lot of different evolutions that are coming
(58:07):
up in the world of technology, whether we're
talking about robotics.
And remember, when I think about robotics, we're
not just talking about one thing.
We're talking about a whole system of things.
Hey guys, I'm John C.
Morley, serial entrepreneur.
Be sure to check out BelieveMeAchieve.com.
I'll catch you guys next week.
Have yourself a wonderful weekend.