Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:02):
Hi everyone, I'm John Seymour, the host of
The JMOR Tech Talk Show and Inspirations for
Your Life.
(00:45):
Well hey everyone, it is John Seymour here,
welcome to, yes, The JMOR Tech Talk Show.
I'm so glad you decided to pop in
and join me on this wonderful Friday afternoon.
It is May 23rd, 2025, that we're doing
the show here.
If you're tuning in any other time, well
welcome.
If you're coming for the first time, welcome,
(01:06):
and if you're coming back, well welcome back.
It's always great to have you here.
Do check out BelieveMeAchieved.com for more of
my amazing, inspiring creations 24 hours a day.
Anytime we're not on here live, you can
go ahead and check that out.
If you're thirsty or hungry, well why not
head over to the kitchen and get yourself
something delicious, something hot, something cold, or perhaps
just something yummy, sweet, or maybe something fruity,
(01:30):
doesn't matter.
Feel free to do that and come on
back to the show, guys, because we have
a lot, and I mean a lot, to
share with you guys.
Again, welcome to The JMOR Tech Talk Show.
This is the place where you get all
kinds of very valuable technology nuggets that I
don't think you're going to get anywhere else.
(01:52):
I am John C.
Morley, serial entrepreneur.
I'm an engineer, marketing specialist, technology expert, podcast
host, and also a podcast coach, and I'm
your trusted guide through the fast-changing, evolving
world of innovation.
Now, on The JMOR Tech Talk Show, we
break down some of the biggest stories in
tech with insight, clarity, and a bit of
(02:14):
flair.
Every week, I cover topics that seem to
matter to everyone, from AI, artificial intelligence, breakthroughs,
and even cybersecurity threats to tech giants rewriting
the future.
Tune in every week on Friday, and if
you miss it, of course, you can go
to BelieveMeAchieved.com.
We're also on some different cable stations.
(02:34):
You can catch us there, and of course,
you can catch us on our podcast while
you're jogging, running, or maybe just in the
car.
You can tune in for Real Talk, Big
Tech, and Future Forward Thinking, because knowledge is
your power cord.
Everyone ready?
All right.
I know I am.
This week, we've got some amazing hot tech
(02:56):
highlights.
First one is Fortnite.
We had a lot of tobacco with Fortnite
in the past, in case you guys didn't
know.
There was a lot of things going on
with Fortnite, and well, Fortnite, unfortunately, is blocked
again.
Yes, Apple and Epic Games are reigniting their
(03:19):
battle, and Fortnite is caught in the crossfire
again.
The wildly popular game has been removed from
iPhones across the United States and European Union
after Epic allegedly violated Apple's platform policies.
This fight isn't just about games.
It's a deeper war over control, commissions, and
(03:39):
the future of mobile app marketplace.
I think a lot of people are getting
a little bit tired with this, to be
honest with you.
It all is coming down to money.
We know that, right?
It's coming down to money, and basically, this
started around May 16th, guys, 2025, due to
(03:59):
the renewed conflict that surfaced.
While Apple claims it only requested Epic to
remove the US storefront from an app update
to avoid disrupting other regions, Epic accuses Apple
of retaliatory behavior and has asked a US
judge to hold the tech giant in contempt.
(04:19):
This ongoing feud dates back originally to 2020,
when we first covered the story.
When Apple was first banned, actually, the Fortnite
institution for bypassing the App Store payment rules,
although the game briefly returned after European Union
regulatory pressures, recent developments have once again blocked
(04:39):
its access through Apple's iOS platforms globally.
Now, Apple is saying that this was just
doing something for an update, but I don't
really believe them, and I think it's important
that a judge definitely steps in to see
what the heck is going on.
Because it's all coming down to money, guys.
It's coming down to money, and Apple is
greedy.
I'm not going to lie about that.
(05:01):
All right.
So, number two is Microsoft's European Union is
playing in a strategic move to avoid billions
in fines.
So, Microsoft is decoupling teams from Office 365
in the European Union.
This preemptive strike aims to soothe regulators and
level the playing field for collaboration tools like
(05:23):
Slack and Zoom.
It could, well, it's possible that this could
signal a global shift in how big tech
bundles software.
And I think people say they're doing it
for convenience, but we all know it's not
convenience when you're in this industry.
They're doing it because they want to be
(05:44):
top of mind.
They want to be in your face, and
they want you to choose their software over
someone else.
I mean, that's just the name of the
game, guys.
I mean, it's not about convenience.
It's about having that real estate, having that
space in front of you all the time.
And number three is a really good one.
(06:04):
The AI regulation clash.
Yeah, see, the debate over the AI regulation,
and it's heating up.
Forty state attorney generals are challenging a proposed
federal ban that would prevent states from creating
their own AI laws.
Now, they're arguing that states must retain the
power to protect citizens from AI-driven threats
(06:25):
like bias and automation-induced job loss.
But is there more to it than this,
guys, or is this just solely about money?
Well, so as I say, this AI ban,
it faces pushback from a state attorney general's
office and a Republican proposal to ban state
-level regulation of artificial intelligence for 10 years
(06:48):
tied to former President Donald Trump's original tax
bill.
And it sparked bipartisan opposition from 40 state
attorneys, generally including the representatives from California, New
York, Ohio, and others.
Now, critics argue that such a, let's say,
move would strip states of their ability to
protect consumer rights from high-risk AI uses.
(07:09):
Now, particularly as federal regulations remain stalled, California
Attorney General Rob Bonta, citing recent state laws
regulating AI-generated deepfakes and healthcare decisions, strongly
opposed the federal override supporters of the ban,
including the representative Jay Abernalty and Google, claim
(07:33):
it's necessary to avoid the conflicting state laws
and to preserve the national AI leadership.
Now, the measure must pass Senate budget rules
to proceed, so don't think that this is
going to happen anytime tomorrow, because it's not.
I mean, the thing about all these laws,
right, is that they all kind of want
(07:54):
what they want.
But at the end of the day, what
has to happen is something that's going to
be best for everyone, for all concerned.
I think that's a very, very important point.
But I get why sometimes this could be,
how can I say, a little bit challenging.
I mean, really, it could be a big
problem for some people.
And I think if you understand that, then
(08:15):
maybe you can understand where they're coming from.
I mean, hopefully you can.
And number four, guys.
So Alibaba's iPhone deal worries are surfacing.
Rumors of a possible AI partnership between Apple
and the Chinese tech giant Alibaba have raised
(08:35):
some red flags with U.S. national security
officials.
Concerns centers around data sovereignty, surveillance, potentials, and
what this means for Apple's longstanding privacy, folks,
this brand image.
And so there's a lot here at stake,
guys.
I mean, quite a bit is at stake.
(08:57):
And so, you know, the Trump administration and
the U.S. lawmakers are closely scrutinizing this
bill that they're trying to put forward under
a reported deal between Apple and the Chinese
tech giant Alibaba to incorporate Alibaba's AI technology
into iPhones sold into China.
According to the New York Times, officials are
concerned the arrangement could strengthen China's AI capabilities,
(09:20):
expand the influence of censored Chinese chatbots, and
increase Apple's exposure to Beijing's data and censorship
laws.
So this is taking on a whole new
premise than just what's going on with TikTok.
And we'll cover that again sometime, but not
today.
All right, guys.
And number five on the plate of interesting
stories, M&S.
(09:40):
Yes, M&S data breach from the retailer
Marks & Spencer has confirmed a data breach
stemming from vulnerabilities in a third-party vendor's
set of access.
So hackers exploited weak links to siphon customer
data and disrupt operations, costing the company millions.
(10:02):
Now, this event underscores the importance of rigorous
third-party cybersecurity and protocols.
You can't just do this one time, guys.
You have to do this over and over
and over again.
And you have to realize that you're never,
let's say, exempt from this.
You have to keep checking all the time.
So M&S data breach was linked to
(10:23):
the third-party access, which is what they
proved.
The hackers believe to be from the group
Dragonforce that actually breached Marks & Spencer's systems
back in April by exploiting access through a
third-party provider, causing widespread disruption in the
company.
The cyber attack led to millions in lost
sales, reportedly over 40 million pounds per week,
(10:45):
and forced M&S to pause online orders
for more than three weeks and temporarily shut
down key IT systems.
Some stores experienced empty shelves and customer data,
including personal content and other history details was
compromised.
While M&S has not commented on the
(11:06):
specifics, the company stated the store's availability has
improved and systems are gradually returning to normal.
I think when you have a company that's
been around for so long, like M&S,
you know, they want to do everything the
old fashioned way, like maybe your parents, right?
They don't want to do things the new
way.
They want to do everything with, what do
(11:27):
we say here, the traditional paper, pen and
pencil, right?
And they want to write everything down.
They don't trust technology.
I know that when I automated my mom's
business many years ago, back when I was
in eighth grade, my dad got interested in
computers because that's how he could see the
numbers.
He could see the profits.
(11:48):
He would always look for what we call
the in and the out.
So in a dry cleaning plant business, you
want to know what work is coming in
because eventually that work has to go out.
And so if you have a lot of
work, let's say on the rack, well, that's
inventory and that's money that eventually has to
come into the store.
So even if your day was low, but
(12:08):
you had a lot of money coming in,
then that was something he looked at.
He liked the in and the outs to
be, you know, pretty close.
But sometimes the ins were higher than the
outs.
He didn't like that 100 percent, but he
knew that money eventually had to culminate coming
back to the store.
And so here's one that I think is
(12:29):
kind of interesting.
Number six, this is one that just kind
of blows my mind.
Imagine this, guys.
There is a red team ops and they're
exposed.
And the ever wondered people who test the
scrutiny of critical infrastructures like secured bases, red
teams do elite groups that go around and
(12:51):
they're paid to break into buildings, back networks
and simulate real world cyber attacks.
The newly revealed documents show how these teams
operate in secrecy, define and fix vulnerabilities before
real threats exploit them.
And so I think that's important.
So the team is paid to break into
secret bases.
As I said, it's a specialized red team
(13:12):
and it's made of former military and intelligence
experts.
And they're hired by governments and companies to
covertly test the security of highly sensitive sites
like military bases or corporate headquarters.
Their work involves extensive passive reconnaissance, analyzing human
behavior and vulnerabilities and using stealth tactics to
(13:33):
bypass physical security, such as exploiting disgruntled employees
or copying security passes.
Once inside, they use lock picking and cyber
skills to access secure systems and data, all
to identify weaknesses before real attackers can actually
exploit them.
Now, this intense multi-step process blends psychology,
(13:57):
technology and stealth operations to help clients strengthen
their defenses against real threats.
I think it's a good idea that they're
doing this.
And I think, as I said before, I've
said that, you know, we have to be
vigilant of the security we have because security
evolves every so many years.
(14:18):
And if you're using yesterday's technology to protect
yourself, well, the bad actors are going to
be able to get through that caper pretty
quickly.
All right.
So I thought you guys would find that
kind of interesting.
And number seven is there, um, the tech
giants admit, um, change.
Yes.
(14:39):
The tech leaders like Apple, Google, and Meta
are starting to admit that their flagship products
may not be future-proof anymore from declining
engagement on Facebook to saturation in the smartphone
market, these companies are now exploring.
What's next to take away?
Reinvention isn't optional anymore, guys.
So I think this is a big thing
(15:00):
that a lot of these companies are working
on.
And Google, Facebook, and iPhone, as I said,
may not last forever.
Um, Silicon Valley is admitting it.
The tech giants like Google, Meta, Facebook, as
we know them as their other name, and
Apple are openly acknowledging that their core products,
Google search, Facebook social networking, and the iPhones
(15:21):
may lose relevance in the near future amid
shifting consumer behaviors and emerging technologies like artificial
intelligence, uh, and smart glasses.
Google has seen a decline in search queries
on Apple devices, partly due to AI chat
bots, reducing traditional search usage, Facebook's core activity
(15:41):
of adding friends and sharing content is declining,
especially among younger users with more engagement shifting
toward direct messaging and video content.
Apple's iPhone while still popular faces challenges as
consumers upgrade less frequently and new wearable tech
like Apple's vision pro and smart glasses from
(16:04):
various companies hint at the future of personal
computing, these tech giants are racing to innovate
and stay relevant as the digital landscape rapidly
evolves.
And I think I've said this before, guys,
technology is going to keep involving and revolving
to be something new.
It's going to keep morphing and that's going
to happen because of AI that's going to
(16:25):
happen because of, uh, new innovations, um, where
people are going, people's beliefs are changing.
And this is all going to affect, uh,
what happens, you know, in our world.
But the fact that they're admitting that it
might not last forever, forever.
I think that is a really interesting, um,
(16:46):
point.
You know, um, we always have to understand
that technology today, whether it's a smartphone, whether
it's our desk IP phone, right?
A couple of years ago, uh, I should
say more than that.
Maybe 10 years ago, uh, before that we
had, uh, desk phones.
Then the last 10, 12 years, we've now
(17:06):
gone to IP desk phones and we have
apps that can let you make calls and
link into your business, uh, phone infrastructure.
You don't have a phone system anymore.
You have a phone cloud network and really
what do you have?
Well, you have basically, uh, not just software
as a service.
You have IAAS infrastructure as a service, the
(17:32):
whole phone infrastructure, your phone lines are not
phone lines.
They're ported into SIP, right?
Session initiated protocol.
And so now when you make calls, it,
it's different.
There are no POTS, uh, plain old telephone
services.
Everything comes in through IP.
(17:53):
And although that is a good idea, there
are challenges with it, right?
Like security, um, like making sure the call
quality is good.
I remember using a company, I don't even
know if they're still around anymore.
This had to be, oh gosh, had to
be over, over, over, I gotta say it's
gotta be over 20 years or more.
(18:14):
Um, and I remember the very first company
that I used, I don't even know if
they're still in business anymore.
Um, I'm just trying to think if they're
around.
So they were called, uh, they were called
Packet 8, I believe.
And if you're wondering, you know, what happened,
(18:35):
uh, to Packet 8, uh, voiceover IP, well,
um, they were bought out.
They were acquired by 8x8 in 2004.
So you're probably wondering, so who is 8x8
IIP phone?
So 8x8, um, is a, basically a business,
uh, phone service.
(18:57):
They're a voiceover IP, uh, phone alternative, you
know, Zoom and there's Ring and there's other
ones, right?
And you probably remember Vonage.
The problem with a lot of these companies
is people had to understand the concept of
not having a phone, but having a virtual
phone, right?
And then giving you things like, you know,
business voiceover IP.
(19:17):
And I think the biggest problem that happened
with a lot of these companies is that
we were ready for the technology, but the
technology wasn't ready for us.
I mean, when you got on a phone
call, it was like, oh my gosh, it
was, it was terrible, it was like the
fact that, um, you know, you would hear,
you know, static.
It wouldn't be a clear call.
(19:38):
And I think that's a, that's a problem.
Um, we're warm today.
I'm in our, in our office got a
little bit warm.
I don't know why, uh, a little warm
today.
Uh, the temperature got warm outside, but if
you see me sweating a little bit, that
that's why in the plus being over under
these lights, I think when we can understand
that technology migrates from one thing to another,
(19:58):
um, you know, a lot of companies have
shown this to us like Cisco.
So Cisco is a marketing company.
They're not a tech company.
They acquire and buy other companies.
Uh, Pocket 8 was a very interesting company.
I remember the very first time that I
learned about them and it was because I
(20:20):
was using a service and I needed to
be available, but I didn't want to give
out, let's say my grandmother's number or my
home number, but I wanted to be portable
so I could go wherever I need to.
And people could call me.
But think about packet eight was their customer
service wasn't great.
And, um, they had this box and you
(20:41):
plug this blocks into your network and pretty
much their web interface did everything for you.
I mean, you picked your line, you picked
your plan.
I mean, now you can have voiceover IP
services for, let's just say a lot less
money.
I mean, um, you can have for a
lot less money.
I mean, I think Vonage is doing for
like 1399 and, uh, other companies like this
(21:04):
packet eight company, uh, eight, I should say
eight by eight.
Um, but when I look up eight by
eight, uh, they don't seem to show up
as a, as a top, um, they don't,
they don't seem to show up as, as
a top provider.
Uh, zoom phone is one ring central is
another.
(21:24):
Uh, but I remember the costs of phones
back then.
I think for unlimited phone service, don't hold
me to this exactly, but I think it
was like 2999 plus tax or maybe it
was 2499 plus tax and the quality of
the call was just not good.
I mean, they gave you a voicemail, they
(21:44):
gave you quality, they gave you all these
great features, but you know what?
They didn't give you, they didn't give you
a high quality, uh, phone call.
And so, you know, companies that you've seen
online, maybe it was, uh, you all remember
Skype, which, um, I believe they were shut
down on may something Skype was, uh, basically
they did a merger, uh, into basically going
(22:06):
into team.
So if you actually go to skype.com,
it doesn't even direct you over to teams.
Okay.
It's completely gone.
So what happened?
What happened to Skype?
Well, Microsoft just decided that it wasn't working
out.
Skype has been shut down and retired by
Microsoft as of may 5th, as they're pushing
people over to Microsoft teams, users are being
(22:28):
directed to use Microsoft teams for free, which
offers many of the same core features.
Um, and so the change impacts, uh, both
the free and the paid Skype users.
Um, and again, they said it's in order
to streamline, um, you know, their offerings, um,
and their free consumer communication offerings so that
(22:51):
they can more easily adapt to customer needs.
And so, um, they did retire it like
they said they did.
So if you had a number with Skype,
you had to port it out.
And, uh, just explain that concept for a
minute.
So whether you have a cell phone, a
landline at your home or something else, your,
um, phone number, okay.
(23:14):
Can be ported out.
And what we mean by that is that
the number can leave that carrier and then
be brought to another carrier.
Now you have to be careful, but when
you sign up for services, usually your phone
number is yours.
Okay.
And they can't take it from you unless
they said in some agreement that it's not
yours.
Like there's some back services online that have
(23:35):
said it's not yours.
And I think that's a big problem for
a lot of people as they don't realize
that, you know, they're going to lose their
number.
Now, what is the process of porting?
Let's talk about that for a minute.
There's something called an L, an L, an
LM PR, uh, for porting.
So, um, basically, um, L N L N
(23:57):
PR, uh, line, um, local number portability.
Um, and so it's, it's, it's, uh, basically
an agreement that you have to put through.
Uh, they have a WNLP wireless local number
portability.
Um, so wireless local number portability.
So WNLP and L a LNP, uh, allow
(24:20):
somebody to change their service provider within the
same local area and still keep the same
phone number.
Um, so you might be asking, you know,
how, and it's a great question.
How does LNPR work?
Well, it's not hard.
Um, LNPR for phone portability is just kind
(24:41):
of like a, it's like a tracking system.
Okay.
Um, and so when you fill out the
form or do it online and you sign
it, it goes over to your current carrier.
So this is why if you go to
move to a new carrier, you never tell
the new carrier that you are leaving the
old, the old carrier that you're leaving, I
should say, because what happens is as soon
(25:01):
as you do the port out, um, from
that company, which is going to happen automatically
from your new company, they're going to automatically
close your account.
But if you go over there and try
to ask them to do it, you know
what's going to happen, they might at a
spite, just turn off your line and, uh,
give you hassles.
So you just don't want to deal with
(25:22):
any of that.
And we're talking a lot about the iPhone
guys.
I've been talking a lot about it, right?
So what if the iPhone prices go up?
So the iPhone price hike is believing to
be something that will be apparent soon.
So Apple may, um, start raising the iPhone
(25:44):
prices, not by increasing the base costs, but
by making a must have AI feature exclusive
to higher end models.
And this clever strategy encourages upselling while letting
Apple say they didn't technically raise prices, sneaky,
but very smart.
So do you really need this extra AI
service?
Probably not.
So does that mean they're going to charge
(26:06):
you for Siri?
That's kind of where I'm thinking they're going.
Yeah.
So, so Apple may charge you, uh, for
Siri.
And again, after that, you know, that lawsuit
we talked about that class action 95 million,
they got to do something to cover their
tracks because they're paying a lot of money
(26:27):
out now.
Of course they're at fault.
I mean, that's, that's definitely a, a given
there, but how is Apple going to quietly
raise iPhone prices this fall?
So they're going to raise the iPhone prices
this fall.
Um, partly by saying it has nothing to
do with tariffs.
They're going to use AI as the means.
And, uh, to avoid any direct backlash over
(26:49):
tariff related price hikes, they're going to say
it has nothing to do with the tariff
Apple might instead tie in increases to, like
I said, the new AI features, slimmer designs,
or upgraded hardware, they're going to suddenly adjust
prices, uh, through options like less free storage
or premium options like the iPhone 17 air,
(27:09):
although tariffs have temporarily eased due to a
recent U S China trade deal uncertainty remains.
And Apple CEO, Tim Cook has been balancing
this with political diplomacy for a while now.
And a push to expand Apple's us manufacturing,
despite the challenges they're going through.
(27:29):
iPhone sales have grown slightly.
Um, though consumer upgrade frequency has slowed down
making careful pricing strategies, critical for maintaining sales
volume.
See, I think Apple basically figures out how
long is a person going to keep their
phone?
I mean, Apple tracks everything, right?
They, they, they track more things than you
(27:49):
probably remember, but Apple knows Apple knows, um,
if you ask them how long the average
person keeps their iPhone.
Okay.
And, um, they have insights into this.
(28:10):
They know how long the average person keeps
their iPhone.
They collect data on device lifespan and usage
patterns, allowing them to estimate how long users
typically keep their iPhones before upgrading.
So Apple tracks, a lot of things, uh,
various metrics related to the device lifespan, as
I said, uh, how you're using your device,
did you sign up for Apple care or
(28:31):
not?
Um, what is your device behavior?
Right.
How are you using your device right now?
And Apple studies all these things.
Why?
Because it's big data and it's big money
for them.
If they know that you keep your iPhone,
(28:52):
let's say an average of two to three
years, then they got to figure out how
can they get you rope back in to
spend more money with them, even if you
don't need to, you might say they're going
through a very unique type of marketing.
That's going to say, okay, this is what
we need to do, but how do we
(29:12):
do it?
So I think that's a challenge for, you
know, a lot of people, how it's going
to work.
And so if we understand how it's going
to work, then maybe we can be a
little smarter and say, you know what?
I'm not going to get that new phone
yet.
(29:32):
Okay.
I'm not going to get it.
And the reason I'm not going to get
it is because my current phone works fine.
So then you could ask a question.
Uh, what is the lifespan, uh, of a
typical iPhone?
And so it's going to vary.
(29:54):
All right.
Um, but most people will say between three
to six years, iPhones and all mobile devices,
they're designed to wear out.
People ask, when do you upgrade?
Well, generally, if you're trying to do something
and it doesn't work, maybe you're out of
space.
I had a client of mine that I've
been telling him he needs to upgrade his
(30:15):
phone or he needs to purchase the, uh,
Apple, um, you know, storage plan.
And so most people say that iPhones last
or phones last two to three years.
Should I buy a new phone?
Well, that really depends on your current iPhone
performance, your battery, your age, and other factors.
The best time to buy a new iPhone
(30:36):
is in the fall, right after the newest
model has dropped.
And so I think a lot of people,
you know, they get comp with their iPhone.
I've got my upgrading process down to something
I could do with my eyes closed, but
a lot of people are not technically savvy.
They're not an engineer.
They don't know these steps.
(30:57):
And so they figure, Oh, you know what?
I don't want to go through it.
It's a hassle for me.
If I had to change my iPhone every
day, it wouldn't be a big deal because
I back everything up.
I moved to the new device and it's
really simple.
And I make sure that I have a
great backup before it starts.
If anything goes wrong, I can always revert
back.
All right.
So, uh, Apple's going to raise the prices.
(31:19):
Uh, I'm sorry.
You're not going to like it.
I know I'm not going to like it,
but the big question is, you know, when
is, is Apple raising prices?
Um, you know, they haven't said exactly when
they're raising prices, but, um, I would think
that it's going to happen sometime in the
next release, which could be around September, October.
(31:42):
So we'll have to see how many of
you guys out there.
I have a question for you.
Uh, watch cable.
So I have cable at home, but I
don't use cable.
I stream all my channels.
I really don't watch TV a lot either.
I do a lot of reading.
I do shows, listen to music.
How about this?
A $34.5 billion cable merger, um, with
(32:06):
charter communications is attempting now to acquire Cox
communications in a massive, I said, $30.5
billion deal.
Now the goal is to strengthen their position
against, uh, streaming giants and improve their market
share.
The deal could redefine the whole cable TV
landscape and regulators.
Uh, hopefully they're wanting them to approve this,
(32:29):
but will they approve it?
I mean, I think that's the, that's really
the whole thing.
So charter communication, as it has proposed the
$34.5 billion merger to acquire Cox combining
two of the largest us cable companies to
better compete against streaming services and mobile internet
providers amid ongoing industry challenges from cord cutting.
I want to see a big company by
(32:49):
Altice or autism because their service has been
horrible.
The deal involves merging Cox's residential cable, um,
business into charter with Cox enterprises retaining about
23% ownership in the combined company, which
will be renamed Cox communications after they close
(33:10):
it.
The merger aims to create cost efficiencies and
strengthen the market position, but still requires shareholders
and of course, regulatory approval, the question is,
will the regulatory approve this or will this
get, you know, put down on page nine?
I don't know.
I really don't know.
(33:31):
And, uh, we're just going to have to
see what's going to happen there.
And, um, the Polish election hack.
Yes.
Ahead of the national elections, a cyber attacks
linked to Russian operatives targeting the websites of
several political parties in Poland, how about that?
These hacks are part of a broader effort
to disrupt democratic processes worldwide and highlight the
(33:53):
urgent need for stronger electoral cybersecurity.
I think that's an important thing to understand.
And so, uh, Poland's prime minister, Donald Tusk,
uh, basically revealed that the Russian hackers attacked
the websites of his ruling coalition parties, including
his civic platform party just a few days
(34:16):
ago before the presidential election.
Now the cyber attacks targeted multiple party sites
and ongoing efforts are underway to address the
breaches.
Additionally, uh, Polish authorities are yes.
Investigating paid political ads on Facebook, a flag
does potential election interference, which were removed after
being reported by Polish research Institute.
(34:37):
And these incidents occur amid a broader context
of frequent cyber attacks linked to Russia's partly
due to Poland support for the Ukraine in
its conflict with Russia.
And I think guys, the fact that these
bad actors are going after Russia, I mean,
they're not just doing the U S they're,
they're going everywhere.
And so you might be asking a good
(34:59):
question.
It's a great question to ask, you know,
why are they hacking, uh, Poland?
Well, it's because of the party, right.
Um, it's to disrupt control and power.
And I think they just kind of want
to keep them in check.
(35:20):
Uh, distribute denial of service attack, uh, the
pro Russian groups like no name Oh five,
seven and 16 and dark storm are using
the DDoS distributed denial of service attacks to
temporarily disrupt Polish websites.
And systems, uh, other methods are using like
malware, fake updates and other malware are being
used to infect systems and steal information, exploiting
(35:40):
vulnerabilities in IP cameras.
Hackers are targeting internet connected cameras, especially near
border crossings and railway stations to monitor shipments,
phishing and data theft.
These are main significant threats as well as
data leakage from stolen or lost devices and
disinformation.
And of course, propaganda, which is always a
problem.
Poland reported that Russia linked hackers have attempted
(36:03):
to spread fake news through news agencies.
Advanced malware, um, Poland is growing to be
a target for advanced malware.
And I think it's because Russia wants to
keep them in check.
Russia wants to know what this country is
doing and they're not far from the figure
where they can.
So they're going to, but I think Russia
(36:24):
needs to be more vigilant.
Uh, they need to revamp their security.
They need to have a plan.
I didn't know if they have a plan,
but we'll definitely keep you in the loop
about that.
And my, uh, college Montclair state university.
I'm happy to report on.
Yes.
Montclair state university's inclusion ramps up.
(36:44):
Market state university school of computing, which I'm
happy, glad and proud to be a part
of is taking meaningful steps right now to
foster diversity.
That's new programs.
That's mentorships and inclusive, uh, curriculum.
Um, and they're being introduced to support underrepresented
groups in the tech paving the way for
more equitable futures in computing.
(37:05):
But, um, Montclair state university school of computing,
uh, led by director Likwet Hossain is undergoing
this major transformation to better serve its diverse
student body by enhancing retention, revising the curriculum
and fostering inclusion, especially as he said, for
women, minorities, and first generation students by unifying
computing programs under one school Montclair aims to
(37:29):
offer innovative community focused education and research in
the fields like cybersecurity, software engineering, and AI
artificial intelligence efforts include extra.
Uh, support for students struggling with foundational courses,
hiring and supporting more female faculty and engaging
with local schools and the wider community through
projects in town halls, addressing technology, social impact
(37:52):
on our world.
The school targets, increasing female students enrollment to
40% and emphasizes a culture of inclusion
and community engagement.
So, uh, Montclair state diversity is always staying
on the head of everything at the top
of the cusp.
And I think what I like about them,
not just being a student there is that
they really try to be innovative.
(38:12):
They try to bring things to a new
level.
And, um, there's nothing they don't try.
I mean, some of the things they're working
on now with their, uh, you know, their
new supercomputer.
I mean, there's so many things they're working
on and they're giving students like myself and
others, the opportunity to work with things.
I've been in the field for many years,
but some of the students that I go
(38:33):
to school with a lot younger than me,
they haven't been exposed to one 10th of
what I've been exposed to.
So I think this is a great thing
that Montclair state's doing.
So kudos to Montclair state university, really grateful
for all they do for me and for
everyone else that attends there.
And number 12 guys, uh, the New Jersey
(38:54):
deep fake law, uh, was passed in New
Jersey is leading the nation with a new
law that criminalizes the use of AI generated
deep fakes meant to deceive or defame people.
Yeah.
The law aims to curb the misuse of
synthetic media and promote ethical AI usage, setting
a precedent that other states may soon have
to follow because they want to, but because
(39:17):
they're going to have to, and the legislation
against the dangerous deep fakes is advocating responsible
AI usage is what they title it.
In April, 2025, Jersey city and New Jersey
made major strides in responsible AI innovation and
regulation marked by governor Phil Murphy, signing a
pioneering bipartisan law, criminalizing malicious AI generalized deep
fakes with penalties up to five years in
(39:39):
prison and $30,000 fines inspired by local
activists, uh, Francesca Mani, uh, and the fight
against non-consensual imagery alongside this, the New
Jersey is invested 1.5 million in AI
education grants to modernize classrooms.
Rutgers, um, Newark launched a new data and
AI innovation hub focused on community impact and
(40:01):
AI entered politics with, um, Representative, uh, Josh,
uh, Gottheimer groundbreaking AI, uh, generated campaign ad.
And these developments combined with new legal guidance
on AI workplace fairness and expanded AI transit
enforcement underscored Jersey city's leadership role in fostering
(40:21):
both cutting edge AI technology and robust digital
safety protections.
So we will definitely keep you in the
loop of what's happening there.
All right.
So bad actors don't try to use AI
to create things that are going to break
the law.
And Airbnb AI, uh, pivots as the Airbnb
is no longer just a place to book
(40:42):
a stay with the launch of the new
AI powered concierge service, the company is evolving
into a full service travel and experience platform.
This reinvention could redefine how we plan trips
and connect with destinations.
I always said, you know, you can't be
everything to everyone.
And the Airbnb CEO, Brian Chesky is spreading
a, um, uh, let's say a bold transformation
(41:05):
of the company from a short-term home
rental platform into an all-encompassing app, offering
a wide range of services like fitness, food,
and personal care.
It's inspired by his recent innovation, uh, with
open AI and driven by desire to expand
beyond its current niche.
But I think it's more than that guys.
I think it's the fact that, um, you
know, Airbnb is having some challenges, uh, possibly
(41:27):
financially, and I think they're just trying to
do everything they can to like grapple and
hold on to everything that I think that's
really the reason, um, because you don't just
suddenly go, you know, selling macaroni if you
know, you're in the business of, let's say,
um, you know, building houses or renting houses.
Like they're not related.
It's like we're an IT company.
(41:47):
Right.
And the IT company would not suddenly get
into the business of selling light bulbs.
I mean, if a light bulb need to
be installed in something, yeah, we can get
it, but we're not in the business of
providing light bulbs.
That's not what we do or ceiling tiles.
Right.
That has nothing to do with anything we
do.
Right.
Of course we can do different things.
If the company that we're representing does that,
(42:09):
like our marketing advertisement company does video.
JMOR wouldn't do video, but our marketing advertisement
company, Orbital Media full print production graphic company
does.
So you just have to stay within those
confines.
I think that's important.
And our last story that I'm happy to
share with you guys tonight, and I'm very
happy, knock wood, that Optimum stayed up, I
shouldn't talk because we've had some issues with
(42:31):
them.
Our 14th story, Signal Clone, breach with TeleMessage,
a platform styled as a secure alternative to
Signal, was hacked in less than 20 minutes.
Security researchers uncovered massive flaws that exposed sensitive
law enforcement communications.
And this incident highlights the pearls of over
-promising on privacy without rigorously testing it.
(42:52):
I think this is going to become more
of an issue for a lot of people.
And so Signal knock off TeleMessage hack, as
I said, in 20 minutes, absolutely terrible, the
hacker exploited a publicly accessible Java head dump
endpoint on TeleMessage's archive server, exposing sensitive data
like usernames, passwords, and unencrypted chat logs, shame
(43:15):
on them.
This flaw combined with weak password hashing used
MD5, an outdated technology, allowing the breach.
Now TeleMessage's claims of the end-to-end
encryption were contradicted by the fact that messages
were uploaded unencrypted to their servers.
Despite these glaring vulnerabilities, the app was deployed
(43:36):
within U.S. government circles, including on national
security advisors, Mike Walsh's phone.
So, okay.
If you're a national security dude, right?
I mean, wouldn't you be a little more
conscientious of like what apps you put on
your phone?
I know I am, even on what I
put on my computer.
(43:56):
And I think, you know, you've got to
set a standard there, buddy.
Whether you're a CTO, a chief technology officer,
chief information officer, I think you've got to
realize what your company's policies are and you've
got to stay top of game, right?
If you don't stay top of game, they're
(44:19):
going to exploit you.
We've all talked about this before.
The way hacks happen is one of the,
some of the big ones now coming through
social engineering and not to give you the
whole formula, but they go round and round
and round and round, and they find enough
information about you that it gets through the
doors.
It gets people to trust them because they
know things.
So one thing I'm going to tell you
to do is use passwords, like, like that
(44:42):
you'll have to verify.
Like when somebody calls in, you can have
a password or a key phrase on something.
A lot of alarm companies do this.
They use a phrase like, do you have
a message for us?
Now, if you're the bad actor, I don't
have any message.
Oh, okay.
No problem.
Sorry.
Have a good day.
And so when that call comes in and
(45:04):
they say, do you have a message for
us?
And you say no, but then a lot
of people are getting smart because they're probably
saying, well, gee, that's probably a passphrase, maybe
they should call in and say something, hi,
this is Mary from such and such is,
is, is Joan there, and that could be
(45:24):
the phrase, because I think it's too direct
to say, do you have a message for
us?
I mean, in the beginning, it sounded great,
but I feel that you can definitely exploit
that in many, many ways.
So if you're using something like Slack, which
we do, and there were a lot of
companies like, um, I think it was, um,
Disney and a few others, and so they
were complaining and crying back to Slack that
(45:47):
their information got compromised.
Well, hello, Disney.
I mean, you know, you didn't even use
two factor authentication.
I mean like, hello.
Right.
Of course, somebody could get in and try
to compromise.
I mean, you guys should know this.
You run billion dollar parks, right?
Do you just leave the doors open to
your park?
No.
Right.
And now Disney is starting to cover some
more information by capturing how the, and we'll
(46:08):
talk about that probably the next show, how
this new badge system works that validates you
and actually links to your email account.
We'll talk about that another time, but I
think technology is emerging.
The need for more data to be stored,
processed, queried, analyzed, and let's say spit back
(46:33):
out or, you know, regurgitated back out to
you, whether that be on screen, whether that
be into a file, whether that be in
some other form of output device.
This is where our world is going folks.
And so data itself, just like I said,
technology is not bad.
Okay.
It's how we choose to use it.
That makes it so, so if you're fearful
(46:55):
of technology, you're fearful about something don't be.
It's the people that are fearful about something
that are going to get themselves burned because
they're afraid to open up the box.
They're afraid to put on what they need
to put on.
Right.
And I think those are the things that
I really want to capture with you guys.
(47:17):
Because if we just think about the fact
that our life evolves by what we see,
what we hear, what we touch, what we
taste, if we process that for just one
moment and we say, okay, now what do
(47:37):
I do with that?
Well, I'm fearful.
I don't want to do anything.
If you're fearful and don't want to do
anything, then you're basically stuck.
And we talk about technology.
If you're stuck, you know what happens?
You definitely do not go forward.
You stop and you might actually go backward
(47:58):
so you could regress, not always, but you
could.
So our mindset shapes what and where and
how we're going to evolve.
Right.
People every day tell me, you know, AI
is a problem.
AI is not a problem.
There are new caveats appearing with AI and
(48:20):
you've got to adapt to those.
If I say, John, well, it took away
my job.
Yeah, but the things that AI is doing
is taking away the things that are wasting
your time.
All right.
And now you can focus on doing things
much more effectively.
I know that sounds like something that's really
crazy, but it's ultimately the truth guys.
(48:44):
It's the truth.
I mean, like the fact that this guy,
this dude, you know, security advisor for the
US government, Mike Waltz, the fact that his
phone got exploited, that says to me, Hey,
government, where's your plan?
Like, you know, you gotta have some way
to check this.
I mean, shouldn't that app have been checked
on another phone before it even got on
(49:05):
his phone?
Where's your endpoint protection management system?
Like, hello.
And it was a BYOD, bring your own,
you know, BYOD, bring your own device.
Then where are their policies?
Where are their rules?
Or did he suddenly scapegoat them because he's
head?
And I think that's a big problem.
(49:26):
So we talked about a lot here today,
guys, we talked about Fortnite and the fact
that, you know, they're trying to exploit, you
know I would say a very good cause
because Apple is really trying to take advantage
of that.
And the fact that Microsoft decided to remove
(49:47):
or decouple teams from office, I get why
they did it.
They don't want to get hit with some
more fines from the European union.
And with the AI ban facing the pushback
to the state attorney general's office, these are
all very, very important things, right?
And what about happening with the Trump admin
being wary about Apple Alibaba AI deal for
(50:10):
iPhones in China?
I'd be worried about that.
We're so worried about everything happening with TikTok.
Why don't we worry about what's going on
in our own backyard first, before we start
worrying about that?
Okay.
I think those are, we've got enough concerns
in our own backyard and MNS.
I mean, I've been watching the feeds with
them and what they've been doing.
I mean, they suffered a lot and they're
(50:30):
slowly just trying to come back, right?
I mean, losing 40 million pounds in sales.
That's a lot guys.
And did you know that there was a
team that gets paid big money to break
into secret bases and that they're X, uh,
military, uh, people.
(50:51):
That was pretty amazing to me.
They're there to exploit human error and physical
weakness and to give you a report of
how you can fix it or to fix
it for you.
I think so many times we're trusting in
our nature because we feel that there's no
need to worry.
(51:11):
I mean, after nine 11, I think your
level of trust has probably gone way down
with a lot of people.
I remember being down the shore and for
years we would always just leave our doors
open.
We would walk and leave.
I remember us taking the boat out many
times and we would always lock the front
door, but we would never lock the back
doors.
Like what is it?
(51:32):
The bad actors won't go to the back
door.
Like they just, they don't want to walk
around.
It's not a big walk.
And, uh, the thing that was interesting is
that we trusted that for a while.
Go for a walk around the back.
Oh, we don't need to lock the door.
We can just leave it open.
But then suddenly the nine 11 happened.
(51:54):
Now we're like, you know what?
We can't really trust our community anymore.
So I think trust starts out high, right?
But trust gets degraded anywhere from 1%, 5%,
or even a hundred percent like that.
And you're like, well, gee, what do I
do?
(52:15):
Right.
What happens?
Let's just take this for example.
Let's say you have a good friend, a
really good friend.
You've known them for years.
Right.
And now let's say that friend suddenly betrays
you.
What do you do?
You say, Hey, gee, bud, like, what are
you doing?
What are you throwing me in the bus?
Like, what happened?
What are you doing to me?
Now, of course there could be a mistake,
but generally not.
(52:35):
And now, you know what happens?
You suddenly don't want to trust that person
anymore.
They're not even your friend anymore.
They're an acquaintance.
They're like somebody so removed me.
You don't even want to have a conversation
with you.
You don't even want to be in their
presence.
It's sad how I watch all different people
change and evolve because of one reason.
(52:55):
Because they want to become better.
Even if they have to knock other people
down, I'm not talking about competition.
Folks, competition is completely legal.
I'm talking about doing something unethical.
Taking something from somebody or, you know, maybe
lying to somebody or when somebody has trust
(53:19):
in you take that secret and you share
it, right?
What happens when you share something very personal
and then that story reaches other people in
your circle.
You've probably known this happening in grammar school
or high school.
And when that happened, you were devastated.
I remember the first time it happened to
me.
I'm like, dude, what are you doing?
(53:40):
It's like, well, you know, like, you know,
we're all kind of friends.
What are you doing?
Like, I mean, that was something I told
you in confidence.
And when I did that, he was like,
he kind of realized he messed up.
Now this person took about six months to
gain my trust back again, but it still
was never where it was because I still
(54:00):
had that inkling that he could slip up.
Why?
Because the other friends is all, you know,
we're all friends.
Of course, John wouldn't really mind if, you
know, if we keep it all between us,
but the whole point of sharing something and
saying it's confidential means you're not going to
share it with somebody else.
And now it's like, you don't want to
(54:23):
even tell that person anything.
You want to keep things to very generic
terms and conversations like the weather.
Right.
Um, you know, the food you're eating, you
don't want to involve anything that could be,
uh, taken directly, indirectly that could potentially destroy
(54:44):
your reputation.
Now, do people destroy your reputation on purpose?
I don't believe they do.
I think they just blab, blab, blab, blab,
blab, blab, blab, blab, blab, blab.
And they like to talk because they feel
that now they've got some information.
I mean, when in high school, I wasn't
a gossiper, but some of my friends were,
and they gossip.
And I never forget the one time I
was on a three-way call.
(55:05):
Didn't know I was on a three-way
call and my conversation that I was sharing
went to another person and no one ever
told me about that.
And from that moment on, I made sure
that every call I got on was busy,
was not busy before I started talking.
So it could have been something simple.
(55:26):
Like, uh, I, I, I would dial the
number and then it would, it would checks
it up and just, just run a bit
of beep and then just let it go.
I knew there was nobody on the call
that if I got a fast busy, I
knew that there was another party on that
call.
And I know this sounds strange guys, but
people betray other people because they think they're
(55:51):
going to get ahead.
I talked about the story many times.
I always told you guys, you know, regardless
of what licenses we buy, we buy them
legally, right?
I've had so many vendors and I've even
had large corporations like, Hey, John, like, you
know, what if I slip you something?
Can you, can you make that Novell?
Can you make that Microsoft license price disappear?
(56:13):
Disappear.
It's a couple thousand dollars.
Yeah.
I'll give you a hundred dollars for yourself.
Like, what are you asking me to do?
Like the fact that you would ask me
to do that is something that is so
wrong.
And that alone has said to me, you
know what?
I don't trust you.
I don't want to trust you.
Anyone that's going to try to mislead me,
(56:34):
manipulate me.
And now somebody that I have to be
on my guard for to make sure you're
not going to play games with me.
I mean, come on guys.
Right.
And you might say, John, well, I'm going
to win.
Well, you're going to win at the cost
of damaging your friendship, your relationships, and you've
had for years.
(56:54):
And I remember saying that to one of
my friends.
So I guess my friendship didn't mean that
much to you.
Oh no, of course it did, John.
Well, obviously it didn't.
If that's the way, if that's the way
you're acting, it obviously didn't.
Well, I wasn't thinking, well, maybe you should
think next time.
Maybe you should think about what you can
do and what you shouldn't be doing.
And I remember saying that the person felt
(57:19):
very bad, but I still had this distrust
with him.
For a while, I remember, um, him asking
me to do things and I would do
them, but I would be very leery about
asking him to do things because I couldn't
trust.
He could trust me, but I didn't have
(57:41):
full trust in him for almost a year.
And that destroyed our whole sophomore year of
friendship.
That was like a good friendship.
We got back to acquaintanceship within six months
or three months, but it was different.
And when we graduated, do we ever get
back to a hundred percent?
No, probably.
We got to 90% because I always
(58:02):
knew there could be a potential of something
to go wrong.
Ladies and gentlemen, I am John C.
Morley, serial entrepreneur.
It's always a privilege and pleasure to bring
you all these great insights about tech, about
improving your life.
Do check out, believemeachieved.com for more of
my amazing, inspiring creations.
If you're watching this on Labor Day weekend,
I hope you have a great, happy, healthy,
and a safe one.
I'll catch you real soon, everyone.