Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:04):
Hi, everyone.
I'm John C. Morley, the host of the JMOR
Tech Talk Show and Inspirations for Your
Life.
(00:48):
Hey, guys.
Welcome, everyone.
It is John Seymour Lee here.
This is the JMOR Tech Talk Show.
Today is Friday, March 21st, 2025, which means
it's time for the JMOR Tech Talk
Show.
We air every single Friday night.
This episode is called, yes, Tech Tidbits, Apple
(01:09):
AI and the Future of Innovation.
We are on series number four, and this
is show number 12.
Welcome, everyone.
Feel free to get yourself something delicious and
yummy from the kitchen.
Maybe it's something hot.
Maybe it's something cold.
Maybe it's something sweet, something tart, healthy or
not.
Totally up to you.
I am your host, John Seymour Lee, serial
(01:30):
entrepreneur.
It is so great to be with you
guys.
I hope that you will enjoy this show
and much of my other great content at
BelieveMeAchieve.com.
I'm not only a podcast host, but I'm
also a podcast coach.
Welcome to the JMOR Tech Talk Show.
(01:50):
All right, everyone.
It's time for that weekly dose of all
things tech, the breaking news and innovation that
is currently shaping our world from Apple's big
moves to the latest in artificial intelligence and
cybersecurity.
I've got it all for you, and it's
all right here in this show.
So don't miss out on all the insights
every single week right here, including the latest
(02:13):
iOS update, Facebook Live video changes, OpenAI's new
direction, and so much more.
So you can listen now, get the scoop,
and of course, you can catch the replays
on BelieveMeAchieve.com.
So our first topic for today is Apple.
So Apple re-enables the Apple intelligence by
default with the latest iOS update.
(02:34):
So Apple has made a major change, unfortunately,
in its latest iOS update, re-enabling the
Apple intelligence by default.
Now, this move aims to improve overall device
performance by utilizing machine learning to enhance user
experiences, making your iPhone smarter and more efficient
in predicting your needs.
Unfortunately, it does turn on the Apple's iOS
(02:59):
AI.
And the question you might be saying is,
how do I disable the Apple AI from
auto-turning on?
That's a great question.
The thing is, how to turn off Apple
intelligence?
Well, you can actually open settings or system
(03:20):
settings and choose Apple intelligence Siri, turn off
the Apple intelligence option, confirm your choice in
the dialogue, and it'll appear by tapping when
you turn off the Apple intelligence option.
Unfortunately, when the phone goes through an update,
well, you are going to get the Apple
intelligence turning back on.
I hope they'll eventually fix this, but unfortunately
(03:43):
that is a problem right now.
And ladies and gentlemen, if you are on
Facebook, yes, and you do lives on Facebook,
Facebook will start deleting those lives 30 days
after the broadcast.
I actually did a broadcast on my birthday
on February 21st, 2025.
(04:06):
And today is the first time that they
are going to be deleting one of my
broadcasts per this new rule.
In this move to maintain a cleaner, a
more streamlined platform, Facebook is beginning to delete
live videos 30 days after their broadcast.
So, this shift is designed, as I said,
to reduce clutter, improve content duration, ensuring only
(04:30):
the most relevant and, of course, important live
broadcasts remain accessible to users everywhere.
So, I know this might be a challenge
for some people, but hey, when you put
a live out there, remember that live is
there to captivate people at the moment.
And basically, after 30 days, the content is
(04:50):
pretty much old.
All right.
Our next story comes to us from Mr.
Elon Musk and OpenAI.
So, OpenAI and Musk have fast-tracked this
trial over OpenAI's shift forward to a for
-profit model.
The ongoing legal battle between OpenAI and Elon
Musk has intensified as two parties fast-track
(05:11):
a trial concerning OpenAI's transition to a for
-profit model.
Now, this trial will likely shape the future
of artificial intelligence, particularly in OpenAI companies in
the balance of profitability and public good.
The question you might be asking yourself is,
why does Elon Musk want to block OpenAI
(05:37):
from going public?
That's a great question.
Musk had filed a lawsuit against OpenAI, and
Mr. Altman claiming they had breached the company's
founding contract by putting commercial interests ahead of
public good.
You know, unfortunately, when people create something, they
say they're doing it for one reason, either
(05:59):
to, let's say, avoid certain types of tax
allegations or fees that they have to pay.
When you do something as a think tank
or a not-for-profit, it allows you
to exist and be under some more liberal
rules as opposed to a for-profit corporation.
(06:20):
Vice President J.D. Vance expects a framework
for resolving TikTok's ownership by April 5th, everyone.
This framework will be in place to resolve
the issue of TikTok's ownership, and this decision
comes amidst growing concerns over national security and,
of course, data privacy that's been lurking regarding
(06:40):
the popular social media apps' ties to China,
ByteDance, corporate company to basically step up to
the plate or step down.
Apple's secret appeal against the UK encrypted data
access order raises privacy concerns.
Apple's quiet legal battle against UK orders demanding
(07:03):
access to encrypting data has raised serious concerns
about privacy rights.
The appeal could set a dangerous precedent for
tech companies as it challenges the balance between
law enforcement needs and individual privacy protections.
Apple plans to add live translation for their
(07:25):
AirPods, right?
Apple is looking to take communication to the
next level by adding live translation capabilities for
the AirPods, and this new feature could revolutionize
how we interact with people across the globe,
making it easier to bridge, let's say, the
language gap and barriers and enhance real-time
(07:45):
conversations.
I think that's going to be something really,
really cool.
The question you might be asking is, you
know, when, and this is a great question,
when do Apple's new AirPods with the language
translation come out?
So, they haven't given a date yet to
when, but they're saying it will be an
(08:07):
iOS 19, which is supposed to come out
later this year.
Who knows whether that's Q3?
Who knows whether that's Q4?
I don't think we're going to see it,
you know, any time before June.
I could be wrong, but it's later.
When they say later, I think we're talking
like Q3, Q4.
So, we will definitely keep you abreast of
(08:28):
what's happening there and what the Apple Translate
feature will do on the Apple AirPods.
My concern, though, is this going to open
up Pandora's boxes for security?
I'm concerned about that.
So, we'll have to just keep you in
the loop about what's happening there.
And India, yes, India plans to limit satellite
(08:51):
broadband licenses to five years, challenging Starlinks, which
is a big problem for Mr. Elon Musk.
India is set to impose a five-year
limit.
And so, this could be a big, big,
this could be a very, very big problem.
(09:12):
A big problem for a lot of people
because I think, you know, it's an issue.
And not only is it about money, it's
about security and, you know, and things like
that.
It's going to be a problem.
It's going to be a big problem.
And this new rule may require adjustments to
its business model in India, potentially impacting its
(09:33):
long-term plans.
Alfonso, a robot waiter who's delivering not only
orders, but serving up smiles and delights to
customers in the UK at their cafe.
A UK cafe and a robot named Alfonso
is helping to serve food and drinks, assisting
staff and providing an engaging experience for customers.
(09:57):
The robot waiter designed to deliver orders efficiently
is capable of interacting with patrons in a
friendly manner, creating a unique atmosphere as robotics
and AI continue to merge their way into
our everyday life.
Alfonso is a glimpse into the future of
customer service, where automation and human touch work
(10:19):
together to improve efficiency and our satisfaction.
The Iran drone uses surveillance.
And I think this is a big issue
for a lot of people that, you know,
they don't quite understand, you know, where this
is going somewhere, but it is definitely a
(10:40):
very, very big problem.
And so Iran uses drones, surveillance apps, and
facial recognition to enforce dress codes.
In Iran, authorities are using drones, surveillance apps,
and facial recognition technology to enforce the country's
strict dress codes for women.
These tools enable the government to monitor public
(11:01):
spaces and identify individuals who are not adhering
to the dress code, potentially leading to legal
consequences.
The use of such technology says spark debates
over privacy, freedom of expression, and the role
of technology in enforcing societal norms.
I know what you're probably saying to me,
what are the dress codes for women in
(11:22):
Iran?
And that was the first thing that came
to my mind too.
In Iran, women are expected to keep their
legs covered down to their ankles, and they
wear loose tunics or coats with long sleeves
to cover themselves.
So people ask me, are jeans illegal in
(11:45):
Iran?
Wearing jeans, leggings, ripped jeans, and loose skirts
and dresses is allowed.
Women also have to cover their hair using
a headscarf or a shawl, which can be
purchased at any market or shopping center in
Iran.
It's acceptable to show the front part of
your head and wear the shawl halfway, as
many Iranian women do.
(12:07):
So can women in Iran wear makeup?
There's no punishment specifically for wearing makeup in
Iran, but women are expected to dress modestly
and may be punished if they are deemed
to be dressed immodestly.
What happens if a woman doesn't wear a
hijab, which is what they actually call this?
Well, while the hijab requirements were already mandatory
(12:28):
under Iran's Islamic Penal Code, this new law
introduces dramatically harsher consequences, and violations can now
result in extended prison sentences of up to
15 years and substantially increased fines.
Can girls wear shorts in Iran?
Well, traveling to Iran means respecting its unique
dress code, especially for women.
(12:48):
In Iran, women are expected to keep their
legs covered down to the ankles and should
wear, like I said, tunics that are loose,
coats with loose sleeves.
Do female tourists have to wear hijabs?
Whether female tourists need to wear a hijab
depends on the state of the country and
the region they are visiting, with some areas
requiring it while others do not.
(13:09):
I think the question about this, and my
big question is, why is Iran so strict
about women's dress code?
I think it's been something that's been around
for a long time, and they're cracking down
(13:33):
on it.
It's more of a control, I think, and
I think that it's putting a lot of
pressure on women and their families.
It's just something their parliament is choosing to
do, and again, they are using drones and
intrusive digital technology to spy and see whether
(13:55):
women are actually wearing the appropriate clothing, and
if not, well, they can be fined.
Witnesses have said that a 22-year-old
person was badly beaten by the mortality police
during her arrest, but authorities denied she was
mistreated and blamed sudden heart failure for her
(14:16):
death.
Her killing sparked a massive wave of protests
that continues today, despite threats of violent arrest
and imprisonment.
Two and a half years after the protests
began in 2022, women and girls in Iran
continue to face systematic discrimination in law and
in practice that permeates all aspects of their
lives, particularly with respect to the enforcement of
(14:37):
mandatory hijabs.
They are, let's say, redoing things, and I
think it's all about a culture of respect.
I think that's why they're doing this.
I think it's a little harsh, but again,
this is what they're deciding to do, and
the Roblox CEO advises parents to keep kids
(15:02):
off the platform if they are truly concerned
about their safety.
The CEO of Roblox has issued a public
statement advising parents to keep their children off
the platform if they're concerned about safety.
Roblox, a popular online gaming and social platform,
(15:23):
has faced criticism over issues such as inappropriate
content, and that's a big, big problem for
a lot of people, and so this is
what he's saying to do.
Keep your children off the platform if you're
concerned about safety.
Again, it's a popular online gaming social platform.
(15:46):
It's faced criticism over issues as inappropriate content,
online harassment.
With an ever-growing user base, the platform
is under increasing scrutiny, and the CEO's warning
reflects the company's recognition of the importance of
safeguarding young users in the online world.
You know, whenever you have some type of
(16:06):
a public thing, I think it becomes very
hard to enforce this.
I mean, I think that's what it comes
down to.
It becomes very hard because when you think
about what it is, I think it becomes
hard because a lot of people just don't
understand that this is the culture.
This is the way people are behaving, and
(16:27):
I think that can be a big problem
for a lot of people, and they don't
necessarily know why it is, but I know
that a lot of people out there don't
like sometimes what some governments do.
I mean, that's just kind of the be
-all, end-all, I guess I would say
to you guys, but the question is, how
(16:49):
do you handle stuff like this?
Because it's very misleading.
It can cause people to feel, I guess
you'd say, uncomfortable, but my big question comes
back to is the why, and the why
in this case, ladies and gentlemen, it comes
out because of control.
(17:11):
That's the main reason that this whole thing
comes to pass.
It's about control, and control is something that
I think a lot of people don't really
understand, but it's something that I think more
people need to understand because governments are doing
this all over the place, and is it
(17:32):
really for the protection of the people?
No, of course not.
It's not for the protection of people.
It's actually there because they want control, and
so I think when we use technology like
AI and drones and facial recognition, I think
there is a problem with that, and the
(17:52):
problem I have is that we are disrespecting
people's privacy.
That's just my personal thoughts, and I know
that that is a big problem for a
lot of people, but the fact that the
CEO of Roblox stepped up, and the thing
you might be asking me, and it's a
(18:14):
very, very good question, how does Roblox make
their money if the platform doesn't charge to
play games?
Well, that's a great question, so Roblox earns
money by selling Robux, R-O-B-U
-X, which users spend in experiences on items
(18:37):
and assets in their marketplace.
It's all created for you to play in,
and they're always searching for new ways to
increase creators' earnings.
In 2024, a creator community earned $923 million.
Let me say that again, $923 million, $923
million, and they paid their developer community close
to $3.3 billion in 2018, so they
(19:02):
have something called in-game purchases, such as
the virtual items, game passes, and developer products,
and they're always working on new monetization strategies.
Does Roblox take something?
Yes, you can also choose to sweeten the
deal by including some additional Robux, but keep
in mind that there will be a 30
% transaction fee removed once the offer gets
(19:24):
accepted, so you can choose to sweeten the
deal by including some additional Robux, but you
can keep in mind that there'll be a
30% transaction fee that will be removed
once the offer gets accepted.
On the smaller deals, they tend to charge
people a lot of money, so in-game
purchases play a significant role for Roblox.
Revenue stream with millions of users on the
(19:46):
platform, the opportunities for developers to monetize their
games are immense.
Developers can create and sell various in-game
items, such as clothing, accessories, and even special
abilities using Robux as the currency.
So Roblox remains, they claim, unprofitable, which I
don't understand how.
(20:06):
The company posted a consolidated net loss of
$221.1 million in Q4, and an improvement
from a $325.3 million loss in Q4
of 2023 for a full year net loss
of $940.6 million compared to $1.16
billion in fiscal year 2023.
Something seems wrong, like it doesn't seem legitimate
(20:29):
there.
So is Roblox a making money now?
Well, they're the biggest game in the world,
but it isn't profitable because it reinvests heavily
into its growth.
So my question is, how does a company
keep staying in business, like in investing all
(20:50):
that, if it's not making money?
I find that to be a problem.
Some of you have asked me, does Roblox
have content that is not appropriate?
And so, yes, there've been some experiences on
Roblox that include some types of weapons and
violence.
They don't allow content that contains violence or
(21:11):
anything like that, or serious physical or psychological
abuse, including animal abuse or anything like that,
or anything that would be considered illegal.
And so they have a lot of rules
on Roblox, and they have to comply with
their restricted content policy and community standards.
They won't allow language that's used to harass,
discriminate, incite violence, or threaten others, or used
(21:35):
in any adult context manner that would not
be appropriate for minors.
So restricted experiences are not accessible in certain
regions.
Rule 11 in Roblox, due to things happening
in the Roblox game, you're required to follow
the same Roblox rules there.
(21:55):
Roblox games are punishable, what they call rule
11, general rudeness and cyber bullying.
Do not trash talk or bully anyone in
the server.
If you have an issue, speak to a
staff member.
So let's just take a look.
What are Roblox rules?
Let's take a look at what they are.
So no harmful content, no illegal activities, no
(22:18):
personal information disclosures.
Users cannot reveal or threaten to reveal others'
personal information.
No account takeovers or false reports, basically threatening
to take over another account or file a
false abuse report.
It's also against the rules.
No discrimination, no harassment, no impersonation.
Impersonating individuals, groups, or entities in ways that
(22:39):
could damage their reputation or cause others harm
is also prohibited.
No inappropriate content.
That is obviously important, especially content that would
endanger minors or children.
No Roblox-created assets.
Creators cannot use any Roblox-created assets or
official Roblox branding or iconography as part of
their items.
(22:59):
Not permitted.
No overly similar items.
So if somebody else creates something, well, sorry,
you can't create that same Roblox item because
it's too similar.
No clickbait or false claims.
Advertisements that falsify, represent content, or encourage clicks
solely to inflate ad clicks are not permitted.
No promoted sales for charity.
(23:22):
Creators cannot claim that sales of an item
will go towards a cause, even if it's
true, as Roblox cannot verify the legitimacy of
such claims.
Thus, they do not allow you to do
a sale for charity.
So they just don't allow it.
Content maturity, age-appropriate ratings.
They have ranges for all ages or nine
plus.
They have a restricted access group for under
(23:43):
nine, and then they have a no rate
experience that prevents children under 13 from accessing
areas that would lack age-appropriate rating settings.
Enforcement, have moderation.
Roblox moderates content and user behavior to ensure
a safe and positive environment.
Users can report violation of the community standards
or terms of use.
Violations are dealt with based on the severity
(24:04):
of the violation and the threat to the
Roblox community.
My question is, will Roblox ever make money?
And this is interesting.
They're the biggest game in the world, but
they are not profitable.
(24:25):
So the question I have is, who owns
Roblox?
The owner of Roblox is David Bozuki, who
is the co-founder and CEO of the
company.
He co-founded Roblox in 2004, along with
Eric Castle.
While Bozuki is the primary figure behind the
company's ownership, Roblox is now a publicly traded
company, meaning it also is owned by shareholders.
(24:48):
The real owner, as I said, is David
Bozuki, and his vision is to build a
that enables shared experiences among billions of users,
not thousands, billions.
The company has been widely recognized for its
innovation and vision under Bozuki's leadership.
The question people have asked, will David shut
(25:08):
down Roblox?
This has been a rumor for a while.
So no, Roblox is not going away.
The game will stay available and players will
be able to play at their leisure.
There are no plans to close the platform
or any of its services in the foreseeable
future.
So you may continue to play Roblox.
They're saying you can play it at school,
home or work.
(25:29):
I don't think I'd advise playing it at
school.
I wouldn't advise playing it at work as
well.
The question is, this is a big question
for me, will Roblox ever be popular?
And it's rising.
(25:49):
The community is growing so fast.
According to Statistica, the platform has seen almost
27% rise in the daily users from
Q4 2023 and beyond.
People ask, is Roblox dying?
No, it's not dying.
It's continued to grow with significant increases in
revenue bookings and whatnot.
So you might be asking, what is Roblox
(26:10):
trading at?
So Roblox right now, it's on the stock
market, right, in case you're wondering.
And it's currently up.
It's $60.62, of course, at the time
of this stream, but it could change.
So it's up 4.6 something percent, keeps
fluctuating now between 60.58, 60.57. So
(26:33):
the question everyone asks is, why is Roblox
so popular?
It's because people can play a wide array
of games without any upfront cost.
They can make it so they can create
their own games, their own Roblox maps.
(26:56):
And this is what people want.
And eventually, I mean, I would say just
because a company is not profitable, doesn't mean
that the staff is not getting paid.
I mean, that's the thing you have to
understand.
The staff is getting paid at Roblox.
So they might not be a profitable net
at the end of the year, but that
might be something that they have kind of
(27:17):
designed.
I don't know.
It just seems odd.
So a software bug was recently found at
Metafer and exposed NHS patient data to potential
hackers.
A software bug at Metafer, a UK-based
healthcare provider, has exposed sensitive NHS patient data
to potential hackers.
The flaw, which affected the security of patient
(27:38):
information, raised alarms about the vulnerability of healthcare
systems to cyber attackers as more healthcare service
transitioned to digital platforms, ensuring the security of
patient data as it becomes a critical concern
for everyone, with incidents like this highlighting the
need for stronger cybersecurity.
And I know what you're probably saying to
(27:58):
me, hey, John, you know, this is very,
very interesting.
And although it's interesting, it's a problem.
It is something that is growing and it's
something that is becoming a very, very big
problem.
And Apple, yes, Apple's voice-to-text system
misinterpreted a voicemail into a very inappropriate message.
(28:19):
Apple's voice-to-text system designed to convert
voicemails into written text has made a notable
error by misinterpreting a message into something very
inappropriate.
This mistake has sparked discussions about the accuracy
of voice recognition technology, especially in situations where
context is crucial.
As a voice-to-text system becomes increasingly
(28:41):
integrated to daily communication, these types of errors
raise questions about limitations of AI and machine
learning in understanding nuanced human speech.
And is this something that we can trust?
Those are questions that I think are going
to continue to compound.
And so we'll have to keep an eye
on that, but it's definitely something that I
(29:03):
think is going to be very, very interesting.
And ladies and gentlemen, nine, nine United Kingdom
banks face millions in compensation after IT outages
disrupt services.
Nine UK banks are facing millions of pounds
(29:24):
in compensation claims after IT outages disrupt services
for thousands of customers.
These outages cause significant delays in transactions, accesses
to accounts and other banking services, leading to
frustration and of course, financial losses for customers.
The incidents highlight the growing dependence on digital
(29:46):
infrastructure in the banking sector and the need
for robust systems to prevent such disruptions.
And I think this is something that a
lot of people maybe don't realize is that
although this stuff has been happening in the
UK, it can just as well be happening
in the United States.
See, hackers don't just hit up the United
(30:07):
Kingdom or other countries.
So we'll definitely keep an eye on that
and let you know what's going on.
At New York City, MTA pilots a new
Google AI project to detect subway track defects.
This is pretty interesting.
New York City's Metropolitan Transportation Authority, MTA, is
(30:28):
basically piloting a Google AI project aimed at
detecting defects in subway tracks.
The AI system uses advanced imaging and machine
learning algorithms to identify potential issues before they
become major problems, enhancing the safety and the
reliability of subway systems.
(30:49):
This innovative approach could revolutionize how public transit
agencies maintain infrastructure, preventing delays and ensuring a
smoother commute for passengers.
I'm curious about your thoughts about this and
what MTA is going to do with Google's
pilot track system.
(31:10):
And this is something that I feel is
interesting, but I think it can't be a
be-all end-all.
So the MTA, as I said, has partnered
with Google's public sector on a pilot program
designed to detect track defects before they cause
disruptions.
So Google Pixel smartphones retrofitted on subway cars.
(31:35):
Basically, the system is capturing millions of sensor
readings, GPS locations, and hours of audio to
identify potential problems.
The project aims to improve the efficiency of
MTA's response to track issues, potentially saving them
money, and also reducing delays for passengers.
So it's not just doing this with cameras.
It's also doing it with sound.
(31:56):
So this AI-powered program called Track Inspect
analyzes the sound and the vibrations from the
subway to pinpoint areas that could signal defects
such as loose rails or worn joints.
Data collected during the pilot, which ran from
September 2024 and basically has just about gone
(32:17):
to its end in January 25, showed that
AI systems successfully identified 92% of defect
locations found by human inspectors.
The system was trained using feedback from MTA
inspectors, helping refine its ability to predict track
issues.
While the pilot was considered a success, the
future of the program remains uncertain due to
(32:37):
financial concerns at the MTA.
Despite this, the success of the project has
sparked interest from other transit systems looking to
adopt the similar AI-driven technologies to improve
infrastructure maintenance and reduce delays.
The MTA is now exploring other technological partnerships
to enhance its track monitoring and maintenance efforts.
So I think a program like this would
(32:59):
probably be interesting if they implemented this maybe
at like the music park, like a great
adventure, or a bush gardens, or something like
that.
I think that could be really cool to
understand.
But I think AI, again, AI is a
tool and that's important.
(33:19):
It's a tool and we have to know
how to use that tool.
And the question you might be asking me,
it's a very good question, what did the
MTA Google Track pilot cost?
Well, it was actually offered at no cost
to the MTA.
And it was so successful that New York's
(33:41):
Transportation Agency has announced it's continuing the work
with Google in a new pilot program.
So again, the original one ended in January,
they're doing another pilot.
And it looks like they're not paying yet.
But I think the real reason that Google
is doing this for free is they want
the notoriety.
They want to get into other places.
So I think that's a big concern for
(34:03):
everyone.
And hopefully, this will spark other things like
maybe Google might approach amusement parks, because we're
talking about the same things.
Maybe they can identify certain parts of the
track that are a problem.
I think it's going to become harder on
(34:23):
a track because right now, the tracks are
just put in.
But I think eventually, each track is going
to have to be identified.
Maybe it's going to be a foot long
track.
And it's going to have to be identified
so we know what part of the track.
It might even be so picky as to
identify every rail of the track.
So if there's spokes, maybe it's on that.
(34:44):
Maybe it's on turn points.
It could be on a lot of different
things.
And I feel that although this technology is
good, I'm very concerned about where this data
is going, what they're going to do.
And I think that is a big concern
of mine.
I remember the first time I stumbled across
an emerging trend that completely changed the way
(35:05):
I thought about innovation.
It was probably back in the early 2000s
when smartphones were just starting to get traction.
I was amazed at how quick mobile technology
transformed the way we live, communicate, and work.
I was actually one of the thought leaders
on the BlackBerry sidekick board when they made
(35:26):
sidekick before they actually had BlackBerry.
It was interesting.
But unfortunately, I had to move over to
the iPhone, which I do like, all because
I was starting to use apps like things
like it was actually like Lyft and stuff
like that.
And they weren't available on BlackBerry.
I think BlackBerry really missed the boat, unfortunately.
(35:47):
BlackBerry was so secure, it was more secure
than anything.
I was amazed at how quickly mobile technology
had transformed the way we live, communicate, and
we work.
Every time a new episode, as you know,
of The JMOR Tech Talk Show drops,
I feel like I'm uncovering another game-changing
moment.
Much like those early days when I felt
like anything was possible in tech, I can't
(36:08):
wait to share more insights, as you know,
not just today, but every day.
And I think it's these personal stories that
I can relate to with my insights and
now going back to school and getting my
master's at Montgomery State University, and then eventually
getting my PhD.
I think it's these moments that make me
not just an interesting content creator, but one
that gives me a true flair of authenticity
(36:32):
about things that apply to real life.
And as I said, Apple re-enables the
Apple Intelligence by default with the latest iOS
update.
And I had my iPhone, last one I
had for a couple years.
I remember the first time I felt like
it was learning my habits that had to
be probably on my second iPhone.
I think I'm up to like my seventh
(36:53):
or eighth iPhone.
I noticed how Siri became increasingly efficient, predicting
the apps I might use, offering suggestions at
just the right moments.
But the iOS updates that made Apple Intelligence,
the default setting, made my phone feel almost
human.
Like it knew me better than I knew
myself.
It's like having a personal assistant who anticipates
(37:14):
your every need.
And it blew my mind about this technology.
It was almost scary in a way, because
I think it was trying to think about
things.
And I think that's a problem.
I think we need to keep AI in
a certain realm.
And I feel that's a problem for a
lot of people.
And when I heard about Facebook deleting live
videos 30 days after the broadcast, I wasn't
(37:36):
crazy about it.
A few years ago, I went live.
Actually, it's been over four years that I
went live on Facebook the first time.
And I was hosting this, I think the
first thing I went live for, I think,
might have been The JMOR Tech Talk
Show.
At that time, guys, I wasn't so into,
I was always in video production.
But I didn't really have my following.
(37:58):
I didn't have, I had a weekly show,
but I didn't have inspirations for your life.
And so I went live on Facebook.
It was a very exciting moment for me.
But I realized that it wasn't easy to
keep up with the engagement or even remember
everything from the broadcast.
When I heard about Facebook's decision to delete
live videos after 30 days, I thought about
how much pressure we often feel to create
(38:19):
evergreen content that everyone's just going to love.
It reminded me of how our digital footprint
can sometimes feel overwhelming.
And maybe it's okay to let go of
some of these things after they've had their
time in the spotlight.
But I think sometimes it should be more.
I think maybe it should have been 90
days.
That's my feeling.
Or maybe 180 days.
And I get that they need to clean
(38:40):
things up, but I just think 30 days
is too long.
People have hustled and bustled in life, and
they just miss things.
OpenAI and Musk fast-tracking a trial over
OpenAI's shift to the for-profit model.
This is just interesting.
This story reminds me of a key debate
I had one time with a close friend
about the future of artificial intelligence.
(39:01):
I was advocating for innovation and development while
my friend was worried about the ethical implications
of AI-driven business.
When I heard about the legal battle between
OpenAI and Elon Musk, it was as if
our discussion had come to life in a
full circle.
It's an important moment in tech history as
it pushes us to think about what drives
innovation, profit, or purpose.
(39:23):
And I remember talking to somebody the other
day, it's actually a doctor, a client of
mine that I've been working with for many
years, and just talking about the concept that,
do we need to fear AI?
I don't think we need to fear it,
but I think we need to be responsible
in how we create AI.
I think that is a very, very important
point.
(39:43):
And hearing about Vice President JD Vance expecting
a framework for resolving TikTok's ownership by April
5th, that was interesting.
A couple of years ago, a similar drama
happened when I was working with a tech
startup, and that was at the center of
a privacy debate.
I was working on a project and a
(40:05):
product with one of my companies that gathered
user data, and figuring out how to balance
privacy with business goals felt like walking a
very, let's say, fine line or a very
hard-to-walk tightrope.
When Vice President JD Vance announced that a
resolution on TikTok's ownership would be coming soon,
(40:27):
it struck a very emotional chord with me.
It's a challenging decision, as we see with
TikTok.
Balancing user privacy and business interests is an
ongoing complex issue.
So does TikTok have a new owner yet?
Well, right now, no.
(40:49):
And so they're working on this.
And do I think they're going to find
somebody?
I think they will.
I think Trump's probably at the bottom of
it or ahead of it.
And when I heard about Apple's secret appeal
against the UK's incremented data access, raising privacy
concerns, there was a time when I faced
a situation where I had to decide whether
(41:11):
to share personal data for a security reason.
It was a difficult choice, as they weighed
the potential benefits of helping the law enforcement
agencies against my concerns about privacy.
When I heard about Apple's legal battle over
encrypted data, I thought back to the moment,
privacy is such a fundamental right.
And this case highlights how easily tech companies
(41:32):
can be caught in the middle of protecting
users while also being pressed for transparency.
I was recently having a conversation with a
president the other day of an organization, and
I was talking to him about, you know,
there's a lot of great things you guys
do here, but a lot of your technology
(41:54):
and your whole fight for being transparent, although
it's what you want and it's what you
preach, I don't feel like you are transparent.
And I wanted to bring this to your
attention.
He says, John, that's an excellent observation.
And the reason I share this with him
is because I think they push transparency so
much, but they fail every time.
(42:16):
He says, yeah, John, it's because we have
so many antiquated systems, and it's taking us
a while to review them, and then we're
going to replace them within a couple of
years.
But what I don't want to do is
replace those systems with something that is going
to be a newer technology and be what
we have or worse, so we're just copying
(42:37):
what we have.
We want to make sure those systems align
with everything.
I think that's important.
And hearing about Apple's plans to add live
translations for AirPods was exciting.
When I traveled abroad a few years ago,
and many of you know that I have
traveled, my first trip to Europe was actually
when I was in eighth grade.
My grandfather on my mom's side of the
(42:59):
family and grandmother and cousin took, we went
to Italy because we had family in Italy.
And that was an interesting experience.
And I found myself fumbling through translation apps
and even a little book to communicate with
the locals.
There wasn't any translation tools like we have
now, or we didn't even have an iPhone
back then or any phone that could do
(43:20):
translation.
The language barrier was frustrating for me and
challenging.
And I kept thinking about how much easier
it would be if there was a tool
that could just translate in real time without
the need for an app.
I remember being at a school doing a
play for a charity and the bathroom was
closed because they were doing maintenance on the
(43:40):
men's room.
So I wanted to ask the janitors where
the next bathroom is because I had no
clue when the school was huge.
And I said, I can't just go up
to them and ask them where the bathroom
is.
I mean, I can try.
So I did, and they didn't understand me.
So I went to my phone and I
used, I think it was Google Translate.
(44:01):
And I said, you know, hey, such and
such translate this into Spanish and read it
back.
And it read it back, not just put
it on the screen, it read it back.
And then when they said it, they were
able to tell me exactly where it was
and they showed me.
So I think those are important things.
(44:21):
And so, you know, when I heard about
Apple's plan to add live translation to AirPods,
I couldn't help but imagine how much simpler
travel and communication would now be.
It felt like something out of a sci
-fi movie that was finally becoming reality, but
I have a concern.
And my concern is my privacy.
That's a big concern of mine.
(44:42):
India, as you know, plans to limit the
satellite broadband license to five years and challenging
Mr. Musk and his company Starlink.
A few years ago, I was working with
a rural community that had very limited internet
access.
And we explored options like satellite broadband, but
service was inconsistent and the costs were higher
than really expected.
I remember trying to bring cable into somebody's
(45:04):
home that lived literally about two miles from
the street and it would cost them a
thousand dollars just to bring in cable at
that time.
And they would have to pay for it
because they were the only property back there.
So that's why they would have to pay
for it.
Big problems, right?
And so we explored options, as I said.
(45:25):
When I heard about India's decision to limit
satellite broadband license, it reminded me of that
struggle.
It's clear that satellite internet needs to evolve
and be more sustainable.
And this decision reflects a larger issue about
how we can provide affordable, reliable internet to
underserved areas.
(45:46):
I remember just a few years ago, I
believe it was the Affordable Internet Act was
removed.
And you might say, gee, John, why?
So they had an affordable connectivity program and
it died and thousands of households have already
lost their internet.
The ACP provided affordable internet connectivity to low
-income Americans since it expired in May, not
(46:09):
too long ago, just last year.
Around 100,000 charter subscribers have had to
pull the plug.
And it's sad that that had to happen.
It really is.
But nevertheless, maybe something like that will come
back.
Who knows?
But I got to tell you that it's
(46:30):
important that everybody has access to the internet.
You can't even get a job today very
easily without having access to the internet, right?
So it's clear that satellites need to evolve
and they need to become affordable.
And imagine this, as I said, Alfonso, a
robot waiter, assisting staff and delaying customers at
a UK.
This was really amazing.
(46:52):
Not too long ago, I visited a restaurant
that had these integrated robots in Naples, Florida
to deliver food to tables.
It was a novelty at first, and I
wanted to go there because of the robots.
But as the evening went on, I started
to appreciate how efficient and friendly the robot
was.
Now, all this robot did was simply show
up at my table and the server still
had to take the food and put it
on there.
(47:13):
So it wasn't a full-fledged robot, but
Alfonso actually is a full-fledged robot that
can actually do more than what some of
these table servers can do.
And it got me thinking about the future
of customer service.
Now robots could actually improve experiences.
Rather than just replace human workers, Alfonso the
robot waiter feels like the next step in
the evolution.
It's exciting to see technology enhance.
(47:33):
It brings smiles to people in many different
ways, and it's cool.
Hearing about Iran using drone surveillance and facial
recognition to enforce dress code for women, this
is just sad.
It takes me back to a trip I
once took to a country with strict public
behavior rules.
The idea of surveillance was all around, and
it made me feel constantly observed.
I wasn't doing anything wrong, but I always
(47:54):
felt like I was being watched.
I remember questioning the balance between personal freedom
and the desire for security.
Now with Iran using drones and surveillance technology
for dress code enforcement, it's a stark reminder
or a flashback of how powerful technology can
be used in ways that affect our personal
lives and freedoms.
It's unsettling to me, though, that it also
(48:14):
makes me reflect on how vital it is
to safeguard privacy in an increasingly monitored world.
I think that's important to understand.
And we know many kids play with Roblox,
and the CEO now advising parents to keep
their kids off of it if they feel
there's a safety issue.
I'm not a parent, but if I was,
I would be cautious about kids spending their
(48:36):
time online.
I remember the first time I saw children
playing Roblox.
It was fun, but I had concerns about
the online interactions they were having.
Hearing the Roblox CEO's advice to parents gave
me a pause.
It made me think about the responsibility of
tech companies in creating safe environments for younger
audiences.
Who are they interacting with?
Who are these people?
And they're sharing things personally they're not supposed
(48:57):
to.
As much as we embrace technology, it's crucial
that platforms remain vigilant and protecting their users,
especially the vulnerable ones, because I think minors
and kids don't know about this.
They're just wanting to play a game.
And the software bug I heard about exposed
the NHS patient data, potential hackers.
That's just wrong.
When I heard about that software bug at
Medifer, I immediately thought about the impact it
(49:20):
could have on trust within the healthcare system.
The idea that personal health data could be
exposed due to a bug is terrifying.
And it serves as a reminder of the
importance of cybersecurity in the healthcare industry.
And hearing about Apple's voice-to-text system
recently with the inappropriate message, I had a
(49:40):
funny but awkward situation where my voice-to
-text system completely misunderstood my voicemail and sent
a message that was way off from what
I intended.
I had a good laugh about it.
It was not inappropriate.
It was just funny.
When I heard about Apple's misinterpretation with this
elder person, I thought back to the moment
that it was funny, but how something that
could be funny could also be translated something
(50:02):
that could be rude and inappropriate.
And it's amazing when they work perfectly.
But when moments happen, it reminds you that
AI is not perfect.
It's a tool, and it's constantly getting better.
Nine UK banks we heard recently faced millions
in compensation, as I said, in disruptive services.
(50:23):
I remember the time when I tried to
transfer funds for an urgent payment, but the
banking app crashed, or it says the mobile
app is currently unavailable.
I'm like, well, now what do I do,
right?
And I had to wait hours for it
to be resolved.
Instead of that, I decided to show up
to the bank and just make my withdrawal
right there and then.
It was frustrating.
I couldn't help but feel powerless as I
had to do something.
(50:44):
Hearing about the IT outages affecting UK banks
and their compensation plans took me right back
to that moment saying, I need to be
proactive.
As much as I like technology and it's
a convenience, I have to realize that if
it doesn't work, I can't be lazy and
say, all right, I can't do it, or
I can't make the deposit withdrawal because technology
doesn't work.
That's a big problem.
And the last story comes to us again
(51:05):
with my personal antidotes is the NYC's MTA
piling the Google AI project to detect subway
track defects and my personal antidote.
I've been on subways countless times in New
York and many other places in the world,
but there was one instance when I was
stuck in a delay because of a track
issue.
It made me realize how much technology could
improve the reliability of public transportations.
(51:27):
When I heard about the MTA's partnership with
Google AI to track the defects in their
tracks, it reminded me of the potential for
AI to make our daily commute safer and
more efficient.
I'm excited about how tech like this could
change the way we experience public transit.
I'm also excited because if this bridges to
things like amusement parks and other places where
(51:48):
there could be dangers, it could even be
used in public buildings to see if there's
potential dangers.
Could be a stair, could be an elevator,
could be a walkway, an escalator, could even
be a loose tile.
AI can be used in so many different
ways, but I think what we have to
make sure of is that we're going to
(52:09):
use this.
We have to ensure that the data it's
capturing is isolated, that it's not grabbing personal
data, and that's where we all have our
problems, right?
The personal data doesn't mean to get picked
up, but surprise, it does.
Now, many of you out there have probably
(52:29):
experienced AI before, and you might be asking
a good question.
Do we need to fear AI?
It's often discussed.
It's based on lots of things.
People fear job displacement, lack of control.
(52:50):
These are just a couple.
Privacy and security, autonomous weapons, super intelligent AI.
Of course, there are reasons to not fear
it.
Enhancements of human capabilities, personal enhancements, improved efficiency
and productivity, problem solving, creating new opportunities, new
(53:13):
types of jobs.
Instead of fearing AI, it's crucial to approach
it with a mindset of understanding responsibility, just
like with any powerful technology.
The way we use it matters.
We need to prioritize ethics, regulation, and transparency
to ensure that AI benefits society as a
(53:34):
whole, while also addressing the concerns that come
with it.
The key is to strike a balance between
innovation and caution, and I think that's important.
As I've worked with AI and watched its
development, I've come to realize that we're not
in a situation where we need to fear
AI, but we need to be mindful of
how it's integrated into our lives.
Embracing its positive potential while ensuring we protect
(53:57):
our rights and freedoms is the path forward,
just like using a tool.
It could be a saw.
It could be a drill.
You might say, oh my gosh, this is
a really dangerous drill, or maybe even, let's
say, some type of a torch.
Of course, there is a danger in these
things, right?
But what we focus our mind on, we've
(54:18):
learned in the past, is what brings about
our reality.
That's the truth, guys.
That is what brings about our reality, and
if we bring about our reality in that
kind of way, I feel that it's going
to not only make us feel more comfortable,
but it's also going to give us the
(54:39):
transition that, hey, this is a bridge.
This is how I can make this work,
right?
I think that's something that is very important
for a lot of people to understand.
Going back and going to get my master's
in AI and then my PhD, I feel
(54:59):
that doing that, I'll be able to make
a big impact in the world.
I'll be able to share things, and also
being a speaker and taking that into my
keynote repertoire.
I think a lot of people speak because
they want to pat themselves on the back.
(55:21):
I'm not looking to do that.
I'm looking to give value to the world.
Of course, I charge for it, but all
these shows that I put out, all this
content at BelieveMeAchieve.com, all this is free,
and I do this because I think it's
important that people understand how technology works, what
technology is, what technology isn't.
(55:43):
There's a lot of misnomers in the tech
world, unfortunately.
These misnomers get us to fear things.
This fear comes in for one main reason.
It's because people don't know.
What people don't know, you might say, gee,
(56:03):
John, that's no big deal.
Yes, I could say that to you, but
I won't because you have to understand what
you don't know.
When you don't know how something works, you
become very fearful of it.
You do.
Once you know how it works, it's like,
(56:24):
oh, I get that.
I understand that.
Does that make sense to everyone?
I hope it does.
I hope that you embrace this whole concept
of what this means.
Taking the time to understand what's in your
life, how technology is being used, I think
(56:44):
that is a major key player in our
lives, a major key player.
If somebody came up to me and said,
hey, John, what is artificial intelligence?
Artificial intelligence is the culmination of data that
is, let's say, searched and built upon
(57:11):
based on algorithms, which means it could be
added.
It means it could be changed.
It could be deleted.
The data comes from humans or the data
comes from the real world, like we learned
from the Google project, like the cameras on
the phone, but also the sound, any kind
(57:32):
of things that we could take.
We could even take smell to be part
of AI.
We could.
We could do that.
If we take smell to be part of
AI, I think that's something that maybe people
don't realize that all these things have so
much potential in them.
I mean, they really do.
If you think about AI right now, the
(57:54):
first thing that comes to your mind is,
oh my gosh, how do I get a
job?
Well, don't fear AI.
Use AI to be a tool, right?
Don't fear it.
Ladies and gentlemen, I am John C.
Morley, serial entrepreneur.
It's always such a privilege, a pleasure and
honor to be with you here on The
JMOR Tech Talk Show.
(58:15):
Be sure to check out BelieveMeAchieve.com for
more of my amazing, inspiring creations.