Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:02):
Hi everyone, I'm John C. Morley Serial Entrepreneur, the host of
The JMORTech Talk Show and Inspirations
for Your Life.
(00:45):
Well, hey guys, welcome to the JMOR
Tech Talk Show.
It is great to be with you.
We have a brand new look, as you
guys can see.
We've been working hard on that.
And so it is great to be with
you once again on the Jay Moore Tech
Talk Show.
Today is Friday, May 9th that we're live
(01:05):
to you here.
And it is 2025, it is the second
Friday of May, and we've got a lot
to share with you.
I mean, a lot.
First of all, the title for the show
is AI on the Edge, Tech in Court,
and Gliders Take Flight, Series 4, Show 19.
So welcome everyone, it is my privilege and
(01:26):
of course my pleasure to be with you
here on Jay Moore Tech Talk Show.
Excuse me, do a lot of different shows
there, it is the Jay Moore Tech Talk
Show.
Just did Inspirations for Your Life and doing
another one shortly, but great to be with
you here.
And so the show today is a really,
really good insight of what's going on in
the world.
(01:47):
But if you are hungry, if you're thirsty
or what have you, feel free to head
out to your kitchen so you're not parched.
Maybe it's water, maybe it's soda, maybe it's
a snack, maybe it's healthy like fruit, or
maybe it's a sweet, maybe it's tart or
not.
That's totally up to you.
And hurry on back.
All right.
Well, welcome to the Jay Moore Tech Talk
Show.
In today's episode, I'm diving deep into the
(02:08):
latest headlines that are shaking up the tech
world.
From legal battles involving Google and TikTok to
the latest developments in artificial intelligence and cyber
attacks.
This episode is packed with everything you need
to know to stay ahead of the curve.
I'll explore the new tech regulations, the challenges
big tech companies are facing, and how these
changes could impact our overall digital lives.
(02:30):
So stay right where you are, sit back,
relax, enjoy your refreshment or your snack.
And I'll break it all down for you
and give you the insights that you won't
find anywhere else.
First of all, Google faces an antitrust ad
tech trial in September.
Google is preparing for a major antitrust trial
this September, where the company will face accusations
(02:53):
of manipulating the digital advertising market.
The trial could have serious implications for Google's
business model, potentially reshaping the ad tech landscape
for years to come.
So definitely stay tuned for how this case
unfolds in September.
We'll be covering it for you.
And what it all means for advertisers and
consumers alike.
(03:14):
It's going to set a new precedent.
We all know that Google has not been,
let's say, the most honest kid on the
block.
But I got to be honest, I've had
challenges with them that go way beyond just
bad customer service.
I won't get into that.
I've shared that on other episodes before, but
they've just flat out lied to me.
(03:35):
So we're just going to have to see,
you know, what's going on with that and
just keep you on track.
All right.
And the European Union, well, they're getting tough.
They fine TikTok 530 million pounds for data
violations.
It's no wonder that the United States is
considering possibly banning them if they don't get
their act cleaned up.
(03:55):
TikTok is in hot water after being fined,
as I said, the 530 million pounds by
the European Union for violating data privacy laws.
Now, the fine stems from the platform's handling
of children's data, highlighting the growing pressure on
social media giants are facing to better protect
privacy.
Now, this move sets a precedent for stricter
(04:19):
regulations on digital platforms in the future.
And so the judge warns that the AI
could that this could be a big problem.
But we're going to have to see what
happens, because, you know, what's going on with
this is really a major problem.
And I just don't think enough people understand
the severity of it and what it could
(04:40):
do for people internationally, not just people in
the United States.
And a new thing, the judge warns that
AI could destroy original content market in the
Medicaid.
So a judge has raised concerns about the
impact of artificial intelligence on the market for
original content, citing its role in ongoing Medicaid
(05:00):
that's currently up at the courts.
As AI technology continues to evolve, the fear
is that it could flood the market with
automated content, making value and.
Possibility of negating originality, and this is a
topic I'll continue to follow for you and
(05:20):
watch and report on so that you can
learn the long term effects on the creative
industry, and it's something that is really bad.
Ladies, gentlemen, I think AI is being exploited
in many ways.
I don't think this is what AI was
created for.
Again, it's a tool and how we choose
to use it that makes it good or
bad.
Uber now enables cash payments, catch this, across
(05:43):
most United Kingdom cities.
In a major move, Uber is now enabling
cash payments across most UK cities, a shift
that could attract a whole new set of
customers.
While Uber has previously been a cashless service,
this change opens up accessibility for users who
prefer using cash or lack access to digital
payment options.
(06:04):
Could this move signal a larger trend in
the rider share industry?
Possibly.
And I think there are security implications, like,
you know, where are people getting the money
from?
Is this something good?
Is it something bad?
I mean, I think that's really what it
comes down to at the end of the
day.
And I think that could be a very,
very big problem.
(06:26):
I know that it's something that a lot
of people don't get.
But it's something that I feel is very,
very important to understand.
(06:48):
And, you know, I think a lot of
people are trying to do what they're doing
now.
But then they don't really get it, you
know what I'm saying?
But we're just going to have to see
what happens.
And so I know that this is something
(07:12):
that people just don't seem to get.
And the reason they don't get it is
they get stuck in a certain belief, right?
And that belief holds them back.
(07:42):
And I think this is a problem for
a lot of people, okay?
A very, very big problem.
And that problem means that you need to
understand where technology is going and where it's
not going.
I think that's a huge, huge problem.
(08:02):
And that problem could signal like new challenges,
you know, in the marketplace.
So definitely we'll have to keep our eyes
peeled about what's going on there.
And, you know, what does this, what does
it all mean, right?
I think a lot of people are pushing
(08:24):
for cash.
But will this open them up to more
security concerns?
I don't know.
We're going to have to just wait and
see.
And ladies and gentlemen, Meta threatens the Facebook
shutdown in Nigeria over regulatory fines.
Well, Meta is threatening currently to shut down
Facebook in Nigeria over mounting regulatory fines.
(08:45):
The dispute centers on compliance issues with Nigerian
laws, which have put social media platforms under
increased scrutiny.
That's no surprise, I'm sure you know.
And this move raises questions about the balance
of power between global tech giants and national
governments.
And so I think, you know, if people
(09:05):
want to operate in another, let's say, country
or what have you, they need to follow
the rules of that country.
I think that's a very, very important thing.
And I think it's something that most people
today don't really understand.
(09:27):
And the reason they don't understand it is
because of one thing.
You know what that one thing is?
That one thing is the fact that people,
OK, they believe one thing, right?
(09:50):
And I think that can be a very,
very big problem for some people because, you
know, we're thinking that something is one way,
right?
But then what's happening is it's moving to
another way.
And all this is happening because of a
subtle shift, OK, a subtle shift that is
really changing people's lens or their perspective on
(10:14):
what's going on in our current world.
And cyber attack on Marks and Spencer involves
a four million dollar ransom.
So Marks and Spencer, one of the UK's
largest retailers, is now recovering slowly from a
major cyber attack involving a four million dollar,
it's actually four million pound ransom.
(10:37):
The attack is a stark reminder of the
ever growing threat of ransomware and the need
for stronger cybersecurity measures in the retail sector.
And I think the problem that I see
with a lot of this is that many
retail businesses are not taking technology seriously, OK?
(11:01):
That's a problem.
I think that's something that a lot of
people don't understand, but it's important.
And that importance is something that I believe
most people don't want to, they don't want
(11:24):
to admit.
And the reason they don't want to admit
it is because it's going to change how
they have to do business.
You know, accepting that technology can be vulnerable
is a hard thing for some people.
But I'm going to tell you that when
you choose to embrace technology and understand that
(11:47):
you've got to be vigilant about it, I
mean, 24 seven, not just like not just
like today, not just because you heard something.
I think that's a real big thing when
it comes to tech.
People just don't get like what's going on.
And the reason they don't get it is
(12:08):
because of, let's say, a false belief, OK?
And I think that is a very, very
big problem.
And I see the problem only getting worse
by the moment.
And so I know that sometimes this can
(12:34):
be really a big challenge for people.
And it's not because of the it's not
because of the actual tech, it's because of
people's mindsets, right?
A mindset that can cause people to change
(12:59):
where they are.
OK, I think that's probably the biggest thing
I could say to you, a change of
of where they are.
And so that's something I feel that could
be something that is very hard for many
people to to do.
(13:22):
And so I know that you're probably saying,
you know, this is something that is nonsense,
but it's not, guys.
It's something that is really very apparent in
our world today.
And this apparency is because of somebody's current
(13:43):
beliefs.
Because of their current beliefs and that current
belief means that I am safe.
Now, I'm not saying you shouldn't believe you're
safe.
I'm not saying that.
I'm saying that it's not a question of
if you get attacked, but when, if you're
not properly protected.
So that means that you need to be
vigilant.
You need to keep scanning.
(14:03):
You need to make sure that what you're
doing is up to par.
Right.
That's a very important thing, guys.
I mean, I want to tell you that.
But some people just feel that, you know,
we've got the greatest, the best.
And you might.
But the thing is, these bad actors, they
prey on the fact that you think you've
(14:24):
got the best.
And we all know that security is a
constant morphing landscape of potential vulnerabilities and different
intrusions and different types of attacks ranging from
distributed denial of service attack to much more
(14:44):
complicated attacks, different types of social engineering that
is relentless until it gets through to its
prey.
And I think when a company like, you
know, the one that just got attacked here,
which we all know very well, I think
Mark and Spencer were completely blindsided by what
(15:06):
happened.
They didn't know this was going to happen.
And so when it did, they were like,
you know, feeling like they were like on
their hands and knees, like not knowing what
to do.
And so they are, you know, starting to
bounce back.
But it is a slow process.
I think if Marks and Spencer would have
been more open to more security audits and
(15:29):
things like that, this would have been caught.
Everyone thinks they have the best.
All right.
But we're learning that what you have today
may not be the greatest tomorrow.
We need to keep updating that.
We need to make sure that our threat
landscape matches our internal security, you know, protections.
(15:49):
That's important.
And Apple warns, guys, the Trump era tariffs
could cost 900 million.
Apple's warning that the tariffs imposed during the
Trump administration could cost the company nearly 900
million.
And these tariffs, which impact imported goods from
China, have been a major point of contention
in the trade relations and could have ripple
(16:10):
effects on the tech industry.
So what does all this mean for Apple's
pricing strategies and the global supply chain?
It's a mess.
Right now, Apple phones, Apple, you know, could
raise their prices on the iPhone.
And so it's stated that they've got to
(16:36):
get this money back somewhere.
Right.
And so they're going to try to get
you on things like more accessories.
And so it's basically an inflated Apple tax
is the best way to think about it.
And so it's going to get more expensive
because of the potential tariffs.
(16:58):
And I think they need to move things
away from China.
That's number one.
Analysts estimate that the iPhone could increase as
much as $800.
Some analysts predict that even a moderate 54
percent tariff could lead to a top end
iPhone costing over $2,300.
(17:19):
Now, the average iPhone today, like an iPhone
16 Pro Max, is right around, let's see,
it's right around maybe $1,200 is probably
what we're looking at.
So you're talking a difference there, guys, not
(17:40):
even including tax.
Right.
You're talking right around average, around $1,100,
that's $1,000 more.
That's a pretty big bump up, right, from
where the price was just recently.
So I don't know, guys, but I know
this is going to definitely be a problem
(18:00):
for a lot of people.
And I think this is something that we
need to understand because it's something that we've
got to be concerned about.
OK.
I know that a lot of people don't
quite understand what's going on.
(18:23):
But this is a negotiation deal and it's
not just something for our state or our
country.
This is something that's affecting people globally, internationally.
So we've obviously got to address it or
we're going to be paying some very hefty
prices.
Trump delays the TikTok ban once again, granting
(18:46):
ByteDance kind of a temporary immunity for more
time.
In this cofangled twist, President Trump has once
again delayed the TikTok ban, granting ByteDance more
time to resolve scrutiny concerns.
And this latest extension raises questions about the
future of popular apps in the United States
and its ongoing struggle with government regulations.
(19:08):
Will TikTok finally comply with the demands or
face an eventual shutdown?
I don't know.
So the big question is, will TikTok, this
is the question, will TikTok shut down on
June 9th?
It's June 19th, excuse me.
(19:29):
You know, we all know what happened.
We all know what happened, I should say,
a while ago.
Right.
But the fact that this could happen again,
the question is, will TikTok go black in
June?
(19:51):
And the thing is this, being protected and
doing the right thing are important, right?
The fate of TikTok continues to be up
in the air.
Months after federal legislation has effectively banned the
social media platform in the United States, President
(20:12):
Donald Trump said during an NBC News interview
recently that he would extend the TikTok ban
deadline if a deal isn't struck by June
19th.
He said by executive order in April and
TikTok's Chinese parent company has until June 19th
to divest.
That means June 18th at 10 p.m.
(20:32):
at night is when the U.S. servers
usually start shutting down.
Trump said about the divestment, adding that he
has a little sweet spot in his heart
for TikTok, which he claims helped him win
votes during the 2024 presidential election.
It'll be protected, he said.
It'll be very strongly protected.
But if it needs an extension, I would
be willing to give an extension.
(20:54):
So if ByteDance does not divest TikTok by
June 19th, it could be banned.
But Trump says it'll extend it.
So what's the point?
What really makes TikTok want to change?
I mean, it's like tomorrow is never coming,
right?
They always have some reason as to like
what is going on.
And I think that's a huge, huge problem
(21:16):
for a lot of people that they've got
to like do what they say, not this
nonsense.
I think that's really, I think that's really
a huge thing for a lot of people.
You know, we've seen what's happened with the
European Union and they're giving them fines.
The U.S. isn't quite so, let's say,
vigilant on that.
(21:39):
And I think that's something that most people
don't realize, that it's about where we're coming
from, right?
And where we're coming from is something that
most people don't understand.
And the reason they don't understand it is
(22:02):
just because they don't understand that there is
a technical issue that is potentially going to
cause a problem for a lot of people.
And by a lot of people, I mean,
this could affect people's security.
It could affect things like, you know, their
(22:25):
own privacy.
So the question I have for everybody is,
when is enough can be enough?
I mean, Trump could be extending TikTok for
eternity.
But is that the right thing to do?
I don't think so.
I don't think it is.
But I also don't think people realize the
importance of this.
(22:47):
I just don't think they get it.
And I think the reason people don't get
it is because they don't understand, let's say,
the grandeur of this.
They don't understand the grandeur of this whole
moment.
The moment that could shape our future, our
security forever.
(23:08):
And so if I had to ask you,
friends, would you ban TikTok?
And I have to tell you, it's got
to come down to the fact that, you
know, something needs to be said.
They can't keep just doing these willy nilly
extensions.
They cannot do this over and over again
because the more they do this, it sets
(23:29):
a precedent, a precedent that what's going on
is a huge, huge problem.
And this huge problem, guys, is something I
feel that is going to be around for
a long time until we, OK, until we
as a nation, as a country decide that,
(23:50):
you know, enough is enough.
Like we cannot do this TikTok stuff anymore.
We cannot play these games anymore.
Right.
I mean, when is America going to get
fed up with this whole nonsense?
That that's what I want to know.
When is America going to get fed up
with all this nonsense?
Is it going to be tomorrow?
Is it going to be, you know, is
(24:10):
it going to be next month, next year?
How many times?
I just want to know how many times
is Donald Trump going to extend TikTok?
That's really what I want to know is,
does he ever plan on banning TikTok or
because they helped him win, he feels that
he owes it to them?
Which I think that is a very bad
move from a political standpoint.
(24:32):
I think the bad move comes because most
people don't realize that TikTok could be costing
people billions.
OK, and you might not think that your
security is very important, but I'm here to
tell you that getting back your identity could
be as much as a couple couple million
(24:54):
dollars.
So we don't think about that very much.
But I'm hoping that more people are going
to be, let's say, at least on board
with what the heck is going on.
And if we're more on board, then maybe
just maybe, guys.
Maybe people will be able to understand one
(25:14):
thing, you know, that one thing is that
maybe we have to make TikTok accountable.
And I think we have to start with
our own government.
If we can't have our government get on
board with what TikTok should be and should
not be doing, then how can we expect
the American people to get on board?
I mean, the government is supposed to be
(25:35):
a set of leaders, right?
They're not supposed to be followers.
I think that's a huge, huge problem, right?
A very, very huge problem.
And I know, guys, that a lot of
people have had issues.
Like we said, I mean, they've already got
in trouble for so many different fines that
have been happening to them, right?
So many different fines.
(25:55):
And these fines that are happening are because
of things that are breaches.
And breaches are things where your identity could
be stolen.
I'm not saying it will, it could be
compromised.
I mean, just think about that for a
moment.
I mean, think about the fact that here
you are in this one thing and now
(26:18):
it's like you don't know what to do.
You don't understand something.
But suddenly you think that if you let
the good guys go, or supposedly the good
guys, that it's going to be OK.
But I'm here to tell you that we
are getting misled.
And so I think they've really got to
give TikTok a final date and say, look,
(26:40):
enough is enough is enough is enough is
enough.
I think that's a very, very, I think
that's a very, very big problem.
And I think that as you, let's say,
evolve, I think hopefully you'll be able to
understand that it's more than just one thing,
(27:01):
right?
It's more than the fact that we as
people want to do the right thing, not
just for us, but for other people.
And I think that's something that a lot
of people just don't quite understand.
They don't understand the fact that something somebody
is doing could cause a very, very big
problem.
And that big problem, guys, could make the
(27:24):
difference to whether, let's say, life for TikTok
goes on or doesn't.
I mean, everybody says to me, John, oh,
my gosh, like when TikTok went down, it's
like it's like they couldn't breathe.
I mean, come on, right?
A day without TikTok was not the end
of the world, but people act like it
was.
(27:44):
It's like it was like we were taking
away their food supply, right?
I mean, I think that's pretty bad when
people were relying on that so much that,
you know, it was taking away everything they
believed in.
I mean, why is it so hard for
TikTok to get on board?
(28:05):
I mean, this is what I want to
know why.
And if we can't figure out what the
why is, then maybe we need to do
some more.
We need to do some more amazing stuff.
And by amazing stuff, I mean, we need
to start like making TikTok be accountable.
(28:28):
Okay.
If we can make TikTok accountable, then maybe,
just maybe other people will start realizing that
life is got to change.
And so people are saying, oh, you know,
what, what harm is TikTok?
And it's not about anything political guys.
It's about the fact that if you're doing
(28:50):
something wrong, you need to be doing what's
important.
Okay.
You need to be doing what's important.
And if you do what's important, then I
know, ladies and gentlemen, I know that you
can make a huge, huge difference.
But again, this is not something that's going
to happen overnight.
(29:11):
I want to be quite truthful with you
right there.
If you thought that giving them another extension
is going to help, then maybe, just maybe,
okay, it will change the way you do
things.
I mean, maybe.
Then again, it may not.
(29:32):
And I know, ladies and gentlemen, that people
from all around the world don't quite understand
why things like this are happening.
My biggest concern is why haven't we addressed
this?
But I think the reason we haven't addressed
this is because it's politics.
(29:57):
It's money.
All right.
TikTok is worth a lot of money.
We all know that.
I think that's one of the very, very
big reasons.
Speaking about other things, which are very important,
we have another interesting little twist.
Dezeen, a cutting edge tech company, has successfully
delivered its Autotomus Grasshopper cargo glider to the
(30:20):
United States Air Force.
This revolutionary cargo drone could change the way
the military delivers supplies with its ability to
fly autonomously and land on a variety of
terrains, and diving into how this innovation could
reshape logistics and defense operations, there are infinite
possibilities.
(30:41):
But the question, ladies and gentlemen, is will
this be used for the power of good
or will it be used for something evil?
And so I always said to you, technology
is not good, technology is not bad, but
how we choose to use it, that makes
it so.
I think that is probably the most important
thing.
(31:02):
And I think that it's something that most
people don't realize.
They don't realize that we have choices.
And these choices are the ones that are
going to make a difference in your life.
(31:22):
Okay.
They're going to make a very, very, very,
very big difference.
And that difference could be whether you potentially
get frauded or not.
I mean, I don't mean to say that
in a bad way, but it's the truth.
I mean, the thing is this, right?
The data that they have, we all know,
(31:44):
as we said, has been exploited before in
the past.
And it's not because of, let's say, we
don't believe it's because of a direct violation,
but we also don't have proof that the
stuff in the past has been done in
a manner that would be not professional or
against the, let's say, standards of our own
(32:08):
safety and security.
So whether we're talking about TikTok, whether we're
talking about this new autonomous flyer, I mean,
what's really going on with this stuff and
what changes are going to be made to
our world?
Here's one I really love.
Professor was harassed over Assassin's Creed responds with
(32:30):
kindness.
So a university professor recently made headlines over
being harassed over the connection to the popular
game franchise, Assassin's Creed, instead of reacting with
anger, the professor responded with kindness and invitation
discussed the matter in an open forum.
The story highlights the power of compassion in
(32:51):
the face of online hostility and the importance
of fostering positive digital interactions.
And I think a lot of times people
want to go and attack.
And I think the reason they want to
go and attack, I'll tell you why.
It's because they just don't understand why.
And the why is because of something that's
(33:15):
going on.
The why is because of like where we
are.
And I think we're going to keep asking
those questions.
And so when we think about things like
(33:36):
this, it makes us think, right?
It makes us think about where we are
today, where we are tomorrow.
Right.
And so these things that we're talking about
right here, right.
I think they have, um, I think they
(33:58):
have a very interesting, um, sign that we
have power in our lives.
And I know that a lot of you
are probably saying that it's okay.
But then again, it might not be okay.
(34:21):
And the not be okay is happening.
I'll tell you why.
Uh, because of something that people are choosing
to do.
Okay.
Something that people are choosing to do.
And this choosing is happening, not because of
their own choice, but because of something somebody
(34:42):
else did.
And I know guys that, uh, it's important
to understand what it is, why things are
a certain way and why people are, let's
(35:03):
say doing things a certain way.
And by doing things a certain way, I
know that a lot of you out there
are probably saying to me, John, like, you
know, this, this is not, uh, this is
not really the way it should be.
And, and you're right, but sometimes we get
roped into things because of things, uh, that
(35:26):
are going on.
And that might be something that may scare
you.
Okay.
It might be something that maybe gets you
to look at life, um, a completely different
way.
I mean, just maybe, but then again, it
also might change how you are, uh, your
(35:48):
thought patterns.
And so this all happens because of what's
happening because of beliefs, because of what somebody
said or didn't say.
And I think that's a big, I think
that's a really big thing right now, guys,
is we have to be, we have to
(36:09):
be respectful.
Number one, number two, we have to know
what we're doing and why we're doing something
right.
Like with the case with this professor, I
think that's like utterly absurd that somebody would
go do this.
I mean, they just don't have any respect.
And I think in order to be in
the world today, you can get whatever you
want, as long as you are not rude
(36:31):
or disrespectful to another person.
You don't harm them, whether that's physically, mentally,
emotionally.
I think it's really important to live by
that particular credo.
And Visa is now developing AI agents to
use consumer credit cards.
I don't think I like this.
So Visa is developing AI agents that would
revolutionize, they claim, the way consumers use credit
cards.
(36:51):
These AI agents are designed to help users
make smarter spending decisions, manage their finances, and
even detect fraud more effectively.
This is a huge step forward, they claim,
in merging AI with everyday financial transactions.
And I don't know if I like the
idea of giving my credit card to an
AI agent and having that AI agent shop
for me.
I see that as a major, major, major,
(37:14):
major flag.
And maybe you're saying to me, hey, John,
you know, this is nice, but all I
know is that it's not, it's not personal
anymore, right?
And so now an AI bot gets to
know who you are, what you do.
I mean, I think that's a little impersonal.
I also think that's a breach of our
(37:34):
privacy.
So I'm not really on board with that.
And French crypto entrepreneur's father was rescued from
kidnappers.
In a dramatic term of events, the father
of a French crypto entrepreneur was recently rescued
from kidnappers after a ransom was paid.
Yeah, they should have never paid him.
This harrowing incident and set basically just kind
(37:59):
of helps highlight the growing risks faced by
those involved in the digital currency industry and
the need for increased security measures.
What does it mean for the future of
crypto related security?
Well, I think it means that we need
to develop more of a standard, right?
(38:19):
A standard where we can obviously do things
that we're going to feel safe about.
Okay.
And if we can feel safe about things,
then I know most people, okay, most people
will understand.
They'll understand the fact that it's something that
(38:44):
you just don't understand.
It's something that we need to have a
respect for.
It's something that just doesn't happen once.
You know, having respect for people is not
overrated.
That's something I want to tell you.
It's not overrated, but many people don't have
(39:04):
respect because they don't they're fleeing in the
moment.
That's right.
I said they're fleeing in the moment.
And what that really means is that they
are acting impulsively, right?
You've all been to a, let's say a
store and you have these things where, you
(39:25):
know, you check out and there are these
items to purchase at the register.
There are these impulse items.
I know that, uh, it's important to realize
(39:51):
these things, but I also know that if
we keep this alive, if we believe that
there is a good world out there, we
will see a good world out there.
But if we don't, then we will see
(40:13):
that world.
I think that's a very important thing that
I want to share with all of you
today.
And, uh, here's one that's interesting.
I think you'll find this thing.
Conservative activists, Sue's meta over AI generated defamation.
We all know that AI, uh, has been
stepping, um, you know, that fine line between
where it should be and where it shouldn't
(40:35):
be.
And that line is very gray right now.
It's not black or white.
Assertive activists has filed a lawsuit against meta
alleging defamation by artificial intelligence generated images.
This case raises important questions about the ethics
of AI generated content and the legal ramifications
(40:55):
of using AI to spread misinformation.
It's a pivotal moment for both AI and
online platforms in terms of accountability and content
regulation.
But the question is, how do we set
the standard?
How do we move forward?
How do we make sure there isn't bias
in the world?
I think that is a, a very, very
big thing for a lot of people.
(41:18):
And I think this scares some people because
they don't necessarily know like what, um, is
going on.
Um, and maybe it's about where we're trying
to go, um, and why things are going
a certain way.
(41:39):
I think it comes down to the fact
that we need to, we need to understand,
um, what are people's mentalities?
(41:59):
And so I know you're probably saying, I
know you're probably saying that things can go
a certain way.
And maybe you're confused because of a message
you're getting from society.
Maybe I'm just saying, but then again, maybe
(42:20):
you're not confused.
Um, right.
(42:46):
Maybe, and maybe you're wondering like
what's going on.
Right.
(43:10):
I know that you're probably saying to me,
John, this is like, so out there, but
it's not guys, the stuff that Meta is
doing.
I mean, you're just really pushing that edge,
going back to school, getting my master's and
then my PhD.
Um, I feel that the world needs to
be educated about artificial intelligence and how to
(43:31):
use it for the greater good of all
concerned, because there's so many bad actors that
are taking the practice they have and accelerating
them with artificial intelligence.
And I think that is just so sad.
And ladies and gentlemen, Starbucks opens their first
3d printed drive-thru store in Texas.
So Starbucks has opened its first 3d printed
(43:52):
drive-thru store in Texas, showcasing how the
company is embracing new technology.
This innovative store allows for faster construction, uh,
more sustainability and a futuristic customer experience.
Um, could this be the future of retail?
I think it might.
I think it very well, uh, might be,
uh, that future.
So as I said, Starbucks has opened its
(44:13):
first 3d printed store in the U S
it's located in Brownsville, Texas, along the U
S Mexico border, the drive-thru, uh, only
store was built using a robotic arm that
layered concrete in a 3d printing process, this
innovative move places Starbucks among the few major
(44:34):
retailers exploring 3d printing and commercial construction, which
by the way has primarily been used in
residential projects.
Now, while the company hasn't disclosed plans for
additional 3d printing locations, the Brownsville stores, unique
design with its rigid walls showcases the potential
of this technology, despite the higher costs compared
to traditional construction methods, uh, experts believe 3d
(44:57):
printing could address labor shortages and improve construction
speed over time.
My question is how much does this machine
cost?
Um, and what do mistakes cost?
You know, maybe you're not familiar with this
and suddenly you have to learn how to
do something and now you don't know how
to do it.
I think that could be a huge, huge
problem.
(45:18):
But again, I, I feel more people, um,
will embrace this.
Um, I think if we can understand what's
going on, I think maybe just maybe, uh,
(45:39):
and maybe, um, I think sometimes
we don't get what we need because the
world is telling us that something can't be
done.
(46:00):
I think what technology is doing for us
in a very unique way is it's showing
us there are lots of possibilities, right?
I mean, how did Starbucks print through, I
could say three, a three, a 3d print
that first store.
And I bet if we were to dive
(46:21):
in, okay.
First of all, the look of it is
kind of cool.
Has a very unique kind of look to
it.
Um, it probably went through a lot of
hours, right?
Um, it was called a CO BOD BOD
two, a 3d printer, and it layered the
concrete using a steel nozzle to build the
(46:42):
structure that we have, the exterior walls were
printed in sections.
As you probably would gather using, uh, as
again, um, the nozzle, but with spaces left
for windows and for doors, the exterior walls
were printed in sections, as I said, and,
um, that made this easier once the show
was completed, human workers finished the store by
installing windows, adding a, uh, a porch and
(47:04):
completing other details.
The Perry 3d construction was responsible for 3d
printing process.
And this is according to D zine and
the resulting, uh, the build is a drive
through only location, uh, 1400 square feet and
has a unique appearance with ridges on the
walls, which are the result of the printing
process.
The question is, uh, and this is a
(47:26):
really good question.
How much money did it cost Starbucks to
print their first, uh, 3d store?
How much money you ready?
1.2 million.
How much will it cost Starbucks to print
(47:48):
their next, uh, 3d store?
Well, the first door, as I said, was
about 1.1, nine, 8 million.
Um, but as we move forward, the cost
(48:09):
should come down.
Um, it basically costs around $535 per square
foot to build a quick serve restaurant, which
comes to 749,000, uh, for a 1400
square foot building.
I see the, um, printing costs coming down.
(48:33):
So you've got a 3d printer, right?
You got printing supplies, you got digital infrastructure,
if that's there, you have licensing and certification.
And I think more stores are going to
do 3d printing.
The question is like, where are we going
to see this most?
And I think we're going to see it
most in places like, uh, maybe you might
(48:57):
see it out West.
We might see it in some interesting places,
but I think it's going to be interesting
because where there's labor shortages, right?
So you let me know, like, you know,
where is, let's say the worst, uh, labor
shortage, uh, for construction.
And if we were to look at that,
um, it happens to be in New Hampshire,
(49:18):
uh, New Jersey, and those are in the
Northeast.
Um, so some of those like Maine and
Vermont and Maryland, um, and they have many
job openings, but people don't want to work,
it's either people have retired, they're still under
the pandemic mindset, um, they want something bigger.
Uh, so there's lots of reasons and I
(49:41):
think there is work out there, but there
is also a mental, uh, mythology out there
that is basically, let's say changing, uh, what
could be in our world.
And I think that's probably the biggest thing
I can say to you guys is that
(50:02):
what's in our world is because of who
we are, because of what we see because
of the lens that we see.
And I know that sounds like really, really
crazy, but I'm here to tell you guys
that this is something that is really, um,
really amazing and it's amazing because you just
(50:25):
don't realize, um, you know, what's happening.
You don't realize that these possibilities exist.
You don't realize that you could take technology
and build a building or technology and, um,
you know, do things we never did before.
Who knew that we could scan books and
(50:46):
get a summary of that book in just
a few minutes after it was read.
So I think AI is good for a
lot of reasons, but I do think a
lot of people are exploiting AI.
Uh, and another thing I want to talk
about today, it's kind of a last, uh,
kind of a last throw in for today.
And that is, uh, understanding the, uh, chimney
(51:08):
and, and the Pope and how it works.
I thought this would be a cool little
ending for today.
So the Sistine Chapel, um, has a stack
that indicates whether a new Pope has been
chosen during a, uh, papal conclave.
Black smoke means no decision has been reached
after the vote.
While white smoke signals that a new Pope
(51:30):
has been elected.
The smoke comes from burning ballot papers in
a special stove to create different smoke colors.
Uh, they add, uh, basically put chemical additives
that are burned in a second stove.
So, um, basically there's something pretty interesting in
(51:50):
the way the Vatican has had to resort
to chemistry to get its, uh, models of
communication out about the election results.
So before, um, the week of the funeral,
they, um, they, uh, they have a few
people outside of Italy, but now all eyes
are on the copper chimney of the Sistine
Chapel and the new Pope, uh, will be
(52:13):
elected once that has white smoke.
We do have our new Pope now.
Thankfully the smoke comes partly from the burning,
as I said, of the ballots, not a
hundred percent from the burning ballots, but a
good percentage of it comes from the, uh,
burning ballots.
And so, um, in the special stove in
the chapel, but to color it white or
(52:34):
black, the smoke is mixed with the, um,
let's say additive from a chemical, um, that's
burnt in a second stove.
Traditionally, the Vatican produces the different colors by
burning wet straw for white and Terry pitch
for black.
Anyone, um, who has ever made a bonfire
knows that damp grass will work for the
former and the less responsible of you who
(52:56):
will know that, uh, chucking old tires or
roofing felt into flames will turn the smoke
black, uh, what's more noxious because it's then
full of city carbon particles that can clog
the lungs, which isn't really good and very
carcinogenic.
So I think that's a, that's a very,
very big problem right there.
So the Vatican has now revealed, uh, what's
(53:17):
going on.
So for black, it uses a mixture of
potassium, uh, per chloride, uh, and thraxene and
sulfur, uh, white comes from potassium, chloride, lactose,
and the conifer resin called rosin, which is,
uh, often rubbed on violin bows to increase
friction.
Um, it's interesting, uh, to imagine a team
(53:38):
of Vatican chemists laboring like alchemists to devise
these magic recipes, because what they really show
is that the Vatican is making plain smoke
bombs, a smoke bomb, as well as fireworks
designed to particularly be very smoky.
And combined, uh, easily burnt carbon rich compounds,
such as sugar with the so-called oxidizing
agent, which provides the oxygen for the combustion
(54:01):
reaction.
And so, um, people have asked me, you
know, John, what does that stove, you know,
uh, look like that they, uh, burn the
ballots in, so, um, the, the stove is
a, it's, it's a, it's a very small
stove.
It's not very, very big.
(54:23):
Um, and so they actually had to do
maintenance on the stove.
I don't know if you know, it didn't
work, um, because they had to get people
in to literally, uh, fix the stove.
And, um, I thought it was interesting that
they had to do this because there was
a problem.
Um, basically, um, in Vatican City, Vatican workers
(54:45):
installed the simple stove in the Sistine Chapel
where the ballots, uh, would be burned.
Uh, apparently there was something wrong with the
old stove.
So, uh, basically they have the one stove,
basically they have the one on the left
and they have the one on the right,
the one on the left, I think is
where they put the ballots.
And the one on the right is actually
where they, um, would have the chemical in,
(55:08):
uh, that, you know, would make it either
white, uh, the, the white smoke or the
black smoke.
So, um, I think it's very interesting, you
know, how all this works and how they
used a very interesting, uh, method.
And I think it's very important to tell
you.
So who is our new Pope?
Well, our new Pope is, uh, Robert, uh,
(55:29):
Provost, uh, Pope Leo XIV, and he is
the first American Pope.
So, um, really interesting to share that with
you and how they've been using this method
for centuries to let people know that they've
elected a new Pope.
So the question is, you know, why did
they need to install a new stove, uh,
(55:53):
at the Sistine Chapel?
Well, um, basically, um, they had a problem
with the other one.
And, um, so they installed a simple stove
in the Sistine Chapel where the ballots are
burned during the conclave to elect a new
Pope, and they began taking measures to block
any electronic interference with their deliberations.
(56:15):
Um, and that was very important because you
were not allowed.
Um, it was basically a sacred space where
masses happened.
And, um, yes.
So, so the, uh, the Sistine Chapel blocked,
uh, electronic interference.
I thought that was a very interesting thing,
(56:37):
how they, how they did that.
Uh, no cameras are allowed in the Sistine
Chapel.
Uh, photography is prohibited, uh, to protect, to
protect basically the delicate, uh, free scopes from
potential damage caused by camera flashes and to
maintain sacred space of contemplation.
But, um, the thing about it is the
whole secrecy about how, uh, you know, they,
(56:59):
they've done all these interesting things.
And I think that's very, very, uh, important
to understand this.
Um, it's shielded from the outside interference by
high tech security measures, uh, canonical restrictions and
unprecedented logistical rigor.
And it has been, uh, cleared of all
electronic devices and communication systems, and they've all
been blocked, uh, shrouding the election of the
(57:19):
new Pope.
So I think that's really, really amazing, uh,
that they've gone through those levels and that
they had to install an entirely, uh, new
stove.
I never knew that they actually had to
have two stoves.
One stove is for the ballots and the
other stove is actually for the, um, uh,
basically, you know, emitting what type of smoke
(57:44):
will be, whether it be white or whether
it be black, and again, they vote usually
twice a day.
And if the vote comes back, uh, with,
I believe it's without the two thirds majority,
then it goes up black.
Once they get the two thirds majority for
the Pope, then it goes up white.
Uh, but it's very, um, sacred and, um,
it's a very unique process and people are
(58:06):
almost like, you know, with bated breath, well,
ladies and gentlemen, I'm John C.
Morley, serial entrepreneur.
It's always a privilege, pleasure to be with
you guys here on the Jay Moore tech
talk show, please do check out, believemeachieved.com
for my amazing, inspiring creations.
I'll catch you real soon.
Everyone be well.