Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
A guy who only thinks about the future. Thomas Frye,
our futurists from the Da Vinci Institute and futurist speaker
dot com. Hey Thomas, how you doing today?
Speaker 2 (00:10):
I'm doing great. This is a great day to be alive.
Speaker 1 (00:12):
Every day is a well, most days are a great
day to be alive.
Speaker 3 (00:15):
I agree with you on that.
Speaker 1 (00:17):
You know, Thomas and I were just talking off the
air about a couple of things, one topic which I
think is incredibly important as part of a bigger part
of the sort of decaying of the fabric of society.
And tell me a little bit about the paper you
wrote on pop culture or we can't even call it
pop culture anymore because pop culture stands for popular.
Speaker 3 (00:39):
Culture, but we're also siloed.
Speaker 1 (00:41):
Is it really popular if only a fraction of the
population watches it?
Speaker 4 (00:46):
Right, we're seeing the decay of the traditional pop culture
that we had in the past. It's getting much more fragmented,
and it's being controlled by all the algorithms. So each
of us has our own preferences, our own likings, and
we have many more options than ever in the past.
So we have very few things that are part of
(01:09):
a monoculture that hold us together. So in the past,
we would have songs by the Beatles that everybody knew,
or we would have a movie that came out with
Arnold Schwarzenegger saying I'll be back, and everybody knew that line.
That was common things that we all grew up together with.
Speaker 2 (01:31):
We don't have that anymore. It's getting much more fragmented.
Speaker 1 (01:35):
What's interesting about what you're saying is we were driving
last weekend, driving back from Albuquerque, and Chuck has satellite
radio in his truck, and we were listening to the
Casey case Top forty counts down from August eighteenth, nineteen
seventy four. And we're listening to the countdown and there
was a country song, and there was a disco song,
(01:57):
and there was a rock song, and the Top forty
covered everything right, And I thought, that certainly doesn't happen anymore.
Now we have ten different charts. So as a person
who doesn't necessarily listen to dance music, I don't know
anything about that genre anymore. Whereas it used to be,
if you listen to the radio, you'd get a little
(02:17):
bit of everything.
Speaker 3 (02:18):
I think this is a huge problem, huge problem.
Speaker 2 (02:23):
Yeah, part of it was with YouTube coming online.
Speaker 4 (02:28):
Everybody that goes onto YouTube but watches something different because
it all depends on what their preferences are.
Speaker 2 (02:35):
But the first person to.
Speaker 4 (02:37):
Break over a billion downloads on YouTube was in twenty twelve.
The Korean pop artist Si, who is a K pop star,
was the first one to break.
Speaker 2 (02:52):
Through that barrier.
Speaker 4 (02:53):
So he got over a billion downloads. And since that
time we've had several more. But that was a big
turn of events. At that time, nobody else had done that,
and that was just twenty twelve, So that was thirteen
years ago, and now we've got a lot more things
happening in our lives. We have many more options than
(03:14):
ever before, and so it's a a piece kind of
a piecemealk culture that we're putting together. There are a
few anomalies, like when the Oppenheimer movie came out where
we came out, those were really popular, they kind of
broke through. There's a few things like that Squid Games.
(03:38):
Squid Games was a really popular hit for a while.
But we don't have we don't have those quotable lines
that come out that we had in the pass.
Speaker 1 (03:49):
And I was talking to Thomas about this because I
think this for me is as a part of my
childhood back in I don't know what year, it was,
nineteen eighty maybe, and and Dallas was everybody watched Dallas
on Friday Night. I mean everybody watched Dallas and Jr.
Ewing gets shot, and we had a whole summer where
(04:09):
everyone in the country seemed to be obsessed with finding
out who shot j Are. I had a pair of
jeans that on the back pocket.
Speaker 3 (04:17):
Said who shot j Are.
Speaker 1 (04:18):
There were stores that popped up in malls that only
sold who shot Jare merchandise, and it was that kind
of stuff. We don't have anything that comes as close.
I mean, there are certain science fiction genres, the Star
Wars genre, the Star Trek genre, the Marvel movies, like
they kind of nibble around the edges, but they weren't.
Speaker 3 (04:41):
As they're not as powerful as the stuff that I
just mentioned.
Speaker 4 (04:46):
Right right, And like the Marvel movies that are coming
out now are getting much smaller audiences than they did
in the past.
Speaker 2 (04:54):
Oh yeah, So it's.
Speaker 4 (04:56):
All this fragmenting of society and it's becoming more of
a global culture than it is just a US based
culture as well. So that's that's changing things in addition
to the algorithms and everything else that's coming about. But
we have so many more feeds, We have so many
more channels of information. So just getting on too access
(05:22):
an example, you can you can.
Speaker 2 (05:25):
Dial into all the music they have.
Speaker 4 (05:26):
On there, or you can dial into the news stories
you can or if you go on to Facebook, they're
they're all channeled in different ways as well. This is
this is radically different than when I was growing up,
that's for sure.
Speaker 1 (05:40):
Oh, I mean I remember my dad would get home
and at six o'clock we had to watch the news,
which sucked because Zoom was on at the same time.
We wanted to watch Zoom. So when my dad wasn't home,
we could watch Zoom. We had one TV you know,
to speak of, and so watching television was more than
it was a family experience, right. We we didn't have
the opportunity to all be watching our own favorite channel.
(06:04):
It was it was it was far more uh communal.
And And this the other thing that I worried about
that I was talking to my friend about these two
issues is the decline of civic organizations in our in
our country, Because when I was younger, the Rotary Club,
the Moose Lodge, the Elk Lodge, the Shriners. Everybody's dad
(06:25):
belonged to one of them. Then you had the Daughters
of the American Revolution, you had the Garden Club, you
had the Bridge Club, you had all of these civic organizations.
Speaker 3 (06:32):
For women, and those two things.
Speaker 1 (06:35):
I think did a lot to bring us around people
that we wouldn't necessarily have been around otherwise, you know, uh,
you know, let us know people that didn't live in
our immediate neighborhood. And we just don't have those things anymore.
And and it's kind of like, I worry about us,
and I think that's one of the reasons that it's
easy to be so angry.
Speaker 4 (06:58):
Yeah, there's we're racing generation of isolationists to a large
extent because they're tapped into their iPad or their phone
or something, and they're really don't have a need to
talk to their friends at that point. So they still
want to be around other people, but not as much
(07:21):
as we had a just a really craving to be
around other people. And I don't see that in the
young people today.
Speaker 1 (07:31):
Well, Thomas, we also weren't neurotic and anxious and ridden
with anxiety and on antidepressants and not able to function
in a room full of people. So I'm just going
to say it, our way was better. Our way really
was better. And I'm just going to leave it at that.
I want to talk about the next subject. And I've
got this Colin posted on the blog today at mandy'sblog
dot com, because this is one of the things I
(07:53):
said yesterday on the show. I feel like we are
on the cusp of a few things that are going
to be as revolution utionary as the cotton gin or
the printing press. And one of those things is going
to be when we fully manage to figure out how
to make AI work for us in a massively significant way.
(08:15):
And you know, Thomas, I don't even know if you
remember this. First time you started talking about driverless cars,
I was like, hard pass, no way, uh uh, I'm
not doing it now. I'm like, why is it Waymo?
And Denver, why can't I get a driverless car?
Speaker 3 (08:28):
And never?
Speaker 1 (08:28):
My evolution has been very fast, right because I have
confidence in these things. But let's talk about AI in education.
Where are we now and where are we in the
near future when it comes to harnessing AI and using
it for good instead of just having it be something
that kids used to cheat.
Speaker 4 (08:50):
Yeah, most of the teachers are kind of behind the curve.
The students are much more adepta using AI than.
Speaker 2 (08:58):
The teachers are.
Speaker 4 (09:00):
The teachers are in great fear of everybody cheating on
our tests or on writing papers with it. And what
we're going to find is that the employers in the
future are going to be all about what have you
accomplished rather than what do you theoretically know? So education
(09:23):
has always been about you need to learn all this
just in case you needed in the future. But we're
in a just in time business world. The world is
happening really quick and you might need to learn something
like tonight, and so having the ability to pick up
on something and learn it tonight that is remarkably different
(09:47):
than in the past, where you'd have to go to
a library and check out a book and.
Speaker 2 (09:51):
Dig through it maybe you could learn something.
Speaker 4 (09:54):
But having an AI agent that can actually coach you
through the process of Let's say let's.
Speaker 2 (10:01):
Say you wanted to create a video game.
Speaker 4 (10:03):
You could have an AI agent that would actually teach
you all the gamification techniques and teach you all the
tools that are necessary to create this video game that
you want to create and you can step your way
through the prole process. By the time you're done, you
have something that you're proud of. This is something you've accomplished.
You want to show to your friends. This is radically
(10:24):
different than just going to a geography class and learning
theoretical things that you're not intimately familiar with.
Speaker 2 (10:32):
So that's part of the difference. And the same goes
is if you wanted to if you.
Speaker 4 (10:37):
Wanted to invent a product, you could have this AI
coach that would actually coach you through the whole process
of filing for a pattern and creating this invention get
them a minimal viable product. You could have AI coached
you through the process of writing a book, actually take
you through developing the characters, the main story arc, having
(11:00):
a real exciting beginning and ending to this book, and
you could have it published and then you'd have something
that you're proud of you could show to your friends.
This is radically different than what we have today because
it's all about this theoretical knowledge.
Speaker 2 (11:16):
You're supposed to know this just in case you need
it sometime in the future.
Speaker 3 (11:19):
Here's what I worry about that.
Speaker 1 (11:21):
Okay, I worry that we're forgetting a big part of
this and that is human nature, especially the human nature
of teenagers and children. Talk about a group of people
that as a general rule, are looking for the path
of least resistance. What we see now as more and
more schools and implemented laptops for everyone in the classroom
iPads for everyone in the classroom. They're not using it
(11:42):
to seek out knowledge. They're using it to try and
get around the filters to look at porn, and they're
using it to watch TikTok videos, and they're using it
to you, you know, go to Instagram. The issue is
not that the technology is not quite right, it's.
Speaker 3 (11:56):
That we suck, especially with kids and teens, on managing
that technology.
Speaker 1 (12:01):
So from that perspective, what I would love to see,
and maybe this is where we're headed, I'd love to see.
Speaker 3 (12:07):
And we've talked about this before.
Speaker 1 (12:08):
Every student has their AI tutor that stays with them
their entire life, and the kid can ask the AI
tutor questions. But if the AI tutor said, well, I'm
not going to find that for you, you have to
look it up yourself, I would love that.
Speaker 3 (12:21):
I would love it if they would.
Speaker 1 (12:22):
Say show your work, you know, what I mean, because
then I feel like we're getting the best. We're pushing
back on kids' worst tendencies, which is the path of laziness.
And anyone who as a teenager is nodding along with
me right now. So I think there's like those two
things have to be straddled before it really starts to
work in schools.
Speaker 4 (12:41):
Yeah, yeah, that's that's this whole world of AI agents.
Speaker 2 (12:44):
It's being developed as we speak here. These agents.
Speaker 4 (12:49):
This will be the first generation agent, so they're probably
not going to be that good, but the generation that
comes after them, it's going to be much better.
Speaker 2 (12:57):
And these are.
Speaker 4 (13:00):
Like bots that we can talk back and forth to
that wake up in the morning and say, Hi, how
did you sleep all this morning?
Speaker 2 (13:08):
Did you sleep well last night.
Speaker 4 (13:10):
It's it's going to be radically different because.
Speaker 2 (13:15):
You're you're going to get very intimate. I think.
Speaker 4 (13:19):
The AI agents are going to be the friend that
you always wish you had. I think it's going to
be something that will be the guardian of your privacy.
Speaker 2 (13:28):
It'll be the ones that.
Speaker 4 (13:31):
You can share your intimate secrets with, and it'll become
your most valuable possession. That I think is real interesting
because I think people would rather lose a car rather
than lose their their AI agent that they've become very
intimate with. I think it's going to get to something
(13:52):
like that.
Speaker 1 (13:53):
Well, I, you know, I'm basically a ludd eye Thomas.
I never in one to lean in on technology, but
I started really really dipping my toe into how I
can use like Chat, GPT or GROCK to my advantage.
And now I do all my food tracking with Chat GPT.
I've set up a food diary in Chat and I
go in and I take a picture of whatever I'm
eating and I say, put that in for breakfast, and
(14:15):
it breaks down calories, fat, protein, fiber. It would have
taken me forever to figure out how to do that
on my own. And that's the things that I am
finding myself. Like Chat GPT, let's make a packing list
that has checkable boxes, which for me visually I love.
Speaker 3 (14:31):
I printed out.
Speaker 1 (14:32):
These are things that formatically would have taken me seven
hours because I suck at anything computer related. But Chat
just spits it out to me. And now we're best friends.
So and I'm developing my you know, my relationship with Chat.
Speaker 3 (14:44):
We're very formal, Thomas.
Speaker 1 (14:45):
Chat and I always use basin thank you, because we
know when the robots take over, I want to make
sure I survived the first call. But even for a
person like me who's not good like this, some of
the applications are so freaking useful and they're so easy
and they're so good that I am excited to see
(15:06):
what comes next. My fear is completely gone. If the
robots are going to take over whatever, it's fine.
Speaker 4 (15:12):
Yeah, there are a number of people that are using
chat gpt as their personal doctor. They're feeding in all
of the data about themselves, their genes, their health records.
They're feeding in all of that and it will come
up with a list of supplements to be taking, the
(15:33):
exact dosage of medicine that they should be taking, and
it'll give them opportunity to get off the different medications.
You can have a very rigorous conversation with chat gpt
as at being your personal doctor. That's one option. If
(15:53):
you want it to be your investment advisor, you can
have it do that as well, and these AI agents
are getting really good at that. So if you're following
along with the right things, you might be able to
invest in all the perfect stocks and the perfect crypto.
Speaker 2 (16:12):
Coins or whatever, And yeah, I don't.
Speaker 4 (16:15):
Know exactly what kind of risk you want to take
and what you don't want to take, and know what
areas are off limits or on limits for every one
of us.
Speaker 1 (16:24):
Now, to be clear, AI still is not perfect, and
if I were actually I don't know if you saw
this Thomas that the defense team for Oh no, it's
a high profile case and I can't remember which one.
Oh the Mike Lindell case. The CEO of my Pillow
is being sued by Dominion Voting Systems. They filed a
brief that had made up citations. They did it on
(16:46):
chat GPT, and no one checked the work. And I'm like,
come on, guys, like you're filing a legal document and
you didn't check the work.
Speaker 3 (16:54):
How long before they.
Speaker 1 (16:56):
Work out the kinks in that to get rid of
those AI hallucinations In an educational environment, you can't have
kids learning something that's just made up.
Speaker 4 (17:06):
Right, right, I'm guessing the hallucinations will all be gone
within a year.
Speaker 3 (17:12):
Oh wow, that fast.
Speaker 4 (17:14):
I think that's going to be happening very quick here.
I mean, there's still a possibility for things to go
wrong along the way, but there is too much competition. Now,
the stakes are very high, and this is moving at
light speed. It's just the amount of time and energy
dedicated to every one of these llms is just staggering.
Speaker 1 (17:37):
How much how much power do these AI computers use?
I've seen some crazy numbers on the internet, like insane,
like I think that can't possibly be true. But then
the numbers are so big that I think maybe they could.
Speaker 3 (17:51):
Be true because they're so outrageously large.
Speaker 1 (17:54):
Are these just power drains these data centers that are
running these AI programs?
Speaker 2 (18:00):
Yeah?
Speaker 4 (18:00):
So Elon Musk built this massive data center in Memphis, Tennessee,
and this is he quickly assembled the world's largest supercomputer
right there in Memphis. And and to power this he
had to get some special provisioning of power from the
Tennessee Valley Authority.
Speaker 2 (18:21):
To allow him to kick it up to full full speed.
So this is a massive power drain.
Speaker 1 (18:28):
Uh.
Speaker 4 (18:29):
Now the Chinese have been claiming that they can do
it with far less power, but.
Speaker 3 (18:34):
They haven't shown their work on that.
Speaker 1 (18:36):
Like all these other people are like, yeah, show us
your work and we'll believe you. And China's like nah,
So I don't know if I buy that just yet.
Speaker 3 (18:43):
Thomas.
Speaker 4 (18:47):
Yeah, I'm skeptical myself.
Speaker 2 (18:50):
So but this is uh, this.
Speaker 4 (18:53):
Is the area that we're growing up in, this this era,
and things are changing so rapidly. So in the future,
people are not going to lose their job because of AI.
They're going to lose their job because of a person
working with AI. Right, So it's not AI by itself,
(19:13):
it's not a robot by itself. It's a robot with
a person. And so the technologies that we're most afraid
of taking our jobs, disrupting our lives, so the things
that we need to lean into and actually learn them
better than ever.
Speaker 3 (19:30):
There you go. That's my strategy.
Speaker 1 (19:31):
And I'm just you know what, And as I said,
I'm always very polite with my AI experiences, just so
when they do take over, they'll remember I was a
good egg.
Speaker 3 (19:40):
He's a good egg.
Speaker 1 (19:41):
Thomas fry thank you so much for your time this
and next month we will not talk because I will
be in uh No, I'll be back. We'll just reschedule
you for later in the month, so we'll talk to
you into Thomas. Okay, all right, my friend that is
Thomas Frye