Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
My friend Thomas Fry, and together we are going to
talk about the future. Now, Thomas, you sent me a
column today and immediately I was like, can I go
ahead and sign up for that digital copy of my life?
I mean, can we go ahead and jump on that?
Can I be a beta tester for the digital copy
of myself and my life? Let's start there. First of all,
(00:21):
good to see you again, my friend.
Speaker 2 (00:23):
Great to see you too. Yes, I think we're going
to have this very soon.
Speaker 3 (00:28):
You can be wearing smart glasses and record everything that
you see, and it can also record everything that you hear,
and then with a few censors, you'll be able to
record everything that you taste, touch and feel, and you
can record your entire life experience in and so then
you have a digital copy of you that you start
in your personal cloud.
Speaker 1 (00:50):
So how do you This is what? Because I got
to tell you this would be a game changer in
marital arguments, right, because most marital arguments are because someone
heard something differently than the other person thought they said it.
Speaker 3 (01:04):
Right.
Speaker 1 (01:04):
So can you imagine being in a in a firm
discussion with your loved one and saying things like no,
that's not what I said and you're like, digital Mandy,
what did I say? And then with digital made like
in my world, Thomas, digital Mandy would shoot out a
hologram of me actually saying whatever it was that I said,
(01:25):
and therefore boom, I win the argument. What I'm afraid Actually,
what happened is that I find out I'm wrong way
more than I think I'm right. Right, So, I mean,
how far away from from we are some of this stuff.
We've already got the glasses, but being able to sort
of instantly data mine what's been recorded. That is the
(01:46):
challenging part, Isn't.
Speaker 2 (01:47):
It that that'll be part of it too.
Speaker 3 (01:52):
I mean, once we record all of this and put
it into some personal cloud, then it's whatever interface would
come up with to interact.
Speaker 2 (02:01):
With that cloud, right.
Speaker 3 (02:04):
I mean the article talks about having a voice in
your ear, Well, what if it's just instantly you were recalling.
Speaker 2 (02:13):
It because you have that mind to mind interface? That
would be much more efficient mind.
Speaker 1 (02:18):
To mind interface. Sounds like something has to be implanted
in my brain and I'm not sure I'm ready for
that yet. So Thomas Ai, at this point, I'm now
looking at AI programs that can look at the longer
version of the YouTube show that I do with my
friend deb Flora and pull out snippets so I can
have them ready to go for social media. And one
of the most shocking things that one of these programs
(02:41):
that I'm looking at, it can go through and it says,
let me pick the best parts of your show. And
you're thinking to yourself, how does say I know if
it's if it's the best? The reality is it does
a pretty dang good job. So that now we're talking
about being able to think about whether something is a
(03:02):
great point, or it's funny or what makes I mean,
the speed with which this stuff happened is amazing, right,
I mean, it's crazy what's happening now.
Speaker 3 (03:13):
This is ramping up at exponential pace, and so this
is giving us all these extra capabilities that we never
knew existed in the past. So the things that we
can accomplish in a single day are just going to
ramp up dramatically. That's what I find really interesting because
(03:36):
a lot of these things we never even knew we
could possibly consider in the past, and suddenly they're at
our doorstep ready to be unfolded.
Speaker 1 (03:46):
You know, there's a lot of fear right now, and
we've talked about this in terms of AI replacing jobs
and AI replacing people, and I think that some industries.
I actually have a video from mister Beast, who is
by far the most popular YouTuber in the world, and
in it he says AI is going to be the
death of the YouTube creator because people are no longer
(04:06):
going to have to fund the kind of videos that
he's been funding and they can just use AI to
create them. But to your point, aren't we looking sort
of shortsightedly about what we don't know is going to
happen next? Does that make sense the way I just asked.
Speaker 3 (04:23):
That, Yeah, you're What you're really asking is do we
still need the human and the loop?
Speaker 2 (04:32):
Yeah? And how can we do things impersonally?
Speaker 3 (04:40):
I think we'll for a long time still know the
difference if humans are involved or if they're not involved. Yeah,
but I think we might cross that Turing test threshold
and not too distant future. But things are changing just
at and ask stronomical pace right now, and so it's
(05:02):
really tough keeping up and all this shifting and changing
going on.
Speaker 1 (05:06):
Well, let's go from this conversation about digital clones into
your three rules of exponential capabilities, because I feel like
that's kind of what we're talking about, right, Like this
is taking us in a direction that we can't wrap
our head around. Just like I'm guessing that people who
lived on a farm in the agricultural period of the
(05:27):
world before the Industrial Revolution probably had no way of
understanding what the Industrial Revolution was going to unleash on society.
Are we kind of at that same precipice?
Speaker 3 (05:38):
Now, Yeah, I'll just mention one other thing about digital clones.
Speaker 2 (05:43):
First.
Speaker 3 (05:44):
The digital clones will enable you to they will go
on a date for you to test out the other person.
So two digital clones will be testing out dating.
Speaker 1 (05:57):
No, no, no, that is not a thing. We're not
going to do that. I'm not I show up for
coffee and someone's digital clone shows up, I am leaving
right away. I'm like, nope, no.
Speaker 2 (06:12):
Not like that.
Speaker 3 (06:13):
But they can date each other online and you don't
have to be involved at all. So they date a
thousand different people and pick the very best two or
three that you should test on personally.
Speaker 1 (06:25):
Oh wait, so it's all in the in the digital world.
They're all dating, right, What can I get a report
from my digital clone? Like, oh, I see you had
a date with Bob, tell me about it? Is my
digital clone gonna be like, girl, he is not for you.
And here's why. I mean, like, what is that exchange
even gonna sound like?
Speaker 2 (06:42):
You know, it'll sound just like that. Yeah, there you.
Speaker 1 (06:47):
Go, girl, Bob was no, Bob is no. He's a
hard pass on that one. Well that actually sounds okay,
But then how does that get used against you? Like
if you're going for a job and they're like, I'm sorry,
your digital clone is going to have to come in
and do the work, you know, for three days, and
we'll just test out your digital clone to see how
it works. What if my digital clone is an idiot?
(07:08):
What if they have no idea what they're doing and
they prevent me from getting that job.
Speaker 3 (07:13):
Yeah, there's a lot of unexplored territories, so we will
have to test these things out.
Speaker 2 (07:19):
To make sure they all work right.
Speaker 3 (07:22):
And there's there's no end to all the things that
are going to go wrong between.
Speaker 2 (07:26):
Now and then.
Speaker 1 (07:28):
Well, I did get this text message Mandy digital Mandy
needs rewind sound effects. That would be pretty cool, Like
I feel like that this person asks a better question, though, Mandy.
Most of this new technology sounds fantastic, But how do
we power it and where does all the data go?
How secure will it be? Aren't we putting the cart
before the horse? And that is a genuine concern. I
(07:49):
don't need everybody having access to my entire life. I mean,
that's none of anybody. I don't need anybody to have
access to me flossing my teeth or standing there in
my underwear or any of the other millions of things
that I do every single day. So what does that
look like? And in terms of data storage, isn't that
kind of a problem that we're already running into.
Speaker 3 (08:09):
Yeah, I think you have to have your own personal
clone cloud storage. So it's your own personal cloud that
you store all of life details in and you guard
and protect that and you grant minimal access to whoever
you want to. But it's very very much discretionary on
(08:31):
your part. So it's not like the storage that we
have nowadays that gets when you store something on AI,
everybody has access to it.
Speaker 2 (08:42):
That's that's not going to work.
Speaker 3 (08:45):
We need we need a lot of protections in place,
and you're hitting on all the right the right pieces.
Speaker 1 (08:51):
I mean, it sounds like it would be one of
those things that you would want your own little dedicated
server for rather than off source it to a server farm.
Speaker 2 (09:02):
Right right, Yeah, so you have to have that.
Speaker 3 (09:07):
It might be in a server farm, but it's only
accessible by you.
Speaker 1 (09:11):
Right right. Okay, now now we're talking. That sounds far
more reasonable to me. So I mean, are we on
the horizon in the next five years with stuff like
this or is it because just for yeah, I love you,
I love you.
Speaker 2 (09:26):
Thirty I think, oh wow, because.
Speaker 1 (09:28):
I love the thing in your article where you use
the example of you're in a business meeting with a
client and they said, yeah, in our last meeting six
months ago, remember we talked about and you have no
idea what they're talking about. Like for me, that would
be a godsend. It's just to be able to stay
on top of those little things in my life. So
I'm not kidding Thomas. Before you started coming on the show,
I was a complete luddite. I was like, I'll never
(09:49):
have a self driving car. Now I'm like, can I
have one?
Speaker 2 (09:51):
Now?
Speaker 3 (09:52):
Right?
Speaker 1 (09:52):
So you're turning me, Thomas, you're turning me. Let's talk
about the laws of exponential capabilities before we run out
of time there, because this is very interesting to me.
Speaker 3 (10:02):
The first law is with automation, every exponential decrease and
effort creates an equal and opposite exponential increase in capabilities.
Speaker 1 (10:10):
What does that mean?
Speaker 3 (10:13):
So as things get easier to do, you're just going
to do more of it, and you're going to be
much much more capable because you can get so many
more things done just because they become easier to do. Now,
I put these these together ten years ago, and it
seems like they all still hold up today.
Speaker 2 (10:35):
So I thought that was kind of it.
Speaker 1 (10:36):
Well, I mean using the example of now, you know,
five six years ago, if you were a person like
me with limited technoledge, you had to hire someone to
build a decent website. Well, now there are AI platforms
that can take an idiot like me and walk me
through the entire process and AI handles all of the
little tricky coding things that I don't know how to do,
(10:57):
and you can build your own website. So is that
kind of an example of law number one?
Speaker 2 (11:02):
That's a great, great example.
Speaker 1 (11:04):
Okay, now let's go to law number two. In this
As today's significant accomplishments become more common, mega accomplishments will
take their place. What are we talking about here.
Speaker 3 (11:19):
Well, let's think about I've been joking around that the
Crazy Horse monument up in South Dakota.
Speaker 2 (11:27):
We'll be able to.
Speaker 3 (11:28):
Three D print it before they ever finished the original one.
Speaker 1 (11:33):
And I love that monument. But that's funny. That is
really funny, Thomas. That's well done.
Speaker 2 (11:38):
So, but I.
Speaker 3 (11:40):
See taking on massive, huge projects because as we become
much more capable, we just have to set our sites
higher and we can get so many more things done.
So rather than making a statue that's ten feet high,
we make it one hundred feet higher, two hundred feet higher,
even a thousand foot high.
Speaker 2 (12:00):
Yeah, the.
Speaker 3 (12:02):
Buildings that we're going to start creating are going to
be massively intricate and complicated in ways that we can't
even imagine right now. That they're not just going to
be square boxes. So we said on the street corner.
Speaker 1 (12:15):
Well, I will tell you the best example of this
that I can think of is going back in time
to the first modem that Mandy Connell ever owned. And
I remember, I remember this clearly, Thomas. The first day
I got fifty six K speed on my modem, I
thought I had literally gone to heaven. I had died
(12:35):
and gone to heaven. And now if an entire film
doesn't load on my phone in three minutes, I'm mad, right,
I'm so those kind of things that we were like,
our expectations were so low, and now there on the
other side, it's like, if I can't get it instantly
in a mobile device that I hold in my hand,
what are you even doing? So our expectations have gone
(12:57):
from oh my gosh, I got fifty six K to
give me my film in three seconds. I think that's
a good example of that.
Speaker 3 (13:04):
Yeah, when you talk to your computer or to your
phone and it doesn't recognize what you're saying.
Speaker 1 (13:11):
Exactly, or when the voice to text puts in a
bunch of extra words and you're like, I didn't say this. Yeah,
yeah we did, we did get Law number three is
as we raise the bar for our achievements, we also
reset the norm for our expectations. That's kind of also
what I was just talking about. I mean, now we
expect that kind of speed where it was something to
(13:32):
wonder at not that long ago.
Speaker 3 (13:34):
Yeah, so you're you're suddenly at work, you're getting ten
times as much done as you used to get done.
Now suddenly the boss just normally expects you to get
ten times as much done.
Speaker 1 (13:47):
Correct. That's why I keep my output at work very
very low, Thomas, It's very low, even keeled low kind
of thing. Don't want to get too ahead of myself there.
So I got a couple questions going back to the
personal digital clone, and one of them is where does
your personal cloud info go when you die? Who gets that?
Speaker 2 (14:12):
Yeah? I think that's part of your will.
Speaker 1 (14:15):
Could you imagine? And your family would be like, oh God,
who has to take care of digital?
Speaker 2 (14:20):
Mom?
Speaker 1 (14:20):
Like this sucks? Like now somebody's got to be in
charge of the storage. We're never gonna watch it again.
I don't know I would actually put it into my
will that like, put it on a thumb drive and
smash it with a hammer, right Like. I don't want
all of that to go on beyond me. I think
that my time on earth will be well spent. But
after that, do I really matter in the grand scheme
(14:41):
of things? And oddly I'm okay with the answer being no.
Speaker 3 (14:44):
So if somebody goes to your cemetery site and it's
all built into your tombstone and they can talk to
you right there.
Speaker 1 (14:52):
Chuck my husband, he loves the idea of that, Thomas,
But I don't even want a headstone. I want to
be put in the ground, wrapped in a shroud, you know,
ashes to ashes, dust to dust, like, keep it natural.
And if I do have a headstone, it will be
because my children demand it, not because I felt the
need to live on in some kind of digital fashion
(15:13):
like Mandy headroom or something that's just you know, if
my family wanted it, then that's great. They could do it,
and more power to them. But I'm gonna be dead, right,
So I'm okay with that. How about this one? With
all of these advances in computing, will there be any
more room for Thomas Edison's meaning? What is left to discover?
Speaker 3 (15:34):
Right?
Speaker 1 (15:34):
What is left to invent? And I know that seems
like a silly question because there's always something, But is
it harder to cross that barrier or is it easier
now because of the technology.
Speaker 3 (15:46):
I actually think it'll be easier, and I think there's
going to be It'll be up to people to actually
cross the chasm for what would exists today into what
can I just tomorrow. So we'll we'll have to come
up with that be our imagination. So that's one of
(16:08):
the barriers that AI has that can only work with
what's already existence. So to come up with something new
and original, it'll be strictly a human thing.
Speaker 1 (16:18):
Well, yeah, because I mean, Thomas, I have been using
I've now officially hired chat gpt as my assistant and
I'm using it. But what an incredible tool it is.
Now it's not going to replace what I think about things,
but in terms of gathering up large sums of data
and creating a chart, oh my gosh, something that would
have taken me a half hour of my own personal
(16:38):
time now takes me ninety seconds. So that allows me
to do a whole bunch of other stuff that I
have time for because I'm not trying to dig up
the data that chat gpt can find in ninety seconds.
Speaker 3 (16:51):
Yeah, I'm using it to cross to cross check everything
that I'm doing. Is this the best approach? What am
I doing wrong? What are the failures in this model?
And it instantly on that helps me all the wholes
that I have, So it's hugely valuable.
Speaker 1 (17:11):
Now let me ask you this question from a texter,
and I'm going to glow up their question just a
little bit. Even though it's a good question. And this
question is Mandy asked Thomas, if we have a digital clone,
could we not put fake quality qualities and misrepresent yourself
to get your job or get your date, Because let's
be real, there are times in every job interview where
(17:32):
they ask you a question that you don't know the
answer to, so you bluff your way through it. Is
our digital clone going to have those same sink fast
on my feet, bluff my way through it? Bs qualities
that we have.
Speaker 3 (17:46):
Yeah, cosmetic surgery for your digital clone. Yes, I'm sure
somebody's going to come out with that.
Speaker 1 (17:56):
That.
Speaker 3 (17:56):
Yeah, the whole drone and hence and strategy.
Speaker 1 (18:03):
I can just see it now. There's going to be
a whole cottage industry on how to glow up your
digital clone in such a way that's passable. Not obviously
well glowed up, but you could just throw it out
there to do all kinds of stuff, and then when
the real you shows up, they're like, wait a minute,
that's not at all one of us. This texter said, Mandy,
can you address water usage and AI? And I think
(18:23):
that's a bigger question about power usage water usage? Why
does AI take so much of both?
Speaker 3 (18:33):
It takes so much compute power just grinding away on
all of the things that it has access to, and
that brings up this whole topic of where our data
centers are going to be in the future, which is
Jeff Bezos says that we should be putting our data
centers in space because they have access to solar twenty
(18:55):
four to seven, and so that's naturally we can we
could do things much cheaper because we don't have the
energy costs involved. That's that's one of the roadblocks right
now moving at the speed we want to move at.
Speaker 2 (19:13):
So my thinking is that both him and.
Speaker 3 (19:18):
Elon Musk are going to be shooting up rockets that
have digital centers on them into space, huge huge solar
panels on the side collecting all that energy, and we
can we can shoot stuff back and forth into space
all the time, anytime we want to. It raises lots
of interesting questions, So because who's who's in control of
(19:42):
things that are in space? I mean, which laws? So
if as an example, you have a US patent on
something and somebody has a business model running on that
satellite that is violating your patent and patent infringement or
are sat outside of the bounds of earth.
Speaker 1 (20:06):
That's a fine question, one that I don't have the
answer to. But that's why we talked to Thomas Fry
because I need more things to keep you up at
night wondering the answers to, because these are all problems
that we're going to have to solve. This text or
asks an interesting question. Will digital clones give off pheromones
for that dating scenario that you were talking about? Will they?
(20:27):
Will they have that intangible something that makes things work?
Speaker 2 (20:33):
Ah, that's a great question.
Speaker 3 (20:37):
I am certain somebody's going to be working on that,
but I don't have a good answer for that.
Speaker 1 (20:42):
Yeah. Well we'll find out about that at some other time.
Thomas Fry is our futurist and you can find him
at Future a Speaker. I've also put a link on
the blog as well to all of these articles so
you can continue to read. Good to see my friend.
We'll see you next time, Thomas.
Speaker 2 (20:59):
All right, great to be on your show, Mandy.
Speaker 3 (21:01):
All right.
Speaker 1 (21:01):
That is Thomas Fry, our Futurist.