Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Hi, everybody, it's me Cinderella Acts. You are listening to
the Fringe Radio Network. I know I was gonna tell them, Hey,
do you have the app?
Speaker 2 (00:15):
It's the best way to listen to the fringe radio Network.
Speaker 3 (00:19):
It's safe and you don't have to log in to
use it, and it doesn't track you or trace you,
and it sounds beautiful.
Speaker 2 (00:27):
I know I was gonna tell him, how do you
get the app? Just go to.
Speaker 1 (00:31):
Fringeradionetwork dot com.
Speaker 2 (00:34):
Right at the top of the page.
Speaker 1 (00:37):
I know, slippers, we gotta keep cleaning these chimneys.
Speaker 4 (01:00):
The problem for the kids, the chat GPT problem, the
short TikTok version of getting your news version is the
medial temporal love stuff. That's the in and out stuff,
that's the short change in your memory. And the problem
(01:20):
is teaching the kids that there's a value in doing
the hard work to making that knowledge be yours. You know.
The extreme example is learning music. I could sit and
learn a passage fairly quickly, but it doesn't stay there.
(01:41):
If I practice at it for a really long time,
I not only own it, it's automatic.
Speaker 5 (01:47):
Welcome to business game Changers. I'm Sarah Westall. I have
a very interesting guest today. His name is Jack McCollum.
He's actually an MD, but he's a neuro pediatric neurosurgeon.
But he's also this she in history he has.
Speaker 6 (02:02):
Before he went into.
Speaker 5 (02:04):
Neurology, he was an engineer and then he went into
business and created a insurance company and sold that for
billions of dollars. This guy is a very unique individual.
And I'm already reached out to me and says he
follows my work and he wants to talk to me,
(02:25):
and so I like wow, and I saw his background,
said Okay, Yeah, these are the people I'd really like
to talk to because they have insight across disciplines, which
is what we need, right We need to make sense
of things across disciplines. And we are going to talk
today about the changing brain. And I've been talking a
(02:45):
lot about how brains are changing in youth based on
social media and just what they're exposed to.
Speaker 6 (02:52):
And he says, you not only are right, you're right
than right.
Speaker 5 (02:56):
I mean, the brains are actually changing, the structure of
our brain, the connections are our brain are changing, and
we talk about it and what that means and how
much it's going to affect society. He talked about historically
how brains operated differently based on what was valued and
what was capable. I think you'll find that is very
(03:18):
interesting tie into what's happening today and how what we
value capabilities we value changes how our brain structure actually develops,
and what that means to how we train young people
in developing.
Speaker 6 (03:35):
The skills that we need to develop.
Speaker 5 (03:38):
This is a groundbag breaking conversation that I hope a
lot of people listen to. People who are thinking about
these times and how to navigate these times and how
to actually create policies withinside of universities and elsewhere that
can benefit everyone. These are the conversations that need to
happen so that people understand the foundations behind this and
(04:01):
then how to navigate forward, because right now I just
think it's every total chaos, people not understanding what's going
on and how to apply it. And we talked about
some of that as well. Think you'll enjoy this conversation.
He has a book that's coming out on the Changing Brain.
His website is Jack MCCOLLUMMD dot com.
Speaker 6 (04:25):
I'll have a link to that.
Speaker 5 (04:26):
I'll have a link to his website and I hope
you listen to this whole interview and you also share it.
Speaker 6 (04:33):
I need people to share it.
Speaker 5 (04:34):
I also, magically my channel has been restored on YouTube
after five years. I don't know what to think about
it yet. I'm kind of monitoring it. It would be
nice if I wasn't censored. I don't have a lot
of I mean, come on, it's been many years of abuse,
(04:55):
right They've been abusing people because they have so much power.
Speaker 6 (05:00):
And you know, I guess the powerful.
Speaker 5 (05:02):
Gets away with abusing people who don't have as much power.
And that is what we're dealing with right now. And
so it's an important time in history that we look
at what's going on for the sake of humanity and
what we want as a.
Speaker 6 (05:18):
Society and how we know.
Speaker 5 (05:20):
It's esoteric what I'm saying, but it's it's important that
we analyze this stuff and we fight for it because
the people with the most money and resources don't always
win win. The people rise up, and people even internally
to these large organizations, realizing I have to live in
this world.
Speaker 6 (05:39):
Too, to what kind of world do we want?
Speaker 5 (05:42):
And how can we be stores of a world that
is better for everyone including ourselves and our families. So
with that, I want to share with you a couple
of things that are happening. I have a webinar coming
up with doctor Diane Kaser on peptides. Are going to
review the peptides. What's happening in that space GLP ones.
They're going to start funding it the government. They've lowered
(06:04):
the price on it. Of course, we have the next
generation glp ones that are clean clean meaning that there
isn't any additives, no preservatives, and that it has the
best best in class as far as showing you that
not only has it made the United States, it's act
so tested by third parties to show you there's nothing
else in it except the peptides.
Speaker 6 (06:27):
Find another company that does that.
Speaker 5 (06:29):
I'm willing to pay ten dollars more per month to
make sure that I have that kind of quality, especially
with what we live through, right, you need that.
Speaker 6 (06:37):
Kind of quality.
Speaker 5 (06:38):
That being said, we're going to be reviewing peptides. You
don't have my guide on weight loss and muscle muscle preservation.
Speaker 6 (06:46):
We have.
Speaker 5 (06:46):
We're going to be talking about regenerative medicine and you know,
wound healing, and there's so much happening in this space.
I think peptides and exosomes and stealsomes are just the beginning.
There are is a lot more that's going to be
happening within the next decade that is going to be
pretty profound. But this is already profound in the differences
(07:12):
it's making in people's lives. And so if you are
interested in signing up for that webinar, go to Sarah
Westall dot com slash peptides, sign up there, and you
can also put in questions that you have. I can't
promise we're going to get to all the questions because
last time we had eight hundred questions, and so what
(07:33):
we do is we just go through and see what
the common questions are. But do put your question in
because it does help us understand what to cover, and
we cover as many of the questions that are given
to us. We didn't have time to do live Q
and A because we had so many questions this time.
If we depending on how many questions we'll do Live
Q and A as well, we'll try. But anyways, go
to Sarah Westall dot com slash peptides and sign up
(07:58):
for it there.
Speaker 6 (07:59):
The webinar will be No.
Speaker 5 (08:00):
Number twenty at that five PM Central Time and again
sign up at Sarah Westall dot com slash peptides. Okay,
let's get into this fantastic common absolutely fantastic.
Speaker 6 (08:12):
I think you guys are.
Speaker 5 (08:13):
Like this conversation with doctor Jack McCollum. Hi Jack, Welcome
to the program.
Speaker 4 (08:20):
Good morning.
Speaker 5 (08:22):
I'm really excited to talk to you because.
Speaker 6 (08:26):
You have such a deep.
Speaker 5 (08:30):
Knowledge or should I say study, I should say deep
study in a couple different areas that are unique, and
I think that gives you a different perspective. I mean,
first of all, you're a pediatric neurosurgeon. You have your
PhD in history, and you built an insurance company, which
a multi billion dollar insurance company which you sold, which
(08:53):
is pretty unique. Now you've got a deep business experience too,
that's pretty unique set of skill sets. Usually a doctor
is not good at business and usually don't see business people.
Speaker 6 (09:04):
Sometimes you do, but you know the history background too.
Speaker 5 (09:08):
It's these are like three different unique subject areas coming together,
which I think is profoundly cool. So what made you
kind of go in all these different directions?
Speaker 4 (09:21):
Just kind of sequential careers? You know, I started out
as an engineer, and Matt too discovered fairly quickly that
I had no business being an engineer. So I went
to medical school and practice medicine for a long time.
Was really lucky I had. I had a great practice,
(09:41):
I had a really good experience. Late on realized that
in my whole educational experience, my entire liberal arts exposure
was one course in German. I figured I needed to
fill that hole. So I went back and got the
and got the history degree. And that led to a
(10:03):
stretch of time when I was teaching primarily undergraduates, and
that's when I started down the road. If I noticed
that those kids' brains don't work the same way, it's
not just a toss off comment. They generally do not
work the same way that my brain works. My brain
(10:24):
is not substantially different than my grandfather's was, but theirs
is not the.
Speaker 5 (10:30):
Same, and we need to talk about that. Because I
taught at the University of Minnesota for a while and
I realized quickly that things were different too. I just
was like, wait a minute, this is different. It's profoundly different.
They also have a lot more stress too. They react,
at least the kids I was I was teaching, they
(10:53):
responded to stress differently than we used to as well.
Speaker 4 (10:59):
They get They get a bad wrap for people say
that the kids and the gen Z kids, the gen
X kids don't read. All they do is look at screens.
Well that's not really true. They probably read and write
more than we ever did. They do it constantly, they
just do it in a different way. They do it,
(11:21):
and they do it in short bits. It is true
that you won't find them sitting around reading long books
very often, but to say they don't read is not correct.
Speaker 5 (11:33):
But I noticed, and I want to get into why
you think their brains work differently on what that means,
because that's a profound shift in society. But what I
noticed is when I was having these kids write papers,
there was almost almost no one could write a paper
(11:54):
that was now they can with chat GBT, but we
didn't have it back then, which is not good either,
and we got to get into that. But they could
not write a paper that was coherent, well thought out
and developed an idea.
Speaker 4 (12:10):
Yeah, it has to do. That's a really long discussion,
but it has to do with how thinking and memory work.
When you hear something the first time, it sort of
goes straight to your medial temporal lobes and it changes
the chemistry in how neurons connect to one another in
(12:32):
that part of your brain. But if it's just for
a little while, the chemicals go away and the memory
goes away, and there's nothing there that stays forever. The
short bits of exposure to information do that. The problem
is that to be really useful, you have to do
(12:55):
the hard work of converting that into something that goes
to a different part of your brain, and then it's
not chemicals anymore. Then you grow whole new connections. Your
brain physically changes at that point, and you own that information.
The problem for the kids, the chat GPT problem, the
(13:19):
short TikTok version of getting your news version is the
medial temporal love stuff. That's the in and out stuff,
that's the short change in your memory. And the problem
is teaching the kids that there's a value in doing
(13:39):
the hard work to making that knowledge be yours. You know.
The extreme example is learning music. I could sit and
learn a passage fairly quickly, but it doesn't stay there.
If I practice at it for a really long time,
(14:00):
I not only own it, it's automatic, it deep just
lives there. It's a deeper kind of learning.
Speaker 5 (14:07):
I I you know, when I was teaching, or I
was coaching hockey, and I was coaching at an elite
level of athletes, and we looked at the difference between
deep learning, deep learning deep skills, like really learning the
skill versus just glossing over it and not really ever
and they compared it to learning to play it was
(14:29):
a saxophone, there was. They did a study on this
girl that I think it was a saxophone. She was
learning some musical instrument, and they showed that in the
ten minutes that she was struggling to learn her instrument. Basically,
she was playing and then when it wasn't right, she
stopped and redid it and played and kept working at
it and working at it until it was what she wanted.
Speaker 6 (14:52):
She did that for about ten minutes.
Speaker 5 (14:53):
And then moved on to how she usually practiced, which
was just flying through it and be done.
Speaker 6 (14:59):
And that ten.
Speaker 5 (15:00):
Minutes of the struggle to get it right was equivalent
to our hundreds of hours of just flying through a
piece and not actually struggling.
Speaker 4 (15:12):
Well, that's actually, that's absolutely true, and there are growing
numbers of well done studies that show that. To show
the difference between semantic memory, the memory for bits and
pieces of information and procedural memory, the memory where it's automatic,
where you own it. You know. The great example in
(15:34):
my in my my personal life is it takes seven
years of training to get to be an eurosurgeon. It's
not that there's seven years of bits and pieces of information,
it's that you get that stuff so that the basic
stuff is automatic. You don't have to think about the
(15:55):
basic stuff. You do the you do the difficult one
off things, and the rest you just own. Pilots of
airplanes are the same way. You know, they wear the airplane,
they fly without thinking about it until something bad happens. Yeah,
it's a different kind of learning. The problem for the
kids right now, especially with chat GPT, is that the
(16:18):
easy kind is really easy. The I don't have to
work at it is really easier than it's ever been.
And it's a problem when you teach the kids to
convince them that there's a value in doing the work
to get.
Speaker 5 (16:34):
Past that well, and it's also a problem when college kids.
If you write the paper yourself, just as an example,
it could be any subject in any activity, you write
the paper yourself, you get a b. If you have
chat GPT write it for you, you get an a. No,
it's that the teachers, professors need to change how they're teaching.
Speaker 6 (16:56):
Because of this.
Speaker 4 (16:58):
I have a really good friend who who teach, who's
teachers primarily undergrads, and we've talked about this back beginning
in the end of twenty twenty two when chat GPT
became available. He really got on the train early and
he really understood what he was doing. What Keith does
is he has those kids write one time in draft,
(17:20):
and he grades it one time fixed by GPT, and
he grades it again and one time going back and
making it their own, and they get a third grade.
That's really a lot of work.
Speaker 6 (17:34):
Well a ton of work, but that's how I write.
That's how I write exactly. However I do the draft
I do too.
Speaker 5 (17:40):
I tell chat GPT to clean it up and then
I fix what they it creates because I don't like
it doing it just that's the better product.
Speaker 4 (17:49):
That's exactly what I do as well. And the only
other thing that I bet I do that you probably
don't is I do it Longhand.
Speaker 5 (17:56):
Well, yeah, I don't write it out. I type it
out because I'm fast. I have a computer science background,
so I typhessed.
Speaker 6 (18:01):
But that is what.
Speaker 5 (18:03):
But I think that interaction back and forth actually has
made me a little bit bit of writer too.
Speaker 4 (18:09):
Oh for sure, me too, without a doubt. Let me
give you an example. This is a great historical example
of how technology changes how brains work. Writing was ground
for obout five thousand years. But until printing in fourteen
fifty four, it was very hard to find books because
(18:31):
they were all hand copied. If you were an academic
and you wanted to see a book, you might have
to travel halfway across Europe to go find the only copy,
and then you only got one shot at it. During
that time, in the Middle Ages, it was a common
skill for academics to be able to read a book
(18:54):
and committed to memory in one reading. It was it
was standard whole book. It was standard for academics to
be able to do that. Were people's intelligence was not
judged on having new ideas. It was judged on how
many books they had memorized.
Speaker 5 (19:12):
Well, that's not necessarily good either, but.
Speaker 4 (19:16):
But it was. It was. It was a valued skill,
so people had it. Yes, Printing came along in fourteen
fifty four, and it took one generation for that skill
to go away because people needed any.
Speaker 5 (19:28):
Had that skill, they could learn. You know, the people
who have photographic memory. We think it's just the superhuman skill.
But people, because they needed it could do it. I mean,
could they just say what's on page seventy nine?
Speaker 4 (19:41):
Yes? Actually, the example I've never seen it done with.
The example was the Jewish scholars of the Torah knew
the book. And because the book the Torah was always
copied exactly the same on every page every time, the
rumor was that you could take a Torah scholar, you
could stick a in through the book from the front,
(20:04):
front to the back, and they could tell you every
letter it passed through. I'm not sure that's true, but
it's the idea. The point in all of that, though,
is that you can train brains to do some amazing things.
You train the brain to do what's useful. For those folks,
memorizing was useful. Printing came along and memorizing wasn't useful anymore,
(20:28):
so they spent their time inventing calculus and astronomy doing
other things. We're at a really interesting juncture right now.
A generator of AI is the equivalent, maybe better than
the equivalent of printing compared to manuscripts. The set of
skills that people are going to need are going to
(20:50):
be entirely different. The things we value, the semantic memory,
stuff that we value right now, that that we judge
as intelligence. That's the stuff the machines could at.
Speaker 5 (21:05):
Wow.
Speaker 4 (21:05):
All that stuff we're going to offload.
Speaker 5 (21:07):
So there's going to be a profound shift in how
humanity functions. But we don't want to get away from
the brain development, the atrophy of certain sections of our
brain that are important to keep development. The way we
go about doing our writing is different than the way
(21:31):
the kids are actually going because they never learn that
fundamental initial skill. If the teachers don't recognize what's going on,
we have to teach.
Speaker 4 (21:41):
We have to convince the kids that having the procedural memory,
that having that stuff that they own will make them
better users of what the machine can do. And it does,
it does. That's but that's a sale we have to
make in the classroom.
Speaker 5 (21:59):
And we have to make not only to the kids,
but to the teachers, and how they without a doubt,
the teachers have.
Speaker 6 (22:05):
To buy into it.
Speaker 5 (22:06):
Because the kids are are motivated by grades or motivated
by performance, right for sure they are, And so if
like you're your professor friend, he's doing it right. Yeah,
Now how do those kids, well, yeah, it's hard. It
takes more effort by him. Now, how do those kids
come out of his starting his class and exiting his class.
(22:28):
That must be a profound change for them, and it's
probably really.
Speaker 4 (22:32):
Hard he's actually teaching those kids. They come out of
that class, the majority of it's never universal, but the
majority of them come out understanding how to make that
tool work for them. That's right, Yeah, instead of instead
of being a crutch for them.
Speaker 6 (22:52):
That's it.
Speaker 4 (22:53):
And that's a that's just a change we're going to
have to make and how we educate and how we
educate people, how we educate our educators, that's right.
Speaker 5 (23:01):
And it's a profile and shift on understanding what these are. Now,
how does it restructure our brains? Like are we is
it our a fund that you're saying you think different
than the kids do?
Speaker 4 (23:12):
Now?
Speaker 5 (23:13):
So is there just like you were talking about the
scholars back in the day, their.
Speaker 6 (23:18):
Brains were different than ours.
Speaker 5 (23:20):
How are their brains actually on a from a scientific standpoint,
how they're developing?
Speaker 6 (23:26):
How is it different?
Speaker 5 (23:27):
What is developing more than ours did than ours has?
Speaker 4 (23:32):
So here's a guess. Let me take you back to
one other change that I think is a really interesting one.
Let's go back to the change when people learn to write.
Brains changed dramatically when people learned to write. There's a
great character named Julian James. He taught at Princeton Forever,
was one of the most popular teachers at Princeton Forever.
(23:55):
After he took ten years off and acted on the
West End in London just stumble arc. James said that, yes,
brains changed when people learned to write, and we actually
know how they did because the original writing were just
copies of what people had passed down verbally before. So
(24:16):
when you look at the very earliest written things, you're
seeing how brains worked before there was writing. And the
examples are like Homer in Greek and the Old Testament.
And here's a great thing out of that. Think about
think about the Iliad, or think about the Old Testament.
The thing that was characteristic of that was that the
(24:38):
gods were in the room. It wasn't that they thought
about them and had a vision of them. They were there.
They were standing there talking to people and telling them
what to do. If you get to the later Old
Testament book. If you look at Psalms all psalms is
is one complaint after another. Where did God go? He
(24:59):
left the room? What happened?
Speaker 6 (25:01):
That's incredible. Yeah.
Speaker 4 (25:02):
What happened was, according to Jane, was that people's right hemispheres,
they're non writing hemispheres, were very important, equally important with
the left side. The right side hallucinates, and people were
actually living with their hallucinations because their right brains were
(25:26):
so active. People said, Jane's got to be crazy. That
can't be right. Until the last few years when people
started doing functional MRI scans on people while they were hallucinating.
Real active schizophrenics who are having active hallucinations, they come
from the right brain. It turns out he was right.
Speaker 6 (25:47):
Okay, but.
Speaker 5 (25:49):
People hallucinating in mass is something for me I would
have to jump. I would have to say, wait a minute,
I don't know if hallucination and mass is the same
thing as schizophrenia.
Speaker 6 (26:01):
I know you're going to have a great answer for
all this. As much as maybe there was.
Speaker 5 (26:06):
Something more to what these people were experiencing and doing,
because I don't buy in the fact that entire society
was had some mental illness.
Speaker 4 (26:17):
It wasn't society, It wasn't entire societies. It was person
to person. They were very individual experiences. That's why the
people that kept the skill after writing became profits, but
or oracles if you were in Greece. But the point
and all of that is that at that juncture when
(26:40):
we learned to write, when we learned a new way
to move information, we changed where we put all of
our energy in remaking and learning and remaking our brains.
We put it on the west side. We concentrated all
of our effort and all of our learning into training
the stuff that the left brain does well. We kind
(27:02):
of ignored the right side. And we may very well
be at a time now when the left side stuff
is not as important. We may very well be at
a time where in the next generation or so, we're
going to be putting the energy into training the right side.
Speaker 5 (27:20):
How profound a change is that now that not only
will it'll it'll focus on our spiritual side, it'll focus
on what we can't see. It also will change in
how we relate to other human beings for a fact.
Speaker 4 (27:34):
For a fact, I mean that's purely a guess. But
if you look at what's happened. They are only a
handful of times in human history when the way we
move information fundamentally changed. We learned to talk, we learned
to write, we learn to make information electronic with coding,
(27:55):
with first with telegram, and then with coding, and then
the internet, and now those are the only times, and
each time there's been a profound change in how our
brain worked. We train different parts of our brain to
do different things. We just happen to be at this
really exciting time where we're going to do it again.
Speaker 6 (28:17):
And we don't know what we're entering into, do we.
Speaker 4 (28:20):
It's absolutely not Yeah, it's absolutely.
Speaker 5 (28:22):
Not chartered territory, and it's scary.
Speaker 6 (28:24):
It's scary. I know.
Speaker 5 (28:26):
I did did some research on how many people are
embraced change and actively seek it. It's a small percentage
of the population. In fact, there's a large percentage of
over thirty percent that even if the change benefits them greatly,
they'll still fight it.
Speaker 4 (28:45):
Oh. I think half the definition of happiness is stability,
for sure. And this is a very challenging time. There
are so many really good things that can happen with us,
and a large number are really bad ones.
Speaker 5 (29:02):
I know.
Speaker 4 (29:03):
Yeah, Well, if you think about it, if you think
about printing, printing got us the Enlightenment, and it got
us the Industrial Revolution, and it got us the Thirty
Years War, and it arguably got us two world wars
in a holocaust. I mean, good things happen and bad
(29:23):
things happen. Well, I've happened each time.
Speaker 6 (29:26):
I think this.
Speaker 5 (29:27):
The Internet, which is causing all societies to start talking,
is causing a clash of cultures, which is also I mean,
I know there's an economic reset, I know there's all
these other things going on, But when when cultures that
aren't used to communicating start communicating and all these ideas
come together, it's going to cause a lot of conflict.
(29:50):
And so I think a big chunk of this conflict
we're seeing worldwide is also based on clash of cultures.
Speaker 4 (29:57):
Well for sure, Well not just classical cultures. Evaporation of
the boundaries between cultures. Yeah, it's really we become tribal.
But it's very hard to have very big cultures now,
it's really difficult.
Speaker 5 (30:12):
Well, we don't and we don't really operate that way either,
do we as humans. We're really we're really small gangs
of people, small clients that's where we thrive.
Speaker 4 (30:22):
Yeah, we become we become much more tribal, if you particularly.
I mean, this is a whole different topic, but if
you figure that it's so easy in the Internet to
access information, that just reinforces what you already believe. It's
almost impossible not to become silent.
Speaker 6 (30:44):
That's right.
Speaker 5 (30:44):
Well, and the social media actually, the way the algorithms
are written and actually encourages that.
Speaker 4 (30:49):
Behavior, oh for sure. For sure, in a lot of ways,
it makes it possible. Here's one more thought. That's just
an odd that's just an odd piece of a way
to think about AI. Every time we move information. One
of the really wild things is every time that I
(31:11):
talk to you, I change your brain. If you're not
very interested, I don't change it for very long. If
you're really interested, I make permanent changes in your brain
that will still be there twenty years from now. Now.
That's impressive. So if you think about what we're talking
(31:31):
about here, is moving information always before, Always before, it
was moving information from one person to another, or maybe
one person to a lot of people. We talk, it's
one to one. We print, it's one to the reader.
We look at TV, it's one to the it's one
to the viewer. It's always person to person. This is different.
(31:54):
This is the very first time that the machine is
creating information no human ever had in communicating it to
a human now, that's wild.
Speaker 5 (32:06):
It's synthesizing all the data that we have in information
that we couldn't actually easily get before, synthesizing it and
then presenting it to you in in.
Speaker 6 (32:18):
Good or bad ways.
Speaker 5 (32:19):
It's not always good at it presenting it to you
in a summary summary content.
Speaker 4 (32:26):
It's the it's the it's the argument about whether what
comes out of a large language model is really new
information or whether it's just a new way to present
stuff that was already.
Speaker 6 (32:40):
There that we never could before.
Speaker 4 (32:42):
So I would that we just hadn't seen before. I
would argue, and it's it's a hard argument, but I
would argue that the information it makes this new information.
Speaker 6 (32:52):
I think so.
Speaker 5 (32:53):
Sure, because we couldn't we couldn't see it summarized like
that before, because we just didn't have the tools to
do it.
Speaker 4 (32:59):
Absolutely. And the you know, the great example was when
was when the machine took the game of go and
be the very best player in the whole world, resoundingly
using moves that no human had ever thought of. I mean,
it's a the machine is it is really moving information
(33:25):
without it being person to person. And that's pretty crazy.
Speaker 5 (33:28):
But you know, when you think about the left brain,
right brain, the spiritual side, the soft side is really
what we're talking about.
Speaker 6 (33:37):
The machine can't do that. It has absolutely zero ability
to do that.
Speaker 5 (33:41):
It can it can synthes it can make it look
like it, it can mimic it, but it can't.
Speaker 4 (33:46):
The machine does not have experience. The machine does not
have an immigrant system. The machine does not have emotions.
It can fake them, but it doesn't have them. Those
are true, those are human. Those you're right. They tend
to live on the right side. They tend to live
in the non dominant hemisphere. And if I had to
(34:06):
make a wild guess, my guess would be that those
things that we can do better are going to be
the things that.
Speaker 6 (34:14):
We value, and that would make sense.
Speaker 5 (34:17):
And let's talk about what we do better, what are
we good at? Because we haven't developed this side of
our brain, and we've almost forget's laid dormant, it atrophied
in a lot of ways, just like we can't memorize
the whole book anymore.
Speaker 6 (34:33):
What has been atrophied, and what are we going to see?
Speaker 4 (34:37):
I'm not so sure that it's been agrophy. It's just
that we haven't just that we haven't built up those skills.
The neurons are there, we see, the connections are there.
We just haven't built them up in way because they haven't.
They haven't had a societal value. You haven't gotten well
paid for that stuff.
Speaker 6 (34:58):
We haven't valued it. So what are we going to see?
Speaker 5 (35:00):
Just I know it's all a guess, right, you're guessing,
But I'm sure with everything you've done, you've been thinking
about it.
Speaker 6 (35:08):
What are we going to see? What are these profile
changes we're going to see?
Speaker 5 (35:11):
And then I want to ask you as we dive
into this more, I want to say what negative things
we're going to see too that we need to try
to deal with.
Speaker 4 (35:20):
Well, the answer to your first question is I don't know.
I really don't know. If I had to just make
a generalized wild guess, my generalized wild guess is that
those things that liberal arts educations are very good at
(35:41):
will have a high value. STEM now has all the value.
You know, science, technology, engineering, math have all the value.
The machines are really good at that and my guess is,
and maybe we're already seeing it a bit, and we're
seeing we're seeing change shifts in the job market. The
(36:01):
STEM kids used to be able to five years ago
could write their own ticket, and that's not true any longer.
Those jobs are getting progressively harder to.
Speaker 5 (36:13):
Get have other skills, right, I mean, because I always
say that, you know, when I taught entrepreneurship at the university,
it's not the business people that are the most likely
to succeed in being an entrepreneur.
Speaker 6 (36:29):
It's the artist, and it's.
Speaker 5 (36:31):
It's it is the creative genius that's mixed with the
fundamental understanding of science and engineering. It's that artistic creative
expression using that skill set is the differentiator.
Speaker 4 (36:45):
I think so too. I have a really hard time
putting in words. I have a really hard time sitting
down and writing what that means. But I have the
same sense that you do. I have the backgrounds all STEM,
so I'm pretty hampered in that regard.
Speaker 5 (37:04):
Well, you have a PhD in history, so it's.
Speaker 4 (37:06):
Not yeah, but you know history, History's history is just
science and disguise. But no, I think I don't know.
I don't know the real answer to that question. I
don't now as far as the risks and all of this,
(37:29):
they can't coming levels. The first risk is the one
that we're already seeing, and that is that the job
market's going to change. There will be there are already
being a lot of jobs lost because the machine can
do things that particularly entry level employees had been getting
paid to do. Yeah, law is seeing that, Accounting is
(37:54):
seeing that, marketing is seeing that those kinds of things
are happen across the board and will be and will
be more of those.
Speaker 5 (38:03):
Well, you know my background computer science. The first jobs
that they're automating is themselves, which is is coding kind
of profoundly dumb, but that's what they're doing because that's
what they understand the most. So they're automating their own jobs,
and they're elimiting the need.
Speaker 6 (38:19):
For some of these low level entry jobs.
Speaker 5 (38:22):
The problem is, you know, I've head conversations the system
the senior engineers, the systems systems engineers who will be
managing broad parts of society.
Speaker 6 (38:34):
Because everything's going to be run on computer systems.
Speaker 4 (38:37):
Right.
Speaker 6 (38:37):
Everything is moving in that direction.
Speaker 5 (38:39):
Because we're not developing those entry level skill sets. The
senior people that understand all that architecture the funnel is
going to be is becoming smaller and smaller. It's almost
like we're creating our own problems.
Speaker 4 (38:57):
We are, No, you're absolutely dead on that's exactly correct.
The problem is and this goes back to what we
were talking about earlier with learning. The entry level people
are the people that are doing the work to get
that knowledge where they own it, where they become the experts. Right,
(39:18):
And you're right, we're in a large sense we're cutting
off the pipeline of experts and that's a worry. That
is a worry.
Speaker 6 (39:27):
How are we going to do that?
Speaker 5 (39:29):
And when we have a top a card heavy house
of cards kind of.
Speaker 4 (39:33):
Thing, Well, my guess is that will, like we do
everything else, will create a crisis and then we'll fix it.
Speaker 6 (39:42):
That is true, that's what we do.
Speaker 4 (39:45):
That is kind of how we behave. There will be
a time I suspect in five or ten years when
people realize that the pipeline has been shut off and
they'll go back and create a pipeline.
Speaker 5 (39:56):
They'll make these kids do it without the tools and things.
Speaker 4 (40:00):
Right, do I need to do this or the machine
can do it? Do I really need a human to
do it. Well, yes, I do, because I'm going to
have to have somebody supervising the machine.
Speaker 5 (40:10):
That's right, the ones who are moving the machine forward.
And I see that in medicine, I see, you know,
like doctors for sure, I don't think. I think the nurses,
the ones who are nurturing and with the people, that's
going to be much more valued. The researchers who are
moving the film forward are going to be much more valued.
(40:31):
And doctors themselves are not going to be like they
used to.
Speaker 4 (40:35):
Be, well, at least the at least the face to
face in the office people.
Speaker 6 (40:40):
They'll change.
Speaker 5 (40:42):
There'll be more of a nurturing interaction to the information.
Speaker 6 (40:48):
It'll be different, I think. I mean, I don't know.
Speaker 4 (40:50):
I'm just physician. Physicians in my community or all across
the country are using a program where they walk in,
turn a microphone on, and sit and talk with the patient,
do the exam, take the history, talk about what's going
on in their lives, never take a note, never look
at the never look at a computer screen, And at
(41:12):
the end of the hour, the machine writes the note,
gives the differential diagnosis, and makes the recommendations for follow up. Yeah,
and the doc reviews it says this looks okay, and
goes to the next person.
Speaker 5 (41:24):
And so their person skills will be more important exactly so.
And then the people who are moving the field forward,
the researchers, the systems engineers, those people who are highly
you know, really at the cutting edge of the creators.
The creators are the ones who are going to be
(41:44):
valued in that way.
Speaker 4 (41:45):
I think that's probably true. I think that's probably true. Now,
we haven't talked at all about the next level risks
of the system, Okay, and that's a whole different topic
and and probably more than we want to get into.
But there are a couple of really good books, uh,
one by Elisar Yukowsky which just came out that says,
(42:10):
if they build it, everyone will die. And there's a
there's a slight pretty dark.
Speaker 5 (42:18):
That's pretty dark, but yeah, it's pretty dark.
Speaker 4 (42:21):
There's a there's a slightly less apocalyptic apocalyptic version that
was written by x Smith and Henry Kissinger about the risks.
The problem with AI that we haven't dealt with is that, remember,
AI is hooked up to the an AI machine is
hooked to the Internet. It has access to you and
(42:45):
all of the things that you use on the internet
and all of the things that the infrastructure and society
uses on the internet. The AI machines have an amazing
breadth of influence. There is no real guarantee that the
machine will when it starts doing things, will do what
(43:10):
you wanted to do well. The underlying problem is we
don't understand how.
Speaker 6 (43:16):
It works well.
Speaker 5 (43:18):
And we also are at the mercy of the people
who are directing it and programming it, and it'll be
a smaller and smaller set of people.
Speaker 4 (43:29):
That's a bit of a worry. The biggest worry is
that the people who are directing it in programming it
don't know how it works.
Speaker 5 (43:35):
Yeah, and it's also because they're relying on the mass
we used to when we did artificial artificial intelligence. We
used to be very careful about the algorithms and we
had to be efficient with them, and so it was clean.
It was a little bit more understood now because these
(43:57):
large language models, because the computing or has become so
much more effective efficient, there's so much more power, there's
so much more memory. It's messy and it's just they're
they're getting messy.
Speaker 6 (44:12):
With how they're developing it. To which you couldn't be
back in the eighties and nineties.
Speaker 4 (44:16):
It's worse than that, sir. The problem is that until
generative AI, as you know, programming was, programming was a
serial exercise. You put in a step, you operated on
some amount of data. You've got an answer, you operated
on the answer, and you got another answer, and you
(44:37):
went step by step by step, and the programmer knew
every step. The programmer knew what was going, what happened before,
and what came next, and it was always one at
a time. Generative AI works like a brain. It takes
a whole lot of inputs, looks at them all at
(44:59):
the same time time, and then doesn't make a yes
no decision. They make ah, maybe thirty percent or maybe
sixty percent decisions yep. They grade it, and then they
go back and they look at what the last answer
was and they adjust the answer ye. And then they
try it again. And they do that not just once
(45:20):
or twice, but billions of times. Nobody knows how they
make the adjustments. Well, nobody knows how those weights are
being assigned.
Speaker 5 (45:28):
And you're right, because the problem is is that when
you follow, you can follow each one like each little
part of that, but it can take three months to
follow what they've done on just one decision set, and
they're making millions of decisions.
Speaker 6 (45:46):
So the time it takes to.
Speaker 5 (45:49):
Analyze, you know, one, you know, really dive into one
decision takes a long time, and.
Speaker 4 (45:56):
We don't have enough time to do that. It makes
the decisions way too fast, and it gets a little
bit scary because there have been real, honest goodness examples
of where the machine tailor's its answer to what it
thinks you want to want to hear. And there have
actually been examples where the machine came up with an
(46:17):
answer thought that you might not want to hear it
and hit it well.
Speaker 5 (46:24):
And I think you're right because when I talk to
you know, like using chat, GBT or some of these ais,
I use all of them and I actually play them
off each other and stuff at times when I actually
it's weird when I get like I shut it down,
I say you're wrong, you're doing this. It'll change its
tone with me like it is. It's interesting how you
(46:45):
can get you can manipulate the AI differently based on
how it's interpreting your reaction to it.
Speaker 4 (46:53):
Hopefully that you're doing the manipulation, and it's not manipulating
you for sure.
Speaker 5 (46:57):
For sure, well I'm sure it is manipulating. I mean,
the thing is that it has more because you as
a neural as a neural scientist, you understand how how
you can manipulate people, and people are very unaware of
the manipulation process, and all of that can be programmed
in like it knows how to manipulate me and I
don't even know I'm being manipulated.
Speaker 4 (47:19):
And in fact, it can program itself to do that.
Which is which is bothersome? That's a that's a hard discussion.
I mean, that's that's a hard discussion. And part of
the problem is we're creating this thing that is immensely
powerful that's going to have an incredible impact on how
(47:42):
our economies work, and our politics work, and our society works.
All of those are going to change because of this tool,
and we haven't really arrived at how we're going to
deal with the tool. And it's and more than that,
it's not that it's just one ai. You got to
realize that there are ais all over the place. There's
(48:04):
a Chinese AI that's just struggling by crazy to be
as good as j GPT.
Speaker 5 (48:08):
Yeah, well, and there's all there's little as all over
the place. People are developing, right, yeah, and so and
they all have different functions and different things. And I
think that we this is where the adults in the
room say, Okay, how do we as powerful as this is,
how do we make sure that we still ultimately can
(48:28):
shut it off or redirect it when we absolutely need.
Speaker 4 (48:32):
To, especially when some of the adults are are laser focused.
Don't how much money that can make out with?
Speaker 6 (48:39):
That's right.
Speaker 5 (48:41):
They have Yeah, different agendas than what you would think
is good for the general society.
Speaker 4 (48:47):
And that goes back to where we were a while
back in this discussion. Every time there's been a fundamental
change and how we move information, those handful of times
the results were greater than expected, were unpredictable, and were mixed.
(49:09):
At the end of the day, we always came out better,
but there was no shortage of chaos while all of
that was going on. And I guess if history somebody
said history is not history is not a prediction, but
it rhymes. There's no small likelihood that we'll go through
(49:30):
that sort of thing again.
Speaker 5 (49:33):
Yeah, I think this is going to be a profoundly
different set of circumstances that we have not experienced before.
Speaker 4 (49:41):
I do too. I think absolutely, it's a it's it
is a very interesting time. The other exit, the other
thing about this that's different than anything that's ever happened before,
is that the times the time string is so collapsed. Yeah,
you know, it took hundreds of thousands of years to
get past speech to writing. It took a few thousand
(50:05):
years to get from writing to printing. It took about
five hundred years to get to coding. Chat gpt had
one hundred million users in a month.
Speaker 6 (50:17):
Yeah, isn't that incredible?
Speaker 4 (50:19):
I mean that's different.
Speaker 5 (50:21):
Well, and I look at the mentality of social media,
and I look at the mentality that I see and
I just kind of shake my head.
Speaker 6 (50:28):
I saw with.
Speaker 5 (50:28):
COVID, I saw with all the It's like, these are
the people that are going to have to manage what
these tools are going to do to society.
Speaker 6 (50:35):
And I keep saying, the adults in the room need
to step up.
Speaker 5 (50:38):
We need some serious people with a lot of wisdom
need to step up and start being part of this
because that kind of mentality that I'm seeing dominating our
social media and our everywhere is not good enough to
be able to manage this.
Speaker 4 (50:55):
The room could use a few adults.
Speaker 6 (50:57):
The room could use enough.
Speaker 4 (50:59):
Great, Yes, no, that's absolutely true.
Speaker 6 (51:04):
That's okay.
Speaker 5 (51:06):
Let's get into a lighter part of this. We I
think that they're you know, you say that what the
university is teaching on, you know, liberal arts and stuff.
I think it's more bigger than that. I think what
we're going to see is what the universities don't teach.
I think it's what because the right side of the
(51:28):
brain has been so dominant, so many of these other
skill sets and topics and issues have been kept out
in universities because.
Speaker 6 (51:38):
It's not it's not valued.
Speaker 5 (51:42):
And so we're going to see you know, our whether
you believe it or not, as astrology, and you know
what is what is our ability to.
Speaker 6 (51:52):
Connect with each other?
Speaker 5 (51:53):
Why can I my husband can say something and then
I can finish his thought or vice versa. What are
these skill sets and how does that relate to quantum physics?
And why can people actually do what? Okay, remote viewing
people think it's you know, woo stuff, but why can
some people see that there's a woman that was in
(52:16):
this training session that this guy that I'm working with,
Da Smith.
Speaker 6 (52:19):
He was training a woman.
Speaker 5 (52:21):
That could see a target fifty seven times in a
row within thirty seconds and thirty seconds and draw it
almost perfectly. There are people out there that have the
skill set, and we all have a little bit of
that and we can develop it. Supposedly, that's the stuff
we haven't developed. Universities don't teach it because it's not valued.
(52:41):
It's not right brain thinking, it's not I think we're
going to get into all sorts of other topic areas
that we don't even it'll be absolutely magical and magical
in a sense of just incredible, not magic as it's magic.
There's a science behind if it's real, there's a science
behind it, right. I think we're going to get into
all these kind of topics that we've ignored for so.
Speaker 4 (53:03):
Long, and we're going to build and enhance skills that
we didn't have. Here's a here's a random piece of
trivia for you that that this is right brain trivia.
Things that your that your right brain does that you're
not really aware of. Uh, you probably don't recognize it
(53:25):
all together. But most of the majority of the of
the emotional messages that you spend from send from your
face comfort the left side of your face, from the
right side of your brain. Yeah, I do you actually
just did it? You? Your smile is a little bit asymmetrical. Uh.
(53:50):
The if you if you take if you take women,
uh and ask how how they're going to hold a baby?
They hold it where they see it out of their
right visual field. They hold it with I mean out
of their left visual field. They hold it where they
see it with their right brain. And if you add,
(54:10):
it's not just people who've been moms. They all hold
it the same way. It's not just people who've been moms.
You give a little girl a doll and they hold
it that way. It's something that's built in to the
fact that your right brain picks up emotional stuff so
much better than your left brain.
Speaker 5 (54:30):
Well, will men be encouraged to develop that aspect because
they have been kind of it's kind of been suppressed, right,
And will there be more encouragement for them to develop
different aspects? I know the answer, yes, it will be,
But how do you see that changing them and still
(54:53):
valuing immensely who they are.
Speaker 4 (54:56):
You're not telling me I have to give up, John Waite.
Speaker 7 (55:00):
I think maybe we become more a stud of who
they are, what both sexes are, and we value people
more because that's part of what we're starting to figure out.
Speaker 4 (55:13):
For sure, that'll happen. I mean, again, going back to
the to the historical stuff. Thomas Aquinas memorized hundreds of
books and never thought about how calculus might work when
you didn't have to memorize books anymore. Newton went off
and invented the calculus instead of memorizing books. It's you
(55:36):
build different brain capabilities. Our brains are amazing machines. I mean,
they really are amazing machines. Somebody said, and I think
the name was Mitch Kaku, and he's absolutely right. He said,
you would have to go at a very minimum a
few thousand light years to find anything else as complicated
(55:58):
as what's sitting on your shoulders. Yeah, and that's really true.
That's not an exaggeration.
Speaker 5 (56:05):
That's cool. The will never be us, will it. It
will never have our capability.
Speaker 4 (56:11):
I don't think so. Now. That is not a universal opinion.
There are certainly folks. Sam Altman genuinely believes Elon Musk
genuinely believes that they can create a super intelligence that's
a brain better than ours. I'm sure that's correct.
Speaker 5 (56:34):
I'm not sure it's correct either, but if it can
figure out, because I think so much of the key
in some of these other capabilities actually tie into quantum physics,
and we don't even understand that yet, and we.
Speaker 6 (56:49):
Can tie in.
Speaker 5 (56:51):
I don't know if just because we can, we should that.
Speaker 6 (56:55):
I don't like.
Speaker 5 (56:55):
I think there's people doing and tinkering with things that
maybe we shouldn't be tinkering with this. But they're trying
to connect hundreds of brains together, synthesize it, and then
maybe connect it to quantum physics and then take it
to a whole other level.
Speaker 6 (57:13):
I think that's what they're trying to do.
Speaker 5 (57:15):
And can they do that?
Speaker 6 (57:16):
And should we do that?
Speaker 4 (57:18):
I think there's absolutely no question that the semantic part
of your brain, the bits and pieces of information part
of your brain, they can create something that handles that
a great deal more quickly than we can. They can
build something that handles a lot more information bits that
(57:39):
we can possibly handle. Those are doable things in throw
in quantum computing, and they're doable things by two or
three orders of magnitude. The question that you asked earlier
and the ones, and the one that I struggle with
a bit is are there things that are non semantic?
(58:01):
Are there things that are not information bit related that
we do that are not reproducible in a machine. I
struggle with that because I have a hard time if
I had to sit down and write a list of
those things, I'd have a tough time with it.
Speaker 5 (58:19):
Well, we're connected, we're frequency beings, right. The physics is
showing we're frequency. Our consciousness is actually creating reality. Can
they can a machine create reality?
Speaker 4 (58:36):
Well? Somebody said it was an off it was an
off in comment, but I think it's a lot of truth.
You can't even talk about consciousness until you have tenure.
Speaker 2 (58:50):
Well, come on, that's the point.
Speaker 5 (58:53):
These are the things I think universe because they have
been so dominant with this right brain stuff. I think
these are the profile shifts we're going to see in
what we're allowed to talk about, discuss and study.
Speaker 4 (59:06):
I think they're really important questions for educators, They're terribly
important questions for society in general. They're important questions and
the and the the end result is, Look, we know
what the machine is. We have a reasonable guess about
what its capabilities might be. The question is how do
(59:27):
we use that to make ourselves better? How do we
work at the machine to make us better organisms? And
and the real answer is going to be, we're going
to make ourselves different. We're not going to be our
brains are not going to be the same brains. And
(59:49):
the trick is going to be how do we how
do we make them more effective?
Speaker 5 (59:55):
And that's and people are scared when they hear that
they're year to talking just you know, connections in our brain.
But then they get scared and they start thinking transhumanism
and this almost where they turn us into this robot
like human you know where you're And I think that's inevitable.
(01:00:15):
I think there are going to be people who enhance
themselves through brain interfaces, chips, things, you know, the matrix
where you download how to learn something. I think that
it's an inevitable that there's going to be some of
that happening. But then when do we lose our humanity?
Speaker 4 (01:00:37):
I'm not sure we lose our humanity. We probably are
not going to be the same humans.
Speaker 6 (01:00:44):
And when is it? These are real questions.
Speaker 5 (01:00:47):
I know there's people who are fighting it tooth and nail,
and maybe they should be maybe they shouldn't be, but
it's a real discussion we have to have because even
if I totally hate it, there's going to be millions
of people who are going to embrace it and move
forward with it. So how do we look at that?
(01:01:09):
And how is it that? I mean, these are questions
we have to have.
Speaker 6 (01:01:13):
When is it good?
Speaker 5 (01:01:15):
And you know, being the bionic woman, for example, that
is you know, that was not something that was scary
because she wasn't altering her brain.
Speaker 6 (01:01:24):
And who she was, although maybe she was. I mean
she was.
Speaker 5 (01:01:27):
Fast and strong and all these things, But when.
Speaker 6 (01:01:29):
Is it that there?
Speaker 5 (01:01:31):
It changes us to a point where it's we're not
even we're different. I mean not even just different, but
so different that it's we're not even.
Speaker 6 (01:01:42):
We lose the soul of us. You know what I'm saying.
Speaker 4 (01:01:46):
I am it's way above my pay grade. But but
I understand.
Speaker 5 (01:01:51):
I have to ask you hard question.
Speaker 4 (01:01:55):
No, I understand exactly what you're saying. I also absolutely
know that I don't know the answer.
Speaker 5 (01:02:02):
There's going to be a time where people are going
to be forced to think about these things, and.
Speaker 4 (01:02:07):
Then there's going to be and it's going to be
real soon.
Speaker 5 (01:02:10):
And there's going to be a hard division of people
and it's going to create it's going to be another
war in society, even if it's not like physical, it's
going to be really intense debates.
Speaker 4 (01:02:24):
Look. I keep going back to history here, but you know,
when printing became available, the church instituted the restriction on
what could be printed. The church, the Catholic Church, markedly
restricted what was allowed to be put on paper in
(01:02:48):
the Northern Europe. That didn't happen, and culture and knowledge
and education moved from Southern Europe to Northern Europe and
did it in less than one hundred years.
Speaker 5 (01:02:58):
And they still are the people who didn't accept it.
And that where the church was suppressing. I did a
whole I did a segment at the university on this
that where they didn't where they weren't suppressed, they were
economically much further along. And the people still today who
were suppressed in that time are still suffering economically, they're
(01:03:20):
still lagging behind.
Speaker 4 (01:03:21):
There may be a message in therefore us.
Speaker 6 (01:03:25):
There is, but it rhymes it's not the.
Speaker 4 (01:03:28):
Same right exactly so exactly so are not our challenge?
Not my challenge. I'm wait to ask the age of
them for it to be my challenge, our kids challenge,
our grandkids challenge is going to be to figure out
how to deal with this well.
Speaker 5 (01:03:44):
And I think the other lesson was it opened up
free will. They took free will away from them, and
it hurt them substantially for centuries. Taking away free will,
suppressing freedom of speech, suppressing free will, whatever that is.
Manipulating people in a way that takes away their free
will has consequences.
Speaker 4 (01:04:08):
And I would think about it almost more. I'm a
little more concrete than that. But my version of that is,
rather than free will, I would say free flow.
Speaker 5 (01:04:19):
Of information, which is the precursor to free will.
Speaker 4 (01:04:25):
Yeah, if you start trying to hamper particularly a new
technology that changes how information moves, you're going to lose
that battle. What you have to do is figure out
how to make me Putting guardrails around it may make
good sense. Having some protections around how the information moves
(01:04:53):
may make good sense. Yes, not letting the information move
that doesn't work.
Speaker 5 (01:04:58):
No, you'll end up hurting yourself because somebody else is
going to do it. But yet you don't want them
to do it in a way that makes them hurt
you either. So I mean, this is a challenging time.
There's nothing that we've ever lived through.
Speaker 4 (01:05:11):
No, there's absolutely nothing we've ever lived through, And I
would sort of outcome to the conclusion that there's not
anything that humanity has ever lived through that's been quite
like this.
Speaker 6 (01:05:23):
I think you're right.
Speaker 4 (01:05:24):
I think this is the qualitative change is huge. That
alone is an immense problem. But the timescale that's happening
on is insanely short. We don't have a few hundred
years or even a few generations to figure this out.
Speaker 6 (01:05:45):
And we have what we have in the room. We
need some fast upgrades. So where you are, I have
to have you back.
Speaker 5 (01:05:54):
You're going to have to be a staffle of mine
as we go forward.
Speaker 6 (01:05:58):
Look, now, do you are you out there?
Speaker 5 (01:06:00):
You have anything where people can learn more about you?
Speaker 6 (01:06:04):
And are you're going to write a book? What do
you got going the book?
Speaker 4 (01:06:09):
Is? The manuscript is just about done?
Speaker 6 (01:06:14):
What will be the name? What do you have a
name for it yet?
Speaker 4 (01:06:16):
I'm probably going to call it the Changing Brain, the
Changing Brain.
Speaker 5 (01:06:20):
Excellent that you're doing. I think that when that comes out,
you are going to be everywhere. I think you're a
voice that people want to hear and I'm definitely going
to have to have you back. I really enjoyed this conversation.
Speaker 4 (01:06:32):
I'm sorry, this has been fun time.
Speaker 6 (01:06:36):
Thank you so much.
Speaker 5 (01:06:38):
I appreciate it. Before we end this, though, do you
have a website that people can go to where they
can where they're going to be able to get your
book or learn about it or is.
Speaker 6 (01:06:48):
It not up yet?
Speaker 4 (01:06:50):
No, the website is up and it's the website's Jack
McCollum MD dot com. But if you really want to
dip into a little bit of this stuff in short form,
there's a substack called the Changing Brain.
Speaker 5 (01:07:05):
I didn't know you had a substack. I will definitely Okay, everybody,
you gotta go to a substack. You gotta subscribe, and
I'll have all those links below as well.
Speaker 4 (01:07:16):
Yeah, there's about Tanner or some entries in that touch
on a lot of this stuff.
Speaker 6 (01:07:24):
Okay, thank you so much.
Speaker 4 (01:07:26):
Sarah, You're welcome. I'm Jordan.
Speaker 2 (01:07:45):
Hi, everybody, it's me Cinderella. Ax.
Speaker 1 (01:07:48):
You are listening to the Fringe Radio Network.
Speaker 2 (01:07:53):
I know I was gonna tell them, hey, do you
have the app?
Speaker 1 (01:07:57):
It's the best way to listen to the Fringe Radio Network.
Speaker 3 (01:08:00):
It's safe and you don't have to logging to use it,
and it doesn't track you or trace you, and it
sounds beautiful.
Speaker 2 (01:08:09):
I know, I was gonna tell him, how do you
get the app?
Speaker 3 (01:08:12):
Just go to fringe radionetwork dot com.
Speaker 2 (01:08:16):
Right at the top of the page. I know, slippers,
we gotta keep cleaning these chimneys.