Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hey, everybody, it's me Josh, your old pal, and for
this week's s Y s K Selects, I've chosen How
Chaos Theory Changed the Universe first came out in July
of two thousand sixteen, and I have to say I
think it's um one of the better science e Stuff
you should Know episodes of all time, because there's just
something about this that grabbed me and Chuck by the
(00:22):
callers and said I'm interesting, aren't I? And we said, yes,
you definitely are. And this one has everything. It has science,
it has philosophy, it has our understanding of the universe.
Is just all around good episode. So I hope you
enjoy it as much as I did listening to it. Again.
Welcome to Stuff You Should Know, a production of I
(00:43):
Heart Radios How Stuff Works. Hey, and welcome to the podcast.
I'm Josh Clark with Charles W Chuck Bryant and there's
Jerry over there. So this is stuff you should know,
the podcast about Chaos Theory. Like, uh, have you ever
(01:03):
seen Event Horizon? I did not bad? Great movie? Are
you crazy? I think it was great? Oh it was
so imagining it. I thought it was okay, it was
like a love crafty and thing in outer space. Yeah,
I loved it was all right, I love crafted it.
I liked it. Um, that's what I think of when
(01:24):
I think of chaos. You know, there's that one part
where they kind of give you like a glimpse behind,
like the dimension that this action is taking place in,
to see the chaos underneath. And you should check that
out again. Yeah, I think about Jurassic Park and Jeff
Goldblum as as the creep Dr Malcolm explaining chaos in
(01:49):
the little auto driving suv or whatever that was. Yeah,
that's what it was called in the script, the auto
driving suv scene. Yeah. And you know what, actually rewatched
the scene and it confirmed two things. One is that
he uh, he actually did a pretty decent job for
a Hollywood movie with a very rudimentary explanation of chaos. Um,
(02:10):
and you watched it for this yeah, yeah, just that scene.
And then it also confirmed of what a creep that
character was. Yeah. If you watch that scene, he's like,
you know, he was all gross and flirty with her
right in front of her ex but there's just you know,
he's talking to her. I didn't even notice this at first.
He like he just like touches her hair out of
(02:30):
nowhere for no reason. He's just talking to her and
he just like grabs her hair and touches it. And
I'm like, what a creep. I know, if you look closely,
you can see the hormones emerging through his chest hair. Yeah,
and I love Jeff Goldblum. It's not a reflection on him.
He was basically doing Jeff Goldbloom. Well that's what. Yeah, sure,
he's Jeff Goldblum, but I don't think that's how in
(02:51):
the manner in which he speaks. But I don't think
he's a creep, do you. Wow, I've got nothing against
Jeff gold I think he's a I think he's doing
Jeff Goldblum. It was also a sign of the times,
Like if that movie were made today, doctor what was
her name in the movie? Think, Yeah, Dr Sadler would
(03:12):
be like, it's very inappropriate to stroke my hair. Yeah, like,
don't touch me. But this was the nineties, nineties, free wheeling,
it was eight it was nineties, it was the early
mid nineties thing. The book came out in and in
the book, uh Ian Malcolm, who's a Kayetician? Yeah, creep Kaetician? Right?
(03:35):
He um? He he goes into even more depth about chaos.
But that was I mean, that was the first time
I ever heard of chaos theory was from Jurassic Park,
and um it really it was really misleading. I think
the entire term chaos is very misleading as far as
the general public goes as from what I researched in
(03:58):
this this for this article, well, yeah, I mean you
hear the word chaos as an English speaker and you
think frenetic and crazy, out of control. Yeah, and that's
not what it means in terms of science like this, right.
What it means, I guess we can say up front
is is basically the idea that complex systems do not
(04:19):
behave in very neat ways that we can easily grasp, understand,
or measure right, and not even even simple systems don't.
Sometimes it doesn't always have to be complex. But um,
I want to give a shout out in addition to
our own article to uh when you know, when it
comes to stuff like this, the brain breaking stuff for me, man,
(04:41):
this is a brain breaker. You know how I always
go to like blank blank for kids because it always
helps if there's a dinosaur mascot on the page. It's
a sure thing, we can understand it. But the best
explanation for all this stuff that I found on the
internet was from a website called a bar Um A
B A R I. M. Publications, which turns out to
(05:04):
be a website about biblical patterns and sandwiched in the middle,
there is a really great, easy to understand uh series
of pages on chaos there. So I was like, man,
I get it now, I mean in a rudimentary way, right, Well, yeah, um,
I think even a lot of people who deal with
(05:26):
systems that display chaotic behavior, which I guess is to
say basically all systems eventually under the right conditions, UM,
don't necessarily understand chaos. Yeah. And they define a complex
system is specifically. It doesn't mean just like, oh it's complex,
I mean it is, but specifically, Um, they define it
in a way that helped me understand. It's a system
(05:48):
that has so much motion, so many elements that are
in motion, moving parts. Yeah, that it takes like a
computer to calculate all the possibilities of like what that
could look like five minutes from now, ten years from now.
So before computers came around, we before the quantum mechanical revolution.
(06:09):
It was there's a lot more basic. It was like
what comes up must come down, stuff like that. Let's
talk about that, Chuckers, because when you're talking about chaos theory,
it helps to understand how it revolutionized the universe by
getting a clear picture of how we understood the universe
leading up to the discovery of chaos. Right, So, prior
(06:32):
to the um the scientific Revolution, everybody was like, oh, well,
it's it's God. The Earth is at the center of
the universe, and God is spinning everything around like a top. Right.
It was all a theistic explanation. Then the scientific revolution
happens and people start applying things like math and making
like mathematical discoveries and and and figuring out that there
(06:56):
are there's order. They're finding order in patterns and predictability
to the universe if you can apply mathematics to it. Yes, specifically,
if you can apply mathematics to the starting point right, right, So,
if you can if you can um figure out how
a system works mathematically speaking, right, you can go in
(07:18):
and plug in whatever coordinates you want to and watch
it go. You can predict what what the outcome is
going to be, and what this is the it's based
on what at the time was a totally revolutionary idea
um By Initially, I think the cart was the first
one to kind of say cause and effect is a
(07:38):
pretty big part of our universe, right. Yeah. It was
sort of like where this is the the sixteen hundreds where
early science met philosophy. They kind of complemented one another
as far as something that's we're talking about determinism, right,
So that was the kind of the seeds of determinism
was the scientific revolution, and like you said, where philosophy
(07:59):
and science came together in the form of Descartes, right.
And then Newton came along and we did a whole
episode on him. Yeah, January of this year. That was
a good one. It was really good. Like I think
you said in that episode that there's possibly no scientists
that changed the world more than Newton has. He's he's
got legs. People shouted out others and email, but I'll
(08:20):
just say he's at the near the top for sure
with some other people. The cream. So Newton came along
and Newton said that was his name, Isaac the Cream Newton, right,
and anytime he don't to be like cream, Yeah, you
just got creamed. So I thought he was a boxer.
He's a basketball player. He was much more well known
as a boxer, but he definitely could dunk as a
(08:42):
as a B baller. So um Man, that threw me
off a little bit. Yeah, the cream comes along and uh,
he basically says, watch this, dude, this causing effect thing
you're talking about, I can express it in quantifiable terms.
And he comes up with all of these great laws
(09:02):
and and basically sets the stage the foundation for science
for the next three centuries or so. Yeah, these these
laws that were so rock solid and powerful that scientists
kind of got ahead of themselves a little and said
we're done. Like with Newton's laws, we can predict Uh,
we can predict everything if we have a good enough
(09:24):
beginning accurate value to plug into his equations, and they weren't.
I think there was a little hubris and a little
just excitement about like, well, we figured it all out right,
that that you could take Newton's laws and if you
had accurate enough measurements, you could predict what the outcome
(09:45):
would be of that system that you plug those measurements
into using these simular and at the time, a lot
of this was like planetary like, well, we know that
these planets are here and they're moving and their orbiting.
So if we know these things, we can plug it
into an equation and we can figure out what it's
going to be like in a hundred years exactly. And
they've figured out the basis of determinism is what we
(10:09):
just said, that if you have accurate measurements, you can
take those measurements and use them to predict, um how
a system is going to change over time using differential equations. Right,
so this is what this is what Newton comes along
and figures out that you can describe the universe and
these mathematical terms using differential equations and um. Like you
(10:30):
said there was a tremendous amount of hubris, and well,
I think you said there's some hubris. I think there's
a tremendous amount of hubris where science basically said, we've
mastered the universe, We've uncovered the blueprint of the universe,
and now we understand everything. It's just a matter now
of getting our scientific measurements more and more and more exact.
(10:51):
Because again, the hallmark of determinism is that if you
have exact measurements, you can predict an outcome accurately, like
the pool queue example. Well they're the pool table example, right, right,
So if you've got a pool table, let's say you're
playing some nine ball. You have that beautiful little diamond
set up, you got your cue ball, you put that
(11:12):
cue ball, and you you crack it with the queue.
And if you are super accurate with your initial measurements,
you should be able to mathematically plot out via angles
where the balls will end up right exactly, Like you
can say, this is what the table will look like
after the break, if you know the force, the angle,
all those little variable temperature, if there's wind in the room,
(11:34):
like the felt on the table, like everything. The more
specific you are, the more accurate your end result will be. Right.
And then one of the other hallmarks of determinism is
that if you take those exact same initial conditions and
do them again, the table, the pool table will look
exactly the same after the break. Yeah, which is pretty
much impossible for like a human to do with their hands. Sure,
(11:56):
but the idea at the time of science was that
if you could build a perfect machine, sure that could
recreate these conditions, it will happen the same way every time, right, Yeah,
And this, I mean, this led to they had hubris,
but you could understand it when like literally in eighteen
forty six, two people predicted Neptune would exist within months,
(12:20):
that would exist, but does exist. And this is not
by looking up in the sky like they did it
with math and they were right. Yea. So imagine in
eighteen forty when that happens, they're like, yeah, we kinda
we've got the math down, so we're pretty much all
knowing well. Plus also for the most part these not
just with Neptune, they were finding, um that this stuff
(12:43):
really panned out. It held true for everything from um,
you know, the investigation into electricity to new chemical reactions
and understanding those, and it laid the scientific revolution, laid
the basis for the industrial revolution, and just the change
inge that came out of the world like that. It
definitely there. It is understandable how science kind of was
(13:06):
like we got it all figured out well. And like
you said, they even Galileo was smart enough to know
there's uncertainty in these measurements, like the precision is key,
so they spent what does the article say, A lot
of the much of the nineteenth and twentieth century just
(13:26):
trying to build better instrumentation to get more and more
smaller and smaller and more precise measurements. Right, that was
like basically the goal of it, right, Yeah, which was
the right direction. That's like exactly what they should have
been doing. The problem is there, Like you said, Galileo
knew that there was some sort of there, There're gonna
(13:46):
be some flaws and measurement that we just didn't have
those great scientific instruments yet. Yeah. It's called the uncertainty principle. Okay,
it's accuracy, right, But the idea is if you have
a good enough instrument, you can overcome that, and that
the the more you shrink the um error in measuring
(14:11):
the initial conditions, the more you're going to shrink the
error in the outcome. It would be proportionate. Right. They
were correct. The thing is they were also aware but
ignoring in a lot a lot of ways some outstanding problems,
specifically something called the end body problem. You know what,
(14:33):
I'm so excited about this. I need to take a break.
I think that's a good idea. I need to go
check out my end body in the bathroom, Okay, and
we'll be back, all right. Took we're back. So there's
(15:03):
some there's some issues right with determinism. There's some some
weird problems out there that are saying like, hey, pay
attention to me because I'm not sure determinism works right. Uh.
And one one is the end body problem. Yeah. How
this came about was five. That was King Oscar number
(15:24):
two of Sweden and Norway. Yeah, I don't want to
leave out Norway both. Uh. He said, you know what,
let's offer a prize to anyone who can prove the
stability of the Solar system, something that has been stable
for a long time before that. And a lot of
the most brilliant minds on planet Earth got together and
(15:45):
tried to do this, uh, with mathematical proofs, and no
one could do it. Uh. And then a dude name Honoree.
You gotta help me there with that, Oh, say the
whole thing. Red Plank, a very nice he was French,
believe it or not, and he was a mathematician, and
(16:06):
he said, you know what, I'm not gonna look at
this big picture of all the planets in the Sun
and all their orbits. You'd have to be a fool
to try that. Sure, he said, I'm gonna shrink this down.
Like we talked about shrinking that initial value, you know,
and um, that initial condition, he shrunk it down. He said,
I'm gonna look at just a couple of bodies orbiting
(16:26):
one another, uh, with a common center of gravity, And
I'm gonna look at this. And this was called the
N body problem, Yeah, which was smart to do, because
the more variables you factor into um a nonlinear equation
like that, just the harder it's gonna be. So he
shrunk it down. So the N body problem has to
(16:48):
do with three or more celestial bodies orbiting one another.
So Plank said, oh, I'll just start with three. Ye. Smart.
And what he found from doing his equations for this
this King Oscar the Sequel Prize um, was that shrinking
the initial conditions measurement or rate of error right yeah,
(17:12):
did not really shrink the the error in the outcome, right,
which flies in the face of determinism. What he found
was that just very very minute differences in the initial
conditions fed into a system produced wildly different outcomes after
(17:33):
a fairly short time. Yeah. Like, let me just round
off the mass of this planet at like the eighth
decimal point, and you know who cares, who cares? At
that point? Let me just round that one to a two,
and that would throw everything off at a at a
pretty high rate. And he said, wait a minute, I
think this contest is in possible, right, He said, there
(17:57):
is no way to prude, prud to roof the stability
of the Solar system, because he just uncovered the idea
that it's impossible for us to predict the um, the
the rate of change among celestial bodies. Yeah, it's such
a complex system. There are far too many variables that, uh,
(18:21):
it's impossible to start with something so minute to get
the equation whatever, the sum that you want at the end. Well,
not only that a sum I guess, but the result
not only that, And this is what really undermined determinism
was that he figured out that you would have to
have an infinitely precise measurement. Yeah, which even if you
(18:44):
build a perfect machine that could take the infinitely a
perfect machine that could take a measurement of like the
the movement of a celestial body around another, you, it's
literally impossible to get infinite an infinitely precise measurement, which
means that we could never predict out to a certain
(19:05):
degree the movement of these celestial bodies. Like he was saying, like, no,
you you can't get you can't build a machine there
that gets measurements enough that we can overcome this, Like
determinism is wrong, Like you can't just say, uh, we
have the understanding to predict everything. There's a lot of
(19:27):
stuff out there that we're not able to predict. And
he uncovered it trying to figure out this end body problem. Yeah,
and King Oscar the sequel said you win, Yeah, bring
me another rack of lamb and uh, here's your prize.
And he won by proving that it was impossible, which
is pretty interesting, and that utterly and completely changed not
just math, but like our our our understanding of the
(19:49):
universe and our understanding of our understanding of the universe,
which is even more kind of earth shaking. Yeah, he
discovered dynamical instability or chaos, and um, they didn't have
supercomputers at the time, so it would be a little while,
about seventy years at m I T until uh, we
could actually kind of feed these things into machines capable
(20:13):
of plotting these things out in a way that we
could see, which was really incredible. So there was this
dude um seventy years later, uh named um Edward Lawrence
or Lawrence. Yeah. Well, first of all, we should set
the stage the reason this guy he was a meteorologist
and scientist, not that those are not the same thing, right,
(20:36):
He's a scientist who dabbled the meteorology. He was a mathematician, Yeah,
but he was really into meteorology because it was there
was a weird juxtaposition at the time where we were
sending people into outer space but we couldn't predict the weather. Yeah,
and it was it was definitely a blot on the
field of meteorology. People were like, do you guys know
(20:57):
what you're doing? And and meteor all just like you
have no idea how hard this is? Yeah, Like, yeah,
we can predicted a couple of days out, but after that,
it's just it's totally unpredictable. It drives as mad and
it's not It wasn't just there, um, their reputations that
were at stakes, like people were losing their lives because
of it, right, Yeah. Nine six two there were two
(21:18):
notorious storms, one on the East coast and one on
the west. Uh, the ash Wednesday storm in the east
and the Big Blow in the west. That killed a
lot of people, cost hundreds of millions of dollars in damage,
and people were like, you know, we need to be
able to see these things coming a little more because
it's a problem. And meteorologists were like, why did you
do it then? So they thought the key was these
(21:41):
big supercomputers. Remember the supercomputers. When they came out the
big rooms full of hardware, it was amazing and they
were finally able to do like these incredible calculations that
we could never do before. I know, they were able
to like crunch sixty four bites a second. Yeah, we
had the abacus and then the supercomputer, nothing in between. Um.
(22:02):
I looked up the computer that Lawrence was working with,
the Whopper a Royal McBee. What was the whopper board games?
Was it called the whopper? W a pr? I can't
believe they called it that. So the guy just nicknamed
it Joshua. No, Joshua was the the software Falcon was
(22:23):
the old man who designed all this stuff, and his
son was Joshua. And that was the password? Oh, that
was the password. Yeah, I guess I was too young
to understand what a password was. Yeah, okay, you didn't
even there weren't passwords at the time. Shouted it at
the computer and they're like, okay, access granted. Yeah, still
that movie holds up, does it really totally got to
(22:45):
check it out? Yeah, still, very very fun. Young Ali
Sheety boy had a crush on her from that movie.
She was great. Yeah, what else was she in recently?
Wasn't she in something? Well? I mean she kind of
went away for a while and then had her big
comeback with the indie movie High Art, But that was
a while ago. Has she been in anything else recently? Sure?
(23:07):
I think I saw something and something recently and I
didn't realize that was her. She looks familiar. I was like, oh,
that's Ali Sheity. I don't know all right. I could
look it up, but I won't as it doesn't matter anyway.
I still crushed on her. So the the Royal McBee
was not quite the whopper. You could actually sit down
(23:28):
at it. The Royal McBee, that's the name of that
sounds like a Hamburger too. It was by the Royal
Typewriter Company and they got into computers for a second.
And this is the kind of computer that Lawrence was
working with, and it was a huge deal, like you
were saying Abacus supercomputer. UM. But it was still pretty
(23:49):
dumb as far as what we have today is concerned.
But it was enough that Lawrence is like Lawrence and
his ILK, where like, finally we can start running models
and actually pretty the weather. Yeah, he started doing just that.
He did. So he started off with, UM, a computational
model of twelve meteorological meteorological calculations, which is very basic
(24:13):
because they're infinite meteorological calculations, probably depending I stay it
wrong again, Like it sounds like you're about to say
it wrong and then you pull it out at the
last second. Maybe it's really impressive. But h So that's
very basic. But he wanted to start out, you know,
with something attainable, so he narrowed it down to twelve conditions,
basically twelve calculations that had you know, temperature, wind, speed, pressure,
(24:38):
stuff like that started forecasting weather. Uh. And then he said,
you know, it'd be great if you could see this,
So I'm gonna spit it into my wonder machine, the
McWhopper Royal MCBE and I'm going to get a print
out so you can visualize what this looks like. So
things were going well, and he had this print out,
(24:58):
and everyone was amazed because these these calculations never seemed
to repeat themselves. He was making like um, like like
word art. You remember that. That was the first thing
anybody did on a computer. It was to make word
art like a butterfly or right, you would print out. Yeah,
I never could do that. I couldn't either, Like you
(25:19):
have to be able to visualize things spatially that you
have to have that right kind of brain for that, right,
or you have to be following a guide book. But
you have you ever seen me? You and everyone we know? Yeah,
I love that movie. That's a great movie. Those little
kids in there, they were doing that Oh yeah, yeah, forever,
back and forth poop. Well I haven't I haven't seen
(25:41):
that since it came out. It's been a while. Oh
you gotta see it again. Yeah, great movie. Ali's not
in it. It's a Miranda July right, and she wrote
and directed to right. She did a great job. It's
like it's one of those rare movies where like there's
just the right amount of whimsy. Is whimsy so easily
overpowers everything else and becomes like yeah, yeah, this is
(26:06):
like the most perfectly balanced amount of like whimsy you've
ever seen in a movie. Yeah, there's too much whimsy.
I just like terrible Garden State. I just want to
punch in the face Terrible. Although I like Garden State,
but I haven't seen it since it came out. It
hasn't aged. Well, it's just when you look at it now,
it's just so cutesye and whimsical. It's like, come on, yeah, boy,
(26:28):
we're getting to a lot of movies today. Oh yeah,
Well we're stalling and we haven't even talked about butterfly
effect yet, which is coming and I'm dreading it. That's
why I'm stalling, all right. So where were we? He
was running his calculations, printing out his values so people
could see it, and then he got a little lazy
one day. In this output he noticed was interesting, so
(26:54):
he said, you know, I'm gonna repeat this calculation see
it again, but I'm gonna to save time, just gonna
kind of pick up in the middle, and I'm not
gonna input as many numbers, but I'm still using the
same values, just I'm not going out to six decimal points.
So the print out he had went to three decimal points,
so he was working from the print out and didn't
(27:16):
take into account that the computer accepted six decimal points,
so he was just putting in three and expecting that
the outcome would be the same. Right, Yes, but the
outcome was way different, right, And he went whoa, whoa what? Yeah,
He's like, what's going on here? There was a big deal.
I mean someone would have come up with this eventually
probably yeah, but sort of accidentally came upon it. It's
(27:38):
neat that this guy did this because it changed his career.
I think he went from emphasis on meteorology to an
emphasis on chaos math to stud scientists basically. So, I mean,
the guy's got an attractor named after him, you know
what I mean. Yeah, well let's get to that. So
Lorenz starts looking at this and he's like, wait a minute,
(27:58):
this is this is weird. This is worth investigating, and
like uh like uh, what was his name? Pwankara? He said,
I need fewer variables, so I'm not going to try
to predict weather with these twelve differential equations that you
have to take into account. I'm just gonna take one
aspect of weather called the rolling convection current, and I'm
(28:21):
going to see how I can write it down in
formula form. So rolling convection current, chuck is where you know,
how the wind is created where air at the surface
is heated and then starts to rise and suddenly cool
air from higher above comes in to fill that that
vacuum that's left, and that creates a rolling um or
(28:46):
vertically based convection current. Okay you could, I would describe
it as oven oven, boiling water, a cup of coffee.
Wherever there's a temperature differential based on a vertical alignment,
you're going to have a role in convection current. Okay, yeah,
it sounds complex, but he just picked out one thing,
(29:07):
basically one condition and this is the one he picked out.
But had you seen my hands moving listeners, you would
be like, oh yeah, I know he made little rolling emotions.
So um, he's like, okay, I can figure this out.
So he comes up with three three formula that kind
of describe a rolling convection current, and he starts trying
(29:28):
to figure out how to describe this rolling convection current
right correct, And so, like I said, he got this
these three formula, which we're basically three variables that he
calculated over time, and he plugged him in and he
found three variables that changed over time, and he found
that after a certain point, when you graph these things out,
(29:49):
and since there are three, you graph them out on
a three dimensional graph, so x, y and z. Again,
he wanted to just be able to visualize this because
it's easier for people to understand. He was a very
visual guy. All of a sudden, it made this crazy
graph that where the line as it progressed forward through time,
went all over the place. It went from this access
(30:10):
to another access to the other axis, and it would
spend some time over here, and then it would suddenly
loop over to the other one, and it followed no
rhyme or reason. It never retraced its path. And it
was describing how a convection current changes over time. Right,
and Lorenz is looking at this. He was expecting these
(30:33):
three things to equalize and eventually form a line, because
that's what determinism says, things are going to fall into
a certain amount of equilibrium and just even out over time.
That is not what he found. And what he discovered
was what poncar A discovered, which was that some systems,
even relatively simple systems, exhibit very complex, unpredictable behavior, which
(30:59):
you could call offs. Yeah. And when you say things
were going all over like if you look at the graph,
it it's not just lines going in straight lines bouncing
all over the place randomly, like there was an order
to it, but the lines were not on top of
one another. Like let's say you draw a figure eight
with your pencil and then you continue drawing that figure eight,
(31:19):
It's gonna slip outside those curves every time unless you're
a robot um. And that's what it ended up looking like. Yeah. Yeah,
it never retraced the same path twice ever. Um. It
had a lot of really surprising properties and at the time,
it just fell completely outside the understanding of science, right yeah.
(31:41):
Luckily this happened to Lawrence who was curious enough to
be like, what is going on here? And again he
sat down and started to do the math and thinking
about this and especially how it applied to the weather,
right yeah, and he came up with something very famous. Yes,
the butterfly effect. Yes, uh a, this thing kind of
(32:02):
looked like butterfly wings a little bit. Uh and b
When he went to present his findings, he he basically
had the notion He's like, I'm gonna I'm gonna wile
these people in the crowd. In nine two, it's a
conference that I'm going to and I'm gonna I'm gonna
say something like, you know, the seagull flaps his wings
and it starts a small turbulence that can one that
(32:24):
can affect whether on the other side of the world.
The small little thing will just grow and grow in
snowball and effective things. And he had a colleague goes, like,
seagull wings, that's nice, and he said, how about this?
And this is the title they ended up with, predictability
Colin does the flap of a butterfly's wings in Brazil
(32:44):
set off a tornado and Texas And everyone was like,
whoa mind's blown? Should we take a break? All right,
We'll be right back, all right. So the lawns attractor,
(33:18):
uh is that picture that he ended up with, The
lawns attractor? And this biblical pattern website that I found
described attractors and strange attractors in a way that even
dumb old me could understand what you got. So if
I may, he says, all right, here's the cycle of chaos.
(33:41):
He said, Actually, I don't know who wrote this. Woman
could have been a small child could have been no
of undetermined gender. I have no idea but the gender
neutral narrator. They said, He's sorry. Think about a town
that has like in thousand people living in it. To
(34:02):
make that town work, you've got to have like a
gas station, a grocery store, a library, um, whatever you
need to sustain that town. So all these things are built,
everyone's happy. You have equilibrium. He said, So that's great.
Then let's say you build some Someone comes and builds
a factory on the outskirts of that town, and there's
(34:23):
gonna be ten thousand more people living there, and they
don't go to church. Maybe so, uh did I say church?
They needed a church? Okay? I was just assuming this
is what's caum no, no, no, but you just have
more people. So there's you need another gas station and
another grocery store. Let's say so they build all these things,
and then you reach equilibrium. Again, it's maintained because you
(34:47):
build all these other systems up. That equilibrium is called
an attractor. Okay, So then he said it said, they
said he capital he the royal. He said, all right,
now let's say instead of that that factory being built,
(35:08):
and you have those original tin thous and let's say
three thousand. Those people just up and leave one day,
and the grocery store guy says, well, there's only seven
thousand people here. We need eight thousand people living here
to make a profit. So I'm shutting down this grocery store.
Then all of a sudden, you have demand for groceries.
So things go on for a little while, and someone
comes in and say, hey, this town needs a grocery store.
(35:30):
They build a grocery store, they can't sustain, they shut down.
Someone else comes along because the demand, and it is
this search for equilibrium, this dinet. Well, you reach equallylibrium
here and there as the store opens periods of stability,
periods of stability, and that dynamic equilibrium is called a
(35:51):
strange attractor. So an attractor is the state which a
system settles on. Stranger attractor is the trajector on which
it never settles down but tries to reach the equilibrium
with periods of stability. Man, does that make sense that
Bible based explanation was dynamite? I understand it better than
(36:12):
I did before, and I understood it okay before. That's great,
surely can add yeah, yeah, now you're gonna add to it. No,
that's it. No, I mean like it. Yeah. And attractor
is where if you graph something and eventually it reaches equilibrium,
it's a regular attractor. If it never reaches equilibrium, is
(36:33):
constantly trying to and has periods of stability. Strange attractor.
I can't. I can't top that. All right, grocery store,
small town. That was great. So um Lorenz, a strange
attractor was named a Lorenz attractor named after him. Big deal.
They weren't using the word chaos yet. No, but he
published that paper about butterfly wings, right, the butterfly effect,
(36:55):
and it coupled with his pictures, the picture of a
strange attractor, which is almost the aside from fractals, almost
the the the um emblem or the logo for chaos theory,
the lords attractor is um. It got attention off the bat.
It wasn't like plan Cares findings, where he got neglected
(37:17):
for seventy years. Almost immediately everybody was talking about this
because again, what Lawrence had uncovered, which is the same
thing that Planka had uncovered, is that determinism is possibly
based on an illusion that the universe isn't stable, that
the universe isn't predictable, and that what we are seeing
as stable and predictable are these little periods windows of
(37:39):
stability that are found in strange attractor graphs. That that's
what we think the order of the universe is, but
that that is actually the abnormal aspect of the universe,
and that instability, unpredictability, as far as we're concerned, is
the actual state of affairs in in nature. And I
think as as we're concerned is a really important point
(38:02):
to chuck, because it doesn't mean that nature is unstable chaotic.
It means that our picture of what we understand is
order doesn't jibe with how the universe actually functions. It's
just our understanding of it, and we're's just so um
(38:22):
anthropocentric that, you know, we we see it as chaos
and disorder and something to be feared, when really it's
just complexity that we don't have the capability of predicting.
After a certain degree. Yeah, I think that makes me
feel a little better, because when you read stuff like this,
you start to feel like, well, the Earth could just
throw us all off of its face at any moment,
(38:46):
because it starts spinning so fast that gravity becomes undone,
and I know that's not right. By the way, I've
always loved that kind of science that shows we don't
know anything, like Ert Robert Hume, who I know, I
understand was a philosopher, but he was a philosopher scientist. Um.
His whole jam was like causing effect as an illusion
that like we all we it's it's just an assumption,
(39:07):
like that if you drop a pencil, it will always
fall down. It's an illusion. And this is pre um
gravity understanding gravity, but he makes a good pret gravity
when everyone's just floating around. Yeah, going this pencils got
me wacky. But but the point was that you know,
we we are. We base a lot of our assumptions
(39:27):
um or a lot of stuff that we take as
law are actually based on assumptions that are made from
observations over time, and that we're just making predictions that
causing effect as an illusion. I love that guy, and
this this definitely supports that idea for sure. Yeah. Sorry,
I'm I'm excited about chaos theory. Can't believe it? Well,
(39:48):
I mean I like that I'm able to understand it
and enough of a rudimentary way that I can talk
about it at a dinner party. Well, thank your Bible website. Well,
once you take the formulas out, yeah, for people like us,
we're like, okay, we can understand chaos. Yeah. Then when
somebody says, good, do a differential equation, you're just like,
(40:10):
what a different equation? Alright? So earlier I said that
chaos had not been used the word chaos to describe
all this junk. Uh, And that didn't happen until later on,
and well actually about ten years, you know, but it
was kind of at the same time this other stuff
was going on with the Lorenz late sixties early seventies.
(40:30):
There was a guy named Stevens Smile uh Fields metal recipient,
so you know, he's good at math and um he
describes something that we now know as the small horseshoe,
and it goes a little something like this. Uh so,
all right, take a piece of dough, like like bread dough,
(40:54):
and you smash it out into a big flat rectangle
can do. So you're looking at that thing and you're like, boy,
I hope this makes some good bread. This is gonna
be so good. So then you do a little roseberry
on it. Yeah maybe so uh well, see salt. Yeah,
and then um, lick it before you bake it, so
you know it's yours. No one else can happen. Uh.
So you you have that flat rectangle of dough, you
(41:16):
roll it up into a tube, and then you smash
that down kind of flat, and then you bend that
down to where it eventually looks like a horse shoe.
So now you take that horseshoe. You take another rectangle
of dough, and you throw that horseshoe onto that, and
then you do the same thing. The smell horseshoe basically
says you cannot predict where the two points of that
(41:40):
horse shoe will end up. Yeah, you can roll it
a million times and they'll end up in a million
different places, totally random, different places to totally random. You
never know. It's like a box of chocolates. You never
know what you're gonna get. You have to say it,
and that became known. You have to say it. Oh
what a tate Forrest Coump And I can't do that.
(42:01):
That's fine, he's not one. He's not in my repertoire.
That's fine. Although I did see that again part of
it recently. Does it hold up well? I mean, take
out forty minutes of it and it would have been
a better movie, like all of that coincidence stuff that
Oh I love that and also did the smile t
shirt Like it was just too much, Like he really
(42:23):
hammered it too much. That was the basis of the movie.
I know. But see it again and I guarantee you,
like an hour and a half into it, you'll be like,
I get it. You know. It was a good Tom
Hanks movie that was overlooked. A Road to Perdition, Yeah,
that bad. That was a good one. Great sam In
does Oh man, that guy is awesome. Yeah, Oh, what
(42:47):
is he gonna do? He might do something he did
the James Bot he did Skyfall. Yeah, yeah, I know
he's gonna also that last one that wasn't so great.
He's got a potential project coming up and he would
be amazing for it. And I don't remember what it was.
Did you see Revolutionary Road? Yes? God have it was
just like, yeah, you want to jump off a bridge?
That like every five minutes during that movie. That was hardcore.
(43:12):
Uh he did that one too, huh. Yeah, And don't
see that if you're like engaged to be married or
thinking about it, yeah, or if you're blue already. Yeah,
I'm yeah to take a really good good mood and
be like I'm sick of being in a good mood,
sit down and watch Revolutionary Road, watch Joe versus the
Volcano instead? Uh? Where was I smell? Horseshoe? Is what
(43:33):
that's called? And um? That was he was the first
person to actually use the word chaos. Oh he was?
I think so? No? No, No York was Tom York's dad. Yeah,
you're right, he wasn't the first person New York correct,
But it's male's horseshoe illustrates a really good point, Chuck,
is it Tom York's dad? Okay, no, but they're both British, Sure,
(43:54):
Yorky's actually one's Australian. No, they're British. Um. So, uh
those two points which should which started out right by
each other, and then end up in two totally different places.
That applies not just a bread dough, but also too
things like water molecules that are right next to each
other at some point and then uh month later, they're
(44:17):
in two different oceans. Even though you would assume that
they would go through all the same motions and everything,
but they're not. There's so many different variables with things
like ocean currents, that two water molecules that were one
side by side end up in totally random different places.
And that's part of chaos. It's basically chaos personified or
(44:38):
chaos molecule fied. So we mentioned York. Where I was
going with that was, Um, there was an Australian named
Robert May and he was a population biologist. So he
was using math to model how animal populations would change
over time, giving certain starting conditions. Uh. So he started
(45:01):
using uh these equations is differential equations, and he came
up with a formula known as the logistic difference equation
that basically enabled him to predict these animal populations pretty well. Yeah,
and it was working pretty well for a while, but
he noticed something really really weird, right. He had this formula, Um,
(45:22):
the logistic difference equation is the name of it. Sure, Okay,
So we had that formula and he figured out that
if you took our which in this case was the
reproductive rate of a animal population, and you pushed it
past three, the number three, So that meant that the
average animal in this population of animals had three offspring
(45:48):
in its lifetime or in a season whatever. If you
pushed the past three, all of a sudden, the number
of the population would diverge. If you pushed it equal
to three actually, or more right, it would diverge, which
is weird because a population of animals can't be two
different numbers, you know, like that herd of antelope is
(46:11):
not there's not thirty, but there's also forty five of
them at the same time. That's called a super position,
and that has to do with quantum states, not herds
of antelopes. That was kind of weird. And then he
found if you pushed it a little further, if you
made the reproductive rate like three point oh five seven
or something like that. I think it was a different number,
(46:34):
but you just tweaked it a little bit, not even
to four. We're talking like millions of a of a
um of a degree um. All of a sudden it
would turn into four, so there'll be four different numbers
for that was the animal population, and then would turn
into sixteen. And then all of a sudden, after a
certain point, it would turn into chaos. The number would
be everything at once, all over the place, just totally
(46:55):
random numbers that it oscillated between. But in all that chaos.
There would be periods of stability. Right, you push it
a little further and all of a sudden it would
just go to two again. But beyond that, it didn't
go back to the original two numbers and went to
another two. So if you looked at it on a graph,
it went line divided into two, divided into four eight
(47:16):
sixteen chaos, two four sixteen sixteen chaos, all before you
even got to the number four of the reproductive right. Yeah,
And he was working with Mr York because he was
a little confounded. So he was a mathematician buddy of his,
James York from the University of Maryland, So they worked
together on this. In the nineteen seventy five they co
(47:39):
authored a paper called Period three Implies Chaos And man,
finally somebody said the word. I kept thinking it was
all these other people. Yeah, and the this this paper
where they first debut the name chaos. Um. They they
based it um. Tom York's stay based it on Edward
(48:01):
Lawrence's paper. He was like, you know what, I have
a feeling this has something to do with the Lawrence attractor.
So that um, that that provided chaos to the world,
and it it was the basically the third the third
time a scientist had said, we don't understand the universe
like we think we do, and determinism is based on
(48:24):
an illusion of order in a really chaotic universe. And this, uh,
this established chaos. It took off like a rocket. And
the eighties and the nineties, you know, as you know
from Jurassic Park, chaos was everything. Everybody's like chaos, this
is totally awesome. It's the new frontier of science. And
then it just went It just went away, And a
(48:45):
lot of people said, well, it was a little overhyped,
but I think more than anything, and I think this
is kind of the current understanding of chaos because it
didn't actually go away. It became a deeper and deeper field.
As you'll see, Um, people miss took what chaos meant.
It wasn't the a new the new type of science.
(49:06):
It was a new understanding of the universe. It was
saying like, yes, you can still use Newtonian physics, like
don't throw everything out the window. You can still try
and predict weather and still try and build more accurate
instruments and get you know, decent results, but you can't
with absolute perfection. Complex systems like determinism. The the ultimate
(49:28):
goal of determinism is false. It can never be it
can never be done because we can't have an infinitely
precise measurement for every variable or any variable. Therefore, we
can't predict these outcomes. Right, So you would expect science
to be like, what's the point, what's the point of anything? No,
not science. Well, some some chaos people have said, no,
(49:49):
this is this is great, this is good. We'll take this.
Will take the universe as it is, rather than trying
to force it into our pretty little equations and saying,
if the ocean temperature is this at this time of year, uh,
and the fish population is this at that time, then
this is how many offspring this fish style, this fish
(50:10):
population is going to have. Um, say, okay, here is
the fish population, Here is the ocean temperature, here all
these other variables. Let's feed it into a model and
see what happens. Not this is going to happen. What
happens instead, And this is kind of the understanding of
chaos theory. Now, it's taking raw data, as much data
(50:33):
as you can possibly get your hands on, as precise
data as you could possibly get your hands on, and
just feeding it into a model and seeing what patterns emerge.
Rather than making assumptions, it's saying, what's the outcome, what
comes out of this model? Yeah, and that's why like
when you see some things like you know, fifty years
ago they predicted this animal be its extinct and it's not. Well,
(50:55):
it's because the variations were too complex they tried to predict. Uh.
And that's why if you look at a ten day forecast, you, sir,
are a fool. All right, It's true, Well, ten days
from now says it's going to rain in the afternoon.
Come on. But if you take if you took enough
variables for weather for like a city, and fed it
(51:19):
into a model of the weather for that city, you
could find, uh, you could find a time when it
was similar to what it is now, and you could
conceivably make some assumptions based on that. You can say, well,
actually we can we can predict a little further out
than we think. But um, it's it's based on this
(51:41):
this theory, this understanding of chaos, of unpredictability, of not
just not forcing nature into our formulas, but putting data
into a model and seeing what comes out of it. Yeah.
And then at the end of that, you learn like
when that animal is not extinct like you thought it
would be, you go back and look at the general
thing and you have a more accurate picture of how
(52:03):
the you know, data could have been off slightly this
one value, and then you have more buffalo than you think. Yeah,
sure you got buffaloed by chaos. And we're not even
getting into fractals. It's a whole other thing. And we
did a whole other podcast in June about fractals and
(52:23):
Mandel bin Wa mandel Brett, mandel Brett, mandel Brett, and
go listen to that one and hear me clinging to
the edge of a clift Clift man. We we should
end this, but first, um, I want to say, there
is a really interesting article it's pretty understandable on Quanta
magazine about a guy named George Sara and he is
(52:49):
a chaos theory dude who's got a whole lab and
is applying it to real life. So it's a really
good picture of chaos the re inaction. Go check it out. Okay, uh,
if you want to know more about chaos theory, I
hope your brain is not broken. Yeah, go take some
(53:09):
LSD and look at don't do that. Um, you can
type those words into how stuff works in the search bar,
any of those fractals LST chaos. It'll bring up some
good stuff. And since I said good stuff, it's time
for a listener. Now, I'm gonna call this rare shout out.
(53:30):
You get requests all the time. I bet I know
which one is. Really yeah, duding his girlfriend. Yeah, No,
so far, so good. Hey, guys, just want to say
I think you're doing a wonderful job with the show
to the state. My first time listening was during my
first deployment. Uh yeah, when I listened to your list
on famous and influential films, and I was hooked after that.
(53:53):
Since I came back state side, I spent many hours
driving to and fro uh see my girlfriend to my barracks,
and I can happily say that they've been made all
the more enjoyable by listening to you guys. Even my
girlfriend Rachel has warmed up to you dudes, which was
not a pleasant I'm sorry, which was a pleasant shock
to me, as she has told me repeatedly that she
(54:14):
cannot listen to audiobooks because quote, hearing people talk on
the radio gives me a headache. End quote. Anyway, I
hope you guys continue to make awesome podcasts as I'm
headed out on my next deployment. And if you could
give a shout out to Rachel, I'm sure it would
make her feel a little better that I got the
pleasant people on the podcast to reaffirm how much I
love her. That is John, Rachel hanging there. John, be
(54:39):
safe and uh, thanks for listening. Yeah, man, thank you.
That's a greed email. I love that one. Glad we
don't give you a headache. Rachel. Yeah, for she listened
to this son and she's like, Okay, oh yeah, everybody's
gonna get a headache from this one. Like I I
came to hate the sound of my own voice from
this one. How You'll be all right. If you want
to get in touch with us, you can hang out
(54:59):
with us on Twitter at s y SKA podcast Saying
goes for Instagram. You can hang out with us on Facebook,
dot com slash Stuff you Should Know. You can send
us an email to Stuff podcast at how Stuff Works
dot com and has always joined us at home on
the web. Stuff you Should Know dot Com. Stuff you
Should Know is a production of iHeart Radio's How Stuff
(55:19):
Works for more podcasts for my heart Radio because at
the iHeart Radio app, Apple Podcasts, or wherever you listen
to your favorite shows,