Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Ignition sequence starts. Three, two, one.
(00:22):
Welcome back to University, everybody, the podcast where we explore the hard-hitting
questions about Earth, existence, and the unknown. I'm your host, AJ Perrin, with me
as always is, Judson Martin, and our special guest today, we are joined by Dr. Tim Cullinan
from Iowa State. Hello. Tim, tell us a little bit about your research, your work, what you
are teaching right now. Yeah, so I actually don't do any research. I just teach. I'm an
(00:46):
associate teaching professor in material science and engineering at Iowa State. I've been doing
that now for about eight years and enjoying that. My role does not involve research, so
I just have a teaching commitment, and so I teach a variety of classes. For those that
are at Iowa State University, you might recognize my name because I teach a lot of students.
(01:06):
I teach introductory material science for non-majors, so that would be pretty much all
engineering students. So that's the big clients there are mechanical engineering, industrial
aerospace, and then a handful of the other departments often take it as an elective.
But I've kind of ran the numbers before, and in a steady state when we're at sort of typical
(01:28):
enrollments, I'll be teaching about one out of ten on campus at any moment. So that's
a lot of students. Yeah. Well, we actually weren't planning to have Dr. Cullinan on this
episode, and then I met with him yesterday to talk about some of the stuff that we were
going to talk about today, just get an outside perspective on the topic, and it turned out
to be a pretty lucrative conversation. I said, well, why don't you just do my job for me
(01:49):
today, and you're going to come on and you're going to tell people about entropy, which
is arguably maybe the hardest subject that we have ever covered here on the show, I think.
Would you agree, Judd? Yeah, it's definitely one of the more confusing ones. Maybe the
warp drives was tough. I mean, that included a re-record, and so hopefully that doesn't
(02:10):
happen this time. And if it doesn't happen this time, then it's a success. So let's really
jump right into it. But first, we have to cover our starting segments. And the first
let's talk about brain gains. So that's where we talk about stuff that we learned this week
or interesting things that we've heard. And I've got a good one actually, this week, and
I have to pull out my calculator for it. So the average shower produces 2.1 gallons of
(02:34):
water per minute. So let's imagine for a second that you decrease the time that you shower
by two minutes. Now you have saved 4.2 gallons of water from that shower. Okay, right? Sure.
So Judd, I'm going to go out on a limb and say that you shower every day. Yeah. Okay.
And if you shower every day, that means you a year would save 1500 gallons of water if
(03:00):
you showered two minutes quicker. Every day? Every day. Okay. Now, Judd, do you plan to
live a long time? I don't know. I mean, we'll see. Adrenaline junkie. Yeah, okay. You're
more of an iPad kid in my brain. I don't see you getting out on too many adventures. So
I'm going to be bold. What are you going to put me at in your lifespan? I'm going to be
bold and say 80. Okay. And I'm going to say you didn't shower once a day up until you're
(03:24):
about 10, which is valid. Okay. Sure. Okay. So now you have saved 107,000 gallons of water.
And do you know how much is in an average swimming pool? I don't. That would be 16,000.
So now if you tally that up across your lifetime, you would save enough water just showering
two minutes quicker to fill almost seven swimming pools. And that's like a large swimming pool.
(03:48):
That's not even like a teeny traditional backyard swimming pool. Not the kiddie pool? Yeah.
It's about 14 to 15,000 would be a pretty average one for a household. And so 16,000
is pretty big. Okay. Now, the reason I bring this up is because that example was just you.
What about Matt? Okay. Matt takes, Matt's our other roommate. He takes long showers,
(04:09):
but he also doesn't listen to this show, so we don't have to worry about calling him out
for that. And then Tim, did you hear or learn anything interesting this week? Maybe our
conversation yesterday sparked something. Well, certainly it did. But I have some comments
outside of the entropy that were interesting that came my way this week. So just yesterday,
actually, a coworker of mine, he had published on this a little while ago, a few weeks ago,
(04:33):
but it came to my attention yesterday that there's work being done here at Iowa State
to try and make what are otherwise brittle materials, strong but brittle, to get them
to offer some ductility. And for maybe the general audience, that may not make sense.
But for those of you that have a mechanical inclination or engineering background, you
(04:53):
might realize that that's kind of a difficult thing to do. And so he's specifically working
with ceramics, ultra hard ceramics, and using electric fields to actually move dislocations.
And you've been through my class, you might kind of appreciate just how difficult that
might be to even cause that in any way, like with force or whatever. And so they're able
to do that with fields, with electricity, electrical fields, accurately.
(05:18):
Tim, how would you describe a dislocation? What does that mean to a material? Are dislocations
a good thing or a bad thing or what? They're actually a good thing. So what that is, is
it's a defect in what would otherwise be an orderly arrangement of the atoms, the ions,
whatever the particles are. And for those that don't know, the quick rundown of this
(05:38):
is that in ceramic materials and strongly bonded materials, it's really difficult for
those types of defects to move. And by contrast in metals, they tend to move easily, which
is why metals show ductility. We can typically bend metals, we can deform them a lot before
they fracture and break. Whereas in a lot of other systems like ceramics, highly covalent,
highly ionic systems, those defects don't have that kind of mobility typically. What
(06:02):
we're finding is that if we put them under the right conditions here with electric field,
they start to become mobile. And evidently enough that they start to offer what we would
call ductility, which is kind of crazy. Like that's not a term that we normally associate
with ceramics. And so again, I haven't looked into that work in great detail. I just sort
of saw a headline come my way yesterday. And so I'm excited to dig a little deeper into
(06:24):
that. So are you saying we're taking something like, like you said, ceramics, could that
be something like what you find in your kitchen? Like something you could never imagine bending
can now be bent more because of these electric fields? Well, certainly you can find ceramics
like in your kitchen, you can find them in your bathroom, like a lot of people are familiar
with, you know, I always say toilets and bricks, but we're really talking about advanced ceramics.
(06:46):
So these would be compounds that you probably don't have access to like in your everyday
life, but they're used in engineering, they're used in all kinds of things, for other reasons,
but usually not in like everyday life, you probably don't own any of these ceramics,
but they certainly are important and they contribute to other fields, other companies
might buy and use these ceramics for different reasons. But personally, you probably don't
(07:09):
own any of these. But you are familiar with ceramics. Yeah, yeah. So now let's move into
news before we start here, just a couple things that have, I always like, the phrase that
comes to mind is cross my desk, but that's I don't have a desk and I usually do my work
on the couch. But a couple things that have caught my attention recently, which is one,
so we know the sample from the asteroid Bennu that was collected by the OSIRIS-REx spacecraft
(07:35):
from NASA has the sample has touched down on planet Earth. And NASA is working through
that and they are scheduling a public release of like their initial findings on the 11th.
So we'll learn a little bit about what they initially found from that sample. And then
another big, which is, again, like I kind of breezed over that, but that's a huge accomplishment
(07:58):
to be able to take a spacecraft and go land it on a moving object in our solar system
and then have that return. This thing launched in 2016. So it's been seven years since its
original launch and to have success throughout that entire mission to have nothing go wrong
and bring deliver that back is truly an achievement and we will learn a lot from just a silly
old rock actually. And then another thing was that by pointing the James Webb Space
(08:23):
Telescope at Europa, so I think how it works at NASA is they let like they allocate time
with the James Webb Space Telescope to different research groups. And one of the groups is
looking at Europa, which is one of the moons of Jupiter, and they found a pretty heavy
amount of carbon on the surface. And this is a big deal because if there wasn't something
(08:45):
constantly producing this carbon, it wouldn't have a reason to stay like on the surface,
it would have dissipated or gone elsewhere. But because it's an amount that's enough
to detect it, scientists have reason to believe that there is actually possible signs of life
in the subsurface oceans of Europa that is then like bleeding this carbon out towards
(09:07):
the surface. And Europa is like covered covered in ice with a giant ocean spanning the entire
moon underneath it. So that is pretty interesting topic. Yes. I think the way that they choose
the missions that they allow like the time for on something like the James Webb Space
Telescope is pretty interesting. So like they only have a given amount of fuel on board
(09:29):
and they can't really go refuel that very easily. So when they have to adjust the telescope,
they're either burning fuel or doing some other maneuver to reorient and they have to
choose these very wisely. Yeah, it makes me think of the satellite we talked about that
did the stuff with Saturn and Enceladus. What was that? Cassini. Yeah, okay. So that one
(09:51):
when we were looking at how it moves, I believe NASA comes up with some really interesting
ways for propulsion for these spacecraft that aren't like just going straight up and down
back and forth, right? Which is they sometimes have like, you know, it's slipping my mind.
You have to go listen to the episode on Enceladus. It's actually one of our more popular ones.
But I think it's like either some weird nuclear cells. Oh, yeah, they have these molecules
(10:17):
which are or these these systems which are radiating off energy over huge periods of
time and they collect the energy from that radiation and then turn it into power for
the spacecraft. That's pretty neat. Yeah, I guess I don't remember the... Yeah, I'm
surprised I remember that actually. I don't remember that being like part of the discussion,
I guess. I think, I don't know. I bet you tune me out, so I'm not surprised. Yeah, whatever.
(10:41):
Let's jump right into the episode now though. Like I said, today we're talking about entropy.
And if you've never heard what that means, then I'm sorry that the first explanation
of entropy has to come from me, which is why I'm actually gonna let you avoid that. And
I'm gonna offload this onto Tim, because it seems silly of me to explain something about
(11:01):
thermodynamics when I have somebody who studies thermodynamics right next to me. Right. Well,
I'll give it a try. And a little maybe inside joke, if we create a little bit of chaos here
today in the podcast, that's, you know, well suited. It's about entropy after all. So perfect
time for that. So entropy. First of all, it's worth noting that entropy is as a concept,
(11:25):
very abstract. So if you're having a hard time following along, that's okay. We'll do
our best here to verbally try to communicate that to you. It has different meaning and
different interpretation, different like value that it brings in different contexts. So entropy
as a term, as a concept gets used in a lot of different spaces. It certainly has origins
in thermodynamics and in material science, but we hear that term sometimes show up in
(11:48):
other places too. And so we'll kind of explore a little bit that with some examples. The
origin of the term again comes from thermodynamics. It comes from Rudolf Clausius, who was studying
on the heels of Carnot, maybe for those that have taken thermodynamics, you may have heard
of things like the Carnot cycle and so forth. And he studied heat engines and things in
(12:10):
France a long time ago, but on the heels of that work, Clausius was interpreting some
of that work and he came up with and coined the term entropy. Entropy, if we break down
the word has two parts, E-N-T-R-O-P-Y is the spelling. E-N comes from energy and entropy
through translations. I'm looking it up. So from Greek actually turning or change and
(12:34):
in German to English, it ends up being transformation. So there's kind of two contributions to that
word that he created. E-N energy and entropy transformation. So already we see a connection
between this concept of entropy and energy. They're not the same. They are related in
certain contexts. Like in thermodynamics, they're certainly related, but there are contexts
(12:55):
where energy is maybe not relevant. So maybe like in pure mathematics or statistics or
these, some of these other examples, but from the origins of where the word came from and
the people that were studying these things, there is a connection to that energy. So Carnot
was originally studying the engines and thinking about engine efficiencies and trying to make
better engines essentially and better machines, more efficient machines. And so that work
(13:18):
was being done in like something before the mid 1800s. And then here comes Clausius and
kind of on the heels of that is looking at some of this work and is interested in really
this idea of what's called like the dissipation of heat or this idea that there's in any kind
of real process that occurs, there's going to be a degradation of energy. No 100% efficiency.
(13:40):
Right. The energy cannot be used 100% efficiently. There's always going to be creation of heat
that's not really usable in sort of a lay explanation. That sort of wasted heat. So
you could think about examples in everyday life where if you're going to traverse a mile,
there are different ways you could do that. You could walk and you could traverse that
mile at a low pace or you could sprint it or you could like scramble and tumble and
(14:03):
make your way. And you can imagine some of those journeys are going to create a lot more
exhaustion in you and create a lot of excess heat and sweat and so forth. And so that brings
up this idea that in any real process that has a finite rate where there's transformations
that are ensued, there's going to be this dissipation of usable energy essentially.
(14:24):
Not all of it is completely usable. If we tend to, now here's kind of a caveat to that.
If we start to slow down processes and or make the changes very, very small. So instead
of a mile that you have to traverse, maybe it's a millimeter. And if we could slow things
down, you start to approach higher and higher efficiencies. All right. So there's this general
(14:45):
concept of kind of rates and also the extent of change at play here. And one of the big
ideas to realize upfront is that if we have a process playing out where there's a lot
of change and or if we're trying to make those changes quickly, we're not going to, by nature,
we're going to have some inefficiencies. But if we slow things down and make incremental
change and only consider small changes, we have a chance to kind of get close to that
(15:10):
limit of 100%. But it's a limit. It's just like other limits in mathematics. We can't
quite get there in reality.
Entropy as we were starting off this conversation, it's really a concept that traverses lots
and lots of different fields of study and even perspectives. And so later on, about
20 years later, Boltzmann comes along and he starts looking at things from a different
(15:33):
perspective. He starts to think about the actual like arrangement of matter that's inside
of maybe a system of interest, some we call thermodynamic system or a material system.
He made efforts among other people to try to create a framework where instead of thinking
about this in terms of heat flow and work and efficiencies and so forth, which are natural
to do if you're thinking about engines. And that was the practical application at the
(15:55):
time. But then this was a pivoting point where now we're starting to think about what's
going on at the fundamental particle level. And so he comes up with an equation that actually
allows us to assign an absolute number to entropy. And it's very much related to the
details of arrangements at the particle level. And so it's kind of worthwhile to point out
(16:17):
that Clausius' mathematical contribution, he was in a position to talk about changes
of entropy, but he wasn't really in a position to assign absolute entropies. But in the other
perspective, if you have all the details of the particles in your system and the configuration
options that they have, you could use Boltzmann's formula to actually assign an absolute entropy.
(16:41):
And so maybe it's worthwhile to point out that we have a symbol for entropy, it's S.
And so that's a standard symbol mathematically in thermodynamics for entropy is S. So we
might use that term here from time to time today.
So again, for the people who are like completely lost right now, what is this episode will
be full of examples and metaphors and stuff like that?
(17:03):
It might be easier actually to just back up slightly. So if we were to think about just
a piece of material, this is kind of the space I occupy as material. So if you think about
like a block of simple metal, right, and for those that don't know, I'll clue you in something
like aluminum pure, occupies space as atoms in an orderly fashion, right in a fairly orderly
(17:25):
fashion as a solid. So nonetheless, if you're zooming in and see those atoms, there's a
pattern to them. There's an orderly arrangement, similar to like a marching band compared to
like, you know, just a crowd of people that's just doing something random. Okay. Whereas
if we compare that same block of aluminum to its melted state, if you heat it up and
make it a liquid, those atoms are now kind of random and disorderly and more chaotic.
(17:48):
And so very conceptually, we can already make a claim that the liquid state for that system
has a higher entropy, has more disorder, has more chaos compared to the say solid state.
We can add a different contribution to entropy, which is composition. Now we can say, well,
what if it's not just pure aluminum? What if you've got some copper in there once in
(18:08):
a while or something, you can kind of take this to a 2d example and think about, like
you were saying, like a grid of positions and you could fill those positions maybe with
like a marble, like they're all the same color, all right, all blue or something. And there's
only one unique way to do that. If all of those positions are the same color, there's
only one unique way to create that configuration. It'll always look that way. But if you bring
(18:30):
along even one red marble into that grid, there's now a lot of ways you can place that
red marble. And so you've introduced entropy, you've introduced disorder and chaos at the
kind of chemical level. You don't necessarily have to change the arrangement of the grid.
Like imagine you like maybe drilled some holes in a board so you could set these marbles.
You don't even have to change that necessarily. You can introduce entropy just through what
(18:53):
we call composition, just through changing the nature of what's sitting on a site, right?
Or both could be in play. And we could talk about some of those examples too.
So we know that entropy has to do with configuration. And there can be ordered configurations and
there can be disorder configurations. And we can look at this, like you said, from a
number of perspectives. This entropy can be used to describe stuff like encryption and
(19:16):
data, but it can also be used to describe like most traditionally in like a thermodynamic
setting would be energy or like you said composition. So there are multiple different configurations
you can have, some of them ordered and some of them disordered. And yet, kind of the theme
of entropy is that it's constantly moving towards the disorder and away from the order.
(19:37):
And now I think the interesting thing is why.
Right. So that is actually connected to the second law of thermodynamics, which comes
from Clausius's work, which states that the entropy of the universe, so think about what
that really means. The universe is everything there is inside of that bound, right? That
(19:58):
entropy is either zero or increasing. Now we're at a state that's along the timeline.
We know we exist, we can see each other, there's things around us. So we are in an entropy
state along an increasingly higher entropic path. So we're continuing out towards higher
entropy states as time continues. And so that's the essence of the second law of thermodynamics.
(20:21):
The thing about Clausius's work is that it's still a little bit nebulous. Like it's hard
for humans to like witness and observe and carefully track like heat flows and work.
But this is where I actually kind of prefer the statistical view because in some ways,
at least for the beginning examples that we might discuss, I think it's a little bit easier
(20:41):
to kind of understand the inherent origins of entropy, even though entropy does play
a role and can help with those other perspectives like we were talking about earlier, like with
engines and so forth. And so this kind of connects back to what we're saying about Boltzmann.
So he creates this equation, he really chooses an equation. S equals a constant called Boltzmann's
(21:03):
constant. So S equals k times log omega.
I was about to say that's conceited to call it after yourself. But if you do something
this spectacular, I guess you kind of get to do that, right?
Well, in full disclosure, I don't know if he named the constant. I suspect that it was
named after him later, but I'm not positive. Yeah, but it's actually for as far as like
(21:23):
profound equations go. I mean, oftentimes they are very like seemingly very simple,
but it's kind of, I think it's really interesting to dig a little deeper. You know, you could
think about like E equals MC squared, like one of the most famous equations that are
out there. And you could dig into that and you could realize like, oh my gosh, there's
so much like nuance here and like, just genius in that in that construct. It's a similar
(21:44):
thing here. And I think if we explore that a little bit that that will come out. So nonetheless,
he basically chooses this formula. Like remember entropy is a human invented concept to help
try to describe and understand systems in a predictable manner, right? That's the whole
point about, you know, science and engineering is to try to create frameworks that allow
(22:04):
us to control and predict systems, right? So he creates this and chooses this equation
and we should define that term. So entropy equals K, a proportionality constant, times
the log and the natural log of omega and omega represents the counts of possible configurations
that a system might take. So already we see an interesting thing here. S is proportional
(22:28):
to omega through the log function. And so nonetheless, S increases as the number of
configuration options increase. So that means that systems that are capable of showing more
possible configurations by definition have higher entropy according to this formula.
All right. And so we could dig a little bit deeper into that. And this, by the way, is
the beginning of what's called statistical mechanics. So statistical mechanics as a field
(22:52):
of study was coined by J.W. Gibbs, famous thermodynamics investigator at a similar time.
So he along with Boltzmann and along with James Clerk Maxwell, kind of work on this
field, this new perspective known as statistical mechanics. And they're thinking about systems
in different ways, small or thinking about the small particles that are at play, their
arrangements, their configurations, their little tiny energies that of course sum up
(23:16):
to the full energy of like a real system.
So let me get this straight. You're saying Boltzmann, what Boltzmann was saying is the
more possible configurations you have, aka like the greater the size of the system, the
greater the entropy. So bigger systems like by default have to basically be at higher
(23:36):
entropy or be moving towards higher entropy.
Yeah, that's a good point. There's kind of a couple of ways that you could imagine easy
ways that you could increase entropy. You could just make a bigger system, right? And
it would potentially have more entropy. So yes, it's an extensive property entropy, meaning
it scales with the system size. So if I have like a small bucket of liquid aluminum and
(23:57):
I have a large bucket of liquid aluminum, the larger one, even though they're both liquid
and it might have a very similar overall structure throughout, the larger bucket has more entropy
because it's a larger system. So there's an extensive nature to entropy. That's an important
point that's going to come back here in a second. What ends up happening here is Boltzmann
chooses this functionality log and that's important because there's a couple of things
(24:17):
that are required. One of those criterion is that we need the whatever function we choose
for determining absolute entropy. We need to be able to account for the fact that it
scales with options. And we see that the log function is positively correlated with its
input here. The input is the count of choices. And just as an example, we talked earlier
(24:39):
about the simple case of like the grid, where if you have the grid and you're going to put
all one color, like all blue marbles, how many ways did we say, could you do that? I
have all kinds of marbles I could choose from, but I'm going to put only blue on that grid.
How many ways? I was hoping Judd would like, Phil, jump in there. It's not zero and it's
not two. It's just one? Yes, exactly. It's just one. It's just one. Yeah. So not a trick
(25:03):
question. So it's just one. Now here's the real trick. What's the log of one? Zero. Zero.
So the entropy of a unique offering is by definition zero. Yeah. Right. So a perfect
offering in this context, we're thinking about kind of like chemical entropy, like red marbles,
blue marbles. Yeah. And if they're all unique, if they're all the same, no entropy, one unique,
no entropy. That's kind of right. No disorder. There's that sort of a fundamental outcome
(25:27):
of this log function that needs to be in place. We already talked about that. There needs
to be a lower bound of zero on the entropy. All right. Now most systems have some sort
of disorder. So most systems are above zero. But the point is there needs to be that lower
limit. That's good. And think about it. The input needs to be a count. All right. And
a count of zero means you have no particles, no particles don't express any entropy. Right.
(25:49):
So it's therefore log of zero is undefined. And so that also kind of fits in the framework
here. So the log function is showing to be very good at satisfying some of these connections
back to the macroscopic. Right. But there's another one that's really important. And that
is that this extensive nature that we talked about, if you were to divide, let's say a
system in half, just to make it easy. All right. We have a system and we're just going
(26:11):
to put like a barrier in place, not going to like physically separate them. We're just
going to imagine a barrier and think about this idea that we could kind of consider the
system as the sum of the two parts or the sum of N parts. Well, if we knew the entropy
of each of those contributing parts, if we could, if we had that number of reach of those,
those individual S values, we could sum them and get the total S, the total entropy of
(26:34):
the system. Well, think about log function. All right. If you have, let's say N or we'll
just keep it at two for now. If you have two subsystems, right? Two subsystems that have
their own contributing configuration options, omega, you could take the product of those
omegas and insert that as your input to the Boltzmann's function. And guess what happens
(26:58):
because of log? What's the log of, what's the log of two, what's the log of a product
of two numbers?
So like log AB equals log A plus log B.
Yes. Yes. So it maintains the extensive nature, right? So what you find is that just to verbalize
that for folks, maybe we could throw an equation up. I don't know. But to summarize that the
total entropy S would be equal to KB log of the product of the two different configuration
(27:25):
counts. But because of log properties that Judd remembered, we find that that's just
going to be K times log of the first counts plus K times log of the second count, which
lo and behold is just the individual entropy is summed. So it maintains that connection
as well. And so it was a very robust equation. Yeah. Very, very clever choice and very useful
(27:45):
choice for just for then allowing that connection back to some of these macroscopic details.
I can't even believe that somebody could just like make an equation. Like I'm so glad I
live right now when I can just get taught these in school or like it fed them through
Google when I have to do my homework. But it's like at some point there was a person who
is trying to describe a property of our universe and had no idea how to how to do so except
(28:10):
by just critical thinking alone. And lo and behold, these equations hold up after after
centuries because it turns out our universe is like a logical system that that has these
properties inherently, you know, you know, this was one of I'm going to tell you, I've
been, you know, basically in school a very long time. I'm in like 25th grade or something.
I kind of joke about that. But this, you know, I was teaching thermodynamics for the first
(28:35):
time and on the hook for teaching that, you know, a couple of years ago. And when I finally,
you know, found a reference that explained this, it was one of the most satisfying things
of my entire career because it connected it connected dots. You know, you think about
in school, you learn about the log properties and you think I'm never going to need that.
And I'm telling you, that was one of the most satisfying things to read. And I'm thinking,
(28:56):
oh my gosh, this connects so much. Yes. It's such an elegant way. Like, holy cow, like
this is really incredible. And I just like, I get chills thinking about it right now.
It's just really, really satisfying and elegant. And again, glad that somebody else figured
that out. I wonder like, what are the origins of like the log function in mathematics? Like
how and how it plays so closely into like real life examples? I think the bigger question
(29:18):
is whom what where the log function come from? Like even Boltzmann was, yeah, like even Boltzmann
was like, oh, thank God somebody came up with this. I could just throw it into my equation.
And now here we are, like getting what are we going to use his equation for to add on
to the next thing? It's like there was, I don't know, it goes back to probably like Greece
or something like that, but still. Yeah. That's that's nuts. So yeah, so we've talked about
(29:43):
up to this point, configurations, and we've talked about how entropy is low entropy is
meaning order and high entropy is meaning disorder. And that when we have these bigger
systems and they're scaling up, we increasingly see a movement towards disorder. And there's
actually a very elegant explanation for this within statistics. If we're looking at something
(30:05):
like rolling dice while we're still on the topic of the equation and so that kind of
leads into what you're talking to. OK, wait. So you what you're saying is never mind. We're
going to say something more about. I want to say one more thing and then you go. Yeah.
So while we're on the topic of the equation, so we have the Boltzmann constant times the
log of some omega. I wanted to talk about some statistics that I saw for I don't know
(30:28):
who made these estimates, but somebody estimated the amount of entropy that was in the universe.
Oh, that was me. Yeah. Oh, you did that. Yeah. During the Big Bang. Yes. Yes. You were back
there. Yeah. OK. So they've estimated that the entropy of the universe was the Boltzmann
constant times 10 to the 88, 10 to the power 88. And that was the entropy, total entropy
(30:51):
of the universe during the big or right before the Big Bang, which is crazy to think about
because how do you count how many different positions all these particles like every particle
in the universe, how the energy can be configured? Yeah. That many states. I don't know who decided
that that's how many it was. That's kind of crazy to think about. But now today they estimate
(31:11):
that the entropy is about 10 to the one hundred and third Boltzmann constant, which is about
a quadrillion times larger. Yeah. It's not like it's not like we said one hundred and
three. Yeah. Yeah. So it's not like 15 times more than that. It's times 10 times 10 times
10 to the fifth, 10 to the 15th more, which is nuts. And I think speaking of like that
(31:35):
started the universe thing. So I watched the same video that you watch that you're that
I think you're referring to this from. But the interesting thing is that when you think
of the start of the universe, which I think we may have talked about before, but the the
simple like explanation or description of the start of the universe, it is a hot and
it was a dense state. It was a cluster of a lot of energy. And like to me, I look at
(31:59):
that and I think like that's so chaotic. That's so like mixed up. That's so complex. But this
is where the idea that like entropy is this misunderstood concept comes in, because that
start of the universe, like you said, was lower entropy. In fact, when we have this
configuration of energy from this initial compact state, there's only there's there's
(32:21):
many fewer configurations as when the universe expanded and created complex systems that
will continue to drive this entropy forward. So when I said the start of the universe was
low entropy and very low disorder, a lot of high order, that's the opposite of disorder
if you didn't know, is that the temperature actually from point to point, and we know
this by looking at like the cosmic background radiation now using very advanced telescopes
(32:46):
that are beyond the scope of my knowledge. We know that the temperature fluctuations
back then were like less than point 001 of a degree between any point in the universe
when it was in that hot dense state, which goes to show how like uniformly this energy
was distributed, which in turn shows us why entropy was so low back then.
(33:09):
And remember, there's different kind of contributions to the overall entropy. And it's honestly
for real systems, it's hard to track all of it and make even make sure you've accounted
for everything, because there might even be things we're just not even familiar with,
it's possible. Yeah. So let me give you an example to maybe help with with that. Remember,
we said it was an extensive property, it's dependent on the system size, not just the
count of things, but also in some contexts, like the space they even have to occupy, right?
(33:35):
And so in the context of at the beginning, if it's all in this relatively small encapsulated
volume, as there are more and more space opportunities for particles to exist, then naturally the
number of locations over which even a single particle can reside increase. And so overall,
and then think about the vast number of particles there are that now have more and more and
(33:56):
more options, it just scales tremendously rapidly, and increases that that entropy.
So think about what's happened since the Big Bang, we have the formation of planets, we
have Earth where we have, we know we have crystalline matter, we know there's matter
that occupies orderly structures, right, crystalline structures. So in some locations in the universe,
(34:16):
locally, the entropy is lower than it might have been otherwise. But the overall change
is trending towards increasing. And so that's another useful thing to point out.
Why is this trend pointing towards higher disorder? Because that's I think we haven't
quite touched on that yet. It's like, I know they call entropy's entropy times arrow, because
(34:37):
it continually moves in one direction, the entropy of the universe is constantly increasing.
So why is that? And I think we have to dive into statistics to completely understand that,
would you agree?
That's a reasonable starting point for the discussion. So the this gets into the weeds
a little bit. And and it's it would require a bit more math than maybe we're capable of
(35:00):
communicating verbally here. But systems would like to find an equilibrium state. So maybe
it's worthwhile to even define equilibrium first, like, what does equilibrium even mean?
If you're in a truly equilibrium states, like, what would be a way to describe that? Do you
think? No change, no change, right? So like, the universe would be very boring, by the
way, if things weren't changing. So the fact that we're in transient states is actually
(35:21):
is useful, right? So systems can come close to equilibrium, like we observe that all the
time. So certain systems, like if we were to look around the room, even like some of
the metal things that are supporting the microphones, whatever, like, they're reasonably close to
equilibrium, but some things are not like, maybe we're digesting food and inside of our
bodies, which are complicated, there's all kinds of things constantly happening, right?
(35:41):
So even though, you know, sitting here, nothing seems to change on the outside, there's a
lot going on on the inside. And so there were like far from equilibrium in certain locations,
even within ourselves, right? One way you could kind of think about equilibrium is sort
of tracking properties of material, like maybe you have, like, maybe you can imagine being
somehow a third party existential viewer of the universe. All right, that doesn't really
(36:03):
make sense. But just for argument's sake, you're like sitting out there and there's
somebody listening to this right now that's like, oh, well, aren't we all really third
party observers? So it's okay. Yeah. So nonetheless, like, I'm not part of them. I'm on my own.
So if you could do that, and you could maybe like have the ability to track certain things,
like maybe whatever it is, temperature, any number of things, some math, skipping some
(36:24):
things here. But if you take it on faith that systems that are isolated, the universe, I
would argue, is isolated, can only maintain their entropy currently, or they can increase.
You can't, overall, you can't go backwards. Right? That's right. It's like time. That's
kind of the underlying tenet here. So with that in mind, if we observe no changes, we
(36:48):
would reach an equilibrium state. All right. So if you're this third party observer watching
the universe and there are no changes occurring, again, life as we know it, everything as we
know it would would probably be quite a bit different than it is work or non existent
even like, how would you even I would your heart beat? Well, right. It opens up a lot
of questions that I'm not capable of answering. It's sort of an interesting discussion. But
(37:09):
the point is, if you could watch and monitor these things, you would if you're looking
for the equilibrium configuration of the universe, you'd be looking for where the collection
of all possible properties you could track, none of them would be changing anymore. Right?
If you see change, right, if you see them changing, like whatever that might be, temperature
is an easy one to think about the universe density. If those are fluctuating, we're not
(37:31):
in an equilibrium state. All right. And we the underlying principle is that systems tend
toward even the universe tends towards an equilibrium. Well, what the what the second
law says is there's one property that we could conveniently use to track whether or not we're
in the equilibrium state. And that's the entropy because if the entropy is changing, then that
means that we are continuing to traverse through states towards equilibrium, towards equilibrium.
(37:56):
And we said that entropy can only increase. Right. That was sort of an underlying theme.
And so if we're continuously watching this increase, increase, we're going through transient
states, if we somehow arrived at a condition where the entropy no longer changed and we
waited in this third party observer again, waited and observed and watched, that would
be the equilibrium state for the universe. All right. And then we would expect no more
(38:19):
change beyond that. And so the as noted earlier in the conversation, where we have evidence
that there's continually evolving entropy right now we're in like the sort of a like
a climb up a mountain, we are still climbing the mountain. And then eventually there's
a plateau. And then when that plateau is reached, it never goes up or down again. Right. And
associated with that would be not only just the entropy constant, if we're in an equilibrium
(38:42):
state, but the temperature would no longer change, the density would no longer change.
Any property might track. It kind of made me think about a conversation that we had
in a previous episode on the expansion of the universe would be like, for no entropy,
for entropy not to be changing the expansion of the universe would have to stop. Yeah,
that would be or again to like if it well, okay, well, here's an idea. If the universe
(39:04):
eventually like when we talked about with Neil, he was like, okay, it's gonna expand.
And then we don't know, we can't say quite for sure if it's gonna like reach a stopping
point or continue or like the expansion will accelerate and eventually rip all the molecules
apart at the atom scale or if it will eventually condense. But I'm thinking here, like, if
eventually it will start to like reverse itself and go backwards and condense itself, like
(39:27):
wouldn't that be returning energy into the system and therefore lowering the entropy
returning to this denser form. And I feel like based on the conjecture that entropy
can only go increase like how could that even be possible? Well, remember that right and
remember where that came from that came from observations. This whole framework came from
observations like in engines and things like where to date we haven't observed a breaking
(39:52):
of this rule. I'm not going to claim personally that there's not some set of conditions that
are in our future or in some other, you know, spatial arrangement where, where maybe there
there's a reversal and there there's a school of thought out there. I can't remember the
name of it. We talked about the other day. No, the it's the big contraction or the big
(40:13):
well, yeah, basically where if you think about time as sort of like a VHS tape, you put it
in and we're just kind of able to humans are able to perceive sort of one frame at a time
and with a very rapid set of frames sequential, that's what we experience as sort of life
or as time. There's a school of thought that when we get to the end of the tape that it's
going to like when we get to the limits of the universe expansion that there will be
(40:35):
a contraction back. And what that is or how that actually executes out meaning is it going
to be literally in rewind and we proceed through time backwards. Yes, that I'm again is unverifiable.
We have no way to know. But there is a school of thought. There are people that think that
it might be that way or that it might just be generally a contraction and it's just going
to be maybe not exactly events reversing in the in the reverse sequence exactly. But nonetheless
(41:00):
an overall contraction with I don't know. But it's it's an interesting thought. There are
there's actually some pop culture like movies and things that account for this and have
some of their storyline based on some of these these theories. I think it's interesting,
but ultimately it's unverifiable. All we know is what we've observed previously and what
we tend to observe. And that's where a lot of these so-called laws come from is that
(41:23):
to date we haven't really found anything that that violates them. And that's that's what
makes them a law. That was very elegant. And I think that's like perfect time for a break
here. We're going to turn from the break and talk more about entropy and what it has to
do with you. So we're back from the break and we're here to talk to you more about entropy.
(41:48):
And now we're going to explain a little bit more about why entropy is only really moving
in one direction and whether or not that's a good thing. All right. Let's talk about
some statistics. Right. Yeah, absolutely. So one of the things again that that is important
is that when Boltzmann came along and started to think about things at the particle level,
(42:08):
real systems that are of interest to people have lots and lots and lots of particles.
And so it becomes very difficult to even track any metric about those particles as soon as
you get to even a system size of, you know, 100 or 1000. We're going to kind of try to
convince you that here in a moment. Yeah, we talked about that yesterday. Actually,
you're saying like humans don't have like an entropy meter or anything. We can't go
(42:29):
around and freeze time and then look at the configuration, pause it and look at every
bond in between atoms. Statistics kind of helps us make inferences about the measurements
of entropy. Right. And a lot of the tools that we need in this framework just come from
statistics as we'll see. And it's a great point you just made. We have a hard time with
that. Can you imagine back then there weren't computers, there weren't ways to do that,
(42:50):
right? Simulate it. Yeah. Well, even now we can't simulate accurately like huge systems.
I know you said something about how like it could take even for a supercomputer, like
thousands of years to try to make measurements on entropy of a given system. Right. Well,
if like the argument that I was making was that if you had like a simulation of let's
(43:12):
say a realistically sized piece of material, like what would that even mean? Like take
a dice of metal, like a metal chunk about the size of a dice that fits in your hand.
There's gonna be about a mole of atoms there. And even just like tracking 10 to the 26 scalar
values in a computer and being able to do calculations and manipulations. Yeah, very
quickly the computing power is not capable of keeping up with, you know, what we call
(43:34):
realistic sized systems. If again, you're tracking with like that full accuracy that
you would want. So you might have to, this is where again, I'm a little bit out of my
element. I don't do this type of thing for living, but there are people that certainly
model materials and they have tricks up their sleeve to try to get around some of those
issues. But yeah, system size, particle count is definitely a limitation when it comes to
(43:55):
those methods for sure.
You know, I was just, so one of the things that I did, I did some simulations from flow
around a given geometry and this was entropy, or well, it was enthalpy, which is I think
closely related in some ways, or at least a little bit related. And it was just one
of these equations that I looked at and I was like, I don't know what this means, but
(44:16):
my professor was like, just do it. It was part of the simulation and they took forever.
But also one of the research here at Iowa State that I was reading about was modeling
materials on an atomic scale and how they could really only model like, they were using
the supercomputer that we have here on campus and they were really only able to model like
(44:37):
a thousand by a thousand at the largest molecules at one time for the specific simulation just
because how intense it was.
So, let's move into statistical mechanics and kind of this notion of what we would call
microstates and macrostates and our good friend Boltzmann.
Right. Well, Boltzmann alongside another two important figures, J.W. Gibbs and James Clerk
(45:01):
Maxwell, they together, the three of them sort of came up with, and J.W. Gibbs actually
coined the term statistical mechanics. So if you've been listening to the conversation
so far and you've heard us say things like counts of possible configurations and so forth
and in your head if you're thinking kind of sounds like probability and statistics, you're
right. And so nonetheless, this perspective lends itself nicely to leverage those tools
(45:24):
and more accurately, a lot of these tools that we need come from the, I guess, the subfield
of combinatorics. And so I hope I'm saying that right. But basically thinking about combinations,
permutations and things like that, which would generally fall into the realm of statistics
as well. And so there's some key definitions that we want to get through here and then
we'll be able to kind of connect the dots. So first is this idea of a microstate. And
(45:49):
so in statistical mechanics, we're interested in the connections between microstates and
macrostates. A microstate refers to a specific state of a system. And so we're going to kind
of think about easy examples, easy systems to begin with. And so maybe like a system
of two dice, two six-sided fair dice, meaning there are six options for each dice that you
(46:14):
could roll and they're equally likely. All right. So there's no bias. It's not a loaded
dice or anything. And so we have, if we track the roll and let's say that, for example,
if we roll two dice and we get a one and a two or conversely a two and a one, all right,
then we have defined the microstate of that system. We have information about each observable
(46:37):
outcome. Here the observable outcomes are the roll that we get. And there, since there
are two dice in the system, there are two things to track. One, we're only tracking
one property of the two objects in this example. So again, a simple system. And the microstate
is the listing of whatever we're tracking for all the particles. Here we're only tracking
the face of the die that's face up, right? And so two pieces of information is all we're
(47:00):
tracking for those one piece from each dice. So the microstate would be a one and a two
or vice versa, two into one. All right. They're kind of considered equivalent here. Another
way that we could kind of think about microstates as well. And we'll kind of flesh out this
little example here. We had talked about that earlier with like the grid, right? Where we
put all blue or we introduce a little bit of color, a specific arrangement on that grid,
(47:24):
like where exactly that red dot, red marble sits, would be a way to define if we just
had the, let's just say a system of like a hundred marbles and only one of them is red.
Assuming the location of the red is enough information to kind of describe that, that
microstate. So it's a specific instance of the system state, if that makes sense. One
(47:45):
of the things that is interesting here as a theme is that as we kind of alluded to earlier,
if we increase the count of things that we're tracking and or the options of things that
we're tracking, that can lend itself to like increased entropy. And so let's try to think
about that for a minute. Okay. So if we think about an easy example would be the alphabet.
So if I had like, imagine like the Scrabble game and I took just the 26 letters of the
(48:08):
alphabet, threw them into a bag and we're going to think about the ways that we could
configure those letters, right? So there's only one unique way that we can sequence those
tiles that are featuring the letters. There's only one unique way to sequence them in the
correct order that we know as the alphabet, right? One unique way. But how many total
ways are there to sequence those 26 letters? In other words, how many scrambled ways are
(48:32):
there? There's actually 26 factorial. The first time we make the selection, we have
26 choices. The next time it's 25 and we take that product down the line winds up being
26 factorial is the total configurations. But we know there's only one that's the actual
alphabet. So already we see that there are way more configurations of the alphabet that
are scrambled, the other wrong than are correct. And so among other examples we've surveyed,
(48:58):
there's often way more sort of so-called scrambled states than the unique cases, right? Where
there's some specific case you're after, whether that's all blue marbles and there's just one
way to do that. And if you introduce some red into the mix, now you have different ways
to sequence them and get different arrangements that are unique. And so we're going to see
that that's the theme here. And so those configuration states that tend to have more options are
(49:24):
going to be higher entropy. So now we need to kind of introduce macrostate. And this
is going to be helpful by going back to our dice example. So in our situation where we
were rolling, let's say, two dice, right? We're going to roll two dice. And before we
were mentioning that if we were to list out the exact values that we see upon the roll,
(49:46):
like a one and a two, for example, to sum to three, if we instead of track the individual
values, if we just have data on the sum of the roll. So if we were like, let's say at
a casino and people are rolling dice or whatever, and all we see on the screen, like we can't
see the actual dice being rolled, but we see the sum being displayed. If we have that information,
(50:07):
what we really have is what's called a macrostate.
So the microstate tells us about the individual pieces of the system and the macrostate looks
at it as the whole picture.
As the whole picture. And so in the context of rolling dice, the macrostate is naturally
like the sum, depending on what you're doing and what field you're in. Like there are other
ways you could think about getting to the macrostate. But for the simple example of
(50:30):
like rolling dice, it's just you just take the sum, right? And so you're watching at
the casino and let's just say they're rolling two dice. What number do you see the most?
Seven.
Seven. There are more ways to roll.
I knew that already, Judd. Don't feel bad.
No, I get it. Four, three, five, two.
Yeah, he's not stupid. Yeah.
I would have, it would have taken me like 60 seconds to do that if we hadn't talked
(50:51):
about that earlier. So yeah.
Right. Now this does open a little can of worms depending on there's probably half the
people out there thinking, well, there's three ways, right? There's the, as we noted, there's
the three and the four, there's the two and the five and the one and the six. And then
there's another 50% of the audience out there that's thinking, well, aren't there six ways?
And that other half is thinking about distinguishing the dice, right?
(51:12):
That's fine. It turns out that it doesn't matter which perspective you fall into because
the numbers that we're going to use here, I'm going to stick to the case where we really
don't distinguish the dice for simplicity sake. If you're on the other boat, all the
values are just going to double and you're going to get the same conclusion, right? So
it doesn't matter. But anyway, the point is you're sitting there, you're watching and
(51:33):
most of the time it's a seven, right? And the reason is there are more ways to roll
a seven than more microstates that are associated with seven than any other of the macro states.
So the macro state, for example, of rolling a six, there are fewer ways to roll a six
than a seven and so forth. And so what you think, and if you do this for a variety of
systems, you're going to find that basically you're defining a histogram where there's
(51:56):
a peak, all right? And the X axis, so to speak, of this histogram is just some kind of indicator
of your macro state. And then the Y axis, sort of the intensity of the histogram is
the indicator of how likely essentially, or how many ways that macro state can be achieved.
And in the case of rolling the dice, the peak is around seven and then it falls on either
(52:18):
side of that down to the sum of two, which is the minimum, and then up to down to the
sum of 12, which is again, only one way to do that, two sixes. So there's already this
inherent, just in that simple example, this very intuitive realization that, okay, for
whatever macro state features the most microstates, most options, that lends itself to being more
(52:43):
probable. Well, what does it mean from an entropy standpoint? It means that the macro
state that has more possibilities is associated with having higher entropy, and that's literally
from the definition from Boltzmann's equation, s equals k log omega. So we know that for
the case of rolling the seven, we could plug that in. And again, the energy constant is
(53:05):
not terribly relevant in this example, but the point is, it's a higher entropy. It's
a higher number of mixed cases of the roll that are possible than say the unmixed cases.
And so as an example, we could extend this to a larger set. I actually brought some dice
today. So if you want to take a little gamble here on here, I have 36 dice, six sided, they're
(53:28):
all fair to my knowledge. We're going to do a live roll here. Oh, I dropped one. You want
to collect that? We're going to do a live roll here. Okay, so we've got 36 dice here
in my hand, and we're going to turn them over here. And what do we think? Do we think we're
going to get a unique roll? All ones are all sixes? Or what are we more likely to get?
Probably all ones. Probably all ones. I think all sixes. All right, we're going to see.
(53:50):
What about you, Jen? All three, all threes. Okay, all right, we're going to see. So nonetheless.
Okay, already, we see that none of us are correct. We got some fours. We got some ones,
sixes, twos. Is there not a three? Is that? Oh, there's some threes. Yeah. So pretty much
looks like every number, every single option was expressed. So we have a mixed case as
expected, right? So this is again, kind of the essence of entropy. If you have, and this
(54:14):
is a nice example, because there are 36 dice, it's a relatively like when what game have
you ever rolled 36, right? Like, so it's a pretty large roll of dice. And the point being,
as your system size increases, like the number of dice increase, the likelihood that it's
going to be a mixed case just goes astronomically high in probability, a related example with
(54:35):
tossing coins, right? So if you have a fair coin that you toss, the two options are heads
and tails, right? And we expect that for any individual flip, it's 5050. All right. So
then the question becomes, what if you roll n times, like if you roll 100 times, what
do you expect the outcome to be? Okay, so about 5050. Oh, it should be close to 5050.
Right? There's a chance it's probably going to go 5149 or something like that. That's
(54:57):
what I'm saying. But more often than not, like if you were to repeat that over and over
again, or just increase the count, let's just go to instead of 100, let's go to 1000. So
it turns out that we were talking about this distribution, right? This idea that as you
think about macro states as a histogram, what you're finding is that you're building a distribution
and that distribution, the intensity, the y axis value, the count of possibilities goes
(55:22):
astronomically high, even for a relatively like seemingly small count of observations.
So the example I like to give here is you have 10 coins, and you're going to flip those
10 coins, and you distinguish those coins. So each one has like a number that is that
is trackable then, right? There are 252 ways that those coins can land half and half five
(55:47):
tails five heads 252 ways for 10 coins. If you go to 100 coins, 10 times more coins,
100 coins, that number jumps to one to the 29th. It's over a million 1000 moles. And
then it goes if you do 1000 times, that number is three times 10 to the 300. So the point
is when you get up to 1000 coins that you're tossing that are all fair, the chances that
(56:11):
you roll or that you toss anything but half and half are astronomically low, it's insanely
low, like it's imperceptibly low. So the point is, when we say that like systems and
observations are dominated by the states of high entropy, like that is a very real thing.
That is a very mathematically tangible thing. If you if you look at those types of distributions,
(56:32):
right? And so now, like, let's think about. So we were talking about scaling it there.
Let's like the scenario of taking it from 10 coins to 1000 coins. So like you said,
there's more ways to get 5050 than there are other macro states. Let's let's bring this
into just like life in general, there is no like I'm looking at one of these dice on the
(56:53):
table. That dice is not made up of just 10 atoms. That dice is made up of about a mole,
right? Yeah, I was gonna say about a mole. Um, actually, I wasn't gonna say that I was
just like, I'll let before I say some number that's wrong. I'll just let Tim say, um, yeah,
so it's about a mole of Adam, so 10 to the 26, right? And that's a lot more than 10.
(57:14):
So we can see how just even the small things on the human scale, like a die, for example,
there is, there is a there is a certainty about entropy in the universe. That being
said, even though it's improbable, some of these like ordered cases could arise, but
they're so they're so ridiculously small that it just it doesn't even matter, you know,
(57:37):
in the grand scheme of things in the universe, there is a probable outcome, which happens
to be high entropy, if we thought about it as a grid of positions, like I think we've
mentioned something like that before, and now to take it to more of a more of an actual
like physics example, if these positions can be occupied by energy, bits of energy, units
(58:00):
are not like even necessary right now, just bits of energy among these positions, it is
more, it is more likely for that energy to be dispersed in some random way across this
array of positions, and it is to be all lined up on the left side perfectly on the left
side, all the bits moved over to the left, right? Versus like, let's say there's only
(58:21):
10 bits of energy on a grid of 100. It's very unlikely that they'll occupy just the first
10 bits over here in the left, they'll spread themselves out, right? The more likely scenario
is that they spread themselves out. And so we'll get into this a little bit later. But
what that means is that essentially, hot things cool down and cool things heat up, like energy
spreads itself out throughout the universe. And it turns out that us humans are actually
(58:45):
incredibly good at making that trend continue. Yeah, and I think it's worthwhile to point
out that in some of these examples, so I was thinking as you're as you're commenting there,
a lot of the examples we're talking about are seemingly very simple, the conditions,
the rules at play, like the rolling of the dice, very simple. But this is why entropy
is so fun and rewarding to talk about is because there's just so much to consider when we get
(59:07):
to like real phenomenon. And so one of the examples is that remember what we said about
the dice, we said that they were a fair dice, we said that every face was equally likely.
Well, when for example, in matter, you know, it's not just about geometry, it's not just
about like, if you imagine like a grid of atoms, just a simple 2d grid of atoms, it's
not just about the configuration geometrically, there are forces at play that cause those
(59:30):
positions, there's bonding, and there's preferential bonding. And so that's basically the analogy
of there's like a loaded dice, and that skews the histogram, for example, like mix sodium
and chlorine together, right metal nonmetal, and you may know they form a compound, an
orderly compound, because it's favorable for them to do so. So there's other things at
play. So remember, entropy is not the only important consideration in thermodynamics.
(59:54):
We've been sticking to simple examples, just to kind of make some of these these points
a little more intuitive. But real situations often require some accommodation and bending
of some of the rules that are at play in these simple examples, one of them being, you know,
favorability, the fact that there's bias and there is bias.
Judd was saying, you were saying during the break that like, entropy, at least right now,
(01:00:15):
like, it's one of those things where once you start to understand it, so many more things
click, and it's easy to kind of like shy away from it because it is a hard topic, but it's
very it's very rewarding. Once this starts to make sense.
Yeah, I just think it's interesting. I mean, I just think it's crazy how quickly entropy
(01:00:37):
can get out of hand when you increase even just one more like variable on a system. So
say we're looking at that grid of blue balls again, if we just introduce so we have one
red ball, but let's think now we add a green ball and instantly it's just the entropy is
so much higher. Yeah, I think it's just kind of Yeah, the disorder is.
And now I'm going to blow your mind again. So not only can you introduce that kind of
(01:01:01):
chemical compositional entropy in materials, there's even other ways to show entropy. So
here's another one. We've been talking about thinking about marbles or atoms, right? And
what kind of shape would you associate with like a simple atom? A sphere sphere, right?
Symmetry so like you don't even think about how the marbles oriented right on the grid.
But what if you had an asymmetric molecule? Right? And let's say you have water, water
(01:01:24):
is asymmetric, right? To some degree. So what if the water molecules are I'm just kind of
making this up on the fly, but there are other asymmetric molecules. What if they have now
a distinguished orientation on each of those sites? That's a rotational entropy. They may
they're not all aligned, right? Imagine people standing like let's say you had a bunch of
clones. Can you imagine a bunch of clones of Judd running around? I had nightmares about
(01:01:47):
that. Yeah. So they're all standing in a line, but maybe they're rotated. They're like facing
different directions, right? So there's an arrangement of them that they're all Judd,
right? They're all compositionally a low entropy, because they're all him, but they are rotated
in space, right? They're facing different ways along the compass. And so that entropy
presents itself in multiple features. Simultaneously sometimes. Yeah. And that's what I was saying.
(01:02:09):
It's sometimes difficult to even track all the ways that entropy can even manifest simultaneously.
And so it's a, like I said, it's a very rich topic and very rewarding when you like uncover
like, oh my gosh, there's a new way to think about this. But it can be frustrating at first
when you're when you're scrambling just to get a basic understanding. I get it. It's
taken a long time to, you know, I've been doing this for a while. And it's it's still
(01:02:30):
I still uncover new concepts and new deeper understandings. And sometimes I realize, oh,
I really didn't understand that at all and have to really readjust and reincorporate
new knowledge. Sometimes that happens. So we've got dice on the desk still that clearly show
like none of them are the same value. So that if that's one instance of showing disorder.
(01:02:51):
And now, like Tim said, we have a bunch of different ways for entropy to present itself
positionally, rotationally in within systems, and we can't even track them all. So really,
it should be quite clear now that disorder is the preferable way for systems to to arrange
Yeah, to arrange itself. Yeah, exactly. I'll leave it up to you two. Should we talk about
(01:03:27):
more examples in our daily lives or? Yeah, I think it would be maybe fruitful to talk
about the maybe Yeah, the Gibbs free energy. You know, I love talking about that. So entropy
shows up in other equations that allow us to make useful understanding and prediction
of system behavior. So for consideration, a simple system. And so we'll pick maybe something
like aluminum, just spheres of identical aluminum atoms sitting in a crystalline arrangement.
(01:03:52):
So they're an orderly arrangement, more or less, even though there can be some like,
as you know, Judd, some vacancies in there that cause some entropy, dislocations, whatever.
But overall, compared to say liquid or vapor, we would expect that to be kind of a low entropy
from a configuration standpoint, a low entropy configuration. So then we heat it up. So we
add energy right into the mix. Remember, we said entropy was not energy. What entropy
(01:04:15):
is, is in a certain framework where you have temperature and pressure as your variables
of interest, your thermodynamic variables of interest. Entropy represents kind of a
gradient of energy. And so we'll we'll kind of uncover that here a little bit. So what
ends up happening is, let's say we add heat into this block of aluminum, solid block of
aluminum, we add some heat. So what does that even mean? Let's think about that. So at the
(01:04:39):
particle level, they were sitting there, and now we know there's some incoming heat. So
what what does that how does the system react to that? oscillations, like between the atoms,
they start to they start to move. So what's what we really need to do here is think about
heat as a as like a verb as an action, like we, we provide a flow of heat into the material,
heat is kind of a process, it's really the transfer of energy. So you have your atoms
(01:05:02):
and they have a nominal position. A lot of times when you look at like if anybody's seen
chemistry books or whatever, and you see atom positions in something, they are frozen in
time, like you're looking at a page where it's a snapshot in time. In reality, they're
oscillating. And they oscillate with a frequency and an amplitude that's related to what we
know as temperature, right? But so you're dumping in this heat, we know we're raising
(01:05:23):
the temperature of this aluminum block. And we expect to see those those oscillations,
those vibrations change in sort of what way they would get bigger. Yeah, so they're moving
faster, right? higher frequencies and moving further higher amplitudes, right? But there
becomes a problem like that can only go on for so far because guess what you're in a
let's say you're in the middle of that block as an atom, like what dictates how fast and
(01:05:45):
how far you can really go the other Yeah, you're gonna like eventually start colliding.
And so there's a limit to sort of how much velocity that those atoms can take on or you
could look at it as sort of frequencies. There's different ways you can track this. But the
point is, there's kind of a limit where those get saturated, all right, and the block of
luminous as a crystal and solid says, we know there's incoming energy, like, but we're no
(01:06:06):
longer able just to accommodate that by moving any faster, right. And so the example that
I think we were talking about yesterday was that you can imagine like as an analogy, if
you're at a concert, and maybe at the concert that you show up to, you know, it's a rock
concert. And for whatever reason, they put out some chairs maybe in the front right in
front of this stage, right, they got these chairs there for some reason, which sort of
(01:06:29):
prompts people to adopt this, you know, whatever order that is, you know, maybe it's just sort
of like a square grid of chairs or whatever. So people arrive, they're kind of standing
or sitting in those chairs. But man, then the music cranks and you got this rock music
blasting and then all of a sudden, what are they going to do in the front? Start jumping,
it's gonna be a mosh pit, right. But because they started out, you know, they were coerced
(01:06:50):
into kind of a compressed environment, an orderly environment to begin with, there's
only so much like velocity and speed they can run up and gain before they bump into
somebody. So what could they do when they're sort of saturated in that, but they want to
be able to move further and faster, they could kind of expand the space they're occupying,
maybe push out against the crowd near them. That's just kind of hanging out. In other
(01:07:12):
words, like do work on their environment and expand out themselves. And now they have more
space, they have more configurational options to therefore express this this new velocities
and these higher velocities and rambunctiousness, you know, of the mosh pit they want to form.
It's kind of what's going on here in our aluminum example, aluminum as the crystalline solid
when it gets to that melting point, as we know it, if you're just hovering there and
(01:07:35):
trying to dump in energy, the crystal has no real ability to accommodate any more of
these additional velocities, additional frequencies, unless it does something to accommodate that,
which is to change phase. And so often what we'll see in this in the case of aluminum,
what happens is it's a simple material, it happens to just form a liquid. And so the
liquid state is more fluid and just has a little bit more open space typically than
(01:07:59):
the comparatively to the crystal. And so now some additional velocity vectors can be opened
up additional frequencies of vibration and so forth. And you can take that even further
and go all the way to the vapor, right, you heat the liquid up to the point where now
it's incapable of accommodating anymore. And then it phases change phases to the vapor
state. And now it unlocks a huge amount of opportunities for velocity variation and vectors
(01:08:21):
and so forth. And so what's interesting is that's our expectation. But then the question
becomes, well, wait a minute. What about the don't systems want low energy? Isn't that
something like universal that we just in general kind of understand about systems, we kind
of know that and this is again an underlying theme in general in science is like systems
try to seek out low energy states. And so this idea that entropy is increasing on this
(01:08:47):
journey we go from the crystal and aluminum to the liquid to the vapor, we know the entropy
is increasing sort of the chaos and disorder is increasing. How does that inform this other
thing we know, which is that systems try to seek out low energy. And that's where the
Gibbs free energy comes in. And so there's this metric that we track in thermodynamics
under these conditions. And it's called Gibbs free energy G, it equals h minus TS, enthalpy
(01:09:11):
minus the product of temperature and entropy. And so very conceptually here, we're not doing
the full calculations because it would take a lot of calculus, we'd run out of time. But
in concept, what's happening is as you raise the temperature of the aluminum as it and
what's that really mean? It means we're raising the enthalpy, right? thermal energy is going
(01:09:31):
up the total thermal energy is going up in that aluminum. And what ends up happening
is we know systems want to lower their overall energy, the overall energy that matters here
is the Gibbs free energy. So h is increasing. So there's an increase to G as we raise temperature.
And for a while, that's fine. For a while, the system just tolerates that. But then the
(01:09:52):
system has an ability to counteract this, right. And so we go through these phase changes.
One of the things that's happening is you're switching to higher entropy states, s is increasing,
in some cases very dramatically to offset the increase in h. So remember, it's G equals
h minus TS, s is a positive number, we know that. And so nonetheless, as we go to higher
(01:10:16):
and higher entropy states, it balances out to give a relatively relatively low Gibbs
free energy and maintain a relatively low Gibbs free energy. Now it's still increasing,
right. But it's still trying but the balance is at play. It's trying to keep that number
kind of low. So what you're saying is entropy can be used to explain phase changes. Yeah,
(01:10:37):
as we increase as we increase the the the heat is inputted into the system. And eventually
it reaches a capacity. And they say, something's got to change. And that change happens in
entropy in which energy finds new ways to disperse itself within the system, which right
a change of phase opens the door for now an increased opportunity to gain more enthalpy,
(01:11:01):
right. And so that's what's what's really happening and to allow for the system to maintain
a relatively low overall energy. Interesting. I mean, that was definitely a concept that
was maybe more abstract in your course or not necessarily abstract, but we didn't have
the time to dig too deep into it. So yeah, right. Right. So, Jen, I know that so we're
running out of time. But Judd, I know that you would want to talk about what entropy
(01:11:24):
has to do with life here on Earth. Is that correct? Yeah, if we could just briefly like
you give a brief thing and then I'll do the quote. Okay. Yeah. So this is this is what
I'll say is let's the original kind of like Darwinian theory about like life starting
is kind of you need heat, and you need the elements that are essential for life. And
you combine those things. And like, boom, that's life. Right. Right. But we know from
(01:11:49):
our episode on Enceladus that at least on Enceladus, the reason scientists were very
excited about the possibility of life is because we know there's geysers that have a lot of
energy and a lot of heat at the bottoms of these oceans. Now, that this Darwinian idea
isn't 100% correct. And that's because you need things like these hot underwater geysers
to disperse their energy increase the entropy of the system in order to kickstart life
(01:12:17):
because as we know, entropy is kind of times arrow and as it's moving forward that gives
rise to complex systems like life. And so I think the interesting thing about this is
that so just like how we consume food and metabolize or even things as big as stars
exploding and dispersing their dust throughout the universe, those are all systems within
(01:12:41):
our universe that are constantly increasing the entropy. Now to go back to the geyser
thing, they actually predicted that these geysers should exist, that they need to exist
in order for life to happen before we ever even discovered them on the bottom of our
ocean. We were so certain that this function would be necessary in order to start life
(01:13:04):
in general before we had ever seen one. And then we went and discovered skyscraper sized
mineral deposits at the bottom of our ocean, which goes to show you how large of a role
entropy plays in the origins of life. Yeah, and I think it's like that's my mic drop
for today. I think that the overall idea is that I think humans as humans, so the universe
(01:13:27):
is tending towards higher entropy. So as humans, our creation was following that trend. So
we we increase entropy, our existence increases entropy, and we're pretty good at it. And
we're pretty good at it. And so here's a quote from Jeremy England, a Harvard physicist,
who says, If you start with a random clump of atoms and shine light on it for long enough,
(01:13:50):
it should not be surprising that you get a plant. So we're such good creators of entropy
that light or life itself shouldn't be super surprising. Yeah, exactly. And so maybe that
gives us hope to know that there are other I mean, our sun, for example, is a very good
(01:14:10):
deliver of entropy in a low entropy state, it delivers this high energy in a low entropy
state, high density photons. And then when they collide with Earth, all the systems on
the earth starting at the plants, and then to the animals that eat the plants, and every
system, even the fossil fuels that we're burning, all then work to disperse this and create
a higher entropy system. So as we know, pretty easily that just by looking up at the night
(01:14:34):
sky, there's a lot of other suns in the universe that can deliver this energy. And if there
is life on other planets, we can be sure that they're doing the exact same thing right now,
which is, I mean, maybe unfortunately for us, but probably burning a lot of fossil fuels
and doing a lot of doing a lot of work on the planet.
Well, yeah, I think wrap it up. I think we want to say thank you to our professor Dr.
(01:14:57):
Coleman for coming on. I think you did a great job of explaining all these topics. And I
think we have a much better understanding of entropy now.
Yes. And you have much to look forward to as well the next time that Dr. Colin is on
with us. So
All right. Well, yeah, my pleasure. Thank you.
All right. I think now's a pretty good time for our listener shout out. This episode's
listener shout out goes to Shawna Jacobs, who is a University of Wisconsin, Madison
(01:15:18):
graduate who studied botany, Spanish and Korean. And now she's from Texas. She says hello from
Texas. I'm not sure how you do it, but university somehow manages to be calming yet mentally
stimulating at the same time. It always feels like I'm sitting down to a conversation with
friends while also diving into topics that I might not have had the time to look into.
I love what you've put out so far. And I'm excited to see what topics you'll cover in
the future. Thank you so much for the message Shawna from the whole university team. We
(01:15:41):
love to hear back from awesome listeners like yourself, and we wish you the best of luck
with whatever you're up to now. And if you are listening to this and would like to be
featured as the episode's listener shout out, you can do so by following us on Instagram
and keeping up with whatever we're posting or whatever we're putting on our story. And
we will give you plenty of opportunities to do so. Keep an eye out for that.
(01:16:01):
All right. Unfortunately, that has to be the end of our episode. But that was a very fruitful
discussion on entropy and hopefully you are leaving here. Maybe not understanding it better,
but having more questions even about it and knowing the right questions to ask, which
is exactly our goal here is to make you a little bit more curious about what science
has to do with your life. So until next time, we will be back with you in two weeks to talk
(01:16:21):
about a topic that we haven't decided on yet. Peace.
Yep. That's it.