1
00:00:06,220 --> 00:00:11,389
Here you see a circuit powered by a battery
and connected through a light bulb. This section
2
00:00:11,389 --> 00:00:14,789
of the circuit is made from copper foil.
3
00:00:14,789 --> 00:00:20,029
We can put a cut into the foil to make this
circuit incomplete. Taking advantage of the
4
00:00:20,029 --> 00:00:24,939
thermal expansion properties of copper, we
can place some candles under the foil to provide
5
00:00:24,939 --> 00:00:31,939
enough heat to expand the foil and complete
the circuit again! Many materials, like metals,
6
00:00:32,200 --> 00:00:34,100
expand when you heat them.
7
00:00:34,100 --> 00:00:37,730
But other materials, like polymers, shrink
when heated. The macroscopic properties of
8
00:00:37,730 --> 00:00:39,530
both of these materials are highly dependent
on temperature.
9
00:00:39,530 --> 00:00:44,730
The difference in the macroscopic behavior
of these two materials is determined by very
10
00:00:44,730 --> 00:00:51,730
different microscopic structure. Statistical
Mechanics is the method used to describe and
11
00:00:53,260 --> 00:01:00,260
predict behavior at the macro-scale based
on statistical models of microscopic behavior.
12
00:01:00,780 --> 00:01:06,060
The first step to understanding the power
of Statistical Mechanics, is to use this method
13
00:01:06,060 --> 00:01:13,060
to define and understand temperature in terms
of macro state parameters. The notion of equilibrium
14
00:01:13,890 --> 00:01:18,110
will also prove to be useful in this exercise.
15
00:01:18,110 --> 00:01:21,230
This video is part of the equilibrium video
series.
16
00:01:21,230 --> 00:01:27,580
It is often important to determine whether
or not a system is at equilibrium. To do this,
17
00:01:27,580 --> 00:01:34,030
we must understand how a system's equilibrium
state is constrained by its boundary and surroundings.
18
00:01:34,030 --> 00:01:40,090
Hi, my name is Jeff Grossman, and I'm a professor
in the MIT Department of Materials Science
19
00:01:40,090 --> 00:01:41,700
and Engineering.
20
00:01:41,700 --> 00:01:47,400
Before watching this video, you should be
familiar with the second law of thermodynamics,
21
00:01:47,400 --> 00:01:53,670
and the definitions of micro-state, macro-state,
and entropy.
22
00:01:53,670 --> 00:01:59,030
After watching this video, you will be able
to: Explain how the definition of temperature
23
00:01:59,030 --> 00:02:06,030
arises as a derivative of entropy with respect
to energy.
24
00:02:09,250 --> 00:02:16,250
Consider an isolated, insulated box of non-interacting
particles. We can characterize this box macroscopically
25
00:02:17,230 --> 00:02:23,099
by 3 parameters: the Volume of the box, the
Number of particles in the box (measured in
26
00:02:23,099 --> 00:02:28,810
moles), and the total Energy. The Volume of
the box, and the Number of particles and the
27
00:02:28,810 --> 00:02:35,810
total energy in this box are fixed. Thus a
microstate that corresponds to such a macrostate
28
00:02:35,849 --> 00:02:41,670
is the velocity and position {vi, xi} of each
particle in the box. With this information,
29
00:02:41,670 --> 00:02:46,620
we can determine the movement and position
of every particle in the box, and we obtain
30
00:02:46,620 --> 00:02:53,620
the total energy is ½ mv^2 summed over all
particles. As you can imagine, measuring changes
31
00:02:55,349 --> 00:03:01,840
in the system by tracking each and every microstate
would be a computational nightmare.
32
00:03:01,840 --> 00:03:08,560
Instead, we consider Omega, the number of
microstates that correspond to a system with
33
00:03:08,560 --> 00:03:15,560
macro-parameters E, N, and V. Now, let's consider
an isolated system comprised of two boxes
34
00:03:17,040 --> 00:03:24,040
of non-interacting particles. The wall separating
the two boxes allows the transfer energy,
35
00:03:24,150 --> 00:03:29,599
but does not allow particles to cross it.
What changes in macro state parameters do
36
00:03:29,599 --> 00:03:36,599
we observe in our system? Pause the video
here. The volume and number of particles in
37
00:03:41,879 --> 00:03:48,519
each box is fixed and unchanging. Because
energy, in the form of heat, can be transferred
38
00:03:48,519 --> 00:03:55,519
through the wall, E1 and E2 can change. But
of course, the total energy E = E1 + E2 stays
39
00:03:57,440 --> 00:04:03,709
the same, because the 2-box system is thermally
isolated. Let's count the total number of
40
00:04:03,709 --> 00:04:09,599
microstates that the composite system can
have, which correspond to a total system energy
41
00:04:09,599 --> 00:04:16,599
E. The easiest way to do this is to first
define a new function f(E1), which is the
42
00:04:16,889 --> 00:04:23,889
number of microstates of the composite system
when box 1 has energy E1. To do this, we count
43
00:04:24,290 --> 00:04:30,430
the number of microstates where box 1 has
some energy E1, and multiply this by the number
44
00:04:30,430 --> 00:04:37,430
of microstates where box 2 has energy E-E1.
Then, to find the total number of microstates
45
00:04:40,000 --> 00:04:46,120
for the composite system, we need to sum the
function f over all ways we could have selected
46
00:04:46,120 --> 00:04:53,120
E1. We want to understand more about this
function f because it is key to defining Omega.
47
00:04:54,320 --> 00:04:59,030
Let's look at a typical example to see what
f looks like.
48
00:04:59,030 --> 00:05:04,150
We can explore the function f by considering
a simple case, where each particle can only
49
00:05:04,150 --> 00:05:11,150
exist in one of 2 allowable energy states—one
with energy 0 and the other with 1. Suppose
50
00:05:12,910 --> 00:05:19,560
I have 50 particles in box 1 and 100 particles
in box 2, and the total energy of the system
51
00:05:19,560 --> 00:05:26,560
is fixed at, say, 50. In order to reach this
total energy, the energy E1 in box 1 can range
52
00:05:27,400 --> 00:05:34,400
from 0 to 50. This is the graph for the number
of microstates with total energy E having
53
00:05:34,460 --> 00:05:41,460
a given E1. We get this graph by looking at
all ways to assign every particle a 0 or 1
54
00:05:42,130 --> 00:05:49,130
state such that the energy in box 1 is E1
and the energy in box 2 is E-E1. We can imagine
55
00:05:50,150 --> 00:05:56,080
that the sum that defines omega is the area
under this graph.
56
00:05:56,080 --> 00:06:03,080
Look at how peaked this graph is! With increasing
numbers of particles, the graph becomes more
57
00:06:03,310 --> 00:06:10,310
and more strongly peaked. Remember, in a realistic
situation, the number of particles is going
58
00:06:10,509 --> 00:06:16,150
to be on the order of Avagadro's number. Which
means the distribution of energies will be
59
00:06:16,150 --> 00:06:23,150
highly peaked with a very small standard deviation.
Statistically, this means as the Number of
60
00:06:23,650 --> 00:06:29,770
particles becomes larger, the energy in box
1 is almost always very close to this peak
61
00:06:29,770 --> 00:06:36,770
value. Let's define Em to be the box 1 energy
corresponding to the peak value of f. So how
62
00:06:40,270 --> 00:06:47,270
do we find Em? Pause the video and think about
it. To find Em, we want to maximize the function
63
00:06:53,340 --> 00:06:55,630
f with respect to E1.
64
00:06:55,630 --> 00:07:01,080
Since the volume, number, and total energy
of the system are fixed, differentiate f with
65
00:07:01,080 --> 00:07:06,770
respect to E1. The maximum occurs when this
derivative is zero.
66
00:07:06,770 --> 00:07:11,720
Take a moment to carry out this derivative,
and check your solution with ours. Pause the
67
00:07:11,720 --> 00:07:18,720
video.
68
00:07:19,440 --> 00:07:26,440
We obtain the following expression. Thus the
Energy Em that leads to the greatest number
69
00:07:26,960 --> 00:07:32,610
of composite system microstates is defined
by this elegant condition.
70
00:07:32,610 --> 00:07:39,610
So let's define a new macro-parameter, S,
called "entropy". The kB is Boltzmann's constant.
71
00:07:40,669 --> 00:07:46,800
We'll explain why we include it later. By
defining this new term, entropy, our condition
72
00:07:46,800 --> 00:07:53,020
above occurs when the derivative of the entropy
of box 1 with respect to the energy of box
73
00:07:53,020 --> 00:07:59,039
1 is equal to the derivative of the entropy
of box 2 with respect to the energy of box
74
00:07:59,039 --> 00:08:05,960
2.
75
00:08:05,960 --> 00:08:12,960
So on average, we expect the energy of box
1 to be very close to Em. But what happens
76
00:08:13,520 --> 00:08:18,819
if box 1 starts with some different energy,
maybe an energy significantly different from
77
00:08:18,819 --> 00:08:25,819
Em? Statistically, because such a state is
so much less likely, when contact between
78
00:08:25,880 --> 00:08:32,209
the boxes is made, the energy will redistribute
over time towards the most likely state, where
79
00:08:32,209 --> 00:08:35,349
box 1 has energy Em.
80
00:08:35,349 --> 00:08:42,139
This process of energy redistribution is called
equilibration. And the state with the highest
81
00:08:42,139 --> 00:08:45,449
likelihood is called equilibrium!
82
00:08:45,449 --> 00:08:50,899
How do we know we've landed in an equilibrium
state? If the derivative condition we found
83
00:08:50,899 --> 00:08:57,329
earlier, evaluated at the average energy of
each box is held, it indicates that we are
84
00:08:57,329 --> 00:09:04,329
at equilibrium. Note that in the simple example
scenario we considered earlier, it was NOT
85
00:09:04,660 --> 00:09:09,999
the case that the energy of each box was the
same. It is the derivative of entropy with
86
00:09:09,999 --> 00:09:15,879
respect to energy for each box evaluated at
the energy, number, and volume of each box
87
00:09:15,879 --> 00:09:20,189
that must be equal at equilibrium.
88
00:09:20,189 --> 00:09:26,470
This suggests that we should define the derivative
of entropy with respect to energy to be a
89
00:09:26,470 --> 00:09:33,470
new system variable. But what would this variable
represent physically?
90
00:09:33,670 --> 00:09:40,300
To figure out what it should be, think about
our composite system. The volume of each box
91
00:09:40,300 --> 00:09:46,779
is constant as is the number of particles
in each box. The wall between the two boxes
92
00:09:46,779 --> 00:09:53,779
allows energy to transfer. Over time, what
parameter will eventually be the same for
93
00:09:53,970 --> 00:10:00,970
both boxes? Pause the video and discuss.
94
00:10:04,160 --> 00:10:09,410
Our experience tells us that once 2 subsystems
are brought into thermal contact, we expect
95
00:10:09,410 --> 00:10:15,959
the composite system will eventually evolve
so that each box has the same temperature!
96
00:10:15,959 --> 00:10:21,249
But at equilibrium the derivative of entropy
with respect to energy of each box is also
97
00:10:21,249 --> 00:10:28,249
equal. This tells us that this derivative
should be some function of temperature. In
98
00:10:28,389 --> 00:10:34,110
fact, the derivative of entropy with respect
to energy is exactly equal to the reciprocal
99
00:10:34,110 --> 00:10:40,519
of temperature! This tells us that when we
are measuring temperature of a system, we
100
00:10:40,519 --> 00:10:45,989
are NOT measuring the energy of a system,
we are measuring this derivative!
101
00:10:45,989 --> 00:10:51,309
There are specific units determined by the
typical way we measure temperature. This is
102
00:10:51,309 --> 00:10:56,910
why Boltzmann's constant was introduced into
the definition of entropy! The constant is
103
00:10:56,910 --> 00:11:02,980
introduced precisely so that the derivative
of entropy with respect to energy has dimension
104
00:11:02,980 --> 00:11:09,980
of 1/temperature.
105
00:11:12,279 --> 00:11:18,410
In this video we've seen that entropy is a
natural macrostate parameter, and statistically
106
00:11:18,410 --> 00:11:24,639
a system's microstates evolve to exist within
the maximum entropy state. This is called
107
00:11:24,639 --> 00:11:31,519
equilibrium. Temperature is naturally defined
as the derivative of entropy with respect
108
00:11:31,519 --> 00:11:37,619
to energy for a system. And this definition
allows us to understand how temperature and
109
00:11:37,619 --> 00:11:44,619
entropy are related at equilibrium.