Flash and JavaScript are required for this feature.

Download the video from iTunes U or the Internet Archive.

## Summary

This video motivates the statistical mechanics approach through material structure and behavior. Entropy is introduced as a natural variable whose derivative with respect to energy is zero when the number of microstates is maximized. This derivative is identified as being the reciprocal of temperature, and Boltzmann’s constant is explained.

## Learning Objectives

After watching this video students will be able to explain how the definition of temperature arises as a derivative of entropy with respect to energy.

Funding provided by the Singapore University of Technology and Design (SUTD)

Developed by the Teaching and Learning Laboratory (TLL) at MIT for SUTD

MIT © 2012

Here you see a circuit powered by a battery and connected through a light bulb. This section of the circuit is made from copper foil. We can put a cut into the foil to make this circuit incomplete. Taking advantage of the thermal expansion properties of copper, we can place some candles under the foil to provide enough heat to expand the foil and complete the circuit again! Many materials, like metals, expand when you heat them. But other materials, like polymers, shrink when heated. The macroscopic properties of both of these materials are highly dependent on temperature. The difference in the macroscopic behavior of these two materials is determined by very different microscopic structure. Statistical Mechanics is the method used to describe and predict behavior at the macro-scale based on statistical models of microscopic behavior. The first step to understanding the power of Statistical Mechanics, is to use this method to define and understand temperature in terms of macro state parameters. The notion of equilibrium will also prove to be useful in this exercise. This video is part of the equilibrium video series. It is often important to determine whether or not a system is at equilibrium. To do this, we must understand how a system's equilibrium state is constrained by its boundary and surroundings. Hi, my name is Jeff Grossman, and I'm a professor in the MIT Department of Materials Science and Engineering. Before watching this video, you should be familiar with the second law of thermodynamics, and the definitions of micro-state, macro-state, and entropy. After watching this video, you will be able to: Explain how the definition of temperature arises as a derivative of entropy with respect to energy. Consider an isolated, insulated box of non-interacting particles. We can characterize this box macroscopically by 3 parameters: the Volume of the box, the Number of particles in the box (measured in moles), and the total Energy. The Volume of the box, and the Number of particles and the total energy in this box are fixed. Thus a microstate that corresponds to such a macrostate is the velocity and position {vi, xi} of each particle in the box. With this information, we can determine the movement and position of every particle in the box, and we obtain the total energy is ½ mv^2 summed over all particles. As you can imagine, measuring changes in the system by tracking each and every microstate would be a computational nightmare. Instead, we consider Omega, the number of microstates that correspond to a system with macro-parameters E, N, and V. Now, let's consider an isolated system comprised of two boxes of non-interacting particles. The wall separating the two boxes allows the transfer energy, but does not allow particles to cross it. What changes in macro state parameters do we observe in our system? Pause the video here. The volume and number of particles in each box is fixed and unchanging. Because energy, in the form of heat, can be transferred through the wall, E1 and E2 can change. But of course, the total energy E = E1 + E2 stays the same, because the 2-box system is thermally isolated. Let's count the total number of microstates that the composite system can have, which correspond to a total system energy E. The easiest way to do this is to first define a new function f(E1), which is the number of microstates of the composite system when box 1 has energy E1. To do this, we count the number of microstates where box 1 has some energy E1, and multiply this by the number of microstates where box 2 has energy E-E1. Then, to find the total number of microstates for the composite system, we need to sum the function f over all ways we could have selected E1. We want to understand more about this function f because it is key to defining Omega. Let's look at a typical example to see what f looks like. We can explore the function f by considering a simple case, where each particle can only exist in one of 2 allowable energy states—one with energy 0 and the other with 1. Suppose I have 50 particles in box 1 and 100 particles in box 2, and the total energy of the system is fixed at, say, 50. In order to reach this total energy, the energy E1 in box 1 can range from 0 to 50. This is the graph for the number of microstates with total energy E having a given E1. We get this graph by looking at all ways to assign every particle a 0 or 1 state such that the energy in box 1 is E1 and the energy in box 2 is E-E1. We can imagine that the sum that defines omega is the area under this graph. Look at how peaked this graph is! With increasing numbers of particles, the graph becomes more and more strongly peaked. Remember, in a realistic situation, the number of particles is going to be on the order of Avagadro's number. Which means the distribution of energies will be highly peaked with a very small standard deviation. Statistically, this means as the Number of particles becomes larger, the energy in box 1 is almost always very close to this peak value. Let's define Em to be the box 1 energy corresponding to the peak value of f. So how do we find Em? Pause the video and think about it. To find Em, we want to maximize the function f with respect to E1. Since the volume, number, and total energy of the system are fixed, differentiate f with respect to E1. The maximum occurs when this derivative is zero. Take a moment to carry out this derivative, and check your solution with ours. Pause the video. We obtain the following expression. Thus the Energy Em that leads to the greatest number of composite system microstates is defined by this elegant condition. So let's define a new macro-parameter, S, called "entropy". The kB is Boltzmann's constant. We'll explain why we include it later. By defining this new term, entropy, our condition above occurs when the derivative of the entropy of box 1 with respect to the energy of box 1 is equal to the derivative of the entropy of box 2 with respect to the energy of box 2. So on average, we expect the energy of box 1 to be very close to Em. But what happens if box 1 starts with some different energy, maybe an energy significantly different from Em? Statistically, because such a state is so much less likely, when contact between the boxes is made, the energy will redistribute over time towards the most likely state, where box 1 has energy Em. This process of energy redistribution is called equilibration. And the state with the highest likelihood is called equilibrium! How do we know we've landed in an equilibrium state? If the derivative condition we found earlier, evaluated at the average energy of each box is held, it indicates that we are at equilibrium. Note that in the simple example scenario we considered earlier, it was NOT the case that the energy of each box was the same. It is the derivative of entropy with respect to energy for each box evaluated at the energy, number, and volume of each box that must be equal at equilibrium. This suggests that we should define the derivative of entropy with respect to energy to be a new system variable. But what would this variable represent physically? To figure out what it should be, think about our composite system. The volume of each box is constant as is the number of particles in each box. The wall between the two boxes allows energy to transfer. Over time, what parameter will eventually be the same for both boxes? Pause the video and discuss. Our experience tells us that once 2 subsystems are brought into thermal contact, we expect the composite system will eventually evolve so that each box has the same temperature! But at equilibrium the derivative of entropy with respect to energy of each box is also equal. This tells us that this derivative should be some function of temperature. In fact, the derivative of entropy with respect to energy is exactly equal to the reciprocal of temperature! This tells us that when we are measuring temperature of a system, we are NOT measuring the energy of a system, we are measuring this derivative! There are specific units determined by the typical way we measure temperature. This is why Boltzmann's constant was introduced into the definition of entropy! The constant is introduced precisely so that the derivative of entropy with respect to energy has dimension of 1/temperature. In this video we've seen that entropy is a natural macrostate parameter, and statistically a system's microstates evolve to exist within the maximum entropy state. This is called equilibrium. Temperature is naturally defined as the derivative of entropy with respect to energy for a system. And this definition allows us to understand how temperature and entropy are related at equilibrium.

It is highly recommended that the video is paused when prompted so that students are able to attempt the activities on their own and then check their solutions against the video.

During the video, students will:

- Identify how to find the energy states of box 1 and box 2 that maximizes the number of microstates with those energies.
- Differentiate the expression for the number of microstates with respect to energy that corresponds to fixed energy states of E1 for box 1 and E-E1 for box 2.
- Identify the physical quantity that best represents the derivative of entropy with respect to energy with fixed volume and number by considering properties of thermal equilibrium.

## Welcome!

This OCW supplemental resource provides material from outside the official MIT curriculum.

**MIT OpenCourseWare** is a free & open publication of material from thousands of MIT courses, covering the entire MIT curriculum.

**No enrollment or registration.** Freely browse and use OCW materials at your own pace. There's no signup, and no start or end dates.

**Knowledge is your reward.** Use OCW to guide your own life-long learning, or to teach others. We don't offer credit or certification for using OCW.

**Made for sharing**. Download files for later. Send to friends and colleagues. Modify, remix, and reuse (just remember to cite OCW as the source.)

Learn more at Get Started with MIT OpenCourseWare