next up previous
Next: About this document ...

LECTURE 3

Statistical Thermodynamics
In this lecture we will introduce two important thermodynamic concepts: temperature and entropy. We already have intuitive ideas about temperature and entropy. We know what hot and cold mean. We think of entropy as disorder. But how can we quantify these concepts and relate them to the microscopics of systems of many particles? That is the subject of this lecture.

Thermal Interaction Between Macroscopic Systems

=3.0 true in \epsfbox{boxes.eps}

Consider two macroscopic systems A and A$^{\prime}$ with energies $E$ and $E^{\prime}$ respectively. Divide the energy scale into equal small intervals of magnitude $\delta E$ and $\delta E^{\prime}$. We can then define $\Omega(E)$ as the number of microstates of system A with energy between $E$ and $E+\delta E$, and $\Omega^{\prime}(E^{\prime})$ as the number of microstates of system A$^{\prime}$ with energy between $E^{\prime}$ and $E^{\prime}+\delta E^{\prime}$. These two systems may exchange energy with each other, i.e., they are not thermally isolated from each other.

Let us define the total system A$^o$ as the two interacting systems A+A$^{\prime}$ and assume that A$^o$ has a total energy $E^{o}=E+E^{\prime}$ which is constant. We assume that energy is conserved and that A$^o$ is isolated from the remainder of the universe. Let us also assume that A$^o$ is in equilibrium (i.e., no macroscopic parameters are changing in time). Let us consider an ensemble of such systems.

The energies $E$ and $E^{\prime}$ may have any values consistent with the constraint $E^{\prime}=E^o-E$. Thus the distribution of energy between the two systems is a function of only one variable, say $E$. Then the probability of finding system A with energy $E$ is proportional to $\Omega^o(E)$, i.e., the probability $P(E)$ of system A having energy $E$ is

\begin{displaymath}
P(E)=c\Omega^{o}(E)
\end{displaymath} (1)

where $C$ is a constant and $\Omega^o(E)$ is the number of microstates of the total system where system A has energy $E$ (i.e., an energy between $E$ and $E+\delta E$).

But if A has energy E, then system A may be in any one of $\Omega(E)$ microstates, and system A$^{\prime}$ may simultaneously be in any of $\Omega^{\prime}(E^{\prime})=\Omega^{\prime}(E^o-E)$ microstates. Thus

\begin{displaymath}
\Omega^o(E)=\Omega(E)\Omega^{\prime}(E^o-E)
\end{displaymath} (2)

and
\begin{displaymath}
P(E)=c\Omega(E)\Omega^{\prime}(E^o-E)
\end{displaymath} (3)

As discussed in Reif, sections 2.5 and 3.3, the quantity $\Omega(E)$ is a rapidly increasing function of its argument. For example, as shown by Reif (section 2.5), for an ideal gas
\begin{displaymath}
\Omega(E)=BV^NE^{3N/2}
\end{displaymath} (4)

where $B$ is a constant, $V$ is the volume occupied by the ideal gas and $N$ is the number of gas molecules in the system. Since $N$ is of the order of Avogadro's number, $\Omega(E)$ is indeed a rapidly increasing function of $E$. Hence $P(E)$ is the product of a rapidly increasing function ($\Omega(E)$) and a rapidly decreasing function ( $\Omega^{\prime}(E^o-E)$) of $E$ and has the behavior shown in the figure. $P(E)$ is a highly peaked function if $\Omega(E)$ and $\Omega^{\prime}(E^o-E)$ are both rapidly varying functions.

=3.0 true in \epsfbox{peak.eps}

Reif in section 3.7 argues that the fractional width of the peak in $P(E)$ is of order

\begin{displaymath}
\frac{\Delta E}{\overline{E}}\approx\frac{1}{\sqrt{f}}
\end{displaymath} (5)

where $f$ is the number of degrees of freedom in the system, a number of the order of Avogadro's number.

So it is highly probable that the combined system will be in a state with the most probable value of $E$, which we denote $\tilde{E}$. To find $\tilde{E}$, we must find the maximum of the function $P(E)$. Rather than $P(E)$ itself, consider $\ln P(E)$ which varies more slowly. (This will make more sense after we define entropy.) The condition for a maximum is

\begin{displaymath}
\frac{\partial\ln P(E)}{\partial E}=\frac{1}{P}\frac{\partial P}{\partial E}=0
\end{displaymath} (6)

The value of $E$ satisfying this equation is by definition $\tilde{E}$, the most probable energy. Substituting
\begin{displaymath}
\ln P(E)=\ln c + \ln \Omega(E) + \ln \Omega^{\prime}(E^{\prime})
\end{displaymath} (7)

yields
\begin{displaymath}
\frac{\partial\ln P(E)}{\partial E}=\frac{\partial\ln\Omega(...
...{\partial E^{\prime}}
\frac{\partial E^{\prime}}{\partial E}=0
\end{displaymath} (8)

when $E=\tilde{E}$. Or using
\begin{displaymath}
\frac{\partial E^{\prime}}{\partial E}=\frac{\partial(E^o-E)}{\partial E}=-1
\end{displaymath} (9)

we obtain
\begin{displaymath}
\frac{\partial\ln\Omega(E)}{\partial E}
-\frac{\partial \ln\Omega^{\prime}(E^{\prime})}{\partial E^{\prime}}=0
\end{displaymath} (10)

Now introduce the definition
\begin{displaymath}
\beta\equiv \frac{\partial \ln\Omega(E)}{\partial E}
\end{displaymath} (11)

Thus the most probable state of the system is characterized by
\begin{displaymath}
\beta(\tilde{E})=\beta^{\prime}(\tilde{E}^{\prime})
\end{displaymath} (12)

where $\tilde{E}^{\prime}=E^o-\tilde{E}$. Now let us introduce the definition
\begin{displaymath}
\beta\equiv\frac{1}{k_BT}
\end{displaymath} (13)

where $k_B$ is a constant called Boltzmann's constant and $T$ is a function we will call temperature. Thus
\begin{displaymath}
T\equiv{\rm temperature}\equiv\frac{1}{k_B\beta}
\end{displaymath} (14)

Let us consider for a moment the definition of $\beta$ in eq. (11). We expect a system to have more states available as its energy $E$ increases.

=3.0 true in \epsfbox{lnOmega_vs_E.eps}

At low temperatures the energy of the system will be close to the ground state energy $E_g$ and the slope of $\Omega(E)$ will be steep, corresponding to large $\beta$ and low $T$. At high $E$, the slope of $\Omega(E)$ is small, corresponding to small $\beta$ and high temperatures.

Let us introduce one more definition. Define the entropy of a system to be

\begin{displaymath}
S\equiv{\rm entropy}\equiv k_B\ln\Omega
\end{displaymath} (15)

Thus combining these definitions
\begin{displaymath}
\frac{\partial\ln\Omega(E)}{\partial E}=\frac{1}{k_BT}=\frac{1}{k_B}
\frac{\partial S}{\partial E}
\end{displaymath} (16)

or
\begin{displaymath}
\frac{1}{T}=\frac{\partial S}{\partial E}
\end{displaymath} (17)

The condition of maximum probability corresponds to maximum total entropy according to eq. (7): The condition that this occurs can, by eq. (12), be written as
\begin{displaymath}
T=T^{\prime}
\end{displaymath} (18)

In other words if we put 2 systems together, they will adjust until they are at the same temperature and their total entropy is maximized.

Note: One reason for defining entropy as $S=k_B\ln\Omega$ is to obtain a definition that makes entropy an additive parameter of two interacting systems, i.e, if

\begin{displaymath}
S=k_B\ln\Omega
\end{displaymath} (19)

and
\begin{displaymath}
S^{\prime}=k_B\ln\Omega^{\prime}
\end{displaymath} (20)

then
\begin{displaymath}
S^o=k_B\ln\Omega\Omega^{\prime}=k_B\ln\Omega+k_B\ln\Omega^{\prime}=S+S^{\prime}
\end{displaymath} (21)

The definition of entropy $S=k_B\ln\Omega$ agrees with our intuitive notion of disorder. The greater the number of accessible states $\Omega$, the larger the number of ways to arrange things, and the greater the ``disorder.''

=3.0 true in \epsfbox{boxesapart.eps}

Let us explore some of the properties of these two newly defined quantities, entropy and temperature. Consider the situation where A and A$^{\prime}$ are initially separated and in equilibrium. A has energy $E_i$ and A$^{\prime}$ has energy $E_i^{\prime}$. Now assume that A and A$^{\prime}$ are placed in thermal contact with each other so that energy may be exchanged. The resulting situation just after they are put together is highly unlikely unless just by chance $E_i=\tilde{E}$ and $E_i^{\prime}=\tilde{E}^{\prime}=E^o-\tilde{E}$. The system will evolve in time until

\begin{displaymath}
E_f=\tilde{E} \;\;\;\;{\rm }\;\;\;\; E_f^{\prime}=\tilde{E}^{\prime}
\end{displaymath} (22)

and $P(E)$ becomes a maximum. At equilibrium the $\beta$ parameters of the systems are then equal (their temperatures are equal):
\begin{displaymath}
\beta_f=\beta_f^{\prime}
\end{displaymath} (23)

where $\beta_f\equiv\beta(E_f)$ and $\beta_f^{\prime}\equiv\beta^{\prime}(E_f^{\prime})$

The final probability is a maximum and hence is never less than the probability of the initial situation. In other words

\begin{displaymath}
S(E_f)+S^{\prime}(E_f^{\prime})\geq S(E_i)+S^{\prime}(E_i^{\prime})
\end{displaymath} (24)

Let
\begin{displaymath}
\Delta S\equiv S(E_f)-S(E_i)
\end{displaymath} (25)

and
\begin{displaymath}
\Delta S^{\prime}\equiv S^{\prime}(E_f^{\prime})-S^{\prime}(E_i^{\prime})
\end{displaymath} (26)

Then
\begin{displaymath}
\Delta S +\Delta S^{\prime}\geq 0
\end{displaymath} (27)

This is one form of the second law of thermodynamics. It says that in any process the entropy never decreases. Here we have derived it from basic postulates, primarily from the ``equal probability of microstates'' postulate.

Let us define the heat added to system A as

\begin{displaymath}
Q\equiv E_f-E_i
\end{displaymath} (28)

and the heat added to system A$^{\prime}$ as
\begin{displaymath}
Q^{\prime}\equiv E_f^{\prime}-E_i^{\prime}
\end{displaymath} (29)

The conservation of energy (also called the first law of thermodynamics) yields
\begin{displaymath}
E_f+E_f^{\prime}=E_i+E_i^{\prime}
\end{displaymath} (30)

or
\begin{displaymath}
Q+Q^{\prime}=0
\end{displaymath} (31)

or
\begin{displaymath}
Q=-Q^{\prime}
\end{displaymath} (32)

Note that since the probability (or entropy) increases in this process of heat flow, and if we assume $\Delta E=E_f-E_i$ is an infinitesimal, then, using (7), we can write
$\displaystyle \Delta \ln P$ $\textstyle =$ $\displaystyle \frac{\partial\ln\Omega(E_i)}{\partial E}\Delta E+
\frac{\partial\ln\Omega^{\prime}(E^{\prime}_i)}{\partial E^{\prime}}
\Delta E^{\prime}$  
  $\textstyle =$ $\displaystyle \frac{\partial\ln\Omega(E_i)}{\partial E}(E_f-E_i)+
\frac{\partia...
...\Omega^{\prime}(E^{\prime}_i)}{\partial E^{\prime}}
(E_f^{\prime}-E_i^{\prime})$  
  $\textstyle \geq$ $\displaystyle 0$ (33)

where
\begin{displaymath}
\frac{\partial\ln\Omega(E_i)}{\partial E}=
\left.\frac{\partial\ln\Omega(E)}{\partial E}\right\vert _{E=E_i}
\end{displaymath} (34)

But from (11)
\begin{displaymath}
\frac{1}{T_i}=\beta_i= \frac{\partial \ln\Omega(E_i)}{\partial E}
\end{displaymath} (35)

Thus
\begin{displaymath}
\beta_iQ+\beta_i^{\prime}Q^{\prime}=(\beta_i-\beta_i^{\prime})Q\geq 0
\end{displaymath} (36)

The minus sign comes from one system absorbing heat and the other one losing it. If $Q>0$, then
\begin{displaymath}
\beta_i\geq\beta_i^{\prime}
\end{displaymath} (37)

or
\begin{displaymath}
T_i\leq T_i^{\prime}
\end{displaymath} (38)

In other words, if the system A$^{\prime}$ has a greater value of T than system A (i.e., system A$^{\prime}$ is hotter than A), then energy flows from A$^{\prime}$ to A. This agrees with our intuitive idea of temperature that heat flows from hot to cold.

Temperature
Let's compare this parameter $T$ to our intuitive concept of temperature:
  1. If two systems are separately in equilibrium and characterized by the same value of the temperature parameter, then the systems remain in equilibrium when they are brought into thermal contact (no heat flows between the two systems).
  2. Different values of $T$ imply the systems are not in equilibrium with each other.
  3. Heat flows from larger $T$ to smaller $T$.
  4. If two systems are in thermal equilibrium with a third system (i.e. the same $T$), then they must be in thermal equilibrium with each other. This statement is known as the ``zeroth law of thermodynamics.''

One commonly uses a thermometer to determine the temperature. One can make a thermometer in any one of a number of ways. For example:

  1. The height of mercury in a glass tube is taken as the thermometric parameter $\theta$.
  2. Gas in a bulb at constant volume. The mean pressure of the gas is taken as the thermometric parameter $\theta$.
  3. Gas in a bulb at constant pressure. The volume of the gas is taken as the thermometric parameter $\theta$.
  4. An electrical conductor at constant pressure that is carrying current. The electrical resistance is taken as the thermometric parameter $\theta$.
Hence we see that there are numerous ways to measure the temperature. Temperature is one of the macroscopic parameters used to characterize a system and it is useful to define an absolute temperature $T$ of a system. In physics we talk in terms of Kelvin which is based on the absolute temperature. The absolute temperature has 2 important properties:
  1. It is completely independent of the nature of the particular thermometer used to perform the measurement.
  2. The absolute temperature $T$ is a parameter of fundamental significance and enters theoretical statistical mechanical equations.
The absolute temperature is defined by
\begin{displaymath}
T\equiv\frac{1}{k_B\beta}
\end{displaymath} (39)

I would like to make one other point about absolute temperature, namely that you can think of $kT$ as the mean energy per degree of freedom. To see this, start with

\begin{displaymath}
\Omega(E)\propto E^{f/2}
\end{displaymath} (40)

where $f$ is the number of degrees of freedom of the system and the energy $E$ is measured with respect to its ground state. This was shown in Reif section 2.5 where $\Omega(E)=BV^N E^{f/2}$ for an ideal gas with $f=3N$ degrees of freedom. Then
\begin{displaymath}
\ln\Omega(E)\approx \frac{f}{2}\ln E + {\rm constant}
\end{displaymath} (41)

Thus when $E=\tilde{E}\approx\overline{E}$, one gets
\begin{displaymath}
\beta=\frac {\partial\ln\Omega(E)}{\partial E}\approx\frac{f}{2\overline{E}}
\end{displaymath} (42)

and
\begin{displaymath}
k_BT\approx\frac{2\overline{E}}{f}
\end{displaymath} (43)

or
\begin{displaymath}
\frac{1}{2}k_BT\approx\frac{\overline{E}}{f}
\end{displaymath} (44)

Thus the quantity $k_BT/2$ is a measure of the mean energy, above the ground state, per degree of freedom of a system.

Heat Reservoir
Another thermodynamic concept is that of a heat reservoir. A heat reservoir or a heat bath is much larger (has many more degrees of freedom) than the system in contact with it. The temperature of the heat bath remains essentially unchanged no matter how much heat the smaller system absorbs from it. Assume system A$^{\prime}$ of our two interacting systems is very large compared to A. If system A$^{\prime}$ has $\Omega^{\prime}(E^{\prime})$ microstates and absorbs heat $Q^{\prime}=\Delta E^{\prime}$ from system A, then the resulting change in $\ln\Omega^{\prime}(E^{\prime})$ is
$\displaystyle \ln\Omega^{\prime}(E^{\prime}+Q^{\prime})-\ln\Omega^{\prime}(E^{\prime})$ $\textstyle =$ $\displaystyle \frac{\partial \ln\Omega^{\prime}}{\partial E^{\prime}}Q^{\prime}...
...}\frac{\partial^2 \ln\Omega^{\prime}}{\partial E^{\prime\;2}}
Q^{\prime\;2}+...$  
  $\textstyle =$ $\displaystyle \beta^{\prime}Q^{\prime}+\frac{1}{2}\frac{\partial \beta^{\prime}}
{\partial E^{\prime}}Q^{\prime\;2}+...$ (45)

If $\frac{\partial \beta^{\prime}}{\partial E^{\prime}}Q^{\prime}\ll
\beta^{\prime}$ then
$\displaystyle \ln\Omega^{\prime}(E^{\prime}+Q^{\prime})-\ln\Omega^{\prime}(E^{\prime})$ $\textstyle =$ $\displaystyle \beta^{\prime}Q^{\prime}$  
  $\textstyle =$ $\displaystyle \frac{Q^{\prime}}{k_BT^{\prime}}$  
  $\textstyle =$ $\displaystyle \frac{\Delta S^{\prime}}{k_B}$ (46)

or
\begin{displaymath}
\Delta S^{\prime}=\frac{Q^{\prime}}{T^{\prime}}
\end{displaymath} (47)

The assumption $\frac{\partial \beta^{\prime}}{\partial E^{\prime}}Q^{\prime}\ll
\beta^{\prime}$ is the criteria for a heat reservoir. The temperature of the heat reservoir remains essentially unchanged no matter how much heat it absorbs from the smaller system.

For any system which absorbs a differential (or infintesimal amount) of heat $dQ$, then

\begin{displaymath}
\ln\Omega(E+dQ)-\ln\Omega(E)=\frac {\partial\ln\Omega(E)}{\partial E}dQ
=\beta dQ
\end{displaymath} (48)

or, since $S=k_B\ln\Omega$ implies
\begin{displaymath}
\frac{dS}{k_B}=\ln\Omega(E+dQ)-\ln\Omega(E)
\end{displaymath} (49)

we have
\begin{displaymath}
dS=\frac{dQ}{T}
\end{displaymath} (50)

where $dS$ is the increase in the entropy of the system.




next up previous
Next: About this document ...
Clare Yu 2008-04-09