Next: About this document ...
LECTURE 3
Statistical Thermodynamics
In this lecture we will introduce two important thermodynamic
concepts: temperature and entropy. We already have intuitive ideas
about temperature and entropy. We know what hot and cold mean. We
think of entropy as disorder. But how can we quantify these
concepts and relate them to the microscopics of systems of many
particles? That is the subject of this lecture.
Thermal Interaction Between Macroscopic Systems
=3.0 true in
Consider two macroscopic systems A and A
with energies
and
respectively.
Divide the energy scale into equal small intervals of magnitude
and
. We can then define
as the number of
microstates of system A with energy between
and
, and
as the number of microstates of system
A
with energy between
and
.
These two systems may exchange energy with each other, i.e., they
are not thermally isolated from each other.
Let us define the total system A
as the two interacting systems
A+A
and assume that A
has a total energy
which is constant. We assume that energy is conserved and that
A
is isolated from the remainder of the universe. Let us also
assume that A
is in equilibrium (i.e., no macroscopic parameters
are changing in time). Let us consider an ensemble of such systems.
The energies
and
may have any values consistent with the
constraint
. Thus the distribution of energy between the
two systems is a function of only one variable, say
. Then the probability
of finding system A with energy
is proportional to
, i.e.,
the probability
of system A having energy
is
 |
(1) |
where
is a constant and
is the number of microstates of
the total system where system A has energy
(i.e., an energy between
and
).
But if A has energy E, then system A may be in any one of
microstates, and system A
may simultaneously be in any of
microstates. Thus
 |
(2) |
and
 |
(3) |
As discussed in Reif, sections 2.5 and 3.3, the quantity
is a rapidly increasing function of its argument. For example, as shown by
Reif (section 2.5), for an ideal gas
 |
(4) |
where
is a constant,
is the volume occupied by the ideal gas and
is the number of gas molecules in the system. Since
is of the order
of Avogadro's number,
is indeed a rapidly increasing function
of
. Hence
is the product of a rapidly increasing function
(
) and a rapidly decreasing function (
)
of
and has the behavior shown in the figure.
is a highly peaked
function if
and
are both rapidly varying
functions.
=3.0 true in
Reif in section 3.7 argues that the fractional width of the peak in
is of order
 |
(5) |
where
is the number of degrees of freedom in the system, a number of the
order of Avogadro's number.
So it is highly probable that the combined system will be in a state with
the most probable value of
, which we denote
.
To find
, we
must find the maximum of the function
. Rather than
itself,
consider
which varies more slowly. (This will make more sense after
we define entropy.) The condition for a maximum is
 |
(6) |
The value of
satisfying this equation is by definition
,
the most probable energy. Substituting
 |
(7) |
yields
 |
(8) |
when
. Or using
 |
(9) |
we obtain
 |
(10) |
Now introduce the definition
 |
(11) |
Thus the most probable state of the system is characterized by
 |
(12) |
where
. Now let us introduce the definition
 |
(13) |
where
is a constant called Boltzmann's constant and
is a function
we will call temperature. Thus
 |
(14) |
Let us consider for a moment the definition of
in eq. (11).
We expect a system to have more states available as its energy
increases.
=3.0 true in
At low temperatures the energy of the system will be close to the ground
state energy
and the slope of
will be steep, corresponding
to large
and low
. At high
, the slope of
is small, corresponding to small
and high temperatures.
Let us introduce one more definition. Define the entropy of a system
to be
 |
(15) |
Thus combining these definitions
 |
(16) |
or
 |
(17) |
The condition of maximum probability corresponds to maximum total entropy
according to eq. (7):
The condition that this occurs can, by eq. (12), be written
as
 |
(18) |
In other words if we put 2 systems together, they will adjust until they
are at the same temperature and their total entropy is maximized.
Note: One reason for defining entropy as
is to obtain a
definition that makes entropy an additive parameter of two interacting
systems, i.e, if
 |
(19) |
and
 |
(20) |
then
 |
(21) |
The definition of entropy
agrees with our intuitive notion
of disorder. The greater the number of accessible states
, the
larger the number of ways to arrange things, and the greater the ``disorder.''
=3.0 true in
Let us explore some of the properties of these two newly defined quantities,
entropy and temperature. Consider the situation where A and A
are initially
separated and in equilibrium. A has energy
and A
has energy
. Now assume that A and A
are placed in thermal contact with
each other so that energy may be exchanged. The resulting situation just
after they are put together is highly unlikely unless just by chance
and
. The
system will evolve in time until
 |
(22) |
and
becomes a maximum. At equilibrium the
parameters
of the systems are then equal (their temperatures are equal):
 |
(23) |
where
and
The final probability is a maximum and hence is never less than the probability
of the initial situation. In other words
 |
(24) |
Let
 |
(25) |
and
 |
(26) |
Then
 |
(27) |
This is one form of the second law of thermodynamics. It says that in any
process the entropy never decreases. Here we have derived it from basic
postulates, primarily from the ``equal probability of microstates'' postulate.
Let us define the heat added to system A as
 |
(28) |
and the heat added to system A
as
 |
(29) |
The conservation of energy (also called the first law of thermodynamics)
yields
 |
(30) |
or
 |
(31) |
or
 |
(32) |
Note that since the probability (or entropy) increases in this process
of heat flow, and if we assume
is an infinitesimal,
then, using (7), we can write
where
 |
(34) |
But from (11)
 |
(35) |
Thus
 |
(36) |
The minus sign comes from one system absorbing heat and the other one
losing it.
If
, then
 |
(37) |
or
 |
(38) |
In other words, if the system A
has a greater value of T than
system A (i.e., system A
is hotter than A), then energy flows
from A
to A. This agrees with our intuitive idea of temperature
that heat flows from hot to cold.
Temperature
Let's compare this parameter
to our intuitive concept of temperature:
- If two systems are separately in equilibrium and characterized by
the same value of the temperature parameter, then the systems remain in
equilibrium when they are brought into thermal contact (no heat flows
between the two systems).
- Different values of
imply the systems are not in equilibrium
with each other.
- Heat flows from larger
to smaller
.
- If two systems are in thermal equilibrium with a third system
(i.e. the same
), then they must be in thermal equilibrium with each
other. This statement is known as the ``zeroth law of thermodynamics.''
One commonly uses a thermometer to determine the temperature. One can
make a thermometer in any one of a number of ways. For example:
- The height of mercury in a glass tube is taken as the
thermometric parameter
.
- Gas in a bulb at constant volume. The mean pressure of the gas
is taken as the thermometric parameter
.
- Gas in a bulb at constant pressure. The volume of the gas is
taken as the thermometric parameter
.
- An electrical conductor at constant pressure that is carrying current.
The electrical resistance is taken as the thermometric parameter
.
Hence we see that there are numerous ways to measure the temperature.
Temperature is one of the macroscopic parameters used to characterize
a system and it is useful to define an absolute temperature
of a system. In physics we talk in terms of Kelvin which is based
on the absolute temperature. The absolute temperature has 2 important
properties:
- It is completely independent of the nature of the particular
thermometer used to perform the measurement.
- The absolute temperature
is a parameter of fundamental
significance and enters theoretical statistical mechanical equations.
The absolute temperature is defined by
 |
(39) |
I would like to make one other point about absolute temperature, namely
that you can think of
as the mean energy per degree of freedom.
To see this, start with
 |
(40) |
where
is the number of degrees of freedom of the system and the
energy
is measured with respect to its ground state. This was shown
in Reif section 2.5 where
for an ideal gas
with
degrees of freedom. Then
 |
(41) |
Thus when
, one gets
 |
(42) |
and
 |
(43) |
or
 |
(44) |
Thus the quantity
is a measure of the mean energy, above
the ground state, per degree of freedom of a system.
Heat Reservoir
Another thermodynamic concept is that of a heat reservoir.
A heat reservoir or a heat bath
is much larger (has many more degrees of freedom) than the system in
contact with it. The temperature of the heat bath remains essentially
unchanged no matter how much heat the smaller system absorbs from it.
Assume system A
of our two interacting systems is very large compared to A.
If system A
has
microstates and absorbs
heat
from system A, then the
resulting change in
is
If
then
or
 |
(47) |
The assumption
is the
criteria for a heat reservoir.
The temperature of the heat reservoir remains essentially
unchanged no matter how much heat it absorbs from the smaller system.
For any system which absorbs a differential (or infintesimal
amount) of heat
, then
 |
(48) |
or, since
implies
 |
(49) |
we have
 |
(50) |
where
is the increase in the entropy of the system.
Next: About this document ...
Clare Yu
2008-04-09