next up previous
Next: About this document ...

LECTURE 14
Quantized Energy Levels
We know from quantum mechanics that the solutions to Schroedinger's equation
\begin{displaymath}
H\psi=E\psi
\end{displaymath} (1)

have quantized energy levels. For example, a particle of mass $m$ in a box with infinitely high walls, has energy levels given by
\begin{displaymath}
E_{n}=\frac{n^{2}\pi^{2}\hbar^{2}}{2ma^{2}}
\end{displaymath} (2)

where $n=1,2,3,...$. A harmonic oscillator is another example. The energy eigenvalues are
\begin{displaymath}
E_{n}=(n+\frac{1}{2})\hbar\omega
\end{displaymath} (3)

where $n=0,1,2,3,...$. Notice once again that the energy levels are quantized. In this case they are evenly spaced by an amount $\Delta E=\hbar\omega$.

Electromagnetic radiation is also quantized. Light can be described as waves or as particles called photons. A photon has energy $h\nu$ where $\nu$ is the frequency of the electromagnetic wave. Recall that $\omega=2\pi\nu$ and that $\nu=c/\lambda$ where $c$ is the speed of light. Often one speaks in terms of the wavenumber $k=2\pi/\lambda$. If we make it a vector quantity $\vec{k}$, then we call it a wavevector. This is related to the momentum by $\vec{p}=\hbar \vec{k}$ and to the frequency by $\omega=ck$. So if the electromagnetic wave has a short wavelength, it has a high frequency and the photon carries a lot of energy. Lots of wiggles means lots of energy. Photons are massless and they travel at the speed of light.

Periodic Boundary Conditions:Counting States
So energy is quantized into discrete energy levels. Each energy level is associated with a mode or eigenfunction. We have seen that it is often useful to be able to count the number of modes in a box that have energies between $E$ and $E+dE$.

Suppose we have a 3 dimensional box whose walls are parallel to the $x$, $y$, and $z$ axes with lengths $L_{x}$, $L_{y}$, and $L_{z}$. Thus the volume is $V=L_{x}L_{y}L_{z}$. We can solve this as a particle in a box problem. Inside the box the potential is zero. The eigenmodes are waves. However, let's choose boundary conditions such that the solution of Schroedinger's equation are wavefunctions that are plane waves:

\begin{displaymath}
\Psi=A\exp[i(\vec{k}\cdot\vec{r}-\omega t)]=\psi(\vec{r})\exp(-i\omega t)
\end{displaymath} (4)

This is a propagating wave that is never reflected. So our box can't have hard walls. Rather let's imagine that our box is embedded in an infinite set of similar boxes in each of which the physical situation is exactly the same. In other words, each of these boxes is a repeat of the original box.
=2.0 true in \epsfbox{pbc.eps}
To describe this situation, we use periodic boundary conditions which we can write as

\begin{eqnarray*}
\psi(x+L_{x},y,z) & = & \psi(x,y,z) \\
\psi(x,y+L_{y},z) & = & \psi(x,y,z) \\
\psi(x,y,z+L_{z}) & = & \psi(x,y,z) \\
\end{eqnarray*}



If we require our traveling wave solution
\begin{displaymath}
\psi(\vec{r})=\exp(i\vec{k}\cdot \vec{r})=\exp[i(k_{x}x+k_{y}y+k_{z}z)]
\end{displaymath} (5)

to satisfy these boundary conditions, then we must require that
\begin{displaymath}
k_{x}(x+L_{x})=k_{x}x+2\pi n_{x}
\end{displaymath} (6)

where $n_{x}$ is an integer. We can rewrite this as
\begin{displaymath}
k_{x}=\frac{2\pi}{L_{x}}n_{x}
\end{displaymath} (7)

Similarly,

\begin{eqnarray*}
k_{y}&=&\frac{2\pi}{L_{y}}n_{y} \\
k_{z}&=&\frac{2\pi}{L_{z}}n_{z} \\
\end{eqnarray*}



Here the numbers $n_{x}$, $n_{y}$, and $n_{z}$ are any set of integers: positive, negative, or zero.

We can use $p=\hbar k$ and $E=p^{2}/2m$ to deduce that

\begin{displaymath}
E(n_{x},n_{y},n_{z})=\frac{\hbar^{2}}{2m}(k_{x}^{2}+k_{y}^{2...
...\frac{n_{y}^{2}}{L_{y}^{2}}+\frac{n_{z}^{2}}{L_{z}^{2}}\right)
\end{displaymath} (8)

Once again we see that the energy levels are quantized. Notice that for any kind of macroscopic volume where $L_{x}$, $L_{y}$, and $L_{z}$ are large, the energy levels are very closely spaced.

Now we want to count the number of modes or waves that have wavevectors between $\vec{k}=(k_{x},k_{y},k_{z})$ and $\vec{k}+d\vec{k}=(k_{x}+dk_{x},k_{y}+dk_{y},k_{z}+dk_{z})$. For given values of $k_{y}$ and $k_{z}$, it follows from (7) that the number $\Delta n_{x}$ of possible integers $n_{x}$ for which $k_{x}$ lies in the range between $k_{x}$ and $k_{x}+dk_{x}$ is equal to

\begin{displaymath}
\Delta n_{x}=\frac{L_{x}}{2\pi}dk_{x}
\end{displaymath} (9)

We see that if $L_{x}$ is very large, a lot of states can be in the small interval $dk_{x}$. The same holds true for $dk_{y}$ and $dk_{z}$. So the number of states that lie between $\vec{k}$ and $\vec{k}+d\vec{k}$ is
\begin{displaymath}
\rho d^{3}k=\Delta n_{x}\Delta n_{y}\Delta n_{z}=
\left(\fra...
...}\right)=
\frac{L_{x}L_{y}L_{z}}{(2\pi)^{3}}dk_{x}dk_{y}dk_{z}
\end{displaymath} (10)

or
\begin{displaymath}
\rho d^{3}k=\frac{V}{(2\pi)^{3}}d^{3}k
\end{displaymath} (11)

where $d^{3}k\equiv dk_{x}dk_{y}dk_{z}$ is an element of volume in ``k space.'' Notice that the number of states $\rho$ is independent of $\vec{k}$ and proportional to the volume $V$ under consideration. So the ``density of states'', i.e., the number of states per unit volume, lying between $\vec{k}$ and $\vec{k}+d\vec{k}$ is $d^{3}k/(2\pi)^{3}$ which is a constant independent of the magnitude or shape of the volume $V$. Note that $\rho$ denotes the number of single particle states.

Using the relation $\vec{p}=\hbar \vec{k}$, we can also deduce that the number of states $\rho_{p}d^{3}p$ in the momentum range between $\vec{p}$ and $\vec{p}+d\vec{p}$ is

\begin{displaymath}
\rho_{p}d^{3}p=\rho d^{3}k=\frac{V}{(2\pi)^{3}}\frac{d^{3}p}{\hbar^{3}}
=V\frac{d^{3}p}{h^{3}}
\end{displaymath} (12)

where $h=2\pi\hbar$ is the ordinary Planck's constant. Notice that $Vd^3p$ is the volume of the classical 6 dimensional phase space occupied by a particle in a box of volume $V$ and with momentum between $\vec{p}$ and $\vec{p}+d\vec{p}$. Thus (12) shows that subdivision of this phase space into cells of size $h^3$ yields the correct number of quantum states for the particle. If we compare this to our classical expression $Vd^{3}p/h^{3}_o$, we see that our arbitrary length $h_o$ is replaced by Planck's constant $h$.

If k-space is isotropic, i.e., the same in every direction, then the number of states in a spherical shell lying between radii $k$ and $k+dk$ is

\begin{displaymath}
\rho_{k}dk=\rho d^{3}k=\frac{V}{(2\pi)^{3}}(4\pi k^{2}dk)
=\frac{V}{2\pi^{2}}k^{2}dk
\end{displaymath} (13)

If we are considering photons for which $\omega=ck$, then we can plug $k=\omega/c$ into (13) to get the number of states lying between $\omega$ and $\omega+d\omega$.
\begin{displaymath}
\rho_{\omega}d\omega=2\frac{V}{2\pi^{2}c^{3}}\omega^{2}d\omega
\end{displaymath} (14)

The factor of 2 comes from the fact that there are 2 photon polarizations. The polarization refers to the direction of the electric field vector $\vec{E}$ in the electromagnetic radiation. Since $\vec{E}$ must be perpendicular to $\vec{k}$, there are 2 polarization directions. We will use (14) in deriving blackbody radiation. Sometimes the term ``density of states'' for photons is used to refer to the number of states per unit volume per unit energy:
\begin{displaymath}
N(\omega)=\frac{2}{2\pi^{2}c^{3}}\omega^{2}=\frac{1}{\pi^{2}c^{3}}\omega^{2}
\end{displaymath} (15)

The density of states is very useful for converting sums into integrals as we shall see.

Recap
So let's recap where we are and what we've found. If a system with lots of particles has many-particle states $R$ with energy $E_R$, then the average of some quantity $A$ is given by
\begin{displaymath}
\overline{A}=\frac{1}{Z}\sum_{R}A_R e^{-\beta E_R}
\end{displaymath} (16)

where the partition function $Z$
\begin{displaymath}
Z= \sum_{R} e^{-\beta E_R}
\end{displaymath} (17)

This is for the canonical ensemble with fixed temperature $T$ and fixed particle number $N$. We have seen that $\ln Z$ is very useful in finding other quantities. For example,
\begin{displaymath}
F=-k_BT\ln Z
\end{displaymath} (18)


\begin{displaymath}
S=k_B(\ln Z+\beta \overline{E})
\end{displaymath} (19)


\begin{displaymath}
\overline{E}=-\frac{\partial\ln Z}{\partial\beta}
\end{displaymath} (20)


\begin{displaymath}
C_V=\left. \frac{\partial \overline{E}}{\partial T}\right\vert _V
\end{displaymath} (21)


\begin{displaymath}
\overline{p}=\frac{1}{\beta}\frac{\partial\ln Z}{\partial V}
\end{displaymath} (22)

But the problem is that it is very difficult to solve Schroedinger's equation to get $E_R$:
\begin{displaymath}
H\psi_R=E_R\psi_R
\end{displaymath} (23)

It is much easier to solve Schroedinger's equation to get single particle energies. So we consider systems (gases) of noninteracting particles. If we know how many particles are in each single particle state, then we just sum over all the particles to get the appropriate average, e.g., the mean energy of the whole system. If we have to treat the particles quantum mechanically because their wavefunctions overlap or because the temperature is low, then we need to pay attention to whether the particles are fermions or bosons. Fermions can have at most one particle in a state while bosons can have umpteen particles in a state. So now the mean energy is given by
\begin{displaymath}
\overline{E}=\sum_{s}\varepsilon_s \overline{n}_s
\end{displaymath} (24)

where the mean number of particles in state $s$ is given by
\begin{displaymath}
\overline{n}_s=\frac{1}{e^{\beta(\varepsilon_s-\mu)}\pm 1}
\end{displaymath} (25)

$+$ is for fermions and $-$ is for bosons. It's usually not easy to do sums, so it would be nice if we could convert the sum into an integral. That's why we calculated density of single particle states. Then we can do the conversion:
\begin{displaymath}
\sum_{s}\rightarrow \int \rho(\varepsilon)d\varepsilon
\end{displaymath} (26)

or
\begin{displaymath}
\sum_{s}\rightarrow \int \rho(\omega) d\omega
\end{displaymath} (27)

So the mean energy becomes
\begin{displaymath}
\overline{E}=\int d\varepsilon \rho(\varepsilon) \overline{n}(\varepsilon)
\varepsilon
\end{displaymath} (28)

where
\begin{displaymath}
\overline{n}(\varepsilon)=\frac{1}{e^{\beta(\varepsilon-\mu)}\pm 1}
\end{displaymath} (29)

Applications
Now let's go do some examples of this strategy. We will cover the following examples:
  1. Monatomic Ideal Gas
  2. Black Body Radiation
  3. Electron Gas (electrons in a metal)
  4. Bose-Einstein Condensation (if time permits)

Monatomic Ideal Gas
Let's start with our tried and true example of a monatomic ideal gas in the classical limit of low density or high temperature. We want to calculate the partition function. We found earlier that
\begin{displaymath}
Z=\frac{\zeta^N}{N!}
\end{displaymath} (30)

where $\zeta$ is the partition function for one particle.
$\displaystyle \zeta$ $\textstyle =$ $\displaystyle \sum_r e^{-\beta\varepsilon_r}$  
  $\textstyle =$ $\displaystyle \sum_{k_x,k_y,k_z}\exp\left[-\frac{\beta\hbar^2}{2m}
\left(k_x^2+k_y^2+k_z^2\right)\right]$ (31)

where
\begin{displaymath}
k_x=\frac{2\pi}{L_x}n_x\;\;\;\;\;\;\;k_y=\frac{2\pi}{L_y}n_y
\;\;\;\;\;\;\;k_z=\frac{2\pi}{L_z}n_z
\end{displaymath} (32)

Since the number of allowed $k$ states is very large and since these states are very close together with spacing going as $ 1/L$, we can approximate the sum by an integral. This is where the density of states comes in handy.
\begin{displaymath}
\sum_{\vec{k}}\rightarrow\int\rho d^3k=\frac{V}{(2\pi)^3}\in...
...frac{V}{(2\pi)^3}\int 4\pi k^2 dk
=\frac{V}{2\pi^2}\int k^2 dk
\end{displaymath} (33)

So
$\displaystyle \zeta$ $\textstyle =$ $\displaystyle \frac{V}{2\pi^2}\int_{0}^{\infty}\exp\left[-\frac
{\beta\hbar^2k^2}{2m}\right]k^2 dk$  
  $\textstyle =$ $\displaystyle \frac{V}{2\pi^2}\frac{\sqrt{\pi}}{4}\left(\frac{2m}{\beta\hbar^2}
\right)^{3/2}$  
  $\textstyle =$ $\displaystyle \frac{V}{h^3}\left(2\pi mk_BT\right)^{3/2}$ (34)

and
$\displaystyle \ln Z$ $\textstyle =$ $\displaystyle \ln\left(\frac{\zeta^N}{N!}\right)$  
  $\textstyle =$ $\displaystyle N\ln\zeta-N\ln N+N$  
  $\textstyle =$ $\displaystyle N(\ln\zeta-\ln N+1)$  
  $\textstyle =$ $\displaystyle N\left(\ln\frac{V}{N}-\frac{3}{2}\ln \beta+\frac{3}{2}\ln\frac
{2\pi m}{h^2}+1\right)$ (35)

Hence
\begin{displaymath}
\overline{E}=-\frac{\partial \ln Z}{\partial\beta}=\frac{3}{2}Nk_BT
\end{displaymath} (36)

and
\begin{displaymath}
S=k_B(\ln Z+\beta\overline{E})=Nk_B\left(\ln\frac{V}{N}+\frac{3}{2}\ln T+
\sigma_o\right)
\end{displaymath} (37)

where
\begin{displaymath}
\sigma_o\equiv\frac{3}{2}\ln\frac{2\pi mk_B}{h^2}+\frac{5}{2}
\end{displaymath} (38)

This is identical to the result obtained for the purely classical ideal gas except that $\sigma_o$ now has a well defined value with $h_o=h=$ Planck's constant and the Gibbs paradox has automatically been taken care of.

Blackbody Radiation
Blackbody radiation is a nice example of the statistical mechanics that we have been discussing. A black body is a perfect absorber and absorbs all the radiation incident on it. If its temperature is kept constant, then the amount of power it radiates must equal the amount of power it absorbs. Otherwise it would heat up or cool off. We can imagine the black body being kept inside some kind of closed container which is at the same temperature $T$. The radiation field inside this enclosure is in equilibrium. In other words there is a gas of photons in thermal equilibrium inside the enclosure. By thermal equilibrium, we mean that the average occupation number $\overline{n_{s}}$ of the single particle states is given by the Planck distribution that we talked about in lecture 13.
\begin{displaymath}
\overline{n_{s}}=\frac{1}{e^{\beta \varepsilon_{s}}-1}
\end{displaymath} (39)

One can imagine making a histogram by counting the photon energy density in each frequency range from $\omega$ to $\omega+d\omega$. It turns out that this distribution of the energy density of blackbody radiation is a universal curve that depends only on the temperature $T$. In other words if one plots the distribution of the photon energy density (counting both directions of polarization) as a function of photon (angular) frequency $\omega$, the shape of the curve is universal and the position of the peak is a function only of the temperature. When we say that the curve is universal, we mean that it doesn't depend on the size or shape of the box, or what the walls are made of. All that matters is the temperature.

Blackbody radiation is historically important in physics for two reasons. The first is that the measurement of the spectral distribution in the late 1800's led Planck to come up with the idea of energy quantization. He couldn't explain the distribution unless he postulated that $E=h\nu$. This marked the birth of quantum mechanics. The second reason that blackbody radiation is important is that 3 K black body radiation pervades the universe and is the remnant of the Big Bang. This radiation is in the microwave region.

Let's calculate the distribution of the mean energy density of blackbody radiation. Since the size and shape of the box don't matter, let's imagine a rectangular box of volume V filled with a gas of photons that are in thermal equilibrium. The box has edges with lengths $L_{x}$, $L_{y}$, and $L_{z}$ such that each of these lengths is much larger than the longest wavelength of significance. There are 2 factors that determine the energy density at a given frequency. The first is the average energy in each state $s$ which is given by

\begin{displaymath}
\overline{n_{s}}\varepsilon_{s}=\frac{\varepsilon_{s}}
{e^{\beta \varepsilon_{s}}-1}
\end{displaymath} (40)

If we set $\varepsilon_{s}=\hbar\omega$ and $\overline{n_{s}}
=\overline{n}(\hbar\omega)$, we can rewrite this to give:
\begin{displaymath}
\overline{n}(\hbar\omega)\hbar\omega=
\frac{\hbar\omega}{e^{\beta \hbar\omega}-1}
\end{displaymath} (41)

The second factor is the number of states per unit volume whose frequency lies in the range between $\omega$ and $\omega+d\omega$. This is given by (15)
\begin{displaymath}
N(\omega)d\omega=\frac{1}{\pi^{2}c^{3}}\omega^{2}d\omega
\end{displaymath} (42)

So at temperature $T$ the mean energy density $\overline{u}(\omega,T)d\omega$ contained in the photon gas by photons whose frequencies are between $\omega$ and $\omega+d\omega$ is given by the product of the average energy in each single photon state and the density of states which lie in this frequency range:
\begin{displaymath}
\overline{u}(\omega,T)d\omega=\overline{n}(\hbar\omega)
\hba...
...ac{\hbar\omega}
{\left(e^{\beta \hbar\omega}-1\right)}
d\omega
\end{displaymath} (43)

We can rewrite this to give:
\begin{displaymath}
\overline{u}(\omega,T)d\omega=\frac{\hbar\omega^{3}}{\pi^{2}c^{3}}
\frac{d\omega}{\left(e^{\beta \hbar\omega}-1\right)}
\end{displaymath} (44)

This is Planck's law for the blackbody spectrum.
=5.0 true in \epsfbox{blackbody.eps}
We can take the high temperature limit to get the classical limit of this spectrum. In the high temperature limit, $\beta$ is small so we can expand the exponential in the denominator:
\begin{displaymath}
e^{\beta \hbar\omega}-1\approx (1+\beta \hbar\omega)-1=
\beta \hbar\omega
\end{displaymath} (45)

So the high temperature limit of (44) is
\begin{displaymath}
\overline{u}(\omega,T)d\omega\approx\frac{\hbar\omega^{3}}
{\pi^{2}c^{3}(\beta \hbar\omega)}d\omega
\end{displaymath} (46)

or, using $\beta=1/k_{B}T$, we can write
\begin{displaymath}
\overline{u}(\omega,T)d\omega\approx
\frac{\omega^{2}k_{B}T}{\pi^{2}c^{3}}d\omega
\end{displaymath} (47)

This is the Rayleigh-Jeans formula for blackbody radiation. Notice that eqn. (47) increases as $\omega^{2}$. Therefore the classical spectrum (47) predicts that the energy density goes to infinity as the frequency goes to infinity. By the end of the 1800's the black body spectrum had been measured and the classical formula had been calculated. There was a clear lack of agreement, so people knew they had a problem. Planck resolved the conflict by proposing that electromagnetic energy was not continuous, but rather was quantized. He proposed $E=\hbar\omega$ (or $E=h\nu$) and derived Planck's law (44). This fit the data very well, and quantum mechanics was born.

We can rewrite (44) in terms of a dimensionless parameter $\eta$:

\begin{displaymath}
\eta\equiv \beta\hbar\omega = \frac{\hbar\omega}{k_{B}T}
\end{displaymath} (48)

Planck's law becomes:
\begin{displaymath}
\overline{u}(\eta,T)d\eta=\frac{\hbar}{\pi^{2}c^{3}}
\left(\frac{k_{B}T}{\hbar}\right)^{4}\frac{\eta^{3}d\eta}
{e^{\eta}-1}
\end{displaymath} (49)

If we plot $\overline{u}(\eta,T)$ versus $\eta$, the maximum occurs around $\eta_{max}\approx 3$.

=5.0 true in \epsfbox{scaledbb.eps}
So if at temperature $T_{1}$ the maximum occurs at frequency $\omega_{1,max}$, then at some other temperature $T_{2}$ the maximum occurs at $\omega_{2,max}$. This is because
\begin{displaymath}
\eta_{max}=\frac{\hbar\omega_{1,max}}{k_{B}T_{1}}=
\frac{\hbar\omega_{2,max}}{k_{B}T_{2}}
\end{displaymath} (50)

or
\begin{displaymath}
\frac{\omega_{1,max}}{T_{1}}=\frac{\omega_{2,max}}{T_{2}}
\end{displaymath} (51)

This is called the Wien displacement law. It says that
\begin{displaymath}
\omega_{max}\propto T
\end{displaymath} (52)

This was initially an empirical relation that was deduced from the experimental data. We see that it also follows from Planck's law. It is often useful in physics to express things in terms of dimensionless parameters. The Wien displacement law is an example of useful scaling relations that can result from this.

We can also calculate the total energy density $\overline{u}_o(T)$ contained in the photon gas at temperature $T$ by integrating (44) over frequency:

\begin{displaymath}
\overline{u}_o(T)=\int_{0}^{\infty}\overline{u}(\omega,T)d\omega
\end{displaymath} (53)

Using (49), we can rewrite this as
\begin{displaymath}
\overline{u}_o(T)=\frac{\hbar}{\pi^{2}c^{3}}
\left(\frac{k_{...
...}\right)^{4}\int_{0}^{\infty}\frac{\eta^{3}d\eta}
{e^{\eta}-1}
\end{displaymath} (54)

One can evaluate the integral exactly. The answer is
\begin{displaymath}
\int_{0}^{\infty}\frac{\eta^{3}d\eta}{e^{\eta}-1}=\frac{\pi^4}{15}
\end{displaymath} (55)

Using this, one finds
\begin{displaymath}
\overline{u}_o(T)=\frac{\pi^2}{15}\frac{(k_{B}T)^{4}}{(\hbar c)^{3}}
\end{displaymath} (56)

This is known as the Stefan-Boltzmann law. The important point is that the total energy density goes as the fourth power of the temperature:
\begin{displaymath}
\overline{u}_o(T)\propto T^{4}
\end{displaymath} (57)

Finally the mean pressure $\overline{p}$ exerted on the walls of the enclosure by the radiation is simply related to the total energy density:

\begin{displaymath}
\overline{p}=\sum_{s}\overline{n}_{s}\left(-\frac{\partial\varepsilon_s}
{\partial V}\right)_T=\frac{1}{3}\overline{u}_o(T)
\end{displaymath} (58)

To see where this comes from, recall that
\begin{displaymath}
\varepsilon_s=\hbar\omega_s\quad\quad\quad \omega=ck
\end{displaymath} (59)

Now use
\begin{displaymath}
k_{x}=\frac{2\pi}{L_{x}}n_{x}\quad\quad
k_{y}=\frac{2\pi}{L_{y}}n_{y} \quad\quad
k_{z}=\frac{2\pi}{L_{z}}n_{z}
\end{displaymath} (60)

to obtain
$\displaystyle \varepsilon_s$ $\textstyle =$ $\displaystyle \hbar ck=\hbar c\sqrt{k_x^2+k_y^2+k_z^2}$  
  $\textstyle =$ $\displaystyle \hbar c\left(\frac{2\pi}{L}\right)\sqrt{n_x^2+n_y^2+n_z^2}$  
  $\textstyle =$ $\displaystyle 2\pi\hbar c V^{-1/3}\sqrt{n_x^2+n_y^2+n_z^2}$ (61)

where $V$ is the volume. So the pressure associated with state $s$ is
\begin{displaymath}
p_s=-\frac{\partial\varepsilon_s}{\partial V}
=\frac{1}{3}2\...
...ar cV^{-4/3}\sqrt{n_x^2+n_y^2+n_z^2}
=\frac{\varepsilon_s}{3V}
\end{displaymath} (62)

So the average pressure for the system is
\begin{displaymath}
\overline{p}=\sum_s p_s n_s=\frac{1}{3V}\sum_s n_s \varepsilon_s
=\frac{\overline{E}}{3V}
\end{displaymath} (63)

or
\begin{displaymath}
\overline{p}=\frac{1}{3}\overline{u}_o
\end{displaymath} (64)

(The pressure can also be written as $\frac{1}{\beta}\left(\frac{\partial\ln Z}{\partial V}\right)_T=
-\left.\frac{\partial F}{\partial V}\right\vert _T$.) The ``3'' in the denominator reflects the fact that the box is 3 dimensional. Radiation pressure is quite small, but it is what gives comets their tails. Solar radiation is what pushes tiny bits of dust and ice that come from the ice ball away from the sun and produces the tail. The comet tail always points away from the sun. The power emitted $\sim$ flux $\sim c\overline{u}_o(T)$.

Principle of Detailed Balance
If an object is sitting in a cavity filled with radiation (photons) and is in equilibrium at temperature $T$, then the If this were not true, the body would be losing or gaining energy and would get cooler or would heat up. As a result, its temperature would no longer be the same as the ambient photons at temperature $T$; it would no longer be in equilibrium. So it must absorb the same amount of power as it emits in order to stay in equilibrium.

We can make an even stronger statement. Namely, that in equilibrium the power radiated and absorbed by the body must be equal for any particular element of area of the body, for any particular direction of polarization, and for any frequency range. To show that this must be true, one could imagine putting a shield or filter around the object that absorbs all radiation except that, in one small element of area, it is completely transparent to radiation in one direction with one polarization and in one narrow frequency range between $\omega$ and $\omega+d\omega$. In the presence of the shield the body must absorb and emit the same power in order to avoid heating up or cooling off, i.e., in order to stay in equilibrium. So the power radiated and absorbed by the body must be equal for any particular element of area of the body, for any particular direction of polarization, and for any frequency range. This is called the principle of detailed balance.

The principle of detailed balance is a fundamental result that is based on very general arguments. Microscopically it is a result of time reversal invariance and the fundamental assumption of accessible macrostates being equally probable in an isolated system. Consider a single isolated system consisting of several weakly interacting parts, e.g., a body and photons. When these parts are not interacting, the system can be in any one of its quantum states labeled by indices $r$, $s$, etc. When interactions are present, the interactions induce transitions between the states. Let $w_{rs}$ be the transition rate (or transition probability per unit time) from state $r$ to state $s$. Under time reversal, $t\rightarrow -t$, $r\rightarrow r^*$ and $s\rightarrow s^*$ where $r^*$ and $s^*$ are the time reversed states of $r$ and $s$. For example, if a particle has momentum $\vec{p}$ in state $r$, then it has momentum $-\vec{p}$ in state $r^*$. If the system is invariant (the same) under time reversal, then

\begin{displaymath}
w_{s^*r^*}=w_{rs}
\end{displaymath} (65)

This expresses the principle of microscopic reversibility. For example, if the body in the cavity emits a photon with wavevector $\vec{k}$, then the time reversed process is the absorption of a photon with wavevector $-\vec{k}$. Microscopic reversibility asserts that these two processes occur with equal probability.

If we have some initial set $A$ of states labeled by $r$ and some final set $B$ of states labeled by $s$, the transition probability from $A\rightarrow B$ is given by

\begin{displaymath}
W_{AB}=\sum_{r}\sum_{s}P_r w_{rs}
\end{displaymath} (66)

where $P_r$ is the probability of being in state $r$. The probability of landing in state $s$ is the probability $P_r$ of being in state $r$ multiplied by the transition rate $w_{rs}$ to state $s$. Similarly the time reversed process has a transition rate given by
\begin{displaymath}
W_{B^*A^*}=\sum_{s^*}\sum_{r^*}P_{s^*}w_{s^*r^*}
\end{displaymath} (67)

But the fundamental postulate of statistical mechanics states that accessible macrostates are equally probable in an isolated system. So $P_r=P_s^*$ and
\begin{displaymath}
W_{B^*A^*}=W_{AB}
\end{displaymath} (68)

This is the principle of detailed balance.

Radiation Emitted by a Body
Let us now apply the principle of detailed balance to a body at temperature $T$ in equilibrium with radiation (photon gas) inside an enclosure at this temperature. Let ${\cal P}_i({\bf k},\alpha)$ be the incident radiation power on a unit area of this body per unit frequency and solid angle range about the vector ${\bf k}$ with polarization $\alpha$. Let $a({\bf k},\alpha)$ be the fraction of incident power absorbed, the rest being reflected. By the principle of detailed balance, the power absorbed must equal the power emitted ${\cal P}_e(-{\bf k},\alpha)$ in the opposite direction $-{\bf k}$:
\begin{displaymath}
{\cal P}_e(-{\bf k},\alpha)=a({\bf k},\alpha){\cal P}_i({\bf k},\alpha)
\end{displaymath} (69)

For a blackbody, $a({\bf k},\alpha)=1$; a good absorber is a good emitter and vice-versa.

Let us now calculate explicitly the power ${\cal P}_i({\bf k},\alpha)$ incident per unit area of a body in an enclosure at temperature $T$. This is the incident energy flux. Let $f({\bf k})d^3{\bf k}$ be the mean number of photons per unit volume with a given polarization whose wavevector lies between ${\bf k}$ and ${\bf k}+d{\bf k}$. We found earlier in studying blackbody radiation that

\begin{displaymath}
f({\bf k})d^3{\bf k}=n(\hbar\omega)\frac{d^3{\bf k}}{(2\pi)^3}=
\frac{1}{e^{\beta\hbar\omega}-1}\frac{d^3{\bf k}}{(2\pi)^3}
\end{displaymath} (70)

So $(c\;dt\;\cos\theta)f(k)d^3{\bf k}$ photons of a given frequency and polarization strike a unit area of the body in a time $dt$. Since each photon carries energy $\hbar\omega$, one obtains
\begin{displaymath}
{\cal P}_i({\bf k},\alpha)d\omega d\Omega=(\hbar\omega)(c\;\cos\theta
f(k)d^3{\bf k})
\end{displaymath} (71)

=3.0 true in \epsfbox{flux.eps}

Converting $d^3{\bf k}$ to spherical coordinates and using $k=\omega/c$, we obtain

\begin{displaymath}
d^3{\bf k}=k^2dkd\Omega=\frac{\omega^2}{c^3}d\omega d\Omega
\end{displaymath} (72)

and
\begin{displaymath}
{\cal P}_i({\bf k},\alpha)d\omega d\Omega=\frac{\hbar\omega^3}{c^2}f(k)
\cos\theta
\end{displaymath} (73)

If the body absorbs isotropically, then the fraction of incident radiation absorbed is $a({\bf k},\alpha)=a(\omega)$, i.e., $a$ is independent of the direction ${\bf k}$. We are also assuming $a$ is independent of the polarization direction. So the power emitted in the direction ${\bf k}^{\prime}=-{\bf k}$ is
\begin{displaymath}
{\cal P}_e({\bf k}^{\prime},\alpha)=a(\omega){\cal P}_i({\bf k},\alpha)
\end{displaymath} (74)

Now let us find the total power ${\cal P}_e(\omega)d\omega$ emitted per unit area into the frequency range between $\omega$ and $d\omega$ for both polarization directions by integrating over the solid angle. Using $d\Omega=\sin\theta d\theta d\phi$ and multiplying by 2 for both polarizations, we write

$\displaystyle {\cal P}_e(\omega)d\omega$ $\textstyle =$ $\displaystyle 2\int_{\Omega}{\cal P}_e({\bf k}^{\prime},\alpha)
d\omega d\Omega$  
  $\textstyle =$ $\displaystyle a(\omega)\frac{2\hbar\omega^3}{c^2}f(k)d\omega\left(2\pi
\int^{\pi/2}_{0}\cos\theta\sin\theta d\theta\right)$  
  $\textstyle =$ $\displaystyle a(\omega)\frac{2\pi\hbar\omega^3}{c^2}f(k)d\omega$ (75)

The right hand side is proportional to $(\hbar\omega)f(k)d^3{\bf k}$, i.e., to the mean radiation density $\overline{u}(\omega)d\omega$ inside the enclosure. So we can write
\begin{displaymath}
{\cal P}_e(\omega)d\omega=a(\omega)
\left[\frac{1}{4}c\overline{u}(\omega)d\omega\right]
\end{displaymath} (76)

It makes sense to see $c\overline{u}(\omega)$ for the flux. Using Eq. (70), we can write Eq. (75) as
\begin{displaymath}
{\cal P}_e(\omega)d\omega=a(\omega)\frac{\hbar}{4\pi^2c^2}
\frac{\omega^3 d\omega}{e^{\beta\hbar\omega}-1}
\end{displaymath} (77)

The total power ${\cal P}_e^{tot}$ emitted per unit area of the body is obtained by integrating Eq. (77) over frequency as we did in Eqs. (54)-(56) to obtain

\begin{displaymath}
{\cal P}_e^{tot}=a\left(\frac{1}{4}c\overline{u}_{o}\right)
=a\left(\sigma T^4\right)
\end{displaymath} (78)

where $\overline{u}_{o}$ is given by Eq. (56):
\begin{displaymath}
\overline{u}_o(T)=\frac{\pi^2}{15}\frac{(k_{B}T)^{4}}{(\hbar c)^{3}}
\end{displaymath} (79)

Eq. (78) is another form of the Stefan-Boltzmann law. The Stefan-Boltzmann constant $\sigma$ is
\begin{displaymath}
\sigma\equiv \frac{\pi^2}{60}\;\frac{k_B^4}{c^2\hbar^3}
=(5....
...)\times10^{-5}\;{\rm erg}/({\rm sec}\;{\rm cm}^2\;{\rm deg}^4)
\end{displaymath} (80)

For a perfect blackbody, $a=1$. For something shiny like gold, $a\approx 0.01$.




next up previous
Next: About this document ...
Clare Yu 2009-03-10