next up previous
Next: About this document ...

LECTURE 2
Quantum Numbers
When we solved the problem of a particle in a one dimensional box, we labelled the eigenvalues and eigenvectors with an integer n. n is an example of a quantum number. It's a number which characterizes an eigenstate of the system. It changes from eigenstate to eigenstate, but if the system is in that eigenstate, it remains in that eigenstate and the quantum numbers describing that state don't change either. This is why an eigenstate is called a stationary state. Typically the quantum numbers are associated with symmetries. Symmetry means that the system looks the same under certain operations. For example if the system is invariant under time translation, then the Hamiltonian is time independent, and energy is conserved. This was true of the case of a particle in a box, and we labelled the energy eigenstates by n.

Another possible symmetry is spatial translation. If the potential is invariant under spatial translation, then momentum is a good quantum number. (Momentum is a generator of translations.) This would be true of a constant potential or if there were no potential. Classically momentum is conserved as long as the system is not subjected to a force. The force is the gradient of the potential: $\vec{F}=-\nabla V(\vec{r})$. So momentum is a good quantum number as long as the potential does not vary spatially, i.e, as long as it's constant. (We don't usually refer to forces in quantum mechanics.) If the potential is constant or zero, then we have a free particle with momentum $\vec{p}$, energy E=p2/2m and $\psi\sim\exp(i\vec{p}\cdot \vec{r}/\hbar)$. If we have a periodic potential, then we have discrete translational symmetry, and momentum takes on discrete values. This is what we saw with our 3 dimensional box that had periodic boundary conditions:

\begin{eqnarray*}p_{x,n}=\hbar k_{x,n}=\frac{2\pi\hbar}{L_{x}}n_{x} \\
p_{y,n}=...
...n_{y} \\
p_{z,n}=\hbar k_{z,n}=\frac{2\pi\hbar}{L_{z}}n_{z} \\
\end{eqnarray*}


where the dimensions of the box are Lx, Ly, and Lz. Notice that the momentum eigenstate is also the energy eigenstate:

\begin{displaymath}\hat{H}\psi_{n}=E_{n}\psi_{n}
\end{displaymath} (1)

and

\begin{eqnarray*}\hat{p}_{x}\psi_{n}=p_{x,n}\psi_{n} \\
\hat{p}_{y}\psi_{n}=p_{y,n}\psi_{n} \\
\hat{p}_{z}\psi_{n}=p_{z,n}\psi_{n} \\
\end{eqnarray*}


where we are using the same $\psi_{n}$ in both equations. (One way to see that $\hat{p}_{x}\psi=p\psi$ is by plugging in $\psi\sim\exp(i\vec{p}\cdot \vec{r}/\hbar)$ and using $\hat{p}_{x}=(\hbar/i)\partial/\partial x$.) Another way to define a conserved quantity is to recall the Heisenberg equation of motion. For wavefunctions, it takes the form

\begin{displaymath}i\hbar\frac{\partial\psi}{\partial t}=H\psi
\end{displaymath} (2)

There is also an equation of motion for operators. Let $\hat{A}$ be some operator. It's equation of motion is

\begin{displaymath}i\hbar\frac{\partial\hat{A}}{\partial t}=[\hat{A},\hat{H}]
\end{displaymath} (3)

So if $i\hbar\partial\hat{A}/\partial t=0$, then we must have $[\hat{A},\hat{H}]=0$. This is the condition that A is a constant of the motion and is a conserved quantity. So if momentum is a conserved quantity, then it commutes with the Hamiltonian:

\begin{displaymath}[\hat{H},\hat{p}]= \hat{H}\hat{p}-\hat{p}\hat{H}=0
\end{displaymath} (4)

But this is getting too technical.

Orbital Angular Momentum
Let's name some other quantities that are often conserved. Angular momentum comes to mind. There are two basic types of angular momentum: orbital and spin. Orbital angular momentum is associated with quantum numbers that describe the orbital wavefunctions of a electrons in an atom. The operator is denoted by $\hat{\vec{L}}$

\begin{displaymath}\hat{\vec{L}}=\hat{\vec{r}}\times\hat{\vec{p}}
\end{displaymath} (5)

Classically the orbital angular momentum is conserved as long as there is no torque on the system. In quantum mechanics we don't usually talk about torque. Rather we say that orbital angular momentum is a good quantum number if the system has continuous rotational symmetry. By rotational symmetry, I mean that if you rotate the system in some way, like by an angle $\theta$ about the z axis, it still looks the same. An atom is a spherical kind of object, and it has rotational symmetry. It turns out that there are 2 quantum numbers associated with orbital angular momentum: $\ell$ and $m=\ell_{z}$. This is because an eigenstate $\psi_{n}$ of the energy is also an eigenstate of $\hat{L}^{2}$ and of one component of $\hat{\vec{L}}$. Traditionally this is chosen to be the z-component, $\hat{L}_{z}$. Thus

\begin{displaymath}\hat{L}^{2}\psi_{n}=\hbar^{2}\ell(\ell+1)\psi_{n}
\end{displaymath} (6)

and

\begin{displaymath}\hat{L}_{z}\psi_{n}=\hbar m\psi_{n}
\end{displaymath} (7)

(In terms of commutators, $[\hat{H},\hat{L}^{2}]=0$, $[\hat{H},\hat{\vec{L}}]=0$, and $[\hat{L}^{2},\hat{\vec{L}}]=0$.) The azimuthal symmetry of the wavefunction is contained in the factor

\begin{displaymath}\psi(\theta)\sim e^{im\theta}
\end{displaymath} (8)

where $\theta$ is the angle of rotation about the z axis. In order for the wavefunction to be single valued under rotations by $2\pi$ about the z axis, $\ell$ must a nonnegative integer and

\begin{displaymath}m=-\ell,-\ell+1, -\ell+2,..., \ell-2,\ell-1,\ell
\end{displaymath} (9)

( $\psi(\theta)\sim e^{im2\pi}=1$) So if $\ell=2$, the possible values of m are -2, -1, 0, 1, and 2. In general there are $2\ell+1$ allowed values of m associated with each value of $\ell$.

Spin Angular Momentum
The other type of angular momentum is spin angular momentum. Spin angular momentum is an internal angular momentum that is associated with a particle. The spin operator is denoted by $\vec{S}$. Just as for orbital angular momentum, there are 2 quantum numbers associated with spin angular momentum: s and sz. Sometimes sz is denoted by ms. If spin is a good quantum number, then the energy eigenstate $\psi_{n}$ is also an eigenstate of $\hat{S}^{2}$ and $\hat{S}_{z}$:

\begin{displaymath}\hat{S}^{2}\psi_{n}=\hbar^{2}s(s+1)\psi_{n}
\end{displaymath} (10)

and

\begin{displaymath}\hat{S}_{z}\psi_{n}=\hbar s_{z}\psi_{n}
\end{displaymath} (11)

s can be either an integer or a half-integer. Particles with integer spin are called bosons and particles with half-integer spin are called fermions. An example of a fermion is an electron. An electron is a spin-1/2 particle, i.e., s=1/2, and it has 2 spin states: spin up (which corresponds to sz=+1/2) and spin down (which corresponds to sz=-1/2). Protons and neutrons are also spin-1/2 particles and are therefore fermions.

An example of a boson is a photon. A photon is a spin-1 particle, i.e, s=1, and it has sz=-1 or sz=+1. It turns out not to have sz=0. This is related to the fact that there are 2 possible polarizations of the electric field $\vec{E}$ and that $\vec{E}\bot\vec{k}$, where $\vec{k}$ points in the direction of propagation of the electromagnetic wave.

Adding Angular Momenta
One can add angular momenta. For example the total angular momentum of a state is the sum of its orbital and spin angular momenta:

\begin{displaymath}\hat{\vec{J}}=\hat{\vec{L}}+\hat{\vec{S}}
\end{displaymath} (12)

The total angular momentum also has 2 quantum numbers associated with it: J and Jz. If these are good quantum numbers, then the energy eigenstate $\psi_{n}$ satisfies:

\begin{displaymath}\hat{J}^{2}\psi_{n}=\hbar^{2}J(J+1)\psi_{n}
\end{displaymath} (13)

and

\begin{displaymath}\hat{J}_{z}\psi_{n}=\hbar J_{z}\psi_{n}
\end{displaymath} (14)

One can also add the spin angular momenta of two independent particles, say the spin of the proton and the neutron in a deuteron:

\begin{displaymath}\hat{\vec{S}}=\hat{\vec{S}}_{1}+\hat{\vec{S}}_{2}
\end{displaymath} (15)

One can also add orbital angular momenta. The rules for angular momentum addition are the same for all types of angular momentum. But the rules for adding angular momenta in quantum mechanics are a little tricky. You might think that if one particle has angular momentum j1 (be it spin or orbital or total) and another independent particle has angular momentum j2, the total is j=j1+j2. This is not necessarily so. The rule is that if you add j1 and j2, the total j obeys

\begin{displaymath}\vert j_{1}-j_{2}\vert\leq j \leq j_{1}+j_{2}
\end{displaymath} (16)

The z component jz still runs from -j to +j in integer steps; there are 2j+1 values of jz. So if we add the spin angular momentum of 2 spin-1/2 particles, the total spin S of the system is S=0 (we call this a singlet) or S=1 (we call this a triplet). The singlet state has sz=0 and the triplet state has 3 possible values of sz: -1, 0, +1.

Magnetic Moments
Orbital and spin angular momenta manifest themselves in the form of magnetic moments which can be measured. An electron with charge -e, mass m, and orbital angular momentum $m_{\ell}$ (z component) has an associated magnetic moment:

\begin{displaymath}M_{z,orb}=-\frac{e\hbar}{2mc}m_{\ell}
\end{displaymath} (17)

The minus sign comes from the electron's negative charge. Because of this, the angular momentum is in the opposite direction from the magnetic moment. The magnetic moment associated with the spin of an electron is

\begin{displaymath}M_{z,spin}=-g\frac{e\hbar}{2mc}s_{z}
\end{displaymath} (18)

where the constant g is called the Landé g factor. For an electron g=2. (For a proton g=5.59.) The Bohr magneton is defined as

\begin{displaymath}1\; \mu_{B}=\frac{e\hbar}{2mc}=9.27410\times 10^{-21}\; {\rm erg/Gauss}
\end{displaymath} (19)

A good rule of thumb to remember is that

\begin{displaymath}\mu_{B} \cdot (H=1\; {\rm Tesla})\approx 1 K
\end{displaymath} (20)

The total magnetic moment of an electron is

\begin{displaymath}M_{z,tot}=-\frac{e\hbar}{2mc}(m_{\ell}+2s_{z})
\end{displaymath} (21)

The magnetic moments of electrons is what gives rise to magnetism. We'll talk more about this later in the course.

Stern-Gerlach Experiment
The existence of spin was demonstrated by the famous Stern-Gerlach experiment of 1922. The magnetic moment of the electron couples to a magnetic field. The energy is given by

\begin{displaymath}U=-\mu\cdot\vec{B}
\end{displaymath} (22)

Suppose $\vec{B}=B_{z}\hat{z}$. Then if Bz(z) is not uniform along the z axis, the electron feels a force:

\begin{displaymath}F_{z}=-\frac{\partial U}{\partial z}=\mu_{z}\frac{\partial B_z(z)}{\partial z}
\end{displaymath} (23)

Notice that spin up and spin down electrons feel forces that are in the opposite direction from each other. Stern and Gerlach passed a beam of silver atoms through a nonuniform B field and found that the beam was split in two. People didn't know about electron spin at the time, so they attributed the splitting to orbital angular momentum. But if that was right, they should have seen the beam split into $2\ell+1$ components. Since $\ell$ is an integer, $2\ell+1$ = odd number. It turns out that the beam split into two pieces due to the spin of the electron. Electron spin was first proposed by 2 graduate students (Goudmit and Uhlenbeck) in 1925 to explain the fine structure splitting of some atomic spectral lines in the alkali metals (see Eisberg and Resnick, page 276). Phipps and Taylor showed that the beam splitting was due to electron spin in 1927 using hydrogen atoms (which have $\ell=0$), thus confirming the existence of the spin of the electron.

Pauli Exclusion Principle
As we mentioned earlier, fermions are particles which have half-integer spin. Electrons are an example of fermions. All electrons are the same. They are indistinguishable particles. If we have a system with many electrons, as in a multielectron atom, the Pauli exclusion principle states that there can never be more than one electron in the same quantum state. In other words, a given set of quantum numbers can only be assigned to at most one electron. This is true for each type of fermion. For example, there cannot be more than one proton in a given quantum state.

Let's go back to the box with infinitely high walls. We can label the states with quantum numbers n, spin s and the z-component of spin sz. Suppose we put 4 electrons in this box. Two electrons, one spin up and the other spin down, go into the n=1 state. The quantum numbers of these two electrons is (n=1, s=1/2, sz=1/2) and (n=1, s=1/2, sz=-1/2). The other two electrons go into the n=2 state. They have quantum numbers (n=2, s=1/2, sz=1/2) and (n=2, s=1/2, sz=-1/2).

Lec2boxqn.eps

Exchange Symmetry of Bosons and Fermions
The wavefunction of a collection of fermions is antisymmetric under the exchange of any 2 fermions. For example, if we have 2 fermions with coordinates 1 and 2, and we put them into 2 states $\phi_a$ and $\phi_b$, an antisymmetric wavefunction for them is:

\begin{displaymath}\psi(1,2)=\phi_a(1)\phi_b(2)-\phi_a(2)\phi_b(1)
\end{displaymath} (24)

Notice that if we interchange 1 and 2, we get $\psi(2,1)=-\psi(1,2)$. This is what we mean by antisymmetry. If fermions 1 and 2 were both in the same state, say $\phi_a$, then we would get

\begin{displaymath}\phi_a(1)\phi_a(2)-\phi_a(2)\phi_a(1) =0
\end{displaymath} (25)

Thus antisymmetry enforces the Pauli exclusion principle. In general a wavefunction describing a collection of N fermions must be antisymmetric and satisfy

\begin{displaymath}\psi(1,2,...,r,...,s,...,N)=-\psi(1,2,...,s,...,r,...,N)
\end{displaymath} (26)

Bosons are symmetric under exchange. For example, if we have 2 bosons with coordinates 1 and 2, and we put them into 2 states $\phi_a$ and $\phi_b$, a symmetric wavefunction for them is:

\begin{displaymath}\psi(1,2)=\phi_a(1)\phi_b(2)+\phi_a(2)\phi_b(1)
\end{displaymath} (27)

If we interchange 1 and 2, we get $\psi(2,1)=\psi(1,2)$. This is what we mean by a symmetric wavefunction. If bosons 1 and 2 were both in the same state, say $\phi_a$, then we would get

\begin{displaymath}\phi_a(1)\phi_a(2)+\phi_a(2)\phi_a(1)= 2\phi_a(1)\phi_a(2)\neq 0
\end{displaymath} (28)

Thus it's ok to put more than one boson in the same single particle state. In general a wavefunction describing a collection of N bosons must be symmetric and must satisfy

\begin{displaymath}\psi(1,2,...,r,...,s,...,N)=+\psi(1,2,...,s,...,r,...,N)
\end{displaymath} (29)

Boltzmann Statistics

In the lecture 1, we counted the number of possible states in a 3 dimensional box. Just because we know how many states a box has, doesn't mean we know how many of those states are occupied. By ``occupied'', I mean that a state or wavefunction actually has a particle associated with it. For example, in our box with periodic boundary conditions, each state represents what an electromagnetic wave with a given wavevector $\vec{k}$ can do if it exists. If it exists, then there is a corresponding photon in the box and we say that the state is occupied. What determines whether or not a state is occupied? Usually it's the temperature T which is measured in degrees Kelvin. Here we are assuming that the system is in contact with a heat bath held at temperature T. (Technically speaking we are considering the canonical ensemble.) We will concentrate on the states in the system, not in the heat reservoir. If the temperature is low, only the low energy states are occupied. If the temperature is high, then both low and higher energy states are occupied. One can reasonably expect that states up to energy $E\sim k_{B}T$ will be occupied. Here kB is Boltzmann's constant. It converts degrees Kelvin into a unit of energy like ergs. $k_{B}= 1.38\times 10^{-16}$ ergs/K. I always remember the conversion 11,600 K $\sim$ 1 eV.

Physicists refer to the probability that a state is occupied. Probability distributions are familiar to any student who ever worried about the grade distribution after an exam. The peak in the distribution gives the most probable value. The probability that a state n with energy En is occupied at a temperature T is given by the Boltzmann distribution (also called the Maxwell-Boltzmann distribution):

\begin{displaymath}P_{n}=\frac{e^{-E_{n}/k_{B}T}}Z
\end{displaymath} (30)

where Z is called the partition function. It's needed in order to normalize the distribution. More on this in a moment. First look at the numerator. Notice that the larger En is compared to kBT, the smaller the chances that that state will be occupied. In fact the probability falls exponentially with increasing En. But if $E_{n}\ll k_{B}T$, then there's an excellent chance that the state will be occupied.

Now back to the partition function Z. Z is used to normalize Pn such that

\begin{displaymath}\sum_{n}P_{n}=1
\end{displaymath} (31)

Here the sum is over all possible states of the system irrespective of energy. This implies that

\begin{displaymath}Z=\sum_{n}e^{-E_{n}/k_{B}T}
\end{displaymath} (32)

Once a probability distribution is known, one can use it to calculate various mean values. For example, let A be some quantity which has the value An when the system is in state n. Then the thermal average of A is given by

\begin{displaymath}<A>=\sum_{n}P_{n}A_{n}=\frac{\sum_{n}e^{-\beta E_{n}}A_{n}}Z
\end{displaymath} (33)

where the sum is over all the states n of the system and $\beta=1/(k_{B}T)$. For example, the average energy is given by

\begin{displaymath}<E>=\frac{\sum_{n}e^{-\beta E_{n}}E_{n}}Z
=\frac{\sum_{n}e^{-\beta E_{n}}E_{n}}{\sum_{n}e^{-\beta E_{n}}}
\end{displaymath} (34)

Notice that we can also write this as

\begin{displaymath}<E>=-\frac{\partial\ln Z}{\partial\beta}
\end{displaymath} (35)

Once we have the average energy, we can find the heat capacity CV at constant volume:

\begin{displaymath}C_{V}=\left(\frac{d<E>}{dT}\right)_{V}
\end{displaymath} (36)

If <E> is the internal energy in a unit volume, then CVis the specific heat. If all the external parameters are kept fixed so that the system does no macroscopic work, then dQ=d<E> where dQ is a tiny amount of heat. The amount of heat dQ that needs to be added to the system to raise the temperature by an amount dT is the heat capacity. The heat capacity per unit volume is called the specific heat. If the system has a large number of degrees of freedom, it will take more heat to raise the temperature by 1 degree than if the system has a small number of degrees of freedom. So the larger the heat capacity the more degrees of freedom there are in the system.

The second law of thermodynamics tells us that the system absorbs the heat dQ and increases its entropy by an amount dS:

\begin{displaymath}dS=\frac{dQ}{T}
\end{displaymath} (37)

The entropy S reflects the amount of disorder in the system. The more degrees of freedom that are excited, the greater the entropy is. We can define the entropy S as

\begin{displaymath}S=k_B\ln \Omega(<E>)
\end{displaymath} (38)

where $\Omega(E)$ is the number of accessible states of the system in the range between E and E+dE, and <E> is the mean energy of the system. We can also define the Helmholtz free energy F:

\begin{displaymath}F=<E>-TS=-k_BT\ln Z
\end{displaymath} (39)

The free energy is an important quantity. If the system is in contact with a heat bath, it is in equilibrium when it minimizes its free energy F. Minimizing F favors having a large entropy S since the coefficient of S is negative in F. This is especially true at high temperatures; at low temperatures having a large entropy doesn't help so much with lowering F. This is why water exists at high temperatures and ice, the more ordered state, exists at low temperatures. Also notice that

\begin{displaymath}S=-\frac{\partial F}{\partial T}
\end{displaymath} (40)



 
next up previous
Next: About this document ...
Clare Yu
2000-10-02