Presentation453.02

advertisement
Lecture 2 – The Boltzmann distribution
Ch 22
pp. 552-568
If I speak of Heat and I asked you…
What is it?
If I say a substance is at a certain temperature ..
What exactly am I measuring?
Do you remember what classical entropy is?
Summary of lecture 1
• Macroscopic (i.e. thermodynamic) properties can be
related to microscopic (mechanical) properties using
statistical mechanics (and viceversa)
• Equilibrium thermodynamic properties are averages over
microscopic states (e.g. energy)
N E
i
E 
i
N
i
Ni
  E i   Pi E i
N
i
i
• Thermodynamic energy, reversible heat and work are
related to the (microscopic) energy levels Ei and the
occupation numbers Ni or distributions Pi
The Boltzmann distribution function
• In principle, we can calculate energy levels from
quantum mechanics, though in practice this is often very
difficult or impossible; in order to obtain thermodynamic
properties, we must also know the distribution
• There are in general many distributions that result in
the same average energy. However, there exists a
distribution that is much more probable than all the
others (most probable distribution). It is postulated that
this is the distribution that characterizes a system at
equilibrium. This most probable distribution is of central
importance.
Work and Heat
• We will show that the heat absorbed by a system during
reversible change is related to the changes in the number
of molecules in each energy level
• As heat is absorbed, the number of molecules in higher
energy levels increases, while that of those in lower
energy levels decreases. The reversible heat change is the
part of the total energy that is related to changes in the
distribution of the molecules among the energy levels.
• We will also show that work is instead related to
changes in the energy levels themselves
If I speak of Heat and I asked you…
What is it?
Work and Heat
Consider a gas in a contained with a movable wall (piston).
If we move the piston by a distance dx in the direction of an
external force Fx, the work done on the gas by the
surrounding is:
dw = Fx dx
The total force against the piston exerted by all molecules is:
N F
i
ix
i
where Fix is the force exerted by each molecule in the
Ei energy level
Work and Heat
If the work is carried out reversibly, then the forces exerted
by the molecules must balance the external force, so that the
work done by a reversible process is:
dwrev   N i Fix dx
i
The force can be equated to the negative derivative of
an energy. For an ideal gas, the force Fix is related to
the change in its energy Ei due to the change in the
dimension of the container da
Fix 
dE i
da
Work and Heat
By substituting into the previous equation (dx is simply the
opposite of da)
 dEi 
dx   N i dEi
dwrev   N i 
i
i
 da 
The work done on the system for a reversible process
is related to the changes in the energy levels due to the
change in the dimension of the container
This is true for any system, not just ideal gases,
though real gases changes the size of the system
affects molecular interactions as well and therefore
energy levels and distributions
Work and Heat
If we now write the differential expression for the change of
energy with changes in energy levels and occupation
numbers and substitute the expression for the reversible
work:
dE   N i dEi   Ei dN i  dwrev   Ei dN i
i
i
i
Since the first law of thermodynamics states that:
dE  dwrev  dq rev
dqrev   Ei dN i
i
Work and Heat
dqrev   Ei dN i
i
Thus, the heat absorbed by a system during reversible
change is related to the changes in the number of
molecules in each energy level
As heat is absorbed, the number of molecules in higher
energy levels increases, while that of those in lower
energy levels decreases. The reversible heat change is the
part of the total energy that is related to changes in the
distribution of the molecules among the energy levels.
Work and Heat
dqrev   Ei dN i
i
The heat absorbed by a system during a reversible
change is related to the changes in the number of
molecules in each energy level
 dEi 
dx   N i dEi
dwrev   N i 
i
i
 da 
The work done on a system during a reversible change is
related to the changes in the energy levels
Degeneracy of a distribution
A distribution reflects a microscopic arrangement of the
system. There can be, and in general there are, many
distributions that result in the same average energy. The
number of microscopic arrangements that result in the
same distribution is called the degeneracy of the
distribution
The number of ways W of arranging N distinguishable
molecules among n energy levels such that N1 have
energy E1, N2 have energy E2 N3 have energy E3 etc. is
given by
N!
W
N 1 ! N 2 ! N 3 ! N n !
Degeneracy of a distribution
Example – Poker - The degeneracy or weight of a
distribution W may be thought of as a probability, in the
same sense that the probability of drawing a five card
poker hand is related to the number of occurrences of
that hand in a deck of 52 cards. There are 3,744 possible
combinations of five cards in a 52 card poker deck that
yield “full house” hands, but there are only four ways to
draw royal flush hands. It is common to state that one
has a greater chance or a higher probability of being
dealt a full house than a royal flush.
Degeneracy of a distribution
Example - In a system containing 10 molecules, it is
possible that 9 have zero energy (the lowest energy level)
and one has 10 times the average energy, but this is not
likely. It is much more probable that a wide range of
energy values is represented.
Degeneracy of a distribution
Example - Consider three distributions of 10 molecules
over seven equally spaced energy levels. The level spacing
is defined as E0. Assume E1=0
Degeneracy of a distribution
Distribution A:
N1  N 2  N 3  N 4  N5  2 ; N 6  N 7  0
Distribution B:
N1  N 2  N 4  N5  N 6  N 7  0 ; N 3  10
Distribution C:
N1  3 N 2  N 3  2 N 4  1 N 5  0 N 6  N 7  1
For all 3 distributions:
E   N i E i  20 E 0
N   N i  10
i
i
N E
i
E 
i
N
i

20 E 0
 2 E0
10
Degeneracy of a distribution
Distributions A, B, and C have equal average energies,
but they have very unequal degeneracy:
WA 
10!
 113,400
2 ! 2 ! 2 ! 2 ! 2 ! 0!0!
WB 
10!
1
0!0!10!0!0!0!0!
WC 
10!
 151,200
3! 2 ! 2 !1! 0!1!1!
There is only one way to put all molecules in energy level
3, but 113,400 ways to generate distribution A and
151,200 to generate distribution C (there are two ways of
assigning two molecules to two different levels, but only
one way to assign both molecules to the same level).
The Boltzmann distribution
If a very large number of distributions {Pi} can indeed
result in the same average property (e.g. ) how can we
proceed to relate mechanical properties to
thermodynamic properties?
There is one distribution for which the degeneracy is
much, much larger than the rest. This degeneracy is
called Wmax ; the distribution corresponding to the
largest degeneracy is called the most probable
distribution. The identity of this distribution is of
fundamental importance.
The Boltzmann distribution
Let us consider the plot of W as a function of all
distributions of N ideal gas molecules that result in the
same average energy
The Boltzmann distribution
The most probable distribution {N1,N2,…Nn} (or
equivalently {P1,P2,…Pn}) corresponds to the maximum
Wmax in the plot of the function
N!
W
N 1 ! N 2 ! N 3 ! N n !
Subject to two constraints:
N E
i
i
i
E
N
i
N
i
Total energy and particle numbers are conserved
The Boltzmann distribution
The conclusion of this exercise in calculus is that the most
probable distribution, the distribution that characterizes
a system at equilibrium is the so-called Boltzmann
distribution because he first derived it:
Ni
e  Ei / k B T
e  Ei / k B T
Pi 


 Ei / k B T
N
q
e
i
q  e
 Ei / k B T
i
q is the sum over molecular states and is called the
molecular partition function (note that the text uses Z for
the partition function).
The Boltzmann distribution
The Boltzman distribution is one of the most important
concepts in statistical thermodynamics, because it
provides us with the probability of finding a system
within a particular energy state Ei
Ni
e  Ei / k B T
e  Ei / k B T
Pi 


 Ei / k B T
N
q
e
i
q   e  Ei / k B T
i
The Boltzmann distribution - warnings
• The derivation given above is only valid for the case
where molecules in the system do not interact (ideal gas)
• If molecules interact (non-ideal gas), we can no longer
talk about the average energy being defined in terms of
averages over single molecule energies
• The microscopic energies are now functions of
interactions between all the molecules in the system.
q   e  Ei / k B T
i
Ni
e  Ei / k B T
e  Ei / k B T
Pi 


 Ei / k B T
N
q
e
i
The Boltzmann distribution - warnings
• In this case an analogous derivation is performed which
involves a large collection, or ensemble, of isolated
systems, which are in thermal contact
• Thus the systems have the same N, V, and T. The
thermodynamic energy is now the average of the system
energies over the ensemble.
q   e  Ei / k B T
i
Ni
e  Ei / k B T
e  Ei / k B T
Pi 


 Ei / k B T
N
q
e
i
Properties of the Boltzmann distribution
• Near 0K, only the ground level is populated: all
molecules (or all systems) are in the lowest-energy level
• As the temperature is raised, other levels become
progressively more populated
• As the temperature becomes very high, all energy levels
become equally populated because the exponential
factors all approach 1
q   e  Ei / k B T
i
Ni
e  Ei / k B T
e  Ei / k B T
Pi 


 Ei / k B T
N
q
e
i
Properties of the Boltzmann distribution
• The magnitude of kbT is a significant characteristic of a
system.
• If the energy difference between levels is small
compared to kbT, many energy levels are populated
• If the spacing is large (compared to kbT), only the lowest
energy level is populated.
q   e  Ei / k B T
i
Ni
e  Ei / k B T
e  Ei / k B T
Pi 


 Ei / k B T
N
q
e
i
Thermodynamic properties from the Boltzmann distribution
• Now that the distribution is available, we can calculate
any thermodynamic property, if we know the energy
levels (which, remember, in general is very difficult).
• For example, the thermodynamic, internal energy is the
average energy:
E   Pi E i  
i
i
Ni
Ei 
N
 Ei / k B T
E
e
 i
i
 Ei / k B T
e

i
Thermodynamic properties from the Boltzmann distribution
• Thermodynamic properties of a system can be obtained
from the distribution function. For example
  ln q 
 E  kbT 
 T V
2
• For an isolated system composed of N non-interacting
particle, it can be shown that
  ln q 
E  NkT 2 
 T  V
  ln q 
P  NkT 
 V  T
S  kN ln
q E
  kN
N T
Thermodynamic properties from the Boltzmann distribution
• Starting from:
E e
 Ei / k T
i
 E 
i
q
• Take the derivative of q with respect to T at constant V
1
 q 

 T 
k bT 2
V
kT 2
 E 
q
Use the fact that:
 q 
 T 
V
E e
 Ei / kT
i
i
kT 2   ln q 
 E 
 T 
V
dq / q  d ln q
Use of the ‘partition’ function
• Calculation of the molecular partition function and
average energy for the translational motion of an ideal
gas in a one-dimensional box of length a
• In Lecture 1 we have provided the equation for the
quantized energy levels for a particle in a box:
n2 h2
En 
8ma 2
q   e Ei / kBT
i


n2 h2 
  exp  
dn 
2
 8ma kBT 
0
2 ma 2 kBT
h
Use of the ‘partition’ function
• In deriving this expression, we have assumed that the
energy levels are spaced very close together
(corresponding to a system of large mass, or a
macroscopic, classical system), so the summation can be
replaced by an integral, which in turn can be evaluated
using the expression:


0
 E 
e x dx 
2
1 
2 
E
  ln q 
2 
1/ 2 


 kbT 2 

k
T
ln
T


b

N

T

T

V


kBT
N A k B T RT
E
E 

E

NA
2
2
2
Why the name ‘partition’ function?
•To a high degree of approximation, the energy of a
molecule in a particular state is the simple sums of
various types of energy (translational, rotational,
vibrational, electronic, etc.):
E  E tr  E rpt  E vib  ...
q
 e
 Etr / kT
 e
 Erot / kT
 e
 Evib / kT
...  q
tr
qrot qvib ...
Equipartition principle
• It can easily be shown that for an ideal gas in a threedimensional box the partition function is just the cube of
the one dimensional partition function:
q trans,3 D  q
3
• From which it can be shown that:
Etransl,3D
  ln q 3 
3RT
2   ln q 
 RT 

3
RT


 T 

T
2
V

V
2
• Equi-partition Principle: the energy is equally
partitioned among all translational degrees of freedom.
Do you remember what classical entropy is?
Statistical mechanical entropy
• In the context of classical thermodynamics, entropy has
been introduced as a measure of the disorder of a system.
It is therefore very reasonable to expect the entropy to be
related to W, the number of ways of distributing the
molecules in a system among their energy levels.
• Boltzmann showed that the entropy is in fact related to
the most-probable distribution Wmax (if a system contains
more than a few hundred particles, that is the only
distribution that need be considered):
S  k ln Wmax
Statistical mechanical entropy
This relationship provides a molecular interpretation of
entropy, in that it directly relates it to the way the
molecules in a system are distributed among its different
energy levels, which in turn depends on the energy levels
themselves as well as the temperature of a system.
S  k ln Wmax
Statistical mechanical entropy
• Does this definition make sense given what we know
from classical thermodynamics?
• As T approaches 0K, all molecules will be in the lowest
energy state and the Boltzmann distribution will
approach unity (hence S=0)
• As T becomes progressively higher, the entropy
increases
S  k ln Wmax
Statistical mechanical entropy
• From the point of view of statistical mechanics, the third
law of thermodynamics is a simple consequence of the
occupation of the lowest available energy levels at
temperature approaching the absolute zero.
• The second principle of thermodynamics states that the
equilibrium state is the state of maximum entropy for an
isolated system. From the statistical mechanical
perspective, the equilibrium state of a system represents
the most probable distribution and has maximum
randomness.
S  k ln Wmax
Statistical mechanical entropy
S  k ln Wmax
Looking at it from the opposite perspective, you can
imagine that the Boltzmann distribution and the
definition of entropy given above are statistical
mechanical formulations of the second and third
principles of thermodynamics (while the first principle,
conservation of energy, was explicitly incorporated in the
derivation of the Boltzmann distribution).
Download