Overview

advertisement
Overview 1
1 Overview
Statistical Mechanics and Thermodynamics
Statistical mechanics is the branch of physics where the gross properties of a
substance are determined from its molecular attributes. Usually the allowed
energies of a substance, E1, E2,... are known from a theoretical model and a
clever averaging process produces expressions for thermodynamic variables like
average energy, pressure, and temperature.
Statistical mechanics, however, goes beyond classical thermodynamics in that it
gives the probability that the system is in any particular energy state. A variety
of averages can then be performed.
Our Approach
Statistical mechanics has the widest scientific applications of any physics
subject. Here we outline some basic theory and present various crude applications in
order to make a “first pass” over the subject using only elementary algebra and
calculus. Most of the material covered in this chapter is repeated later, but usually
with more sophistication and in more detail.
We first review fundamentals of classical thermodynamics and then introduce
some basic tools of statistical physics including the formulation of entropy and the
canonical probability distribution. As we proceed, each of these topics is illustrated with
applications.
Thermodynamic Fundamentals
Many applications of statistical physics use relations that are the province of
classical thermodynamics. We introduce the first and second laws of thermodynamics
with an interpretation of entropy based on a dynamic interpretation of temperature.
The First Law
The first law of thermodynamics is the familiar law of conservation of energy.
The forms of energy include heat energy, mechanical kinetic and potential energy,
chemical energy, electrical energy, radiant energy, and all other forms of energy. The
internal energy U of a system is the total energy in the system in all its forms.
The first law is written as*
*
dU  dQ  dW
The d’s on the right hand sign are spurious because heat and work are defined as changes already.
However, we find it convenient to use this notation.
(1)
Overview 2
where our convention is to take W as the work done on the system by its surroundings
and Q is the heat transferred into the system. Very often the work done is due to the
system exerting pressure P on the surroundings. In such cases, the work is given by –PdV
where V is the system volume. Equation (1) becomes
dU  dQ  PdV
1. Find the change in internal energy due to the following: (a) A system absorbs
200 J of heat and does 300 J of work. (b) A system absorbs 200 J of heat and has
300 J of work done on it. (c) Find the work done when 500 J are removed from a
gas at constant volume. [ans. –100 J, 500 J, 0]
Dynamic Interpretation of Temperature
An unfamiliar interpretation of temperature helps us to conceptualize entropy and
to formulate fundamental expressions in thermodynamics and statistical mechanics. In
the following we use the notation k for Boltzmann’s constant (1.3810-23 J/K) and T for
absolute temperature.
We assert that the quantity kT is the average magnitude of transition-inducing
energy per particle. In particular, kT is the average energy expended to change a
molecule by an interaction or to transform the physical state of a molecule. This dynamic
interpretation of temperature is not widely known, but we present it as a shortcut to the
basic results of statistical physics and as a useful guide to intuition..
Energy of Ideal Gas
Some crude examples can make the idea more concrete. Consider a onedimensional ideal gas particle confined to a line segment by walls. The only
interaction occurs at the walls that absorb the kinetic energy to stop the particle
(by the work-energy theorem) and then supply an equal amount of energy to
reverse the particle motion. The total interaction energy is twice the kinetic
energy.
kT  2 12 mvx2
Introducing U to indicate the internal energy of N one-dimensional particles, this
becomes
U  12 NkT
The same argument can be repeated for a three-dimensional gas with the result
that each component satisfies a similar relation. On average, 12 mv x2 is one-third of
total kinetic energy 12 mv 2 . Considering the interaction with one wall,
2
kT  2 12 mvx2  13 2 12 mv 2  U / N ,
3
3
3
U  NkT  nRT
2
2

R. W. Finkel and J. Joseph, Nuovo Cimento, 79 B (1984) 155.
Overview 3
The last equality is usual in many books, especially in chemistry, where quantities are
given in moles where one mole is NA=6.031023 particles. Rather than Boltzmann’s
constant k, these authors use the gas constant R where N Ak  R . For n moles,
Nk  nN Ak  nR .
2. Use the dynamic interpretation of temperature to derive the internal energy
of a two-dimensional monatomic gas. [ans U = nRT]
Energy in Radiation
Heated objects radiate energy in the form of electromagnetic waves so, for
example, warmth from an oven can be felt at a distance because the energy is
projected by infrared waves. Radiation may be viewed as a gas of photons with
energy hf and momentum hf/c. (h is Planck’s constant, c is the speed of light, and
f is the wave frequency.) When radiation is perfectly absorbed and emitted from
an object, it is called blackbody radiation. In this case the interaction of photons
is simply to absorb or emit energy hf with the walls of the object so we make the
approximate relation hf = kT.
Here we consider radiation contained in a volume V at temperature T. The average
number of photons can be estimated as follows: consider that a photon occupies a
cube of volume 3 were the sides are one wavelength . Then the number of
photons in volume V is given by
V
average number of photons  N  3

Now introduce the relation
c

f
and note that the interaction energy of a photon is just hf when it is emitted or
absorbed. Thus, we replace each factor of hf with kT. The result shows that the
energy is proportional to T4
(2)
U  N kT  VT 4
where α is a constant.
3. Repeat our crude development of Eq.(2) and show that our value for α is
k4
 3 3.
h c
The exact coefficient is found from statistical mechanics to be
4 2k 4

.
60 3 c 3
Compare the ratio of these coefficients.
Thermodynamic Form for Entropy
We regard entropy as a measure of the number of internal state changes accessible
to the system. Consider a system into which a reversible increment of heat, dQrev, is
Overview 4
introduced. According to the dynamic interpretation of temperature, the increased number
of possible states is a ratio, d Qrev kT . Entropy S is defined by leaving out the k factor so
dQrev
(3)
T
It is worth repeating that, in this view, entropy is a measure of the average number of
state changes accessible to the system and the product TS is the average energy
committed to microstate transformations.
dS 
Note that Eq.(3) applies only to reversible increments of heat. Heat added irreversibly
cannot be fully utilized for state transitions. Reversibility is a convenient fiction because
entropy is a state variable—a real, irreversible process can be replaced by a reversible
process that brings the system to the same end point.
The Second Law
An irreversible increment of heat Q must be less efficient in producing state
changes than dQrev so that
dQrev Q

T
T
or
(4)
Q
dS 
T
This leads us to the second law of thermodynamics: Entropy within a closed
system must increase or remain the same, but may never decrease. For any process,
(5)
S total  0
When no macroscopic change occurs (that is, at equilibrium) the equal sign applies. The
subscript emphasizes that although some components in a closed system may have
lowered entropy, the total entropy must increase.
A Fundamental Thermodynamic Relation
Apply the differential form of the first law to a reversible process so that dQ can
be replaced by dQrev  T dS and dW can be replaced by reversible work –PdV,
(6)
dU  T dS  PdV .
This expression consists entirely of state variables. Although Eq.(2) applies to ideal
reversible processes, it can be used to analyze irreversible processes. We can use any
artificial ideal process that carries the system through the same beginning and ending
state values as the real system.
Adiabatic Expansion and Cooling
As a simple illustration of the use of the fundamental equation (2), we
consider the rapid expansion of an ideal gas.
Consider n moles of helium originally under high pressure that is allowed
to expand rapidly into a region of lower pressure. The rapid expansion does not
allow time for heat to significantly enter the expanding gas. (A system that is so
insulated from heat transfers is called an adiabatic system.)
Overview 5
This process is highly irreversible, but we can “replace” it with a
reversible process with the same beginning and end. We imagine the gas in an
insulated container that is allowed to expand slowly as a reversible process.
Taking dQrev=0 implies dS=0 so the central equation (2) becomes
dU  PdV
Substitute P from the ideal gas equation,
PV  nRT ,
and substitute for the internal energy of a monatomic gas, U  23 nRT . Integrate to
find
3
3
(7)
T 2 V  T 2 V
1
1
2
2
where the subscripts pertain to the initial and final values.
4. Derive Eq(7). If the initial temperature of compressed helium is 27oC and the
expanded volume is twice the compressed volume, what is the temperature of the
expanded gas? [ans. –84oC]
Statistical Mechanics Fundamentals
Most traditional presentations of statistical mechanics begin by expressing
entropy in terms of W, the number of equally probable microscopic states that constitute a
given macroscopic state. For example, a “macroscopic” state where 4 pupils (molecules)
are seated in any 6 chairs can be achieved in W = 30 “microscopic” ways. Here we
develop the relevant form of entropy using the dynamic interpretation of temperature.
Boltzmann Entropy
Let W represent the total number of microstates. From the dynamic interpretation
of temperature, the ratio dQ kT is the number of new states available to the system due
to the introduction of heat dQ. It follows that the change in total microstates dW is given
by1
dQ
dW  W
.
kT
Introduce entropy using dS  dQ T to obtain
k dW  WdS .
Solving, we find the Boltzmann form for entropy,
(8)
S  k ln W
5. Show that the Boltzmann entropy S of a composite system is equal to the sum of
the entropies S1 and S2 of two individual parts.
1
We know from elementary statistics that if part 1 of a system can be arranged in 1 ways and part 2 of the
system can be arranged in 2 ways, then the state of the composite system can be arranged in 12 ways.
Overview 6
6. (a) Evaluate microstates W1 for one particle of a monatomic ideal gas in volume V
using the (crude) idea that each particle occupies a cube of volume 3 were the
sides are one deBroglie wavelength . The number of ways one particle can fit in
V is V/3 and the average energy per particle is U/N=3kT/2. (Recall the deBroglie
relation   h mv .)
(b) Form the total number of microstates for N particles, W  W1N and calculate
the entropy S and use it to find an expression for the adiabatic expansion of an
ideal monatomic gas, Eq.(7). (Use S = constant or S1 = S2) For now, we assume
the particles are distinguishable from each other.
7. (a) Repeat the development of problem 6 for radiation (photon gas). Show that an
adiabatic condition for radiation is V1T13  V2T23 .
(b) The background radiation from the Big Bang is now 2.7 K. What was the
temperature when the universe was 1.0X10-10 of its current volume? [ans 5800 K,]
Entropy and Thermodynamics
Statistical mechanics is likely to provide us with an entropy representation. Then
(2) can be rewritten to express dS,
Once a system’s entropy is calculated from statistical mechanics, all of the related
thermodynamic properties can be generated. Here is the principle trick that lets us wring
thermodynamic state equations from entropy.
(9)
dU P dV
dS 

T
T
physics
This can be compared with the mathematical statement,
S
S
dS 
dU 
dV
U
V
(10)
math
to obtain relations between S and the more accessible variables T and P:
1 S
P S

and

T U
T V
(11)
8. In problem 6 , you found that entropy for a monatomic ideal gas is given by
3
 V  2mU  2 
S  Nk ln  3 
  . Use the relations (11) to derive the ideal gas law and the
h
N

 

energy relation for monatomic gas. Show that the results give the ideal gas
equation and the relation U = 3/2 nR T.
Overview 7
9. An entropy representation for radiation from heated objects is
1
4
S  VU 3 4
3
where  is a constant. Find the state equations for this “blackbody radiation.”
Express U entirely in terms of T and V and eliminate T to express P entirely in
terms of U and V, and (these are the most familiar forms).
[ans. U  VT 4 , P  13 U / V ]
Canonical Distribution
Here we derive the probability that a system with energy levels E1, E2,... at
temperature T is in any particular energy level Ei. The probabilities pj are used to find
averages of physical quantities. For instance, the average energy E of a particle with
possible energies Ej available is given by
E   pi Ei
i
10. A two-state system has energy levels of 1.0 eV and 2.0 eV. The probability that
the system is in the lower energy state is 0.75. Find the average energy of the
system. [ans. 1.25 eV]
Introduce the Helmholtz free energy F as the minimum work required to assemble
the system from its well-separated components at constant temperature. Denote the
average energy exchanged per interaction as kT. We want to evaluate the probability pj
that the system is in level j. The energy available for changing the state from level j is
Ej–F. Consider an arbitrary time interval during which there are N interactions. The
energy available for transitions is then NkT and the probability of a transition out of state
j is (Ej-F)/N kT. Each interaction then has a complementary probability 1–(Ej-F)/NkT of
not causing the system to leave level .j. The probability that none of the N interactions
causes a transition is pj,
 E F
p j  1  j
 .
NkT 

In the limit of large N this becomes2 the sought-for level probability,
(12)
 F  Ej 
p j  exp 

 kT 
Notice that a lower energy level Ei is favored over a higher level. Also note that the
higher the temperature, the closer all exponents approach to zero, so that the levels
become more equally populated.
N
11. The ground state and first excited states of hydrogen atoms are respectively
-13.6 eV and -3.4 eV. Find the ratio of ground state atoms to first excited state
atoms at (a) 3000 K and (b) 6000 K (the temperature at the surface of the sun).
[ans. exp(-39.5) and exp(-19.7)]
2
Recall that lim 1  x N N  exp( x).
N 
Overview 8
Application: Paramagnetism
Paramagnetism is induced when a magnetic field B
interacts with unpaired electrons in a material causing a
preferential orientation of magnetic moments. A simplified
model is a uniform magnetic field applied to a collection of
free electrons. Quantum mechanics dictates that only two
states are allowed, parallel and antiparallel. When the B field -B
and the spin orientation are parallel, the energy per electron
is E   B where  is the magnetic moment of the
electron. Conversely, an antiparallel orientation has energy
E   B .
B
+B
12. (a) Use the canonical distribution to write the respective probabilities p and p
that a single electron is oriented parallel or antiparallel to a uniform B-field. (We
will soon evaluate the factor exp( F / kT ) , but it will cancel out in this problem)
(b) Given that there are N total electrons in the material, write expressions for the
numbers of electrons N  and N  oriented parallel and antiparallel to the
field.[ans. N   Np , etc]
(c) Show that the total energy E  N E  N E is given by

 B 
E   NB tanh
 .
 kT  

(d) Show that for low temperatures E   NB and for high temperatures
E  N
2 B2
. (Use the approximation e  x  1  x for small x)
kT
(e) Magnetization M is the excess of parallel magnetic moments over antiparallel:
M  N   N  . Show that in the high temperature limit magnetization is given
by the Curie relation, M 
N 2 B
kT
Overview 9
Partition Function
Write the normalization condition
 pi  1
(13)
i
and substitute from Eq. (12) to obtain a fundamental connection with thermodynamics,
F = –kT ln Z
(14)
where the partition function Z is defined as
  Ei 
(15)
Z   exp 

 kT 
Moreover, Eq. (12) can be rewritten using Eq. (15) to put the canonical probabilities in
a more traditional form,
  Ej 
1
(16)

p j  exp 
Z
kT


13. Use equations (12) and (13) to derive equations (14) and (16)..
14. From memory, (a) write the canonical distribution probability and identify the
symbols in the expression (b)Write an expression for the partition function Z and
(c) express Helmholtz free energy F in terms of Z. (In later chapters, we will
wring thermodynamic information from both F and Z.)
15. A particular enzyme has two states, (i) inactive, with energy E, and (ii) activated,
with energy E + 0.003 aJ. (a  atto = 10-18 ).
(a) Write the probability of each state at 310 K. (b) Find the sum of the
probabilities. (c) Which state occurs with greater frequency?
Looking Ahead
A few things should be mentioned as we look ahead to further elaborations and
consolidations. Entropy is a cornerstone for statistical theory and we focused on entropy
as the link between microphysics and macrophysics. However, we will find that other
thermodynamic quantities like U, F, and Z and others often serve as the link even more
simply or directly.
This overview presented two powerful concepts that connect the micro-world to
the macro-world, entropy and probability. The next few chapters develop a most useful
connection between these two. We will reconstruct entropy in terms of probabilities to
find the Gibbs or Shannon form, S  kN  pi ln pi .
.
Our plan for this course is to first introduce information theory and derive the
canonical distribution and other probabilities based on the maxent principle. We will also
develop other distributions that are useful for isolated systems and for systems that
exchange particles with the environment (the micro canonical and grand canonical
distributions). Basics of classical thermodynamics will be developed as byproducts. We
will present shortcuts to the averaging processes based on the partition function. Most
importantly, we will use our results for applications to various branches of physics and
chemistry.
Overview 10
Summary
This overview introduced a dynamic interpretation of temperature as a conceptual
basis for central features of statistical physics. Core aspects of the subject were presented
in an initial treatment:




Reviewed the laws of thermodynamics and summarized these for
reversible systems using dU  T dS  PdV .
Introduced the thermodynamic form of entropy, dS  dQrev / T , and
related it to the statistical mechanical form, S  k ln W .
Illustrated how thermodynamic state equations like PV  nRT can be
derived from entropy expressions.
Derived the canonical probability distribution, p j  Z 1 exp  E j / kT ,
and demonstrated its use in the averaging process.
Download