Lecture_5_Statistica..

advertisement
The Statistical Interpretation of Entropy
The aim of this lecture is to show that entropy can be interpreted in terms
of the degree of randomness as originally shown by Boltzmann.
Boltzmann’s definition of entropy is that S  k ln  where
B
probability that a given state exists.
 is the
For example, we consider a system composed of 3 particles with energy
levels  o ,  1 ,  2 ,  3 where the energy of level 0 is zero, level 1 is u, level 2
is 2u and level 3 is 3u. Let the total energy of the system, U = 3u.
The total energy of 3u can be present with various configurations or
microstate complexions.
Distinguishable complexions for U = 3u.
3
2
1
o
3u
2u
u
0
a =1
b=3
c=6
a; all three particles in level 1; probability of occurrence 1/10
b; one particle in level 3, 2 particles in level 0; 3/10
c; one particle in level 2, 1 particle in level 1, one particle in level 0; 6/10
All 3 of these complexions or microstates correspond to a single “observable”
macrostate.
In general the number of arrangements or complexions within a single
distribution is given by
n!
 
n o ! n1 ! n 2 !...n i !
where n particles are distributed among energy levels such that no are in level o,
n1 are in level o, etc.
distribution a;
distribution b;
distribution c;
a 
b 
c 
3!
1
3!0 !0 !
3!
3
2 !1!0 !
3!
1!1!1!
6
The most probable distribution is determined by the set of numbers ni
that maximizes .
Since for real systems the numbers can be large (consider the number in
1 mole of gas), Stirling’s approximation will be useful,
ln x !  x ln x  x
The observable macrostate is determined by constraints.
ir
U  n o  o  n1 1  ...  n r  r 
n
i
i
constant energy in the system
io
ir
n  n o  n o  n1  ...n r 
n
io
i
constant number of particles in the
system
Any interchange of particles among the energy levels are constrained by
the conditions:
U 
 n
i
i
0
A
i
n 
  ni  0
B
i
Also using the definition and Stirling’s approximation;
 
n!
n o ! n1 !...n i !
ln   ln n ! ln n o ! ln n1 ! ...  ln n i !
i
ln   n ln n  n 
 n
io
i
ln n i  n i 
The constraints on the particle numbers impose a condition on ,
 ln      n i ln n i  0
C
What we need is to find the most likely microstate or complexion and that will be
given by the maximum value of . This occurs when equations A, B and C are
simultaneously satisfied.
Technique of Lagrange multipliers which is a method for finding the extrema of a
function of several variables subject to one or more constraints.
We will multiply equation A by a quantity b, which has the units of reciprocal
energy.
 b  n
i
i
i
0
D
Equation B is multiplied by a dimensionless constant a,
 a n
i
0
E
i
Equations C, D and E are added to give,
ir
  ln n
i
 a  b  i   n i  0.
io
i.e.,
 ln n o  a
 b  o   n o   ln n1  a  b  1   n1  ...  ln n r  a  b  r   n r  0.
This can only occur is each of the bracketed quantities are identically zero,
ln n i  a  b  i  0
rearranging for the ni,
ni  e
a
e
 bi
and summing over all r energy levels,
ir

io
ni  n  e
a
ir

io
e
 bi
ir
The quantity

e
 bi
e
 bo
e
 b 1
 ...  e
 b r
 P
io
is very important and occurs very often in the study of statistical mechanics. It
is called the partition function, P. Then,
ir

ni  n  e
io
e
a
a
ir

e
 bi
io
 n/P
This allows us to write the expression for ni in convenient form,
ni 
ne
 bi
P
So, the distribution of particles maximizing  is one in which the
occupancy or population of the energy levels decreases exponentially
with increasing energy.
We can identify, the undetermined multiplier b using the following argument
connecting  with entropy, S.
Consider two similar systems a and b in thermal contact with
entropies Sa and Sb and associated thermodynamic probabilities a
and
b. Since entropy (upper case) is an extensive variable, the
total entropy of the composite system is
S  Sa  Sb
The thermodynamic probability of the composite system involves a product
of the individual probabilities,
   a b
Since our aim is to connect  with entropy, S,, we seek
S  f  
Then we must have
f    f  a b   f  a   f  b 
The only function satisfying this is the logarithm, so that we must have
S  k ln  ,
where
k is a constant.
Now we can identify the quantity b. We start with the condition,
 ln      n i ln n i  0
C
and make the substitution in C for ln n i from ln n i  a  b  i  0,
 ln n i  a  b  i
  b
 ln      n i ln n i 
i
 a  n i
Expanding
 ln  
 b   n   a n
i
i
i
rearranging
 ln   b    i n i  a   n i
 ln   b  U
and solving for b,
b 
 ln 
U
=0
But we can see that,
b 
 ln 
U

d ln 
dU
1  S 

k ln   

k dU
k   U V
1 d
The constant volume condition results from the fixed number of energy states.
The from the combined 1st and 2nd Law
dU  T dS  pdV
1
 U 
 S 

  T ;
 
T
  S V
  U V
and finally
b 
1
kT
;
k  kB
Configurational and Thermal Entropy
Mixing of red and blue spheres
for unmixed state 1,   1
for mixing of red and blue spheres; 

conf
Then
S conf  k B ln
 nb  n r  !
nb ! n r !
 nb  n r  !
nb ! n r !
The total entropy will be given by
S total  S therm al  S conf
S total  k B ln  th  k B ln  conf
S total  k B ln  th  conf
The number of spatial configurations available to 2 closed systems placed
in thermal contact is unity. For heat flow down a temperature gradient we
only have th changing.
Similarly for mixing of particles A and B the only contribution to the entropy
change will be S conf if the redistribution of particles does not cause any
shift in energy levels, i.e.  1   2 . This would be the case of ideal mixing
th
th
since the total energy of the mixed system would be identical to the sum of the
energies of the individual systems. This occurs in nature only rarely.
Download