entropy - University of Amsterdam

advertisement
Heat and Entropy
Van der Waals-Zeeman
Institute, University of
Amsterdam
Jacques C. P. Klaasse
Lecture on
Heat and Entropy,
according to
Clausius and Boltzmann
Presentation IMNU, Hohhot, China
June 2008
Contents of the presentation
•
•
•
•
•
•
•
•
•
Internal energy U
Heat, Work, and Entropy
Specific heat
Entropy and statistics
Boltzmann entropy
Entropy and ideal gas
Boltzmann distribution
Specific heat revisited
Conclusions
Contents of the presentation
•
•
•
•
•
•
•
•
•
Internal energy U
Heat, Work, and Entropy
Specific heat
Entropy and statistics
Boltzmann entropy
Entropy and ideal gas
Boltzmann distribution
Specific heat revisited
Conclusions
Internal energy U
For an ideal gas, the
“internal” energy, U, is
nothing more than the
total kinetic energy of the
N particles:
Ekin = (½).N.m.vav2
with m the mass and vav2
the average squared
velocity.
This kinetic energy shows to
be only dependent on the
temperature of the gas.
A simple derivation,
showed on the next
sheets, gives for a
mono-atomic ideal gas
the simple relation:
U = Ekin = (3/2)NkBT
with kB the so-called
Boltzmann constant.
Internal energy U
Take a particle of the gas with
x-component of the velocity vx.
On each collision with the yz-wall, an
impulse of 2mvx is transferred.
This happens vx/(2.lx) times per sec.
=> Total force for this single particle:
F = m.vx2 / lx .
For all particles together holds:
F = m.(Σ vx2) / lx .
With Σ vx2 = N.vx,av2 , follows
for the pressure, p :
p = F / lylz = (N/V).m. vx,av2 , with
vx,av2 the “mean square” value of
the velocity in the x-direction.
All 3 directions are
equivalent, thus:
vx,av2 + vy,av2 + vz,av2
= vm2
is the total mean square
velocity, with
vm2 = 3 . vx,av2
.
Internal energy U
We have seen that for the pressure holds:
p = (N/V).m. vx,av2 .
Because of the equivalence of the three
directions follows: vm2 = 3 . vx,av2 .
Thus for p follows:
p = (1/3).(N/V).m.vm2 .
We know: Ekin = (1/2).N. m.vm2 ,
and
p.V = N.kB.T .
By eliminating m.vm2 and p follows
Ekin = (3/2).N.kB.T
=>
U = Ekin = U(T) .
Contents of the presentation
•
•
•
•
•
•
•
•
•
Internal energy U
Heat, Work, and Entropy
Specific heat
Entropy and statistics
Boltzmann entropy
Entropy and ideal gas
Boltzmann distribution
Specific heat revisited
Conclusions
Heat, Work, and Entropy
For an ideal gas, energy can be
added or removed by
- mechanical work:
ΔW = -p.ΔV
- heat: ΔQ .
We have, by consequence,
dU = dQ + dW = dQ - p.dV
The total internal energy, U, is, for
the same amount of gas, only a
function of T.
Two isotherms in a pVdiagram, with Th > Tl.
The internal energy U is
constant along the isotherm.
Heat, Work, and Entropy
Let us go, in the pV diagram, from
state 1 to state 2.
∫dU (1->2) is only a function of T. =>
independent of the path.
∫
However: dW (1->2) is different for
the two path’s
1 -> A -> 2
1 -> 1’ -> B -> 2’ -> 2
∫
(The pressure in – p.dV is much
lower in the second case.)
∫
By consequence: also dQ
depends on the path that is
followed in the pV diagram.
Two isotherms in a PVdiagram, with Th > Tl..
Going from 1 to 2, both Q
and W are dependent on
the path.
Heat, Work, and Entropy
We have seen that ∫dW is dependent on the path
that is followed.
However: ∫(1/p).dW = ∫dV is
independent on the path:
V(2)-V(1) is the same for both cases.
Question:
how to make ∫ dQ independent on the path?
The related variable is T, so, we look for dQ = T d “?” .
Clausius
S,
“entropy”.
called this “missing variable”
named
It can be shown that ∫dS = ∫(1/T).dQ is
independent on the path that is followed,….
….. provided that during the process
thermal equilibrium is never broken.
Heat, Work, and Entropy
We have defined
dS = dQ/T .
Q is dependent on the number of
particles, N, so is S.
That means that for the two boxes
here
S = S1 + S2 .
If we open the valve, heat will flow
from 2 -> 1.
That means, if that heat is dQ, that
dS1 = + dQ / 99.5
dS2 = - dQ /100.5 < - dS1 !!
By the spontaneous temperature
egalisation, the total entropy is
increased.
It can be shown that for any
spontaneous process the
entropy will increase.
Entropy is, considering the
whole system, always
developing to a maximum value.
Contents of the presentation
•
•
•
•
•
•
•
•
•
Internal energy U
Heat, Work, and Entropy
Specific heat
Entropy and statistics
Boltzmann entropy
Entropy and ideal gas
Boltzmann distribution
Specific heat revisited
Conclusions
Specific Heat
A method to determine the entropy content of a
system (for instance a piece of material) is to
determine the Specific Heat (this is the heat
capacity per mass unit) as a function of T from T=0
up to the wanted T.
For the heat capacity C holds
C = dQ / dT .
By consequence: (C/T).dT = dQ / T = dS.
For S follows
S = ∫ (C(T) / T) .dT
from T=0 up to the final temperature.
NB: for solids it is not very important whether
we keep V or p constant.
Specific Heat
Typical heat-capacity
curve for a solid from
low T up to about half
the “DebyeTemperature”,
most of the order of
about 100 K.
If we make a plot of C/T versus T, the area under the
curve is the entropy.
Problem: we cannot measure from T=0 …
Fortunately:
the S contributions are low there.
Specific Heat
For mono-atomic noble gases we have seen that
Ekin = (3/2).N.kB.T
That means for the specific heat:
C = (3/2).N.kB .
This is in good agreement with experimental
values for noble gases.
This C is taken at constant volume: CV .
You can also measure at constant pressure:
Cp = 5/2 N.kB .
Solids: constant volume is very difficult. But at
low T for solids CV ≈ Cp
At 300K the difference is only a few percent.
Specific Heat
For a two-level system the
specific heat has a typical
shape, called a “Schottky
peak”.
The maximum of the most
simple two-level system
(without degeneracy) is at
T ≈ 0.3 * Δε / kB , with Δε
the distance between the
levels.
Later in this course we will
learn how to calculate this
Schottky peak.
Typical Schottky peak in
the specific heat. The left
part of the C/T curve
increases roughly
exponentially, the right
part decreases as 1/T 3 .
Contents of the presentation
•
•
•
•
•
•
•
•
•
Internal energy U
Heat, Work, and Entropy
Specific heat
Entropy and statistics
Boltzmann entropy
Entropy and ideal gas
Boltzmann distribution
Specific heat revisited
Conclusions
Entropy and statistics
Boltzmann: nature is ruled by statistics.
Energy is distributed randomly over the given particles.
Any possible configuration at given energy has equal
probability.
A possible configuration is often called a micro-state.
A micro-state is also called a quantum state (individual
energy levels are given by quantum mechanics).
A certain solution for the energy distribution over the
particles is (shortly) called a distribution.
Entropy and statistics
The distribution of highest probability (containing
the highest number of possible configurations) is
the equilibrium distribution.
The equilibrium distribution is adopted most of the
time,
BUT NOT ALWAYS.
Distributions far away from equilibrium have, in case of a
large number of particles, a negligeable probability, but
not zero probability.
Entropy and statistics
• Look at this system of 6
places. Particles can be
distributed randomly with
sign “+” or “-”.
• Here we present a
“distribution” with two “+”
and four “-” particles.
• In how many ways this
distribution can be
realized?
The first + can be put at 6
positions, the second at 5.
But we can change the two
“+” particles without seeing
the difference: the total
amount of really different
configurations is 6*5/2=15.
Entropy and statistics
The number of possible configurations, Ω, for arbitrary
distributions n(+), n(-) equals:
(6*5*4*3*2*1) / [n(+)*..*1]*[n(-)*..*1] .
With the notation n*(n-1)*..*2*1 = n! follows:
Ω = 6! / [n(+)! * n(-)! ].
Results for all 7 distributions +,- in case N=6:
0,6 1,5 2,4 3,3 4,2 5,1 6,0
Ω= 1
6 15 20 15 6
1
We see that 3,3 is most probable but others are quite
well possible. If we take N = 60, the 30,30 is most
probable but the curve is more sharply peaked.
NB: the sum over all configurations equals 64 = 26.
Entropy and statistics
We repeat the results for all 7 distributions in case
N=6:
0,6
1,5
2,4
3,3
4,2
5,1
6,0
Ω=
1
6
15
20
15
6
1
We see that the Ω for 2,4 (or 4,2) is 75% of the 3,3
value.
During 78% of the time the system is in one of the
three central configurations.
If we take N = 60, the Ω for 20,40 is only 3% of the
30,30 value …
Entropy and statistics
Here we see plots for three
numbers of N.
The larger N, the more peaked
is the curve around 50 %.
For N= 6.1023 the peak is so
sharp that the system is nearly
all the time in the 50 % state.
However, all other distributions
are possible!!
Ω
N=6
N=60
N=
6.1023
Contents of the presentation
•
•
•
•
•
•
•
•
•
Internal energy U
Heat, Work, and Entropy
Specific heat
Entropy and statistics
Boltzmann entropy
Entropy and ideal gas
Boltzmann distribution
Specific heat revisited
Conclusions
Boltzmann entropy
Boltzmann realised that entropy has something to
do with Ω.
For two systems we know that the number of
configurations is the product of the two:
Ω = Ω1 * Ω2 .
However, the entropy, being directly related to the
heat content, has to be the SUM of both
entropies. By consequence:
S = const* log(Ω) .
The const is called kB , the “Boltzmann constant”.
Boltzmann entropy
Some mathematics in order to calculate S for large N.
For large N holds:
log (N!) ≈ N.log(N) - N
(Stirling formula).
NB: with “log” we mean the natural log.
For large N, counting is not possible.
But if x = n(+) / N , and thus n(-) / N = 1-x , we can write
log (Ω) = log { (N!) / [ n(+)! * n(-)! ] } =
N*log(N)-N –Nx*log(Nx)+Nx-N(1-x)*log(N(1-x))+N(1-x).
This can be simplified to:
log (Ω) = - N { x*log(x) + (1-x)*log(1-x) }
Boltzmann entropy
We have found that for large N:
S/kB = log (Ω) = - N { x*log(x) + (1-x)*log(1-x) }
with 0 < x < 1 .
For equilibrium we are looking for the maximum in this
function. That means differentiating and zeroing.
After some straightforward calculations one finds:
log (x / (1-x) ) = 0 , or x = 1-x , or
x = 0.5 …
…. Just what we have seen for small numbers.
This is not a sharp maximum, but going from S to Ω, you
will find the sharpness presented before.
Boltzmann entropy
We have found :
log (Ω) = - N { x*log(x) + (1-x)*log(1-x) }
= - N * f(x) .
Let the function between parentheses, f(x), differ only
1/1000 from its equilibrium value fmax, which is of the
order of one. Then
Ω / Ω(max) = exp ( N.[ fmax - 0.001 ] ) / exp ( N.fmax )
= 1 / exp ( 0.001*N ) .
For N = 1023 , this means Ω / Ω(max) = 1 / exp (1020)
.. and this is a tremendously small number…
Boltzmann entropy
We have seen that a collection of particles with two
possibilities with equal probability is most in a 50/50
distribution.
Think for instance on particles without interaction with spin
up or down in zero field.
But if we apply field, and we let the spins freely interact with
a large system at temperature T, the distribution will
change, dependent on the temperature and the magnetic
field.
We will generalize this to a system with a finite number of
energy levels at εi, in thermal contact with a large system
with energy E, and we will look for the probability for
each level to be occupied.
Contents of the presentation
•
•
•
•
•
•
•
•
•
Internal energy U
Heat, Work, and Entropy
Specific heat
Entropy and statistics
Boltzmann entropy
Entropy and ideal gas
Boltzmann distribution
Specific heat revisited
Conclusions
Entropy of ideal gas
But first something different.
Let us calculate the S for a vessel filled with ideal monoatomic gas. All energy is kinetic energy.
Counting microstates is no option.
We know
dS = dQ/T = dE/T ,
thus T = dE/dS , with E = (3/2) . N.kB.T .
This gives dS = (3/2) . N.kB. dE/E .
Integrating this gives: S = (3/2) . N.kB. log(E) + const.
With const = log(a), and S = kB. log(Ω), we have
log(Ω) = (3/2).N.log(E) +log(a) = log (a.E (3/2).N) ,
or:
Ω = a.E (3/2).N
Entropy of ideal gas
We have seen:
Ω = a.E (3/2).N
For small E, however, in particular E<1, the probability
seems to go to zero for large N.
So, we have to think on the constant a. We know that at
T=0, the Ω has to be =1. All particles are in the ground
state (or zero point energy) E0, so a.E0 (3/2).N = 1, or:
a = E0 - (3/2).N , or Ω = (E/E0) (3/2).N
For simplicity we will continue with writing “a” in our
formulas. But we can ask for a good guess for E0.
On the next sheet we calculate this energy for helium gas.
Entropy of ideal gas
Quantum physics learns: for an ideal mono-atomic gas
holds S = NkB [ log (nQ/n) + 5/2 ] , with n the density,
nQ = (2π mkBT / h2)3/2 with m the mass of the particles.
Calculating the S for helium at 1 atm (105 Pa) and 100K
gives S = 8.314 [ log (667) + 5/2 ] ≈ 75 J/K.
That means: S = kB log [ (E/E0) (3/2).N ] ≈ 12.5 log (E/E0).
=> log (E/E0) ≈ 6 , and (E/E0) ≈ 400 .
=> E0 / (N kB) ≈ 0.37 K .
We find for the ground state an energy roughly equivalent
to 0.4 kelvin, which is anyhow the correct order of
magnitude. Below 1 K we are in the quantum regime.
Entropy of
ideal gas
If we consider two vessels with ideal gas, and if we keep the
valve open, equilibrium will be attained.
That means Ω = Ω1 * Ω2 has a maximal value:
dΩ /dE1 = Ω1 (dΩ2 / dE1 ) + Ω2 (d Ω1 / dE1 ) = 0 .
Now dE1 = - dE2 (!!)
and thus
Ω1 . (d Ω2 / dE2 ) = Ω2 . (d Ω1 / dE1 ) .
With
Ω = a.E (3/2).N , and thus
dΩ/dE = a.(3N/2).E ((3/2).N) -1 ,
it follows (after some simplification) that
E1 / N1 = E2 / N2 , or
T1 = T2
!!
Contents of the presentation
•
•
•
•
•
•
•
•
•
Internal energy U
Heat, Work, and Entropy
Specific heat
Entropy and statistics
Boltzmann entropy
Entropy and ideal gas
Boltzmann distribution
Specific heat revisited
Conclusions
Boltzmann
distribution
Without exact counting, we can say that for a large system
holds:
Ω = Ω(E).
We now start with system 1 filled at total energy E, and
system 2 empty.
We now allow one particle going from 1 to level εi in 2 and
we define the probability for attaining just level i as Pi .
This Pi can be written as being proportional to the total Ω,
being Ω1.Ω2 . Now Ω1 = Ω1(E – εi), but Ω2 = 1 !!
Thus:
Pi = Ω1(E – εi) / ΣkΩ1(E – εk) .
Boltzmann
distribution
Thus:
Pi = Ω1(E – εi) / ΣkΩ1(E – εk) .
Now Ω is a steep function of E, but log(Ω) is smooth. That
means we can approximate log(Ω1(E – εi)) in a first
order Taylor expansion:
log(Ω1(E – εi)) ≈ log(Ω1(E)) – εi .d(log Ω1(E))/dE .
⇒
Ω1(E – εi) ≈ exp {log(Ω1(E)) – εi .d(log Ω1(E))/dE}
= Ω1(E) . exp(– β.εi ) , with β = d(log Ω1(E))/dE .
The formula above now reduces to:
Pi = Ω1(E).exp( – β.εi) / Σk { Ω1(E).exp(– β.εk) } =
exp( – β.εi) / Σk { exp(– β.εi ) }
Boltzmann
distribution
We have :
Pi = Ω1(E).exp( – β.εi) / Σk { Ω1(E).exp(– β.εk) } =
exp( – β.εi) / Σk { exp(– β.εk ) }
We can calculate β for a mono-atomic ideal gas where
Ω = a.E (3/2).N and E = (3/2).N.kB.T :
β = d(log Ω1(E)) /dE = d( log(a)+ (3/2).N.log(E)) /dE
= (3/2).N. (1/E) = (3/2).N. / (3/2).N. kB.T .
Conclusion:
β = 1/kB.T
This also shows to hold for all other systems.
Boltzmann
distribution
We have now:
Pi = exp( – εi / kBT ) / Σk exp(–εk / kBT ) .
If we have two levels, i and j, at energy difference Δε, the
relative occupation number follows from
(NB: the denominator is the same for both states)
Pi /Pj = exp(– εi /kBT ) /exp(– εj /kBT ) = exp(–(εi –εj )/kBT ).
=>
Pi / Pj = exp( – Δε / kBT )
And in this form, in most cases, the “Boltzmann factor” will
occur in physics and chemistry.
Contents of the presentation
•
•
•
•
•
•
•
•
•
Internal energy U
Heat, Work, and Entropy
Specific heat
Entropy and statistics
Boltzmann entropy
Entropy and ideal gas
Boltzmann distribution
Specific heat revisited
Conclusions
Specific heat
revisited
We will learn how to calculate
the specific heat of a simple
two-level system, energy
Now the specific heat C :
difference Δε .
C = dE /dT =
We know now that
2 e+δ/T / [1+e+δ/T ]2 .
R(δ/T)
n(+) / n(-) = exp( – Δε / kBT) ,
We see that at low T the
and with n(-) = N – n(+) follows
exponentials dominate,
with δ = Δε / kB and R = N kB :
where at high T they are
E = n(+) Δε =
about one, leading to the
R δ e-δ/T / [1+ e-δ/T ] or
previously mentioned
behaviour.
E = R δ / [1+ e +δ/T ] .
Specific heat
revisited
For low T we have, thus:
C/T ≈ R (δ2/T3) e - δ/T ,
where the exponential
dominates at low T.
At high T we have
C/T ≈ (R/4). ( δ2/T3 ) .
Question:
what is the entropy of the
system (thus the red shaded
area) for T very high ??
Remember what is the
equilibrium at high T, and
how we calculate the
entropy ….
Specific heat
revisited
At high T, sufficient energy is
available for each particle to
make a free choice between
+ and – , leading to Ω = 2N
possible configurations.
For S follows
S = kB log (Ω) = kB log (2N) =
= N.kB log (2) = R log 2 .
And 2 is exactly the number of
levels…
For m levels, degeneracy
one, we will find at
sufficiently high T
S = R log m .
In case the degeneracy is
>1 the formulas become
some more intricate, but
the principle doesn’t
change.
Contents of the presentation
•
•
•
•
•
•
•
•
•
Internal energy U
Heat, Work, and Entropy
Specific heat
Entropy and statistics
Boltzmann entropy
Entropy and ideal gas
Boltzmann distribution
Specific heat revisited
Conclusions
Conclusions
• Entropy is controlled by statistics.
• Equilibrium is nothing more than the most
frequently occurring distribution.
• Equilibrium can be calculated by maximizing
probability, and thus entropy.
• The “Boltzmann factor” can in special cases
directly be derived from the maximum in the
probability.
• However, the factor exp(-Δε/kBT) has a universal
validity.
Acknowledgements
• A considerable part of this course is based on a
first year tutorial given by professor dr. D.
Frenkel (now at Cambridge University in the UK)
during the last few years in Amsterdam.
• Many others have contributed during many
discussions on entropy, thereby giving me a
deeper understanding of the topic.
Thank you all.
End of the presentation.
Thank you for your attention.
Stirling approximation
N=100
N= 500
N= 1000
10^4
10^5
10^6
10^7
log(N!) =
363.74
2611.3
5912
82109
1051299
12815518
151180965
NlogN-N = 360.52
2607.3
5908
82103
1051292
12815510
151180956
Download