Uploaded by rob

BoltzmannsEntropy3

advertisement
Entropy according to Boltzmann
Kim Sharp, Department of Biochemistry and Biophysics, University of Pennsylvania, 2016
"The initial state in most cases is bound to be highly improbable and from it the system will always rapidly
approach a more probable state until it finally reaches the most probable state, i.e. that of the heat equilibrium.
If we apply this to the second basic theorem we will be able to identify that quantity which is usually called
entropy with the probability of the particular state." (Boltzmann, 1877) (1)
Heat flows from a hot object to a cold one. Two gases placed into the same container will spontaneously mix.
Stirring mixes two liquids; it does not un-mix them. Mechanical motion is irreversibly dissipated as heat due to
friction. Scientists’ attempts to explain these commonplace observations spurred the development of
thermodynamics and statistical mechanics and led to profound insights about the properties of matter, heat and
time’s arrow. Central to this is the thermodynamic concept of entropy. 150 years ago (in 1865), Clausius
famously said (2)
"The energy of the universe remains constant. The entropy of the universe tends to a maximum."
This is one formulation of the 2nd Law of Thermodynamics. In 1877 Boltzmann for the first time explained what
entropy is and why, according to the 2nd Law of Thermodynamics, entropy increases (3). Boltzmann also
showed that there were three contributions to entropy: from the motion of atoms (heat), from the distribution of
atoms in space (position) (3), and from radiation (photon entropy)(4). These were the only known forms of
entropy until the discovery of black hole entropy by Hawking and Bekenstein in the 1970's. Boltzmann's own
words, quoted above, are as concise and as accurate a statement on the topic any made in the subsequent century
and a half. Boltzmann's original papers, though almost unread today, are unsurpassed in clarity and depth of
physical insight (see for example references (1, 5–7)). It is likely that many inaccurate and misleading
statements about entropy and the 2nd law of thermodynamics could have been avoided by simply reading or
quoting what Boltzmann actually said, rather than what his commentators or scientific opponents said he said!
How do I love thee? let me count the ways- E. B. Browning
Boltzmann's formulation of the 2nd Law can be paraphrased as
"What we observe is that state or change of states that is the most probable, where most probable
means: can can be realized the most ways"
The 2nd Law, once expressed in this way, has the same retrospective obviousness and inevitability that Darwin's
theory of evolution through natural selection and survival of the fittest does. Incidentally, Boltzmann was a
strong admirer of Darwin's work and he correctly characterized the struggle of living things "not for elements
nor energy...rather, it is a struggle for entropy (more accurately: negative entropy)"(8). Boltzmann's statement of
the 2nd Law in terms of probability is not simply a tautology but a statement of great explanatory power. This is
apparent once the exact meaning of the terms observe, state, most probable, and ways is established.
Atoms, Motion and Probability
The fundamental definition of entropy and the explanation of the 2 nd Law must both be grounded upon
probability, Boltzmann realized, because of two simple considerations: i) Atoms are in constant motion,
interacting with other atoms, exchanging energy, changing positions. ii) We can neither observe, nor calculate
the positions and velocities of every atom. The macroscopic phenomena we seek to explain using the 2 nd Law
reflect the aggregate behavior of large numbers of atoms that can potentially adopt a combinatorically larger
1
Entropy according to Boltzmann, © Kim Sharp, 2016
number of configurations and distributions of energies. So we first consider the probability distribution for
atomic positions in the simplest case, in order to define our terms, and appreciate the consequences of averaging
over the stupendously large numbers involved. Then, with just a little more work, the 'rule' for the probability
distribution for the energies of atoms is established.
Distribution in space
To introduce the probability calculations, consider the distribution of identical molecules of an ideal gas between
two equal volumes V, labeled L for left, and R for right (Figure 1). From the postulate of equal a priori
probabilities, each molecule has the same probability, ½, of being in either volume: the volumes are the same
size, each molecule is moving rapidly with a velocity determined by the temperature, and undergoing elastic
collisions with other molecules and the walls which changes its direction, but otherwise making no interactions 1.
So at any instant we might expect an equal number of molecules in each volume. Indeed, if there are two
molecules, this is the mostly likely single 'state distribution', produced by 2 of the 4 arrangements or
complexions: LR and RL both belong to the state distribution (NL=1,NR=1) 2. The more extreme state
distributions of (NL=2,NR=0) and (NL=0,NR=2) are less likely, there being 1 complexion belonging to each: LL
and RR. Still, with only two molecules these more unequal state distributions are quite probable, with p=1/4
each. Distributing 4 molecules, the possibilities are given in Table 1,
Table 1
State distribution # of complexions Probability
(NL =4,NR=0)
1
(½)4 = 0.0625
(NL=3,NR=1)
4
4/16 = 0.25
(NL=2,NR=2)
6
6/16 = 0.375
(NL=1,NR=3)
4
4/16 = 0.25
(NL=0,NR=4)
1
(½)4 = 0.0625
R
L
state distribution (NL=2,NR=2)
Figure 1
where the probabilities are obtained as follows: Each molecule independently has a probability of ½ of being in
L or R, so the probability of any particular complexion of 4 molecules is ½ x ½ x ½ x ½ = (1/2) 4 = 1/16. and
there is one such complexion LLLL that gives the distribution (NL=4,NR=0), four (RLLL, LRLL etc.) that give
the distribution (NL=3,NR=1) and so on. There are more ways (complexions) to obtain an equal distribution (6)
than any other distribution, while the probability of the more extreme distributions- all molecules in the left or
right volumes, diminishes to p=1/16 each. More generally, for N molecules the total number of complexions is
2N, and the number of these that have a given number of molecules NL in the first volume and NR = N- NL in the
second, is given by the binomial coefficient:
W (N L , N )=
N!
N L ! N R!
(1)
The function W(NL,N) is peaked with a maximum at NL= NR given by
1 Tolman (1936) has emphasized that the ultimate justification for the postulate of equal a priori probabilities is the
empirical success of the resulting theory
2 Here LR means a 'complexion' where the first molecule is in the left hand volume, and the second one is in the right
hand volume, and so on. The state distribution is defined as the number of molecules in each volume. More generally the
state distribution describes the number of molecules/atoms lying within each small range of spatial positions and kinetic
energies, exactly analogous to the partition function of post-quantum statistical mechanics.
2
Entropy according to Boltzmann, © Kim Sharp, 2016
W max =W (N L =N R , N )=
N!
( N2 ! )2
(2)
As N increases the peak becomes ever sharper, in other words an increasing fraction of the state distributions
come to resemble ever more closely the most probable one NL=NR . This is illustrated in the 3 panels below for
10 thousand, 1 million and 100 million molecules:
Figure 2
With 10 thousand molecules, we find that the majority of possible state distributions lie within ±1% of the most
probable NL= NR one, although a small but significant fraction lie further away. For 1 million molecules, we find
that almost all the possible state distributions (99.92%) lie within ¼ of a percent of NL=NR . For a 100 million
molecules, most (99%) lie within two hundredths of a percent of NL=NR . In other words, for large numbers of
molecules almost every state distribution is indistinguishable from the most probable one, NL=NR, the state
distribution that has the most complexions, that can occur in the most ways.
Figure 3
Figure 3 shows one computer generated distribution
for each of six values of N, starting with 100 and
increasing by a factor of 10 each time up to 10
million molecules. Already for N=100,000 molecules,
the total number of possible complexions is
stupendous, greater than 1 followed by 30,000 zeroes
(!), yet the single sample belongs to a state
distribution that differs negligibly from NL=NR,
exactly as one expects from the narrowing peak in
Fig. 1. For the larger values of N, there are even
more possible complexions, on the order of 100.3N yet
the single sample belongs to a state distribution even
more closely resembling the most probable one. The
point of this example is twofold: 1) Even picking just
one complexion, it is overwhelming likely that it
corresponds to a state distribution with NL≈NR . In no
way is it necessary to sample all the complexions (the
so called ergodic hypothesis), or even any significant fraction of them. 2) As the specific complexion changes
due to thermal motion (as molecules cross between L and R), the gas is overwhelmingly likely to stay in state
distributions where NL≈ NR if it is already there, or if it has a complexion where L and R are very different, it is
overwhelmingly likely to move to complexions belonging to less unequal state distributions, because there are so
many more of the latter than the former.
3
Entropy according to Boltzmann, © Kim Sharp, 2016
Figure 4
This relaxation of a system towards equilibrium can
also be illustrated by a 'computer experiment' for our
simple case of molecules in a volume divided into left
and right halves: Initially, all N molecules are in the left
half, confined by a partition. The partition is removed.
This is now a very unlikely state distribution, as, at our
coarse level of description it contains only one
complexion L1...NR0. A molecule is picked at random and
moved to the other half (This represents a small amount
of elapsed time during which one molecule by chance
crosses the dividing line between left and right (in
either direction). The process of picking a molecule at
random and moving it to the other half is repeated.
Figure 4 illustrates how the percentage of molecules in
L changes over 'time'. Starting with 90 molecules
(100%) in L the percentage rapidly decreases at first, approaches 50% and then fluctuates around that value for
the rest of the experiment (black curve). The fluctuations are sizable since the number of molecules is rather
small. Although the majority of complexions are fairly close to NL=NR, they belong to rather a broad set of state
distributions. For N=5000 molecules (red curve), more crossovers were sampled (albeit representing smaller
time increments between sampling), and the results were plotted after every 20 th sample. Here the excess fraction
in L decays in an almost deterministic fashion to an 'equilibrium' state indistinguishable from NL=NR . The
fluctuations are very small, as almost all of the complexions are close to NL=NR .
The approach to equilibrium in this experiment is straightforward to understand. Early in the process, when the
number of molecules in L greatly exceeds the number in R, the probability that the next molecule to crossover
will come from L is higher than from R. Thus is it extremely likely that over the course of several steps the
number in L will decrease, and we will move towards the most probable state distributions. Moreover, for large
N, far from equilibrium the decay is almost deterministic because in a sequence of 1<<n<<N crossovers, the
fractions of the n crossovers coming from L or R will be close to the total fractions of the molecules in L or R,
namely NL/N or NR/N, respectively. So in a sequence of n crossovers, the decrease in excess number of molecules
in L, Δ(NL-NR), is approximately α -n(NL-NR)/N which is the formula for exponential decay. In contrast, once
having reached an equilibrium of NL≈NR, to move away significantly requires that a large excess of molecules
crossover in one direction (say from L to R). This is unlikely in the same way that is unlikely to get say 20 heads
in 25 coin tosses.
The key point is that macroscopic quantities like the density (NL/V and NR/V in our example), that are the subject
of thermodynamics, result from the aggregate behavior of large numbers of molecules. Even the figure of 10
million in Fig. 3 is not particularly large compared to the number of atoms and number of atomic configurations
encountered in most applications of thermodynamics to chemistry and biology: Just 1 nanomole contains
1x10-9NA = 6.02x1014 atoms or molecules. Because of the combinatorially large (supra-astronomical) numbers
involved their aggregate behavior is highly predictable using Boltzmann's statistical mechanical principle of
determining which state, or change of states, can be realized the most ways.
Relationship to entropy
The distribution of molecules into two equal sub-volumes labeled here left (L) and right (R), and the
determination of the state distribution NL= NR with the most complexions is a very simple example of the general
statistical mechanics approach first developed by Boltzmann (3). Boltzmann called the logarithm of the number
4
Entropy according to Boltzmann, © Kim Sharp, 2016
of complexions Pi belonging to a particular state distribution i its permutability-measure3 defined as Ωi =ln Pi,
and he postulated that thermodynamic equilibrium is achieved when the system has the state distribution with
the most possible complexions, Pmax, as this is the most probable macro-state. Conversely, if the system is not at
equilibrium, it has a state distribution with a much smaller number of complexions, and with high probability it
will evolve through successive state distributions with increasing numbers of complexions, until it reaches
equilibrium, as illustrated in our simple example. It was then a short step for Boltzmann to find the exact
relationship between the number of complexions Pi of a state distribution and the existing thermodynamic
definition of entropy due to Clausius:
Si = k ln Pi
(3)
where k is a constant, now named after Boltzmann 4 Equilibrium is then achieved when the entropy is maximum,
S = k lnPmax. Clearly the entropy defined by Eq. 3 is equally applicable to non-equilibrium states. At this point
we can already apply Boltzmann's principle to a number of simple thermodynamic scenarios where the
temperature is not changed
Ideal gas expansion and mixing
N molecules of an ideal gas is contained in a volume V1 separated by a partition from an empty volume of equal
size. The partition is removed and the gas expands to volume V2. The number of complexions (positions in
space) available to each gas molecule is to a good approximation proportional to the volume available to it, all
parts of the container being indistinguishable. Since the molecules do not interact except through random elastic
collisions, their distributions are independent. Applying Eq. 3 the entropy change is then
ΔS = k ln (V2N/V1N) = Nk ln (V2/V1)
(4)
Equation 4 is also applicable to expansion of an ideal gas by withdrawing a piston, providing this is done slowly
enough that the temperature of the gas remains the same throughout.
If we have two different gases, N1 molecules of gas in volume V1 and N2 molecules of gas in volume V2 and
remove the partition between them, each gas expands into a new volume V=V1+V2, with entropy change
ΔS = N1k ln (V/V1) + N2k ln (V/V2)
(5)
The entropy change comes not from the mixing per se (9), but from the expansion of each gas into a larger
volume, permitting more complexions for each gas 5. Lastly, we can see that the probability of a particular
molecule being in any two equal volume partitions is 50:50 however we divide the volume up: in to upper and
lower halves, back and front halves, indeed into a checkerboard pattern of equal cubes etc. From this we
conclude that the maximum entropy, equilibrium state is one of uniform density. It is not physically impossible
that there will be significant fluctuations in density, or even that N gas molecules will spontaneously go back into
their original volume V1 without the partition, but it is effectively 'statistically' impossible.
3 Permutabilitätmaß
4 Boltzmann in his 1877 paper used the mean kinetic energy per atom (using the ambiguous symbol T, replaced here by
Ê), rather than the temperature, so Ê≡3/2kT. He wrote Clausius' expression for entropy as δS = δQ/Ê so entropy was
dimensionless, and S=2/3Ω+constant = 2/3lnPmax+constant. With this convention Boltzmann's constant is dimensionless,
with value 2/3 (!). There is also an additive constant which plays no role in changes in entropy, and which has been
omitted from Eq. 3 onwards
5 Note that the concentration dependent part of the entropy entropy of an ideal solute in a solvent is calculated in exactly
the same way as for a gas at the same concentration. See Ben Naim 'Solvation Thermodynamics'
5
Entropy according to Boltzmann, © Kim Sharp, 2016
Distribution of kinetic energy
Consider now the distribution of kinetic energy (heat) among atoms. Because atoms are in constant motion,
interacting with each other through collisions and other inter-atomic forces, they exchange energy. While atoms
have a fixed identity, so that when specifying complexions it is meaningful to speak of a specific atom being first
in one position and then in another, energy has no fixed identity; what is distinct is only how much of it each
atom has. The procedure for enumerating the number of complexions for each state distribution is thus different,
but again it can be illustrated with a simple random sampling experiment. There are 10 cups labeled A through J.
In each cup there are exactly 3 coins. The cups represent atoms, and the number of coins in the cup represent the
amount of kinetic energy the atom has. So the initial state distribution is one where the energy is spread out
uniformly and each 'atom' has exactly the average amount of 'kinetic energy'. For each cup, we pick a random
integer M from 0 up to the number of coins in the cup. We then select one of the other 9 cups at random and
move M coins from the 1st cup to the 2nd cup. This is done for each cup in turn, constituting one round of
'energy' exchange between 'atoms' through 'random collisions and interactions'. Multiple rounds of coin (energy)
exchanges are performed. The exact procedure for exchanging energy is not important providing that multiple
applications generate sufficiently random samples of possible distributions. The table below represents the
results of several such exchanges
Table 2
Atom
A
B
C
D
E
F
G
H
I
J
Starting Energy
3
3
3
3
3
3
3
3
3
3
Exchange i
0
3
12
1
1
0
5
3
4
1
Exchange ii
4
0
8
2
0
5
6
2
2
1
1
0
2
4
0
2
1
5
9
6
etc
...
Exchange ix
After each exchange we can determine the distribution of energies by tabulating the number of atoms with 0
units of energy, 1 unit of energy etc. Initially this state distribution is mono-disperse: All atoms have exactly 3
units (Green bar in the figure). This state
distribution soon changes, as the table
illustrates. After 10 exchanges the average state
distribution is given by the Red bars: Now the
energy is not evenly distributed: Most atoms
have a small amount of energy, fewer have a
larger amount. In other words, on average, the
probability of an atom having a certain amount
of energy decreases with increasing energy.
There are simply more ways (complexions) to
distribute the energy in this way than either to
spread it uniformly among the atoms (one
extreme state distribution, Green Bar) or
allocate all 30 'units' to one atom (the other
extreme state distribution, Blue Bars).
Figure 5
6
Entropy according to Boltzmann, © Kim Sharp, 2016
If one does the counting of complexions more exactly, using a larger number of atoms, and allowing the energy
to be divided much more finely, as Boltzmann did (3), one finds that the state distribution for energies that has
the largest permutability measure is the now famous Boltzmann distribution:
p ( E k ) α exp(−
≡
3 Ek
)
2 Ê
E
exp(− k )
kT
(6)
where p(Ek) is the probability that the k'th atom has kinetic energy Ek, and Ê is the average kinetic energy of the
atoms. The first expression is written the way Boltzmann originally gave it. Recognizing that Ê = 3/2kT, where T
is the temperature and k is Boltzmann's constant we have the modern form of the distribution. In some ways the
older form is preferable, because it emphasizes that the energy is relative to the average energy of all the atoms
which are exchanging energy. Temperature as a thermodynamic concept is much easier to understand if one
remembers what it really is: A measure of the average kinetic energy of the atoms where the 'constant' k is
simply a conversion factor from whatever temperature units are used to whatever energy units are used for Ek. 6
Note that Boltzmann-like state distributions already develop in the small sample coin-cup model because they
encompass so many more complexions; The Boltzmann distribution approaches law-like status when
thermodynamically large numbers of complexions are involved. A simply overwhelming number of all possible
energy complexions belong to state distributions that follow the Boltzmann distribution so closely as to be
indistinguishable from it. Therefore the Boltzmann distribution is the maximum entropy distribution, and the one
found at equilibrium.
Equation 6 shows how the energy is distributed when N atoms are sharing a total amount NÊ of kinetic energy.
Boltzmann also determined how the number of complexions of energy P for the state distribution given by Eq. 6
changes when an additional small amount of kinetic energy (heat) is added (δq >0) or removed (δq <0). If the
amount of heat is small compared to NÊ, then
ln
P2 3 δ q δ q δ S
`
=
≡ =
P 1 2 Ê kT
k
(7)
where the last equality uses Clausius' thermodynamic definition of an entropy change δS = δq/T. Eq. 7 relates
the change in number of complexions, or change in permutability measure to the change in entropy due to added
heat. To give some feeling for Eqs. 3 and 7, consider adding a small amount of heat equal to the average kinetic
energy of a single atom δq = Ê to a system at equilibrium. Then the number of complexions of the most probable
state distribution increases by a factor of
3
P2
=e 2 ≈4.5
P1
(8)
Equations 7 and 8 apply when the amount of heat added is small enough that the change in average kinetic
energy of the atoms (and thus δT) is negligible. For larger amounts of heat one sums up the effects of many small
additions (i.e. integrates):
T2
Δ S =∫
δq
Cv δ T
=
T ∫
T
T
(9)
1
6 See also the note to Eq. 3, where k =2/3.
7
Entropy according to Boltzmann, © Kim Sharp, 2016
where
Cv=δ q /δT is the heat capacity.
In summary Eq. 7 is a prescription for relating the amount of heat (kinetic energy) added to or removed from a
system, the average kinetic energy per atom (i.e. the temperature) of the system, and the change in the number of
ways of distributing that energy.
Note that it is not the absolute amount of heat added, but the amount relative to the average kinetic energy that
matters for the entropy change. In other words, when there is only a small amount of kinetic energy to start with,
the number of complexions is low, and adding a certain amount of heat has a large effect on the relative number
of complexions. Conversely, if there is a already a large amount of kinetic energy, the number of complexions is
already large, and adding the same amount of heat will produce a proportionally smaller increase in
complexions.
Why heat flows from hot to cold
Using Eq. 7 it is now possible to explain why heat flows from high to low temperature. Consider two separate
bodies A and B at equilibrium with temperatures TA and TB and with numbers of complexions PA1 and PB1
respectively. The total number of complexions is P1 = PA1 PB1. If the two bodies are brought into contact and an
amount of heat δq leaves the first body and enters the second, then the number of complexions for each are now
P A2=P A1 exp (
−δ q
δq
) and P B2=P B1 exp(
)
kT A
kT B
(10)
and the total number of complexions after the heat transfer is
P A2 P B2=P A1 P B1 exp(
δq 1
1
( − ))
k TB TA
(11)
We see that the number of complexions will increase only if the exponent is positive, when TA>TB, i.e. when heat
flows from the body which is hotter to the one which is colder. When TA=TB the change in number of
complexions is zero, in other words the number of complexions will be a maximum when the two bodies are at
the same temperature. Heat is equally likely to flow from A to B as from B to A when they are at the same
temperature, which is the condition for equilibrium.
Including potential energy or 'Free energy: Its all entropy all the time'
In describing the spatial contribution to entropy, above, the probabilities of finding a given particle in two
regions of space of equal volume were taken to be equal. This is the postulate of equal a priori probabilities. The
equivalent postulate applies to each distinguishable distribution of kinetic energies consistent with a constant
total energy. When there are forces acting on the particles then there are two contributions to the total energy,
kinetic energy and potential energy, and by the first law of thermodynamics the sum is constant. The total
number of complexions is a product of the number of spatial and kinetic complexions, but these are not
independent since the requirement for conservation of energy means that kinetic energy is converted to potential
energy or vice versa whenever a particle moves between states of different potential energy.
Consider a particle which moves from one region of space, a, to another, b, with a potential energy that is higher
by δU. Since energy is conserved, the total kinetic energy decreases by the same amount. In other words an
amount δq = -δU of heat disappears. If δq is small compared to the total kinetic energy, the change in entropy is
8
Entropy according to Boltzmann, © Kim Sharp, 2016
δq/T = -δU/T. The change in total entropy is then the sum of this kinetic part and that arising from any change in
spatial complexions when the particle moves from a to b, δSspatial. So then
δStot = δSspatial – δU/T
(12)
Multiplying by -T, this can be written as
-TδStot = δU – TδSspatial = δA
(13)
where A is known as the Helmholtz free energy. If the system is at a constant pressure and there is a volume
change δV associated with the movement from a to b, then for energy to be conserved the amount of heat lost
must be greater by the amount of work done, δq = -δU - PδV and the total entropy change is now
δStot = δSspatial – δU/T – PδV/T
(14)
Multiplying by -T, this can be written as
-TδStot = (δU + PδV) – TδSspatial = δH – TδSspatial = δG
(15)
where G is the Gibbs free energy. In other words the change in free energy is just another name for the change in
total entropy (times -T), arising from what we might describe as the intrinsic change in the entropy of the
particle or system of interest, δSspatial, plus the change in entropy of the system plus environment (universe) due to
the gain or loss of heat.
Entropy and time's arrow
In this way we can see that the tendency for a system to move downhill in potential energy is just another
example of the second Law, because some of the potential energy is converted to heat (through friction which is
always present). Thus the system moves to a more probable state, in the sense that it can be realized by a greater
combination of spatial and kinetic energy complexions7. This imparts a direction in time - the Arrow of Time to mechanical motion and many other events, even though the underlying laws of physics are time reversible.
Directionality appears to come from moving 'downhill' in potential energy, but in fact it comes from moving
'uphill' in entropy.
The second Law as the increase in disorder or as the tendency of energy to spread?
An increase in entropy is often referred to as an increase in disorder. Another widely promoted formulation of
the second law is “Energy of all types changes from being localized to becoming more spread out, dispersed in
space: Frank. Lambert, http://entropysite.oxy.edu” Here I have avoided such descriptions as unnecessary and
potentially misleading. The first because the term disorder is contentious, and in the end can only be defined
rigorously in terms of the very entropy it is supposed to 'explain'. I avoid the second because it does not advance
us any further than the statement that “heat flows from a hot body to a cold one.” It does not explain why. This is
best answered by Boltzmann's original explanation as the: “ ..approach a more probable state.”
7 Boltzmann said of mechanical motion of a body “molar motion is known as heat of infinite temperature” meaning it can
always lose, via friction, its kinetic energy to the surroundings as heat, whatever the temperature of the latter.
9
Entropy according to Boltzmann, © Kim Sharp, 2016
References
1. Sharp KA, Matschinsky F (2015) Translation of Ludwig Boltzmann’s Paper “On the Relationship between
the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations
Regarding the Conditions for Thermal Equilibrium” (Wien. Ber. 1877, 76:373-435). Entropy 17:1971–
2009.
2. Clausius R (1865) On several convenient forms of the fundamental equations of the mechanical theory of
heat. Ann Phys 125:353.
3. Boltzmann L (1877) Uber die Beziehung zwischen dem zweiten Hauptsatze der mechanischen
Warmetheorie und der Wahrscheinlichkeitsrechnung respektive den Satzen uber das Warmgleichgewicht.
Weiner Berichte 76:373–435.
4. Boltzmann, L (1884) Ableitung des Stefanschen Gesetz, betreffend die Abhangigkeit der Warmestrahlung
von der Temperatur aus der elektromagnetischen Lichttheorie. Wiedemanns Ann 22:291–4.
5. Gallavotti G (1999) Statistical mechanics : a short treatise (Springer, Berlin).
6. Lebowitz, J. L. (1993) Boltzmann’s entropy and time’s arrow. Phys Today September:32–38.
7. Swendsen, R. H. (2006) Statistical mechanics of colloids and Boltzmann’s definition of the entropy. Am J
Phys 74:187–190.
8. Boltzmann L The second law of the mechanical theory of heat. Populare Schriften (J. Barth, Leipzig).
9. Ben-Naim A (1987) Is mixing a thermodynamic process. Am J Phys 55:725–733.
10
Entropy according to Boltzmann, © Kim Sharp, 2016
Download