The Statistical Definition of Entropy

advertisement
Chemistry 360
Spring 2015
Dr. Jean M. Standard
February 18, 2015
The Statistical Definition of Entropy
Probability Considerations
Suppose we have a container of volume V filled with ideal gas molecules, as illustrated in Figure 1. We know that
the molecules will be moving around in the container, and they tend to spread out and fill the entire container.
Figure 1. A container filled with ideal gas molecules.
Now, consider just a single ideal gas molecule in the same container, as shown in Figure 2. This molecule again
will be moving throughout the container.
Figure 2. A single ideal gas molecule in a container with volume V. The dashed line is not part of the
container but indicates the division between the left and right halves of the container.
Since there is nothing to bias the ideal gas molecule to one side of the container or the other, the probability P of
finding the ideal gas molecule in the left half (or right half) of the container is
P = 1
.
2
(1)
Next, consider two ideal gas molecules in the same container, as shown in Figure 3.
Figure 3. Two ideal gas molecules in a container with volume V. In this case, both are found in the left
half of the container. The dashed line is not part of the container but indicates the division between the
left and right halves of the container.
The probability P of finding both ideal gas molecules in the left half (or both in the right half) of the container is
! 1 $! 1 $ 1
P = # &# & = .
" 2 %" 2 % 4
(2)
2
Since the ideal gas molecules are non-interacting, the probabilities are independent of one another, and therefore the
probability of finding both molecules in one half of the container is just a product of the probabilities for the
individual molecules, as shown in Eq. (2).
Now consider the original container of N ideal gas molecules shown in Figure 1. Analogous to the previous cases,
we can determine the probability P that all N molecules will be found in the left half (or right half) of the container,
as illustrated in Figure 4.
Figure 4. N ideal gas molecules in a container with volume V. All N molecules are located in the left
half of the container in this case.
Once again, since the ideal gas molecules are non-interacting, the probability of finding all N molecules on the left
side of the container is just the product of the probabilities for each individual molecule,
! 1 $! 1 $
!1$
P = # &# & … # & (N factors) " 2 %" 2 %
"2%
! 1 $N
P = # & .
"2%
(3)
Table 1 gives some examples of the calculated probability P of finding all the molecules in one half of the container
using Eq. (3) for different numbers of molecules, N.
Table 1. Probability P for finding N ideal gas molecules in one half of the container.
N
P
1
0.5
2
0.25
3
0.125
4
0.0625
5
0.03125
10
0.00098
100
7.9×10–31
1000
9.3×10–302
6×1023
~0
Notice that the probability of finding all the molecules in one half of the container decreases rapidly with N. With
10 molecules, the probability is about 10–3, but with only 100 molecules the probability has dropped to about 10–32.
By 1000 molecules, the probability is ~10–301, so for a macroscopically-sized system with Avogadro's number of
particles, the probability of finding them all in one half of the container is effectively 0.
3
The rapid decrease in probability as N increases helps explain statistically why a gas expands to fill the entire
available volume. Consider starting with a gas compressed by a piston into half the available volume, V/2, as shown
in Figure 5a.
(a)
(b)
Figure 5 (a). Ideal gas molecules compressed by a piston into half the available volume of a container,
V/2. (b) The piston is instantaneously withdrawn to allow the ideal gas molecules access to the entire
volume V of the container.
The piston is instantaneously withdrawn to allow the gas molecules access to the entire volume V of the container.
As would be expected, the gas molecules rapidly expand to fill the entire container. Why do they not remain in the
left half of the container when the piston is withdrawn? In part, it is because the situation is such a low probability
event, as demonstrated in Table 1. The gas expands to fill the container; in other words, the gas seeks a more
probable arrangement.
Spontaneous Processes
Drawing some conclusions based upon the findings for ideal gas molecules and probability, a spontaneous process
can be defined. A spontaneous process is one that moves from a state of lower probability to a state of higher
probability.
Some familiar examples of spontaneous processes include:
•
•
•
•
•
a gas expands to fill the available volume.
a sugar cube dissolves and is distributed throughout a cup of coffee.
a dropped drinking glass hits the ground and shatters into a myriad of pieces.
an ice cube melts when placed in a bowl of warm water.
a bouncing rubber ball gradually bounces lower and lower until it finally stops.
In none of the examples of spontaneous processes does the reverse process also happen spontaneously. That is, gas
molecules do not spontaneously move into one small region of a container, sugar that is dispersed in a cup of coffee
does not spontaneously arrange back into a sugar cube, a drinking glass does not reassemble spontaneously from
broken shards, an ice cube does not reform in a bowl of water, and a rubber ball at rest does not spontaneously start
bouncing again. There is one direction of the process that occurs spontaneously, and one direction that never occurs
spontaneously.
4
Extensive Variables and Entropy
In order to develop a thermodynamical variable that can provide information about spontaneity (this will be known
as entropy), we should first consider the probability, since the ideal gas example demonstrated that spontaneous
processes always move from lower to higher probability states.
All of the energy functions in thermodynamics are extensive variables. These functions include the internal energy
U, the enthalpy H, the Helmholtz energy A, and the Gibbs free energy G. While the function that will be developed
to provide a measure of spontaneity will not have units of energy, the behavior of the function will be similar to the
energy functions; therefore, we seek an extensive variable as a measure of spontaneity.
It is easy to show that the probability P is not an extensive variable. For example, if we consider the ideal gas
molecule system studied earlier, increasing the size of the system from N=1 to N=3 molecules, the probability of
! 1 $3
1
finding the molecules in one half of the container goes from
to # & . For an extensive variable, increasing the
"2%
2
size of the system by a factor of 3 should lead to an increase in the variable by a factor of 3; thus, probability is not
extensive.
While the probability itself is not an extensive variable, a function of probability may be extensive; therefore, let us
consider the natural log of probability. For the previous example, when increasing the size of the system from N=1
to N=3 molecules, the natural log of probability of finding the molecules in one half of the container goes from
!1$
!1$
! 1 $3
ln # & to ln # & . Using the properties of logarithms, the N=3 result becomes 3ln # & , which is triple the N=1
"2%
"2%
"2%
result. Therefore, the natural log of the probability is an extensive variable because it scales linearly with the size of
the system.
The Statistical Definition of Entropy
Ludwig Boltzmann (1884-1906) first developed the statistical definition of this new extensive thermodynamic
variable, called entropy and denoted S. The equation he developed for entropy is engraved on his grave stone,
Figure 6.
Figure 6. Bust of Ludwig Boltzmann with grave marker (from T. L. Brown, H. E. LeMay, Jr., and B. E.
Bursten, Chemistry The Central Science, 8th ed., Prentice Hall, Upper Saddle River, NJ, 2000).
5
In modern notation, the statistical (or Boltzmann) definition of the entropy S is
S = kB lnW .
(4)
In this equation, kB is the Boltzmann constant (sometimes denoted k, equal to 1.381×10–23 J/K), ln corresponds to
the natural log, and W is the probability. Because kB carries units of J/K so does the entropy S. (Note that molar
entropy, Sm , will therefore have units of J/mol-K).
More specifically, in the Boltzmann definition of the entropy, the probability W is related to the number of
accessible energy levels for the system. Thus, when we discuss a spontaneous process as one in which the system
moves from a state of lower probability to a state of higher probability, another way of stating this is that the system
moves from a state with fewer accessible energy levels to a state with a larger number of accessible energy levels.
Example Statistical Entropy Calculation: Ideal Gas Expansion
As a first example application of the statistical definition of entropy, consider an isothermal expansion of an ideal
gas from volume V1 to V2 . We want to use the statistical definition of the entropy, Eq. (4), to determine the change
in entropy that occurs as a result of the isothermal ideal gas expansion.
Applying Eq. (4), the entropy S1 of the initial state is
S1 = kB lnW1 .
(5)
S2 = kB lnW2 .
(6)
Similarly, the entropy S2 of the final state is
Here, W1 represents the probability of finding N particles in a volume V1 out of a total volume VT , and W2
represents the probability of finding N particles in a volume V2 out of a total volume VT . Thinking back to the
probability examples discussed earlier, it is easy to see that these probabilities should be
! V $N
W1 = # 1 & ,
" VT %
(7)
! V $N
W2 = # 2 & .
" VT %
(8)
and
Substituting, the entropies of the initial and final states are
! V1 $N
!V $
S1 = kB ln # & = NkB ln # 1 & ,
" VT %
" VT %
(9)
! V $N
!V $
S2 = kB ln # 2 & = NkB ln # 2 & .
" VT %
" VT %
(10)
and
6
Using these relations, the entropy change ΔS for the isothermal expansion of an ideal gas from volume V1 to V2 is
ΔS = S2 − S1 #V &
#V &
= NkB ln % 2 ( − NkB ln % 1 (
$ VT '
$ VT '
# V2 &
% VT (
= NkB ln %
(
% V1 (
V
$
T '
# V2 &
ΔS = NkB ln % ( .
$ V1 '
(11)
Finally, we can use the relation between the gas constant R, Avogadro's number N A , and the Boltzmann constant
kB ,
R = N A kB .
(12)
Therefore, the factor NkB in Equation (11) may be rewritten as
NkB = NR
= nR ,
NA
(13)
where n is the number of moles (=N/ N A ). Substituting, the final expression for the change in entropy for an
isothermal expansion of an ideal gas from volume V1 to V2 is
"V %
ΔS = nR ln $ 2 ' .
# V1 &
(14)
Other than using the statistical definition of entropy to obtain this result, the only assumption we have made in
obtaining this result is that the entropy change depends only on the initial and final states of the system. That is, we
have assumed that S is a state function; we will prove this in class in a future lecture.
Download