STATISTICAL MECHANICS OF COMPLEX SYSTEMS – SOLUTIONS 2010 1. (a) All entropy formulae below have an arbitrary multiplicative constant K setting the units, or alternatively an arbitrary base of the logarithm. joint information entropy: X H(X, Y ) = − pij log pij {2} i,j conditional information entropies: X (Y ) X pij H(X|Y ) = pj H(X|Y = yj ) = − pij log (Y ) pj j i,j X (X) X pij H(Y |X) = pi H(Y |X = xi ) = − pij log (X) pi i i,j mutual information: I(X; Y ) = X pij log i,j pij (X) (Y ) pi pj {2} {2} {2} [Bookwork] (b) Three independent relations: H(X|Y ) = H(X, Y ) − H(Y ) H(Y |X) = H(X, Y ) − H(X) I(X; Y ) = H(X) + H(Y ) − H(X, Y ) any set of independent relations worth {2} per equation. [Bookwork] (c) Using notation {e, o} for even, odd (for V ) and {p, n} for prime, non-prime (for W ), the joint probabilities: pep = 1/6, pen = 1/3, pop = 1/3, pon = 1/6. Using bits as units (using log2 ), the joint entropy: 1 1 1 1 1 1 1 1 {1} log2 + log2 + log2 + log2 H(V, W ) = − 6 6 3 3 3 3 6 6 2 1 1 {1} = log2 6 + log2 3 = log2 3 + 3 3 3 ln 3 1 = + ≈ 1.92 bit ln 2 3 One way to calculate the mutual information is to realise that pe = po = 1/2, so H(V ) = −2 21 log2 12 = 1 bit, similarly H(W ) = 1 bit. {1} Then I(V ; W ) = H(V ) + H(W ) − H(V, W ) 5 1 = − log2 3 = 1 + 1 − log2 3 + 3 3 ≈ 0.08 bit Alternatively, one can apply the formula in 1(a). [Unseen] 1 {2} False. They don’t even have the same units (eg. bit vs bit2 ) ! {2} A numerical counterexample (using bits as units): if X and Y are independent coin tosses, H(X) = H(Y ) = 1, H(X, Y ) = 2 6= 1 · 1. (ii) False. It can be easily shown that H(X, X) = H(X) (eg. by applying the definition). Then H(X|X) = H(X, X) − H(X) = 0, so any nontrivial X is counterexample. {2} (iii) True. I(X; X) = H(X) + H(X) − H(X, X) = H(X). {2} [Unseen] (d) (i) 2 2. (a) Interface width: w(L, t) = [Bookwork] rD h(x, t) − h̄(t) 2 E x where h̄(t) = hh(x, t)ix {1} early times, t t× : w(L, t) ∼ tβ β: growth exponent {2} α late times, t t× : w(L, t) ∼ wsat (L) ∼ L α: roughness exponent {2} crossover time: t× ∼ L z z: dynamic exponent {2} [Bookwork] (ii) Family-Vicsek scaling relation: ( uβ , if u 1 t f (u) ∼ {2} w(L, t) ∼ Lα f z L const, if u 1 (b) (i) For t Lz : W (L, t) ∼ Lα · (t/Lz )β ∼ tβ (using z = α/β) For t Lz : W (L, t) ∼ Lα · const ∼ Lα . {1} [Bookwork] (iii) To recover tβ for small t, we need g(u) = const for u 1 (assuming B > 0) to make it independent of L, which sets A = β. {1} α α To obtain L for large t, we need g(u) = u for u 1. {2} Then for large t, Lα ∼ w ∼ tβ Lα t−Bα . This gives B = β/α = 1/z. {2} In summary: ( uα , if u 1 L g(u) ∼ w(L, t) ∼ tβ g 1/z t const, if u 1 [Unseen] (c) (i) Random deposition model: the interface grows on a discretised substrate by accreting squares (hypercubes in general dimension). The cubes arrive above random substrate positions, and simply increase the height of the colum at that position. [Bookwork] {2} (ii) Suppose N cubes are grown on a substrate made of S units. (In one dimension S = L, in two dimensions S = L2 etc.) The growth above a given substrate location can be considered as the sum of N i.i.d Bernoulli processes, each grows unit height (∆h = 1) with probability p = 1/S and does not grow (∆h = 0) with probability 1 − p. The expectation h∆hi = p = 1/S, and the variance Var(∆h) = h(∆h)2 i − h∆hi2 = p(1 − p) = (S − 1)/S 2 , since h(∆h)2 i = p {3} The Central Limit Theorem says that the sum of N such processes has mean h̄ = N h∆hi = N/S, and variance S−1 (h − h̄)2 = N Var(∆h) = N = w2 {2} S2 which equals to the square of the interface width. Now fix the substrate size S. If time is measured as N (maybe with some proportionality constant, eg. t = N/S; this does not change the scaling exponent), then √ √ w∼ N∼ t using w ∼ tβ , this gives β = 1/2. [Unseen] 3 {1} (iii) ∂h = F + η(x, t) ∂t Where η has zero mean and is delta correlated: hη(x, t)i = 0 hη(x, t)η(x0 , t0 )i = 2Dδ(x − x0 )δ(t − t0 ) {2} [Bookwork] 4 3. (a) The laws of thermodynamics (credit {1} for each): (0) There exist a relation between thermodynamic systems. This relation is called thermodynamic equilibrium, and it is transitive (equivalence relation): if A ∼ B and B ∼ C, then A ∼ C. Here A, B and C label different systems. For example in thermal equilibrium this means a transitive relation between the temperatures of the three systems. (1) Energy conservation: the total energy of an isolated system is fixed. Thus if during some process a system absorbs heat ∆Q, as well as work ∆W = −p ∆V + . . . is made on it, then its energy changes by ∆E = ∆Q + ∆W . (2) In an isolated system the entropy does not decrease. Thus if during some process a system absorbs heat ∆Q, then its entropy changes by ∆S = ∆Q/T + ∆Sinternal ≥ ∆Q/T . (3) The entropy at absolute zero temperature is zero (or can be set zero). [Bookwork] (b) (i) Free energies are the Legendre transforms of the energy. {2} (ii) The free energy of a system does not increase, so at stable equilibrium it is minimal. {2} An example: (canonical ensemble:) a system is kept at fixed temperature T while undergoing some change. Its change in energy: ∆E = ∆Q. Its change in entropy: ∆S = ∆Q/T + ∆Sinternal ≥ ∆Q/T The relevant free energy is the Helmholtz free energy: A(T ) = E − T S. {3} The change in Helmholtz free energy: ∆A = |{z} ∆E − |T {z ∆S} ≤ 0. ∆Q ≥∆Q (iii) The probability of a macroscopic state (sum of Boltzmann factors) can be expressed by a single Boltzmann factor, in which the energy is replaced by the appropriate free energy. {1} Example: in a grand canonical ensemble the probabilities multiplied by Ξ: Ξ= X i e −β(Ei −µNi ) = ∞ X e βµN X e −βEj;N j N =0 | N =0 {z } state with same N [Bookwork] 5 = ∞ X e−β(A(T ;N )−µN ) {2} (c) (i) The partition function: 1 Z= 2 h Z ∞ dx −∞ " Z ∞ dy −∞ Z ∞ dpx −∞ Z ∞ dpy −∞ p2y mg 2 p2x + + (x + y 2 ) × exp −β 2m 2m 2` Z ∞ Z ∞ mg 2 mg 2 1 = 2 dy e−β 2` y dx e−β 2` x h −∞ {z } −∞ | q × Z {2} 2π` βmg ∞ dpx e −∞ | {z q = !# 1 2 −β 2m px 2πm β ` h̄ gβ 2 } Z ∞ 1 2 dpy e−β 2m py −∞ 2 {3} (ii) Average energy: hEi = − ∂ 1 2 ∂ ln Z =− ln 2 = = 2kB T ∂β ∂β β β {3} mgx2 p2x or ) contributes 12 kB T to the 2m 2` average energy. {2} In this case we have 4 such quadratic contributions, leading to hEi = 2kB T . {1} [Unseen] in this form, though harmonic oscillator was covered in lecture. (iii) Each quadratic half-degree of freedom (eg. 6