Extended Entropies and Disorder - Applied Mathematics

advertisement
Extended Entropies and Disorder
Matt Davison1
Department of Applied Mathematics, The University of Western Ontario, London, Canada N6A 5B7
J.S. Shiner2
The Shiner Group, Bergacher 3, CH-3325 Hettiswil, Switzerland
P.T. Landsberg3
Faculty of Mathematical Studies, University of Southampton, Southampton, UK
The concept of disorder, originally defined for the Shannon entropy, is generalized to the
extended entropies of Rényi and Tsallis, as well as to the recently introduced "U” entropies.
An important result is an intimate relation between the disorders corresponding to the Rényi
entropies and multifractals. Indeed it is found that the dimension of Rényi index q is just the
product of the embedding dimension and the corresponding disorder. The results are
illustrated for a simple pedagogical example and for power law distributions. In general all
three entropies are required for a complete characterization.
Keywords: disorder, Rényi entropy, Tsallis entropy, "U” entropy, power laws, multifractals,
dimensions, scale free networks
Introduction
Entropy, the Shannon entropy [], statistical or information theoretic entropy, to be more
exact,
N
H    pi ln pi
(1)
i 1
where pi is the probability of the ith of N states, is often taken as a measure of disorder, but as
such may have disadvantages due to extensivity []. For example, two moles of a noble gas
has twice the entropy of one mole (under identical conditions, of course); does one wish to
say that the two moles have twice the disorder of one mole? To circumvent this problem
some time ago one of us [PTL] introduced the quantity

H
Hmax
(2)
as "disorder”4. H max is the maximum possible entropy, which depends on the constraints
imposed on the system and the question being addressed []. In the simplest case (no
contraints) Hmax  ln N , corresponding to the equiprobable distribution pi  1 N ,1  i  N .
1
Telephone: +1 519 661 2111 x 88784; fax: +1 519 661-3523; electronic address: mdavison@uwo.ca.
To whom correspondence should be addressed. Telephone: +41 34 411 02 43; electronic address:
shiner@alumni.duke.edu.
3
Telephone: +; fax: +; electronic address: .
4
We write "disorder” in quotation marks to indicate that this is the defined quantity ; written without quotation
marks the term refers to the general concept of disorder.
2
The "disorder”  has found application to problems ranging from cosmology [] to biological
evolution []. (See also [].)
In this contribution we generalize the "disorder"  to the extended entropies of Rényi [] and
Tsallis [], as well as to the recent "U” entropies []. The higher order entropies are obtained in
general by relaxing one of the axioms leading to the Shannon entropy []. The Rényi
entropies follow upon relaxation of the condition of subadditivity, and lead to multifractals [].
The Tsallis entropies are in general nonextensive and were originally motivated, logically
enough, by the study of nonextensive systems such as black holes []. The "U” entropies
were introduced as another example of the family of higher order entropies. In this
contribution we will consider only maximum entropies corresponding to the equiprobable
distribution, the simplest case.
A first important result relates the Rényi "disorders” to multifractals: thedimension of Rényi
order q is simply the product of the corresponding "disorder” and the embedding dimension.
Two examples are analyzed. The first We then use the "disorders" corresponding to the
extended entropies to analyze a simple example and power law distributions. The simple
model was originally introduced as an example where the Shannon entropy increases with the
states of the system but the corresponding "disorder" decreases. It will be seen that the
various extended "disorders" behave differently. Power law distributions are studied because
of there ubiquity and importance. They arise in apparently very different systems, e.g.\ ...\ .
"Disorders" conrresponding to all three sorts of extended entropies are found to be necessary
for a complete characterization of these distributions.
Extended Entropies and "Disorders" – Definitions
All the higher order entropies can be written in terms of the quantity
Q
N
p
i 1
q
i
(3)
where q is independent of the pi and N. The Rényi entropies are defined as
HR q  
ln Q
, (4a)
1 q
the Tsallis entropies as
HT q  
Q 1
, (4b)
1 q
and the "U" entropies as
HU q  
1 1 Q
1 q
(4c)
2
By L'Hôpital's rule, all of these reduce to the Shannon entropy when q  1 . The extremum of
all the entropies occurs for the equiprobable distribution, as is easily shown. Furthermore,
the extrema are all maxima if q  0 , which we assume. Thus, the maximum entropies are
q
HR,max
 ln N ,
q
HT,max

N 1 q   1
,
1 q
q
HU,max

1  1 N 1 q 
,
1 q
and the corresponding disorders
 Rq  
 Tq  
 Uq  
ln Q
,
1  q ln N
Q 1
N 1 q   1
1 1 Q
,
1  1 N 1 q 
.
Given the different scalings of the maximum entropies with N (Fig. 1),
q
HR,max
 ln N ,
q
lim HT,max
 N 1 q 
,q  1

  1 q
,
 1 , q1
 q  1
q
lim HU,max
 1
 1  q , q  1
   q 1
,
N
,q  1
 q  1
N
N
one might expect that the three sorts of "disorders" will behave differently, as indeed they do
q
as will be seen below for the examples analyzed. Note that only H R,max
is independent of q.
3
log (maximum entropy)
q<1
Tsallis
Rényi or Shannon
U
1
10
log N
log (maximum entropy)
q>1
U
Rényi or Shannon
Tsallis
1
10
log N
Fig.1. Scaling of the maximum entropies with system size. N = number of states.
Fractal Dimensions
The spectrum of dimensions known as multifractals [] may be defined in terms of the Rényi
entropies by the following procedure. Consider an object embedded in a d-dimensional
space; d is known as the embedding dimension. Cover the object with d-dimensional
hypercubes whose sides are of length . Let pi be the probability that part of the object is in
the ith of these cubes. The Rényi dimension D q  is then just
D
q
 lim
0
H R q 
ln1  
.
q  0 yields the box counting or capacity dimension, q  1 the information dimension, and
q  2 the correlation dimension, although the q's are not restricted to integer values in
principle. Note that HR q  depends on  , although we do not explicitly indicate this here.
Given the relation between the HR q  and the corresponding "disorders"  Rq  we expect a
relation between D q  and  Rq  . We first need to establish the maximum possible entropy,
ln N , where N is the number of states available to the system. This is just N  V  d , where
V is the volume of the object covered by the cubes, since the volume of one of these is just
4
 d . The maximum possible entropy is then lnV  d ln  , which in the limit   0 becomes
simply  d ln  . The disorder corresponding to HR q  is thus
q
lim  R
 0
HR q 
D q 
 lim

  0  d ln 
d
In other words, the dimension D q  is simply the product of the embedding dimension and
the Landsberg-Rényi disorder  Rq  in the limit of vanishing . This may also be called the
fine grained limit.
This result has several implications of import. The first is that the concept of "disorder" as
introduced by Landsberg is not just a convenient normalization. Not only has it proven
useful in illuminating various phenomena ranging from cosmology to biology and beyond, its
relation to the Rényi dimensions adds weight to the evidence that "disorder" is measuring
fundamental properties.
A Simple Example
Here we consider a minor extension of the simple system devised by Landsberg [] to
illustrate cases where as the system grows it becomes less and less disordered (i.e., random),
although entropy increases with increasing system size. The probablility distribution is
i1
 p,
pi  
 1  p  N  1 , 2  i  N
Landsberg examined only the case p  0.5 for the Shannon entropy, which in the more
general case is
N
HS    pi ln pi   p ln p  1  p ln1  p  1  p ln N  1
i 1
It is obvious that H S always increases with N. For the case p  0.5 , however, the system is
maximally disordered for the mimimal N   2 . Landsberg showed that his proposed
measure of disorder,
 S  HS HS,max  HS ln N
decreases with increasing N as the system deviates more and more from the equiprobable
distribution:


 S   p ln p  1  p ln1  p  1  p ln N  1 ln N
Although systems such as this are often referred to as growing systems, note that this is only
one possible interpretation. A probability distribution such as that above could just as well
apply to the situation where a more precise measuring instrument becomes available, or when
5
one chooses to use a more detailed theoretical description []. In the limit N   we would
then speak of the fine grained (or thermodynamic) limit.
lim  S  1  p
N
To arrive at the extended entropies and “disorders” we first calculate
N
Q
p
i 1
 p q   N  1
q
i
1 q 
1  p
q
The Rényi entropies and "disorders" are then

    ln p
H R q   ln p q   N  1
q
R
N
Q
p
i 1
q
i
q
  N  1
1  p  1  q, q  1

1  p  1  q ln N , q  1
1 q 
1 q
 p q   N  1
q
q
1 q 
1  p
q
the Tsallis quantities are

    p
HT q   p q   N  1
q
T
q
  N  1
1  p  1 1  q, q  1

1  p  1  N    1, q  1
1 q 
1 q
q
q
1 q
and the "U" entropies and "disorders" are



     p   N  1
HU q   pq   N  1
q
U
q
 1  q p   N  1  1  p , q  1

1  p  1 1  1 N    p   N  1   1  p , q  1
1 q 
1 q
1  p
q
1
1 q
q
q
1 q
q
q
1 q
q
All of these entropies increase monotonically with N, but the "disorders" may increase or
decrease monotonically, or pass through an extremum as N increases (Fig. 2).
6
q = 0.5
20
1
15

p = 0.1
H
10
p = 0.1
0.5
5
0
0
0
25
50
75
100
0
25
N
50
75
100
75
100
75
100
N
1
15
p = 0.5
10

H
0.5
p = 0.5
5
0
0
0
25
50
75
100
0
25
N
50
N
1
6
p = 0.9
H
p = 0.9

0.5
3
0
0
0
25
50
75
0
100
25
50
N
N
7
q = 2.0
60
1

40
H
0.5
p = 0.1
20
p = 0.1
0
0
0
25
50
75
100
0
25
N
50
75
100
N
3
1

p = 0.5
2
H
0.5
p = 0.5
1
0
0
0
25
50
75
100
0
25
N
50
75
100
N
1
0.2

H
0.5
0.1
p = 0.9
p = 0.9
0
0
0
25
50
75
0
100
25
50
75
100
N
N
Fig. 2. The dependence of the various entropies and "disorders" on system size (number of
states N) for the generalized simple model of Landsberg [PTL]. Rényi: solid lines; Tsallis:
dashed lines; "U": dotted lines.
In the fine grained limit N   we find either complete"disorder" or complete "order" (i.e.,
“disorder” = 0) for q  1 for the Rényi and "U" "disorders":
 1, q  1
lim  Rq   lim  Uq   
N
N
 0, q  1
Only for q  1 , corresponding to the Shannon entropy, do we find the mildly more interesting
behavior
q  1: lim  Rq   lim  S  1  p
N
N
The Tsallis quantities are more interesting:
8
q
lim  T
N
 1  pq , q  1

 1  p q , q  1
Other than for the case q  0 the Tsallis"disorders" decrease with p from 1 to 0.
Recalling the connection between Rényi "disorders" and (multifractal) dimensions, we see
that only the information dimension ( q  1 ) may be fractal in the fine grained limit. The
Tsallis "disorders", however, vary for all q in a manner qualitatively similar to the
information dimension: a monotonic decrease with p from maximum to minimum "disorder".
Thus, in the fine grained limit, the Tsallis "disorders" provide more "information", in a
general sense, than either the Rényi or "U" "disorders".
Power Law Distributions
We now turn out attention to power law distributions
r 
,1  r  R;  

pr 
R
s

s 1
where  is a positive semidefinite constant. Q is then
R
Q
R
 r  q
r 1

q
r 
 q
r 1
 R  
 s 
 s1

q
In the thermodynamic limit ( R   ) this becomes
Q 
 q 
  
q
where is the Riemann zeta function.
In this limit the entropies are5

 1  q , q  1

HR q,  ln  q  q ln   

    1 1  q , q  1
HT q,   q

q

HU q,  1       q
5
q
1  q , q  1
The results for q = 1 are taken from OSID [].
9
lim HR q,  lim HT q,  lim HU q,  ln              , q  1
R 
R 
R 
and the "disorders"

 1  q ln R, q  1, R  

 Rq ,  ln  q  q ln   

     1  R
 Tq,   q

q
1 q 

 1 , q  1, R  
 1 R   , q  1, R  
 Uq ,  1       q
q
 1 q




lim  Rq,  lim  Tq,  lim  Uq ,  ln            ln R , q  1
R
R
R
The above is obviously for discrete power law distributions, such as that for the number of
links in a network. If the distribution is continuous, as would be the case for the grandmother
of power law distributions, the Gutenberg-Richter law [], sums must be replace with
integrals:
R
1
 ln R,
r  ~
p r   ~ ;    r   dr   1 
 1 1    ,   1

 R
1

R
~
Q
 r  q dr
r 1
~
q

q  1
 ln R,
 R 1q   1 1  q  , q  1


q
  1
  ln R,
  1 

 1 1    ,   1
 R




~
The various entropies are given in terms of Q as usual (eqs. 4). The Rényi entropies for a
continuous distribution are thus
10
ln R

  1, q  1
 ln ln R  2 ,

1 q 
1
 ln 1 R
q
  ln R 1  q 

,
  1, q  1
1 q

  R 1   1 R 1   ln R


,   1, q  1
~  ln  1     
1  
R
 1 1   
ln Q  
~

;
HR 
 
1 q 
1    q
q ln R
 ln 1 
R

1

,
  1, q  1, q  1

1 q


1    q R 1q   1
 ln 1 
q
R
 1 1  q 

,
  1, q  1, q  1

1 q





the Tsallis entropies are
ln R

  1, q  1
 ln ln R  2 ,

1 q 
1
 1 R
1
q
  ln R 1  q 
,
  1, q  1

1 q

  R 1    1 R 1   ln R

~
ln


,   1, q  1





1


Q1 
~


R
 1 1   
HT 
   1  q 
;
1  q  1   
q ln R  1
 1  
R

1

,
  1, q  1, q  1

1 q

q
R 1 q   1
 1   
1
 R 1    1 q 1  q 

,
  1, q  1, q  1

1 q




and the “U” entropies
11
ln R

  1, q  1
 ln ln R  2 ,


q
 1  1  q  ln R

R 1 q   1 ,
  1, q  1

1 q






1   R 1  1 R 1 ln R


,   1, q  1
1  ~  ln 

1  
Q   1    
R
 1 1   
~
HU 

q
1 q 
R 1   1
 1  1    q ln R

,
  1, q  1, q  1
1 q

q

1  q  R 1   1

 1

1    q R 1q   1

,
  1, q  1, q  1

1 q






The maximum entropies for continuous distributions differ slightly from those for discrete
distributions:
~ 
H R q,max
 ln R  1
 R  1 1 q   1
~ q
HT ,max 
1  q
1 q 
1  1  R  1
~ 
HU q,max

1  q
These are the maximum entropies which must be used in calculating the disorders in the case
of a continuous distribution, which are:
12
ln R

 ln ln R  2
,



ln
R

1


1 R 1 q   1
 ln  ln R q 1  q
 

,
 1  q  ln R  1

1  
 1 R 1   ln R

 ln  R




1  
  1    
R
 1 1   
~
R  
ln R  1

q

1   
 ln 1 
q ln R
R
1

,

 1  q  ln R  1

1    q R 1q   1
 ln
q

R 1   1 1  q 
,

1  q ln R  1

Rényi:




  1, q  1
  1, q  1
,   1, q  1
ln R

 ln ln R  2
,



ln
R

1

1 q 
1
 1 R
  ln R q 1  q   1

,
  R  1 1 q   1
  R 1   1 R 1  ln R

 ln 



1  

1     R
~
 1 1   
Tsallis:  T   
ln R  1

q
 1   
 1 
q ln R  1
1
 R
,

1 q 


R

1

1

q
R 1q   1
 1   
1
q
 1 
 1 1  q 
 R
,

 R  1 1 q   1





13
  1, q  1, q  1
  1, q  1, q  1
;
  1, q  1
  1, q  1
,   1, q  1
  1, q  1, q  1
  1, q  1, q  1
~
U
ln R

 ln ln R  2
,



ln
R

1

1  q ln R q

1
R 1 q   1

,
 1  1  R  1 1 q 
  1 
 1 R 1   ln R

 ln  R


1  



  1  
1  
R
1
 
ln R  1

q
1  

R
1
1
1    q ln R

,
 1  1  R  1 1 q 

1  q R 1   1 q

1
1    q R 1q   1

,

1 q 
1  1  R  1


  1, q  1
,   1, q  1



“U”:
  1, q  1


  1, q  1, q  1
  1, q  1, q  1
In the limit R   the entropies and disorders for continuous distributions become
~
lim HR

q1
  1
q  1 1   1 q

q 1   1q
q1
1q
q1
  1
q1
  1
q1
  1
q  1   1 q
q1 1q1

q1
  1
q1
  1

R
ln R
1  q ln R 1  q
ln ln R 1 q
ln  1
q
q  1 1  q
ln R
 ln R 2
   1  ln  1
ln R
1   q ln R q  1
q ln ln R q  1
ln  1
q
q  1 1  q
~
lim  R
R 
q1
q1
1
1   1 q
1  q 1  q
q1
q1
q1
 1q
  1
  1
0
1
0.5
1
14
q1
q1
q1
  1
  1 q
1 q    1
1   q q  1
q1
  1
0
0
1
4
0
3
0

2
1  q  1  q
1
1    q q  1
1
1
0
0
1
2
3
4
q
1

~
lim  R
R 

0.5

0
0
1
2
3
q
15
4
1
~
lim  R
q =1
R 
q =0.5
q =2
0.5
0
0
1
2
3

~
Fig. 3. Rényi "disorders" lim  R for continuous power law distributions in the
R 
thermodynamic limit ( R   ). Top panel: parameter space; middle panel: dependence on
the exponent of the distribution for three values of q, the index of the extended entropy, one
q  1 , q  1 , and one q  1 ; bottom panel: dependence on the index qfor three values of ,
one  1 ,  1 , and one  1 .
~
lim HT

q1
1
q1
  1
R
1    R
q
R1 q 
1  q1  q
1 q 
1 q ln R 
2
q
1  q1  q
q  1 1   1 q
  1 q R 1q 
q1
  1 q
q1
  1q
q1
q1
q1
q1
  1 q  ln R 1  q
  1 q q  1  1 1  q
  1
  1
  1
  1
ln R
 ln R 2
   1  ln  1
q1
  1
1    1 q  1 q  1

1 q  1
q
~
lim  T
R 
q1
1
1    q 1  q
q1
q1
q1
q1
q1
  1
  1
  1
  1
  1
0
1
0.5
0
1
16
q1
  1
1    1
q
q  1
4
3
1    1
0

q
q  1
2
1
1    q 1  q 
1
0
0
1
2
3
4
q
1
~
lim  T
 
R 
0.5
 
 
0
0
1
2
q
17
3
1
~
lim  T
R 
q =0.5
q =2
0.5
q =1
0
0
1
2
3

~
Fig. 4. Tsallis "disorders" lim  T for continuous power law distributions in the
R 
thermodynamic limit ( R   ). Top panel: parameter space; middle panel: dependence on
the exponent of the distribution for three values of q, the index of the extended entropy, one
q  1 , q  1 , and one q  1 ; bottom panel: dependence on the index qfor three values of ,
one  1 ,  1 , and one  1 .
~
lim HU

R 
1 1 q
q1
 1q
q1
 1q
q1
q1
q1
q1
1  q  1   1  1  q
  1
  1
  1
 1q
ln R
 ln R 2
   1  ln  1
q1
  1 q
q1
1 q    1
q1
  1
  1
q1

q1
q1
  1q
  1q
q1
q1
q1
q1
  1
  1
  1
  1q
q
1  q R
q  1
q  11    
q
q  11    ln R
q  1 R
q  11    
R 1  q
q
1   q
q
 ln R q
1  q  1   1 q  1  q
~
lim  U
R 
1
1  q  1   1
1
0.5
0
1  q
q
1    q
18
q1
  1q
4
0
q
1  q  1   1
3

0
2
1
1
1  q 
1    q
0
0
1
2
3
4
q
1
q =2
~
lim  U
R 
q =0.5
q =1
0.5
0
0
1
2

3
a
19
4
1
~
lim  U
R 
 
 
0.5
 
0
0
1
2
3
q
~
Fig. 5. “U” "disorders" lim  U for continuous power law distributions in the thermodynamic
R 
limit ( R   ). Top panel: parameter space; middle panel: dependence on the exponent
of the distribution for three values of q, the index of the extended entropy, one q  1 ,
q  1 , and one q  1 ; bottom panel: dependence on the index qfor three values of , one
 1 ,  1 , and one  1 .
The distinct behavior of each sort of “disorder” is apparent. Furthermore, the three “sorts” of
disorder divide parameter space (the top panels of Figs. 3, 4 and 5) naturally into 6 regions.
In each of the regions, one sort of “disorder” vanishes, another sort is maximal, and the third
depends on qand . The parameter space characteristics of the three “disorders” are
summarized in Fig. 6.
20
4
Rényi: 0
Tsallis: 0
U
3

Rényi: 0
Tsallis
U: 0
2
Rényi
Tsallis: 0
U: 1
1
Rényi: 1
Tsallis
U: 1
Rényi: 1
Tsallis: 1
U
0
0
Rényi
Tsallis: 1
U: 0
1
2
3
4
q
Fig. 6: Summary of the behavior of the “disorders” in parameter space. In each region only
one of the three “disorders” displays a dependence on the parameters; which one is indicated
by underlining. The other two “disorders” vanishes or = 1.
Thus in any given region of parameter space, two of the “disorders” are constant and do not
lend themselves as objects of study for understanding the system. For example, for scale-free
networks, 2    3 , the Rényi “disorder would be of help for q  1  , the “U“ „disorder for
1   q  1 , and the Tsallis „disorder“ for q  1 . The same would be true for 1    2 , the
range some think to be applicable to biological problems.
These last results are for continuous distributions in the thermodynamic limit R   . At
first glance one would think that it would be legitimate to also use these results for the
discrete distribution in this limit. Indeed, when R   the results for continuous
distributions are a good approximation for those for discrete distributions when 0    1 ; the
close  is to 1, the better the approximation. However, for   1 the approximation is not
good. On second thought, it is obvious that the continuous distribution is not a good
approximation to the discrete distribution; otherwise, the Riemann zeta function would not be
of such interest. Thus, one must distinguish between discrete and continuous power law
distributions.
21
\section{Acknowledgements}
This work was supported by grants 31-42069.94 from the Swiss
National Science Foundation and 93.0106 from the Swiss Federal
Office for Education and Science within the scope of EU contract
CHRX-CT92-0007.
\begin{thebibliography}{99}
\bibitem{bk: lewis}
F.L. Lewis \& V.L. Syrmos, Optimal Control, Wiley-Interscience,
New York, 1995.
\bibitem{ar: ryschonetal}
T.W. Ryschon, M.D. Fowler, R.E. Wysong, A.-R. Anthony \& R.S.
Balaban, Efficiency of human skeletal muscle in vivo: comparison
of isometric, concentric and eccentric muscle action, {\em J.
appl. Physiol.} {\bf 83} (1997),867.
\begin{figure}[p]
\caption{Tsallis "disorders" for power law distributions in the thermodynamic limit.}
\label{fig5}
\end{figure}
22
Download