Document 10940100

advertisement
Hindawi Publishing Corporation
Journal of Inequalities and Applications
Volume 2009, Article ID 323615, 15 pages
doi:10.1155/2009/323615
Research Article
On Some Improvements of the Jensen
Inequality with Some Applications
M. Adil Khan,1 M. Anwar,1 J. Jakšetić,2 and J. Pečarić1, 3
1
Abdus Salam School of Mathematical Sciences, GC University, 5400 Lahore, Pakistan
Faculty of Mechanical Engineering and Naval Architecture, University of Zagreb,
1000 Zagreb, Croatia
3
Faculty of Textile Technology, University of Zagreb, 1000 Zagreb, Croatia
2
Correspondence should be addressed to M. Adil Khan, adilbandai@yahoo.com
Received 23 April 2009; Accepted 10 August 2009
Recommended by Sever Silvestru Dragomir
An improvement of the Jensen inequality for convex and monotone function is given as well as
various applications for mean. Similar results for related inequalities of the Jensen type are also
obtained. Also some applications of the Cauchy mean and the Jensen inequality are discussed.
Copyright q 2009 M. Adil Khan et al. This is an open access article distributed under the Creative
Commons Attribution License, which permits unrestricted use, distribution, and reproduction in
any medium, provided the original work is properly cited.
1. Introduction
The well-known Jensen’s inequality for convex function is given as follows.
Theorem 1.1. If Ω, A, μ is a probability space and if f ∈ L1 μ is such that a ≤ ft ≤ b for all
t ∈ Ω, −∞ ≤ a < b ≤ ∞,
φ
Ω
ftdμt
≤
Ω
φ ft dμt
1.1
is valid for any convex function φ : a, b → R. In the case when φ is strictly convex on a, b one
has equality in 1.1 if and only if f is constant almost everywhere on Ω.
Here and in the whole paper we suppose that all integrals exist. By considering the
difference of 1.1 for functional in 1 Anwar and Pečarić proved an interesting result of
log-convexity. We can define this result for integrals as follows.
2
Journal of Inequalities and Applications
Theorem 1.2. Let Ω, A, μ be a probability space and f ∈ L1 μ is such that a ≤ ft ≤ b for all
t ∈ Ω, −∞ ≤ a < b ≤ ∞. Define
s ⎧
s
1
⎪
⎪
dμt
−
ftdμt
,
ft
⎪
⎪
ss − 1
⎪
Ω
Ω
⎪
⎪
⎪
⎪
⎨ Λs log
ftdμt −
log ft dμt,
⎪
⎪
Ω
Ω
⎪
⎪
⎪
⎪
⎪
⎪
⎪ ft logftdμt −
⎩
ftdμt log
ftdμt ,
Ω
Ω
Ω
s
/ 0, 1,
s 0,
1.2
s 1,
and let Λs be positive. Then Λs is log-convex, that is, for −∞ < r < s < u < ∞, the following is valid
Λs u−r ≤ Λr u−s Λu s−r .
1.3
The following improvement of 1.1 was obtained in 2.
Theorem 1.3. Let the conditions of Theorem 1.1 be fulfilled. Then
Ω
φ ft dμt − φ
Ω
ftdμt
≥ φ ft − φ f dμt − φ
f ft − f dμt,
Ω
1.4
Ω
where φ
x represents the right-hand derivative of φ and
f
Ω
1.5
ftdμt.
If φ is concave, then left-hand side of 1.4 should be φ
Ω
ftdμt −
Ω
φftdμt.
In this paper, we give another proof and extension of Theorem 1.2 as well as
improvements of Theorem 1.3 for monotone convex function with some applications. Also
we give applications of the Jensen inequality for divergence measures in information theory
and related Cauchy means.
2. Another Proof and Extension of Theorem 1.2
In fact, Theorem 1.2 for Ω a, b and 0 < r < s < u, r, s, u /
1 was first of all initiated by
Simić in 3.
Journal of Inequalities and Applications
3
Moreover, in his proof, he has used convex functions defined on I −∞, 0 ∪ 0, 1 ∪
1, ∞ see 3, Theorem 1. In his proof, he has used the following function:
λx v2
xr
xs
xu
2vw
w2
,
ss − 1
rr − 1
uu − 1
2.1
where r s u/2 and v, w, r, s, u are real with r, s, t ∈ I.
In 1 we have given correct proof by using extension of 2.1, so that it is defined on
R.
Moreover, we can give another proof so that we use only 2.1 but without using
convexity as in 3.
Proof of Theorem 1.2. Consider the function λx defined, as in 3, by 2.1.
Now
2
λ x vxs/2−1 wxu/2−1 ≥ 0,
for x > 0,
2.2
that is, λx is convex. By using 1.1 we get
v2 Λs 2vwΛr w2 Λu ≥ 0.
2.3
Therefore, 2.3 is valid for all s, r, u ∈ I. Now since left-hand side of 2.3 is quadratic form,
by the nonnegativity of it, one has
Λ2s
u/2 Λ2r ≤ Λs Λu .
2.4
Since we have lims → 0 Λs Λ0 and lims → 1 Λs Λ1 , we also have that 2.4 is valid for r, s, u ∈
R. So s → Λs is log-convex function in the Jensen sense on R.
Moreover, continuity of Λs implies log-convexity, that is, the following is valid for
−∞ < r < s < u < ∞:
Λs u−r ≤ Λr u−s Λu s−r .
2.5
Let us note that it was used in 4 to get corresponding Cauchy’s means. Moreover, we
can extend the above result.
Theorem 2.1. Let the conditions of Theorem 1.2 be fulfilled and let pi i 1, 2, . . . , n be real
numbers. Then
Λpij ≥ 0
k
k 1, 2, . . . , n,
where |aij |k define the determinant of order k with elements aij and pij pi pj /2.
2.6
4
Journal of Inequalities and Applications
Proof. Consider the function
fx n
ui uj
i,j1
pij
xpij
pij − 1
2.7
for x > 0 and ui ∈ R and pij ∈ I.
So, it holds that
f x n
ui uj x
pij −2
i,j1
n
2
ui x
2.8
≥ 0.
pi /2−1
i1
So fx is convex function, and as a consequence of 1.1, one has
n
ui uj Λpij ≥ 0.
2.9
i,j1
Therefore, Λpij aij denote the n × n matrix with elements aij is nonnegative semi definite
and 2.6 is valid for pij ∈ I. Moreover, since we have continuity of Λpij for all pij , 2.6 is valid
for all pi ∈ R i 1, 2, . . . , n.
Remark 2.2. In Theorem 2.1, if we set n 2, we get Theorem 1.2.
3. Improvements of the Jensen Inequality for Monotone
Convex Function
In this section and in the following section, we denote x n
i1
pi xi and PI i∈I
pi .
Theorem 3.1. If Ω, A, μ is a probability space and if f ∈ L1 μ is such a ≤ ft ≤ b for t ∈ Ω, and
if ft ≥ f for t ∈ Ω ⊂ Ω (Ω is measurable, i.e., Ω ∈ A), −∞ < a < b ≤ ∞, then
Ω
φ ft dμt − φ
Ω
ftdμt
≥ sgn ft − f φ ft − φ
f ft dμt φ f − fφ
f
1 − 2μ Ω ,
Ω
3.1
where
f
Ω
ftdμt,
3.2
for monotone convex
function φ : a, b → R. If φ is monotone concave, then the left-hand side of
3.1 should be φ Ω ftdμt − Ω φftdμt.
Journal of Inequalities and Applications
5
Proof. Consider the case when φ is nondecreasing on a, b. Then
φ ft − φ f dμt
Ω
φ ft − φ f dμt Ω
Ω
Ω
φ ft dμt −
3.3
φ ft dμt − φ f μ Ω φ f μ Ω \ Ω
Ω\Ω
Ω\Ω
φ f − φ ft dμt
sgn ft − f φ ft dμt φ f μ Ω \ Ω − μ Ω .
Similarly,
sgn ft − f ftdμt f μ Ω \ Ω − μ Ω .
ft − f dμt Ω
3.4
Ω
Now from 1.4, 3.3, and 3.4 we get 3.1.
The case when φ is nonincreasing can be treated in a similar way.
Of course a discrete inequality is a simple consequence of Theorem 3.1.
Theorem 3.2. Let φ : a, b → R be a monotone convex function, xi ∈ a, b, pi > 0,
If xi ≥ x for i ∈ I ⊂ {1, 2, . . . , n} In , then
n
i1
pi 1.
n
n
pi φxi − φ
pi xi
i1
i1
n
≥ pi sgnxi − x φxi − xi φ
x φx − xφ
x 1 − 2PI .
i1
3.5
If φ is monotone concave, then the left-hand side of 3.5 should be
φ
n
i1
pi xi
−
n
pi φxi .
i1
The following improvement of the Hermite-Hadamard inequality is valid 5.
3.6
6
Journal of Inequalities and Applications
Corollary 3.3. Let φ : a, b → R be a differentiable convex. Then
i the inequality
1
b−a
b
a
b
φtdt − φ
2
a
b
a b 1
φt − φ
≥
dt
b − a a
2
b − a a b −
φ
4
2
3.7
holds.
If φ is differentiable concave, then the left-hand side of 3.7 should be φa b/2 −
b
1/b − a a φtdt;
ii if φ is monotone, then the inequality
b
1
a
b
φtdt − φ
b−a a
2
1 b
a
b
a
b
≥
sgn t −
φt − tφ
dt
b − a a
2
2
3.8
holds. If φ is differentiable and monotone concave then the left-hand side of 3.8 should be
b
φa b/2 − 1/b − a a φtdt.
Proof. i Setting Ω a, b, ft t, dμt dt/b − a in 1.4, we get 3.7.
ii Setting ft t, dμt dt/b − a, and Ω a, b in 3.1, we get 3.8.
4. Improvements of the Levinson Inequality
Theorem 4.1. If the third derivative of f exist and is nonnegative, then for 0 < xi < a, pi > 0 1 ≤
i ≤ n, ni1 pi 1 and Pk ki1 pi 2 ≤ k ≤ n − 1 one has
i
n
n
pi f2a − xi − f2a − x − pi fxi fx
i1
i1
n
≥ pi f2a − xi − fxi − f2a − x fx
i1
n
− f 2a − x f x
pi |xi − x|,
i1
4.1
Journal of Inequalities and Applications
7
ii if φx f2a − x − fx is monotone and xi ≥ x for i ∈ I ⊂ {1, 2, . . . , n} In , then
n
i1
pi f2a − xi − f2a − x −
n
pi fxi fx
i1
n
≥ pi sgnxi − x f2a − xi − fxi xi f 2a − x f x
i1
f2a − x − fx x f 2a − x f x 1 − 2PI .
4.2
Proof. i As for 3-convex function f : 0, 2a → R the function φx f2a − x − fx is
convex on 0, a, so by setting φ f2a − x − fx in the discrete case of 2, Theorem 2, we
get 4.1.
ii As f2a − x − fx is monotone convex, so by setting φ f2a − x − fx in 3.5,
we get 5.16.
Ky Fan Inequality
Let xi ∈ 0, 1/2 be such that x1 ≥ x2 ≥ · · · ≥ xk ≥ x ≥ xk
1 · · · ≥ xn . We denote Gk and Ak , the
weighted geometric and arithmetic means, respectively, that is,
1
Ak Pk
k
pi xi x,
1/Pk
k
pi
Gk xi
,
i1
4.3
i1
and also by Ak and Gk , the arithmetic and geometric means of 1 − xi , respectively, that is,
Ak
k
1
pi 1 − xi 1 − Ak ,
Pk i1
Gk
k
1/Pk
1 − xi pi
.
4.4
i1
The following remarkable inequality, due to Ky Fan, is valid 6, page 5,
Gn An
≤
,
Gn An
4.5
with equality sign if and only if x1 x2 · · · xn .
Inequality 4.5 has evoked the interest of several mathematicians and in numerous
articles new proofs, extensions, refinements and various related results have been published
7.
The following improvement of Ky Fan inequality is valid 2.
8
Journal of Inequalities and Applications
Corollary 4.2. Let An , Gn and An , Gn be as defined earlier. Then, the following inequalities are valid
i
An /An
≥ exp
Gn /Gn
n
n
1 1 − xi An −
p
−
A
|x
|
,
pi ln
i
i
n
A A
i1
xi An
n n i1
4.6
Gk An
Gn An A k − An
ln
.
2Pk ln
Gk An
An An
An Gn 4.7
ii
An /An
≥ exp
Gn /Gn
Proof. i Setting a 1/2, fx ln x in 4.1, we get 4.6.
ii Consider a 1/2 and fx ln x, then φx ln1 − x − ln x is strictly monotone
convex on the interval 0, 1/2 and has derivative
φ x −
1
.
xx − 1
4.8
Then the application of inequality 4.2 to this function is given by
n
i1
1 − xi
1−x
− ln
xi
x
n
1 − xi
xi
1−x
1
≥ pi sgnxi − x ln
ln
1 − 2Pk .
i1
xi
x1 − x
x
1−x
pi ln
4.9
From 4.9 we get 4.7.
5. On Some Inequalities for Csiszár Divergence Measures
Let Ω, A, μ be a measure space satisfying |A| > 2 and μ a σ-finite measure on Ω with values
in R ∪ {∞}. Let P be the set of all probability measures on the measurable space Ω, A which
are absolutely continuous with respect to μ. For P, Q ∈ P, let p dP/dμ and q dQ/dμ
denote the Radon-Nikodym derivatives of P and Q with respect to μ, respectively.
Csiszár introduced the concept of f-divergence for a convex function, f : 0, ∞ →
−∞, ∞ that is continuous at 0 as follows cf. 8, see also 9.
Definition 5.1. Let P, Q ∈ P. Then
If Q, P Ω
psf
qs
dμs,
ps
is called the f-divergence of the probability distributions Q and P .
5.1
Journal of Inequalities and Applications
9
We give some important f-divergences, playing a significant role in Information
Theory and statistics.
i The class of χ-divergences: the f-divergences, in this class, are generated by the
family of functions:
fα u |u − 1|α u ≥ 0, α ≥ 1,
Ifα Q, P p1−α s|qs − ps|α dμs.
5.2
Ω
For α 1, it gives the total variation distance:
V Q, P Ω
qs − psdμs.
5.3
For α 2, it gives the Karl pearson χ2 -divergence:
I Q, P χ2
qs − ps2
dμs.
ps
Ω
5.4
iiThe α-order Renyi entropy: for α ∈ R \ {0, 1}, let
ft tα ,
t > 0.
5.5
Then If gives α-order entropy
Dα Q, P Ω
qα sp1−α sdμs.
5.6
iii Harmonic distance: let
ft 2t
,
1
t
t > 0.
5.7
2psqs
dμs.
Ω ps qs
5.8
Then If gives Harmonic distance
DH Q, P iv Kullback-Leibler: let
ft t log t,
t > 0.
5.9
10
Journal of Inequalities and Applications
Then f-divergence functional gives rise to Kullback-Leibler distance 10
qs
DKL Q, P dμs.
qs log
ps
Ω
5.10
The one parametric generalization of the Kullback-Leibler 10 relative information studied
in a different way by Cressie and Read 11.
v The Dichotomy class: this class is generated by the family of functions gα :
0, ∞ → R,
⎧
u − 1 − log u,
α 0,
⎪
⎪
⎪
⎪
⎨
1
gα u αu 1 − α − uα , α ∈ R \ {0, 1},
⎪
α1
−
α
⎪
⎪
⎪
⎩
1 − u u log u,
α 1.
5.11
This class gives, for particular values of α, some important divergences. For instance, for
α 1/2, we have Hellinger distance and some other divergences for this class are given by
⎧
⎪
Q − P DKL P, Q,
⎪
⎪
⎪
⎪
⎨
Igα Q, P αQ − P P − Dα Q, P ,
⎪
⎪
α1 − α
⎪
⎪
⎪
⎩
DKL Q, P P − Q,
α 0,
α ∈ R \ {0, 1},
5.12
α 1,
where px and qx are positive integrable functions with Ω psdμs P, Ω qsdμs Q.
There are various other divergences in Information Theory and statistics such as
Arimoto-type divergences, Matushita’s divergence, Puri-Vincze divergences cf. 12–14
used in various problems in Information Theory and statistics. An application of Theorem 1.1
is the following result given by Csiszár and Körner cf. 15.
Theorem
5.2. Let f : 0, ∞ → R be convex, and let p and q be positive integrable function with
psdμs
P, Ω qsdμs Q. Then the following inequality is valid:
Ω
If P, Q ≥ Qf
where If P, Q Ω
P
,
Q
5.13
qsfps/qsdμs.
Proof. By substituting φs fs, fs ps/qs and dμs qsdμs in Theorem 1.1
we get 5.13.
Similar consequence of Theorems 1.2 and 2.1 in information theory for divergence
measures discussed above is the following result.
Journal of Inequalities and Applications
11
Theorem 5.3. Let p and q be positive integrable functions with
Q. Define the function
Ω
⎧
1 t 1−t
⎪
⎪
P Q − Dt P, Q ,
⎪
⎪
⎪
t1 − t
⎪
⎪
⎨
P
Φt DKL Q, P Q log ,
⎪
Q
⎪
⎪
⎪
⎪
⎪
Q
⎪
⎩DKL P, Q P log ,
P
psdμs P,
Ω
qsdμs t
/ 0, 1,
t 0,
5.14
t 1,
and let Φt be positive. Then
i it holds that
Φpij ≥ 0 k 1, 2, . . . , n,
5.15
k
where |aij |k define the determinant of order n with elements aij and pij pi pj /2,
ii Φt is log-convex.
As we said in 4 we define new means of the Cauchy type, here we define an
application of these means for divergence measures in the following definition.
Definition
5.4. Let p and q be positive integrable functions with Ω psdμs P, Ω qsdμs Q. The mean Ms,t is defined as
Φs 1/s−t
, s/
t/
0, 1,
Φt
P s Q1−s logP/Q − Ds P, Q
1 − 2s
,
exp
−
s1 − s
P s Q1−s − Ds P, Q
Ms,t Ms,s
M0,0 exp
where D0 P, Q Ω
qs logps/qsdμs and D0 P, Q M1,1 exp
Ω
5.16
2
Q logP/Q − D0 P, Q
1 ,
2 Q logP/Q − D0 P, Q
where D1 P, Q s/
0, 1,
Ω
qs log ps/qs2 dμs,
2
Q logP/Q − D1 P, Q
−1 ,
2 P logP/Q − D1 P, Q
ps logqs/psdμs and D1 P, Q Ω
5.17
ps log qs/ps2 dμs.
Theorem 5.5. Let r, s, t, u be nonnegative reals such that r ≤ t, s ≤ u, then
Mr,t ≤ Ms,u .
5.18
12
Journal of Inequalities and Applications
Proof. By using log convexity of Φt , we get the following result for r, s, t, u ∈ R such that
r ≤ t, s ≤ u and r /
s, t /
u
Φs
Φr
1/s−r
≤
Φu
Φt
1/u−t
5.19
.
Also for r s, t u, we consider limiting case and the result follows from continuity of
Ms,u .
An application of Theorem 1.3 in divergence measure is the following result given in
16.
o
Theorem 5.6. Let f : I ⊆ R → R be differentiable convex function on I , then
P
If P, Q − Qf
Q
f P/Q P −
Q,
≥ If P, Q − Qf
Q Q
5.20
where
Q
Ω
Qps − P qsdμs.
5.21
Proof. By substituting φs fs, fs ps/qs, and dμs → qsdμs in Theorem 1.3,
we get 5.20.
o
Theorem 5.7. Let f : I ⊆ R → R be differentiable monotone convex function on I and let
ps /qs > P/Q for s ∈ Ω ⊂ Ω
If P, Q − Qf
P
Q
≥
ps
ps P
−
f
qsdμs
qs Q
qs
Ω
ps P
P
−f
−
psdμs
sgn
Q
qs Q
Ω
P
P P
Q
Q f
−
f
1 − 2
,
Q
Q
Q
Q
sgn
5.22
where
Q Ω
5.23
qsdμs,
and Ω as in Theorem 5.7.
Proof. By substituting φs Theorem 3.1ii we get 5.22.
fs, fs
ps/qs and dμs
→
qsdμs in
Journal of Inequalities and Applications
13
Corollary 5.8. It holds that
DHα P, Q −
2P Q
≥
P Q
Ω
sgn
2Q2
−
ps P
−
qs Q
sgn
2psqs
dμs
ps qs
ps P
−
psdμs
qs Q
P Q2 Ω
2Q
2P Q
2P Q2
1
−
−
,
P Q P Q2
Q
5.24
where
Q Ω
qsdμs,
5.25
and Ω as in Theorem 5.7.
Proof. The proof follows by setting ft 2t/1 t, t > 0 in Theorem 5.7.
Corollary 5.9. Let gα : R
→ R be as given in 5.11, then
i for α 0 one has
DKL Q, P Q log
P
Q
ps
ps
ps P
−
− 1 − log
qsdμs
≥
sgn
qs Q
qs
qs
Ω
ps P
P −Q
P
2Q
−
−
psdμs Q log
1−
,
sgn
P
qs Q
Q
Q
Ω
5.26
ii for α ∈ R \ {0, 1} one has
P α Q1−α − Dα P, Q
≥
α1 − α
ps
1 − α − psα qs−α qsdμs
qs
Ω
ps P
α−1 1−α
−α 1−P Q
−
psdμs
sgn
qs Q
Ω
α P Q − α Q P Q1−α − α P/Q 1 − P1−α Q1−α 1 − 2Q /Q
,
α1 − α
5.27
sgn
ps P
−
qs Q
α
14
Journal of Inequalities and Applications
iii for α 1 one has
ps
ps P
ps ps
P
P
−
log
qsdμs
sgn
DKL P, Q log
≥
1−
Q
Q
qs Q
qs qs
qs
Ω
ps P
P
5.28
− log
−
sgn
psdμs
Q
qs Q
Ω
2Q
,
Q − P 1 −
Q
where
Q Ω
qsdμs,
5.29
and Ω as in Theorem 5.7.
Proof. The proof follows be setting f gα to be as given in 5.11, in Theorem 3.1.
Acknowledgments
This research work is funded by the Higher Education Commission Pakistan. The research
of the fourth author is supported by the Croatian Ministry of Science, Education and Sports
under the Research Grants 117-1170889-0888.
References
1 M. Anwar and J. Pečarić, “On logarithmic convexity for differences of power means and related
results,” Mathematical Inequalities & Applications, vol. 12, no. 1, pp. 81–90, 2009.
2 S. Hussain and J. Pečarić, “An improvement of Jensen’s inequality with some applications,” AsianEuropean Journal of Mathematics, vol. 2, no. 1, pp. 85–94, 2009.
3 S. Simić, “On logarithmic convexity for differences of power means,” Journal of Inequalities and
Applications, vol. 2007, Article ID 37359, 8 pages, 2007.
4 M. Anwar and J. Pečarić, “New means of Cauchy’s type,” Journal of Inequalities and Applications, vol.
2008, Article ID 163202, p. 10, 2008.
5 S. S. Dragomir and A. McAndrew, “Refinements of the Hermite-Hadamard inequality for convex
functions,” Journal of Inequalities in Pure and Applied Mathematics, vol. 6, no. 2, article 140, 6 pages,
2005.
6 H. Alzer, “The inequality of Ky Fan and related results,” Acta Applicandae Mathematicae, vol. 38, no. 3,
pp. 305–354, 1995.
7 E. F. Beckenbach and R. Bellman, Inequalities, vol. 30 of Ergebnisse der Mathematik und ihrer Grenzgebiete,
N. F., Springer, Berlin, Germany, 1961.
8 I. Csiszár, “Information measures: a critical survey,” in Transactions of the 7th Prague Conference on
Information Theory, Statistical Decision Functions and the 8th European Meeting of Statisticians, pp. 73–86,
Academia, Prague, Czech Republic, 1978.
9 M. C. Pardo and I. Vajda, “On asymptotic properties of information-theoretic divergences,” IEEE
Transactions on Information Theory, vol. 49, no. 7, pp. 1860–1868, 2003.
10 S. Kullback and R. A. Leibler, “On information and sufficiency,” Annals of Mathematical Statistics, vol.
22, pp. 79–86, 1951.
Journal of Inequalities and Applications
15
11 P. Cressie and T. R. C. Read, “Multinomial goodness-of-fit tests,” Journal of the Royal Statistical Society.
Series B, vol. 46, no. 3, pp. 440–464, 1984.
12 P. Kafka, F. Österreicher, and I. Vincze, “On powers of f-divergences defining a distance,” Studia
Scientiarum Mathematicarum Hungarica, vol. 26, no. 4, pp. 415–422, 1991.
13 F. Liese and I. Vajda, Convex Statistical Distances, vol. 95 of Teubner Texts in Mathematics, BSB B. G.
Teubner Verlagsgesellschaft, Leipzig, Germany, 1987.
14 F. Österreicher and I. Vajda, “A new class of metric divergences on probability spaces and its
applicability in statistics,” Annals of the Institute of Statistical Mathematics, vol. 55, no. 3, pp. 639–653,
2003.
15 I. Csiszár and J. Körner, Information Theory: Coding Theorems for Discrete Memoryless System, Probability
and Mathematical Statistics, Academic Press, New York, NY, USA, 1981.
16 M. Anwar, S. Hussain, and J. Pečarić, “Some inequalities for Csiszár-divergence measures,”
International Journal of Mathematical Analysis, vol. 3, no. 26, pp. 1295–1304, 2009.
Download