PPT

advertisement
物理フラクチュオマティクス論
Physical Fluctuomatics
応用確率過程論
Applied Stochastic Process
第5回グラフィカルモデルによる確率的情報処理
5th Probabilistic information processing by means of
graphical model
東北大学 大学院情報科学研究科 応用情報科学専攻
田中 和之(Kazuyuki Tanaka)
kazu@smapip.is.tohoku.ac.jp
http://www.smapip.is.tohoku.ac.jp/~kazu/
物理フラクチュオマティクス論(東北大)
1
今回の講義の講義ノート
田中和之著:
確率モデルによる画像処理技術入門,
森北出版,2006.
物理フラクチュオマティクス論(東北大)
2
Contents
1.
2.
3.
4.
5.
Introduction
Probabilistic Image Processing
Gaussian Graphical Model
Statistical Performance Analysis
Concluding Remarks
物理フラクチュオマティクス論(東北大)
3
Contents
1.
2.
3.
4.
5.
Introduction
Probabilistic Image Processing
Gaussian Graphical Model
Statistical Performance Analysis
Concluding Remarks
物理フラクチュオマティクス論(東北大)
4
Markov Random Fields for Image Processing
Markov Random Fields
are One of Probabilistic Methods for Image processing.
S. Geman and D. Geman (1986): IEEE Transactions on PAMI
Image Processing for Markov Random Fields (MRF)
(Simulated Annealing, Line Fields)
J. Zhang (1992): IEEE Transactions on Signal Processing
Image Processing in EM algorithm for Markov
Random Fields (MRF) (Mean Field Methods)
物理フラクチュオマティクス論(東北大)
5
Markov Random Fields for Image Processing
In Markov Random Fields, we have to consider not
only the states with high probabilities but also ones
Hyperparameter
with low probabilities.
Estimation
In Markov Random Fields, we
have to estimate not only the
image but also hyperparameters
Statistical Quantities
in the probabilistic model.
We have to perform the
Estimation of Image
calculations of statistical
quantities repeatedly.
We can calculate statistical quantities by adopting the
Gaussian graphical model as a prior probabilistic model
and by using Gaussian integral formulas.
物理フラクチュオマティクス論(東北大)
6
Purpose of My Talk
Review of formulation of probabilistic
model for image processing by means of
conventional statistical schemes.
Review of probabilistic image processing
by using Gaussian graphical model
(Gaussian Markov Random Fields) as the
most basic example.
K. Tanaka: Statistical-Mechanical Approach
to Image Processing (Topical Review), J. Phys.
A: Math. Gen., vol.35, pp.R81-R150, 2002.
Section 2 and Section 4 are summarized in the present talk.
物理フラクチュオマティクス論(東北大)
7
Contents
1.
2.
3.
4.
5.
Introduction
Probabilistic Image Processing
Gaussian Graphical Model
Statistical Performance Analysis
Concluding Remarks
物理フラクチュオマティクス論(東北大)
8
Image Representation in Computer Vision
Digital image is defined on the set of points arranged on
a square lattice.
The elements of such a digital array are called pixels.
We have to treat more than 100,000 pixels even in the
digital cameras and the mobile phones.
x
y 640 480  307,200Pixels y
(1,1)
(2,1)
(3,1)
(1,2)
(2,2)
(3,2)
(1,3)
(2,3)
(3,3)
物理フラクチュオマティクス論(東北大)
x
( x, y )
9
Image Representation in Computer Vision
x
( x, y)  f x, y 
f x, y   0
y
f x, y   255
256  256  65536 Pixels
At each point, the intensity of light is represented as an
integer number or a real number in the digital image
data.
A monochrome digital image is then expressed as a twodimensional light intensity function and the value is
proportional to the brightness of the image at the pixel.
物理フラクチュオマティクス論(東北大)
10
Noise Reduction by
Conventional Filters
192
202
190
202
219
120
100
218
110
The function of a linear
filter is to take the sum
of the product of the
mask coefficients and the
intensities of the pixels.
Smoothing Filters
192 202 190


Average202 219 120  173
100 218 110


192
202
190
202
173
120
100
218
110
It is expected that probabilistic algorithms for
image processing can be constructed from such
aspects in the conventional signal processing.
Markov Random Fields
Algorithm
Probabilistic Image Processing
物理フラクチュオマティクス論(東北大)
11
Bayes Formula and Bayesian Network
Pr{B | A} Pr{A}
Pr{A | B} 
Pr{B}
Prior
Probability
Data-Generating Process
Posterior Probability
Bayes Rule
Event A is given as the observed data.
Event B corresponds to the original
information to estimate.
Thus the Bayes formula can be applied to the
estimation of the original information from
the given data.
物理フラクチュオマティクス論(東北大)
A
B
Bayesian
Network
12
Image Restoration by Probabilistic Model
Noise
Transmission
Original
Image
Degraded
Image
P
osterior



P r{OriginalImage | Degraded Image}
Assumption 1: The degraded
image is randomly generated
from the original image by
according to the degradation
process.
Assumption 2: The original
image is randomly generated
by according to the prior
probability.
P
rior
Likelihood


 


P r{Degraded Image | OriginalImage}P r{OriginalImage}

P r{DegradedImage}

MarginalLikelihood Bayes Formula
物理フラクチュオマティクス論(東北大)
13
Image Restoration by Probabilistic Model
The original images and degraded images are
represented by f = {fi} and g = {gi}, respectively.
Original
Degraded
Image
Image

ri  ( xi , yi )
Position Vector
i
of Pixel i
i
fi: Light Intensity of Pixel i
gi: Light Intensity of Pixel i
in Original Image
in Degraded Image
物理フラクチュオマティクス論(東北大)
14
Probabilistic Modeling of Image
Restoration
P
osterior



P r{OriginalImage | Degraded Image}
P
rior
Likelihood


 


 P r{Degraded Image | OriginalImage}P r{OriginalImage}
Assumption 1: A given degraded image is obtained from the
original image by changing the state of each pixel to another
state by the same probability, independently of the other pixels.
P r{Degraded Image g | OriginalImage f }
 P r{G  g | F  f }    ( gi , fi )
Random Fields
iV
物理フラクチュオマティクス論(東北大)
gi
gi
or
fi
fi
15
Probabilistic Modeling of
Image Restoration
P
osterior



P r{OriginalImage | Degraded Image}
P
rior
Likelihood


 


 P r{Degraded Image | OriginalImage}P r{OriginalImage}
Assumption 2: The original image is generated according to a
prior probability. Prior Probability consists of a product of
functions defined on the neighbouring pixels.
Pr{Original Image f }
 Pr{F  f } 
 ( fi , f j )
ijE
i
j
Random Fields
Product over All the Nearest Neighbour Pairs of Pixels
物理フラクチュオマティクス論(東北大)
16
Prior Probability for Binary Image
It is important how we should assume the function
(fi,fj) in the prior probability.
Pr{F  f } 
 ( f i , f j )
ijE
i
f i  0,1
j
We assume that every nearest-neighbour pair of pixels take
the same state of each other in the prior probability.
(1,1)  (0,0)  (0,1)  (1,0)
i p
j
=
p
>
1
p
2
=
1
p
2
物理フラクチュオマティクス論(東北大)
Probability of
Neigbouring
Pixel
17
Prior Probability for Binary Image
i p
j
=
p
>
1
p
2
=
1
p
2
Which state should the center
pixel be taken when the states
of neighbouring pixels are fixed
to the white states?
Probability of
Nearest Neigbour
Pair of Pixels
?
>
Prior probability prefers to the configuration
with the least number of red lines.
物理フラクチュオマティクス論(東北大)
18
Prior Probability for Binary Image
p
=
p
>
=
?-?
Which state should the center
pixel be taken when the states of
neighbouring pixels are fixed as
this figure?
>
=
>
Prior probability prefers to the configuration
with the least number of red lines.
物理フラクチュオマティクス論(東北大)
19
What happens for the case of
large umber of pixels?
1.0
Covariance
between the
nearest neghbour
pairs of pixels
p
Patterns with
both ordered states
and disordered states
are often generated
near the critical point.
0.8
0.6
0.4
0.2
0.0
0.0
small p
0.2
0.4
lnp
0.6
0.8
1.0
large p
Sampling by
Marko chain
Monte Carlo
Disordered State
Critical Point
(Large fluctuation)
物理フラクチュオマティクス論(東北大)
Ordered State
20
Pattern near Critical Point
of Prior Probability
We regard that patterns
generated near the critical
point are similar to the
local patterns in real world
images.
1.0
Covariance
0.8
between the
0.6
nearest neghbour0.4
pairs of pixels 0.2
0.0
small p ln p
0.0
0.2
0.4
0.6
0.8
1.0
large p
similar
物理フラクチュオマティクス論(東北大)
21
Contents
1.
2.
3.
4.
5.
Introduction
Probabilistic Image Processing
Gaussian Graphical Model
Statistical Performance Analysis
Concluding Remarks
物理フラクチュオマティクス論(東北大)
22
Bayesian Image Analysis by
Gaussian Graphical Model
Prior
Probability
V:Set of all
the pixels
 1
1
2

PrF  f |   
exp     f i  f j  
Z PR ( )
 2 {i , j}E

 1
2

Z PR ( )      exp    zi  z j  dz1dz2  dz|V |
 

 2 {i , j}E

 

Patterns are generated by MCMC.
  0.0001
E:Set of all the
nearest-neighbour
pairs of pixels
f i   ,
  0.0005
  0.0030
Markov Chain Monte Carlo Method
物理フラクチュオマティクス論(東北大)
23
Bayesian Image Analysis by
Gaussian Graphical Model
Degradation Process is assumed to be the
additive white Gaussian noise.
P rG  g F  f ,   
fi , gi   ,

1
2



exp

f

g

i
i 

2
2
 2

iV 2
1
V: Set of all
the pixels
Original Image f
Gaussian Noise n
Degraded Image g
Degraded image is
obtained by adding a
white Gaussian noise to
the original image.

gi  fi ~ N 0, 2

Histogram of Gaussian Random Numbers
物理フラクチュオマティクス論(東北大)
24
Bayesian Image Analysis
PrF  f 
f
Prior Probability
Original
Image
PrG  g F  f 
g
Degradation Process
Degraded
Image
g
Posterior Probability
PrF  f G  g 
V:Set of All
the pixels
E:Set of all
the nearest
neighbour
pairs of pixels
PrG  g F  f PrF  f 
PrG  g




    ( f i , g i )   ( f i , f j ) 
 iV
 {i , j}E

Image processing is reduced to calculations of
averages, variances and co-variances in the
posterior probability.
物理フラクチュオマティクス論(東北大)
25
Estimation of Original Image
We have some choices to estimate the restored image from
posterior probability.
In each choice, the computational time is generally
exponential order of the number of pixels.
(1) fˆ  arg maxPr{F  z | G  g}
z
(2) fˆi  argmaxPr{Fi  zi | G  g}
zi
Maximum A Posteriori (MAP)
estimation
Maximum posterior
marginal (MPM) estimation
Pr{Fi  f i | G  g} 
 Pr{F 
f | G  g}
f \ fi


ˆ  arg min   z Pr{F  z | G  g}dz 2
f
i  i
(3) i
i
Thresholded Posterior
Mean (TPM) estimation
物理フラクチュオマティクス論(東北大)
26
Statistical Estimation of Hyperparameters
Hyperparameters ,  are determined so as to
maximize the marginal likelihood Pr{G=g|,} with
respect to , .
(ˆ ,ˆ )  arg max Pr{G  g |  , }
 , 
Pr{G  g |  ,  }   Pr{G  g | F  z ,  } Pr{ F  z |  }dz
y
V
x
Original Image
Pr{F  f |  }
f
Degraded Image
Pr{G  g | F  f ,  }
g
g
Marginalized with respect to F
Pr{G  g |  ,  }
Marginal Likelihood
物理フラクチュオマティクス論(東北大)
27
Bayesian Image Analysis
fi , g j   ,
A Posteriori Probability
P r{G  g | F  f ,  } P r{F  f |  }
P r{F  f | G  g,  ,  } 
P r{G  g |  ,  }

1
1


exp 
2
Z POS  g,  ,  
2



 f
iV
 gi 
2
i
1
2
    f i  f j  
2 {i , j}E

1
1
1

T
T
exp 
(
f

g
)(
f

g
)


fCf

2
Z POS  g,  ,  
2
 2

Z P OS g,  ,    
 

 


1
1
exp 
( z  g )(z  g ) T   z C z T dz1dz2  dz|V |

2
 2 2



Gaussian Graphical Model
物理フラクチュオマティクス論(東北大)
28
Average of Posterior Probability
 

 

F  ,  


z P r{F  z | G  g,  ,  }dz1dz2  dz|V |
T





1
I
I
2


       z exp  2 2  z  I   2C g  I   C  z  I   2C g  dz1dz2  dz|V |



T

 


 
1 
I
I
2 

       exp  2 2  z  I   2C g  I   C  z  I   2C g  dz1dz2  dz|V |


I

g
2
I   C
 





|V |





 

 1
2
T
      ( z  m ) exp  2 ( z  m )(I   C )(z  m) dz1dz2 dz|V |  (0,0,,0)
Gaussian Integral formula
物理フラクチュオマティクス論(東北大)
29
Bayesian Image Analysis
by Gaussian Graphical Model
fi , gi   ,
Posterior Probability
 1
Pr{F  f | G  g,  ,  }  exp  2
 2
 f
iV
 gi 
2
i
Average of the posterior probability
can be calculated by using the multidimensional Gauss integral Formula
fˆ   z P r{F  z | G  g ,  ,  } dz

I
I   C
2
g
1
2
    f i  f j  
2 {i , j}E

V:Set of all
the pixels
E:Set of all the
nearest-neghbour
pairs of pixels
|V|x|V| matrix  4

i C j   1
0

i  j 
{i, j}  E 
otherwise
Multi-Dimensional Gaussian Integral Formula
物理フラクチュオマティクス論(東北大)
30
Statistical Estimation of Hyperparameters
Pr{G  g |  ,  }   Pr{G  g | F  z,  } Pr{F  z | }dz 
Z POS( g,  ,  )
(2 2 )|V | / 2 Z PR ( )
Z P OS g ,  ,  

 

 


1
1
exp 
( z  g )(z  g ) T   z C z T dz1dz2  dz|V |

2
 2 2



 1
T
Z P R        exp   z C z dz1dz2 dz|V |
 

 2

 
y
V
x

Original Image
Pr{F  f |  }
f
Degraded Image
Pr{G  g | F  f ,  }
g
g
Marginalized with respect to F
Pr{G  g |  ,  }
物理フラクチュオマティクス論(東北大)
Marginal Likelihood
31
Calculations of Partition Function
(2 )|V |
 1

Z PR ( )      exp  z Cz dz1dz2  dz|V | 
 

 2

 |V | det C
 

Z P OS( g,  ,  )

1
     exp 
 2 2
 


 




I
 z 
g  I   2 C
I   2 C 

T



I
1
C
T
 z 
g   g
g dz1dz2  dz|V |

2
2
2
I   C 
I   C



 1

C
 exp  g
g T 
I   2 C
 2

T

 


 
1 
I
I
2 

z 
     exp 
g  I   C  z 
g  dz1dz2  dz|V |
 2 2 

2
2
 

I


C
I


C








(2 2 )|V |
 1

C
exp  g
g T 
det(I   2 C )
I   2 C
 2

 1
T

exp

(
z

m
)
A
(
z

m
)

dz1dz2  dz|V | 
       2

 

Gaussian Integral formula
(2 )|V |
det A
(A is a real symmetric and positive definite matrix.)
物理フラクチュオマティクス論(東北大)
32
Exact expression of Marginal Likelihood in
Gaussian Graphical Model
Pr{G  g |  ,  }   Pr{G  g | F  z,  } Pr{F  z | }dz 
Z POS( g,  ,  )
(2 2 )|V | / 2 Z PR ( )
Multi-dimensional Gauss integral formula
Pr{G  g |  ,  } 
det C 
 1

C
exp  g
g T 
I   2 C
 2

2  V det I   2C


We can construct an exact EM algorithm.
ˆ ,ˆ   arg max Pr{G  g |  , }
 , 
fˆ 
I
I   C
2
g
物理フラクチュオマティクス論(東北大)
y
V
x
33
Bayesian Image Analysis
by Gaussian Graphical Model
Iteration Procedure in Gaussian Graphical Model
 1

 t  12 C
1
C

 t  
Tr

g
gT 
|V |

I   t  1 t  12 C | V | I   t  1 t  12 C



g
1

1
 t  12 I
1
 t  12  t  14 C 2
 t  
Tr

g
gT
|V |
I   t  1 t  12 C | V | I   t  1 t  12 C 2

m(t ) 

I
I   (t ) (t ) 2 C


fˆ
2
g fˆi (t )  arg min  z i  mi (t ) 
zi
0.001
t 
0.0008
0.0006
0.0004
0.0002
g
0
0
20
40
60
 100t 
80
物理フラクチュオマティクス論(東北大)
fˆ
34
Image Restoration by Markov Random
Field Model and Conventional Filters
MSE
Original Image Degraded Image
Statistical Method
315
Lowpass
Filter
(3x3)
388
(5x5)
413
Median
Filter
(3x3)
486
(5x5)
445
MSE 


1
ˆ 2
f

f
 i i
| V | iV
V:Set of all
the pixels
Restored
Image
MRF
(3x3) Lowpass
(5x5) Median
物理フラクチュオマティクス論(東北大)
35
Contents
1.
2.
3.
4.
5.
Introduction
Probabilistic Image Processing
Gaussian Graphical Model
Statistical Performance Analysis
Concluding Remarks
物理フラクチュオマティクス論(東北大)
36
Performance Analysis
Signal
5
 ˆ 2
1
[MSE]   || x  x n ||
5 n 1

y1

y2

y3

y4

y5
Observed
Data
物理フラクチュオマティクス論(東北大)
Posterior Probability

x
Additive White
Gaussian Noise
Sample Average of Mean Square Error

xˆ1

xˆ 2

xˆ 3
ˆ
x4

xˆ5
Estimated Results
37
Statistical Performance Analysis

1
MSE ( |  , x ) 
V

 
   
2

h ( y,  ,  )  x Pr{Y  y | X  x,  } dy

x
Original Image

y
Degraded Image
   
Pr{Y  y | X  x, }
g
Additive White Gaussian Noise
 
h ( y,  ,  )
   
Pr{Y  y | X  x, }
   
Pr{X  x | Y  y, , }
Restored Image
Additive White Gaussian Noise
物理フラクチュオマティクス論(東北大)
Posterior Probability
38
Statistical Performance
Analysis

1
MSE( |  , x ) 
V
1

V

 I   2C 
   
 xP( x | y)dx




    
2
y  x Pr Y  y X  x dy
1 
 
h  y,  ,   

 
    
2
h  y,  ,    x Pr Y  y X  x dy

1 
2
 I   C
y

 4, i  j  V

i C j   1, {i, j}  E
 0, ot herwise


   
1



Pr Y  y X  x ,    exp 
xi  yi 2 
 2 2

iV
 exp(
物理フラクチュオマティクス論(東北大)
1
2 2
 2
xy )
39
Statistical Performance Estimation for
Gaussian Markov Random Fields

1
MSE ( |  , x ) 
V

1

V
I
1

V


 
   
2

h ( y,  ,  )  x  Pr{Y  y | X  x ,  } dy
I   2 C
I
I   2 C
 
yx
2
 1
 
 2 
 
( y  x) 



|V |
1   2 

exp 
x  y  dy
 2 2

2
   1
x  x 
I   2 C
 2 
I
2
 
 2 C   1
( y  x) 
x 
I   2 C
I   2 C
 2 
I
1

V

1

V

 
I
 2 C  

(
y

x
)

x
  I   2C
I   2 C 
T
 1
x 
 2 
1   2 

exp 
y  x  dy
 2 2

1   2 

exp 
y  x dy
 2 2




|V |
1   2 

exp 
y  x dy
 2 2

|V |
1   2 

exp 
y  x dy
 2 2

  1 
 2 C
( y  x )

( I   2 C ) 2
 2  
1

V

 xT
 2 4 C 2  1 
x

( I   2 C ) 2  2  



|V |
1   2 

exp 
y  x  dy
 2 2

1   2 

exp 
y  x  dy
 2 2

|V |

 xT

|V |
|V |



1

V
|V |







 
I
 2 C   1

(
y

x
)

x 
 I   2 C
2 C  2 
I




  1
1  
I
  ( y  x)T
( y  x )
V
( I   2 C ) 2
 2 
1  
 2 C
  ( y  x)T
V
( I   2 C ) 2
 4, i  j  V

i C j   1, {i, j}  E
 0, ot herwise

=0
1   2 

exp 
y  x dy
 2 2

 1   2 4 C 2  T
1 
 2I

T r
x
x
V  ( I   2 C ) 2  | V | ( I   2 C ) 2
物理フラクチュオマティクス論(東北大)
40
Statistical Performance Estimation for
Gaussian Markov Random Fields

1
MSE ( |  , x ) 
V

1

V
I

 
   
2

h ( y,  ,  )  x  Pr{Y  y | X  x ,  } dy
I   2 C
 
yx
2
 1 
 

2
 2  
|V |
1   2 

exp 
x  y dy
2
 2

 1  T  2 4 C 2 
1 
 2I

 T r
x
x
2
2
V  ( I   2 C ) 2  | V |
( I   C )

MSE ( |  , x )

MSE ( |  , x )
600
=40
600
400
400
200
200
0
0
0
0.001
0.002

0.003
 4, i  j  V

i C j   1, {i, j}  E
 0, ot herwise

=40
0
物理フラクチュオマティクス論(東北大)
0.001
0.002

0.003
41
Contents
1.
2.
3.
4.
5.
Introduction
Probabilistic Image Processing
Gaussian Graphical Model
Statistical Performance Analysis
Concluding Remarks
物理フラクチュオマティクス論(東北大)
42
Summary
Formulation of probabilistic model for
image processing by means of
conventional statistical schemes has been
summarized.
Probabilistic image processing by using
Gaussian graphical model has been shown
as the most basic example.
物理フラクチュオマティクス論(東北大)
43
References
K. Tanaka: Introduction of Image Processing by
Probabilistic Models, Morikita Publishing Co.,
Ltd., 2006 (in Japanese) .
K. Tanaka: Statistical-Mechanical Approach to
Image Processing (Topical Review), J. Phys. A,
35 (2002).
A. S. Willsky: Multiresolution Markov Models for
Signal and Image Processing, Proceedings of
IEEE, 90 (2002).
物理フラクチュオマティクス論(東北大)
44
Problem 5-1: Derive the expression of the
posterior probability Pr{F=f|G=g,,} by using
Bayes formulas Pr{F=f|G=g,,}
=Pr{G=g|F=f,}Pr{F=f,}/Pr{G=g|,}. Here
Pr{G=g|F=f,} and Pr{F=f,} are assumed to be
as follows:


1
1

P rG  g F  f ,    
exp 
f i  g i 2 
2
2
 2
2
 1
1
PrF  f |   
exp    f i  f j
Z PR ( )
 2 {i , j}E

iV

 
2


 1
2

dz1dz2  dz|V |



exp


z

z
i
j
  2 {i

  
, j }E

Z PR ( )  
 
[Answer]
Pr{F  f | G  g,  ,  } 

1
1
exp 
2
Z POS  g,  ,  
 2

 1

exp
   2 2
 
Z PR    
 
 f
iV
i
1
2
2
 gi      f i  f j  
2 {i , j}E


1
2
2
 zi  gi   2   zi  z j  dz1dz2 dz|V |
iV
{i , j }E
物理フラクチュオマティクス論(東北大)

45
Problem 5-2: Show the following equality.
 f
1
2
2
iV
 gi 
2
i
1
1
2
    fi  f j 
2 {i , j}E
1
T

(
f

g
)(
f

g
)


f
C
f
2 2
2
T
4

i C j   1
0

i  j 
{i, j}  E 
otherwise
物理フラクチュオマティクス論(東北大)
46
Problem 5-3: Show the following equality.
1
1
( z  g )(z  g )   z C z T
2
2 2
T


1 
I
 z 

g  I   2 C
2 2 
I   2 C 
1
C
 g
gT
2
I   2 C



I
 z 
g 
2
I   C 

物理フラクチュオマティクス論(東北大)
T
47
Problem 5-4: Show the following equalities by using the
multi-dimensional Gaussian integral formulas.


1


exp

  2 2
  
Z POS  g,  ,    
 

1
2
2


f

g

 i i 2    fi  f j  dz1dz2 dz|V |
iV
{i , j }E

(2 2 )|V |
C
 1
T

exp


g
g


det(I   2C )
I   2C
 2

 1
(2 )|V |
2
Z PR ( )      exp      f i  f j  dz1dz 2  dz|V | 
 

 |V | det C
 2 {i , j}E

 

物理フラクチュオマティクス論(東北大)
48
Problem 5-5: Derive the extremum conditions for the
following marginal likelihood Pr{G=g|,} with respect to
the hyperparameters  and .
Pr{G  g |  ,  } 
det C 
 1
C
T
exp  g
g 
2
V
2
I   C
 2

2  det I   C


[Answer]
1
 2C
1
C

Tr

g
gT
 |V |
I   2 C | V | I   2 C
1


2

1
 2I
1
 2 4C 2

Tr

g
gT
|V |
I   2C | V | I   2C 2



物理フラクチュオマティクス論(東北大)

49
Problem 5-6: Derive the extremum conditions for the
following marginal likelihood Pr{G=g|,} with respect to
the hyperparameters  and .
Pr{G  g |  ,  } 
det C 
 1
C
T
exp  g
g 
2
V
2
I   C
 2

2  det I   C


[Answer]
1
 2C
1
C

Tr

g
gT
 |V |
I   2 C | V | I   2 C
1


2

1
 2I
1
 2 4C 2

Tr

g
gT
|V |
I   2C | V | I   2C 2



物理フラクチュオマティクス論(東北大)

50
Problem 5-7: Make a program that generate a degraded
image by the additive white Gaussian noise. Generate some
degraded images from a given standard images by setting
=10,20,30,40 numerically. Calculate the mean square error
(MSE) between the original image and the degraded image.
MSE 
1
 f i  gi 2

| V | iV
Original Image
Gaussian Noise
Degraded Image
K.Tanaka: Introduction of Image
Processing by Probabilistic Models,
Morikita Publishing Co., Ltd., 2006 .
Sample Program:
http://www.morikita.co.jp/soft/84661/
Histogram of Gaussian Random
Numbers Fi Gi~N(0,402)
物理フラクチュオマティクス論(東北大)
51
Problem 5-8: Make a program of the following procedure in
probabilistic image processing by using the Gaussian
graphical model and the additive white Gaussian noise.
Algorithm:
Repeat the following procedure until convergence
 1

 t  12 C
1
C
T

 t  
Tr

g
g
2
2
|V |

I   t  1 t  1 C | V | I   t  1 t  1 C



1

1
 t  12 I
1
 t  12  t  14 C 2
 t  
Tr

g
gT
|V |
I   t  1 t  12 C | V | I   t  1 t  12 C 2

m(t ) 

I
I   (t ) (t ) 2 C
g


fˆi (t )  arg min zi  mi (t ) 2
zi
K.Tanaka: Introduction of Image Processing by Probabilistic Models,
Morikita Publishing Co., Ltd., 2006 .
Sample Program: http://www.morikita.co.jp/soft/84661/
物理フラクチュオマティクス論(東北大)
52
Download