Gaussian Mixture Model

advertisement
Gaussian Mixture Model
Mixture Modeling
A formalism for modeling a
probability density function
as a sum of parameterized
functions.
Normal parameters
Number of hidden
components
Class weights
Px  ,     m Px m ,  m 
M
m 1
Observations
Class weight,
class prior
probability,
multinomial
Multivariate
Normal
Normal = Gaussian
Gaussian Mixture
Gaussian Mixture
Let
Data Likelihood
 Zi1 
Z 
Zi   i2 



 ZiM 
where Zim  1 if X i is fromgroup m, and  0 otherwise.
Under the assumption that the pairs (Zi,Xi) are mutually independent, their
joint density may be written
N
M
i 1
m
pZ, X |      m px i |  m

Zim
Data Log Likelihood
N
M
i 1
m
pZ, X |      m px i |  m

Zim
The complete-data log likelihood is thus
N
M
logpZ, X |     Z im log m px i |  m
i 1 m 1

EM Algorithm
E-Step: 估计Zim
M-Step: 估计πm, µm,∑m
EM Algorithm: E-Step
EM Algorithm: M-Step
Bayesian Ying-Yang Learning
• Proposed by Prof. Lei XU
Reference: Jinwen Ma, Jianfeng Liu. The BYY annealing learning algorithm for
Gaussian mixture with automated model selection, Pattern Recognition, 2007,
40:2029-2037.
Fig. 6. The segmentation result on the color image of house. (a) The
original color image of house; (b) the segmented image via the BYY
annealing learning algorithm (after 21 iterations).
Fig. 8. The segmentation result on the color image of jellies.
(a) The original color image of jellies;
(b) the segmented image via the BYY annealing learning algorithm
(after 22 iterations).
Download