Computational aspects of sequential Monte Carlo approach to image restoration

advertisement
Computational aspects of sequential Monte Carlo
approach to image restoration
Ken Nittono
Department of Markets and Management
Hosei University
2-17-1 Fujimi, Chiyoda-ku, Tokyo 102-8160, Japan
Summary. Computational aspects of methods using Monte Carlo approaches to image restoration are studied. Sequential Mote Carlo filter,
smoother and L-lag smoother for an image are compared through simulations and the results are discussed in comparison with theoretical computational burden.
Key words: image restoration, particle filter, sequential Monte Carlo
method, L-lag smoother
1 Introduction
Bayesian approaches have attracted interest in a wide variety of fields.
Bayesian image restoration as a branch of image analysis is one of the
frameworks based on the statistical methodology [BGHM95, BH99,
GG84]. On the other hand, sequential Mote Carlo methods as a dynamical
estimation of states for state space models have been developed in time series analysis [DFG91, Kit96]. The application of the model has been studied in many fields [Liu01] and a particle filter for image restoration is proposed as an extended formula from the state space models [NK02]. The
particle filter for image restoration is aimed to give a tractable algorithm to
the restoration procedure and then the composition of the estimation algorithm results in a simple scheme. However, the method tends to demand
massive memory and computational time as the increase of the size of objective images and the number of particles [NK02]. In this paper, we give a
study in the aspects of the computational matters for the particle filter
1674
methods of the Monte Carlo approach and it is aimed to improve the restoration methods, which is based on the Bayesian framework.
2 Restoration model
Let x 1:M and y1:M denote original and observed noisy image composed of
M pixels. The estimation of the original image for given observation y1:M
is formulated by the Bayesian approach in the context of MAP estimation
as follows [Bes86, GG84],
p(x 1:M | y1:M ) =
p(y1:M | x 1:M )p(x 1:M )
.
p(y1:M )
(1)
Geman and Geman [GG84] proposed a relaxation method, called
Gibbs sampler, which finds a mode of the posterior distribution in (1). The
method enables to find global maxima of p(x 1:M | y1:M ) , however, the control of annealing schedule makes the method a complicated procedure. In
regard to this, Nittono and Kamakura [NK02] showed an approach using
particle filter to estimate the posterior distribution in (1) based on sequential Monte Carlo method. The method is described in the next section.
3 Sequential Monte Carlo approach
By the approach of sequential Monte Carlo methods, recursive formula for
the posterior distribution in (1) is represented as follows [DFG01, Liu01],
p(x 0 : i | y1: i ) = p(x 0 : i −1 | y1: i −1 )
p(yi | x i )p(x i | x Ri )
,
p(yi | y1: i −1 )
(2)
where x 0 is an initial state, xk :l is a partial coloring at the range of pixel k
to l and Ri denotes a neighbourhood at pixel i . Beginning with sampling
from initial distribution p(x 0 ) , successive sequential procedure for each
pixel i is essentially composed of calculations of i) sampling from conditional distribution, ii) likelihood of each sample, iii) importance weights of
N samples, called particles, and resampling according to the weights
[DFG01, NK02].
On the estimation of states at i , three types of estimation can be
defined according to explanations of given problems, which are prediction,
filtering and smoothing [Kit96]. We here refer f1:i as a filter that samples
1675
from conditional distribution p(x i | y1:i ) , and s1:T as a smoother that samples from p(x i | y1:T ) given observation untilT (> i) .
4 Computational aspect
The sequential Monte Carlo approach to image restoration inherits the
tractability of the sequential Monte Carlo method in its algorithm and
makes it simple to control the estimation by mainly handling the number
of particles. However, some computational matters still remain, that is, it
demands a large amount of calculation time and memory storage for practical computing as the increase of the number of particles and pixels in an
image. Thus, in those cases, further improvements for the calculation that
make it feasible still required. Note that the demand of computational time
and space is not only for our image restoration problem but also the essential feature of the sequential Monte Carlo approaches when the dimension
is very high.
In regard to this, we here introduce L-lag smoother from state
space model [Kit96] into image restoration. We refer L-lag smoother in
image restoration as si−L:T which samples from conditional distribution
p(x i | yi −L:T ) at each pixel i . In other words, the state x i at pixel i is estimated by N particles {x i(−j )L:i ; j = 1,..., N } from p(x i | yi −L:T ) using stored
sample paths indexed from i − L to i − 1 rather than 0 to i − 1 . Note that
the smoother s1:T mentioned above needs M length of sample path as a
maximum for each j-th sample, on the contrary, si−L:T needs L length constantly for each of them through the estimation procedure. And it is known
that the older states in sample paths converge to only one state rapidly
[Kit96], then as the increase of i , all samples {x i(−j )L:i ; j = 1,..., N } tends to
have the same states in the beginning part, that is, where nearby 0. Therefore, for a moderate size of L , sample paths (k 0( j ),..., ki(−j )L −1, x i(−j )L ,..., x i( j ) ) have
partially equivalent values at k 0( j ),..., ki(−j )L−1 for all j after the convergence.
And it implies that only x i(−j )L ,..., x i( j ) are enough for the estimation, thus, it
results in the reduction of computational burden.
5 Simulation
We conduct some simulations based on an artificial image which is modeled on sensing data in which some regions of crops are captured. The
1676
original image is composed of 642 (= M ) pixels with 4 grey levels and the
degeneration is assumed by an additive Gaussian noise with variance σ 2 = 1 . We adopt uniform distribution for the initial state and select
mode of the posterior distribution as the final restored image. The results
of the restoration are compared by misclassification rate d1 E [I (x i , x i* )] ,
where I (u, v ) is 1 if u equals to v or 0 otherwise between pixels in original
and restored image, and mean square error d2 E [(x i − x i* )2 ] .
Table 1 shows the results of restoration effect of filter f1:i , smoother
s1:T and L-lag smoother si −1000:T at L = 1000 . And figure 1 shows the restored images by the methods at particle size N = 10000 . The length of
L is chosen from some preliminary experiments. The results implies that
the use of smoother has the advantage of its effect with respect to
d1 and d2 , and L-lag smoother gains almost the same effects as the ordinal
smoother.
Table 1. Effect of restoration
N
10
100
1000
5000
10000
f1:i
0.527100
0.438965
0.341797
0.267578
0.241455
(a) observed
d1
s1:T
si −1000:T
0.267578
0.235596
0.210205
0.184570
0.167969
0.267578
0.235596
0.210205
0.180420
0.170410
(b) f1:i
(c) s1:T
f1:i
0.764404
0.604736
0.478027
0.367432
0.353516
d2
s1:T
si −1000:T
0.367920
0.296631
0.282471
0.233398
0.216309
0.367920
0.296631
0.282471
0.230713
0.216553
(d) si −1000:T
(e) original
Fig. 1. Observed, restored and original images
Table 2 shows the calculation time for the simulations by 3.0 GHz
processor of Pentium 4 with programming language Java. It indicates reduction of computational burden obviously in the aspect of time for the
case of L-lag smoother.
1677
Table 2. Computational time (second)
N
10
100
1000
5000
10000
f1:i
6.1
59.6
674.8
3268.8
6754.8
s1:T
6.2
58.3
700.0
3269.4
6716.1
si −1000:T
3.8
35.8
360.6
1779.7
3809.3
6 Discussion
On estimation of an image composed of M pixels with grey-levels g , the
usual methods f1:i and s1:T using N particles demand O(gNM 2 ) as computational time. On the other hand, introduced L-lag smoother si −L:T
needs O(gNLM ) , thus, the advantage of the method grows as the increase
of the image size M . In our results, this advantage is obtained by replacing
M with the smaller L .
On the demand of the memory storage, the usual methods basically
need arrays of NM elements and in contrast, si −L:T needs NL elements,
thus, it also show the advantage in the aspect of storage.
In our simulations, we chose the enough length 1000 as L according to preliminary experiments, however, it seems that there is no need for
more than around twice the number of columns of the given image. Indeed, in another experiment for an image of M = 2562 , the beginning part
of the sample paths are the same among all particles when L = 512 . This
appears that further improvement is available, however, we also need to
pay attention to the range of neighbourhood Ri , which represents a Markov
property on the image [Gem88, NK01].
References
[Bes86]
Besag, J. E.: On the statistical analysis of dirty pictures (with discussion). Journal of the Royal Statistical Society, Series B, 48, 259-302
(1986)
[BGHM95] Besag, J. E., Green, P. Higdon, D. and Mengersen, K.: Bayesian computation and stochastic systems. Statistical Science, 10, 3-66 (1995)
1678
[BH99]
[DFG01]
[GG84]
[Gem88]
[Kit96]
[Liu01]
[NK01]
[NK02]
Besag, J. E. and Higdon, D.: Bayesian analysis of agricultural field
experiments. Journal of the Royal Statistical Society, Series B, 61,
691-746 (1999)
Doucent, A., Freitas, N. D. and Gordon, N. (ed) Sequential Monte
Carlo Methods in Practice. New York: Springer-Verlag (2001)
Geman, S. and Geman, D.: Stochastic relaxation, Gibbs distributions
and Bayesian restoration of images. IEEE Transactions on Pattern
Analysis and Machine Intelligence, 6, 721-741 (1984)
Geman, D.: Random fields and inverse problems in imaging. Lecture
Notes in Mathematics, 1427, 117-193 (1988)
Kitagawa, G.: Monte Carlo filter and smoother for non-Gaussian
nonlinear state space models. Journal of Computational and Graphical
Statistics, 5, 1-25 (1996)
Liu, J.S.: Monte Carlo strategies in scientific computing. New York:
Springer-Verlag (2001)
Nittono, K. and Kamakura, T.: Bayesian image restoration via varying
neighborhood structure. Journal of the Japanese Society of Computational Statistics, 14, 31-47 (2001)
Nittono, K. and Kamakura, T.: On the use of particle filters for Bayesian image restoration. COMPSTAT 2002 Proceedings in Computational Statistics, 473-478 (2002)
Download