Drag-and-Drop Pasting

advertisement
Image Deblurring with
Optimizations
Qi Shan
Leo Jiaya Jia
Aseem Agarwala
University of Washington
The Chinese University of Hong Kong
Adobe Systems, Inc.
The Problem
2
An Example
Previous Work (1)
Hardware solutions:
[Ben-Ezra and Nayar 2004]
[Levin et al. 2008]
[Raskar et al. 2006]
4
Previous Work (2)
Multi-frame solutions:
[Jia et al. 2004]
[Petschnigg et al. 2004]
[Rav-Acha and Peleg 2005]
[Yuan et al. 2007]
5
Previous Work (3)
Single image solutions:
[Fergus et al. 2006]
[Levin et al. 2007]
[Jia 2007]
6
Most recent work on Single Image Deblurring
Qi Shan, Jiaya Jia, and Aseem Agarwala
High-Quality Motion Deblurring From a Single Image. SIGGRAPH 2008
Lu Yuan, Jian Sun, Long Quan and Heung-Yeung Shum
Progressive Inter-scale and intra-scale Non-blind Image Deconvolution.
SIGGRAPH 2008.
Joshi, N., Szeliski, R. and Kriegman, D.
PSF Estimation using Sharp Edge Prediction, CVPR 2008.
A. Levin, Y. Weiss, F. Durand, W. T. Freeman
Understanding and evaluating blind deconvolution algorithms. CVPR 2009
Sunghyun Cho and Seungyong Lee, Fast Motion Deblurring.
SIGGRAPH ASIA 2009
And many more...
Some take home ideas
1. Using hierarchical approaches to estimate kernel in
different scales
2. Realize the importance of strong edges
3. Bilateral filtering to suppress ringing artifacts
4. RL deconvolution is good, but we've got better chioces
5. Stronger prior does a better job
6. Deblurring by assuming spatially variant kernel is a good
way to go
Today's topic
How to apply natural image statistics, image local
smoothness constraints, and kernel sparsity prior
in a MAP process
Short discussion on
1. the stability of a non-blind deconvolution process
2. noise resistant non-blind deconvolution and
denoising
Image Global Statistics
…
10
Image Global Statistics
…
11
Image Global Statistics
12
Image Local Constraint
I
L
13
Image Local Constraint
I
L
14
Image Local Constraint
I
L
15
Image Local Constraint
p2
I
L
pL

N
(
d
L
d
I
0
,

)
2
i()
i
i|
1
16
Kernel Statistics
exponentially distributed
fj
pj(f)e
17
Combining All constraints
L
f
n
m
i
n
E
(
L
,
f
)

m
i
n
l
o
g
[
p
(
n
)
p
(
d
L
)
p
(
L
)
p
(
f
)
]
1
2
Two-step iterative optimization
• Optimize L
• Optimize f
18
Optimization Process
Optimize L
Idea: separate convolution

replace dLi with  i

2
E
'
(
L
)

\
S
u
m
(
|
|L
*
f
I
|
|)
2
log p(n)

o
gp
d
L
)||1
1||l
1(
log p1(dL)
2


(
\
S
u
m
(
m
|
|
d
L

d
I
|
|)
)
2
i
i
i 2
log p2 (L)
19
Optimization Process
Optimize L
Idea: separate convolution

replace dLi with  i

2
E
'
(
L
)

\
S
u
m
(
|
|L
*
f
I
|
|)
2
log p(n)
2


(
\
S
u
m
(
m
|
|


d
I
|
|)

o
gp
(
)||1 2
i
i
i 2)
1||l
1
log p1(dL)
log p2 (L)
20
Updating L
Adding a new constraint to make  ~ dL
Removing terms that are not relevant to L


E '( L)  \ Sum(|| L * f  I ||22 ) 
2
2


(
||


d
L
||


(
\
S
u
m
(
|
|


d
I
|
|)
)

|
|l
o
g
p
(

)|
|
2)
2
i
i 2
1
1
1


2
2


(
||


d
L
||
L

a
r
g
m
i
n
\
S
u
m
(
|
|
d
L
*
fd

I
|
|
)
o
p
t
L
2
2)
An easy quadratic optimization problem with a
closed form solution in the frequency domain
21
Updating 
Removing terms that are not relevant to 


E '( )  \ Sum(|| L * f  I ||22 ) 
2
2


(
||


d
L
||


(
\
S
u
m
(
|
|


d
I
|
|)
)

|
|l
o
g
p
(

)
|
|
2)
i
i 2
1
1
1 2


2


a
r
g
m
i
n
|
|
l
o
g
p
(

)
|
|

(
\(
S
u
m
|
|


d
I
|
|
)
)
o
p
t

1
1 1
2
i
i
2

(||dL||22)
22
 
2


a
r
g
m
i
n
|
|
l
o
g
p
(

)
|
|

(
\(
S
u
m
|
|


d
I
|
|
)
)
o
p
t

1 11
2
i i
2

(||dL||22)

a
r
g
m
i
n
(
\
S
u
m
(
E
'
)
)

i
each E ' only contains a single variable Ψi
i
It is then a set of easy single variable
optimization problems
23
Iteration 0
(initialization)
24
Time: about 30 seconds for an 800x600 image
Iteration 8
(converge)
25
A comparison
RL deconvolution
26
A comparison
Our deconvolution
27
Two-step iterative optimization
• Optimize L
• Optimize f
m
i
n
E
(
L
,
f
)

m
i
n
l
o
g
[
p
(
n
)
p
(
d
L
)
p
(
L
)
p
(
f
)
]
1
2
2
E
()|
f
|L
*
f
I
|
|


|
|f|
|
2
1
Optimization with a total variation regularization
28
Results
29
Results
30
31
32
More results
33
More results
34
Today's topic
How to apply natural image statistics, image local
smoothness constraints, and kernel sparsity prior
in a MAP process
Short discussion on
1. the stability of a non-blind deconvolution process
2. noise resistant non-blind deconvolution and
denoising
Stability
Considering the simplest case: Wiener Filtering
T
F
X T
B How about if
FF
I
B B* n



T
*
F
X
*
X

X

n
T
T
F
F

I F
F

I
T
F
And X*  T B*
FF
Stability
Thus

P2
|
|X

X
|
||
|
|
|

C
2
P
P

*2
2
2

where P is the frequency domain representation of F
 2 is the variance of the noise
Observation: the noise in the blur image is magnified in
the deconvolved image. And the Noise Magnification
Factor (NMF) is solely determined by the filter F
Some examples
Some examples
Dense kernels are less stable for deconvolution
than sparse ones
Noise resistant deconvolution and denoising
With Jiaya Jia, Singbing Kang and Zenlu Qin
In CVPR 2010
See you in San Francisco!
Blind and non-blind image deconvolution software
is available online and will be updated soon!
40
Download