8/2/2012

advertisement
8/2/2012
Iterative Reconstruction
Methods in Computed
Tomography
J. Webster Stayman
Dept. of Biomedical Engineering, Johns Hopkins University
Power of Iterative Reconstruction
FBP reconstruction
0.8 mAs/frame
Iterative Reconstruction
0.1 mAs/frame
(80 kVp, 360 projections)
Learning Objectives
o Fundamentals of iterative methods
•
•
Approach to forming an iterative algorithm
Identify particular classes of methods
o Advantages of iterative approaches
•
•
Intuition behind what these algorithms do
Flexibility of these techniques
o Images produced by iterative methods
•
•
Differences from traditional reconstruction
Image properties associated with iterative
reconstruction
1
8/2/2012
Iterative Reconstruction
o What is iterative reconstruction?
Projection
Data
Image
Volume
Make Image
“Better”
o How can we make the image better?
•
Get a better match to the data
•
Enforce desirable image properties
•
Need a measure of “better”
Requires a data model
Desirable
Image
Properties
Encourage smoothness, edges, etc.
Building an Iterative Technique
o Define the objective
•
Find the volume that best fits the data and desired image quality
volume = arg max { data, model & image_properties }
µˆ = arg max { y, y ( µ ) & f ( µ ) }
o Devise an algorithm that solves the objective
• Iteratively solve for µ̂
•
Decide when to stop iterating (and how to start)
Objective Function
Optimization Algorithm
Reconstruction Choices
Image
Properties
Forward
Model Driven
Regularized
PICCS
ART
ART
Penalized
Likelihood
Maximum
Likelihood
MAP
Image
Restoration
PWLS
Veo
SAFIRE
Image
Denoising
iDose
IRIS
ASIR AIDR
Statistical Models
*Disclaimer: The exact details of commercially available reconstruction methods are not known by the author.
2
8/2/2012
Model-based Approaches
o Transmission Tomography Forward Model
•
Projection Physics, Beer’s Law
o Need a Parameterization of µ~
Parameterization of the Object
o Continuous-domain object
o Want finite number of parameters
o Choices of basis functions:
•
•
•
Point Samples - Bandlimited
Contours – Piecewise constant
Blobs, Wavelets, “Natural Pixels,”…
o Voxel Basis
µ1 µ2 µ3 µ4 µ5 µ6 µ7 µ8 µ9 µ10µ11
µ12 ...
...
...
... µp
Projection
o Linear operation
o Discrete-Discrete for parameterized problem
yi
o System matrix
µj-1 µj µj+1 µj+2
3
8/2/2012
Forward Model
o Mean measurements as a function of parameters
Aside: Backprojection
Projection
Backprojection
A Simple Model-Driven Approach
o Forward Model y ( µ ) = I 0 exp ( −Aµ )
o Objective and Estimator
µˆ = arg min y − y ( µ )
 y
 + Aµ
 I0 
µˆ ≅ arg min − log 
2
−1
µˆ = arg min Aµ − l =  AT A  AT l
Projection-Backprojection Backprojection
o For a squared-distance metric
• Many possible algorithms
• Can be equivalent to ART, which seeks Aµ
=l
4
8/2/2012
Flexibility of Iterative Methods
2
−1
µˆ = arg min Aµ − l =  AT A  AT l
o For a well-sampled, parallel beam case
−1
1
AT Ax ≈ **x →  AT A  x ≈ F −1 { ρ 2 D } **x
r
o For other cases
−1
 AT A  AT l
o Iterative methods implicitly handle the geometry
•
•
Find the correct inversion for the specific geometry
Cannot overcome data nullspaces (needs complete data)
More Complete Forward Models
o More physics
• Polyenergetic beam, energy-dependent
attenuation
• Detector effects, finite size elements, blur
• Source effects, finite size element
• Scattered radiation
o Noise
• Data statistics
• Quantum noise – x-ray photons
• Detector Noise
Toy Problem
pdfs
3 Random Variables
Different std dev (σ1,σ2,σ3)
y1
y2
y3
Best way to estimate µ?
µ =?
realizations
5
8/2/2012
Maximum Likelihood Estimation
o Find the parameter values most likely to be responsible for
the observed measurements.
o Properties
•
Asymptotically unbiased and efficient under general conditions
o Likelihood Function
L ( y; µ ) = p ( y1 , y2 ,… , y N µ1 , µ 2 ,… , µ M )
o Maximum Likelihood Objective Function
µˆ = arg max L ( y; µ )
ML Estimate for the Toy Problem
3
Likelihood function: L ( y, µ ) = p ( yi | µ ) = ∏ p ( yi | µ )
i =1
p ( yi | µ ) =
1
2πσ i2
2
exp  − 2σ1 2 ( yi − µ ) 
 i

Log-Likelihood function:
3
3
log L ( y, µ ) = − 12 ∑ log 2πσ i2 − ∑ 2σ1 2 ( yi − µ )
(
i =1
)
i =1
2
i
Maximize over µ: µˆ = arg max log L ( y; µ )
3
∂
2
log L ( y, µ ) = ∑ 2σ1 2 ( yi − µ )
i
∂µ
i =1
µˆ =
∑
∑
3
yi
i =1 σ i2
3
1
i =1 σ i2
ML for Tomography
o
o
o
o
Need a noise model
Depends on the statistics of the measurements
Depends on the detection process
Common choices
•
Poisson – x-ray photon statistics
•
Poisson-Gaussian mixtures – photons + readout noise
•
Gaussian (nonuniform variances) – approx. many effects
6
8/2/2012
Poisson Likelihood Function
o Marginal Likelihoods
o Likelihood
o Log-Likelihood
o Objective Function
µˆ = arg max log L ( y | µ )
Gaussian Likelihood Function
o Marginal Likelihoods
y 
li = − log  i  p ( li | µ ) =
 I0 
o Likelihood
2
exp  − 2σ1 2 li − [ Aµ ]i 

i

(
1
2πσ i2
N
N
i =1
i =1
L ( l | µ ) = p ( l | µ ) = ∏ p ( li | µ ) = ∏
1
2πσ i2
)
2
exp  − 2σ1 2 li − [ Aµ ]i 

i

(
)
o Log-Likelihood
N
log L ( l | µ ) ≅ ∑ − 2σ1 2 li − [ Aµ ]i
i =1
i
(
)
2
o Objective Function and Estimator
−1
µˆ = arg max log L ( l | µ ) =  AT D  σ1  A  AT D  σ1  l
  
 

2
i
2
i
Iterative Algorithms
o Plethora of iterative approaches
•
•
•
•
Expectation-Maximization – General Purpose Methodology
Gradient-based Methods – Coordinate Ascent/Descent
Optimization Transfer – Paraboloidal Surrogates
Ordered-Subsets methods
o Properties of iterative algorithms
•
•
•
•
Monotonicity
True convergence
Speed
Complexity
7
8/2/2012
Statistical Reconstruction Example
o Test Case
•
•
•
•
•
Single slice x-ray transmission problem
512 x 512 0.3 mm volume
400 detector bins over 180 angles (360 degrees)
Poisson noise: 1e5 counts per 0.5 mm detector element
SO: 380 mm, DO: 220 mm
o Reconstruction
•
•
•
•
•
Voxel basis: 512 x 512 0.3 mm voxels
Maximum-likelihood objective
EM-type algorithm
Initial image – constant value
Lots of iterations
ML-EM Iterations
FBP vs ML-EM Comparison
8
8/2/2012
Enforcing Desirable Properties
o FBP
•
Filter designs – cutoff frequencies
o Iterative methods
•
•
•
•
Modify the objective to penalize “bad” images
Discourage noise
Preserve desirable image features
Other prior knowledge
o Image-domain Denoising
µˆ = arg max F ( µ ) − β R ( µ )
o Model-based Reconstruction
µˆ = arg max F ( y , µ ) − β R ( µ )
Local Control of Image Properties
o Pairwise Penalty
•
Penalize the difference between neighboring voxels
R ( µ ) = ∑ ∑ w jkψ ( µ j − µ k )
j k ∈N
o Quadratic Penalty
R ( µ ) = ∑ ∑ w jk ( µ j − µk )
2
j k∈N
Penalized-Likelihood Example
o Forward Model
•
•
•
•
•
Fan-beam Transmission Tomography
512 x 512 0.3 mm volume
400 detector bins over 180 angles (360 degrees)
Poisson: {1e4,1e3} counts per 0.5 mm detector element
SO: 380 mm, DO: 220 mm
o Objective Function
•
•
Poisson Likelihood
Quadratic, first-order Penalty
o Algorithm
•
•
Separable Paraboloidal Surrogates
400 iterations – well-converged
9
8/2/2012
FBP vs PL– 1e4 Counts
FBP vs PL – 1e3 Counts
Other Penalties/Energy Functions
o Quadratic penalty
•
•
Tends to enforce smoothness throughout image
Increasingly penalizes larger pixel differences
o Non-quadratic penalties
•
•
Attempt to preserve edges in the image
Once pixel differences become large allow for decreased
penalty (perhaps a relative decrease)
o Flexibility: Even more penalties
•
Wavelets and other bases, non-local means, etc.
10
8/2/2012
Nonquadratic Penalties
o Choices
•
Truncated Quadratic
•
Lange Penalty
•
P-Norm
Lange
P-Norm
Lange 1e3 Counts
Truncated Quadratric
Closer Look at Image Properties
o Test Case
•
•
•
•
•
Single slice x-ray transmission problem
480 x 480 0.8 mm volume
1000 detector bins over 360 angles (360 degrees)
Poisson noise: 1e5 counts / 0.76 mm detector element
SO: 600 mm, DO: 600 mm
o Reconstruction
•
•
•
•
•
Penalized-likelihood objective
Shift-Invariant Quadratic Penalty
Separable paraboloidal surrogates
200 iterations
Voxel basis: 480 x 480 0.8 mm voxels
11
8/2/2012
FBP vs PL, Noise Properties
x 10
5
µˆ ( yµ)ˆ −( yµˆ)( y )
FBP
-3
Penalized-Likelihood
-4
x 10
8
0.03
6
4
50
50
3
100
100
2
150
150
1
200
200
0
250
250
300
300
-2
350
350
0.01
-4
400
400
450
450
-1
-2
-3
-4
-5
100
200
300
2
0.02
0
0.015
0.005
-6
400
100
o Noise in FBP
•
•
0.025
4
200
300
-8
400
o Noise in Quadratic PL
Shift-variant variance
Shift-variant covariance
Relatively shift-invariant
variance (in object)
Shift-variant covariance
•
•
FBP vs PL, Resolution Properties
(
(
)
)
 − δµˆj ( y [ µ true ] )
µˆ y  µµˆtruey+ δµ true
j +
FBP
Penalized-Likelihood
-3
x 10
0.023
2.5
50
50
50
0.022
100
100
100
0.021
150
150
150
0.02
200
200
200
0.019
250
250
250
0.018
1
300
300
300
0.017
350
350
350
2
1.5
0.5
400
400
400
450
450
450
50
100
150
200
250
300
350
400
400
450
0.016
0.015
0
0.014
50
100
100
150
200
200
250
300
300
350
400
400
450
0.013
-0.5
Image Properties
o Filtered-Backprojection
•
•
Largely shift-invariant spatial resolution
Shift-variant, object-dependent noise
o Uniform Quadratic Penalized Likelihood
•
•
Shift-variant, object-dependent spatial resolution
Shift-variant, object-dependent noise
o Edge-preserving Penalty Methods
•
•
•
Shift-variant, object-dependent spatial resolution
Shift-variant, object-dependent noise
Noise-resolution properties may not even be locally
smooth
12
8/2/2012
Learning Objectives I
o Fundamentals of iterative methods
•
Approach to forming an iterative algorithm
Forward Model
Objective Function
Optimization Algorithm
•
Identify particular classes of methods
Model-based vs. Image Denoising approaches
Statistical vs. Nonstatistical approaches
Kinds of regularization
Learning Objectives II
o Advantages of iterative approaches
•
Intuition behind what these algorithms do
Fitting reconstruction to observations
Data weighting by information content
Importance of regularization
•
Flexibility of these techniques
Arbitrary geometries
Sophisticated modeling of physics
General incorporation of desired image properties
through regularization
Learning Objectives III
o Images produced by iterative methods
•
Differences from traditional reconstruction
Regularization is key
Image properties are tied to statistical weighting
Can depend on algorithm when iterative solution
has not yet converged
•
Image properties associated with iterative
reconstruction
Highly dependent on regularization
Can be more shift-variant (esp. edge-preservation)
Different noise, texture
13
8/2/2012
Thank You
14
Download