Exact/stable recovery conditions

advertisement
Cs: compressed sensing
Jialin peng
Outline
• Introduction
• Exact/Stable Recovery Conditions
– p -norm based recovery
– OMP based recovery
 Some related recovery algorithms
 Sparse Representation
 Applications
Introduction
Data Storage
 high-density sensor
 high speed sampling
 ……
 A large amount of
sampled data will
be discarded
Receiving &
A certain minimum number
Storage
of samples is required in order to
perfectly capture an arbitrary
bandlimited signal

Sparse Property
• Important classes of signals have naturally sparse
representations with respect to fixed bases (i.e.,
Fourier, Wavelet), or concatenations of such bases.
• Audio, images …
• Although the images (or their features) are
naturally very high dimensional, in many
applications images belonging to the same class
exhibit degenerate structure.
• Low dimensional subspaces, submanifolds
• representative samples—sparse representation
Transform coding: JPEG, JPEG2000, MPEG, and MP3
The Goal
Develop an end-to-end system
• Sampling
• processing
• reconstruction
• All operations are performed at a low rate: below
the Nyquist-rate of the input (too costly, or even
physically impossible)
• Relying on structure in the input
Sparse: the simplest choice is the best one
• Signals can often be well approximated as a
linear combination of just a few elements from a
known basis or dictionary.
• When this representation is exact ,we say that
the signal is sparse.
Remark:
In many cases these high-dimensional
signals contain relatively little information
compared to their ambient dimension.
Introduction
Data Storage
 high-density sensor
 high speed sampling
 ……
 A large amount of
sampled data will
be discarded
Receiving &
A certain minimum number
Storage
of samples is required in order to
perfectly capture an arbitrary
bandlimited signal

Introduction
 Sparse priors of
signal
Data Storage
modified sensor
 Nonuniform
sampling
 Imaging algorithm:
optimization
Receiving & Storage
 Alleviated sensor
optimization
 Reduced data
 ……
Introduction

N ×N
N ×1
N ×1
Φ
•
x = y

M ×1
M ×N
Sensing Matrix
N ×1
M  N
compression
Find the most concise representation:
Compressed sensing: sparse or compressible representation
• A finite-dimensional signal having a sparse or compressible
representation can be recovered from a small set of linear, nonadaptive
measurements
•
•
how should we design the sensing matrix A to ensure that
it preserves the information in the signal x?.
how can we recover the original signal x from
measurements y?
• Nonlinear:
1. Unknown nonzero locations results in a nonlinear model:
the choice of which dictionary elements are used can change from
signal to signal .
2. Nonlinear recovering algorithms
the signal is well-approximated by a signal with only k nonzerocoefficients
Introduction
M N
be a matrix of size
M
N
M
N
with K
.
y

Φx

x
For a
–sparse signal
, let
be the measurement vector.
Our goal is to exact/stable recovery the
unknown signal from measurement.
The problem is under-determined.
min x for
, s.t.
y sparsity,
Φx
Thanks
the
we can reconstruct the
0
signal via
.
 Let Φ




How can we recovery the unknown
signal:
Exact/Stable Recovery Condition
Exact/stable recovery conditions
• The spark of a given matrix A
• Null space property (NSP) of order k
• The restricted isometry property
Remark:
verifying that a general matrix A
satisfies any of these properties has
a combinatorial computational
complexity
Exact/stable recovery conditions
Restricted Isometry Property
 The restricted isometry constant (RIC)
K
is
defined as the smallest constant which satisfy:
1   K  x 2  Φx 2  1   K  x
2
 TheK , K 
2
2
2
restricted orthogonality condition K , K 
(ROC)
is the smallestΦu
number
that:
, Φv  such
K , K u 2 v 2
Exact/stable recovery conditions
 Solving 0
minimization is NP-hard, we usually
1
relax it1 to the
or
minimization.
p, 0  p 
Exact/stable recovery conditions
y  Φx  e
 For the inaccurate measurement
the
stable
1
reconstruction
is
min xmodel
,
s.
t.
y  Φx 2  
1
,
Exact/stable recovery conditions
 Some other Exact/Stable Recovery Conditions:
Exact/stable recovery conditions
 Braniuk et al. have proved that for some
random matrices, such as
 Gaussian,
 Bernoulli,
 ……
we can exactly/stably reconstruct unknown
signal with
overwhelming high probability.
Exact/stable recovery conditions
cf:
1
minimization
Exact/stable recovery conditions
 Some evidences have indicated that
min x p ,
s. t. y  Φx
with0  p  1
, can exactly/stably recovery
signal with fewer measurements.
Quicklook Interpretation
• Dimensionality-reducing projection.
• Approximately isometric embeddings, i.e.,
pairwise Euclidean distances are nearly
preserved in the reduced space
RIP
Quicklook Interpretation
Quicklook Interpretation
•the ℓ2 norm penalizes large coefficients heavily, therefore
solutions tend to have many smaller coefficients.
•In the ℓ1 norm, many small coefficients tend to carry a
larger penalty than a few large coefficients.
Algorithms
• L1 minimization algorithms
iterative soft thresholding
iteratively reweighted least squares
…
• Greedy algorithms
Orthogonal Matching Pursuit
iterative thresholding
• Combinatorial algorithms
CS builds upon the fundamental fact that
• we can represent many signals using only a few
non-zero coefficients in a suitable basis or
dictionary.
• Nonlinear optimization can then enable recovery
of such signals from very few measurements.
• Sparse property
• The basis for representing the data
• incoherent->task-specific (often
overcomplete) dictionary or redundant one
MRI Reconstruction
MR images are usually sparse in certain transform domains, such
as finite difference and wavelet.
Sparse Representation
Consider a family of
images, representing natural and typical image
content:
•Such images are very diverse vectors in
•They occupy the entire space?
•Spatially smooth images occur much more often than highly non-smooth and
disorganized images
•L1-norm measure leads to an enforcement of sparsity of the signal/image
derivatives.
•Sparse representation
Matrix completion algorithms
 Recovering a unknown (approximate) low-
rank matrix from a sampling set of its entries.
min rank  X  : X ij  M ij ,  i, j   
NP-hard
min  X * : X ij  M ij ,  i, j   
Convex relaxation
X
X

min X *   X ij  M ij
X
F

,  i, j  
Unconstraint
Download