Sparse Signal Recovery

advertisement
An Introduction to Compressed
Sensing
Student : Shenghan TSAI
Advisor : Hsuan-Jung Su and Pin-Hsun Lin
Date : May 02, 2014
1
Outline
• Introduction
• Signal---Sparse and Compressible
- Sparse & Compressible
- Power law
-The p-norm in finite dimensions
• Sensing Matrices
- NSP(Null space conditions)
-RIP(Restricted isometry proerty)
• Sparse Signal Recovery
• Conclusion
2
Introduction
3
Compressed Sensing
• Compressed sensing is a signal processing
technique for efficiently acquiring and
reconstructing a signal, by finding solutions to
underdetermined linear systems.This takes
advantage of the signal's sparseness or
compressibility in some domain, allowing the
entire signal to be determined from relatively few
measurements.
4
History
• “If we sample a signal at twice its highest
frequency, then we can recover it exactly.”
Whittaker-Nyquist-Kotelnikov-Shannon
• Emmanuel Candès, Terence Tao, and David
Donoho proved that given knowledge about a
signal's sparsity(2004)
5
Motivation
Motivation
How it work?
• Y=ΦX
Sensing matrix : Φ ∈ 𝑅𝑀×𝑁 , 𝑀 ≪ 𝑁
^
^

x =
=(T0  0 ) 1 T0 (x)
8
Deterministic compressive sensing
 Signal---Sparse and Compressible
 Sensing Matrices—NSP and RIP
9
Sparse & Compressible
• Sparse Model: Signals of interest are often sparse or compressible, i.e.,
very few large coefficients, many close to zero.
• Sparse signals: have few non-zero coefficients. i.e. K-sparse mean it
has at most K nonzeros. x  K
0

k
 {x : x 0  K }
• Compressible signals: have few significant coefficients; coefficients
decay as a power law.
10
Power law
• A signal is compressible if its sorted coefficient magnitudes in decay
rapidly. x be a signal
x  
• The signal should observe a power law decay : |  | C s q s=1,2,…
s
1
q decay faster, more compressible
11
The p-norm in finite dimensions
X
• Lp mean norm p
• EX: P=2
Ex: p=0
X
2
  X,X 
X
0
1
p p
p
 (| x1 | p  | x2 | p .... | xn | )
| sup p( x) | {i : xi  0}
Ex: p=∞
X

 max{| x1 |,| x2 |,...,| xn |}
The p-norm in finite dimensions
• The grid distance between two points is never shorter than the length
of the line segment between them. Formally, this means that the
Euclidean norm of any vector is bounded by its 1-norm
X
 X
2
1
X
pa
• Using Cauchy–Schwarz inequality.
X
2

X
1
K
 X
p
Sensing Matrices
 NSP(Null space conditions)
 RIP(Restricted isometry proerty)
15
Sensing Matrices
16
Sensing Matrices
• Y=ΦX
Sensing matrix : Φ ∈ 𝑅𝑀×𝑁 , 𝑀 ≪ 𝑁
17
How to design Sensing Matrices
• If we sure our date is sparse and compressible, then we want to
design Φ with M<<N and want it can recover
• To ensure Φ can recover there are two property NSP & RIP need to
follow and we got a measurement bound
18
The Null space property
• If we want it can recover K-sparse signals it is that require Φx1 ≠ Φx2
for all K-sparse x1 ≠ x2
• So we necessary that Φ must have at least 2K rows otherwise there
exist K-sparse x1,x2 s.t. Φ(x1-x2)=0
• Spark are almost the same meaning
• M >=2K
19
The Null space property
• Null space property (NSP) of order K if there exists a constant C > 0
h
2
C
hc
:R M  R N
1
K
• holds for all h   () and for all such that   K
 (x )  x 2  C
 K ( x)1
K
^
 K ( x) p  min x  xK
^
x K
p
20
21
The restricted isometry property
(x )  x 2  C
 K ( x)1
K
• A A matrix satises the restricted isometry property (RIP) of order K if
there exists a 𝛿𝑘 (0~ 1)
(1   k ) x 2  x 2  (1   k ) x 2
2
2
2
• Make sure Φx be stable
22
23
The RIP and NSP(relationship between RIP &
NSP)
• Suppose that Φ satisfies the RIP of order 2K with  2 k  2  1 . Then Φ
satisfies the NSP of order 2K with constant (接第X幾頁)
h
2
C
hc
1
K
C
2 2 k
1  (1  2) 2 k
|  0 | K
• Suppose that Φ satisfies the RIP of order 2K, 0  {1,2,3,..., N }
1  hc
  0  1
0
h
2

hc
0
K
1

| h , h |
h 2
2 2 k

1   2k
1

1   2k
24
Sparse Signal Recovery
 NSP(Null space conditions)
 RIP(Restricted isometry proerty)
25
Sparse Signal Recovery
•
^
z1
x  arg min
z
with noiseless
z ( y )  {z : z  y}
in noise
z ( y)  {z : z  y 2   }
26
Cont.
• 𝑥 ∈ 𝐵 𝑦 affects|<Φh, Φℎ^ >|
^
• When z ( y)  {z : z  y} then y  x   x
h  0
^
h  x x
^
(x)  x 2  x x  h
2
2
 C0
 K ( x)1
K
1  (1  2) 2 k
C0  2
1  (1  2) 2 k
27
28
Recovery in noise
^
z ( y)  {z : z  y 2   }
y  x +   x
0  {1,2,3,..., N}
1  hc
  0  1
0
|  0 | K
^
 ( x )  x 2  x  x  h
2
1  (1  2) 2 k
C0  2
1  (1  2) 2 k
^
2
 x  x  C0
2
 K ( x)1
| h , h |
 C1
h 2
K
2
C1 
1  (1  2) 2 k
29
30
^
x  0  (
^
T
0
1
 0 ) 
T
0
2
x c0  0
y
 (T0  0 ) 1 T0 (x  e)  x
x x
^
2
 (T0  0 ) 1 T0 e
31
2
Conclusion
• If signals are sparse and compressible then can use CS to compressed.
• Signals can be perfect recovered ,if satisfies NSP & RIP.
32
References
1.
2.
3.
4.
5.
6.
[E. Cand[U+FFFD] The restricted isometry property and its implications for compressed sensing.
Comptes rendus de l'Acad[U+FFFD]e des Sciences, S[U+FFFD]e I, 346(9-10):5898211;592, 2008..
[Yu TSP 11] G. Yu and Guillermo Sapiro, “Statistical Compressed Sensing of Gaussian Mixture
Models,” IEEE Trans of Signal Processing, vol. 59, no. 12, pp. 5842–5857, Dec. 2011.
[R. Baraniuk, M.A. Davenport, M.F. Duarte, C. Hegde], An Introduction to Compressive Sensing,
CONNEXIONS, Rice University, Houston, Texas, 2010.
[R.G. Baraniuk,] “Compressive sensing,” IEEE Signal Processing Mag., vol. 24,
no. 4, pp. 118–120, 124, 2007.
http://en.wikipedia.org/wiki/Lp_space
http://en.wikipedia.org/wiki/Compressed_sensing
33
34
Download