Beruflich Dokumente
Kultur Dokumente
1 Byrne
The objective of Compressed Sensing (CS) is to exploit the properties of sparsity
to reconstruct a sparse signal vector (x) from relatively few linear functional
measurements (y) of the original signal (f ).
f = x1 v1 + x2 v2 + ... + xJ vJ = VT x (1)
From the above equation it follows that:
x = Vf (2)
1
A second orthonormal basis U={u1 , u2 ,...,uJ } also spanning RJ is selected
to maximise the incoherence between V and U.
AT = UK (3)
The matrix AT = {aT1 , aT2 ,...,aTI }, where aTi = uj if ki,j =1. The inner-
product functional measurement between the sparse signal and a member of is
AT defined to be:
yi = haTi , xi (4)
Following this the measured vector is:
y = Ax (5)
Where y is a Ix1 vector and A is a IxJ vector.
y = AVf = Φf (6)
y = Ax̂ (8)
2
And I is sufficiently large. Minimising the support of x̂ represented as the l0
norm ||x̂||0 uses computationally intensive algorithms. It was proven that for
sparse vectors that the l1 norm also satisfies the above problem. This allows for
less taxing Linear Programming (LP) algorithms to be applied to recover x.
Subject to:
y = Ax̂ (10)
Where
J
X
||x̂||1 = |x̂i | (11)
i=1
Now suppose we have two different bases for RN : Φ = {φ1 , φ2 , ..., φN } and
Ψ = {ψ 1 , ψ 2 , ..., ψ N }. Then
α1 β1
α 2
β2
S = φ1 φ2 . . . φ N . = ψ 1 ψ 2 . . . ψ N . (13)
.. ..
αN βN
3
A dictionary is a joint, over-complete set of vectors, created by concatenating
the vectors from different bases. For this example the dictionary will be {Φ,
Ψ} = {φ1 , φ2 , ..., φN , ψ 1 , ψ 2 , ..., ψ N } which gives a Nx2N matrix. The problem
arises when representing a signal in terms of the dictionary since multiple rep-
resentations are possible for the same signal. Of these multiple representations
finding the sparest is a difficult optimisation problem. Suppose we have:
γ1φ
γφ
2
.
..
φ N N
γN
X X
γiφ φi + γiψ ψ i
S = φ1 φ2 ... φN ψ 1 ψ2 ... ψN ψ = Φ Ψ γ=
γ1
ψ i=1 i=1
γ2
..
.
ψ
γN
(14)
Choosing γ involves solving an under-determined set of N equations with 2N
unknowns. To obtain the sparsest representation the additional requirement of
minimising the support of γ must be met. The support of γ is the number of
nonzero entries present in the vector γ. Hence the following problem is required
to be solved.
(P0 ) Minimise ||γ||0 subject to S = Φ Ψ γ
where ||γ||0 is the size of the support of γ. Problem (P0 ) can be addressed us-
ing two different approximation methods named ”matching pursuit” and ”basis
pursuit”. Donoho and Hu formalised conditions for basis pursuit under which
the basis pursuit method exactly finds the desired sparse representation.
This very sparse representation is the unique solution for both (P0 ) and (P1 ).
(P1 ) Minimise ||γ||1 subject to S = Φ Ψ γ
4
PN
Where ||γ||1 = i=1 |γi |