<< Chapter < Page Chapter >> Page >
This module establishes a number of results concerning various L1 minimization algorithms designed for sparse signal recovery from noisy measurements. The results in this module apply to both bounded noise as well as Gaussian (or more generally, sub-Gaussian) noise.

The ability to perfectly reconstruct a sparse signal from noise-free measurements represents a promising result. However, in most real-world systems the measurements are likely to be contaminated by some form of noise. For instance, in order to process data in a computer we must be able to represent it using a finite number of bits, and hence the measurements will typically be subject to quantization error. Moreover, systems which are implemented in physical hardware will be subject to a variety of different types of noise depending on the setting.

Perhaps somewhat surprisingly, one can show that it is possible to modify

x ^ = arg min z z 1 subject to z B ( y ) .

to stably recover sparse signals under a variety of common noise models  [link] , [link] , [link] . As might be expected, the restricted isometry property (RIP) is extremely useful in establishing performance guarantees in noise.

In our analysis we will make repeated use of Lemma 1 from "Noise-free signal recovery" , so we repeat it here for convenience.

Suppose that Φ satisfies the RIP of order 2 K with δ 2 K < 2 - 1 . Let x , x ^ R N be given, and define h = x ^ - x . Let Λ 0 denote the index set corresponding to the K entries of x with largest magnitude and Λ 1 the index set corresponding to the K entries of h Λ 0 c with largest magnitude. Set Λ = Λ 0 Λ 1 . If x ^ 1 x 1 , then

h 2 C 0 σ K ( x ) 1 K + C 1 Φ h Λ , Φ h h Λ 2 .

where

C 0 = 2 1 - ( 1 - 2 ) δ 2 K 1 - ( 1 + 2 ) δ 2 K , C 1 = 2 1 - ( 1 + 2 ) δ 2 K .

Bounded noise

We first provide a bound on the worst-case performance for uniformly bounded noise, as first investigated in  [link] .

(theorem 1.2 of [link] )

Suppose that Φ satisfies the RIP of order 2 K with δ 2 K < 2 - 1 and let y = Φ x + e where e 2 ϵ . Then when B ( y ) = { z : Φ z - y 2 ϵ } , the solution x ^ to [link] obeys

x ^ - x 2 C 0 σ K ( x ) 1 K + C 2 ϵ ,

where

C 0 = 2 1 - ( 1 - 2 ) δ 2 K 1 - ( 1 + 2 ) δ 2 K , C 2 = 4 1 + δ 2 K 1 - ( 1 + 2 ) δ 2 K .

We are interested in bounding h 2 = x ^ - x 2 . Since e 2 ϵ , x B ( y ) , and therefore we know that x ^ 1 x 1 . Thus we may apply [link] , and it remains to bound Φ h Λ , Φ h . To do this, we observe that

Φ h 2 = Φ ( x ^ - x ) 2 = Φ x ^ - y + y - Φ x 2 Φ x ^ - y 2 + y - Φ x 2 2 ϵ

where the last inequality follows since x , x ^ B ( y ) . Combining this with the RIP and the Cauchy-Schwarz inequality we obtain

Φ h Λ , Φ h Φ h Λ 2 Φ h 2 2 ϵ 1 + δ 2 K h Λ 2 .

Thus,

h 2 C 0 σ K ( x ) 1 K + C 1 2 ϵ 1 + δ 2 K = C 0 σ K ( x ) 1 K + C 2 ϵ ,

completing the proof.

In order to place this result in context, consider how we would recover a sparse vector x if we happened to already know the K locations of the nonzero coefficients, which we denote by Λ 0 . This is referred to as the oracle estimator . In this case a natural approach is to reconstruct the signal using a simple pseudoinverse:

x ^ Λ 0 = Φ Λ 0 y = ( Φ Λ 0 T Φ Λ 0 ) - 1 Φ Λ 0 T y x ^ Λ 0 c = 0 .

The implicit assumption in [link] is that Φ Λ 0 has full column-rank (and hence we are considering the case where Φ Λ 0 is the M × K matrix with the columns indexed by Λ 0 c removed) so that there is a unique solution to the equation y = Φ Λ 0 x Λ 0 . With this choice, the recovery error is given by

x ^ - x 2 = ( Φ Λ 0 T Φ Λ 0 ) - 1 Φ Λ 0 T ( Φ x + e ) - x 2 = ( Φ Λ 0 T Φ Λ 0 ) - 1 Φ Λ 0 T e 2 .

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, An introduction to compressive sensing. OpenStax CNX. Apr 02, 2011 Download for free at http://legacy.cnx.org/content/col11133/1.5
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'An introduction to compressive sensing' conversation and receive update notifications?

Ask