<< Chapter < Page Chapter >> Page >

Additionally, one may view the 0 / 1 equivalence problem geometrically. In particular, given the measurements y = Φ x , we have an ( N - M ) -dimensional hyperplane H y = { x ' R N : y = Φ x ' } = N ( Φ ) + x of feasible signals that could account for the measurements y . Supposing the original signal x is K -sparse, the 1 recovery program will recover the correct solution x if and only if x ' 1 > x 1 for every other signal x ' H y on the hyperplane. This happens only if the hyperplane H y (which passes through x ) does not “cut into” the 1 -ball of radius x 1 . This 1 -ball is a polytope, on which x belongs to a ( K - 1 ) -dimensional “face.” If Φ is a random matrix with i.i.d. Gaussian entries, then the hyperplane H y will have random orientation. To answer the question of how M must relate to K in order to ensure reliable recovery, it helps to observe that a randomlygenerated hyperplane H will have greater chance to slice into the 1 ball as dim ( H ) = N - M grows (or as M shrinks) or as the dimension K - 1 of the face on which x lives grows. Such geometric arguments have been made precise by Donoho andTanner  [link] , [link] , [link] and used to establish a series of sharp bounds on CS recovery.

Connections with dimensionality reduction

We have also identified  [link] a fundamental connection between the CS and the JL lemma. In order to make this connection,we considered the Restricted Isometry Property (RIP), which has been identified as a key property of the CS projectionoperator Φ to ensure stable signal recovery. We say Φ has RIP of order K if for every K -sparse signal x ,

( 1 - ϵ ) M N Φ x 2 x 2 ( 1 + ϵ ) M N .
A random M × N matrix with i.i.d. Gaussian entries can be shown to have this property with high probability if M = O ( K log ( N / K ) ) .

While the JL lemma concerns pairwise distances within a finite cloud of points, the RIP concerns isometric embedding of an infinite number of points (comprising a union of K -dimensional subspaces in R N ). However, the RIP can in fact be derived by constructing an effective sampling of K -sparse signals in R N , using the JL lemma to ensure isometric embeddings for each of these points,and then arguing that the RIP must hold true for all K -sparse signals. (See  [link] for the full details.)

Stable embeddings of manifolds

Finally, we have also shown that the JL lemma can also lead to extensions of CS to other concise signal models. In particular, while conventional CS theory concerns sparse signal models, it is alsopossible to consider manifold-based signal models. Just as random projections can preserve the low- dimensional geometry (the union of hyperplanes) that corresponds to a sparse signal family, randomprojections can also guarantee a stable embedding of a low-dimensional signal manifold. We have the following result, which states that an RIP-like property holds for families of manifold-modeledsignals.

Theorem

Let M be a compact K -dimensional Riemannian submanifold of R N having condition number 1 τ , volume V , and geodesic covering regularity R . Fix 0 < ϵ < 1 and 0 < ρ < 1 . Let Φ be a random M × N orthoprojector with

M = O ( K log ( N V R τ -1 ϵ -1 ) log ( 1 ρ ) ϵ 2 )
If M N , then with probability at least 1 ρ the following statement holds: For every pair of points x 1 , x 2 M ,
( 1 - ϵ ) M N Φ x 1 - Φ x 2 2 x 1 - x 2 2 ( 1 + ϵ ) M N

The proof of this theorem appears in [link] and again involves the JL lemma. Due to the limited complexity of a manifold model, it is possible to adequately characterize the geometry using asufficiently fine sampling of points drawn from the manifold and its tangent spaces. In essence, manifolds with higher volume or with greater curvature have more complexity and require a moredense covering for application of the JL lemma; this leads to an increased number of measurements. The theorem also indicates that the requisite number of measurements depends on the geodesic covering regularity of the manifold, a minor technical concept which is also discussed in [link] .

This theorem establishes that, like the class of K -sparse signals, a collection of signals described by a K -dimensional manifold M R N can have a stable embedding in an M -dimensional measurement space. Moreover, the requisite number of random measurements M is once again linearly proportional to the information level (or number of degrees of freedom) K in the concise model. This has a number of possible implications for manifold-based signal processing. Manifold-modeledsignals can be recovered from compressive measurements (using a customized recovery algorithm adapted to the manifold model, in contrast with sparsity-based recovery algorithms) [link] , [link] ; unknown parameters in parametric models can be estimated from compressive measurements; multi-class estimation/classification problems can be addressed [link] by considering multiple manifold models; and manifold learning algorithms may be efficiently executed by applying them simply to the projection of a manifold-modeled data set to a low-dimensional measurement space [link] . (As an example, [link] (d) shows the result of applying the ISOMAP algorithm on a random projection of a data set from R 4096 down to R 15 ; the underlying parameterization of the manifold is extracted with little sacrifice in accuracy.) In all of this it isnot necessary to adapt the sensing protocol to the model; the only change from sparsity-based CS would be the methods for processing or decoding the measurements. In the future, more sophisticated concise models will likely lead to further improvements in signal understanding from compressive measurements.

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Concise signal models. OpenStax CNX. Sep 14, 2009 Download for free at http://cnx.org/content/col10635/1.4
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Concise signal models' conversation and receive update notifications?

Ask