<< Chapter < Page Chapter >> Page >
This module introduces the count-min and count-median sketches as representative examples of combinatorial algorithms for sparse recovery.

In addition to convex optimization and greedy pursuit approaches, there is another important class of sparse recovery algorithms that we will refer to as combinatorial algorithms . These algorithms, mostly developed by the theoretical computer science community, in many cases pre-date the compressive sensing literature but are highly relevant to the sparse signal recovery problem .

Setup

The oldest combinatorial algorithms were developed in the context of group testing   [link] , [link] , [link] . In the group testing problem, we suppose that there are N total items, of which an unknown subset of K elements are anomalous and need to be identified. For example, we might wish to identify defective products in an industrial setting, or identify a subset of diseased tissue samples in a medical context. In both of these cases the vector x indicates which elements are anomalous, i.e., x i 0 for the K anomalous elements and x i = 0 otherwise. Our goal is to design a collection of tests that allow us to identify the support (and possibly the values of the nonzeros) of x while also minimizing the number of tests performed. In the simplest practical setting these tests are represented by a binary matrix Φ whose entries φ i j are equal to 1 if and only if the j th item is used in the i th test. If the output of the test is linear with respect to the inputs, then the problem of recovering the vector x is essentially the same as the standard sparse recovery problem.

Another application area in which combinatorial algorithms have proven useful is computation on data streams   [link] , [link] . Suppose that x i represents the number of packets passing through a network router with destination i . Simply storing the vector x is typically infeasible since the total number of possible destinations (represented by a 32-bit IP address) is N = 2 32 . Thus, instead of attempting to store x directly, one can store y = Φ x where Φ is an M × N matrix with M N . In this context the vector y is often called a sketch . Note that in this problem y is computed in a different manner than in the compressive sensing context. Specifically, in the network traffic example we do not ever observe x i directly; rather, we observe increments to x i (when a packet with destination i passes through the router). Thus we construct y iteratively by adding the i th column to y each time we observe an increment to x i , which we can do since y = Φ x is linear. When the network traffic is dominated by traffic to a small number of destinations, the vector x is compressible, and thus the problem of recovering x from the sketch Φ x is again essentially the same as the sparse recovery problem.

Several combinatorial algorithms for sparse recovery have been developed in the literature. A non-exhaustive list includes Random Fourier Sampling  [link] , HHS Pursuit  [link] , and Sparse Sequential Matching Pursuit  [link] . We do not provide a full discussion of each of these algorithms; instead, we describe two simple methods that highlight the flavors of combinatorial sparse recovery — count-min and count-median .

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, An introduction to compressive sensing. OpenStax CNX. Apr 02, 2011 Download for free at http://legacy.cnx.org/content/col11133/1.5
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'An introduction to compressive sensing' conversation and receive update notifications?

Ask