<< Chapter < Page Chapter >> Page >
This module introduces the spark and the null space property, two common conditions related to the null space of a measurement matrix that ensure the success of sparse recovery algorithms. Furthermore, the null space property is shown to be a necessary condition for instance optimal or uniform recovery guarantees.

A natural place to begin in establishing conditions on Φ in the context of designing a sensing matrix is by considering the null space of Φ , denoted

N ( Φ ) = { z : Φ z = 0 } .

If we wish to be able to recover all sparse signals x from the measurements Φ x , then it is immediately clear that for any pair of distinct vectors x , x ' Σ K = x : x 0 K , we must have Φ x Φ x ' , since otherwise it would be impossible to distinguish x from x ' based solely on the measurements y . More formally, by observing that if Φ x = Φ x ' then Φ ( x - x ' ) = 0 with x - x ' Σ 2 K , we see that Φ uniquely represents all x Σ K if and only if N ( Φ ) contains no vectors in Σ 2 K . There are many equivalent ways of characterizing this property; one of the most common is known as the spark   [link] .

The spark

The spark of a given matrix Φ is the smallest number of columns of Φ that are linearly dependent.

This definition allows us to pose the following straightforward guarantee.

(corollary 1 of [link] )

For any vector y R M , there exists at most one signal x Σ K such that y = Φ x if and only if spark ( Φ ) > 2 K .

We first assume that, for any y R M , there exists at most one signal x Σ K such that y = Φ x . Now suppose for the sake of a contradiction that spark ( Φ ) 2 K . This means that there exists some set of at most 2 K columns that are linearly dependent, which in turn implies that there exists an h N ( Φ ) such that h Σ 2 K . In this case, since h Σ 2 K we can write h = x - x ' , where x , x ' Σ K . Thus, since h N ( Φ ) we have that Φ ( x - x ' ) = 0 and hence Φ x = Φ x ' . But this contradicts our assumption that there exists at most one signal x Σ K such that y = Φ x . Therefore, we must have that spark ( Φ ) > 2 K .

Now suppose that spark ( Φ ) > 2 K . Assume that for some y there exist x , x ' Σ K such that y = Φ x = Φ x ' . We therefore have that Φ ( x - x ' ) = 0 . Letting h = x - x ' , we can write this as Φ h = 0 . Since spark ( Φ ) > 2 K , all sets of up to 2 K columns of Φ are linearly independent, and therefore h = 0 . This in turn implies x = x ' , proving the theorem.

It is easy to see that spark ( Φ ) [ 2 , M + 1 ] . Therefore, [link] yields the requirement M 2 K .

The null space property

When dealing with exactly sparse vectors, the spark provides a complete characterization of when sparse recovery is possible. However, when dealing with approximately sparse signals we must introduce somewhat more restrictive conditions on the null space of Φ   [link] . Roughly speaking, we must also ensure that N ( Φ ) does not contain any vectors that are too compressible in addition to vectors that are sparse. In order to state the formal definition we define the following notation that will prove to be useful throughout much of this course . Suppose that Λ { 1 , 2 , , N } is a subset of indices and let Λ c = { 1 , 2 , , N } Λ . By x Λ we typically mean the length N vector obtained by setting the entries of x indexed by Λ c to zero. Similarly, by Φ Λ we typically mean the M × N matrix obtained by setting the columns of Φ indexed by Λ c to zero. We note that this notation will occasionally be abused to refer to the length | Λ | vector obtained by keeping only the entries corresponding to Λ or the M × | Λ | matrix obtained by only keeping the columns corresponding to Λ . The usage should be clear from the context, but typically there is no substantive difference between the two.

A matrix Φ satisfies the null space property (NSP) of order K if there exists a constant C > 0 such that,

h Λ 2 C h Λ c 1 K

holds for all h N ( Φ ) and for all Λ such that | Λ | K .

The NSP quantifies the notion that vectors in the null space of Φ should not be too concentrated on a small subset of indices. For example, if a vector h is exactly K -sparse, then there exists a Λ such that h Λ c 1 = 0 and hence [link] implies that h Λ = 0 as well. Thus, if a matrix Φ satisfies the NSP then the only K -sparse vector in N ( Φ ) is h = 0 .

To fully illustrate the implications of the NSP in the context of sparse recovery, we now briefly discuss how we will measure the performance of sparse recovery algorithms when dealing with general non-sparse x . Towards this end, let Δ : R M R N represent our specific recovery method. We will focus primarily on guarantees of the form

Δ ( Φ x ) - x 2 C σ K ( x ) 1 K

for all x , where we recall that

σ K ( x ) p = min x ^ Σ K x - x ^ p .

This guarantees exact recovery of all possible K -sparse signals, but also ensures a degree of robustness to non-sparse signals that directly depends on how well the signals are approximated by K -sparse vectors. Such guarantees are called instance-optimal since they guarantee optimal performance for each instance of x   [link] . This distinguishes them from guarantees that only hold for some subset of possible signals, such as sparse or compressible signals — the quality of the guarantee adapts to the particular choice of x . These are also commonly referred to as uniform guarantees since they hold uniformly for all x .

Our choice of norms in  [link] is somewhat arbitrary. We could easily measure the reconstruction error using other p norms. The choice of p , however, will limit what kinds of guarantees are possible, and will also potentially lead to alternative formulations of the NSP. See, for instance,  [link] . Moreover, the form of the right-hand-side of [link] might seem somewhat unusual in that we measure the approximation error as σ K ( x ) 1 / K rather than simply something like σ K ( x ) 2 . However, we will see later in this course that such a guarantee is actually not possible without taking a prohibitively large number of measurements, and that [link] represents the best possible guarantee we can hope to obtain (see "Instance-optimal guarantees revisited" ).

Later in this course, we will show that the NSP of order 2 K is sufficient to establish a guarantee of the form [link] for a practical recovery algorithm (see "Noise-free signal recovery" ). Moreover, the following adaptation of a theorem in  [link] demonstrates that if there exists any recovery algorithm satisfying [link] , then Φ must necessarily satisfy the NSP of order 2 K .

(theorem 3.2 of [link] )

Let Φ : R N R M denote a sensing matrix and Δ : R M R N denote an arbitrary recovery algorithm. If the pair ( Φ , Δ ) satisfies [link] then Φ satisfies the NSP of order 2 K .

Suppose h N ( Φ ) and let Λ be the indices corresponding to the 2 K largest entries of h . We next split Λ into Λ 0 and Λ 1 , where | Λ 0 | = | Λ 1 | = K . Set x = h Λ 1 + h Λ c and x ' = - h Λ 0 , so that h = x - x ' . Since by construction x ' Σ K , we can apply [link] to obtain x ' = Δ ( Φ x ' ) . Moreover, since h N ( Φ ) , we have

Φ h = Φ x - x ' = 0

so that Φ x ' = Φ x . Thus, x ' = Δ ( Φ x ) . Finally, we have that

h Λ 2 h 2 = x - x ' 2 = x - Δ ( Φ x ) 2 C σ K ( x ) 1 K = 2 C h Λ c 1 2 K ,

where the last inequality follows from [link] .

Questions & Answers

what does preconceived mean
sammie Reply
physiological Psychology
Nwosu Reply
How can I develope my cognitive domain
Amanyire Reply
why is communication effective
Dakolo Reply
Communication is effective because it allows individuals to share ideas, thoughts, and information with others.
effective communication can lead to improved outcomes in various settings, including personal relationships, business environments, and educational settings. By communicating effectively, individuals can negotiate effectively, solve problems collaboratively, and work towards common goals.
it starts up serve and return practice/assessments.it helps find voice talking therapy also assessments through relaxed conversation.
miss
Every time someone flushes a toilet in the apartment building, the person begins to jumb back automatically after hearing the flush, before the water temperature changes. Identify the types of learning, if it is classical conditioning identify the NS, UCS, CS and CR. If it is operant conditioning, identify the type of consequence positive reinforcement, negative reinforcement or punishment
Wekolamo Reply
please i need answer
Wekolamo
because it helps many people around the world to understand how to interact with other people and understand them well, for example at work (job).
Manix Reply
Agreed 👍 There are many parts of our brains and behaviors, we really need to get to know. Blessings for everyone and happy Sunday!
ARC
A child is a member of community not society elucidate ?
JESSY Reply
Isn't practices worldwide, be it psychology, be it science. isn't much just a false belief of control over something the mind cannot truly comprehend?
Simon Reply
compare and contrast skinner's perspective on personality development on freud
namakula Reply
Skinner skipped the whole unconscious phenomenon and rather emphasized on classical conditioning
war
explain how nature and nurture affect the development and later the productivity of an individual.
Amesalu Reply
nature is an hereditary factor while nurture is an environmental factor which constitute an individual personality. so if an individual's parent has a deviant behavior and was also brought up in an deviant environment, observation of the behavior and the inborn trait we make the individual deviant.
Samuel
I am taking this course because I am hoping that I could somehow learn more about my chosen field of interest and due to the fact that being a PsyD really ignites my passion as an individual the more I hope to learn about developing and literally explore the complexity of my critical thinking skills
Zyryn Reply
good👍
Jonathan
and having a good philosophy of the world is like a sandwich and a peanut butter 👍
Jonathan
generally amnesi how long yrs memory loss
Kelu Reply
interpersonal relationships
Abdulfatai Reply
What would be the best educational aid(s) for gifted kids/savants?
Heidi Reply
treat them normal, if they want help then give them. that will make everyone happy
Saurabh
What are the treatment for autism?
Magret Reply
hello. autism is a umbrella term. autistic kids have different disorder overlapping. for example. a kid may show symptoms of ADHD and also learning disabilities. before treatment please make sure the kid doesn't have physical disabilities like hearing..vision..speech problem. sometimes these
Jharna
continue.. sometimes due to these physical problems..the diagnosis may be misdiagnosed. treatment for autism. well it depends on the severity. since autistic kids have problems in communicating and adopting to the environment.. it's best to expose the child in situations where the child
Jharna
child interact with other kids under doc supervision. play therapy. speech therapy. Engaging in different activities that activate most parts of the brain.. like drawing..painting. matching color board game. string and beads game. the more you interact with the child the more effective
Jharna
results you'll get.. please consult a therapist to know what suits best on your child. and last as a parent. I know sometimes it's overwhelming to guide a special kid. but trust the process and be strong and patient as a parent.
Jharna
Got questions? Join the online conversation and get instant answers!
Jobilize.com Reply

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, An introduction to compressive sensing. OpenStax CNX. Apr 02, 2011 Download for free at http://legacy.cnx.org/content/col11133/1.5
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'An introduction to compressive sensing' conversation and receive update notifications?

Ask