<< Chapter < Page Chapter >> Page >

A typical problem arising in signal processing is to minimize x A x subject to the linear constraint c x 1 . A is a positive definite, symmetric matrix (a correlation matrix) in mostproblems. Clearly, the minimum of the objective function occurs at x 0 , but his solution cannot satisfy the constraint. The constraint g x c x 1 is a scalar-valued one; hence the theorem of Lagrange applies as there are no multiple components in theconstraint forcing a check of linear independence. The Lagrangian is L x x A x c x 1 Its gradient is 2 A x c with a solution x A c 2 . To find the value of the Lagrange multiplier, this solution must satisfy the constraint. Imposing theconstraint, c A c -2 ; thus, -2 c A c and the total solution is x A c c A c

When the independent variable is complex-valued, the Lagrange multiplier technique can be used if care is taken to make the Lagrangian real. If it is not real, wecannot use the theorem that permits computation of stationary points by computing the gradient with respect to z alone. The Lagrangian may not be real-valued even when the constraint is real. Once insured real, the gradientof the Lagrangian with respect to the conjugate of the independent vector can be evaluated and the minimizationprocedure remains as before.

Consider slight variations to the previous example: let the vector z be complex so that the objective function is z A z where A is a positive definite, Hermitian matrix and let the constraint be linear, but vector-valued ( C z c ). The Lagrangian is formed from the objective function and the real part of the usual constraint term. L z z A z C z c C z c For the Lagrange multiplier theorem to hold, the gradients of each component of the constraint must be linearlyindependent. As these gradients are the columns of C , their mutual linear independence means that each constraint vector mustnot be expressible as a linear combination of the others. We shall assume this portion of the problem statement true.Evaluating the gradient with respect to z , keeping z a constant, and setting the result equal to zero yields A z C 0 The solution is z is A C . Applying the constraint, we find that C A C c . Solving for the Lagrange multiplier and substituting the result into the solution, we find that thesolution to the constrained optimization problem is z A C C A C c The indicated matrix inverses always exist: A is assumed invertible and C A C is invertible because of the linear independence of the constraints.

Inequality constraints

When some of the constraints are inequalities, the Lagrange multiplier technique can be used, but the solution must bechecked carefully in its details. But first, the optimizationproblem with equality and inequality constraints is formulated as x f x subject to g x 0 and h x 0 As before, f is the scalar-valued objective function and g is the equality constraint function; h is the inequality constraint function .

The key result which can be used to find the analytic solution to this problem is to first form the Lagrangian in the usualway as L x f x g x h x . The following theorem is the general statement of the Lagrange multiplier technique for constrained optimizationproblems.

Let x be a local minimum for the constrained optimization problem. If the gradients of g 's components and the gradients of those components of h for which h i x 0 are linearly independent, then x L x 0 where 0 and i h i x 0

The portion of this result dealing with the inequality constraint differs substantially from that concerned with theequality constraint. Either a component of the constraint equals its maximum value (zero in this case) and thecorresponding component of its Lagrange multiplier is non-negative (and is usually positive) or a component is less than the constraint and its component of the Lagrange multiplier is zero. This latter result meansthat some components of the inequality constraint are not as stringent as others and these lax ones do not affect thesolution.

The rationale behind this theorem is a technique for converting the inequality constraint into an equalityconstraint: h i x 0 is equivalent to h i x s i 2 0 . Since the new term, called a slack variable , is non-negative, the constraint must be non-positive. With the inclusion of slack variables, theequality constraint theorem can be used and the above theorem results. To prove the theorem, not only does the gradientwith respect to x need to be considered, but also with respect to the vector s of slack variables. The i th component of the gradient of the Lagrangian with respect to s at the stationary point is 2 i s i 0 . If in solving the optimization problem, s i 0 , the inequality constraint was in reality an equality constraint and that component of the constraintbehaves accordingly. As s i h i x , s i 0 implies that that component of the inequality constraint must equal zero. On the other hand, if s i 0 , the corresponding Lagrange multiplier must be zero.

Consider the problem of minimizing a quadratic form subject to a linear equality constraint and an inequality constrainton the norm of the linear constraint vector's variation. x x A x subject to c x 1 and 2 This kind of problem arises in robust estimation. One seeks a solution where one of the "knowns" of the problem, c in this case, is, in reality, only approximately specified. Theindependent variables are x and . The Lagrangian for this problem is L x x A x c x 1 2 Evaluating the gradients with respect to the independent variables yields 2 A x c 0 x 2 0 The latter equation is key. Recall that either 0 or the inequality constraint is satisfied with equality. If is zero, that implies that x must be zero which will not allow the equality constraint to be satisfied. The inescapable conclusion isthat 2 and that is parallel to x : 2 x . Using the first equation, x is found to be x 2 A 2 4 I c Imposing the constraints on this solution results in a pair of equations for the Lagrange multipliers. 1 4 2 2 c A 1 4 2 I -2 c c A 1 4 2 I c 2 2 2 Multiple solutions are possible and each must be checked. The rather complicated completion of this example is left tothe (numerically oriented) reader.

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Statistical signal processing. OpenStax CNX. Dec 05, 2011 Download for free at http://cnx.org/content/col11382/1.1
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Statistical signal processing' conversation and receive update notifications?

Ask