<< Chapter < Page Chapter >> Page >
Introduces constrained optimization with inequality constraints and the Karush Kuhn Tucker conditions.

A constrained optimization problem with inequality constraints can be written as

min f ( x ) subject to g i ( x ) = 0 , i = ( 1 , , n ) , h j ( x ) 0 , j = ( 1 , , m ) .

Definition 1 Let x 0 be a feasible point. If h j ( x 0 ) = 0 , we say that the constraint h j is active at x 0 , if h j ( x o ) < 0 then h j is inactive at x 0 . The set of active constraints at x is denoted by;

J ( x ) = j : h j ( x ) = 0

Definition 2 We say x Ω is a regular point in Ω if g i ( x ) , i = 1 , , n h j ( x ) , j J ( x ) is a linearly independent set.

Theorem 1 (Karush-Kuhn-Tucker Conditions) if x 0 is a regular point of Ω and x 0 is a local minimizer of f over Ω then there exist scalars λ 1 , , λ n and μ 1 , , μ m such that;

  1. μ j 0 , j = 1 , , m ,
  2. j = 1 m μ j h j ( x 0 ) = 0 (complementary slackness condition),
  3. f ( x 0 ) + i = 1 n λ i g i ( x 0 ) + j = 1 m μ j h j ( x 0 ) = 0 .

Since h j ( x 0 ) 0 , we have that if h j is inactive at x 0 (i.e., if h j ( x 0 ) < 0 ) then we must have μ j = 0 . Therefore, some versions of the theorem feature only the active inequality constraints in the third condition.

Example 1 We pictorially demonstrate some examples of active inequality constraints: consider the case where the set Ω is a convex set bounded by three inequality constraints, h 1 ( x ) , h 2 ( x ) , and h 3 ( x ) . Now, consider these three possibilities for the minimizer x 0 of f ( x ) in Ω , illustrated in [link] :

Three examples of the application of the Karhush-Kuhn-Tucker conditions.
  • In this case, the minimizer is in the interior of the set Ω , and no constraints are active; therefore, the minimizer of the constrained problem matches the minimizer of the unconstrained problem, and the solution is found by solving f ( x 0 ) = 0 , which ignores all inequality constraints. This means that each μ i = 0 .
  • In this case, the minimizer has one active constraint ( h 1 ( x ) ). Consider the gradient of the constraint h 1 ( x ) , which is orthogonal to the tangent space. If h 1 ( x ) and f ( x ) are not collinear (scalar multiples of one another), then there is a direction within the tangent space in which the value of f ( x ) would decrease and x 0 would not be a minimizer. Otherwise, if f ( x ) + μ 1 h 1 ( x ) = 0 for μ 1 0 , then both gradients point in the same direction and the value of f ( x ) would decrease by moving in the opposite direction toward the interior of Ω , and so x 0 would not be a minimizer. Therefore, we must have f ( x 0 ) + μ 1 h 1 ( x 0 ) = 0 with μ 1 > 0 , as specified by the KKT conditions. In this case the inactive constraints h 2 ( x ) and h 3 ( x ) are ignored.
  • In this case, the minimizer has two active constraints ( h 1 ( x ) and h 2 ( x ) ). In this case, the directions d in which we can move from x 0 within ω must obey d , h 1 ( x 0 ) < 0 and d , h 2 ( x 0 ) < 0 . Similarly, moving in a direction d decreases the value of f ( x 0 + d ) below f ( x 0 ) if d , f ( x 0 ) < 0 . Thus, for us to be able to find such a direction d we must have that f ( x 0 ) = a h 1 ( x 0 ) + b h 2 ( x 0 ) with a 0 and b 0 , which would give
    f ( x 0 ) - a h 1 ( x 0 ) - b h 2 ( x 0 ) = 0 .
    Thus, the minimizer x 0 of f ( x ) must obey
    f ( x 0 ) + μ 1 h 1 ( x 0 ) + μ 2 h 2 ( x 0 ) = 0 ,
    with μ 1 > 0 and μ 2 > 0 .

The example illustrates a simple “recipe” for solving inequality constraint optimization problems includes the following steps:

  1. Pick a candidate active set J ^ ( x 0 ) ,
  2. Build the corresponding form of the third KKT condition:
    f ( x 0 ) + i = 1 n λ i g i ( x 0 ) + j J ^ ( x ) m μ j h j ( x 0 ) = 0 ,
    and solve for x 0 , λ , and μ ,
  3. If μ j 0 and h j ( x 0 ) = 0 for j J ^ ( x 0 ) then x 0 is a solution. Otherwise, pick a new candidate active set J ^ ( x ) and repeat Step 2.

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Signal theory. OpenStax CNX. Oct 18, 2013 Download for free at http://legacy.cnx.org/content/col11542/1.3
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Signal theory' conversation and receive update notifications?

Ask