next up previous contents
Next: Geometric Interpretation Up: Constrained Optimization Previous: Constrained Optimization

Equality Constraints (Lagrangians)

Suppose we have a problem:

Maximize tex2html_wrap_inline6944

subject to


If we ignore the constraint, we get the solution tex2html_wrap_inline6948 , which is too large for the constraint. Let us penalize ourselves tex2html_wrap_inline6950 for making the constraint too big. We end up with a function


This function is called the Lagrangian of the problem. The main idea is to adjust tex2html_wrap_inline6950 so that we use exactly the right amount of the resource.

tex2html_wrap_inline6956 leads to (2,1).

tex2html_wrap_inline6960 leads to (3/2,0) which uses too little of the resource.

tex2html_wrap_inline6964 gives (5/3, 1/3) and the constraint is satisfied exactly.

We now explore this idea more formally. Given a nonlinear program (P) with equality constraints:

Minimize (or maximize) f(x)

subject to





a solution can be found using the Lagrangian:


(Note: this can also be written tex2html_wrap_inline6980 ).

Each tex2html_wrap_inline6982 gives the price associated with constraint i.

The reason L is of interest is the following:


Of course, Case (i) above cannot occur when there is only one constraint. The following example shows how it might occur.


It is easy to check directly that the minimum is acheived at tex2html_wrap_inline7014 . The associated Lagrangian is


Observe that


and consequently tex2html_wrap_inline7020 does not vanish at the optimal solution. The reason for this is the following. Let tex2html_wrap_inline7022 and tex2html_wrap_inline7024 denote the left hand sides of the constraints. Then tex2html_wrap_inline7026 and tex2html_wrap_inline7028 are linearly dependent vectors. So Case (i) occurs here!

Nevertheless, Case (i) will not concern us in this course. When solving optimization problems with equality constraints, we will only look for solutions tex2html_wrap_inline6668 that satisfy Case (ii).

Note that the equation


is nothing more than


In other words, taking the partials with respect to tex2html_wrap_inline6950 does nothing more than return the original constraints.

Once we have found candidate solutions tex2html_wrap_inline6668 , it is not always easy to figure out whether they correspond to a minimum, a maximum or neither. The following situation is one when we can conclude. If f(x) is concave and all of the tex2html_wrap_inline7042 are linear, then any feasible tex2html_wrap_inline6668 with a corresponding tex2html_wrap_inline7046 making tex2html_wrap_inline7048 maximizes f(x) subject to the constraints. Similarly, if f(x) is convex and each tex2html_wrap_inline7042 is linear, then any tex2html_wrap_inline6668 with a tex2html_wrap_inline7046 making tex2html_wrap_inline7048 minimizes f(x) subject to the constraints.






Now, the first two equations imply tex2html_wrap_inline7078 . Substituting into the final equation gives the solution tex2html_wrap_inline7080 , tex2html_wrap_inline7082 and tex2html_wrap_inline7084 , with function value 2/3.

Since tex2html_wrap_inline7086 is convex (its Hessian matrix tex2html_wrap_inline7088 is positive definite) and tex2html_wrap_inline7090 is a linear function, the above solution minimizes tex2html_wrap_inline7086 subject to the constraint.

next up previous contents
Next: Geometric Interpretation Up: Constrained Optimization Previous: Constrained Optimization

Michael A. Trick
Mon Aug 24 16:30:59 EDT 1998