Math 235 Calculus III  Lab 2 Optimization and Lagrange Multipliers

Math 235 Calculus III  Lab 2  Optimization and Lagrange Multipliers
There are two different ways to find local maxima and minima. The first method involves looking for critical points and applying the second derivative test while the second method is Lagrange multipliers.
Critical Points
When we look for local extrema we are looking for the points that are either larger than all the surrounding points or smaller than all the surrounding points. The first method we have of doing this is by attempting to find the critical points of the function. Given f(x,y) we find the first partial derivatives, f_{x} and f_{y}. After finding the first derivatives we set them both equal to zero, then solve for x and y to find the critical points of the function. The critical points will occur at either these points or places where the derivative doesn't exist. The maximum and minimum values for the function will occur at either these critical points or on the boundary of the function. For example, given . First we note that the function is defined and differentiable for all x and y and no boundary has been defined. If our function were not defined at some point we would have to include that as one of our test points. We then find f_{x} =y2x2 and f_{y} =x2y2. If f x and f y are both set equal to zero we find that x= 2 and y= 2. So the only critical point is (2,2). To determine if this point is a local minimum, local maximum or neither, we can use the second derivative test. We evaluate the discriminant D(x,y)=f_{xx}f_{yy}f_{xy}^{2} at the critical point.
1) If D(x,y)>0 and f_{xx}>0 then f has a local minimum at the critical point.
2) If D(x,y)>0 and f_{xx}<0 then f has a local maximum at the critical point.
3) If D(x,y)<0 then f has a saddle point.
4) If D(x,y)=0 the test is inconclusive.
So, for our problem above, D(2,2)=(2)(2)1=3>0 and f xx (2,2)<0 so the function has a local maximum. We can see this graphically as well:
> with(plots):
> plot3d(x*yx^2y^22*x2*y+4,x=10..10,y=10..10);
Knowing that the point (2,2) is a critical point from our algebraic work, we can safely deduce that it's a local maximum from the graph.
The problem above was fairly easy to do by hand it was not complex to find the roots of the first partial derivatives. However, this process can quickly grow to be extremely complicated even when the initial functions appear to be fairly simple. That's when we can start to use Maple to perform the algebra for us. For example, let's look at
First note that the domain of the function is all real numbers and no boundary has been specified. That tells us that we don't have to worry about looking for boundary values. We can look for the first partial derivatives of the function by using Maple. The first thing we do is define the function.
> f:=y/(x^2+y^2+1);
To get Maple to understand our function we start by typing the name we want to give the function. In this case, that name is f. We then type ":=" this tells Maple that whatever follows will be the function.
> dfdx:=diff(f,x);
> dfdy:=diff(f,y);
Once we've found the first partial derivatives, we can use Maple to find the points where they are simultaneously zero using the following command:
> solve({dfdx=0,dfdy=0},{x,y});
So, the points (0,1) and (0,1) are our critical points. Next we want to find the second partial derivatives. Note that in the first command the function is named "dfdx2" and Maple will assign the function that it finds when it takes the derivative of dfdx with respect to x.
> dfdx2:=diff(dfdx,x);
> dfdy2:=diff(dfdy,y);
> dfdxdy:=diff(dfdx,y);
Now that we've had Maple find all the individual parts it will need, we can have it calculate the discriminant, D(x,y)=f_{xx}f_{yy}f_{xy}^{2}. Since the name "D" is assigned to something else in Maple, we name this function "S". After defining the function, to get Maple to evaluate the discriminant at each of the critical points, we simply assign whatever value we want for each of x and y, then ask Maple to show us the function S again and it will show us the value S takes on at the critical point.
> S:=(dfdx2)*(dfdy2)(dfdxdy)^2;
> assign(x=0,y=1);
> S;
> dfdx2;
> unassign('x','y');assign(x=0,y=1);
> S;
> dfdx2;
> unassign('x','y');
> plot3d(y/(x^2+y^2+1),x=10..10,y=10..10);
Note that you must be careful to unassign x and y when you want to treat them as variables rather than constants.The two problems already discussed in this lab have dealt with optimization without a boundary to worry about. Before moving on to the next method of optimization, let's look at an example of optimization on a bounded region using the critical point method. Let f(x,y)= x^{2}+3xy+y^{2} . We want to maximize f(x,y) on the unit square with vertices at (0,0), (1,0), (0,1) and (1,1). The first necessity is to find the critical points of f(x,y). The derivative will always exist for this function, so we just need to look for places where the first partial derivative with respect to x and the first partial with respect to y are both 0. First, we define the function:
> f:=x^2+3*x*y+y^2;
> dfdx:=diff(f,x);
> dfdy:=diff(f,y);
> solve({dfdx=0,dfdy=0},{x,y});
> dfdx2:=diff(dfdx,x);
> dfdy2:=diff(dfdy,y);
> dfdxdy:=diff(dfdx,y);
> S:=(dfdx2)*(dfdy2)(dfdxdy)^2;
So, (0,0) is the only critical point and f(0,0)=0. Now, we also check along each of the boundaries. We know the extreme values for the function will occur either at critical points or along one of the boundaries, so if we just find the extrema along each boundary then compare the values that will give the extrema for the function on the region. Between the vertices (0,0) and (1,0) we are along the x axis, so y=0 so f(x,y)=x^{2}. By taking the first derivative of x^{2} we know that this is a decreasing function along the segment [0,1] so the maximum value along this line segment will be f(0,0)=0 and the minimum will be f(1,0)= 1.
Along the segment between (1,0) and (1,1), x is always 1 so f(x,y)= . Again, to maximize along the line segment, this becomes a single variable optimization problem. The derivative of is 3+2y which is always positive when y is in the interval [0,1] so f(x,y) is increasing as y increases along this segment. Thus, the maximum value of f(x,y) along this line segment will be f(1,1)=3 and the minimum along this segment will be f(1,0)= 1.
Along the segment from (1,1) to (0,1), y=1 so f(x,y)= which has derivative 2x+3. When x is between 0 and 1, 2x+3 is always positive. So, as x increases from 0 to 1, f(x,y) increases. This tells us that along the segment, f(x,y) achieves its maximum at f(1,1)=3 and its minimum at f(0,1)=1.
Lastly, we look at the segment of the square running from (0,1) to (0,0) and note that here x=0 so f(x,y)= which increases as y increases in the interval [0,1]. So, along this segment, the minimum is f(0,0)=0 and the maximum is f(0,1)=1. Now that we've found the maximum and minimum along each of the boundaries all that needs to be done is to compare them to find the overall minimum and maximum values for the function. (Note that in this case the maxima and minima on each segment of the boundary happened to occur at endpoints of the boundary. THIS WILL NOT ALWAYS HAPPEN! It is essential that you analyze each portion of the boundary to see where the function is increasing or decreasing to determine where the minima and maxima are occuring.) So, let's finish by finding the maxima and minima of f(x,y) on the region.
This tells us the minimum value of f(x,y) on the boundary is 1 and the maximum value of f(x,y) on the boundary is 3. We then compare these values to our critical values and since (0,0) was our only critical point and f(0,0)=0, the maximum of f(x,y) on the unit square was 3 and the minimum on the unit square was 1. Let's look at this graphically as well.
> plot3d(x^2+3*x*y+y^2,x=0..1,y=0..1);
Lagrange Multipliers
The other method for multivariable optimization is the use of Lagrange multipliers. To find the maximum and minimum values for f(x,y,z) subjct to the constraints g(x,y,z)=k, h(x,y,z)=m we simply solve the following system of equations for x,y and z.
f x =l_{1} g_{x} + l_{2}h_{x}
f y =l_{1} g_{y} + l_{2}h_{y}
f_{z}=l_{1} g_{z} + l_{2}h_{z}
g(x,y,z)=k
h(x,y,z)=m
The next step is to evaluate f(x,y,z) at all solutions to the system and compare the outputs to see what the maximum and minimum values are.
We can see here that we have a system of five equations and 5 variables which can quickly escalate into a problem you wouldn't want to evaluate by hand, but here again we can use Maple to do much of the detail work for us.
To see how Maple can do much of the algebra work let's look at an example: Find the maximum and minimum of f(x,y,z)=x^{2}+y^{2}+z^{2} subject to x^{2}+y^{2}=1 and x+y+z=1.
We start by defining the function f and the constraints g and h in Maple, then taking all the derivatives we'll need.
> f:=x^2+y^2+z^2;
> g:=x^2+y^2;
> h:=x+y+z;
> dfdx:=diff(f,x);
> dfdy:=diff(f,y);
> dfdz:=diff(f,z);
> dgdx:=diff(g,x);
> dgdy:=diff(g,y);
> dgdz:=diff(g,z);
> dhdx:=diff(h,x);
> dhdy:=diff(h,y);
> dhdz:=diff(h,z);
After we have all the differentials in the computer, we ask Maple to solve the system of 5 equations for the 5 variables:
> solve({dfdx=lambda1*dgdx+lambda2*dhdx,dfdy=lambda1*dgdy+lambda2*dhdy,dfdz=lambda1*dgdz+lambda2*dhdz, x^2+y^2=1,x+y+z=1},{x,y,z,lambda1,lambda2});
>
The third solution tells us that x is the root of the equation 2x^{2}1=0. So, x= + Similarly y= +
z is 1 plus 2 times the root of z^{2}1. So, z=1+ .
Thus, our four solutions to the system of equations are (1,0,0), (0,1,0) ( , , ) and ( , , ). So, we now just need to evaluate f(x,y,z) at each of these points:
f(1,0,0)=1
f(0,1,0)=1
We'll use Maple to evaluate the other 2 by assigning the appropriate values for x, y and z then asking Maple to compute the value for f.
> assign('x'=sqrt(2)/2,'y'=sqrt(2)/2,'z'=1sqrt(2));
> f;
> evalf(%);
> assign('x'=sqrt(2)/2,'y'=sqrt(2)/2,'z'=1+sqrt(2));
> f;
> evalf(%);
> unassign('x','y','z');
So, the maximum value for f(x,y,z) is approximately 6.8 at ( , , ) and the minimum value for f(x,y,z) is 1 and it occurs at (1,0,0) and (0,1,0). This was an example of a problem where the algebra involved wasn't to difficult, however, just as with the examples of optimization using critical points, it is easy to see how one can need to work problems that are far more difficult to do by hand and will require a computer to work out the details.
*Problems adapted from sections 12.812.9 of the 8th edition of Calculus and Analytic Geometry by Thomas and Finney.
Exercises:
1. Using critical points, find all maxima, minima, and saddle points of f(x,y)=(3x+4)/(x^{2}+y^{2}+1).
2. Using critical points, find the absolute maxima and minima of (4xx^{2})cosy on the rectangle 1<x<3 and p/4<y<p/4.
3. Use Lagrange Multipliers to find the extreme values of f(x,y,z)=x^{2}yz+1 on the intersection of the plane z=1 with the sphere x^{2}+y^{2}+z^{2}=10.