Sie sind auf Seite 1von 13

0.

1 Practical Guide - Local Extrema


0.1.1 Local Extrema for functions of several variables
Consider scalar valued functions of several variables f : D R
3
R , f = f(x; y; z)
We try to nd the local extrema, that is maximum or minimum points.
Denition
A point (a; b; c) is a local maximum if
f(x; y; z) _ f(a; b; c) for all (x; y; z) in a disk centered at (a; b; c)
Consequently in particular we get
f(x; b; c) _ f(a; b; c) for all x in an open interval centered at a
This precisely means x = a is a local maximum for the function f = f(x; b; c). This is a function in one variable.
By the well known Fremat theorem, it follows the derivative at x = a is zero, that is
@f
@x
(a; b; c) = 0
The same holds for y = b and z = c
We conclude that all three partial derivatives are necessarily zero at every local extrema.
Denition. A point (a; b; c) such that
_

_
@f
@x
(a; b; c) = 0
@f
@y
(a; b; c) = 0
@f
@z
(a; b; c) = 0
is called a "critical" point ( or "stationary" point).
Therefore, all local extrema are critical points.
We use Taylors formula (for several variables) to decide which critical point is a local extrema.
f(x; y; z) = f(a; b; c) + (x a)
@f
@x
+ (y b)
@f
@y
+ (z c)
@f
@z
+
+
1
2!
_

_
(x a)
2
@
2
f
@x
2
+ (y b)
2
@
2
f
@y
2
+ (z c)
2
@
2
f
@z
2
+ 2(x a)(y b)
@
2
f
@x@y
+ 2(x a)(z c)
@
2
f
@x@z
+ :::
. .
quadratic form
_

_
+ :::
All partial derivatives are taken at (a; b; c)
Assume (a; b; c) is a critical point. Then
f(x; y; z) f(a; b; c) = (x a)
0
..
@f
@x
+ (y b)
0
..
@f
@y
+ (z c)
0
..
@f
@z
+
+
1
2!
_

_
(x a)
2
@
2
f
@x
2
+ (y b)
2
@
2
f
@y
2
+ (z c)
2
@
2
f
@z
2
+ 2(x a)(y b)
@
2
f
@x@y
+ 2(x a)(z c)
@
2
f
@x@z
+ :::
. .
quadratic form
_

_
+ :::
Therefore the sign of f(x; y; z) f(a; b; c) depends on the sign of the quadratic form,
which turns out to be corresponding to the symmetric matrix called Hessian matrix
H
f
=
_
_
_
@
2
f
@x
2
@
2
f
@x@y
@
2
f
@x@z
@
2
f
@x@y
@
2
f
@y
2
@
2
f
@y@z
@
2
f
@x@z
@
2
f
@y@z
@
2
f
@z
2
_
_
_
1
with entries all partial derivatives of second order.
A simple algorithm follows.
Step I Find the critical points
Step II Compute the partial derivatives of second order
Step III Classify the quadratic form associated to the Hessian matrix , then
- if the matrix is positive denite then (a; b; c) is a minimum point
- if the matrix is negative denite then (a; b; c) is a maxmum point
- if the matrix is indenite then (a; b; c) is not an extrema, it is called a "saddle" point
To decide this, we may use the eigenvalues of the Hessian matrix.
Compute these eigenvalues as the solutions of the equation
det (H
f
I) = 0
The quadratic form is
- positive = all eigenvalues are _ 0
- negative = all eigenvalues are _ 0
- indenite = one eigenvalue < 0 and another eigenvalue > 0
Or you may use another "test" which computes the "principal minors" of the Hessian matrix (determinants
along the rst diagonal)
- if
1
> 0 ,
2
> 0 ,
3
> 0 then the matrix is positive
- if
1
< 0 ,
2
> 0 ,
3
< 0 then the matrix is negative
- otherwise no determination
Example
1) Find the local extrema for the function
f(x; y; z) = sin(x + y + z) sinx siny sinz , for x; y; z (0; )
Solution
Step I Find the critical points
_

_
@f
@x
= 0
@f
@y
= 0
@f
@z
= 0
=
_
_
_
@
@x
(sin(x + y + z) sinx siny sinz) = 0
@
@y
(sin(x + y + z) sinx siny sinz) = 0
@
@z
(sin(x + y + z) sinx siny sinz) = 0
=
_
_
_
cos(x + y + z) cos x = 0
cos(x + y + z) cos y = 0
cos(x + y + z) cos z = 0
= cos x = cos y = cos z
Remember now the cosine function is "one to one" in the interval (0; ) , so necessarily x = y = z
This leads to
cos 3x cos x = 0 = 2 sin
3x x
2
sin
3x + x
2
= 0 = sinx = 0 or sin2x = 0
We get the solutions x = k respectively 2x = k , k Z
Only x = =2 (0; )
We get just one critical point x =

2
, y =

2
, z =

2
Step II Compute the partial derivatives of second order
@
2
f
@x
2
=
@
@x
(cos(x + y + z) cos x) = sin(x + y + z) + sinx
@
2
f
@y
2
=
@
@y
(cos(x + y + z) cos y) = sin(x + y + z) + siny
@
2
f
@z
2
=
@
@z
(cos(x + y + z) cos z) = sin(x + y + z) + sinz
2
@
2
f
@x@y
=
@
@y
(cos(x + y + z) cos x) = sin(x + y + z)
@
2
f
@x@z
=
@
@z
(cos(x + y + z) cos x) = sin(x + y + z)
@
2
f
@y@z
=
@
@z
(cos(x + y + z) cos y) = sin(x + y + z)
For x =

2
, y =

2
, z =

2
these partial derivatives are either sin
3
2
+ sin

2
= 2 either sin
3
2
= 1
Step III. The Hessian matrix at the critical point (

2
;

2
;

2
) is
H
f
(

2
;

2
;

2
) =
_
_
2 1 1
1 2 1
1 1 2
_
_
Compute the eigenvalues
det
_
H
f
(

2
;

2
;

2
) I
_
= det
_
_
2 1 1
1 2 1
1 1 2
_
_
=

4 4 4
1 2 1
1 1 2

=
= (4 )

1 0 0
1 1 0
1 0 1

= (4 )(1 )
2
= 0
The eigenvalues are
1
= 4 ,
2;3
= 1 , all positive, thus the Hessian matrix is positive and the point (

2
;

2
;

2
)
is a minimum point.

2) Find the local extrema for the function f : R


2
R
f = f(x; y) = (x
2
+ y
2
)e
(x
2
+y
2
)
Solution
Step I Find the critical points
_
@f
@x
= 0
@f
@y
= 0
=
_
@
@x
(x
2
+ y
2
)e
(x
2
+y
2
)
= 2xe
(x
2
+y
2
)
+ (x
2
+ y
2
)e
(x
2
+y
2
)
(2x) = 0
@
@y
(x
2
+ y
2
)e
(x
2
+y
2
)
= 2ye
(x
2
+y
2
)
+ (x
2
+ y
2
)e
(x
2
+y
2
)
(2y) = 0
=
_
e
(x
2
+y
2
)
2x(1 x
2
y
2
) = 0
e
(x
2
+y
2
)
2y(1 x
2
y
2
) = 0
= x = 0 , y = 0 or 1 x
2
y
2
= 0
One critical point is (0; 0) and also all points (a; b) with a
2
+ b
2
= 1 , these are located on a circle centered
at (0; 0) with radius 1
Step II Compute the Hessian matrix
The partial derivatives of rst order may be written
@f
@x
= 2(x x
3
xy
2
)e
(x
2
+y
2
)
@f
@y
= 2(y x
2
y y
3
)e
(x
2
+y
2
)
Now take the derivatives
@
2
f
@x
2
=
@
@x
_
@f
@x
_
= 2
@
@x
_
_
_(x x
3
xy
2
. .
x(1x
2
y
2
)
)e
(x
2
+y
2
)
_
_
_ = 2e
(x
2
+y
2
)
_

_1 3x
2
y
2
. .
1x
2
y
2
2x
2
2x(x x
3
xy
2
. .
x(1x
2
y
2
)
)
_

_
@
2
f
@y
2
=
@
@y
_
@f
@y
_
= 2
@
@y
_
_
_(y x
2
y y
3
. .
x(1x
2
y
2
)
)e
(x
2
+y
2
)
_
_
_ = 2e
(x
2
+y
2
)
_

_1 x
2
3y
2
. .
1x
2
y
2
2y
2
2y(y x
2
y y
3
. .
y(1x
2
y
2
)
)
_

_
@
2
f
@x@y
=
@
@y
_
@f
@x
_
= 2
@
@y
_
(x x
3
xy
2
)e
(x
2
+y
2
)
_
= 2e
(x
2
+y
2
)
_

_2xy 2y(x x
3
xy
2
. .
x(1x
2
y
2
)
)
_

_
3
Step III
Now for the critical point (0; 0) we have
e
(0
2
+0
2
)
= 1 ,
@
2
f
@x
2
(0; 0) = 2 ,
@
2
f
@y
2
(0; 0) = 2 ,
@
2
f
@x@y
(0; 0) = 0
H
f
(0; 0) =
_
2 0
0 2
_
It is a diagonal matrix, therefore the eigenvalues are precisely the diagonal entries
1
=
2
= 2 , both positive,
so (0; 0) is a minimum point.
For any critical point (a; b) with a
2
+ b
2
= 1 , we have
e
( a
2
+b
2
)
= e
1
=
1
e
, 1 a
2
b
2
= 0
the partial derivatives are ( use 1 a
2
b
2
everywhere)
@
2
f
@x
2
(a; b) =
2
e
(2a
2
) ,
@
2
f
@y
2
(a; b) =
2
e
(2b
2
) ,
@
2
f
@x@y
(a; b) =
2
e
(2ab)
The Hessian matrix is
H
f
(a; b) =
_

4a
2
e

4ab
e

4ab
e

4b
2
e
_
The eigenvalues are
det (H
f
(a; b) I) = det
_

4a
2
e

4ab
e

4ab
e

4b
2
e

_
=
2
+
_
4a
2
e
+
4b
2
e
_
+
4a
2
e
4b
2
e

4ab
e
4ab
e
= 0 =

2
+
_
4a
2
e
+
4b
2
e
_
= 0 =
1
= 0 ,
2
=
_
4a
2
e
+
4b
2
e
_
< 0
Since the eigenvalues are less than zero, any critical point (a; b) with a
2
+ b
2
= 1, is a maximum point.

3) Find the extrema for the function f : R


2
R dened as
f(x; y) = ln(1 +
_
x
2
+ y
2
)
Solution
Step I Find the critical points
@f
@x
= 0 =
@
@x
ln(1 +
_
x
2
+ y
2
) =
1
1 +
_
x
2
+ y
2

x
_
x
2
+ y
2
. .
= 0 = x = 0
@f
@y
= 0 =
@
@x
ln(1 +
_
x
2
+ y
2
) =
1
1 +
_
x
2
+ y
2

y
_
x
2
+ y
2
. .
= 0 = y = 0
As you can see, the only possible critical point is (0; 0) ,
but both partial derivatives do not exit for x = 0 and y = 0 ( we marked the denominator that makes no sense
at (0; 0) )
Remember the standard algorithm nds the extrema only where the function has partial derivatives.
Consequently no point (x; y) ,= (0; 0) is an extreme point for f
There is one last "chance" to nd an extreme point, namely at (0; 0)
We check this directly
f(0; 0) = ln(1 +
_
0
2
+ 0
2
) = 0 _ ln(1 +
_
x
2
+ y
2
) = f(x; y)
which means (0; 0) is a minimum point.

4
0.1.2 Local Extrema (for functions dened implicitly
Case I. Functions of "one variable" dened implicitly
To dene a function f : D R R "explicitly" means to give a precise "formula" like
f(x) = sinx 2 , f(x) =
2x + 3
1 + x
2
, ...
A function is dened "implicitly" whenever the correspondance x f(x) is granted, but it is impossible (or
very hard) to produce a precise formula like before.
Example.
The equation x
2
+ y
2
= 4 , y _ 0 denes "implicitly" a function (a correspondance) x y = y(x) , since
for each x [2; 2] there is a unique y R to satisfy the equation.
In this very simple case we may actually produce a precise formula for y
y = y(x) =
_
4 x
2
which denes "explicitely" the function y = y(x).
One major goal in studying a function is to nd out ("predict") its extrema (maximum, minimum points) if
there are any.
For "nice" functions, which are derivable, there is a "standard" procedure to nd only local extrema
- nd the "critical" points, that is solve the equation f
0
(x) = 0
- then decide which of these points are extrema using the sign of the rst derivative
For a function dened implicitly, this procedure does not work as stated, since we dont have a formula for f(x)
, thus no formula for its derivative f
0
(x) either.
However a closer look at this algorithm shows we actually dont need the formula of the derivative.
All we need are the "critical" points (at which the derivative is zero).
This is a bit confusing, it seems impossible to produce the critical points without a formula for the derivative.
Well, it is possible. This is how it works.
Write the equation that denes implicitly the function y = y(x)
x
2
+ y(x)
2
= 4 , y _ 0
and take the derivative (with respect to "x")
_
x
2
+ y(x)
2
_
0
= (4)
0
= 2x + 2y(x) y
0
(x) = 0 =
y
0
(x) =
x
y(x)
for y(x) ,= 0
Now nding the critical points means
y
0
(x) = 0 =
x
y(x)
= 0 = x = 0 = 0
2
+ y(0)
2
= 4 = y(0) = 2 _ 0
So we did nd a critical point x = 0
Next we need to decide whether it is an extrema or not.
We clearly cannot use the sign of y
0
(x) . We use the second derivative.
Remember Taylors formula
f(x) = f(a) +
f
0
(a)
1!
(x a) +
f
00
(a)
2!
(x a)
2
+
f
000
(a)
3!
(x a)
3
+ :::
If "a" is a critical point, that is f
0
(a) = 0, we have
f(x) f(a) =
0
..
f
0
(a)
1!
(x a) +
f
00
(a)
2!
(x a)
2
+
f
000
(a)
3!
(x a)
3
+ :::
Consequently the sign of f(x) f(a) depends on the sign of the second derivative f
00
(a) at the point a
- for f
00
(a) > 0 we have f(x) f(a) _ 0 = a is minimum point
5
- for f
00
(a) < 0 we have f(x) f(a) _ 0 = a is maximum point
- if f
00
(a) = 0 it is an "unfortunate" case ! we need the derivatives of order 3, 4 ...
Therefore we dont need a formula for the second derivative, but just only value f
00
(a) ,
not even the precise value, but only the sign of the value of f
00
(a)
To get it, just take the derivative for y
0
(x)
y
00
(x) = (y
0
(x))
0
=
_

x
y(x)
_
0
=
y(x) xy
0
(x)
1 + y
2
(x)
and we get
y
00
(0) =
y(0) 0y
0
(0)
1 + y
2
(0)
=
2 0
1 + 2
2
< 0 = x = 0 is maximum point
In this very simple case we may as well apply the "standard" procedure
y = y(x) =
_
4 x
2
= y
0
(x) =
x
_
4 x
2
, y
0
(x) = 0 =
x
_
4 x
2
= 0 = x = 0
Clearly the sign of the rst derivative is
x
y(x)
y(x)
0 2 -2
0
Which "certies" the result obtained using the "implicit" method.
Finally this is the algorithm to nd extrema for functions dened implicitly
- Step I take the derivative of the equation (implicit relation), do not forget to write y(x) instead of just "y"
- Step II nd the critical points y
0
(x) = 0 , by using both y
0
(x) = 0 and the "original" equation
- Step III take one more derivative to nd the sign of the values of the second derivative at critical points.
Another way of stating all this (encountered in many text books) is the following.
Any equation (dening implicitly) looks as
E(x; y) = 0 or "better" E(x; y(x)) = 0
We proceed exactly as before.
Step I Take the derivative (with respect to "x")
[E(x; y(x))]
0
= 0 =
@E
@x
+
@E
@y
y
0
(x) = 0 = y
0
(x) =
@E
@x
@E
@y
, for
@E
@y
,= 0
Step II Find the critical points by solving the system
_
_
_
y
0
(x) = 0
E(x; y) = 0
@E
@y
,= 0
=
_
_
_
@E
@x
= 0
E(x; y) = 0
@E
@y
,= 0
Step III Find the sign of the second derivative at the critical points
y
00
(x) = (y
0
(x))
0
=
_

@E
@x
@E
@y
_
0
=
[
@
2
E
@x
2
+
@
2
E
@x@y
y
0
(x)]
@E
@y

@E
@x
[
@
2
E
@x@y
+
@
2
E
@y
2
y
0
(x)]
_
@E
@y
_
2
6
Computations are precisely the same as before, they only look dierent.
We consider this second "way" is less "intuitive", but anyone is free to choose which one seems to be more
familiar.
Example. Find the extrema for the function y = y(x) dened implicitly by the equation
x
3
+ 8y
3
6xy = 0
Solution.
First we need to check if the equation really denes a unique correspondence x y(x) .
This is not hard, but requires some care about "details".
We want to prove, check or nd special conditions, such that for each x there is a unique y = y(x) to verify the
equation.
Therefore look at the equation with "y" as "unknown" and "x" as a parameter,
and check if the equation (in y) has a unique solution for each value of the parameter x.
Let f(y) = x
3
+ 8y
3
6xy and nd how many solutions we get for f(y) = 0.
The rst derivative is
f
0
(y) =
@
@y
_
x
3
+ 8y
3
6xy
_
= 24y
2
6x
next f
0
(y) = 0 = 24y
2
6x = 0 = 4y
2
x = 0 . A little "discussion" is need here.
i) for any x _ 0 we get f
0
(y) = 24y
2
6x _ 0 for all y R and consequently f is strictly increasing thus
f(y) = 0 has a unique solution.
ii) for any x > 0 we get 4y
2
x = 0 = y
1;2
=
x
2
, and
f(
x
2
) = x
3
+ 8(
x
2
)
3
6x(
x
2
) = 3x
2
> 0
f(
x
2
) = x
3
+ 8(
x
2
)
3
6x(
x
2
) = 2x
3
3x
2
= x
2
(2x 3)
The sign of the this depends on x , namely
- x
2
(2x 3) > 0 for x > 3=2 and
- x
2
(2x 3) < 0 for x < 3=2
y
f(y)
f(y)
x/2 -x/2
- infinity
0 0
+ infinity
+
f(x/2)
Consequently the function f(y) has a unique "root" for x _ 0 and x < 3=2 , and has two roots for x _ 3=2
To conclude this, we may apply the implicit extreme points algorithm but only for x < 3=2
This implies a rather complicated discussion. We will try to make it as simple as possible.
Step I take the derivative of the implicit relation x
3
+ 8y(x)
3
6xy(x) = 0
_
x
3
+ 8y(x)
3
6xy(x)
_
0
= 0 = 3x
2
+ 24y
2
(x)y
0
(x) 6y(x) 6xy
0
(x) = 0 =
x
2
2y(x) + y
0
(x)
_
8y
2
(x) 2x
_
= 0 = y
0
(x) =
x
2
2y(x)
2(4y
2
(x) x)
for 4y
2
(x) x ,= 0
Step II To nd the critical points we need to solve y
0
(x) = 0 also using the original equation x
3
+8y
3
6xy = 0;
under the condition 4y
2
(x) x ,= 0 .
7
Therefore the critical points are the solutions for
_
_
_
y
0
(x) = 0
x
3
+ 8y(x)
3
6xy(x) = 0
4y
2
(x) x ,= 0
=
_

x
2
2y(x)
2(4y(x)x)
= 0
x
3
+ 8y(x)
3
6xy(x) = 0
4y
2
(x) x ,= 0
=
(*)
_
_
_
x
2
2y(x) = 0
x
3
+ 8y(x)
3
6xy(x) = 0
4y
2
(x) x ,= 0
= y(x) =
x
2
2
= x
3
+ 8
_
x
2
2
_
3
6x(
x
2
2
) = 0 =
= x
3
+ x
6
3x
3
= 0 = x
3
(x
3
2) = 0 = x
1
= 0 , x
2
=
3
_
2
Remember now that x < 3=2 . Since 0;
3
_
2 < 3=2 it follows that both x
1
= 0 , x
2
=
3
_
2 are acceptable.
For x
1
= 0 , from y(x) =
x
2
2
we get y(0) =
0
2
2
= 0 = 4y
2
(0) 0 = 0 , so 4y
2
(x) x ,= 0 is not veried,
thus we reject x
1
= 0 as possible critical point.
For x
2
=
3
_
2 , from y(x) =
x
2
2
we get y(
3
_
2) =
(
3
p
2)
2
2
=
3
p
4
2
this also veries 4y
2
(x) x ,= 0 since 4y
2
(
3
_
2)
3
_
2 = 4
3
p
16
2

3
_
2 = 4
3
_
2
3
_
2 = 3
3
_
2 ,= 0
Consequently x =
3
_
2 is a critical point ( y
0
(
3
_
2) = 0 ) and y(
3
_
2) =
3
p
4
2
Step III take the second derivative for
y
0
(x) =
x
2
2y(x)
2(4y
2
(x) x)
= y
00
(x) = (y
0
(x))
0
=
_

x
2
2y(x)
2(4y
2
(x) x)
_
0
=
y
00
(x) =
[2x 2y
0
(x)][4y
2
(x) x] [x
2
2y(x)][8y(x)y
0
(x) 1]
4(4y
2
(x) x)
2
Remember now we compute this for a critical point x , which veries all conditions (*)
y
0
(x) = 0 , x
2
2y(x) = 0
Consequently for such an x we have
y
00
(x) =
[2x 2
0
..
y
0
(x)][4y
2
(x) x] [
0
..
x
2
2y(x)][8y(x)
0
..
y
0
(x) 1]
4(4y
2
(x) x)
2
Also remember we need to nd out only the sign of y
00
(x) ( not necessarily the actual value).
Therefore it is much easier than it looks
For the critical point x =
3
_
2 many terms are zero, we do not need to write them in detail
y
00
(
3
_
2) =
[2
3
_
2 2
0
..
::: ][
>0
..
4y
2
(
3
_
2)
3
_
2]]
0
..
[
0
..
::: ][:::]
4(:::)
2
. .
>0
< 0
So nally y
00
(
3
_
2) < 0 = x =
3
_
2 is a maximum point, y(
3
_
2) =
3
p
4
2
is the maximum value for the function
y = (x)

Case II. Functions of "several variables" dened implicitly


We just describe the main idea and give a very simple example.
Any relation dening implicitly a function of two variables z = z(x; y) may be writen as
E(x; y; z) = 0 or "better" E(x; y; z(x; y)) = 0
Remember how we nd local extrema for functions of several variables.
We do exactly the same here.
8
Step I Find the critical points. Take the partial derivatives
@
@x
E(x; y; z(x; y)) = 0 =
@E
@x
+
@E
@z
@z
@x
= 0 =
@z
@x
=
@E
@x
@E
@z
for
@E
@z
,= 0
@
@y
E(x; y; z(x; y)) = 0 =
@E
@y
+
@E
@z
@z
@y
= 0 =
@z
@y
=
@E
@y
@E
@z
for
@E
@z
,= 0
and the critical points are the solutions of the system
_
@z
@x
= 0
@z
@y
= 0
Step II Compute the partial derivatives of second order, and the Hessian matrix only at the critical points.
Then nd whether it is positve, negative or indenite.
Example.
Find the extreme points for the function z = z(x; y) dened implicitly by
x
2
+ y
2
+ z
2
= 9 , z _ 0
Solution
This example is not a "relevant", since we can actually nd explicitly z(x; y) =
_
9 x
2
y
2
However it is simple enough to get the idea.
Step I Find the critical points. Take the partial derivatives in
x
2
+ y
2
+ z
2
(x; y) = 9
@
@x
_
x
2
+ y
2
+ z
2
(x; y

=
@
@x
(9) = 2x + 2z(x; y)
@z
@x
= 0 =
@z
@x
=
x
z(x; y)
@
@y
_
x
2
+ y
2
+ z
2
(x; y)

=
@
@y
(9) = 2y + 2z(x; y)
@z
@y
= 0 =
@z
@y
=
y
z(x; y)
The critical points are
_
@z
@x
= 0
@z
@y
= 0
=
_

x
z(x;y)
= 0

y
z(x;y)
= 0
= x = 0 , y = 0
Step II Compute the partial derivatives of second order
@
2
z
@x
2
=
@z
@x
_

x
z(x; y)
_
=
z(x; y) x
@z
@x
z
2
(x; y)
=
z(x; y) x
x
z(x;y)
z
2
(x; y)
=
z
2
(x; y) + x
2
z
3
(x; y)
@
2
z
@y
2
=
@z
@y
_

y
z(x; y)
_
=
z(x; y) y
@z
@y
z
2
(x; y)
=
z(x; y) y
y
z(x;y)
z
3
(x; y)
=
z
2
(x; y) + y
2
z
3
(x; y)
@
2
z
@x@y
=
@z
@y
_

x
z(x; y)
_
=
x
@z
@y
z
2
(x; y)
=
xy
z
3
(x; y)
For the critical point (0; 0) we have 0
2
+ 0
2
+ z
2
(0; 0) = 9 = z(0; 0) = 3 since z _ 0 and
@
2
z
@x
2
(0; 0) =
9
27
,
@
2
z
@y
2
(0; 0) =
9
27
,
@
2
z
@x@y
(0; 0) = 0
The Hessian matrix at (0; 0) is
H
z
(0; 0) =
_
@
2
F
@x
2
@
2
F
@x@y
@
2
F
@x@y
@
2
F
@y
2
_
=
_

9
27
0
0
9
27
_
It is a diagonal matrix, the eigenvalues are the diagonal entries
1
=
2
=
9
27
, both negative, so the matrix
is negative and the point (0; 0) is a maximum point.

9
0.1.3 Local Extrema with Constraints
We consider the following problem.
Find the local extrema of the function f = f(x; y; z) under (several) "constraints" g
1
(x; y; z) = 0 , g
2
(x; y; z) = 0
.
(there is no restriction on the number of constraints, could be one or as many as needed)
Denition.
A point (a; b; c) is a local extrema for the function f if
f(x; y; z) f(a; b; c) _ 0 (a minimum)
f(x; y; z) f(a; b; c) _ 0 (a minimum)
for all (x; y; z) that verify the constraints
_
g
1
(x; y; z) = 0
g
2
(x; y; z) = 0
We present Lagrange multipliers method.
Consider another function
F(x; y; z) = f(x; y; z) +
1
g
1
(x; y; z) +
2
g
2
(x; y; z)
The real numbers
1
;
2
are called Lagrange multipliers. One for each constraint.
Assume (a; b; c) is a point that veries the constraints, that is
g
1
(a; b; c) = 0 ; g
2
(a; b; c) = 0
In order to establish whether such a point is a local extrema for f we need the sign of f(x; y; z) f(a; b; c)
We write
F(x; y; z) F(a; b; c) = f(x; y; z) f(a; b; c) +
1
[g
1
(x; y; z) g
1
(a; b; c)] +
2
[g
2
(x; y; z) g
2
(a; b; c)]
For any point (x; y; z) that veries the constraints we have
F(x; y; z) F(a; b; c) = f(x; y; z) f(a; b; c) +
1
_
_
g
1
(x; y; z)
. .
0
g
1
(a; b; c)
. .
0
_
_
+
2
_
_
g
2
(x; y; z)
. .
0
g
2
(a; b; c)
. .
0
_
_
Therefore
F(x; y; z) F(a; b; c) = f(x; y; z) f(a; b; c)
Consequently we need to investigate the local extrema for the function F
The partial derivatives are
@F
@x
=
@f
@x
+
1
@g
1
@x
+
2
@g
1
@x
@F
@y
=
@f
@y
+
1
@g
1
@y
+
2
@g
1
@y
@F
@z
=
@f
@z
+
1
@g
1
@z
+
2
@g
1
@z
Step I Find the critical points (with constraints) by solving the system (we get solutions for x; y; z and
1
;
2
)
_

_
@F
@x
= 0
@F
@y
= 0
@F
@z
= 0
g
1
(x; y; z) = 0
g
2
(x; y; z) = 0
=
_

_
@f
@x
+
1
@g1
@x
+
2
@g1
@x
= 0
@f
@y
+
1
@g1
@y
+
2
@g1
@y
= 0
@f
@z
+
1
@g1
@z
+
2
@g1
@z
= 0
g
1
(x; y; z) = 0
g
2
(x; y; z) = 0
Step II Compute the Hessian matrix for the function F at the critical points
H
F
=
_
_
_
@
2
F
@x
2
@
2
F
@x@y
@
2
F
@x@z
@
2
F
@x@y
@
2
F
@y
2
@
2
F
@y@z
@
2
F
@x@z
@
2
F
@y@z
@
2
F
@z
2
_
_
_
10
Step III
If for a critical point (a; b; c) , the Hessian matrix is
- positive then (a; b; c) is a minimum for f
- negative then (a; b; c) is a maximum for f
- indenite then we need to "dierentiate the constraints" and only then decide whether (a; b; c) is an
extrema or not for f
Examples.
1) Consider the function f = f(x; y; z) = xy + yz + xz dened for x; y; z > 0 ,
Find the local extrema with the constraint xyz = 8
Solution
Let g(x; y; z) = xyz 8 . Consider the function
F(x; y; z) = f(x; y; z) + g(x; y; z) = xy + yz + xz + (xyz 8)
Step I Find the critical points with constraint
_

_
@F
@x
= 0
@F
@y
= 0
@F
@z
= 0
g(x; y; z) = 0
=
_

_
@
@x
[xy + yz + xz + (xyz 8)] = 0
@
@y
[xy + yz + xz + (xyz 8)] = 0
@
@z
[xy + yz + xz + (xyz 8)] = 0
xyz 8 = 0
=
_

_
y + z + yz = 0
x + z + xz = 0
y + x + xy = 0
xyz = 8
Subtract the rst two equations and get
y x + z(y x) = 0 = (y x)(1 + z) = 0 = y x = 0 or 1 + z = 0
Subtract the last two equations and get
y z + z(y x) = 0 = (y z)(1 + x) = 0 = y z = 0 or 1 + x = 0
It follows that
- either x = y = z ,
- either 1 + z = 0 , 1 + x = 0 , 1 + y = 0 which also implies x = y = z
Therefore necessarily x = y = z ,
replace them in xyz = 8 = x
3
= 8 = x = 2 , y = 2 , z = 2
then replace these in y + z + yz = 0 = 2 + 2 + 4 = 0 = = 1
Consequently we got a critical point (2; 2; 2) and = 1
Step II Compute the Hessian matrix for F
@
2
F
@x
2
=
@
@x
(y + z yz) = 0 ,
@
2
F
@y
2
=
@
@y
(x + z xz) = 0 ,
@
2
F
@z
2
=
@
@z
(y + x xy) = 0
@
2
F
@x@y
=
@
@y
(
@F
@x
) =
@
@y
(y + z yz) = 1 z ,
@
2
F
@x@z
=
@
@z
(y + z yz) = 1 y
@
2
F
@y@z
=
@
@z
(
@F
@y
) =
@
@z
(x + z xz) = 1 x
The Hessian matrix at (2; 2; 2) is
H
F
(2; 2; 2) =
_
_
_
@
2
F
@x
2
@
2
F
@x@y
@
2
F
@x@z
@
2
F
@x@y
@
2
F
@y
2
@
2
F
@y@z
@
2
F
@x@z
@
2
F
@y@z
@
2
F
@z
2
_
_
_ =
_
_
0 1 1
1 0 1
1 1 0
_
_
Step III
Compute the eigenvalues (add all lines to the top raw, then substract the rst column from the other two
columns)
det
_
_
0 1 1
1 0 1
1 1 0
_
_
=

(2 + ) (2 + ) (2 + )
1 1
1 1

= (2 + )

1 0 0
1 1 0
1 0 1

=
11
= (2 + )(1 )
2
= 0
The eigenvalues are
1;2
= 1 ,
3
= 2 which implies the matrix is indenite
We therefore "dierentiate" the constraint
g(x; y; z) = 0 =
@g
@x
dx +
@g
@y
dy +
@g
@z
dz = 0
Where
@g
@x
=
@
@x
(xyz 8) = yz ,
@g
@y
=
@
@y
(xyz 8) = xz ,
@g
@z
=
@
@z
(xyz 8) = xy
Now for x = 2 , y = 2 , z = 2 we get
@g
@x
(2; 2; 2) = 4 ,
@g
@y
(2; 2; 2) = 4 ,
@g
@z
(2; 2; 2) = 4
So the dierential relation for x = 2 , y = 2 , z = 2 is
4dx + 4dy + 4dz = 0 = dz = dx dy
Next compute these as if they were real numbers : dxdx = (dx)
2
, dxdz + dydz = dz(dx + dy) ....
Remember the quadratic form corresponding to a symmetric matrix, here corresponding to the Hessian matrix
at (2; 2; 2)
_
_
0 1 1
1 0 1
1 1 0
_
_
The corresponding quadratic form is
P = 0(dx)
2
+ 0(dy)
2
+ 0(dz)
2
2dxdy 2dxdz 2dydz = 2[dxdy + dz(dx + dy)] =
= 2[dxdy (dx + dy)(dx + dy)] = 2[dxdy (dx)
2
+ 2dxdy (dy)
2
] =
= 2[(dx)
2
dxdy + (dy)
2
] > 0
since for any real numbers a
2
ab + b
2
> 0
Therefore, the quadratic form for x = 2 , y = 2 , z = 2 (and using the constraint) is positively dened,
which implies (2; 2; 2) is a minimum point for the function f.

2) Consider the curve dened by x


2
+ xy + y
2
= 1 .
Find the points on the curve for which the distance to the origin (0; 0) is maximum.
Solution
The distance from a point (x; y) to (0; 0) is
_
x
2
+ y
2
.
Therefore we need the extrema for the function
d = d(x; y) = x
2
+ y
2
with the constraint
x
2
+ xy + y
2
= 1 = x
2
+ xy + y
2
1 = 0
Step I consider
F(x; y) = x
2
+ y
2
+ (x
2
+ xy + y
2
1)
The critical points are the solutions of the system
_
_
_
@F
@x
= 0
@F
@y
= 0
x
2
+ xy + y
2
1 = 0
=
_
_
_
@
@x
_
x
2
+ y
2
+ (x
2
+ xy + y
2
1)
_
= 2x + 2x + y = 0
@
@y
_
x
2
+ y
2
+ (x
2
+ xy + y
2
1)
_
= 2y + x + 2y = 0
x
2
+ xy + y
2
1 = 0
=
substract the rst two equations and get
2(x y) + (x y) = 0 = (x y)(2 + ) = 0
12
So either x = y = x
2
+x
2
+x
2
1 = 0 = 3x
2
= 1 = x
1;2
=
1
p
3
and 2x +2x +x = 0 = 2 +3 = 0
= =
2
3
either 2 + = 0 thus = 2 , which implies
_
_
_
2x 4x 2y = 0
2y 2x 4y = 0
x
2
+ xy + y
2
1 = 0
=
_
_
_
x y = 0
x y = 0
x
2
+ xy + y
2
1 = 0
=
= x
2
x
2
+ x
2
1 = 0 = x = 1
To conclude, we got several critical points with constraint
-
_
1
p
3
;
1
p
3
_
,
_
1
p
3
;
1
p
3
_
and =
2
3
, repectively
- (1; 1) , (1; 1) and = 2
Step II Compute the partial derivatives of second order
@
2
F
@x
2
=
@
@x
(2x + 2x + y) = 2 + 2 ,
@
2
F
@y
2
=
@
@y
(2y + x + 2y) = 2 + 2 ,
@
2
F
@x@y
=
@
@y
(2x + 2x + y) =
The Hessian matrix for critical points is
H
F
(
1
_
3
;
1
_
3
) = H
F
(
1
_
3
;
1
_
3
) =
_
@
2
F
@x
2
@
2
F
@x@y
@
2
F
@x@y
@
2
F
@y
2
_
=
_
2
3

2
3

2
3
2
3
_
H
F
(1; 1) = H
F
(1; 1) =
_
@
2
F
@x
2
@
2
F
@x@y
@
2
F
@x@y
@
2
F
@y
2
_
=
_
2 2
2 2
_
Step III
For the matrix H
F
(
1
p
3
;
1
p
3
) the eigenvalues are
det (H
F
I) = det
_
2
3

2
3

2
3
2
3

_
=
2

4
3
+
4
9

4
9
= 0 =
2

4
3
= 0 =
1
= 0;
2
=
4
3
Both eigenvalues are positive, therefore both critical points
_
1
p
3
;
1
p
3
_
,
_
1
p
3
;
1
p
3
_
are minimum points.
The minima distance is
_
(
1
p
3
)
2
+ (
1
p
3
)
2
=
_
2
3
For the matrix H
F
(1; 1) the eigenvalues are
det (H
F
I) = det
_
2 2
2 2
_
=
2
+ 4 + 4 4 = 0 =
2
+ 4 = 0 =
1
= 0;
2
= 4
In this case, both eigen values are negative, therefore both critical points (1; 1) , (1; 1) are maximum points.
The maxima distance is
_
(1)
2
+ (1)
2
=
_
2

13

Das könnte Ihnen auch gefallen