∂ f ∂f •

advertisement
∂2f
∂f
= fyx is obtained by differentiating
with Then we have:
∂x∂y
∂y
• A stationary point (x0 , y0 ) is a minimum if all the
respect to x (now treating y as a constant);
principal minors are strictly positive;
In most cases, and possibly in all cases you will encounter
∂2f
∂2f
• A stationary point (x0 , y0 ) is a maximum
if the prinin your studies, we have:
=
.
∂y∂x
∂x∂y
cipal minors alternate sign, with fx1 x1 < 0.
•
community project
mathcentre community project
encouraging academics to share maths support resources
All mccp resources are released under a Creative Commons licence
mcccp-richard-6
For the help you need to support your course
Differentiation for Economics and Business Studies
Multi-Variable Functions
Author: Morgiane Richard, University of Aberdeen
Reviewer: Anthony Cronin, University College Dublin
This leaflet is an overview of the rules of partial differentiation and methods of optimization of functions in Economics and Business Studies.
Partial Differentiation
First partial derivatives: given a function f (x, y):
∂f
• the first derivative of f with respect to x,
or fx ,
∂x
is obtained by differentiating f treating y as a constant;
∂f
or fy ,
∂y
is obtained by differentiating f treating x as a constant;
• the first derivative of f with respect to y,
There are four second partial derivatives:
•
•
•
∂2f
∂f
= fxx is obtained by differentiating
with
∂x2
∂x
respect to x once more (treating y as a constant
again);
∂2f
∂f
= fyy is obtained by differentiating
with
∂y 2
∂y
respect to y once more (treating x as a constant
again);
∂2f
∂f
= fxy is obtained by differentiating
with
∂y∂x
∂x
respect to y (now treating x as a constant);
Unconstrained Optimization
Constrained Optimization: Lagrange Multipliers
First Order Conditions (FOC): if a point (x0 , y0 ) is such
This method is used to find the optimum of a function
∂f
∂f
that
(x0 , y0 ) = 0 AND
(x0 , y0 ) = 0, then it is a of two or more variables when the variables are under a
∂x
∂y
specific constraint:
stationary point.
Second Order Conditions (SOC): the stationary point Find the maximum (or minimum) of the function:
f (x1 , x2 , · · · xn ) given that: g(x1 , x2 , · · · xn ) ≤ M
(x0 , y0 ) is:
∂2f ∂2f
∂2f 2
• a minimum if ( 2 )( 2 ) − (
) > 0 and The constraint g(x1 , x2 , · · · xn ) ≤ M will often have the
∂x
∂y
∂x∂y
form p1 x1 + p2 x2 + · · · + pn xn ≤ M . The Lagrangian
2
∂2f
∂ f
multipliers method does not tell you whether the optimum
> 0,
>0 ;
is a maximum or a minimum; however, you will always be
∂x2
∂y 2
asked to locate either a minimum or a maximum, never to
∂2f ∂2f
∂2f 2
• a maximum if ( 2 )( 2 ) − (
) > 0 and determine the nature of the optimum.
∂x ∂y
∂x∂y
2
∂ f
∂2f
Given the Lagrangian function
< 0,
<0 ;
∂x2
∂y 2
2
2
2
L(x,
y,
λ)
=
f
(x
,
x
,
·
·
·
x
)
+
λ
g(x
,
x
,
·
·
·
x
)
−
M
1
2
n
1
2
n
∂ f 2
∂ f ∂ f
) < 0.
• a saddle point if ( 2 )( 2 ) − (
∂x
∂y
∂x∂y
the optimum of the function f is such that:
Hessian matrix: this method is especially well-suited in the
case of functions of more than two variables. The Hessian
matrix of a function of n variables f (x1 , x2 , · · · , xn ) is the
∂L
matrix with coefficients the second partial derivatives of f :
(x1 , · · · , xn , λ) = 0


∂x
1
fx1 x1 fx1 x2 · · · fx1 xn
..
 fx2 x1 fx2 x2 · · · fx2 xn 


.
H(f ) =  .
..
.. 
..
 ..

.
.
.
∂L
(x1 , · · · , xn , λ) = 0
fxn x1
···
· · · fxn xn
∂xn
∂L
(x1 , · · · , xn , λ) = 0
The Hessian matrix is symmetric because we have fxi xj =
∂λ
fxj xi . We also define the principal minors of the Hessian
matrix as the following determinants:
This gives a system of n + 1 simultaneous equations and
fx1 x1 , fx1 x1 fx1 x2 ,
n + 1 unknowns; solving the system will give you the value
fx2 x1 fx2 x2 (x10 , · · · , xn0 ) of the optimum.
fx1 x1 fx1 x2 fx1 x3 fx2 x1 fx2 x2 fx2 x3 , · · · det H(f )
www.mathcentre.ac.uk
fx3 x1 fx3 x2 fx3 x3 1
Download