Input (x) Function f(x) Output (y)

advertisement
Refresher course: Maths in Economics
(Peerapat Jatukannyaprateep)
***For simplicity, any function in this note is assumed to be continuous and twice differentiable unless stated
otherwise.***
Let 𝑦 be a function of π‘₯,
π’š = 𝒇(𝒙)
We call it “π’š is a function of 𝒙”. 𝑦 is called a dependent variable (output) and π‘₯ is called an
independent/explanatory variable (input).
Input
Function
Output
(x)
f(x)
(y)
Example: 𝑦 = 𝑓(π‘₯) = (2π‘₯ + 3)2
x
y
-3
9
-2
1
-1
1
Slope: is given by a change in y over a change in x,
0
9
βˆ†π‘¦
βˆ†π‘₯
1
25
2
49
3
81
.
Differentiation: a method to compute the slope of a function (the rate at which a dependent variable y changes
with respect to the change in the independent variable x).
Derivative: a measure of how a function changes as its input changes. The derivative is the ratio of
the infinitesimal change of the dependent variable over the infinitesimal change of the independent variable.
“Infinitesimal”: smaller than any number but is not equal to zero (approaching zero).
“The derivative of y with respect to x” can be found by 3 following methods:
Forward Difference
𝑑𝑦 𝑑𝑓(π‘₯)
𝑓(π‘₯ + β„Ž) − 𝑓(π‘₯)
=
= lim
β„Ž→0
𝑑π‘₯
𝑑π‘₯
β„Ž
Backward Difference
𝑑𝑦 𝑑𝑓(π‘₯)
𝑓(π‘₯) − 𝑓(π‘₯ − β„Ž)
=
= lim
β„Ž→0
𝑑π‘₯
𝑑π‘₯
β„Ž
Central Difference
β„Ž
β„Ž
𝑓 (π‘₯ + ) − 𝑓(π‘₯ − )
𝑑𝑦 𝑑𝑓(π‘₯)
2
2 ≡ lim 𝑓(π‘₯ + β„Ž) − 𝑓(π‘₯ − β„Ž)
=
= lim
β„Ž→0
β„Ž→0
𝑑π‘₯
𝑑π‘₯
β„Ž
2β„Ž
All of the 3 methods give the same result when h approaches zero.
Linear function
y is a linear function of x if it can be written in the following form:
𝑦 = 𝑏π‘₯ + 𝑐
where b and c are constants.
The characteristic of a linear function is that the slope (derivative) is constant. (The slope is the same no matter
what the value of x is.)
Example:
x
y
y = 3x + 2
-3
-7
-2
-4
-1
-1
0
2
1
5
2
8
3
11
(3(π‘₯ + β„Ž) + 2) − (3(π‘₯) + 2)
π’…π’š
𝑓(π‘₯ + β„Ž) − 𝑓(π‘₯)
= lim
= lim
=πŸ‘
β„Ž→0
𝒅𝒙 β„Ž→0
β„Ž
β„Ž
𝑑𝑦
𝑑π‘₯
does not depend on the independent variable x. (It is not a function of x.)
x
dy/dx (Slope)
-2
3
-1
3
0
3
1
3
2
3
Non-linear function
Any function that is not linear, including logarithmic and exponential function, is called a non-linear function.
The slope of a non-linear function is not a constant. It depends on its independent variable (input). (It is a function
of x.)
Example:
x
y
π’š = π’™πŸ + πŸ‘
-3
12
-2
7
-1
4
0
3
1
4
2
7
3
12
((π‘₯ + β„Ž)2 + 3) − (π‘₯ 2 + 3)
(π‘₯ 2 + 2π‘₯β„Ž + β„Ž2 + 3) − (π‘₯ 2 + 3)
π’…π’š
𝑓(π‘₯ + β„Ž) − 𝑓(π‘₯)
= lim
= lim
= lim
= πŸπ’™
β„Ž→0
β„Ž→0
𝒅𝒙 β„Ž→0
β„Ž
β„Ž
β„Ž
The result above shows that the slope of the function is twice the value of the independent variable i.e. it is not
constant
x
Slope
-2
-4
-1
-2
0
0
1
2
2
4
Derivatives of elementary functions
Note that we also use a notation 𝑓 ′ (π‘₯) for a derivative of 𝑓(π‘₯).
Let c be a constant.
𝒅𝒇(𝒙)
𝒇(𝒙) =
𝒅𝒙
=
Constant rule: c
0
𝑐π‘₯ π‘Ÿ
π‘π‘Ÿπ‘₯ π‘Ÿ−1
𝑒 𝑐π‘₯
𝑐𝑒 𝑐π‘₯
ln(π‘₯)
1
π‘₯
Sum rule:
𝑔′ (π‘₯) + β„Ž′ (π‘₯)
𝑔(π‘₯) + β„Ž(π‘₯)
Product rule:
Quotient rule:
Chain rule:
𝑔(π‘₯)β„Ž′ (π‘₯) + 𝑔′ (π‘₯)β„Ž(π‘₯)
𝑔(π‘₯)β„Ž(π‘₯)
β„Ž(π‘₯)𝑔′(π‘₯) − 𝑔(π‘₯)β„Ž′ (π‘₯)
β„Ž(π‘₯)2
π‘‘β„Ž(π‘₯) 𝑑𝑔(π‘₯)
×
𝑑𝑔(π‘₯)
𝑑π‘₯
𝑔(π‘₯)
β„Ž(π‘₯)
β„Ž(𝑔(π‘₯))
Examples:
1) Constant rule
𝑦 = 𝑓(π‘₯) = 3
𝑑𝑦
= 0 (𝐡𝑦 π‘‘β„Žπ‘’ π‘π‘œπ‘›π‘ π‘‘π‘Žπ‘›π‘‘ π‘Ÿπ‘’π‘™π‘’)
𝑑π‘₯
Similar to a linear function with slope (b) = 0.
2)
𝒅(𝒄𝒙𝒓 )
𝒙
= 𝒄𝒓𝒙𝒓−𝟏
𝑦 = 𝑓(π‘₯) = 3π‘₯ 2
Here c = 3 and r = 2
𝑑𝑦
= 3 × 2 × π‘₯ 2−1 = 6π‘₯
𝑑π‘₯
3) Sum rule
𝑦 = 𝑓(π‘₯) = 2π‘₯ 2 + 4π‘₯ − 7
𝑑𝑦 𝑑(2π‘₯ 2 ) 𝑑(4π‘₯) 𝑑(−7)
=
+
+
= 4π‘₯ + 4 + 0 = 4π‘₯ + 4
𝑑π‘₯
𝑑π‘₯
𝑑π‘₯
𝑑π‘₯
The value of the constant does not affect the slope of the function.
(brown line: c = 7, red line: c = 0, blue line: c = -7)
Let π’ˆ(𝒙) = π’™πŸ , 𝒉(𝒙) = π’†πŸ‘π’™
4. Product Rule
Let 𝑓(π‘₯) = 𝑔(π‘₯)β„Ž(π‘₯)
𝑑𝑓(π‘₯)
= 𝑔′ (π‘₯)β„Ž(π‘₯) + 𝑔(π‘₯)β„Ž′ (π‘₯) = 2π‘₯𝑒 3π‘₯ + π‘₯ 2 (3𝑒 3π‘₯ ) = (2π‘₯ + 3π‘₯ 2 )𝑒 3π‘₯
𝑑π‘₯
5. Quotient Rule
Let 𝑓(π‘₯) =
𝑔(π‘₯)
β„Ž(π‘₯)
𝑑𝑓(π‘₯) β„Ž(π‘₯)𝑔′(π‘₯) − 𝑔(π‘₯)β„Ž′ (π‘₯) 2π‘₯𝑒 3π‘₯ − 3π‘₯ 2 𝑒 3π‘₯ 2π‘₯ − 3π‘₯ 2
=
=
=
𝑑π‘₯
β„Ž(π‘₯)2
𝑒 6π‘₯
𝑒 3π‘₯
6. Chain Rule
Let 𝑓(π‘₯) = β„Ž(𝑔(π‘₯)) = 𝑒 3π‘₯
2
𝑑𝑓(π‘₯) π‘‘β„Ž(𝑔(π‘₯)) 𝑑𝑔(π‘₯)
2
=
×
= (3𝑒 3𝑔(π‘₯) )(2π‘₯) = 6π‘₯𝑒 3π‘₯
𝑑π‘₯
𝑑𝑔(π‘₯)
𝑑π‘₯
Concavity and Convexity of a function
It is a must-know mathematical concept. I recommend that you take a look at this great lecture note by Martin J.
Osborne (https://www.economics.utoronto.ca/osborne/MathTutorial/CCVF.HTM)
Optimisation problems with univariate functions
The maximum and the minimum (plural: maxima and minima) of a function, known collectively as extrema
(singular: extremum) are the largest and smallest value that the function takes a point, either within a given
neighbourhood (local/relative extremum) or on the function domain in its entirety (global/absolute extremum).
Examples:
1) The function π‘₯ 2 has no maximum and one local minimum which is also a unique global minimum at x = 0. At
the point, the function takes the value of 𝑓(0) = 02 = 0.
2) The function π‘₯ 3 − 3π‘₯ 2 has no global extremum (because when π‘₯ → −∞, 𝑓(π‘₯) = −∞, and when π‘₯ →
∞, 𝑓(π‘₯) = ∞), but has a local maximum at x = 0 (y = 0) and a local minimum at x = 2 (y = -4)
3) The function π‘₯ 3 has no maximum and minimum, but has a saddle point at x=0. (A saddle point is a point that
is a stationary point but not a local extremum. At this point, the function switches from being concave to
convex, or from convex to concave.)
Local Extrema
A point is said to be a local maximum/minimum if it satisfies the following 2 conditions:
1. First Order Condition (F.O.C.)
It is a stationary (critical) point. A stationary point is a point where the derivative of a function is equal to
zero. (the point where the function stops increasing or decreasing.)
𝑭. 𝑢. π‘ͺ. ∢ 𝒇′ (𝒙∗ ) = 𝟎
The slope (derivative) at a maximum and a minimum is always zero.
2. Second Order Condition (S.O.C.)
The First Order Condition is not sufficient in determining whether a point is a local minimum or a local maximum.
To be able to fully determine, the second derivative (the derivative of the derivative of the function,
by 𝑓′′(π‘₯)) needs to be checked.
𝑑𝑓 ′(π‘₯)
𝑑π‘₯
, denoted
𝑺. 𝑢. π‘ͺ. ∢
𝒇′′ (𝒙∗ ) > 𝟎 (πΉπ‘œπ‘Ÿ π‘Ž π‘šπ‘–π‘›π‘–π‘šπ‘’π‘š)
𝒇′′ (𝒙∗ ) < 𝟎 (πΉπ‘œπ‘Ÿ π‘Ž π‘šπ‘Žπ‘₯π‘–π‘šπ‘’π‘š)
𝒇′′ (𝒙∗ ) = 𝟎
(𝑁𝑒𝑒𝑑 π‘Ž π‘“π‘’π‘Ÿπ‘‘β„Žπ‘’π‘Ÿ π‘–π‘›π‘£π‘’π‘ π‘‘π‘–π‘”π‘Žπ‘‘π‘–π‘œπ‘› π‘€β„Žπ‘’π‘‘β„Žπ‘’π‘Ÿ 𝑖𝑑 𝑖𝑠 π‘Ž π‘™π‘œπ‘π‘Žπ‘™ π‘šπ‘Žπ‘₯π‘–π‘šπ‘’π‘š, π‘Ž π‘™π‘œπ‘π‘Žπ‘™ π‘šπ‘–π‘›π‘–π‘šπ‘’π‘š, π‘œπ‘Ÿ π‘Ž π‘ π‘Žπ‘‘π‘‘π‘™π‘’ π‘π‘œπ‘–π‘›π‘‘. )
If the second derivatives of both the left and the right neighbourhood of the stationary point are positive, the
point is a local minimum.
lim 𝑓 ′′ (π‘₯ − β„Ž) > 0 π‘Žπ‘›π‘‘ lim 𝑓 ′′ (π‘₯ + β„Ž) > 0
β„Ž→∞
β„Ž→∞
On the other hand, if the second derivatives of both the left and the right neighbourhood of the stationary
point are negative, the point is a local maximum.
lim 𝑓 ′′ (π‘₯ − β„Ž) < 0 π‘Žπ‘›π‘‘ lim 𝑓 ′′ (π‘₯ + β„Ž) < 0
β„Ž→∞
β„Ž→∞
If the stationary point is to be a saddle point, the sign of the second derivatives of its left and right
neighbourhood must not be the same.
lim 𝑓 ′′ (π‘₯ − β„Ž) > 0 π‘Žπ‘›π‘‘ lim 𝑓 ′′ (π‘₯ + β„Ž) < 0 𝒐𝒓 lim 𝑓 ′′ (π‘₯ − β„Ž) < 0 π‘Žπ‘›π‘‘ lim 𝑓 ′′ (π‘₯ + β„Ž) > 0
β„Ž→∞
β„Ž→∞
β„Ž→∞
β„Ž→∞
The idea behind this condition is that, a positive second derivative implies that the slope (the first derivative) at
the stationary point is starting to increase which implies that the function is also starting to increase from the
stationary point, thus, the point is a minimum.
On the other hand, a negative second derivative implies that the slope at the stationary point is starting to decrease
implying that the function is also starting to decrease, hence, the point is a maximum.
Graph A: 𝒇(𝒙) = π’™πŸ + πŸ“
The red line in the graphs are the first derivative line, Graph A:
Graph B: 𝒇(𝒙) = −π’™πŸ + πŸ“
dy
dx
= 2x, Graph B:
dy
dx
= −2x.
As can be seen above, the functions reach their local extremum when its first derivative line crosses the x-axis
(the derivative function is equal to zero) (F.O.C.). The extremum is the minimum if the slope of the first derivative
line is positive (Graph A), and is the maximum if the slope of the first dervative line is negative (Graph B)
(S.O.C.).
Examples:
1) π’š = π’™πŸ + πŸπ’™ + πŸ“
𝑭. 𝑢. π‘ͺ.:
𝑑𝑦
= 2π‘₯ + 2 (π‘’π‘žπ‘’π‘Žπ‘™ π‘‘π‘œ 0 π‘Žπ‘‘ π‘₯ ∗ = −1)
𝑑π‘₯
𝑺. 𝑢. π‘ͺ. :
𝑑2𝑦
= 2 > 0 (𝐼𝑑 𝑖𝑠 π‘Ž π‘šπ‘–π‘›π‘–π‘šπ‘’π‘š. )
𝑑π‘₯ 2
Blue line: the function 𝑦 = π‘₯ 2 + 2π‘₯ + 5, Green line: the first derivative of the function,
𝑑𝑦
𝑑π‘₯
= 2π‘₯ + 2
From the graph, the function y reaches its minimum at the point where the value of the first derivative function
is zero (at π‘₯ ∗ = −1) and the value of the minimum is equal to:
π‘₯ ∗ 2 + 2π‘₯ ∗ + 5 = (−1)2 + 2(−1) + 5 = 4
2) π’š = π’™πŸ‘ − πŸ‘π’™πŸ + πŸ”
𝑭. 𝑢. π‘ͺ.:
𝑑𝑦
= 3π‘₯ 2 − 6π‘₯ = 3π‘₯(π‘₯ − 2) (π‘’π‘žπ‘’π‘Žπ‘™ π‘‘π‘œ 0 π‘Žπ‘‘ π‘₯ ∗ = 0 π‘Žπ‘›π‘‘ 2)
𝑑π‘₯
𝑺. 𝑢. π‘ͺ. :
𝑑2𝑦
= 6π‘₯ − 6
𝑑π‘₯ 2
At π‘₯ ∗ = 0
𝑺. 𝑢. π‘ͺ. :
𝑑2𝑦
= 6π‘₯ − 6 = 6(0) − 6 = −6 < 0 (𝐼𝑑 𝑖𝑠 π‘Ž π‘šπ‘Žπ‘₯π‘–π‘šπ‘’π‘š. )
𝑑π‘₯ 2
At π‘₯ ∗ = 2
𝑺. 𝑢. π‘ͺ. :
𝑑2𝑦
= 6π‘₯ − 6 = 6(2) − 6 = 6 > 0 (𝐼𝑑 𝑖𝑠 π‘Ž π‘šπ‘–π‘›π‘–π‘šπ‘’π‘š. )
𝑑π‘₯ 2
Blue line: the function 𝑦 = π‘₯ 3 − 3π‘₯ 2 + 6, Green line: the first derivative of the function,
𝑑𝑦
𝑑π‘₯
= 3π‘₯ 2 − 6π‘₯
(From the graph, the function y reaches an extremum whenever the first derivative function crosses the x-axis.)
The function has no global extremum, but has a local maximum at π‘₯ ∗ = 0 (𝑦 = 6) and a local minimum at π‘₯ ∗ =
2 (𝑦 = 2).
3) π’š = π’™πŸ‘
𝑭. 𝑢. π‘ͺ. :
𝑑𝑦
= 3π‘₯ 2 (π‘’π‘žπ‘’π‘Žπ‘™ π‘‘π‘œ π‘§π‘’π‘Ÿπ‘œ π‘Žπ‘‘ π‘₯ ∗ = 0)
𝑑π‘₯
𝑺. 𝑢. π‘ͺ. :
𝑑2𝑦
= 6π‘₯
𝑑π‘₯ 2
At π‘₯ ∗ = 0, 6π‘₯ ∗ = 0, thus the point is a saddle point and the function 𝑦 = π‘₯ 3 have no maximum and minimum.
Global Extrema
A Global maximum(minimum) is a point where the value of the function is the largest(smallest) in its entire
domain.
Consider the function 𝑦 = 𝑓(π‘₯) =
𝑭. 𝑢. π‘ͺ. :
π‘₯4
4
+
π‘₯3
3
9
− π‘₯ 2 − 9π‘₯ + 5
2
𝑑𝑦
= π‘₯ 3 + π‘₯ 2 − 9π‘₯ − 9 = (π‘₯ + 3)(π‘₯ + 1)(π‘₯ − 3) = 0
𝑑π‘₯
There are 3 stationary points, at π‘₯ ∗ = −3, −1, π‘Žπ‘›π‘‘ 3
𝑺. 𝑢. π‘ͺ. :
𝑑2𝑦
= 3π‘₯ 2 + 2π‘₯ − 9
𝑑π‘₯ 2
At 𝒙∗ = −πŸ‘,
𝑑2𝑦
= 3(−3)2 + 2(−3) − 9 = 27 − 6 − 9 = 12 > 0 (π‘‡β„Žπ‘–π‘  π‘π‘œπ‘–π‘›π‘‘ 𝑖𝑠 π‘Ž π‘™π‘œπ‘π‘Žπ‘™ π‘šπ‘–π‘›π‘–π‘šπ‘’π‘š. )
𝑑π‘₯ 2
𝑓(−3) = 2.75
At 𝒙∗ = −𝟏,
𝑑2𝑦
= 3(−1)2 + 2(−1) − 9 = 3 − 2 − 9 = −8 < 0 (π‘‡β„Žπ‘–π‘  π‘π‘œπ‘–π‘›π‘‘ 𝑖𝑠 π‘Ž π‘™π‘œπ‘π‘Žπ‘™ π‘šπ‘Žπ‘₯π‘–π‘šπ‘’π‘š. )
𝑑π‘₯ 2
𝑓(−1) = 9.42
At 𝒙∗ = πŸ‘,
𝑑2𝑦
= 3(3)2 + 2(3) − 9 = 27 + 6 − 9 = 24 > 0 (π‘‡β„Žπ‘–π‘  π‘π‘œπ‘–π‘›π‘‘ 𝑖𝑠 π‘Ž π‘™π‘œπ‘π‘Žπ‘™ π‘šπ‘–π‘›π‘–π‘šπ‘’π‘š. )
𝑑π‘₯ 2
𝑓(3) = −33.25
From the graph, the function has 2 local minima, at x = -3 and x = 3. The local minimum at x = 3 is also the global
minimum as it gives the function the smallest value. There is 1 local maximum at x = -1, but it is not a global
maximum. For this function, there is no global maximum since the value of the function goes to infinity and minus
infinity as π‘₯ → ∞ and π‘₯ → −∞, respectively.
Suppose, the domain of the function is [−5 , 5] instead of (−∞, ∞), then the function would have 3 local
maxima at -5, -1, and 5 giving the value of the function 52.08, 9.42, and 45.42, respectively. Thus, the global
maximum is at x = -5.
To find the global extrema of a function on an interval [a, b]
1. Find the critical/stationary points of the function. (First Order Condition)
2. Classify the critical/stationary points whether they are local maxima, local minima, or saddle points. (Second
Order Condition)
3. Compare the values of the function at these points together with the values evaluated at each end point. The
point with the largest value is the global maximum, and the point with the smallest value is the global minimum.
Partial derivative
Suppose we have a function of two variables
𝑧 = 𝑓(π‘₯, 𝑦)
(z is a function of x and y)
For example
𝑧 = π‘₯𝑦 2
You can see that the value of z depends on the value of y, and the marginal change in the value of z when x
change also depends on y.
Partial derivative of z with respect to x
𝑧 = π‘₯𝑦 2
The partial derivative of z with respect to x is the change in z associated with the change in x holding other
variables constant. (We use πœ• symbol to distinguish between derivative and partial derivative.)
πœ•π‘§
= 𝑦2
πœ•π‘₯
Which means that when x changes by 1 unit, z changes by 𝑦 2 unit implying that the change in z with respect to
x also depends on the value of y
y
1
2
3
4
5
πœ•π‘§
= 𝑦2
πœ•π‘₯
1
4
9
16
25
The rules of partial derivative are the same with derivative’s except that the other variables are treated as
constants.
Example 1:
𝑧 =π‘₯+𝑦
Then
πœ•π‘§ πœ•π‘₯ πœ•π‘¦
=
+
=1+0
πœ•π‘₯ πœ•π‘₯ πœ•π‘₯
πœ•π‘§ πœ•π‘₯ πœ•π‘¦
=
+
=0+1
πœ•π‘¦ πœ•π‘¦ πœ•π‘¦
Example 2:
𝑧 = (π‘₯ − 𝑦)2
By chain rule
πœ•π‘§
= 2(π‘₯ − 𝑦)
πœ•π‘₯
πœ•π‘§
= −2(π‘₯ − 𝑦)
πœ•π‘¦
Example in economics: Cournot competition equilibrium
Optimisation problems with multivariate functions
Similar to the optimisation of a function of 1 variable.
1. First Order Condition (The maximum/minimum is a stationary point)
A multivariate function is stationary if the partial derivative of the function with respect to each variable is equal
to zero
𝑧 = 𝑓(π‘₯, 𝑦)
πœ•π‘§
πœ•π‘§
= 0 π‘Žπ‘›π‘‘
=0
πœ•π‘₯
πœ•π‘¦
Hence, we often have to deal with a system of equations to find a pair of x and y that satisfy both of the
equations.
2. Second Order Condition (Second derivative test)
Again we have to check the second order partial derivative which includes the cross partial derivative. To sum
up, we have to check the determinant (det) of the Hessian Matrix of the function
πœ•2𝑧
πœ•π‘₯ 2
𝐻 =
πœ•2𝑧
[πœ•π‘¦πœ•π‘₯
Note that
πœ•2 𝑧
πœ•π‘₯πœ•π‘¦
=
πœ•2𝑧
πœ•π‘₯πœ•π‘¦
πœ•2𝑧
πœ•π‘¦ 2 ]
πœ•2 𝑧
πœ•π‘¦πœ•π‘₯
1. If det(H) > 0, and
πœ•2 𝑧
πœ•π‘₯ 2
> 0 π‘Žπ‘›π‘‘
πœ•2 𝑧
πœ•π‘¦ 2
> 0 then the point is a local minimum. (the Hessian is positive definite
i.e. all eigenvalues are positive.)
2. If det(H) > 0, and
πœ•2 𝑧
πœ•π‘₯ 2
< 0 π‘Žπ‘›π‘‘
πœ•2 𝑧
πœ•π‘¦ 2
< 0, the point is a local maximum. (the Hessian is negative definite i.e. all
eigenvalues are negative.)
3. If the determinant of the Hessian Matrix is negative, then the point is a saddle point. (i.e. the Hessian is
indefinite. It has both negative and positive eigenvalues.)
*If the Hessian is positive/negative semi-definite, the result is inconclusive and require further investigation.
Very small note: Determinant of a 𝟐 × πŸ matrix
Let 𝐴 = [
π‘Ž
𝑐
𝑏
]
𝑑
det(A) = ad – bc
You must also know how to compute the determinant of a matrix with higher dimension too!
Recommended detailed note on definiteness (by Eivind Eriksen):
http://home.bi.no/a0710194/Teaching/BI-Mathematics/GRA-6035/2010/lecture5.pdf
Constraint Optimisation
Suppose we would like to solve the optimisation problem of a function, however, there is a constraint on the
values of the independent variables.
Examples in economics
1.
Cost minimisation
Suppose, a firm needs 2 inputs for its production; K(Capital) and L (Labour). Therefore the production function
of the firm is going to be a function of K and L
𝑄 = 𝑓(𝐾, 𝐿)
Now, let w denotes wage (cost of labour) and r denote cost of capital, then our cost function would be
𝐢(𝐾, 𝐿) = π‘ŸπΎ + 𝑀𝐿
Suppose we would like to produce Q units of output with the minimum cost, then our cost minimisation
problem become
min 𝐢(𝐾, 𝐿)
𝐾,𝐿
𝑠𝑒𝑏𝑗𝑒𝑐𝑑 π‘‘π‘œ 𝑓(𝐾, 𝐿) = 𝑄
2.
Utility maximisation
Suppose, an individual’s utility depends on 2 kind of goods, A and B, then his utility function can be written as
π‘ˆ(𝐴, 𝐡)
e.g. π‘ˆ(𝐴, 𝐡) = √𝐴 + √2𝐡
let 𝑝𝐴 π‘Žπ‘›π‘‘ 𝑝𝑏 be the price of A and B respectively.
The non-satiation property (more is preferred to less) of utility function implies the individual prefer more to
less. Therefore, the solution to the utility maximisation is 𝐴 = ∞ π‘Žπ‘›π‘‘ 𝐡 = ∞
However, in real life, there is always a budget constraint (You have a limited amount of money). Let the
individual’s budget equals to m, then the utility maximisation problem becomes
max π‘ˆ(𝐴, 𝐡)
𝐴,𝐡
𝑠𝑒𝑏𝑗𝑒𝑐𝑑 π‘‘π‘œ 𝑝𝐴 𝐴 + 𝑝𝐡 𝐡 = π‘š
Lagrange multiplier method
To solve the optimisation problem under constraint, the method of Lagrange multiplier is commonly used.
Let 𝑓(π‘₯, 𝑦) be the objective function (the function you would like to maximise or minimise e.g. Cost function,
Utility function) and let 𝑔(π‘₯, 𝑦) = 𝑐 be the constraint. (e.g. the quantity of output needed to be produced,
budget constraint).
Next, set up the Lagrangian
max Λ(π‘₯, 𝑦, πœ†) = 𝑓(π‘₯, 𝑦) + πœ†(𝑐 − 𝑔(π‘₯, 𝑦))
π‘₯,𝑦,πœ†
Where Λ(π‘₯, 𝑦, πœ†) is a Lagrange function (or Lagrangian) and πœ† is called a Lagrange multiplier.
Then find the stationary point(s) (First Order Condition)
πœ•Λ
πœ•Λ
πœ•Λ
= 0,
= 0,
=0
πœ•π‘₯
πœ•π‘¦
πœ•πœ†
or
πœ•π‘“(π‘₯, 𝑦)
πœ•π‘”(π‘₯, 𝑦)
−πœ†
=0
πœ•π‘₯
πœ•π‘₯
πœ•π‘“(π‘₯, 𝑦)
πœ•π‘”(π‘₯, 𝑦)
−πœ†
=0
πœ•π‘¦
πœ•π‘¦
𝑐 − 𝑔(π‘₯, 𝑦) = 0
Second Order Condition
We have to check the bordered Hessian which is given by: (2 variable case)
0
𝐻𝐡 = [𝑔1
𝑔2
𝑔1
Λ11
Λ21
𝑔2
Λ12 ]
Λ22
The sufficient condition for a local max is that the bordered Hessian is negative definite:
|𝐻1𝐡 | < 0, |𝐻2𝐡 | > 0, |𝐻3𝐡 | < 0, |𝐻4𝐡 | > 0, …
0
where 𝐻1𝐡 = [
𝑔1
0
𝑔1
] , 𝐻2𝐡 = [𝑔1
Λ11
𝑔2
𝑔1
Λ11
Λ21
𝑔2
Λ12 ] and so on.
Λ22
On the otherhand, the sufficient condition for a local min is that the bordered Hessian is positive definite:
|𝐻1𝐡 | < 0, |𝐻2𝐡 | < 0, |𝐻3𝐡 | < 0, |𝐻4𝐡 | < 0, …
Download