Demand Estimation

advertisement
MANAGERIAL ECONOMICS
12th Edition
By
Mark Hirschey
Demand Estimation
Chapter 5
Chapter 5
OVERVIEW
•
•
•
•
•
•
•
Interview and Experimental Methods
Simple Demand Curve Estimation
Simple Market Demand Curve Estimation
Identification Problem
Regression Analysis
Measuring Regression Model Significance
Measures of Individual Variable
Significance
Interview and Experimental
Methods
Consumer Interviews


Interviews can solicit useful information when
market data is scarce.
Consumer opinions can differ from behavior.
Market Experiments


Controlled experiments can generate useful
insight.
Experiments can be expensive.
Simple Demand Curve Estimation
•
Simple Linear Demand Curves
•
•
•
The best estimation method balances
marginal costs and marginal benefits.
Simple linear relations are often useful for
demand estimation.
Using Simple Linear Demand Curves
•
Straight-line relations can give useful
approximations.
Simple Demand Curve Estimation
Consider the linear demand: 𝑃 = π‘Ž + 𝑏𝑄
when P=$12, Q=3,200 and P=$10, Q=4,000
12 = π‘Ž + 𝑏 3200
minus 10 = π‘Ž + 𝑏 4000
2 = −800𝑏
b = −0.0025
𝑃 = π‘Ž + 𝑏𝑄
is
12 = π‘Ž + 𝑏 3200
12 = π‘Ž − 0.0025 3200
12 = π‘Ž − 8
π‘Ž = 20
𝑃 = 20 − 0.0025𝑄
Demand Curve and TR Maximization
𝑇𝑅 = 𝑃 βˆ™ 𝑄
= $20 − $0.0025𝑄 𝑄
= $20𝑄 − $0.0025𝑄 2
πœ•π‘‡π‘…
𝑀𝑅 =
πœ•π‘„
= $20 − $0.005𝑄
$20 − $0.005𝑄 = 0
$0.005𝑄 = $20
𝑄 = 4000 and 𝑃 = 20 − 0.0025 4000 = $10
max 𝑇𝑅 = 𝑃 βˆ™ 𝑄 = 4000 βˆ™ $10 = $40,000
Market Demand Curve Estimation
Market Demand Curve

Shows total quantity customers are willing to
buy at various prices under current market
conditions.
Graphing the Market Demand Curve


Market demand is the sum of individual
demand quantities, Q1 + Q2 = Q1+2.
Add quantities, not prices!
Market Demand Curve Estimation
𝑃𝐷 = $100 − $0.001𝑄𝐷
𝑃𝐹 = $80 − $0.004𝑄𝐹
Domestic Demand
Foreign Demand
𝑇𝐢 = $1,200,000 + $24𝑄 Total Cost
𝑃𝐷 = $100 − $0.001𝑄𝐷
𝑃𝐹 = $80 − $0.004𝑄𝐹
0.001𝑄𝐷 = 100 − 𝑃𝐷
0.004𝑄𝐹 = 80 − 𝑃𝐹
𝑄𝐷 = 100,000 − 1000𝑃𝐷
𝑄𝐹 = 20,000 − 250𝑃𝐹
Market Demand Curve Estimation
𝑄 = 𝑄𝐷 + 𝑄𝐹
𝑄 = 100,000 − 1000𝑃𝐷 + 20,000 − 250𝑃𝐹
𝑄 = 120,000 − 1,250𝑃
1,250𝑃 = 120,000 − 𝑄
𝑃 = $96 − $0.0008𝑄
Market Demand
Market Demand Curve Estimation
𝑇𝑅 = 𝑃 βˆ™ 𝑄
= $96 − $0.0008𝑄 𝑄
= $96𝑄 − $0.0008𝑄 2
πœ•π‘‡π‘…
𝑀𝑅 =
= $96 − $0.0016𝑄
πœ•π‘„
𝑇𝐢 = $1,200,000 + $24𝑄
πœ•π‘‡πΆ
𝑀𝐢 =
= $24
πœ•π‘„
Market Demand Curve Estimation
𝑀𝑅 = 𝑀𝐢 at profit maximization
$96 − $0.0016𝑄 = $24
$0.0016𝑄 = $72
𝑄 = 45,000
𝑃 = $96 − $0.0008 45,000 = $60
πœ‹ = 𝑇𝑅 − 𝑇𝐢
πœ‹ = $96𝑄 − $0.0008𝑄 2 − ($1,200,000 + $24𝑄)
πœ‹ = −$0.0008𝑄 2 + $72𝑄 − $1,200,000
πœ‹ = −$0.0008 45,000
πœ‹ = $420,000
2
+ $72 45,000 − $1,200,000
Market Demand Curve Estimation
$120
$100
$80
$60
$40
Domestic Demand
$20
Market Demand
Foreign Demand
$0
0
20,000
40,000
60,000
80,000
100,000
120,000
140,000
Identification Problem
•
Changing Nature of Demand Relations
•
•
Interplay of Demand and Supply
•
•
Economic conditions affect demand and
supply.
Shifts in Demand and Supply
•
•
Demand relations are dynamic.
Curve shifts can be estimated.
Simultaneous Relations
•
Quantity and price are jointly determined.
Identification Problem
The demand curve “D” does not
exist the data are for three
shifting demand curves
All else equal w.r.t.
demand determinants
D1
S1
P
S1
P
D2
S2
P1
S2
The problem:
“D” has higher
elasticity than “D1”.
P1
D3
S3
P2
S3
P2
P3
P3
D
Q1
Q2
Q3
D
Q
Q1
Q2
Q3
Q
Regression Analysis
•
What Is a Statistical Relation?
•
•
•
Specifying the Regression Model
•
•
•
A statistical relation exists metrics are related
A deterministic relation is true by definition.
Dependent variable Y is caused by X.
X variables are independently determined
from Y.
Least Squares Method
•
Minimize sum of squared residuals.
Regression Analysis
Direct Relation
Unit
Sales
Inverse Relation
Unit
Sales
.... .
.
....
.
Advertising
X
Unit
Sales
...
... ...
..
Price
No Relation
X
.... ......
Orbit of
Jupiter
X
Regression Analysis
Y
Regression Analysis fits a “Sum of Least
Squares” line to the data
π‘šπ‘–π‘›
π‘Œ−π‘Œ
2
Estimate or π‘Œ
. .
𝑛 π‘‹π‘Œ −
𝑋
π‘Œ
𝑏=
. .
𝑛 𝑋2 −
𝑋 2
. . . Error or π‘Œ − π‘Œ
. Observation or π‘Œ
π‘Œ −𝑏 𝑋
.
π‘Ž=
. . .
𝑛
π‘Œ = π‘Ž + 𝑏𝑋
π‘Œ = 0.136047 + 0.005962𝑋
X
Regression Analysis
𝑋
24
43
24
34
36
38
22
23
30
33
𝑏=
π‘Ž=
π‘Œ
78
100
86
82
86
84
75
80
83
91
𝑛
π‘‹π‘Œ
1872
4300
2064
2788
3096
3192
1650
1840
2490
3003
𝑋2
576
1849
576
1156
1296
1444
484
529
900
1089
π‘‹π‘Œ −
𝑋
𝑛 𝑋2 −
𝑋
π‘Œ −𝑏
𝑛
𝑋
π‘Œ
2
π‘Œ
79.50
93.67
79.50
86.96
88.45
89.94
78.01
78.76
83.98
86.21
π‘ˆ
-1.50
6.33
6.50
-4.96
-2.45
-5.94
-3.01
1.24
-0.98
4.79
105
100
95
90
85
80
75
70
65
10
20
30
10 26295 − 307 845
=
= 0.745623
10 9899 − 307 2
856 − 0.745623 βˆ™ 307
=
= 61.60937
10
π‘Œ = 61.60937 + 0.745623𝑋
40
50
Regression Analysis
Regression equations can take
on any functional form
𝑄 = π‘Ž + 𝑏𝑃
𝑄 = π‘Ž + 𝑏𝑃 𝑃 + 𝑏𝐴 𝐴 + 𝑏𝐼 𝐼
𝑄 = 𝑏0 𝑃𝑏𝑃 𝐴𝑏𝐴 𝐼𝑏𝐼
The above multiplicative form is popular among economist because
the exponents “bP” is the constant elasticity of the variable.
𝑄 = 4𝑃−0.4 𝐴0.2 𝐼0.003
has the constant price elasticity of – 0.4
Measuring Regression Significance
Standard Error of the Estimate (SEE) reflects
degree of scatter about the regression line.
Y
Upper 95% confidence bound:
+ 1.96 standard error of the estimate
π‘Œ = π‘Ž + 𝑏𝑋
𝑏 = Slope of curve
π‘Œ
π‘Ž
Lower 95% confidence bound:
- 1.96 standard error of the estimate
𝑋
X
Goodness of Fit
•
Correlation shows degree of concurrence.
•
•
•
Coefficient of determination, R2.
•
•
•
r = 1 means perfect correlation.
r = 0 means no correlation.
R2 = 100% means perfect fit.
R2 = 0% means no relation.
Corrected coefficient of determination
•
Adjusts R2 downward for small samples.
Regression Analysis
Regression Analysis fits a “Sum of Least
Squares” line to the data
𝑅2 =
Y
π‘Œ
Total variation
π‘Œ−π‘Œ
π‘Œ−π‘Œ
π‘Œ−π‘Œ
2
2
π‘Œ
Unexplained variation π‘Œ − π‘Œ
Explained variation π‘Œ − π‘Œ
π‘Œ
π‘Žπ‘‘π‘— − 𝑅2 = 1 − (1 − 𝑅2 )
X
𝑛−1
𝑛−𝑝−1
F statistic
Tells if R2 is statistically significant
Goodness of Fit Test
Do not Reject H0
Reject H0
F
 = 0.05
 = 0.1
F = 2.69
F = 2.14
Critical Value
Judging Variable Significance
•
t statistics compare sample characteristics to the
standard deviation of that characteristic.
•
•
•
Two-tail t Tests
•
•
t > 1.645 implies a strong effect of X on Y (90% conf.).
t > 1.96 implies an even stronger effect of X on Y (95% conf.)
Tests of effect.
One-Tail t Tests
•
Tests of magnitude or direction.
Judging Variable Significance
t – statistic
H0: b=0
Ha: b≠0
Reject H0
Reject H0
 / 2 = 0.025
– 1.96
Do not Reject H0
0
 / 2 = 0.025
1.96
z
Confidence Level = 95%
 / 2 = 0.025 Critical Value = ± 1.96
 / 2 = 0.05 Critical Value = ± 1.645 Confidence Level = 90%
Multiple Regression Example
SUMMARY OUTPUT
Regression Statistics
Multiple R 0.939811
R Square
0.883244
Adjusted R
Square
0.836542
Standard
Error
2.929296
Observatio
ns
15
ANOVA
df
Regression
Residual
Total
SS
MS
F
Significanc
eF
4 649.1256 162.2814 18.91221 0.000118
10 85.80775 8.580775
14 734.9333
Coefficient
s
Intercept 23.30213
P
-5.96115
Ps
6.50636
Pc
-1.09766
I
-1.3E-05
Standard
Error
17.55563
2.928719
3.888925
3.416276
0.000116
t Stat
1.327331
-2.03541
1.673049
-0.3213
-0.10919
P-value Lower 95%
0.213904 -15.8142
0.069176 -12.4867
0.125263 -2.1587
0.754595 -8.7096
0.915211 -0.00027
Q
20
30
10
15
12
28
17
14
20
22
32
14
25
28
12
P
2
1.5
2.3
2.5
2.8
1.7
2.4
3
2
2
1.4
3
1.8
1.6
3
Upper
Lower
Upper
95%
95.0%
95.0%
62.41851 -15.8142 62.41851
0.564444 -12.4867 0.564444
15.17142 -2.1587 15.17142
6.514275 -8.7096 6.514275
0.000245 -0.00027 0.000245
Ps
1.5
2.5
1.7
1.9
1.4
2.7
1.8
1.7
2.1
2.3
2.8
1.4
2.4
2.6
1.7
Pc
3
2.3
3.2
2.8
3.5
2.2
3.1
3.8
2.9
2.3
2.4
3.1
2.4
2.2
3.1
I
38000
22000
40000
25000
28000
17000
35000
40000
20000
34000
20000
40000
36000
36000
30000
Download