Uploaded by peh.shao.wei.98

Chap 3 Basic Time Series

advertisement
FN3142 QUANTITATIVE FINANCE
CHAP 3: BASIC TIME SERIES CONCEPTS
Essential Reading: Chap 7 & 8 of Elements of Forecasting, (Diebold)
(only the parts that overlap with these notes)
Further Reading: Chapter 2 of Analysis of Financial Time Series (Tsay)
FOR THIS CHAPTER…
AIMS
• Introduce standard measures for
predictability of time series,
autocovariance & autocorrelation.
• Introduce the ‘white noise’
processes, a class of time series
that are not predictable.
• Present the ‘law of iterated
expectations’ and illustrate its use
in time series analysis
LEARNING OUTCOMES
1. Able to describe the various
forms of white noise processes
used in the analysis of financial
data
2. Able to use the ‘law of iterated
expectations’ to derive
unconditional means from
conditional means
3. Able to apply these tools to a
simple AR(1) process
REPRESENTATION OF TIME SERIES (TS)
Sample of observations on a TS is denoted as 𝑦1 , 𝑦2 , 𝑦3 , β‹― , 𝑦𝑇
Observations before the start of the sample: β‹― , 𝑦−3 , 𝑦−2 , 𝑦−1 , 𝑦0
Observations beyond the end of the sample: 𝑦𝑇+1 , 𝑦𝑇+2 , 𝑦𝑇+3 , β‹―
Combining the sequence of observations on this TS, we obtain …
𝑦𝑑 ∞
𝑑=−∞
= β‹― , 𝑦−3 , 𝑦−2 , 𝑦−1 , 𝑦0 , 𝑦1 , 𝑦2 , 𝑦3 , β‹― , 𝑦𝑇 , 𝑦𝑇+1 , 𝑦𝑇+2 , 𝑦𝑇+3 , β‹―
Observed data
COVARIANCE STATIONARY TIME SERIES
A time series π‘Œπ‘‘ is said to be covariance stationary if it fulfils the following conditions:
1. πœ‡π‘‘ = 𝔼 π‘Œπ‘‘ = πœ‡ ∀𝑑
• the unconditional mean is the same at any moment of time
2. πœŽπ‘‘2 = π•π‘Žπ‘Ÿ π‘Œπ‘‘ = 𝜎 2 ∀𝑑
• the unconditional mean is the same at any moment of time
3. 𝛾𝑑,𝑑−𝑗 = β„‚π‘œπ‘£ π‘Œπ‘‘ , π‘Œπ‘‘−𝑗 = 𝛾𝑗 ∀𝑑 and 𝑗
• Covariance depends only on the time difference, not the time origin
• E.g. 𝛾6,0 = β„‚π‘œπ‘£ π‘Œ6 , π‘Œ0 = 𝛾6 = 𝛾27,21 = 𝛾10,4
Stationary process vs non-stationary process
Source: https://www.statisticshowto.com/stationarity/
AUTOCOVARIANCE & AUTOCORRELATION
• The j-th order autocovariance of a stationary time series π‘Œπ‘‘ is defined as
𝛾𝑗 = β„‚π‘œπ‘£ π‘Œπ‘‘ , π‘Œπ‘‘−𝑗 = 𝔼 π‘Œπ‘‘ − πœ‡ π‘Œπ‘‘−𝑗 − πœ‡
= 𝔼 π‘Œπ‘‘ , π‘Œπ‘‘−𝑗 − πœ‡ 2
Setting 𝑗 = 0 produces 𝛾0 = β„‚π‘œπ‘£ π‘Œπ‘‘ , π‘Œπ‘‘ = π•π‘Žπ‘Ÿ π‘Œπ‘‘ = 𝜎 2
Variance and autocovariance are scale-dependent: they have units that equal the square of the units
for π‘Œπ‘‘
• The j-th order autocorrelation of a stationary time series π‘Œπ‘‘ is defined as
β„‚π‘œπ‘£ π‘Œπ‘‘ , π‘Œπ‘‘−𝑗
𝛾𝑗
πœŒπ‘— = β„‚π‘œπ‘Ÿπ‘Ÿ π‘Œπ‘‘ , π‘Œπ‘‘−𝑗 =
=
∈ −1,1
𝕍 π‘Œπ‘‘
𝛾0
with 𝜌0 = β„‚π‘œπ‘Ÿπ‘Ÿ π‘Œπ‘‘ , π‘Œπ‘‘ =
𝕍
𝕍
π‘Œπ‘‘
π‘Œπ‘‘
= 1.
If a TS π‘Œπ‘‘ has 𝛾𝑗 ≠ 0 for some 𝑗 ≠ 0, then the TS is said to be serially correlated (or autocorrelated)
LAW OF ITERATED EXPECTATION (LIE)
Let 𝐼1 and 𝐼2 be two related information set satisfying 𝐼1 ⊆ 𝐼2
• 𝐼2 is bigger than 𝐼1 (𝐼2 contains more information 𝐼1 )
• In general, I0 ⊆ 𝐼𝑑 ⊆ 𝐼𝑑+1 ∀𝑑 (information grows with time)
By the law of iterated expectation (LIE), 𝔼 𝔼 π‘Œ 𝐼2 ΰΈ«π‘°πŸ
Note the final landmark, i.e. information set at the end → the ultimate result is the expectation
conditional on information set 𝐼1 (the smaller info set)
LIE is very helpful to calculate conditional expectations, e.g.
𝔼𝑑 𝔼𝑑+1 π‘Œπ‘‘+2 = 𝔼 𝔼 π‘Œπ‘‘+2 ȁ𝐼𝑑+1 𝐼𝑑
= 𝔼 π‘Œπ‘‘+2 ȁ𝐼𝑑
An unconditional expectation is equivalent to conditional expectation on an ‘empty’ or null
information set, which is smaller than any non-empty information set, Φ →𝔼 π‘Œπ‘‘+2 ȁ𝐼0 = 𝔼 π‘Œπ‘‘+2
RULES FOR EXPECTATIONS, VARIANCES & COVARIANCES
Let X, Y and Z be 3 scalar random variables, and
let a, b, c & d be constants. Then
𝔼 π‘Ž + 𝑏𝑋 + π‘π‘Œ = π‘Ž + 𝑏𝔼 𝑋 + 𝑐𝔼 π‘Œ
π•π‘Žπ‘Ÿ π‘Ž + 𝑏𝑋 = π•π‘Žπ‘Ÿ 𝑏𝑋 = 𝑏 2 π•π‘Žπ‘Ÿ 𝑋
π•π‘Žπ‘Ÿ π‘Ž + 𝑏𝑋 + π‘π‘Œ = π•π‘Žπ‘Ÿ 𝑏𝑋 + π‘π‘Œ
= π•π‘Žπ‘Ÿ 𝑏𝑋 + π•π‘Žπ‘Ÿ π‘π‘Œ + 2β„‚π‘œπ‘£ 𝑏𝑋, π‘π‘Œ
= 𝑏 2 π•π‘Žπ‘Ÿ 𝑋 + 𝑐 2 π•π‘Žπ‘Ÿ π‘Œ + 2π‘π‘β„‚π‘œπ‘£ 𝑋, π‘Œ
Example: Consider two stocks X & Y, whose
returns follow normal distribution: 𝑋~𝑁 1,2 and
π‘Œ~𝑁 2,3 . Consider 2 portfolios, A & B with the
following allocation:
1
1
𝐴= 𝑋+ π‘Œ
2
2
3
1
𝐡= 𝑋+ π‘Œ
4
4
Let 𝑼 = 𝐴, 𝐡 ′. Find the mean vector, covariance
matrix and correlation matrix of U.
β„‚π‘œπ‘£ π‘Ž + 𝑏𝑋, π‘π‘Œ + 𝑑𝑍 = β„‚π‘œπ‘£ 𝑏𝑋, π‘π‘Œ + β„‚π‘œπ‘£ 𝑏𝑋, 𝑑𝑍
= π‘π‘β„‚π‘œπ‘£ 𝑋, π‘Œ + π‘π‘‘β„‚π‘œπ‘£ 𝑋, 𝑍
© 2022 Singapore Institute of Management Group Limited
WHITE NOISE & OTHER INNOVATION SERIES
In a linear regression problem / equation: 𝑦𝑑 = 𝛽0 + 𝛽1 π‘₯1𝑑 + 𝛽2 π‘₯2𝑑 + β‹― + 𝛽𝑝 π‘₯𝑝𝑑 + 𝑒𝑑 we assume
𝑒𝑑 ~𝑁 0, 𝜎 2 and β„‚π‘œπ‘Ÿπ‘Ÿ 𝑒𝑑 , 𝑒𝑠 = 0 ∀𝑠 ≠ 𝑑
In TS analysis, the error terms, denoted πœ€π‘‘ , are assumed to not be correlated to each other,
β„‚π‘œπ‘Ÿπ‘Ÿ πœ€π‘‘ , πœ€π‘‘−𝑗 = 0 ∀𝑗 ≠ 0
3 types of noise: A TS is called a _______ white noise (WN), denote as πœ€π‘‘
1. Simple WN: No serial correlation
2. i.i.d WN: No serial correlation, no serial dependence and identical distribution (no shape
assumed)
3. Gaussian WN: No serial correlation, no serial dependence & identical Gaussian distribution
(bell-shaped error)
Simple
WN
πœ€π‘‘ ~π‘Šπ‘ 0, 𝜎
2
0 mean WN
with constant
variance 𝜎 2
WN
πœ€π‘‘ ~i.i.d. WN(0, 𝜎 2 )
0 mean i.i.d
WN with
constant
variance 𝜎 2
Conditions
Conditions
1. πΆπ‘œπ‘Ÿπ‘Ÿ πœ€π‘‘ , πœ€π‘‘−𝑗 = 0
1. πΆπ‘œπ‘Ÿπ‘Ÿ πœ€π‘‘ , πœ€π‘‘−𝑗 = 0
2. πœ€π‘‘ independent of
πœ€π‘‘−𝑗 ∀𝑗 ≠ 0
2. 𝔼 πœ€π‘‘ = 0
No serial correlation
3. πœ€π‘‘ ~𝐹 ∀𝑑 where F is
some distribution
No shape assumed
Gaussian
WN
πœ€π‘‘ ~i.i.d. 𝑁(0, 𝜎 2 )
Error terms follow
normal distribution
with 0 mean, variance
𝜎2
Conditions
1. πΆπ‘œπ‘Ÿπ‘Ÿ πœ€π‘‘ , πœ€π‘‘−𝑗 = 0
2. πœ€π‘‘ independent of
πœ€π‘‘−𝑗 ∀𝑗 ≠ 0
3. πœ€π‘‘ ~𝑁(0, 𝜎 2 )
Shape assumed
0-corr ≠independent
Hence we need the
2nd condition in WN &
Gaussian WN
Reason: Correlation
only captures linear
dependency.
2 RVs with 0
correlation may
depend on each other
in non-linear fashion
TRY THIS
Any time series π‘Œπ‘‘+1 may be decompose into its conditional mean, 𝔼𝑑 π‘Œπ‘‘+1 and a
remainder process, πœ€π‘‘+1 . Show that
π‘Œπ‘‘+1 = 𝔼𝑑 π‘Œπ‘‘+1 + πœ€π‘‘+1
a) πœ€π‘‘ has 0 conditional mean on the information set available at time 𝑑
b) πœ€π‘‘ is a zero-mean white noise process
c) πœ€π‘‘+1 is uncorrelated with the conditional mean term, 𝔼𝑑 π‘Œπ‘‘+1
The above 3 questions are meant to prove the properties of white noise, so that we can be
sure that the remainder process is an error term in the process
APPLICATION TO AR(1) PROCESS
Consider a TS process π‘Œπ‘‘ defined as follows:
π‘Œπ‘‘ = πœ™π‘Œπ‘‘−1 + πœ€π‘‘ ,
πœ€π‘‘ ~π‘Šπ‘ 0, 𝜎 2 and πœ™ < 1
The above equation is referred to as an autoregressive process of order 1, denoted AR(1)
•
Autoregressive / AR(1)= The variable π‘Œπ‘‘ is regressed onto itself (hence the ‘auto’),
lagged by 1 period (hence the ‘first-order’).
The TS has 2 parameters: 𝝓 (1st order AR coefficient) and 𝝈𝟐 (variance of white noise πœ€π‘‘ )
When asked for a particular property of π‘Œπ‘‘ , it should be given as the function of πœ™ and 𝜎 2
𝒀𝒕 ~AR(1): 𝒀𝒕 = 𝝓𝒀𝒕−𝟏 + πœΊπ’• with πœ€~π‘Šπ‘ 0, 𝜎 2
1. What is the unconditional mean of π‘Œπ‘‘ ?
2. What is the unconditional variance of π‘Œπ‘‘ ?
3. What is the 1st order autocovariance & autocorrelation of π‘Œπ‘‘ ?
4. What is the 2nd order autocovariance & autocorrelation of π‘Œπ‘‘ ?
5. What is the jth order autocovariance of π‘Œπ‘‘ ?
i)
Unconditional mean of π‘Œπ‘‘ ⟹ 𝔼 π‘Œπ‘‘
𝔼 π‘Œπ‘‘ = 𝔼 πœ™π‘Œπ‘‘−1 + πœ€π‘‘ = 𝔼 πœ™π‘Œπ‘‘−1 + 𝔼 πœ€π‘‘
= πœ™π”Ό π‘Œπ‘‘−1 + 0
As π‘Œπ‘‘ is assumed to be stationary, 𝔼 π‘Œπ‘‘ = 𝔼 π‘Œπ‘‘−1 = πœ‡. From the above equation,
𝔼 π‘Œπ‘‘ = πœ™π”Ό π‘Œπ‘‘−1
πœ‡ = πœ™πœ‡
πœ‡ − πœ™πœ‡ = 0
πœ‡ 1−πœ™ =0
Hence, the unconditional mean,𝔼 π‘Œπ‘‘ = πœ‡ = 0 because πœ™ < 1 by definition.
ii) Unconditional variance of π‘Œπ‘‘ ,π•π‘Žπ‘Ÿ π‘Œπ‘‘ (denoted as 𝛾0 )
𝛾0 = π•π‘Žπ‘Ÿ π‘Œπ‘‘
= π•π‘Žπ‘Ÿ πœ™π‘Œπ‘‘−1 + πœ€π‘‘
= πœ™ 2 π•π‘Žπ‘Ÿ π‘Œπ‘‘−1 + π•π‘Žπ‘Ÿ πœ€π‘‘ + 2β„‚π‘œπ‘£ πœ™π‘Œπ‘‘−1 , πœ€π‘‘
= πœ™ 2 π•π‘Žπ‘Ÿ π‘Œπ‘‘−1 + 𝜎 2 + 0
Stationarity assumption of π‘Œπ‘‘ implies that π•π‘Žπ‘Ÿ π‘Œπ‘‘ doesn’t vary with time
⟹ 𝛾0 = πœ™ 2 𝛾0 + 𝜎 2
𝛾0 − πœ™ 2 𝛾0 = 𝜎 2
𝛾0 1 − πœ™ 2 = 𝜎 2
𝜎2
⟹ π•π‘Žπ‘Ÿ π‘Œπ‘‘ = 𝛾0 =
1 − πœ™2
iii) The 1st order autocovariance & autocorrelation of π‘Œπ‘‘
β„‚π‘œπ‘£ π‘Œπ‘‘ , π‘Œπ‘‘−1 = β„‚π‘œπ‘£ πœ™π‘Œπ‘‘−1 + πœ€π‘‘ , π‘Œπ‘‘−1
= β„‚π‘œπ‘£ πœ™π‘Œπ‘‘−1 , π‘Œπ‘‘−1 + β„‚π‘œπ‘£ πœ€π‘‘ , π‘Œπ‘‘−1
= πœ™π•π‘Žπ‘Ÿ π‘Œπ‘‘−1 + 0
= πœ™π›Ύ0
β„‚π‘œπ‘£ π‘Œπ‘‘ , π‘Œπ‘‘−1
𝜌1 ≡ πΆπ‘œπ‘Ÿπ‘Ÿ π‘Œπ‘‘ , π‘Œπ‘‘−1
𝜎2
=πœ™
1 − πœ™2
β„‚π‘œπ‘£ π‘Œπ‘‘ , π‘Œπ‘‘−1
πœ™π›Ύ0
=
=
=πœ™
π•π‘Žπ‘Ÿ π‘Œπ‘‘
𝛾0
Equipped with only the AR(1) equation, i.e. π‘Œπ‘‘ = πœ™π‘Œπ‘‘−1 + πœ€π‘‘ , confirm that the 2nd order
𝜎2
2
2.
autocovariance is 𝛾2 = πœ™
&
autocorrelation
of
π‘Œ
is
𝜌
=
πœ™
𝑑
2
2
1−πœ™
LET’S TRY THIS
Consider the AR(2) process: 𝑧𝑑 = 𝛼0 + 𝛼1 𝑧𝑑−1 + 𝛼2 𝑧𝑑−2 + πœ€π‘‘ where πœ€π‘‘ is a zero-mean white noise
process with variance 𝜎 2, and assume 𝛼1 , 𝛼2 and 𝛼1 + 𝛼2 < 1, which together ensure 𝑧𝑑 is
covariance stationary.
a) Calculate the conditional and unconditional means of 𝑧𝑑 , i.e. 𝔼𝑑−1 𝑧𝑑 and 𝔼 𝑧𝑑 .
b) Let us now set 𝛼2 = 0. Calculate the conditional and unconditional variances of 𝑧𝑑 .
c) Keeping 𝛼2 = 0, derive the autocovariance and autocorrelation functions of this process for all lags
as functions of the parameters 𝛼1 and 𝜎.
Suppose now that 𝛼2 ≠ 0, and let us denote the autocovariance at lag k by π›Ύπ‘˜ ≡ β„‚π‘œπ‘£ 𝑧𝑑 , 𝑧𝑑−π‘˜ .
d) Using the AR(2) equation, write down a recursive formula for π›Ύπ‘˜ , i.e. express π›Ύπ‘˜ as a function of
π›Ύπ‘˜−1 , π›Ύπ‘˜−2 and the model parameters.
e) Apply this recursive formula for π‘˜ = 1 and π‘˜ = 0, and explain how to solve for the whole
autocovariance function π›Ύπ‘˜ π‘˜≥0 . Note: No need to derive the exact values! Hint: Think about what 𝛾−1
and 𝛾−2 mean?
f) Can a linear transformation of the 𝑧𝑑 process be represented by an AR(1) process? That is, do
appropriate πœ…, 𝛿0 and 𝛿1 constants exist such that the process defined as 𝑦𝑑 = 𝑧𝑑 + πœ…π‘§π‘‘−1 satisfies 𝑦𝑑 =
𝛿0 + 𝛿1 𝑦𝑑−1 + πœ–π‘‘ where πœ–π‘‘ is a zero-mean white noise process? Explain.
Download