FN3142 QUANTITATIVE FINANCE CHAP 3: BASIC TIME SERIES CONCEPTS Essential Reading: Chap 7 & 8 of Elements of Forecasting, (Diebold) (only the parts that overlap with these notes) Further Reading: Chapter 2 of Analysis of Financial Time Series (Tsay) FOR THIS CHAPTER… AIMS • Introduce standard measures for predictability of time series, autocovariance & autocorrelation. • Introduce the ‘white noise’ processes, a class of time series that are not predictable. • Present the ‘law of iterated expectations’ and illustrate its use in time series analysis LEARNING OUTCOMES 1. Able to describe the various forms of white noise processes used in the analysis of financial data 2. Able to use the ‘law of iterated expectations’ to derive unconditional means from conditional means 3. Able to apply these tools to a simple AR(1) process REPRESENTATION OF TIME SERIES (TS) Sample of observations on a TS is denoted as π¦1 , π¦2 , π¦3 , β― , π¦π Observations before the start of the sample: β― , π¦−3 , π¦−2 , π¦−1 , π¦0 Observations beyond the end of the sample: π¦π+1 , π¦π+2 , π¦π+3 , β― Combining the sequence of observations on this TS, we obtain … π¦π‘ ∞ π‘=−∞ = β― , π¦−3 , π¦−2 , π¦−1 , π¦0 , π¦1 , π¦2 , π¦3 , β― , π¦π , π¦π+1 , π¦π+2 , π¦π+3 , β― Observed data COVARIANCE STATIONARY TIME SERIES A time series ππ‘ is said to be covariance stationary if it fulfils the following conditions: 1. ππ‘ = πΌ ππ‘ = π ∀π‘ • the unconditional mean is the same at any moment of time 2. ππ‘2 = πππ ππ‘ = π 2 ∀π‘ • the unconditional mean is the same at any moment of time 3. πΎπ‘,π‘−π = βππ£ ππ‘ , ππ‘−π = πΎπ ∀π‘ and π • Covariance depends only on the time difference, not the time origin • E.g. πΎ6,0 = βππ£ π6 , π0 = πΎ6 = πΎ27,21 = πΎ10,4 Stationary process vs non-stationary process Source: https://www.statisticshowto.com/stationarity/ AUTOCOVARIANCE & AUTOCORRELATION • The j-th order autocovariance of a stationary time series ππ‘ is defined as πΎπ = βππ£ ππ‘ , ππ‘−π = πΌ ππ‘ − π ππ‘−π − π = πΌ ππ‘ , ππ‘−π − π 2 Setting π = 0 produces πΎ0 = βππ£ ππ‘ , ππ‘ = πππ ππ‘ = π 2 Variance and autocovariance are scale-dependent: they have units that equal the square of the units for ππ‘ • The j-th order autocorrelation of a stationary time series ππ‘ is defined as βππ£ ππ‘ , ππ‘−π πΎπ ππ = βπππ ππ‘ , ππ‘−π = = ∈ −1,1 π ππ‘ πΎ0 with π0 = βπππ ππ‘ , ππ‘ = π π ππ‘ ππ‘ = 1. If a TS ππ‘ has πΎπ ≠ 0 for some π ≠ 0, then the TS is said to be serially correlated (or autocorrelated) LAW OF ITERATED EXPECTATION (LIE) Let πΌ1 and πΌ2 be two related information set satisfying πΌ1 ⊆ πΌ2 • πΌ2 is bigger than πΌ1 (πΌ2 contains more information πΌ1 ) • In general, I0 ⊆ πΌπ‘ ⊆ πΌπ‘+1 ∀π‘ (information grows with time) By the law of iterated expectation (LIE), πΌ πΌ π πΌ2 ΰΈ«π°π Note the final landmark, i.e. information set at the end → the ultimate result is the expectation conditional on information set πΌ1 (the smaller info set) LIE is very helpful to calculate conditional expectations, e.g. πΌπ‘ πΌπ‘+1 ππ‘+2 = πΌ πΌ ππ‘+2 ΘπΌπ‘+1 πΌπ‘ = πΌ ππ‘+2 ΘπΌπ‘ An unconditional expectation is equivalent to conditional expectation on an ‘empty’ or null information set, which is smaller than any non-empty information set, Φ →πΌ ππ‘+2 ΘπΌ0 = πΌ ππ‘+2 RULES FOR EXPECTATIONS, VARIANCES & COVARIANCES Let X, Y and Z be 3 scalar random variables, and let a, b, c & d be constants. Then πΌ π + ππ + ππ = π + ππΌ π + ππΌ π πππ π + ππ = πππ ππ = π 2 πππ π πππ π + ππ + ππ = πππ ππ + ππ = πππ ππ + πππ ππ + 2βππ£ ππ, ππ = π 2 πππ π + π 2 πππ π + 2ππβππ£ π, π Example: Consider two stocks X & Y, whose returns follow normal distribution: π~π 1,2 and π~π 2,3 . Consider 2 portfolios, A & B with the following allocation: 1 1 π΄= π+ π 2 2 3 1 π΅= π+ π 4 4 Let πΌ = π΄, π΅ ′. Find the mean vector, covariance matrix and correlation matrix of U. βππ£ π + ππ, ππ + ππ = βππ£ ππ, ππ + βππ£ ππ, ππ = ππβππ£ π, π + ππβππ£ π, π © 2022 Singapore Institute of Management Group Limited WHITE NOISE & OTHER INNOVATION SERIES In a linear regression problem / equation: π¦π‘ = π½0 + π½1 π₯1π‘ + π½2 π₯2π‘ + β― + π½π π₯ππ‘ + π’π‘ we assume π’π‘ ~π 0, π 2 and βπππ π’π‘ , π’π = 0 ∀π ≠ π‘ In TS analysis, the error terms, denoted ππ‘ , are assumed to not be correlated to each other, βπππ ππ‘ , ππ‘−π = 0 ∀π ≠ 0 3 types of noise: A TS is called a _______ white noise (WN), denote as ππ‘ 1. Simple WN: No serial correlation 2. i.i.d WN: No serial correlation, no serial dependence and identical distribution (no shape assumed) 3. Gaussian WN: No serial correlation, no serial dependence & identical Gaussian distribution (bell-shaped error) Simple WN ππ‘ ~ππ 0, π 2 0 mean WN with constant variance π 2 WN ππ‘ ~i.i.d. WN(0, π 2 ) 0 mean i.i.d WN with constant variance π 2 Conditions Conditions 1. πΆπππ ππ‘ , ππ‘−π = 0 1. πΆπππ ππ‘ , ππ‘−π = 0 2. ππ‘ independent of ππ‘−π ∀π ≠ 0 2. πΌ ππ‘ = 0 No serial correlation 3. ππ‘ ~πΉ ∀π‘ where F is some distribution No shape assumed Gaussian WN ππ‘ ~i.i.d. π(0, π 2 ) Error terms follow normal distribution with 0 mean, variance π2 Conditions 1. πΆπππ ππ‘ , ππ‘−π = 0 2. ππ‘ independent of ππ‘−π ∀π ≠ 0 3. ππ‘ ~π(0, π 2 ) Shape assumed 0-corr ≠independent Hence we need the 2nd condition in WN & Gaussian WN Reason: Correlation only captures linear dependency. 2 RVs with 0 correlation may depend on each other in non-linear fashion TRY THIS Any time series ππ‘+1 may be decompose into its conditional mean, πΌπ‘ ππ‘+1 and a remainder process, ππ‘+1 . Show that ππ‘+1 = πΌπ‘ ππ‘+1 + ππ‘+1 a) ππ‘ has 0 conditional mean on the information set available at time π‘ b) ππ‘ is a zero-mean white noise process c) ππ‘+1 is uncorrelated with the conditional mean term, πΌπ‘ ππ‘+1 The above 3 questions are meant to prove the properties of white noise, so that we can be sure that the remainder process is an error term in the process APPLICATION TO AR(1) PROCESS Consider a TS process ππ‘ defined as follows: ππ‘ = πππ‘−1 + ππ‘ , ππ‘ ~ππ 0, π 2 and π < 1 The above equation is referred to as an autoregressive process of order 1, denoted AR(1) • Autoregressive / AR(1)= The variable ππ‘ is regressed onto itself (hence the ‘auto’), lagged by 1 period (hence the ‘first-order’). The TS has 2 parameters: π (1st order AR coefficient) and ππ (variance of white noise ππ‘ ) When asked for a particular property of ππ‘ , it should be given as the function of π and π 2 ππ ~AR(1): ππ = πππ−π + πΊπ with π~ππ 0, π 2 1. What is the unconditional mean of ππ‘ ? 2. What is the unconditional variance of ππ‘ ? 3. What is the 1st order autocovariance & autocorrelation of ππ‘ ? 4. What is the 2nd order autocovariance & autocorrelation of ππ‘ ? 5. What is the jth order autocovariance of ππ‘ ? i) Unconditional mean of ππ‘ βΉ πΌ ππ‘ πΌ ππ‘ = πΌ πππ‘−1 + ππ‘ = πΌ πππ‘−1 + πΌ ππ‘ = ππΌ ππ‘−1 + 0 As ππ‘ is assumed to be stationary, πΌ ππ‘ = πΌ ππ‘−1 = π. From the above equation, πΌ ππ‘ = ππΌ ππ‘−1 π = ππ π − ππ = 0 π 1−π =0 Hence, the unconditional mean,πΌ ππ‘ = π = 0 because π < 1 by definition. ii) Unconditional variance of ππ‘ ,πππ ππ‘ (denoted as πΎ0 ) πΎ0 = πππ ππ‘ = πππ πππ‘−1 + ππ‘ = π 2 πππ ππ‘−1 + πππ ππ‘ + 2βππ£ πππ‘−1 , ππ‘ = π 2 πππ ππ‘−1 + π 2 + 0 Stationarity assumption of ππ‘ implies that πππ ππ‘ doesn’t vary with time βΉ πΎ0 = π 2 πΎ0 + π 2 πΎ0 − π 2 πΎ0 = π 2 πΎ0 1 − π 2 = π 2 π2 βΉ πππ ππ‘ = πΎ0 = 1 − π2 iii) The 1st order autocovariance & autocorrelation of ππ‘ βππ£ ππ‘ , ππ‘−1 = βππ£ πππ‘−1 + ππ‘ , ππ‘−1 = βππ£ πππ‘−1 , ππ‘−1 + βππ£ ππ‘ , ππ‘−1 = ππππ ππ‘−1 + 0 = ππΎ0 βππ£ ππ‘ , ππ‘−1 π1 ≡ πΆπππ ππ‘ , ππ‘−1 π2 =π 1 − π2 βππ£ ππ‘ , ππ‘−1 ππΎ0 = = =π πππ ππ‘ πΎ0 Equipped with only the AR(1) equation, i.e. ππ‘ = πππ‘−1 + ππ‘ , confirm that the 2nd order π2 2 2. autocovariance is πΎ2 = π & autocorrelation of π is π = π π‘ 2 2 1−π LET’S TRY THIS Consider the AR(2) process: π§π‘ = πΌ0 + πΌ1 π§π‘−1 + πΌ2 π§π‘−2 + ππ‘ where ππ‘ is a zero-mean white noise process with variance π 2, and assume πΌ1 , πΌ2 and πΌ1 + πΌ2 < 1, which together ensure π§π‘ is covariance stationary. a) Calculate the conditional and unconditional means of π§π‘ , i.e. πΌπ‘−1 π§π‘ and πΌ π§π‘ . b) Let us now set πΌ2 = 0. Calculate the conditional and unconditional variances of π§π‘ . c) Keeping πΌ2 = 0, derive the autocovariance and autocorrelation functions of this process for all lags as functions of the parameters πΌ1 and π. Suppose now that πΌ2 ≠ 0, and let us denote the autocovariance at lag k by πΎπ ≡ βππ£ π§π‘ , π§π‘−π . d) Using the AR(2) equation, write down a recursive formula for πΎπ , i.e. express πΎπ as a function of πΎπ−1 , πΎπ−2 and the model parameters. e) Apply this recursive formula for π = 1 and π = 0, and explain how to solve for the whole autocovariance function πΎπ π≥0 . Note: No need to derive the exact values! Hint: Think about what πΎ−1 and πΎ−2 mean? f) Can a linear transformation of the π§π‘ process be represented by an AR(1) process? That is, do appropriate π , πΏ0 and πΏ1 constants exist such that the process defined as π¦π‘ = π§π‘ + π π§π‘−1 satisfies π¦π‘ = πΏ0 + πΏ1 π¦π‘−1 + ππ‘ where ππ‘ is a zero-mean white noise process? Explain.