Time-Series Analysis

advertisement
http://www.ifi.uio.no/it/latex-links/STORE/opt/rsi/idl/help/online_help/TimeSeries_Analysis.html
Time-Series Analysis
A time-series is a sequential collection of data observations indexed over
time. In most cases, the observed data is continuous and is recorded at a
discrete and finite set of equally-spaced points. An n-element time-series is
denoted as x = (x0, x1, x2, ... , xn-1), where the time-indexed distance between
any two successive observations is referred to as the sampling interval.
A widely held theory assumes that a time-series is comprised of four
components:




A trend or long term movement.
A cyclical fluctuation about the trend.
A pronounced seasonal effect.
A residual, irregular, or random effect.
Collectively, these components make the analysis of a time-series a far more
challenging task than just fitting a linear or nonlinear regression model.
Adjacent observations are unlikely to be independent of one another.
Clusters of observations are frequently correlated with increasing strength as
the time intervals between them become shorter. Often the analysis is a
multi-step process involving graphical and numerical methods.
The first step in the analysis of a time-series is the transformation to
stationary series. A stationary series exhibits statistical properties that are
unchanged as the period of observation is moved forward or backward in
time. Specifically, the mean and variance of a stationary time-series remain
fixed in time. The sample autocorrelation function is a commonly used tool in
determining the stationarity of a time-series. The autocorrelation of a timeseries measures the dependence between observations as a function of their
time differences or lag. A plot of the sample autocorrelation coefficients
against corresponding lags can be very helpful in determining the stationarity
of a time-series.
For example, suppose the IDL variable X contains time-series data:
X = [5.44,
5.21,
6.16,
5.36,
5.29,
6.38,
5.23,
6.07,
5.17,
5.58,
5.43,
4.33,
6.56,
5.35,
4.77,
5.22,
5.58,
5.93,
5.61,
5.17,
5.28,
6.18,
5.70,
5.83,
5.33]
$
$
$
$
The following IDL commands plot both the time-series data and the sample
autocorrelation versus the lags.
; Set the plotting window to hold two plots and plot the data:
IPLOT, X, VIEW_GRID=[1,2]
Compute the sample autocorrelation function for time lagged values 0 - 20
and plot.
lag = INDGEN(21)
result = A_CORRELATE(X, lag)
IPLOT, lag, result, /VIEW_NEXT
; Add a reference line at zero:
IPLOT, [0,20], [0,0], /OVERPLOT
The following figure shows the resulting graphs.
Figure 11-3: Time-series data (Top) and Autocorrelation of that Data
Versus the Lag (Bottom)
Figure 11-3: Time-series data (Top) and Autocorrelation of that Data
Versus the Lag (Bottom)
The top graph plots time-series data. The bottom graph plots the
autocorrelation of that data versus the lag. Because the time-series has a
significant autocorrelation up to a lag of seven, it must be considered nonstationary.
Nonstationary components of a time-series may be eliminated in a variety of
ways. Two frequently used methods are known as moving averages and
forward differencing. The method of moving averages dampens fluctuations
in a time-series by taking successive averages of groups of observations.
Each successive overlapping sequence of k observations in the series is
replaced by the mean of that sequence. The method of forward differencing
replaces each time-series observation with the difference of the current
observation and its adjacent observation one step forward in time.
Differencing may be computed recursively to eliminate more complex
nonstationary components.
Once a time-series has been transformed to stationarity, it may be modeled
using an autoregressive process. An autoregressive process expresses the
current observation, xt, as a combination of past time-series values and
residual white noise. The simplest case is known as a first order
autoregressive model and is expressed as
xt = xt-1 + t
The coefficient  is estimated using the time-series data. The general
autoregressive model of order p is expressed as
xt = 1xt-1 +2xt-2 + ... + pxt-p + t
Modeling a stationary time-series as a p-th order autoregressive process
allows the extrapolation of data for future values of time. This process is
know as forecasting.
Download