A Time-changed NGARCH model on the leverage and volatility clustering effects by extreme events: Evidence from the S&P 500 index over the 2008 financial crisis Lie-Jane Kao Department of Finance and Banking, KaiNan University, Taoyuan,Taiwan Po-Cheng Wu Department of Finance and Banking, KaiNan University, Taoyuan,Taiwan Cheng-Few Lee Department of Finance & Economics, Rutgers University, NJ, USA Abstract In the market, the volatility clustering phenonemum is often observed after the occurrence of extreme events. If the extreme events are accompanying with bad news, the leverage feedback that magnifies the fluctuation will proceed. An extension of the variance-gamma GARCH model (VG GARCH) that unifies two strands of option pricing models by Kao and Lee (2009), i.e., the VG NGARCH model, is proposed and calibrated in this study. Three competing models: the Gaussian NGARCH model, the variance-gamma model (VG), and the variance-gamma NGARCH model (VG NGARCH), are compared based on the S&P 500 weekly index series from January 3, 1980 to Feburary 17, 2009. A Monte-Carlo method, namely, the Monte Carlo EM algorithm, for model estimation and calibration is employed. The goodness of fit of the VG NGARCH model on the S&P 500 data with volatility clustering and leverage feedback phenonemum caused by the extreme event is demonstrated. Key words. Volatility clustering, Leverage feedback effect, Variance-gamma GARCH model, Gaussian NGARCH model, Variance-gamma model, Variance-gamma NGARCH model, Monte Carlo EM algorithm. JEL Classifications: c13, c16,c52,c63 1 1. Introduction Financial market is often confronted by extreme events such as earnings surprises, crashes, or terrorist attack, ... , etc, which caused abrupt and unanticipated large changes in the prices of the market. Not only that, the volatility clustering phenonemum due to the propagation of previous information arrivals has prolonged the fluctuation in the market caused by the extreme events. At the same time, it is observed empirically the leverage feedback effect (Nelson (1991), Campbell and Hentschel (1992), and Engle and Ng (1993)) under which volatilities following bad news is higher than following good news, this implies that bad news may drag down the market in a downward movement even more than if there has been upward market movement. The leverage feedback effect has magnified the fluctuation in the market caused by the extreme events. For example, after Lehman Brothers declared its bankruptcy on September 14th, 2008, a series of bank and insurance company failures triggered the global financial crisis in which the market fluctuates dramatically. It is the extreme event, i.e., the declaration of Lehman Brothers’ bankruptcy, together with the volatility clustering plus the leverage feedback effect caused by Lehman Brothers bankruptcy news result in the catastrophic financial crisis in 2008. In order to describe the market with extreme events, a compound Poisson process has been added on as an additional component to the standard Black-Scholes model (Black and Scholes (1973), Merton (1973)). This mixture model is broadly referred to as jump-diffusion models (Merton (1976); Cox et al. (1976); Andersen et al. (2001); Duffie et al. (2000)), which capture the continuous sample path with stochastic jumps over time to account for the local price movements as well as rare large price movements. However, as price movements are driven by information (Jones et al. (1994)) which arrives discretely over time, it would be unrealistic to describe the asset price dynamics with continuous components in the sample path. A more adequate 2 model would adopt a purely jump process for the asset price dynamics. Madan et al. (1990, 1991, 1998), Carr et al. (1998, 2003), Geman, Madan and Yor (2001) considered the use of a purely jump process that is obtained by evaluating a Brownian motion at a random business time calibrated by a gamma process, i.e., the variance-gamma process, to replace the role of Brownian motion in the Black-Scholes geometric Brownian motion model. Being a purely jump process, the variance-gamma process has an infinite arrival rate of jumps with stochastic jump sizes, and the extreme events are implicitly accounted for by jumps of larger sizes. As a Brownian motion evaluated at a random business time with constant mean and variance rates, however, the aforementioned variance gamma model fails to consider the volatility clustering phenonemum and the corresponding leverage effects. On the other hand, the phenonemum of volatility clustering has led to the use of ARCH/GARCH type models in financial forecasting and derivatives pricing (Engle, 1982; Bollerslev, 1986, Forsberg et al., 2002; Duan 1995, 1999). Later, to account for the leverage effect, the nonlinear-in-mean asymmetric GARCH (NGARCH) model is proposed by Engle and Ng (1993). By assuming the return innovations to be conditional Gaussian distributed given all the previous information arrivals, however, the Gaussian ARCH/GARCH type models fail to model the excess kurtosis and skewness caused from the extreme events. To remedy the excess kurtosis and skewness problem, non-Gaussian ARCH/GARCH type models are proposed (Duan (1999); Stentoft (2008)). However, the locally risk-neutralized principle by Duan (1995) is violated if the return innovations are non-Gaussian distributed (Duan (1999)). To overcome the aforementioned shortcomings of the various types of models, a variance-gamma GARCH (VG GARCH) model has been developed in the previous work (Kao and Lee, 2009). In contrast to the traditional ARCH/GARCH type models in which previous return innovations determines the conditional variance of the return 3 innovation t directly, in the VG GARCH model, previous information innovations affect the conditional variance of t via the conditional means of the random business time changes, i.e., the shape/scale parameters of the variance-gamma process, in the GARCH framework. In this study, the performance of the proposed VG NGARCH model is compared to two benchmark models, namely, the Gaussian NGARCH model and VG model, respectively. The S&P 500 weekly index series from January 3, 1980 to Feburary 17, 2009, which include the index before and after the 2008 financial crisis, are used to illustrate the performance of the model. By introducing a random business time in the GARCH framework, an extra complication is added for model calibration and parameter estimation. A Monte-Carlo method, namely, the Monte Carlo EM algorithm, in conjunction with the Metropolis algorithm for parameter estimation in the VG and VG GARCH models is proposed. The method gives maximum likelihood estimates of the model parameters via the iterative sampling of the unknown random business times. The outline of the paper is as follows. Section 2 gives the specification of the three models under consideration, namely, the Gaussian NGARCH model, the VG model, and the VG NGARCH model. Empirical analysis and Monte Carlo EM estimation procedure is outlined in Section 3. Section 5 concludes. 2. Three competing models and their likelihoods In this paper, the goodness of fit of the VG NGARCH model is compared to two benchmark models, namely, the Gaussian NGARCH model and VG model, respectively. 2.1 Gaussian NGARCH model and its likelihood 4 Consider a discrete-time economy where the trading period [0, T] is partitioned into T subintervals (0, 1], (1, 2], … , (T-1, T], respectively. In the Gaussian GARCH-type model, the dynamics of the log-return Yt=ln[S(t)/S(t-1)] under the data-generating measure P is Yt=rt+ ht -0.5ht+t (1) t=1, ..., T, where is the risk premium parameter, see Duan (1995, 1999). The innovation t is conditionally normally distributed with mean zero and variance ht, i.e., t|Ft-1~N(0, ht) ( 2) The conditional variance ht is measurable with respect to the information set Ft-1 containing all the information up to time t-1 and satisfies 2 ht=a0+ a εt -1 c ht -1 +bht-1 (3) that takes into consideration of the leverage effect by Engle and Ng ( 1993). The logarithm of the likelihood p(Y; ) is T 1 T Yt rt h t 05ht log ht 2 t 1 2 ht t 1 2 (4) where =(h1,) denotes the vector of unknown parameters with the risk premium parameter the initial variance h1, and the NGARCH parameters =(a0, a, b, c) in (3). 2.2 Variance gamma model and its likelihood In the variance gamma stock price dynamics, the role of Brownian motion in the Black-Scholes model is replaced by the variance gamma process X(t), where X(t) =g+W(g) (5) Here g=g(t; ,)-g(t0; ,) is the increment in a gamma process {g(t; ,): t>0} with shape and scale parameters and , respectively. The parameters and in (5) are the drift and volatility parameters, respectively, and W is a standard Brownian motion. According to Madan et al. (1998), given the amount of random time-change gt during the interval (t-1, t], the specification for the dynamics of the log-return 5 Yt=ln[S(t)/S(t-1)] at time t under the data-generating measure P is Yt= m++gt+t (6) t=1, ..., T, where m is the mean rate of return, the innovation t is conditionally Gaussian distributed with mean zero and variance 2gt, i.e., t|Ft-1~N(0, 2gt) ( 7) Note that in Madan et al. (1991, 1998), the random time-change gt is gamma-distributed with shape and scale parameters 1/ and , respectively, i.e., gt|Ft-1 ~gamma(1/,) ( 8) where the parameter satisfies = ln 1 σ 2/ 2 / (9) The logarithm of joint likelihood p(Y, g|) is proportional to Yt m g t 2 C Tln 2 g t t 1 T 1 3 1 g log log g t t log t 1 2 T + (10) where C= - Tlog 2 , and the vector of unknown parameters =(,, ). Note that as the random time-changes g1, ... , gT are unobservable, the mean rate of return m is unidentifiable. To simplify the problem, the mean rate of return m is set to the mean of the log-returns Y1, … , YT. 2.3 Variance-gamma NGARCH model and its likelihood In the variance-gamma NGARCH model, the dynamics of the log-return at time t Yt=ln[S(t)/S(t-1)] t=1, ..., T, under the data-generating measure P given the information set Ft-1 that contains all the information up to time t-1, including the amount of random time-change gt during the interval (t-1, t], is specified by Yt=rt+t+gt+t (11) 6 where rt is the risk-free interest rate during (t-1, t], t is a risk-premium dependent parameter, is the drift parameter. As in the variance-gamma model (7), the innovation t is conditionally Gaussian distributed with mean zero and variance 2gt, whereas the random time-change gt is gamma-distributed with shape and scale parameters t and one, respectively, i.e., gt|Ft-1 ~gamma(t,1) ( 12) Here a non-linear asymmetric NGARCH (p, q) process, (p=q=1), is considered to capture the leverage effects in a way that the shape parameter t satisfies 2 t=a0+ a εt -1 c t -1 +bt-1, t2 ( 13) In another words, t depends on the previous innovation t-1 and shape parameter t-1, respectively, and is therefore Ft-1–predictable. Eqs. (11)-(13), together with (7), consist of the specification of the proposed VG NGARCH model, and the logarithm of joint likelihood p(Y, g|) for the observable log-returns Y= Y1 ,...,YT and the unobservable random time-changes g= g1 ,..., g T follows straightforwardly as T Yt rt t g t 2 T + νt 1.5log g t g t log Γνt +C (14) 2 g t t 1 Tln t 1 where C= - Tlog 2 , and =(,, 1, ) denotes the vector of unknown parameters with the drift parameter , the initial shape parameter1, and the NGARCH parameters =(a0, a, b, c) in (13). Note that as the random time-changes g1, ... , gT are unobservable, the risk-premium dependent parameters 1,…, T are therefore unidentifiable. To simplify the problem, the parameters 1, … ,T are set to zeros. 3. Empirical analysis The data on the 1519 weekly dividend-exclusive S&P 500 index series from 7 January 3, 1980 to Feb. 17, 2009 are collected from the website http://hk.finance.yahoo.com. The weekly risk-less interest rates are calculated using the annualized discounted rates for U.S. Treasure Bills with maturities of three months, which was obtained from the website of U.S. Treasury Department. Table 1 reports summary statistics for the 1519 weekly S&P 500 index log-return series. Over this period, the average log-return is 0.0013, and the standard deviation is 0.0229. The distribution of the 1519 weekly S&P 500 log-returns exhibits negative skewness and kurtosis (9.6155) larger than 3, which shows the assumption that log-returns are normally distributed is violated. In figure 1, the plot of the 1519 weekly S&P 500 index log-return against time is given. As can be seen from figure 1, from January 3, 1980 to Feb. 17, 2009, there are periods of calm and periods of large fluctuations in the index, especially after September 2008. 3.1 Parameter estimation: Monte Carlo EM and Metropolis algorithm The parameter estimation of the likelihood functions under the data-generating measure P of the three models: the Gaussian NGARCH, the VG model, and the VG NGARCH are performed. Maximum likelihood estimation is employed for all three models. For the Gaussian NGARCH model, the MLE of the risk premium parameter and the NGARCH parameters =(a0, a, b, c) are obtained by maximization of the log-likelihood (4). For the VG and VG NGARCH models, as the random time-changes g1, ... , gT are unobservable, the Monte Carlo EM algorithm (Wei et al. (1990), McCulloch (1997)) is employed for the parameter estimation. The Monte Carlo EM algorithm is an extension of the EM algorithm in which two primary steps, namely, the E-step and the M-step are taken place iteratively for the estimation of the parameters . Specifically, at the first iteration, an initial parameter values 0 is supplied. The estimates of are then updated iteratively 8 according to the following scheme: at the ith iteration, i>1, with the estimates i-1 from previous iteration, a set of N random samples of the unobservable variables, i.e., N random samples of random time-changes g1, ... , gN are drawn from the posterior distribution p(g|Y;i-1). The estimates of are now updated by i= ̂ , where ̂ maximizes the conditional expectation of the log-likelihood E[log{ p(Y, g|)}]= log pY , g | p( g | Y ; i 1 dg which is approximated by the random samples g1, ... , gN as follows l(|Y) = 1 N log pY , g s | N s 1 It can be shown that i converge to the maximum likelihood estimates of as i. At each iteration of the EM algorithm, the Metropolis algorithm introduced by Metropolis et al. (1953), is used to generate N random samples from the posterior distribution p(g|Y;i-1). Here the independent Metropolis chain approach by Hastings (1970) is adopted, in which if the chain is currently at a point Xn=x, then it generate a candidate value y from a proposal transition density f(y) for the next location Xn+1 from the transition kernel Q(y) that is independent of x. The candidate Xn+1=y is accepted with probability y f x ,1 , (x, y) = min x f y where is the target distribution. Otherwise the step is rejected and the chain remains at Xn+1=x. After L such steps, for L sufficient large, a realization from the target distribution is obtained. From the likelihood functions (10) and (14) for the VG and VG NGARCH models, the target distribution , i.e., the posterior distribution p(g|Y;) satisfies T p(g|Y; ) exp g t t / g t νt - 1.5log g t t 1 9 where the coefficients and t, 1tT, for the VG model are given by 2 1 , 2 t Yt m 2 , respectively. For the VG NGARCH model and t are 1, 2 t 2 Yt rt t 2 At the nth step, a random sample of time-changes g=(g1, ... , gT) is drawn from the proposal transition density f of T independent gamma distributions with shape parameters 1/-0.5 and scale parameters 1/ for the VG model. While for the VG NGARCH model, the proposal transition density f is chosen to be T independent gamma distributions with shape parameters 1-0.5, ...,T-0.5 and scale parameters 1/, wheret=a0+ aε t -1 c 2 +bt-1. The random sample g is accepted and gn=g with probability T = minexp t t , t 1 g t,n g t,n-1 1 (15) otherwise, gn= gn-1. 3.2 Model calibration The goodness of fit of the Gaussian NGARCH model, the VG model, and the VG NGARCH model over the S&P 500 stock index from January 3, 1980 to Feb. 19, 2009 is evaluated in this section. In Table 1, the estimated parameter values for the three competing models are given, and the Akaike's information criterion (AIC) proposed in Akaike (1974) is calculated. When several competing models are ranked according to their AIC, the one with the lowest AIC is the best. 10 From Table 1, the AIC values show that the VG NGARCH model is the best among the three competing models. In figure 1, the observed log-returns for the S&P 500 stock index from January 3, 1980 to Feb. 19, 2009 is plotted. To assess the fitness of the three competing models, residual analysis is performed in figures 2-4, respectively. From the standardized residual plots in figs. 2(a),3(a), and 4(a), the numbers of outliers that fall outside the range [-3,3] are 24, 25, and 4 for the Gaussian NGARCH, the VG, and the VG NGARCH model, respectively. In figs. 2(b), 3(b), and 4(b), the histograms of the standard residuals for the three competing models are provided, respectively. As can be seen from figs. 2(b) and 3(b), the mean (=-0.3907) of the residuals in Gaussian NGARCH model deviates from zeros significantly, while the VG model gives the largest kurtosis (=8.8131) for its standard residuals, this implies that there exists some systematic error for the Gaussian NGARCH model, while the VG model fails to account for the extreme events that cause the heavy tails of the distribution of log-returns. On the other hand, it can be seen that the kurtosis of the standard residuals in the VG NGARCH model is closest to 3 (4.577), and the mean is close to zero (=0.0066). This implies that the VG NGARCH model fits the observed log-return data better than the other two benchmark models. In figs. 2(c), 3(c), and 4(c), the estimated conditional variances of the log-return innovations for the three competing models are plotted against time. In figs. 2(c) and 4(c), the estimated conditional variances for the Gaussian NGARCH and VG NGARCH models increase during the periods when the log-returns of the index fluctuate violently. In particular, a jump of the estimated conditional variances occurs after the September of 2008 in both figs. 2(c) and 4(c). While the estimated conditional variances for the VG model in 2(c) fails to account for the fluctuations in the S&P 500 index. These together imply that the two NGARCH-type models, i.e., the Gaussian NGARCH and VG NGARCH model, outperform the VG model. Compared to the VG NGARCH model, the mean of the residuals of the Gaussian NGARCH model deviates significantly from zero. Therefore, one considers the VG 11 NGARCH model as the best among the three competing models. 4. Conclusion Asset prices are largely driven by the arrival of news. Both the variability in the impact of news and its arrival rate can affect the volatility and caused large and persistent fluctuations in the market. The news of the declaration of Lehman Brothers’ bankruptcy in September, 2008 is an example of an extreme event with profound negative impact in the market. In order to incorporate the extreme event and its impact into the model, various types of models have been proposed. This study shows that NGARCH-type models fit the observed data better. Among the two NGARCH-type models under consideration in this study, the VG NGARCH model, by incorporating a gamma-distributed random time-change with its mean determined by previous return innovations, outperform the Gaussian NGARCH model. In the previous study, it was shown that under the same sufficient conditions, the VG GARCH-type model also obeys the locally risk neutralized principle proposed by Duan (1995) as the Gaussian GARCH-type model. Therefore, the current study suggests that the VG NGARCH model provides a cornerstone for the option pricing. 12 Table 1. S&P 500 daily log-return summary statistics from January 3, 1980 to Feb 19, 2009 Mean log-return Standard deviation Skewness Kurtosis 0.0013 0.0229 -0.8161 9.6155 13 Table 2. Estimates of the parameters for Gaussian NGARCH, VG and VG NGARCH models Parameter Gaussian NGARCH 0.4088 VG VG NGARCH -0.0085 0.0001 0.0215 0.0173 a0 0.0003 1.1830 a 0.2833 0.5641 b 0.0040 0.0500 c 0.4377 0.1218 h1 0.0121 1 7.4775 AIC 0.0089 -9932 -10661 14 -10882 -0.05 -0.1 -0.15 -0.2 -0.25 Date 15 2008/1/7 2006/1/7 2004/1/7 2002/1/7 2000/1/7 1998/1/7 1996/1/7 1994/1/7 1992/1/7 1990/1/7 1988/1/7 1986/1/7 1984/1/7 1982/1/7 1980/1/7 log-return Figure 1. Weekly log-returns from Jan. 2, 1980 to Feb 17, 2009 0.15 0.1 0.05 0 Fig. 2a. Standardized residuals of Gaussian NGARCH model 4 2 0 -2 -4 -6 -8 0 200 400 600 800 1000 1200 1400 1600 Fig. 2b. Histogram of standard residuals of Gaussian NGARCH model 700 600 Kurtosis=4.9088; skewness=-0.3609 mean=-0.3907 500 400 300 200 100 0 -8 -6 -4 -2 16 0 2 4 Fig. 2c. Estimated conditional variances h(t) in Gaussian NGARCH model 0.02 0.018 0.016 0.014 0.012 0.01 0.008 0.006 0.004 0.002 0 0 200 400 600 800 1000 1200 1400 1600 Fig. 3a. Standard residuals of VG model 6 4 2 0 -2 -4 -6 -8 -10 0 200 400 600 800 17 1000 1200 1400 1600 Fig. 3b. Histogram of standard residuals of VG model 900 Kurtosis=8.8131 skewness=-0.7483 mean=0.0214 800 700 600 500 400 300 200 100 0 -10 -8 -4 5.1 -6 -4 -2 0 2 4 x 10 Fig. 3c. Estimated conditional variances in Vg model 0 200 6 5 4.9 4.8 4.7 4.6 4.5 4.4 400 600 800 18 1000 1200 1400 1600 Fig.4a. Standardized residuals of VG NGARCH model 4 3 2 1 0 -1 -2 -3 -4 -5 0 200 400 600 800 1000 1200 1400 1600 Fig. 4b. Histogram of standard residuals of VG NGARCH model 600 Kurtosis=4.577; skewness=-0.2689 mean=0.0066 500 400 300 200 100 0 -5 -4 -3 -2 -1 0 19 1 2 3 4 Fig. 4c. Estimated conditional variances in VG NGARCH model 0.03 0.025 0.02 0.015 0.01 0.005 0 0 200 400 600 800 20 1000 1200 1400 1600 References Andersen, L., and Andreasen, J. (2001). Jump-Diffusion Processes: Volatility Smile Fitting and Numerical Methods for Option Pricing. Review of Derivatives Research, 4, 231–262, 2000. Black, F. and Scholes, M. (1973). The Pricing of Options and Corporate Liabilities. Journal of Political Economy 81, 637-659. Bollerslev, T. (1986). Generalized autoregressive conditional heteroskedasticity. Journal of Econometrics 31: 307–27. Bollerslev, T. (1987). A conditionally heteroskedastic time series model for speculative prices and rates of return,” Review of Economics and Statistics 69: 542–7. Bollerslev, T., Robert F. Engle, and Jeffrey M. Wooldridge (1988). A Capital Asset Pricing Model with Time-Varying Covariances, Journal of Political Economy 96:1, 116-31. Campbell, J.Y. and L. Hentschel (1992). No news is good news: an asymmetric model of changing volatility in stock returns. Journal of Financial Economics, 31, 281-381. Carr, P. and Madan, D. B. (1998). Optimal Derivative Investment in Continuous Time. Working Paper, University of Maryland. Carr, P., Geman, H., Madan, D.B., Yor, M. (2003), Stochastic volatility for Levy processes. Mathematical Finance 13, 345–382. Carr, P., Geman, H., Madan, D., Yor, M., (2002). The fine structure of asset returns: an empirical investigation. Journal of Business 75, 305–332. Cox, J. C., Ross, S. A. (1976). The valuation of options for alternative stochastic processes. J. Financial Econom. 3 145–166. Duan, J.C. (1995). The GARCH option pricing model. Mathematical Finance 5(1), 13–32. Duan, J.C. (1999). Conditionally Fat-Tailed Distributions and the Volatility Smile in Options. Working paper, Hong Kong University of Science and Technology. Duffie, D., Pan, J., Singleton, K. (2000). Transform analysis and option pricing for affine jump-diffusions. Econometrica 68 1343–1376. Engle, R.F. (1982). Autoregressive conditional heteroscedasticity with estimates of the variance of United Kingdom inflation. Econometrica 50, 987–1007. Engle, R. F., and Ng. V. K. (1993). Measuring and testing the impact of news on volatility. Journal of Finance 48(5): 1749–78. Forsberg, L., and Bollerslev, T. (2002). Bridging the gap between the distribution of realized (ECU) volatility and ARCH modeling (of the EURO): The GARCH normal Inverse Gaussian model. Journal of Applied Econometrics 17: 535–48. Geman, H., Madan, D.B. and Yor, M. (1998). Asset Prices are Brownian Motion: Only in Business Time. Working Paper, University of Maryland. Geman, H., Madan, D. and Yor, M. (2001). Times Changes for Lévy Processes. 21 Mathematical Finance, 11, 79-96. Hastings, W.K. (1970). Monte Carlo sampling methods using Markov chains and their applications. Biometrika, 57, 97-109. Jones, C.M., Kaul, G., and Lipson, M.L. (1994). Transactions, volume, and volatiltiy, The Review of Financial studies, Vol. 7, No. 4, 631-651. Kao, L.J., and Lee, C.F. (2009). News arrival driven option pricing: a GARCH model based on time-changed Brownian motions, submitted to Journal of Future Markets. Madan D. B., Carr P. and Chang E. C. (1998). The Variance Gamma Process and Option Pricing. European Finance Review 2, pages 79-105, 1998. Madan D. B. and Milne F. (1991). Option Pricing with VG Martingale Components. Mathematical Finance, 1(4), pages 39-55. Madan D. B. and Seneta E. (1990). The variance gamma (V.G.) model for share market returns. Journal of Business, 63(4), pages 511-524. McCulloch, C. E. (1997). Maximum Likelihood Algorithms for Generalized Linear Mixed Models. Journal of the American Statistical Association, 92, 162–170. Merton, R. C. (1973). Theory of rational option pricing. Bell Journal of Economics and Management Science 4, 141–83. Merton, R. C. (1976). Option pricing when underlying stock returns are discontinuous. Journal of Financial Economics 3, 125–144. Metropolis, N., Rosenbluth A.W., Rosenbluth M.N., Teller, A.H., and Teller E. (1953). Equations of state calculations by fast computing machines. Journal of Chemical Physics 21, 1087-1091. Nelson, D. (1991). Conditional heteroskedastity in asset returns: a new approach. Econometrica, Vol. 59, 347-370. Rubinstein, M. (1976). The valuation of uncertain income streams and the pricing of options. Bell Journal of Economics and Management Science, 7, 407-425. Stentoft, L., (2008). American Option Pricing Using GARCH Models and the Normal Inverse Gaussian Distribution. Journal of Financial Econometrics, Vol. 6, N0. 4, 540-582. Wei, G. C. G., and Tanner, M. A. (1990). A Monte Carlo Implementation of the EM Algorithm and the Poor Man’s Data Augmentation Algorithms. Journal of the American Statistical Association, 85, 699–704. 22