nep-ets New Economics Papers
on Econometric Time Series
Issue of 2010‒01‒10
ten papers chosen by
Yong Yin
SUNY at Buffalo

  1. Modelling the Volatility-Return Trade-off when Volatility may be Nonstationary By Christian M. Dahl; Emma M. Iglesias
  2. Forecasting time series with complex seasonal patterns using exponential smoothing By Alysha M De Livera; Rob J Hyndman
  3. Fully Modified Narrow-Band Least Squares Estimation of Weak Fractional Cointegration By Morten Ørregaard Nielsen; Per Frederiksen
  4. Indirect inference methods for stochastic volatility models based on non-Gaussian Ornstein-Uhlenbeck processes By Arvid Raknerud and Øivind Skare
  5. Bootstrap Confidence Bands for Forecast Paths By Anna Staszewska-Bystrova
  6. A Volatility Targeting GARCH model with Time-Varying Coefficients By Thorsten Lehnert; Bart Frijns; Remco Zwinkels
  7. The Fragility of the KPSS Stationarity Test By Nunzio Cappuccio; Diego Lubian
  8. Time series segmentation by Cusum, AutoSLEX and AutoPARM methods By Ana Badagian; Regina Kaiser; Daniel Pena
  9. "Block Structure Multivariate Stochastic Volatility Models" By Manabu Asai; Massimiliano Caporin; Michael McAleer
  10. Generalized Least Squares Estimation for Cointegration Parameters Under Conditional Heteroskedasticity By Helmut Herwartz; Helmut Luetkepohl

  1. By: Christian M. Dahl (University of Aarhus and CREATES); Emma M. Iglesias (Department of Economics, Michigan State University and University of Essex)
    Abstract: In this paper a new GARCH–M type model, denoted the GARCH-AR, is proposed. In particular, it is shown that it is possible to generate a volatility-return trade-off in a regression model simply by introducing dynamics in the standardized disturbance process. Importantly, the volatility in the GARCH-AR model enters the return function in terms of relative volatility, implying that the risk term can be stationary even if the volatility process is nonstationary. We provide a complete characterization of the stationarity properties of the GARCH-AR process by generalizing the results of Bougerol and Picard (1992b). Furthermore, allowing for nonstationary volatility, the asymptotic properties of the estimated parameters by quasi-maximum likelihood in the GARCH-AR process are established. Finally, we stress the importance of being able to choose correctly between AR-GARCH and GARCH-AR processes: First, it is shown, by a small simulation study, that the estimators for the parameters in an ARGARCH model will be seriously inconsistent if the data generating process actually is a GARCH-AR process. Second, we provide an LM test for neglected GARCH-AR effects and discuss its finite sample size properties. Third, we provide an empirical illustration showing the empirical relevance of the GARCH-AR model based on modelling a wide range of leading US stock return series.
    Keywords: Quasi-Maximum Likelihood, GARCH-M Model, Asymptotic Properties, Risk-return Relation.
    JEL: C12 C13 C22 G12
    Date: 2009–10–02
    URL: http://d.repec.org/n?u=RePEc:aah:create:2009-59&r=ets
  2. By: Alysha M De Livera; Rob J Hyndman
    Abstract: A new innovations state space modeling framework, incorporating Box-Cox transformations, Fourier series with time varying coefficients and ARMA error correction, is introduced for forecasting complex seasonal time series that cannot be handled using existing forecasting models. Such complex time series include time series with multiple seasonal periods, high frequency seasonality, non-integer seasonality and dual-calendar effects. Our new modelling framework provides an alternative to existing exponential smoothing models, and is shown to have many advantages. The methods for initialization and estimation, including likelihood evaluation, are presented, and analytical expressions for point forecasts and interval predictions under the assumption of Gaussian errors are derived, leading to a simple, comprehensible approach to forecasting complex seasonal time series. Our trigonometric formulation is also presented as a means of decomposing complex seasonal time series, which cannot be decomposed using any of the existing decomposition methods. The approach is useful in a broad range of applications, and we illustrate its versatility in three empirical studies where it demonstrates excellent forecasting performance over a range of prediction horizons. In addition, we show that our trigonometric decomposition leads to the identification and extraction of seasonal components, which are otherwise not apparent in the time series plot itself.
    Keywords: Exponential smoothing, Fourier series, prediction intervals, seasonality, state space models, time series decomposition
    JEL: C22 C53
    Date: 2009–12–12
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2009-15&r=ets
  3. By: Morten Ørregaard Nielsen (Queen's University and CREATES); Per Frederiksen (Nordea Markets)
    Abstract: We consider estimation of the cointegrating relation in the weak fractional cointegration model, where the strength of the cointegrating relation (difference in memory parameters) is less than one-half. A special case is the stationary fractional cointegration model, which has found important application recently, especially in financial economics. Previous research on this model has considered a semiparametric narrow-band least squares (NBLS) estimator in the frequency domain, but in the stationary case its asymptotic distribution has been derived only under a condition of non-coherence between regressors and errors at the zero frequency. We show that in the absence of this condition, the NBLS estimator is asymptotically biased, and also that the bias can be consistently estimated. Consequently, we introduce a fully modified NBLS estimator which eliminates the bias, and indeed enjoys a faster rate of convergence than NBLS in general. We also show that local Whittle estimation of the integration order of the errors can be conducted consistently based on NBLS residuals, but the estimator has the same asymptotic distribution as if the errors were observed only under the condition of non-coherence. Furthermore, compared to much previous research, the development of the asymptotic distribution theory is based on a different spectral density representation, which is relevant for multivariate fractionally integrated processes, and the use of this representation is shown to result in lower asymptotic bias and variance of the narrow-band estimators. We present simulation evidence and a series of empirical illustrations to demonstrate the feasibility and empirical relevance of our methodology.
    Keywords: Fractional cointegration, frequency domain, fully modified estimation, long memory, semiparametric
    JEL: C22
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:qed:wpaper:1226&r=ets
  4. By: Arvid Raknerud and Øivind Skare (Statistics Norway)
    Abstract: This paper aims to develop new methods for statistical inference in a class of stochastic volatility models for financial data based on non-Gaussian Ornstein-Uhlenbeck (OU) processes. Our approach uses indirect inference methods: First, a quasi-likelihood for the actual data is estimated. This quasi-likelihood is based on an approximative Gaussian state space representation of the OU-based model. Next, simulations are made from the data generating OU-model for given parameter values. The indirect inference estimator is the parameter value in the OU-model which gives the best "match" between the quasi-likelihood estimator for the actual data and the quasi-likelihood estimator for the simulated data. Our method is applied to Euro/NOK and US Dollar/NOK daily exchange rates for the period 1.7.1989 until 15.12.2008. Accompanying R-package, that interfaces C++ code is documented and can be downloaded.
    Keywords: stochastic volatility; financial econometrics; Ornstein-Uhlenbeck processes; indirect inference; state space models; exchange rates
    JEL: C13 C22 C51 G10
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:ssb:dispap:601&r=ets
  5. By: Anna Staszewska-Bystrova
    Abstract: The problem of forecasting from vector autoregressive models has attracted considerable attention in the literature. The most popular non-Bayesian approaches use large sample normal theory or the bootstrap to evaluate the uncertainty associated with the forecast. The literature has concentrated on the problem of assessing the uncertainty of the prediction for a single period. This paper considers the problem of how to assess the uncertainty when the forecasts are done for a succession of periods. It describes and evaluates bootstrap method for constructing confidence bands for forecast paths. The bands are constructed from forecast paths obtained in bootstrap replications with an optimisation procedure used to find the envelope of the most concentrated paths. The method is shown to have good coverage properties in a Monte Carlo study.
    Keywords: vector autoregression, forecast path, bootstrapping, simultaneous statistical inference
    JEL: C15 C32 C53
    Date: 2009–12–07
    URL: http://d.repec.org/n?u=RePEc:com:wpaper:024&r=ets
  6. By: Thorsten Lehnert (Luxembourg School of Finance, University of Luxembourg); Bart Frijns (Department of Finance, Auckland University of Technology, New Zealand); Remco Zwinkels (Erasmus School of Economics, Erasmus University Rotterdam.)
    Abstract: GARCH-type models have been very successful in describing the volatility dynamics of financial return series for short periods of time. However, for example macroeconomic events may cause the structure of volatility to change and the assumption of stationarity is no longer plausible. In order to deal with this issue, the current paper proposes a conditional volatility model with time varying coefficients based on a multinomial switching mechanism. By giving more weight to either the persistence or shock term in a GARCH model, conditional on their relative ability to forecast a benchmark volatility measure, the switching reinforces the persistent nature of the GARCH model. Estimation of this volatility targeting or VT-GARCH model for Dow 30 stocks indicates that the switching model is able to outperform a number of relevant GARCH setups, both in- and out-of-sample, also without any informational advantages.
    Keywords: GARCH, time varying coefficients, multinomial logit
    JEL: C22
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:crf:wpaper:09-08&r=ets
  7. By: Nunzio Cappuccio (Department of Economics and Management, University of Padova); Diego Lubian (Department of Economics (University of Verona))
    Abstract: Stationarity tests exhibit extreme size distortions if the observable process is stationary yet highly persistent. In this paper we provide a theoretical explanation for the size distortion of the KPSS test for DGPs with a broad range of first order autocorrelation coefficient. Considering a near-integrated, nearly stationary process we show that the asymptotic distribution of the test contains an additional term, which can potentially explain the amount of size distortion documented in previous simulation studies.
    Keywords: KPSS stationarity test, size distortion, nearly white noise nearly integrated model
    JEL: C01 C22
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:ver:wpaper:67/2009&r=ets
  8. By: Ana Badagian; Regina Kaiser; Daniel Pena
    Abstract: Time series segmentation has many applications in several disciplines as neurology, cardiology, speech, geology and others. Many time series in this fields do not behave as stationary and the usual transformations to linearity cannot be used. This paper describes and evaluates different methods for segmenting non-stationary time series. We propose a modification of the algorithm in Lee et al. (2003) which is designed to searching for a unique change in the parameters of a time series, in order to find more than one change using an iterative procedure. We evaluate the performance of three approaches for segmenting time series: AutoSLEX (Ombao et al., 2002), AutoPARM (Davis et al., 2006) and the iterative cusum method mentioned above and referred as ICM. The evaluation of each methodology consists of two steps. First, we compute how many times each procedure fails in segmenting stationary processes properly. Second, we analyze the effect of different change patterns by counting how many times the corresponding methodology correctly segments a piecewise stationary process. ICM method has a better performance than AutoSLEX for piecewise stationary processes. AutoPARM presents a very satisfactory behaviour. The performance of the three methods is illustrated with time series datasets of neurology and speech
    Keywords: Time series segmentation, AutoSLEX, AutoPARM, Cusum Methods
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws098025&r=ets
  9. By: Manabu Asai (Faculty of Economics, Soka University); Massimiliano Caporin (Department of Economics and Management "Marco Fanno", University of Padova); Michael McAleer (Econometric Institute, Erasmus University Rotterdam, Erasmus School of Economics and Tinbergen Institute)
    Abstract: Most multivariate variance models suffer from a common problem, the "curse of dimensionality". For this reason, most are fitted under strong parametric restrictions that reduce the interpretation and flexibility of the models. Recently, the literature has focused on multivariate models with milder restrictions, whose purpose was to combine the need for interpretability and efficiency faced by model users with the computational problems that may emerge when the number of assets is quite large. We contribute to this strand of the literature proposing a block-type parameterization for multivariate stochastic volatility models.
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2009cf699&r=ets
  10. By: Helmut Herwartz; Helmut Luetkepohl
    Abstract: In the presence of generalized conditional heteroscedasticity (GARCH) in the residuals of a vector error correction model (VECM), maximum likelihood (ML) estimation of the cointegration parameters has been shown to be efficient. On the other hand, full ML estimation of VECMs with GARCH residuals is computationally di±cult and may not be feasible for larger models. Moreover, ML estimation of VECMs with independently identically distributed residuals is known to have potentially poor small sample properties and this problem also persists when there are GARCH residuals. A further disadvantage of the ML estimator is its sensitivity to misspecification of the GARCH process. We propose a feasible generalized least squares estimator which addresses all these problems. It is easy to compute and has superior small sample properties in the presence of GARCH residuals.
    Keywords: Vector autoregressive process, vector error correction model, cointegration, reduced rank estimation, maximum likelihood estimation, multivariate GARCH
    JEL: C32
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2009/42&r=ets

This nep-ets issue is ©2010 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.