nep-ets New Economics Papers
on Econometric Time Series
Issue of 2011‒10‒15
eight papers chosen by
Yong Yin
SUNY at Buffalo

  1. Chaotic Time Series Analysis in Economics: Balance and Perspectives By Marisa Faggini
  2. Time-series Modelling, Stationarity and Bayesian Nonparametric Methods By Juan Carlos Martínez-Ovando; Stephen G. Walker
  3. Some Computational Aspects of Gaussian CARMA Modelling By Tómasson, Helgi
  4. Cointegrated VARMA models and forecasting US interest rates By Christian Kascha; Carsten Trenkler
  5. A new method for approximating vector autoregressive processes by finite-state Markov chains By Gospodinov, Nikolay; Lkhagvasuren, Damba
  6. Improved estimation in generalized threshold regression models By Friederike Greb; Tatyana Krivobokova; Axel Munk; Stephan von Cramon-Taubadel
  7. CoVaR By Tobias Adrian; Markus K. Brunnermeier
  8. Distribution Theory for the Studentized Mean for Long, Short, and Negative Memory Time Series By McElroy, Tucker S; Politis, D N

  1. By: Marisa Faggini (Department of Economics and Statistics, University of Salerno)
    Abstract: To show that a mathematical model exhibits chaotic behaviour does not prove that chaos is also present in the corresponding data. To convincingly show that a system behaves chaotically, chaos has to be identified directly from the data. From an empirical point of view, it is difficult to distinguish between fluctuations provoked by random shocks and endogenous fluctuations determined by the nonlinear nature of the relation between economic aggregates. For this purpose, chaos tests test are developed to investigate the basic features of chaotic phenomena: nonlinearity, fractal attractor, and sensitivity to initial conditions. The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in the data. More specifically, our attention will be devoted to reviewing the results reached by the application of these techniques to economic and financial time series and to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area.
    Keywords: Economic dynamics, nonlinearity, tests for chaos, chaos
    Date: 2011–10
  2. By: Juan Carlos Martínez-Ovando; Stephen G. Walker
    Abstract: In this paper we introduce two general non-parametric first-order stationary time-series models for which marginal (invariant) and transition distributions are expressed as infinite-dimensional mixtures. That feature makes them the first Bayesian stationary fully non-parametric models developed so far. We draw on the discussion of using stationary models in practice, as a motivation, and advocate the view that flexible (non-parametric) stationary models might be a source for reliable inferences and predictions. It will be noticed that our models adequately fit in the Bayesian inference framework due to a suitable representation theorem. A stationary scale-mixture model is developed as a particular case along with a computational strategy for posterior inference and predictions. The usefulness of that model is illustrated with the analysis of Euro/USD exchange rate log-returns.
    Keywords: Stationarity, Markov processes, Dynamic mixture models, Random probability measures, Conditional random probability measures, Latent processes.
    JEL: C11 C14 C15 C22 C51
    Date: 2011–09
  3. By: Tómasson, Helgi (Faculty of Economics, University of Iceland, Reykjavik, Iceland)
    Abstract: Representation of continuous-time ARMA, CARMA, models is reviewed. Computational aspects of simulating and calculating the likelihood-function of CARMA are summarized. Some numerical properties are illustrated by simulations. Some real data applications are shown.
    Keywords: CARMA, maximum-likelihood, spectrum, Kalman filter, computation
    JEL: C01 C10 C22 C53 C63
    Date: 2011–09
  4. By: Christian Kascha; Carsten Trenkler
    Abstract: We bring together some recent advances in the literature on vector autoregressive moving-average models creating a relatively simple specification and estimation strategy for the cointegrated case. We show that in the cointegrated case with fixed initial values there exists a so-called final moving representation which is usually simpler but not as parsimonious than the usual Echelon form. Furthermore, we proof that our specification strategy is consistent also in the case of cointegrated series. In order to show the potential usefulness of the method, we apply it to US interest rates and find that it generates forecasts superior to methods which do not allow for moving-average terms.
    Keywords: Cointegration, VARMA models, forecasting
    JEL: C32 C53 E43 E47
    Date: 2011–10
  5. By: Gospodinov, Nikolay; Lkhagvasuren, Damba
    Abstract: This paper proposes a new method for approximating vector autoregressions by a finite-state Markov chain. The method is more robust to the number of discrete values and tends to outperform the existing methods over a wide range of the parameter space, especially for highly persistent vector autoregressions with roots near the unit circle.
    Keywords: Markov Chain; Vector Autoregressive Processes; Functional Equation; Numerical Methods; Moment Matching; Numerical Integration
    JEL: C10 C15 C60
    Date: 2011–06–08
  6. By: Friederike Greb (Georg-August-University Göttingen); Tatyana Krivobokova (Georg-August-University Göttingen); Axel Munk (Georg-August-University Göttingen); Stephan von Cramon-Taubadel (Georg-August-University Göttingen)
    Abstract: Estimation of threshold parameters in (generalized) threshold regression models is typically performed by maximizing the corresponding profile likelihood function. Also, certain Bayesian techniques based on non-informative priors are developed and widely used. This article draws attention to settings (not rare in practice) in which these standard estimators either perform poorly or even fail. In particular, if estimation of the regression coeffcients is associated with high uncertainty, the profile likelihood for the threshold parameters and thus the corresponding estimator can be highly affected. We suggest an alternative estimation method employing the empirical Bayes paradigm, which allows to circumvent deficiencies of standard estimators. The new estimator is completely data-driven and induces little additional numerical effort compared with the old one. Simulation results show that our estimator outperforms commonly used estimators and produces excellent results even if the latter show poor performance. The practical relevance of our approach is illustrated by a real-data example; we follow up the anlysis of cross-country growth behavior detailed in Hansen (2000).
    Keywords: threshold estimation; nuisance parameters; empirical Bayes
    Date: 2011–10–07
  7. By: Tobias Adrian; Markus K. Brunnermeier
    Abstract: We propose a measure for systemic risk: CoVaR, the value at risk (VaR) of the financial system conditional on institutions being under distress. We define an institution's contribution to systemic risk as the difference between CoVaR conditional on the institution being under distress and the CoVaR in the median state of the institution. From our estimates of CoVaR for the universe of publicly traded financial institutions, we quantify the extent to which characteristics such as leverage, size, and maturity mismatch predict systemic risk contribution. We also provide out of sample forecasts of a countercyclical, forward looking measure of systemic risk and show that the 2006Q4 value of this measure would have predicted more than half of realized covariances during the financial crisis.
    JEL: G21 G22
    Date: 2011–10
  8. By: McElroy, Tucker S; Politis, D N
    Abstract: We consider the problem of estimating the variance of the partial sums of a stationary time series that has either long memory, short memory, negative/intermediate memory, or is the ¯rst- di®erence of such a process. The rate of growth of this variance depends crucially on the type of memory, and we present results on the behavior of tapered sums of sample autocovariances in this context when the bandwidth vanishes asymptotically. We also present asymptotic results for the case that the bandwidth is a ¯xed proportion of sample size, extending known results to the case of °at-top tapers. We adopt the ¯xed proportion bandwidth perspective in our empirical section, presenting two methods for estimating the limiting critical values { both the subsampling method and a plug-in approach. Extensive simulation studies compare the size and power of both approaches as applied to hypothesis testing for the mean. Both methods perform well { although the subsampling method appears to be better sized { and provide a viable framework for conducting inference for the mean. In summary, we supply a uni¯ed asymptotic theory that covers all di®erent types of memory under a single umbrella.
    Keywords: kernel, lag-windows, overdifferencing, spectral estimation, subsampling, tapers, unit-root problem, Econometrics
    Date: 2011–09–01

This nep-ets issue is ©2011 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.