nep-ets New Economics Papers
on Econometric Time Series
Issue of 2006‒12‒01
nine papers chosen by
Yong Yin
SUNY at Buffalo

  1. The yen real exchange rate may be stationary after all: evidence from non-linear unit root tests By Georgios Chortareas; George Kapetanios
  2. Testing Covariance Stationarity By Zhijie Xiao; Luiz Renato Lima
  3. A New Algebra ic Approach to Representation Theorems for (Co)integrated Processes up to the Second Order By Maria Grazia Zoia
  4. Improving Consistent Moment Selection Procedures for Generalized Method of Moments Estimation By Jean-Bernard Chatelain
  5. Modelling autoregressive processes with a shifting mean By González, Andrés; Teräsvirta, Timo
  6. Shift versus traditional contagion in Asian markets By Thomas Flavin; Ekaterini Panopoulou
  7. Modeling and forecasting the volatility of Brazilian asset returns By MArcelo Carvalho; MArco Aurelio Freire; Marcelo Cunha Medeiros; Leonardo Souza
  8. Realized volatility: a review By Michael McAleer; Marcelo Cunha Medeiros
  9. Bayesian inference in cointegrated VAR models - with applications to the demand for euro area M3 By Anders Warne

  1. By: Georgios Chortareas; George Kapetanios
    Abstract: The empirical literature that tests for purchasing power parity (PPP) by focusing on the stationarity of real exchange rates has so far provided, at best, mixed results. The behaviour of the yen real exchange rate has most stubbornly challenged the PPP hypothesis and deepened this puzzle. This paper contributes to this discussion by providing new evidence on the stationarity of bilateral yen real exchange rates. We employ a non-linear version of the Augmented Dickey-Fuller test, based on an exponentially smooth-transition autoregressive model (ESTAR) that enhances the power of the tests against mean-reverting non-linear alternative hypotheses. Our results suggest that the bilateral yen real exchange rates against the other G7 and Asian currencies were mean reverting during the post-Bretton Woods era. Thus, the real yen behaviour may not be so different after all but simply perceived to be so due to the use of a restrictive alternative hypothesis in previous tests.
  2. By: Zhijie Xiao; Luiz Renato Lima
    Date: 2006–11
  3. By: Maria Grazia Zoia
    Abstract: The paper establishes a unified representation theorem for (co)integrated processes up to the second order which provides a compact and informative insight into the solution of VAR models with unit roots, and sheds light on the cointegration features of the engendered processes. The theorem is primarily stated by taking a one-lag specification as a reference frame, and it is afterwards extended to cover the case of an arbitrary number of lags via a companion-form based approach. All proofs are obtained by resorting to an innovative and powerful algebraic apparatus tailored to the derivation of the intended results.
    Keywords: Unified representation theorem, Cointegration, Orthogonal-complement algebra, Laurent expansion in matrix form
    Date: 2006–10
  4. By: Jean-Bernard Chatelain (PSE - Paris-Jourdan Sciences Economiques - [CNRS : UMR8545] - [Ecole des Hautes Etudes en Sciences Sociales][Ecole Nationale des Ponts et Chaussées][Ecole Normale Supérieure de Paris], EconomiX - [CNRS : UMR7166] - [Université de Paris X - Nanterre])
    Abstract: This paper proposes consistent moment selection procedures for generalized method of moments estimation based on the J test of over-identifying restrictions (Hansen [1982]) and on the Eichenbaum, Hansen and Singleton [1988] test of the validity of a subset of moment conditions.
    Keywords: Generalized method of moments, test of over-identifying restrictions, test of subset of over-identifying restrictions, Consistent Moment Selection
    Date: 2006–11–08
  5. By: González, Andrés (Unidad de investigaciones économicas, Banco de la República); Teräsvirta, Timo (Department of Economics, University of Aarhus, and Department of Economic Statistics, Stockholm School of Economics)
    Abstract: In this paper we introduce an autoregressive model with a deterministically shifting intercept. This implies that the model has a shifting mean and is thus nonstationary but stationary around a nonlinear deterministic component. The shifting intercept is defined as a linear combination of logistic transition functions with time as the transition variables. The number of transition functions is determined by selecting the appropriate functions from a possibly large set of alternatives using a sequence of specification tests. This selection procedure is a modification of a similar technique developed for neural network modelling by White (2006). A Monte Carlo experiment is conducted to show how the proposed modelling procedure and some of its variants work in practice. The paper contains two applications in which the results are compared with what is obtained by assuming that the time series used as examples may contain structural breaks instead of smooth transitions and selecting the number of breaks following the technique of Bai and Perron (1998).
    Keywords: deterministic shift; nonlinear autoregression; nonstationarity; nonlinear trend; smooth transition; structural change
    JEL: C22 C52
    Date: 2006–09–27
  6. By: Thomas Flavin; Ekaterini Panopoulou
    Abstract: We test for shift contagion between pairs of East Asian equity markets over a sample including the financial crisis of the 1990’s. Employing the methodology of Gravelle et al. (2006), we find little evidence of change in the mechanism by which common shocks are transmitted between countries. Furthermore, we analyze the effects of idiosyncratic shocks and generate time-varying conditional correlations. While there clearly is significant time variation in the pair wise correlations, this is not more pronounced during the Asian crisis than it had been historically.
    Keywords: Shift contagion; Financial market crises; Regime switching; Structural transmission; Emerging markets
    Date: 2006–11–16
    Abstract: The goal of this paper is twofold. First, using five of the most actively traded stocks in the Brazilian financial market, this paper shows that the normality assumption commonly used in the risk management area to describe the distributions of returns standardized by volatilities is not compatible with volatilities estimated by EWMA or GARCH models. In sharp contrast, when the information contained in high frequency data is used to construct the realized volatility measures, we attain the normality of the standardized returns, giving promise of improvements in Value-at-Risk statistics. We also describe the distributions of volatilities of the Brazilian stocks, showing that they are nearly lognormal. Second, we estimate a simple model to the log of realized volatilities that differs from the ones in other studies. The main difference is that we do not find evidence of long memory. The estimated model is compared with commonly used alternatives in an out-of-sample forecasting experiment.
    Date: 2006–11
  8. By: Michael McAleer (University of Western Australia); Marcelo Cunha Medeiros (Department of Economics PUC-Rio)
    Abstract: This paper reviews the exciting and rapidly expanding literature on realized volatility. After presenting a general univariate framework for estimating realized volatilities, a simple discrete time model is presented in order to motivate the main results. A continuous time specification provides the theoretical foundation for the main results in this literature. Cases with and without microstructure noise are considered, and it is shown how microstructure noise can cause severe problems in terms of consistent estimation of the daily realized volatility. Independent and dependent noise processes are examined. The most important methods for providing consistent estimators are presented, and a critical exposition of different techniques is given. The finite sample properties are discussed in comparison with their asymptotic properties. A multivariate model is presented to discuss estimation of the realized covariances. Various issues relating to modelling and forecasting realized volatilities are considered. The main empirical findings using univariate and multivariate methods are summarized.
    Date: 2006–11
  9. By: Anders Warne (European Central Bank, Kaiserstrasse 29, 60311 Frankfurt am Main, Germany)
    Abstract: The paper considers a Bayesian approach to the cointegrated VAR model with a uniform prior on the cointegration space. Building on earlier work by Villani (2005b), where the posterior probability of the cointegration rank can be calculated conditional on the lag order, the current paper also makes it possible to compute the joint posterior probability of these two parameters as well as the marginal posterior probabilities under the assumption of a known upper bound for the lag order. When the marginal likelihood identity is used for calculating these probabilities, a point estimator of the cointegration space and the weights is required. Analytical expressions are therefore derived of the mode of the joint posterior of these parameter matrices. The procedure is applied to a money demand system for the euro area and the results are compared to those obtained from a maximum likelihood analysis. JEL Classification: C11, C15, C32, E41.
    Keywords: Bayesian inference, cointegration, lag order, money demand, vector autoregression.
    Date: 2006–11

This nep-ets issue is ©2006 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.