nep-ets New Economics Papers
on Econometric Time Series
Issue of 2011‒07‒27
five papers chosen by
Yong Yin
SUNY at Buffalo

  1. Closed-Form Likelihood Expansions for Multivariate Time-Inhomogeneous Diffusions By Seungmoon Choi
  2. Forecasting in the presence of recent structural change By Jana Eklund; George Kapetanios; Simon Price
  3. Doubly fractional models for dynamic heteroskedastic cycles. By Miguel Artiach; Josu Arteche
  4. Does the Box-Cox transformation help in forecasting macroeconomic time series? By Tommaso, Proietti; Helmut, Luetkepohl
  5. Wild bootstrap of the mean in the infinite variance case By Giuseppe Cavaliere; Iliyan Georgiev; A.M.Robert Taylor

  1. By: Seungmoon Choi (School of Economics, University of Adelaide)
    Abstract: The aim of this paper is to find approximate log-transition density functions for multivariate time-inhomogeneous diffusions in closed form. There are many empirical evidences supporting that the data generating process governing dynamics of many economics variables might vary over time because of economic climate changes or time effects. One possible way to explain the time-dependent dynamics of state variables is to model the drift or volatility terms as functions of time t as well as state variables. A way to find closed-form likelihood expansion for a multivariate time-homogeneous diffusion has been developed by Ait-Sahalia (2008). This research is built on his work and extends his results to time-inhomogeneous cases. We conduct Monte Carlo simulation studies to examine performance of the approximate transition density function when it is used to obtain ML estimates. The results reveal that our method yields a very accurate approximate likelihood function, which can be a good candidate when the true likelihood function is unavailable as is often the case.
    Keywords: likelihood function; multivariate time-inhomogeneous diffusion; reducible diffusions, irreducible diffusions
    Date: 2011–07
    URL: http://d.repec.org/n?u=RePEc:adl:wpaper:2011-26&r=ets
  2. By: Jana Eklund; George Kapetanios; Simon Price
    Abstract: We examine how to forecast after a recent break. We consider monitoring for change and then combining forecasts from models that do and do not use data before the change; and robust methods, namely rolling regressions, forecast averaging over different windows and exponentially weighted moving average (EWMA) forecasting. We derive analytical results for the performance of the robust methods relative to a full-sample recursive benchmark. For a location model subject to stochastic breaks the relative MSFE ranking is EWMA < rolling regression < forecast averaging. No clear ranking emerges under deterministic breaks. In Monte Carlo experiments forecast averaging improves performance in many cases with little penalty where there are small or infrequent changes. Similar results emerge when we examine a large number of UK and US macroeconomic series.
    JEL: C10 C59
    Date: 2011–07
    URL: http://d.repec.org/n?u=RePEc:acb:camaaa:2011-23&r=ets
  3. By: Miguel Artiach (Universidad de Alicante); Josu Arteche (UPV/EHU)
    Abstract: Strong persistence is a common phenomenon that has been documented not only in the levels but also in the volatility of many time series. The class of doubly fractional models is extended to include the possibility of long memory in cyclical (non-zero) frequencies in both the levels and the volatility and a new model, the GARMA-GARMASV (Gegenbauer AutoRegressive Mean Average - Id. Stochastic Volatility) is introduced. A sequential estimation strategy, based on the Whittle approximation to maximum likelihood is proposed and its finite sample performance is evaluated with a Monte Carlo analysis. Finally, a trifactorial in the mean and bifactorial in the volatility version of the model is proved to successfully fit the well-known sunspot index.
    Keywords: Stochastic volatility; cycles; long memory; QML estimation; sunspot index.
    JEL: C22 C13
    Date: 2011–07–14
    URL: http://d.repec.org/n?u=RePEc:ehu:biltok:201103&r=ets
  4. By: Tommaso, Proietti; Helmut, Luetkepohl
    Abstract: The paper investigates whether transforming a time series leads to an improvement in forecasting accuracy. The class of transformations that is considered is the Box-Cox power transformation, which applies to series measured on a ratio scale. We propose a nonparametric approach for estimating the optimal transformation parameter based on the frequency domain estimation of the prediction error variance, and also conduct an extensive recursive forecast experiment on a large set of seasonal monthly macroeconomic time series related to industrial production and retail turnover. In about one fifth of the series considered the Box-Cox transformation produces forecasts significantly better than the untransformed data at one-step-ahead horizon; in most of the cases the logarithmic transformation is the relevant one. As the forecast horizon increases, the evidence in favour of a transformation becomes less strong. Typically, the na¨ıve predictor that just reverses the transformation leads to a lower mean square error than the optimal predictor at short forecast leads. We also discuss whether the preliminary in-sample frequency domain assessment conducted provides a reliable guidance which series should be transformed for improving significantly the predictive performance.
    Keywords: Forecasts comparisons; Multi-step forecasting; Rolling forecasts; Nonparametric estimation of prediction error variance.
    JEL: C53 C14 C52 C22
    Date: 2011–07–18
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:32294&r=ets
  5. By: Giuseppe Cavaliere (Università di Bologna); Iliyan Georgiev (Faculdade de Economia, Universidade Nova de Lisboa); A.M.Robert Taylor (School of Economics, University of Nottingham)
    Abstract: It is well known that the standard i.i.d. bootstrap of the mean is inconsistent in a location model with infinite variance (?-stable) innovations. This occurs because the bootstrap distribution of a normalised sum of infinite variance random variables tends to a random distribution. Consistent bootstrap algorithms based on subsampling methods have been proposed but have the drawback that they deliver much wider confidence sets than those generated by the i.i.d. bootstrap owing to the fact that they eliminate the dependence of the bootstrap distribution on the sample extremes. In this paper we propose sufficient conditions that allow a simple modification of the bootstrap (Wu, 1986, Ann.Stat.) to be consistent (in a conditional sense) yet to also reproduce the narrower confidence sets of the i.i.d. bootstrap. Numerical results demonstrate that our proposed bootstrap method works very well in practice delivering coverage rates very close to the nominal level and significantly narrower confidence sets than other consistent methods
    Keywords: Bootstrap, distribuzioni stabili, misure di probabilità stocastiche, convergenza debole Bootstrap, stable distributions, random probability measures, weak convergence
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:bot:quadip:108&r=ets

This nep-ets issue is ©2011 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.