nep-ets New Economics Papers
on Econometric Time Series
Issue of 2011‒06‒11
fourteen papers chosen by
Yong Yin
SUNY at Buffalo

  1. Ranking Multivariate GARCH Models by Problem Dimension: An Empirical Evaluation By Massimiliano Caporin; Michael McAleer
  2. Bayesian Inference in the Time Varying Cointegration Model* By Gary Koop; Roberto Leon-Gonzalez; Rodney Strachan
  3. Regime-Switching Cointegration* By Markus Jochmann; Gary Koop
  4. Bayesian Model Averaging in the Instrumental Variable Regression Model* By Gary Koop; Roberto Leon-Gonzalez; Rodney Strachan
  5. Forecasting with Medium and Large Bayesian VARs By Gary Koop
  6. Modelling Breaks and Clusters in the Steady States of Macroeconomic Variables By Gary Koop; Joshua Chan
  7. Forecasting In?ation Using Dynamic Model Averaging* By Gary Koop; Dimitris Korobilis
  8. A comparison of Forecasting Procedures for Macroeconomic Series: The Contribution of Structural Break Models By Luc Bauwens; Gary Koop; Dimitris Korobilis; Jeroen Rombouts
  9. Asymptotic equivalence and sufficiency for volatility estimation under microstructure noise By Markus Reiß
  10. Estimation of the characteristics of a Lévy process observed at arbitrary frequency By Johanna Kappus; Markus Reiß
  11. Time and frequency domain in the business cycle structure By Jitka Poměnková; Roman Maršálek
  12. Evaluating density forecasts: a comment By Tsyplakov, Alexander
  13. Inference on Impulse Response Functions in Structural VAR Models By Inoue, Atsushi; Kilian, Lutz
  14. Uncovering Long Memory in High Frequency UK Futures By John Cotter

  1. By: Massimiliano Caporin; Michael McAleer (University of Canterbury)
    Abstract: In the last 15 years, several Multivariate GARCH (MGARCH) models have appeared in the literature. Recent research has begun to examine MGARCH specifications in terms of their out-of-sample forecasting performance. In this paper, we provide an empirical comparison of a set of models, namely BEKK, DCC, Corrected DCC (cDCC) of Aeilli (2008), CCC, Exponentially Weighted Moving Average, and covariance shrinking, using historical data of 89 US equities. Our methods follow part of the approach described in Patton and Sheppard (2009), and the paper contributes to the literature in several directions. First, we consider a wide range of models, including the recent cDCC model and covariance shrinking. Second, we use a range of tests and approaches for direct and indirect model comparison, including the Weighted Likelihood Ratio test of Amisano and Giacomini (2007). Third, we examine how the model rankings are influenced by the cross-sectional dimension of the problem.
    Keywords: Covariance forecasting; model confidence set; model ranking; MGARCH; model comparison
    JEL: C32 C53 C52
    Date: 2011–05–01
    URL: http://d.repec.org/n?u=RePEc:cbt:econwp:11/23&r=ets
  2. By: Gary Koop (Department of Economics, University of Strathclyde); Roberto Leon-Gonzalez (National Graduate Institute for Policy Studies); Rodney Strachan (The Australian National University)
    Abstract: There are both theoretical and empirical reasons for believing that the parameters of macroeconomic models may vary over time. However, work with time-varying parameter models has largely involved Vector autoregressions (VARs), ignoring cointegration. This is despite the fact that cointegration plays an important role in informing macroeconomists on a range of issues. In this paper we develop time varying parameter models which permit cointegration. Time-varying parameter VARs (TVP-VARs) typically use state space representations to model the evolution of parameters. In this paper, we show that it is not sensible to use straightforward extensions of TVP-VARs when allowing for cointegration. Instead we develop a specification which allows for the cointegrating space to evolve over time in a manner comparable to the random walk variation used with TVP-VARs. The properties of our approach are investigated before developing a method of posterior simulation. We use our methods in an empirical investigation involving a permanent/transitory variance decomposition for inflation.
    Keywords: Bayesian, time varying cointegration, error correctionmodel,reduced rank regression, Markov Chain Monte Carlo.
    JEL: C11 C32 C33
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:str:wpaper:1121&r=ets
  3. By: Markus Jochmann (Department of Economics, New~~#badcastle University); Gary Koop (Department of Economics, University of Strathclyde)
    Abstract: We develop methods for Bayesian inference in vector error correction models which are subject to a variety of switches in regime (e.g. Markov switches in regime or structural breaks). An important aspect of our approach is that we allow both the cointegrating vectors and the number of cointegrating relationships to change when the regime changes. We show how Bayesian model averaging or model selection methods can be used to deal with the high-dimensional model space that results. Our methods are used in an empirical study of the Fisher effect.
    Keywords: Bayesian, Markov switching, structural breaks, cointegration,
    JEL: C11 C32 C52
    Date: 2011–05
    URL: http://d.repec.org/n?u=RePEc:str:wpaper:1125&r=ets
  4. By: Gary Koop (Department of Economics, University of Strathclyde); Roberto Leon-Gonzalez (National Graduate Institute for Policy Studies); Rodney Strachan (The Australian National University)
    Abstract: This paper considers the instrumental variable regression model when there is uncertainty about the set of instruments, exogeneity restrictions, the validity of identifying restrictions and the set of exogenous regressors. This uncertainty can result in a huge number of models. To avoid statistical problems associated with standard model selection procedures, we develop a reversible jump Markov chain Monte Carlo algorithm that allows us to do Bayesian model averaging. The algorithm is very ?exible and can be easily adapted to analyze any of the di¤erent priors that have been proposed in the Bayesian instrumental variables literature. We show how to calculate the probability of any relevant restriction (e.g. the posterior probability that over-identifying restrictions hold) and discuss diagnostic checking using the posterior distribution of discrepancy vectors. We illustrate our methods in a returns-to-schooling application.
    Keywords: Bayesian, endogeneity, simultaneous equations, reversible jump Markov chain Monte Carlo.
    JEL: C30
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:str:wpaper:1112&r=ets
  5. By: Gary Koop (Department of Economics, University of Strathclyde)
    Abstract: This paper is motivated by the recent interest in the use of Bayesian VARs for forecasting, even in cases where the number of dependent variables is large. In such cases, factor methods have been traditionally used but recent work using a particular prior suggests that Bayesian VAR methods can forecast better. In this paper, we consider a range of alternative priors which have been used with small VARs, discuss the issues which arise when they are used with medium and large VARs and examine their forecast performance using a US macroeconomic data set containing 168 variables. We ?nd that Bayesian VARs do tend to forecast better than factor methods and provide an extensive comparison of the strengths and weaknesses of various approaches. Our empirical results show the importance of using forecast metrics which use the entire predictive density, instead of using only point forecasts.
    Keywords: Bayesian, Minnesota prior, stochastic search variable selection, predictive likelihood
    JEL: C11 C32 C53
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:str:wpaper:1117&r=ets
  6. By: Gary Koop (Department of Economics, University of Strathclyde); Joshua Chan (Australian National University)
    Abstract: Macroeconomists working with multivariate models typically face uncertainty over which (if any) of their variables have long run steady states which are subject to breaks. Furthermore, the nature of the break process is often unknown. In this paper, we draw on methods from the Bayesian clustering literature to develop an econometric methodology which: i) finds groups of variables which have the same number of breaks; and ii) determines the nature of the break process within each group. We present an application involving a five-variate steady-state VAR.
    Keywords: mixtures of normals, steady state VARs, Bayesian
    JEL: C11 C24 C32
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:str:wpaper:1111&r=ets
  7. By: Gary Koop (Department of Economics, University of Strathclyde); Dimitris Korobilis (Center for Operations Research & Econometrics (CORE), Universite Catholique de Louvain)
    Abstract: We forecast quarterly US inflation based on the generalized Phillips curve using econometric methods which incorporate dynamic model averaging. These methods not only allow for coe¢ cients to change over time, but also allow for the entire forecasting model to change over time. We find that dynamic model averaging leads to substantial forecasting improvements over simple benchmark regressions and more sophisticated approaches such as those using time varying coefficient models. We also provide evidence on which sets of predictors are relevant for forecasting in each period.
    Keywords: Bayesian, State space model, Phillips curve
    JEL: E31 E37 C11 C53
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:str:wpaper:1119&r=ets
  8. By: Luc Bauwens (Université catholique de Louvain); Gary Koop (Department of Economics, University of Strathclyde); Dimitris Korobilis (Universite Catholique de Louvain); Jeroen Rombouts (HEC Montréal (École des Hautes Études Commerciales) (Business School) and Center for Operations Research and Econometrics (CORE) ECORE)
    Abstract: This paper compares the forecasting performance of different models which have been proposed for forecasting in the presence of structural breaks. These models differ in their treatment of the break process, the parameters defining the model which applies in each regime and the out-of-sample probability of a break occurring. In an extensive empirical evaluation involving many important macroeconomic time series, we demonstrate the presence of structural breaks and their importance for forecasting in the vast majority of cases. However, we find no single forecasting model consistently works best in the presence of structural breaks. In many cases, the formal modeling of the break process is important in achieving good forecast performance. However, there are also many cases where simple, rolling OLS forecasts perform well.
    Keywords: Forecasting, change-points, Markov switching, Bayesian inference.
    JEL: C11 C22 C53
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:str:wpaper:1113&r=ets
  9. By: Markus Reiß
    Abstract: The basic model for high-frequency data in finance is considered, where an efficient price process is observed under microstructure noise. It is shown that this nonparametric model is in Le Cam's sense asymptotically equivalent to a Gaussian shift experiment in terms of the square root of the volatility function σ. As an application, simple rateoptimal estimators of the volatility and efficient estimators of the integrated volatility are constructed.
    Keywords: High-frequency data, integrated volatility, spot volatility estimation, Le Cam deficiency, equivalence of experiments, Gaussian shift
    JEL: C14
    Date: 2011–05
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2011-028&r=ets
  10. By: Johanna Kappus; Markus Reiß
    Abstract: A Lévy process is observed at time points of distance Δ until time T. We construct an estimator of the Lévy-Khinchine characteristics of the process and derive optimal rates of convergence simultaneously in T and Δ. Thereby, we encompass the usual low- and high-frequency assumptions and obtain also asymptotics in the mid-frequency regime.
    Keywords: Jump process, Lévy measure, deconvolution problem, statistical inverse problem
    JEL: C14 C22
    Date: 2011–05
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2011-027&r=ets
  11. By: Jitka Poměnková (Department of Finance, FBE MENDELU in Brno); Roman Maršálek (Department of Radio Electronics, Faculty of Electrical Engineering and Communication, Brno University of Technology)
    Abstract: The paper deals with identification of cyclical behaviour of business cycle from time and frequency domain perspectives. Herewith, commonly used methods for obtaining growth business cycle are investigated – the first order difference, the unobserved component models, regression curves and filtration using Baxter-King and Christiano-Fitzgerald band-pass filters as well as Hodrick-Prescott high-pass filter. In the case of time domain analysis identification of cycle lengths is based on dating process of the growth business cycle. For this reason, methods such right and left variant of naive techniques as well as Bry-Boschan algorithm are applied. In the case frequency domain analysis of cyclical structure trough spectrum estimate via periodogram and autoregressive process with optimum lag are suggested. Results from both domain approaches are compared. On their bases recommendation for cyclical structure identification of growth business cycle of the transition economy type (the Czech Republic) are formulated. In the context of the time domain analysis evaluation of unity results of de-trending techniques from identification turning point points of view is attached. All analyses are done on the quarterly data of the gross domestic product, the total industry excluding construction, the gross capital formation in the period 1996/Q1-2008/Q4 and on the final consumption expenditure in the period 1995/Q1-2008/Q4.
    Keywords: spectrum, business cycle, transition economy, frequency domain, time domain
    JEL: E32 C16 C5 C6
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:men:wpaper:07_2011&r=ets
  12. By: Tsyplakov, Alexander
    Abstract: This is a comment on Mitchell and Wallis (2011) which in turn is a critical reaction to Gneiting et al. (2007). The comment discusses the notion of forecast calibration, the advantage of using scoring rules, the “sharpness” principle and a general approach to testing calibration. The aim is to show how a more general and explicitly stated framework for evaluation of probabilistic forecasts can provide further insights.
    Keywords: density forecasts
    JEL: C53 C52
    Date: 2011–05–30
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:31184&r=ets
  13. By: Inoue, Atsushi; Kilian, Lutz
    Abstract: Skepticism toward traditional identifying assumptions based on exclusion restrictions has led to a surge in the use of structural VAR models in which structural shocks are identified by restricting the sign of the responses of selected macroeconomic aggregates to these shocks. Researchers commonly report the vector of pointwise posterior medians of the impulse responses as a measure of central tendency of the estimated response functions, along with pointwise 68 percent posterior error bands. It can be shown that this approach cannot be used to characterize the central tendency of the structural impulse response functions. We propose an alternative method of summarizing the evidence from sign-identified VAR models designed to enhance their practical usefulness. Our objective is to characterize the most likely admissible model(s) within the set of structural VAR models that satisfy the sign restrictions. We show how the set of most likely structural response functions can be computed from the posterior mode of the joint distribution of admissible models both in the fully identified and in the partially identified case, and we propose a highest-posterior density credible set that characterizes the joint uncertainty about this set. Our approach can also be used to resolve the long-standing problem of how to conduct joint inference on sets of structural impulse response functions in exactly identified VAR models. We illustrate the differences between our approach and the traditional approach for the analysis of the effects of monetary policy shocks and of the effects of oil demand and oil supply shocks.
    Keywords: Credible Set; Impulse responses; Median; Mode; Sign restrictions; Simultaneous inference; Vector autoregression
    JEL: C32 C52 E37 Q43
    Date: 2011–06
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:8419&r=ets
  14. By: John Cotter (University College Dublin)
    Abstract: Accurate volatility modelling is paramount for optimal risk management practices. One stylized feature of financial volatility that impacts the modelling process is long memory explored in this paper for alternative risk measures, observed absolute and squared returns for high frequency intraday UK futures. Volatility series for three different asset types, using stock index, interest rate and bond futures are analysed. Long memory is strongest for the bond contract. Long memory is always strongest for the absolute returns series and at a power transformation of k < 1. The long memory findings generally incorporate intraday periodicity. The APARCH model incorporating seven related GARCH processes generally models the futures series adequately documenting ARCH, GARCH and leverage effects.
    Keywords: Long Memory, APARCH, High Frequency Futures
    Date: 2011–05–30
    URL: http://d.repec.org/n?u=RePEc:ucd:wpaper:200414&r=ets

This nep-ets issue is ©2011 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.