nep-ets New Economics Papers
on Econometric Time Series
Issue of 2012‒03‒21
fifteen papers chosen by
Yong Yin
SUNY at Buffalo

  1. A New Structural Break Model with Application to Canadian Inflation Forecasting By John M Maheu; Yong Song
  2. Large time-varying parameter VARs By Gary Koop; Dimitris Korobils
  3. Is real GDP stationary? Evidence from a panel unit root test with cross-sectional dependence and historical data By Aslanidis, Nektarios; Fountas, Stilianos
  4. Structural Change and Spurious Persistence in Stochastic Volatility By Walter Krämer; Philip Mess
  5. The Forecasting Performance of an Estimated Medium Run Model By Tobias Kitlinski; Torsten Schmidt
  6. Hierarchical Shrinkage in Time-Varying Parameter Models By Miguel Belmonte; Gary Koop; Dimitris Korobilis
  7. A New Model of Trend Inflation By Joshua Chan; Gary Koop; Simon Potter
  8. DSGE model-based forecasting By Marco Del Negro; Frank Schorfheide
  9. Bayesian estimation of DSGE models By Pablo A. Guerrón-Quintana; James M. Nason
  10. Missing in Asynchronicity: A Kalman-EM Approach for Multivariate Realized Covariance Estimation By Corsi, Fulvio; Peluso, Stefano; Audrino, Francesco
  11. Vector Autoregressive Models By Helmut Luetkepohl
  12. Bayesian Testing of Granger Causality in Markov-Switching VARs By Matthieu Droumaguet; Tomasz Wozniak
  13. Does the Box-Cox Transformation Help in Forecasting Macroeconomic Time Series? By Tommaso Proietti; Helmut Luetkepohl
  14. Low-Frequency Waves and the Medium to Long-Term US Stock Market Outlook By Valeriy Zakamulin
  15. Posterior Predictive Analysis for Evaluating DSGE Models By Jon Faust; Abhishek Gupta

  1. By: John M Maheu; Yong Song
    Abstract: This paper develops an efficient approach to model and forecast time-series data with an unknown number of change-points. Using a conjugate prior and conditional on time-invariant parameters, the predictive density and the posterior distribution of the change-points have closed forms. The conjugate prior is further modeled as hierarchical to exploit the information across regimes. This framework allows breaks in the variance, the regression coefficients or both. Regime duration can be modeled as a Poisson distribution. A new efficient Markov Chain Monte Carlo sampler draws the parameters as one block from the posterior distribution. An application to Canada inflation time series shows the gains in forecasting precision that our model provides.
    Keywords: multiple change-points, regime duration, inflation targeting, predictive density, MCMC
    JEL: C5 C22 C01 C11 E37
    Date: 2012–03–13
    URL: http://d.repec.org/n?u=RePEc:tor:tecipa:tecipa-448&r=ets
  2. By: Gary Koop; Dimitris Korobils
    Abstract: In this paper we develop methods for estimation and forecasting in large time-varying parameter vector autoregressive models (TVP-VARs). To overcome computational constraints with likelihood-based estimation of large systems, we rely on Kalman filter estimation with forgetting factors. We also draw on ideas from the dynamic model averaging literature and extend the TVP-VAR so that its dimension can change over time. A final extension lies in the development of a new method for estimating, in a time-varying manner, the parameter(s) of the shrinkage priors commonly-used with large VARs. These extensions are operationalized through the use of forgetting factor methods and are, thus, computationally simple. An empirical application involving forecasting inflation, real output, and interest rates demonstrates the feasibility and usefulness of our approach.
    Keywords: Bayesian VAR; forecasting; time-varying coefficients; state-space model
    JEL: C11 C52 E27 E37
    Date: 2012–01
    URL: http://d.repec.org/n?u=RePEc:gla:glaewp:2012_04&r=ets
  3. By: Aslanidis, Nektarios; Fountas, Stilianos
    Abstract: We use historical data that cover more than one century on real GDP for industrial countries and employ the Pesaran panel unit root test that allows for cross-sectional dependence to test for a unit root on real GDP. We find strong evidence against the unit root null. Our results are robust to the chosen group of countries and the sample period. Key words: real GDP stationarity, cross-sectional dependence, CIPS test. JEL Classification: C23, E32
    Keywords: Producte Interior Brut, 33 - Economia,
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:urv:wpaper:2072/181404&r=ets
  4. By: Walter Krämer; Philip Mess
    Abstract: We extend the well established link between structural change and estimated persistence from GARCH to stochastic volatility (SV) models. Whenever structural changes in some model parameters increase the empirical autocorrelations of the squares of the underlying time series, the persistence in volatility implied by the estimated model parameters follows suit. This explains why stochastic volatility often appears to be more persistent when estimated from a larger sample as then the likelihood increases that there might have been some structural change in between.
    Keywords: Persistence; stochastic volatility; structural change
    JEL: C32 C58
    Date: 2012–01
    URL: http://d.repec.org/n?u=RePEc:rwi:repape:0310&r=ets
  5. By: Tobias Kitlinski; Torsten Schmidt
    Abstract: In recent times DSGE models came more and more into the focus of forecasters and showed promising forecast performances for the short term. We contribute to the existing literature by analyzing the forecast power of a DSGE model including endogenous growth for the medium run. Instead of only calibrating the model we apply a mixture of calibrating and estimating using Bayesian estimation methods. As forecasting benchmarks we take the Smets-Wouters model (2007) and a VAR model. The evaluation of the forecast errors shows that the Medium-Term model outperforms the Smets-Wouters model with respect to some key macroeconomic variables in the medium run. Compared to the VAR model the Medium-Term model forecast performance is competitive. These results show that the forecast ability of DSGE models is also valid for the medium term.
    Keywords: Bayesian analysis; DSGE model; medium run; forecasting
    JEL: C32 C52 E32 E37
    Date: 2011–12
    URL: http://d.repec.org/n?u=RePEc:rwi:repape:0301&r=ets
  6. By: Miguel Belmonte (Department of Economics, University of Strathclyde); Gary Koop (Department of Economics); Dimitris Korobilis (Department of Economics, University of Glasgow)
    Abstract: In this paper, we forecast EU-area inflation with many predictors using time-varying parameter models. The facts that time-varying parameter models are parameter-rich and the time span of our data is relatively short motivate a desire for shrinkage. In constant coefficient regression models, the Bayesian Lasso is gaining increasing popularity as an effective tool for achieving such shrinkage. In this paper, we develop econometric methods for using the Bayesian Lasso with time-varying parameter models. Our approach allows for the coefficient on each predictor to be: i) time varying, ii) constant over time or iii) shrunk to zero. The econometric methodology decides automatically which category each coefficient belongs in. Our empirical results indicate the benefits of such an approach.
    Keywords: Forecasting; hierarchical prior; time-varying parameters; Bayesian Lasso
    JEL: C11 C52 E37 E47
    Date: 2011–06
    URL: http://d.repec.org/n?u=RePEc:str:wpaper:1137&r=ets
  7. By: Joshua Chan (College of Business and Economics, Australian National University); Gary Koop (Department of Economics, University of Strathclyde); Simon Potter (Federal Reserve Bank of New York)
    Abstract: This paper introduces a new model of trend (or underlying) inflation. In contrast to many earlier approaches, which allow for trend inflation to evolve according to a random walk, ours is a bounded model which ensures that trend inflation is constrained to lie in an interval. The bounds of this interval can either be fixed or estimated from the data. Our model also allows for a time-varying degree of persistence in the transitory component of inflation. The bounds placed on trend inflation mean that standard econometric methods for estimating linear Gaussian state space models cannot be used and we develop a posterior simulation algorithm for estimating the bounded trend inflation model. In an empirical exercise with CPI inflation we find the model to work well, yielding more sensible measures of trend inflation and forecasting better than popular alternatives such as the unobserved components stochastic volatility model.
    Keywords: Constrained inflation, non-linear state space model, underlying inflation, inflation targeting, inflation forecasting, Bayesian
    JEL: E31 E37 C11 C53
    Date: 2012–02
    URL: http://d.repec.org/n?u=RePEc:str:wpaper:1202&r=ets
  8. By: Marco Del Negro; Frank Schorfheide
    Abstract: Dynamic stochastic general equilibrium (DSGE) models use modern macroeconomic theory to explain and predict comovements of aggregate time series over the business cycle and to perform policy analysis. We explain how to use DSGE models for all three purposes—forecasting, story telling, and policy experiments—and review their forecasting record. We also provide our own real-time assessment of the forecasting performance of the Smets and Wouters (2007) model data up to 2011, compare it with Blue Chip and Greenbook forecasts, and show how it changes as we augment the standard set of observables with external information from surveys (nowcasts, interest rate forecasts, and expectations for long-run inflation and output growth). We explore methods of generating forecasts in the presence of a zero-lower-bound constraint on nominal interest rates and conditional on counterfactual interest rate paths. Finally, we perform a postmortem of DSGE model forecasts of the Great Recession and show that forecasts from a version of the Smets-Wouters model augmented by financial frictions, and using spreads as an observable, compare well with Blue Chip forecasts.
    Keywords: Stochastic analysis ; Equilibrium (Economics) ; Time-series analysis ; Econometric models ; Monetary policy ; Economic forecasting ; Recessions
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:fip:fednsr:554&r=ets
  9. By: Pablo A. Guerrón-Quintana; James M. Nason
    Abstract: We survey Bayesian methods for estimating dynamic stochastic general equilibrium (DSGE) models in this article. We focus on New Keynesian (NK)DSGE models because of the interest shown in this class of models by economists in academic and policy-making institutions. This interest stems from the ability of this class of DSGE model to transmit real, nominal, and fiscal and monetary policy shocks into endogenous fluctuations at business cycle frequencies. Intuition about these propagation mechanisms is developed by reviewing the structure of a canonical NKDSGE model. Estimation and evaluation of the NKDSGE model rests on being able to detrend its optimality and equilibrium conditions, to construct a linear approximation of the model, to solve for its linear approximate decision rules, and to map from this solution into a state space model to generate Kalman filter projections. The likelihood of the linear approximate NKDSGE model is based on these projections. The projections and likelihood are useful inputs into the Metropolis-Hastings Markov chain Monte Carlo simulator that we employ to produce Bayesian estimates of the NKDSGE model. We discuss an algorithm that implements this simulator. This algorithm involves choosing priors of the NKDSGE model parameters and fixing initial conditions to start the simulator. The output of the simulator is posterior estimates of two NKDSGE models, which are summarized and compared to results in the existing literature. Given the posterior distributions, the NKDSGE models are evaluated with tools that determine which is most favored by the data. We also give a short history of DSGE model estimation as well as pointing to issues that are at the frontier of this research.
    Keywords: Bayesian statistical decision theory ; Markov processes ; Keynesian economics
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:fip:fedpwp:12-4&r=ets
  10. By: Corsi, Fulvio; Peluso, Stefano; Audrino, Francesco
    Abstract: Motivated by the need for an unbiased and positive-semidefinite estimator of multivariate realized covariance matrices, we model noisy and asynchronous ultra-high-frequency asset prices in a state-space framework with missing data. We then estimate the covariance matrix of the latent states through a Kalman smoother and Expectation Maximization (KEM) algorithm. In the expectation step, by means of the Kalman filter with missing data, we reconstruct the smoothed and synchronized series of the latent price processes. In the maximization step, we search for covariance matrices that maximize the expected likelihood obtained with the reconstructed price series. Iterating between the two EM steps, we obtain a KEM-improved covariance matrix estimate which is robust to both asynchronicity and microstructure noise, and positive-semidefinite by construction. Extensive Monte Carlo simulations show the superior performance of the KEM estimator over several alternative covariance matrix estimates introduced in the literature. The application of the KEM estimator in practice is illustrated on a 10-dimensional US stock data set.
    Keywords: High frequency data; Realized covariance matrix; Market microstructure noise; Missing data; Kalman filter; EM algorithm; Maximum likelihood
    JEL: C13 C51 C52 C58
    Date: 2012–01
    URL: http://d.repec.org/n?u=RePEc:usg:econwp:2012:02&r=ets
  11. By: Helmut Luetkepohl
    Abstract: Multivariate simultaneous equations models were used extensively for macroeconometric analysis when Sims (1980) advocated vector autoregressive (VAR) models as alternatives. At that time longer and more frequently observed macroeconomic time series called for models which described the dynamic structure of the variables. VAR models lend themselves for this purpose. They typically treat all variables as a priori endogenous. Thereby they account for Sims’ critique that the exogeneity assumptions for some of the variables in simultaneous equations models are ad hoc and often not backed by fully developed theories. Restrictions, including exogeneity of some of the variables, may be imposed on VAR models based on statistical procedures. VAR models are natural tools for forecasting. Their setup is such that current values of a set of variables are partly explained by past values of the variables involved. They can also be used for economic analysis, however, because they describe the joint generation mechanism of the variables involved. Structural VAR analysis attempts to investigate structural economic hypotheses with the help of VAR models. Impulse response analysis, forecast error variance decompositions, historical decompositions and the analysis of forecast scenarios are the tools which have been proposed for disentangling the relations between the variables in a VAR model. Traditionally VAR models are designed for stationary variables without time trends. Trending behavior can be captured by including deterministic polynomial terms. In the 1980s the discovery of the importance of stochastic trends in economic variables and the development of the concept of cointegration by Granger (1981), Engle and Granger (1987), Johansen (1995) and others have shown that stochastic trends can also be captured by VAR models. If there are trends in some of the variables it may be desirable to separate the long-run relations from the short-run dynamics of the generation process of a set of variables. Vector error correction models offer a convenient framework for separating longrun and short-run components of the data generation process (DGP). In the present chapter levels VAR models are considered where cointegration relations are not modelled explicitly although they may be present. Specific issues related to trending variables will be mentioned occasionally throughout the chapter. The advantage of levels VAR models over vector error correction models is that they can also be used when the cointegration structure is unknown. Cointegration analysis and error correction models are discussed specifically in the next chapter.
    JEL: C32
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2011/30&r=ets
  12. By: Matthieu Droumaguet; Tomasz Wozniak
    Abstract: Recent economic developments have shown the importance of spillover and contagion effects in financial markets as well as in macroeconomic reality. Such effects are not limited to relations between the levels of variables but also impact on the volatility and the distributions. We propose a method of testing restrictions for Granger noncausality on all these levels in the framework of Markov-switching Vector Autoregressive Models. The conditions for Granger noncausality for these models were derived by Warne (2000). Due to the nonlinearity of the restrictions, classical tests have limited use. We, therefore, choose a Bayesian approach to testing. The inference consists of a novel Gibbs sampling algorithm for estimation of the restricted models, and of standard methods of computing the Posterior Odds Ratio. The analysis may be applied to financial and macroeconomic time series with complicated properties, such as changes of parameter values over time and heteroskedasticity.
    Keywords: Granger Causality; Markov Switching Models; Hypothesis Testing; Posterior Odds Ratio; Gibbs Sampling
    JEL: C11 C12 C32 C53 E32
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2012/06&r=ets
  13. By: Tommaso Proietti; Helmut Luetkepohl
    Abstract: The paper investigates whether transforming a time series leads to an improvement in forecasting accuracy. The class of transformations that is considered is the Box-Cox power transformation, which applies to series measured on a ratio scale. We propose a nonparametric approach for estimating the optimal transformation parameter based on the frequency domain estimation of the prediction error variance, and also conduct an extensive recursive forecast experiment on a large set of seasonal monthly macroeconomic time series related to industrial production and retail turnover. In about one fifth of the series considered the Box-Cox transformation produces forecasts significantly better than the untransformed data at one-step-ahead horizon; in most of the cases the logarithmic transformation is the relevant one. As the forecast horizon increases, the evidence in favour of a transformation becomes less strong. Typically, the na¨ive predictor that just reverses the transformation leads to a lower mean square error than the optimal predictor at short forecast leads. We also discuss whether the preliminary in-sample frequency domain assessment conducted provides a reliable guidance which series should be transformed for improving significantly the predictive performance.
    Keywords: Forecasts comparisons. Multi-step forecasting. Rolling forecasts. Nonparametric estimation of prediction error variance.
    JEL: C22
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2011/29&r=ets
  14. By: Valeriy Zakamulin
    Abstract: In this paper we provide compelling evidence of cyclical mean reversion and multiperiod stock return predictability over horizons of about 30 years with a half-life of about 15 years. This implies that the US stock market follows a long-term rhythm where a period of above average returns tends to be followed by a period of below average returns. We demonstrate that this long-term stock market rhythm moves in lockstep with corresponding long-term economic, social, and political rhythms in the US. Assuming that the past relationship between these rhythms will hold unaltered in the future, we provide the medium to long-term stock market outlook.
    Date: 2012–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1203.2250&r=ets
  15. By: Jon Faust; Abhishek Gupta
    Abstract: While dynamic stochastic general equilibrium (DSGE) models for monetary policy analysis have come a long way, there is considerable difference of opinion over the role these models should play in the policy process. The paper develops three main points about assessing the value of these models. First, we document that DSGE models continue to have aspects of crude approximation and omission. This motivates the need for tools to reveal the strengths and weaknesses of the models--both to direct development efforts and to inform how best to use the current flawed models. Second, posterior predictive analysis provides a useful and economical tool for finding and communicating strengths and weaknesses. In particular, we adapt a form of discrepancy analysis as proposed by Gelman, et al. (1996). Third, we provide a nonstandard defense of posterior predictive analysis in the DSGE context against long-standing objections. We use the iconic Smets-Wouters model for illustrative purposes, showing a number of heretofore unrecognized properties that may be important from a policymaking perspective.
    JEL: C52 E1 E32 E37
    Date: 2012–03
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:17906&r=ets

This nep-ets issue is ©2012 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.