nep-for New Economics Papers
on Forecasting
Issue of 2011‒04‒09
six papers chosen by
Rob J Hyndman
Monash University

  1. Forecasting Performance of Alternative Error Correction Models By Iqbal, Javed
  2. Forecasting Long-Term Interest Rates with a Dynamic General Equilibrium Model of the Euro Area: The Role of the Feedback By Paolo Zagaglia
  3. Mixed Frequency Forecasts for Chinese GDP By Philipp Maier
  4. Has the Basel II Accord Encouraged Risk Management During the 2008-09 Financial Crisis? By Michael McAleer; Juan-Ángel Jiménez-Martín; Teodosio Pérez-Amaral
  5. Varying the VaR for Unconditional and Conditional Environments By John Cotter
  6. Realised and Optimal Monetary Policy Rules in an Estimated Markov-Switching DSGE Model of the United Kingdom By Xiaoshan Chen; Ronald MacDonald

  1. By: Iqbal, Javed
    Abstract: It is well established that regression analysis on non-stationary time series data may yield spurious results. An earlier response to this problem was to run regression with first difference of variables. But this transformation destroys any long-run information embodied in the levels of variables. According to ‘Granger Representation Theorem’ (Engle and Granger, 1987) if variables are co-integrated, there exist an error correction mechanism which incorporates long run information in modeling changes in variables. This mechanism employs an additional lag value of the disequilibrium error as an additional variable in modeling changes in variables. It has been argued that ECM performs better for long run forecast than a simple first difference or level regression. This process contributes to the literature in two important ways. Firstly empirical evidence does not exist on the relative merits of ECM arrived at using alternative co-integration techniques. The three popular co-integration procedures considered are the Engle-Granger (1987) two step procedure, the Johansen (1988) multivariate system based technique and the recently developed Auto regressive Distributed Lag based technique of Pesaran et al. (1996, 2001). Secondly, earlier studies on the forecasting performance of the ECM employed macroeconomic data on developed economies i.e. the US and the UK. By employing data form the Asian countries and using absolute version of the purchasing power parity and money demand function this paper compares forecast accuracy of the three alternative error correction models in forecasting the nominal exchange rate and monetary aggregate (M2).
    Keywords: Co-integration; Error Correction Models; Forecasting
    JEL: C32 C53
    Date: 2011–03–19
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:29826&r=for
  2. By: Paolo Zagaglia (Department of Economics, University of Bologna)
    Abstract: This paper studies the forecasting performance of the general equilibrium model of bond yields of Marzo, Söderström and Zagaglia (2008), where long-term interest rates are an integral part of the monetary transmission mechanism. The model is estimated with Bayesian methods on Euro area data. I compare the out-of-sample predictive performance of the model against a variety of competing specifications, including that of De Graeve, Emiris and Wouters (2009). Forecast accuracy is evaluated through both univariate and multivariate measures. I also control the statistical significance of the forecast differences using the tests of Diebold and Mariano (1995), Hansen (2005) and White (1980). I show that taking into account the impact of the term structure of interest rates on the macroeconomy generates superior out-of-sample forecasts for both real variables, such as output, and inflation, and for bond yields.
    Keywords: Yield curve, general equilibrium models, Bayesian estimation, forecasting
    JEL: E43 E44 E52
    Date: 2011–03
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:19_11&r=for
  3. By: Philipp Maier
    Abstract: We evaluate different approaches for using monthly indicators to predict Chinese GDP for the current and the next quarter (‘nowcasts’ and ‘forecasts’, respectively). We use three types of mixed-frequency models, one based on an economic activity indicator (Liu et al., 2007), one based on averaging over indicator models (Stock and Watson, 2004), and a static factor model (Stock and Watson, 2002). Evaluating all models’ out-of-sample projections, we find that all the approaches can yield considerable improvements over naïve AR benchmarks. We also analyze pooling across forecasting methodologies. We find that the most accurate nowcast is given by a combination of a factor model and an indicator model. The most accurate forecast is given by a factor model. Overall, we conclude that these models, or combinations of these models, can yield improvements in terms of RMSE’s of up to 60 per cent over simple AR benchmarks.
    Keywords: Econometric and statistical methods; International topics
    JEL: C50 C53 E37 E47
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:bca:bocawp:11-11&r=for
  4. By: Michael McAleer (Erasmus University Rotterdam, Tinbergen Institute, The Netherlands, and Institute of Economic Research, Kyoto University); Juan-Ángel Jiménez-Martín (Department of Quantitative Economics, Complutense University of Madrid); Teodosio Pérez-Amaral (Department of Quantitative Economics, Complutense University of Madrid)
    Abstract: The Basel II Accord requires that banks and other Authorized Deposit-taking Institutions (ADIs) communicate their daily risk forecasts to the appropriate monetary authorities at the beginning of each trading day, using one or more risk models to measure Value-at-Risk (VaR). The risk estimates of these models are used to determine capital requirements and associated capital costs of ADIs, depending in part on the number of previous violations, whereby realised losses exceed the estimated VaR. In this paper we define risk management in terms of choosing sensibly from a variety of risk models, and discuss the selection of optimal risk models. A new approach to model selection for predicting VaR is proposed, consisting of combining alternative risk models, and comparing conservative and aggressive strategies for choosing between VaR models. We then examine how different risk management strategies performed during the 2008-09 financial crisis. These issues are illustrated using Standard and Poor's 500 Index, with an emphasis on how market risk management practices were encouraged by the Basel II Accord regulations during the financial crisis.
    Keywords: Value-at-Risk (VaR), daily capital charges, exogenous and endogenous violations, violation penalties, optimizing strategy, risk forecasts, aggressive or conservative risk management strategies, Basel II Accord, global financial crisis.
    JEL: G32 G11 C53 C22
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:kyo:wpaper:767&r=for
  5. By: John Cotter
    Abstract: Accurate forecasting of risk is the key to successful risk management techniques. Using the largest stock index futures from twelve European bourses, this paper presents VaR measures based on their unconditional and conditional distributions for single and multi-period settings. These measures underpinned by extreme value theory are statistically robust explicitly allowing for fat-tailed densities. Conditional tail estimates are obtained by adjusting the unconditional extreme value procedure with GARCH filtered returns. The conditional modelling results in iid returns allowing for the use of a simple and efficient multi-period extreme value scaling law. The paper examines the properties of these distinct conditional and unconditional trading models. The paper finds that the biases inherent in unconditional single and multi-period estimates assuming normality extend to the conditional setting.
    Date: 2011–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1103.5649&r=for
  6. By: Xiaoshan Chen; Ronald MacDonald
    Abstract: This paper conducts a systematic investigation of parameter instability in a small open economy DSGE model of the UK economy over the past thirty-five years. Using Bayesian analysis, we find a number of Markov-switching versions of the model provide a better fit for the UK data than a model with time-invariant parameters. The Markov-switching DSGE model that has two independent Markov-chains - one governing the shifts in UK monetary policy and nominal price rigidity and one governing the standard deviations of shocks - is selected as the best fitting model. The preferred model is then used to evaluate and design monetary policy. For the latter, we use the Markov-Jump-Linear-Quadratic (MJLQ) model, as it incorporates abrupt changes in structural parameters into derivations of the optimal and arbitrary policy rules. It also reveals the entire forecasting distribution of the targeted variables. To our knowledge, this is the first paper that attempts to evaluate and design UK monetary policy based on an estimated open economy Markov-switching DSGE model.
    Keywords: DSGE models; Markov-switching; Bayesian analysis
    JEL: C11 C32 C51 C52
    Date: 2011–01
    URL: http://d.repec.org/n?u=RePEc:gla:glaewp:2011_04&r=for

This nep-for issue is ©2011 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.