nep-for New Economics Papers
on Forecasting
Issue of 2013‒10‒25
ten papers chosen by
Rob J Hyndman
Monash University

  1. Forecast combination for U.S. recessions with real-time data By Pauwels, Laurent; Vasnev, Andrey
  2. Practical considerations for optimal weights in density forecast combi nation By Pauwels, Laurent; Vasnev, Andrey
  3. Anchoring the yield curve using survey expectations By Carlo Altavilla; Raffaella Giacomini; Giuseppe Ragusa
  4. How to Identify and Forecast Bull and Bear Markets? By Kole, H.J.W.G.; Dijk, D.J.C. van
  5. Forecasting growth during the Great Recession: is financial volatility the missing ingredient? By Ferrara, L.; Marsilli, C.; Ortega, J-P.
  6. Can social microblogging be used to forecast intraday exchange rates? By Panagiotis Papaioannnou; Lucia Russo; George Papaioannou; Constantinos Siettos
  7. Equity Premia Predictability in the EuroZone By Nuno Silva
  8. Practical use of sensitivity in econometrics with an illustration to forecast combinations By Magnus, Jan R; Vasnev, Andrey
  9. Estimating Taylor Rules for Switzerland: Evidence from 2000 to 2012 By Nikolay Markov; Thomas Nitschka
  10. Trendspotting in Asset Markets By Committee, Nobel Prize

  1. By: Pauwels, Laurent; Vasnev, Andrey
    Abstract: This paper proposes the use of forecast combination to improve predictive accuracy in forecasting the U.S. business cycle index as published by the Business Cycle Dating Committee of the NBER. It focuses on one-step ahead out-of-sample monthly forecast utilising the well-established coincident indicators and yield curve models, allowing for dynamics and real-time data revisions. Forecast combinations use logscore and quadratic-score based weights, which change over time. This paper finds that forecast accuracy improves when combining the probability forecasts of both the coincident indicators model and the yield curve model, compared to each model's own forecasting performance.
    Date: 2013–01
    URL: http://d.repec.org/n?u=RePEc:syb:wpbsba:2123/8933&r=for
  2. By: Pauwels, Laurent; Vasnev, Andrey
    Abstract: The problem of finding appropriate weights to combine several density forecasts is an important issue currently debated in the forecast combination literature. Recently, a paper by Hall and Mitchell (IJF, 2007) proposes to combine density forecasts with optimal weights obtained from solving an optimization problem. This paper studies the properties of this optimization problem when the number of forecasting periods is relatively small and finds that it often produces corner solutions by allocating all the weight to one density forecast only. This paper's practical recommendation is to have an additional training sample period for the optimal weights. While reserving a portion of the data for parameter estimation and making pseudo-out-of-sample forecasts are common practices in the empirical literature, employing a separate training sample for the optimal weights is novel, and it is suggested because it decreases the chances of corner solutions. Alternative log-score or quadratic-score weighting schemes do not have this training sample requirement. January
    Date: 2013–01
    URL: http://d.repec.org/n?u=RePEc:syb:wpbsba:2123/8932&r=for
  3. By: Carlo Altavilla; Raffaella Giacomini (Institute for Fiscal Studies and UCL); Giuseppe Ragusa
    Abstract: The dynamic behavior of the term structure of interest rates is difficult to replicate with models, and even models with a proven track record of empirical performance have underperformed since the early 2000s. On the other hand, survey expectations are accurate predictors of yields, but only for very short maturities. We argue that this is partly due to the ability of survey participants to incorporate information about the current state of the economy as well as forward-looking information such as that contained in monetary policy announcements. We show how the informational advantage of survey expectations about short yields can be exploited to improve the accuracy of yield curve forecasts given by a base model. We do so by employing a flexible projection method that anchors the model forecasts to the survey expectations in segments of the yield curve where the informational advantage exists and transmits the superior forecasting ability to all remaining yields. The method implicitly incorporates into yield curve forecasts any information that survey participants have access to, without the need to explicitly model it. We document that anchoring delivers large and significant gains in forecast accuracy for the whole yield curve, with improvements of up to 52% over the years 2000-2012 relative to the class of models that are widely adopted by financial and policy institutions for forecasting the term structure of interest rates.
    Date: 2013–10
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:52/13&r=for
  4. By: Kole, H.J.W.G.; Dijk, D.J.C. van
    Abstract: The state of the equity market, often referred to as a bull or a bear market, is of key importance for financial decisions and economic analyses. Its latent nature has led to several methods to identify past and current states of the market and forecast future states. These methods encompass semi-parametric rule-based methods and parametric regime-switching models. We compare these methods by new statistical and economic measures that take into account the latent nature of the market state. The statistical measure is based directly on the predictions, while the economic mea- sure is based on the utility that results when a risk-averse agent uses the predictions in an investment decision. Our application of this framework to the S&P500 shows that rule-based methods are preferable for (in-sample) identification of the market state, but regime-switching models for (out-of-sample) forecasting. In-sample only the direction of the market matters, but for forecasting both means and volatilities of returns are important. Both the statistical and the economic measures indicate that these differences are significant.
    Keywords: forecast evaluation;regime switching;stock market;economic comparison
    Date: 2013–10–14
    URL: http://d.repec.org/n?u=RePEc:dgr:eureri:1765041558&r=for
  5. By: Ferrara, L.; Marsilli, C.; Ortega, J-P.
    Abstract: The Great Recession endured by the main industrialized countries during the period 2008–2009, in the wake of the financial and banking crisis, has pointed out the major role of the financial sector on macroeconomic fluctuations. In this respect, many researchers have started to reconsider the linkages between financial and macroeconomic areas. In this paper, we evaluate the leading role of the daily volatility of two major financial variables, namely commodity and stock prices, in their ability to anticipate the output growth. For this purpose, we propose an extended MIDAS (Mixed Data Sampling) model that allows the forecasting of the quarterly output growth rate using exogenous variables sampled at various higher frequencies. Empirical results on three industrialized countries (US, France, and UK) show that mixing daily financial volatilities and monthly industrial production is useful at the time of predicting gross domestic product growth over the Great Recession period.
    Keywords: Great Recession, GDP Forecasting, Financial variables, MIDAS approach, Volatility.
    JEL: C53 E17
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:bfr:banfra:454&r=for
  6. By: Panagiotis Papaioannnou; Lucia Russo; George Papaioannou; Constantinos Siettos
    Abstract: The Efficient Market Hypothesis (EMH) is widely accepted to hold true under certain assumptions. One of its implications is that the prediction of stock prices at least in the short run cannot outperform the random walk model. Yet, recently many studies stressing the psychological and social dimension of financial behavior have challenged the validity of the EMH. Towards this aim, over the last few years, internet-based communication platforms and search engines have been used to extract early indicators of social and economic trends. Here, we used Twitter's social networking platform to model and forecast the EUR/USD exchange rate in a high-frequency intradaily trading scale. Using time series and trading simulations analysis, we provide some evidence that the information provided in social microblogging platforms such as Twitter can in certain cases enhance the forecasting efficiency regarding the very short (intradaily) forex.
    Date: 2013–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1310.5306&r=for
  7. By: Nuno Silva (GEMF/ Faculty of Economics University of Coimbra, Portugal)
    Abstract: In this paper, we studied the equity premium predictability in eleven EuroZone countries. Besides some traditional predictive variables, we have also chosen two other that, to our knowledge, have never been previously used in this literature: the change in the OECD normalized composite leading indicator and the change in the OECD business confidence indicator. The OECD indicators have shown a good performance, in particular during the early stages of the recent financial crisis. We also computed the utility gains that a mean-variance investor would have obtained, if he has used these forecasting variables, and concluded that, for most countries, the utility gains would have been considerable.
    Keywords: Internation stock markets, Equity premia predictability, Asset allocation
    JEL: C22 C53 G11 G17
    Date: 2013–09
    URL: http://d.repec.org/n?u=RePEc:gmf:wpaper:2013-22.&r=for
  8. By: Magnus, Jan R; Vasnev, Andrey
    Abstract: Sensitivity analysis is important for its own sake and also in combination with diagnostic testing. We consider the question how to use sensitivity statistics in practice, in particular how to judge whether sensitivity is large or small. For this purpose we distinguish between absolute and relative sensitivity and highlight the context-dependent nature of any sensitivity analysis. Relative sensitivity is then applied in the context of forecast combination and sensitivity-based weights are introduced. All concepts are illustrated through the European yield curve. In this context it is natural to look at sensitivity to autocorrelation and normality assumptions. Different forecasting models are combined with equal, fit-based and sensitivity-based weights, and compared with the multivariate and random walk benchmarks. We show that the fit-based weights and the sensitivity-based weights are complementary. For long-term maturities the sensitivity-based weights perform better than other weights.
    Date: 2013–03
    URL: http://d.repec.org/n?u=RePEc:syb:wpbsba:2123/8964&r=for
  9. By: Nikolay Markov; Thomas Nitschka
    Abstract: This paper estimates Taylor rules using real-time inflation forecasts of the Swiss National Bank's (SNB) ARIMA model and real-time model-based internal estimates of the output gap since the onset of the monetary policy concept adopted in 2000. To study how market participants understand the SNB's behavior, we compare these Taylor rules to marketexpected rules using Consensus Economics survey-based measures of expectations. In light of the recent financial crisis, the zero-lower bound period and the subsequent massive Swiss franc appreciation, we analyze potential nonlinearity of the rules using a novel semi-parametric approach. First, the results show that the SNB reacts more strongly to its ARIMA inflation forecasts three and four quarters ahead than to forecasts at shorter horizons. Second, market participants have expected a higher inflation responsiveness of the SNB than found with the central bank's data. Third, the best fitting specification includes a reaction to the nominal effective Swiss franc appreciation. Finally, the semiparametric regressions suggest that the central bank reacts to movements in the output gap and the exchange rate to the extent that they become a concern for price stability and economic activity.
    Keywords: Taylor rules, real-time data, nonlinearity, semi-parametric-modeling
    JEL: E52 E58 C14
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:snb:snbwpa:2013-08&r=for
  10. By: Committee, Nobel Prize (Nobel Prize Committee)
    Abstract: There is no way to predict whether the price of stocks and bonds will go up or down over the next few days or weeks. But it is quite possible to foresee the broad course of the prices of these assets over longer time periods, such as, the next three to five years. These findings, which may seem both surprising and contradictory, were made and analyzed by this year's Laureates, Eugene Fama, Lars Peter Hansen and Robert Shiller.
    Keywords: Asset prices;
    JEL: G12
    Date: 2013–10–14
    URL: http://d.repec.org/n?u=RePEc:ris:nobelp:2013_002&r=for

This nep-for issue is ©2013 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.