nep-for New Economics Papers
on Forecasting
Issue of 2015‒12‒20
six papers chosen by
Rob J Hyndman
Monash University

  1. Real-Time Data should be used in Forecasting Output Growth and Recessionary Events in the US By Chrystalleni Aristidou; Kevin Lee; Kalvinder Shields
  2. Employing Bayesian Forecasting of Value-at-Risk to Determine an Appropriate Model for Risk Management By CHEN, Cathy W.S.; WENG, Monica M.C.; WATANABE, Toshiaki
  3. Labour Market Modelling within a DSGE Approach By Jaromir Tonner; Stanislav Tvrz; Osvald Vasicek
  4. A fully non-parametric heteroskedastic model By Matthieu Garcin; Clément Goulet
  5. Benchmarking judgmentally adjusted forecasts By Franses, Ph.H.B.F.; de Bruijn, B.

  1. By: Chrystalleni Aristidou; Kevin Lee; Kalvinder Shields
    Abstract: The paper investigates whether forecast performance is enhanced by real-time datasets incorporating past data vintages and survey expectations. It proposes a modelling framework and evaluation procedure which allows a real-time and a final assessment of the use of the data in forecasting judged by various statistical and economic criteria. Analysing US output data over 1968q4-2015q1, we find both elements of the real-time data are useful with their contributions varying over time. Revisions data are particularly valuable for point and density forecasts of growth but survey expectations are important in forecasting rare recessionary events.
    Keywords: Real-Time Data, Nowcasting, Revision, Survey, Growth, Recession
    Date: 2015
  2. By: CHEN, Cathy W.S.; WENG, Monica M.C.; WATANABE, Toshiaki
    Abstract: To allow for a higher degree of flexibility in model parameters, we propose a general and time-varying nonlinear smooth transition (ST) heteroskedastic model with a second-order logistic function of varying speed in the mean and variance. This paper evaluates the performance of Value-at-Risk (VaR) measures in a class of risk models, specially focusing on three distinct ST functions with GARCH structures: first- and second-order logistic functions, and the exponential function. The likelihood function is non-differentiable in terms of the threshold values and delay parameter. We employ Bayesian Markov chain Monte Carlo sampling methods to update the estimates and quantile forecasts. The proposed methods are illustrated using simulated data and an empirical study. We estimate VaR forecasts for the proposed models alongside some competing asymmetric models with skew and fat-tailed error probability distributions, including realized volatility models. To evaluate the accuracy of VaR estimates, we implement two loss functions and three backtests. The results show that the ST model with a second-order logistic function and skew Student’s t error is a worthy choice at the 1% level, when compared to a range of existing alternatives.
    Keywords: Second-order logistic transition function, Backtesting, Markov chain Monte Carlo methods, Value-at-Risk, Volatility forecasting, Realized volatility models
    Date: 2015–12–08
  3. By: Jaromir Tonner; Stanislav Tvrz; Osvald Vasicek
    Abstract: The goal of this paper is to find a suitable way of modelling the main labour market variables in the framework of the CNB's core DSGE model. The model selection criteria are: the predictive ability for unemployment, the change in the overall predictive ability in comparison to the baseline model and the extent of the required model change. We find that the incorporation of a modified Gali, Smets and Wouters (2011) labour market specification allows us to predict unemployment with an acceptable forecast error. At the same time it leads to a moderate improvement in the overall predictive ability of the model and requires only minor adjustments to the model structure. Thus, it should be preferred to more complicated concepts that yield a similar improvement in predictive ability. We also came to the conclusion that the concept linking unemployment and the GDP gap is promising. However, its practical application would require (additional) improvement in the accuracy of the consumption prediction. As a practical experiment, we compare the inflation pressures arising from nominal wages and the exchange rate in the baseline model and in alternative specifications. The experiment is motivated by the use of the exchange rate as an additional monetary policy instrument by the CNB since November 2013 in an environment of near-zero interest rates and growing disinflationary pressures. We find that the baseline model tends to forecast higher nominal wage growth and lower exchange rate depreciation than the models with more elaborate labour markets. Therefore, the alternative models would probably have identified an even higher need for exchange rate depreciation than the baseline model did.
    Keywords: DSGE, labour market, Nash bargaining, right to manage
    JEL: C53 E32 E37
    Date: 2015–08
  4. By: Matthieu Garcin (Centre d'Economie de la Sorbonne & Natixis Asset Management); Clément Goulet (Centre d'Economie de la Sorbonne)
    Abstract: In this paper we propose a new model for estimating returns and volatility. Our approach is based both on the wavelet denoising technique and on the variational theory. We assess that the volatility can be expressed as a non-parametric functional form of past returns. Therefore, we are able to forecast both returns and volatility and to build confidence intervals for predicted returns. Our technique outperforms classical time series theory. Our model does not require the stationarity of the observed log-returns, it preserves the volatility stylised facts and it is based on a fully non-parametric form. This non-parametric form is obtained thanks to the multiplicative noise theory. To our knowledge, this is the first time that such a method is used for financial modelling. We propose an application to intraday and daily financial data
    Keywords: Volatility modeling; non variational calculus; wavelet theory; trading strategy
    JEL: C14 C51 C53 C58
    Date: 2015–09
  5. By: Franses, Ph.H.B.F.; de Bruijn, B.
    Abstract: Many publicly available macroeconomic forecasts are judgmentally-adjusted model-based forecasts. In practice usually only a single final forecast is available, and not the underlying econometric model, nor are the size and reason for adjustment known. Hence, the relative weights given to the model forecasts and to the judgment are usually unknown to the analyst. This paper proposes a methodology to evaluate the quality of such final forecasts, also to allow learning from past errors. To do so, the analyst needs benchmark forecasts. We propose two such benchmarks. The first is the simple no-change forecast, which is the bottom line forecast that an expert should be able to improve. The second benchmark is an estimated model based forecast, which is found as the best forecast given the realizations and the final forecasts. We illustrate this methodology for two sets of GDP growth forecasts, one for the US and for the Netherlands. These applications tell us that adjustment appears most effective in periods of first recovery from a recession.
    Keywords: forecast decomposition, expert adjustment, total least squares
    JEL: C20 C51
    Date: 2015–11–01
  6. By: Alessia Naccarato; Andrea Pierini; Stefano Falorsi
    Abstract: The increased availability of online information in recent years has aroused interest as to the possibility of deriving indications on phenomena under studies. In the more specifically economic and statistical context, numerous studies suggest the use of online search data to improve the nowcasting and forecasting of the official economic indicators with a view to increasing the promptness of their circulation. In the same way, this paper puts forward a model for multiple time series that harnesses cointegration of the official time series of the Italian unemployment rate and the series of the Google Trend job offers query share to nowcast the monthly unemployment rate. Nowcasting is to be understood here as estimating the monthly unemployment rate for the month in which official survey is actually under way. The aim is thus to assess whether the use of Internet search data can improve the nowcasting of the economic indicator considered.
    Keywords: multivariate time series analysis, preliminary estimates, online search data
    JEL: C13 C32 C53
    Date: 2015–12

This nep-for issue is ©2015 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.