nep-for New Economics Papers
on Forecasting
Issue of 2011‒11‒21
eleven papers chosen by
Rob J Hyndman
Monash University

  1. Evaluating the Rationality of Managers' Sales Forecasts By Bruijn, B. de; Franses, Ph.H.B.F.
  2. Stock return predictability and variance risk premia: statistical inference and international evidence By Tim Bollerslev; James Marrone; Lai Xu; Hao Zhou
  3. Bayesian semiparametric GARCH models By Xibin Zhang; Maxwell L. King
  4. On the predictive content of nonlinear transformations of lagged autoregression residuals and time series observations By Rossen, Anja
  5. The potential of a small model By Adam Elbourne; Coen Teulings
  6. Privileged information exacerbates market volatility By Gabriel Desgranges; Stéphane Gauthier
  7. The total survey error paradigm and pre-election polls: the case of the 2006 Italian general elections By Fumagalli, Laura; Sala, Emanuela
  8. Measuring and Predicting Heterogeneous Recessions By Cem Cakmakli; Richard Paap; Dick van Dijk
  9. From Expert Judgment to Model based Monetary Analysis: The Case of the Dutch Central Bank in the Postwar Period By Frank A.G. den Butter; Harro B.J.B. Maas
  10. When are adaptive expectations rational? A generalization By Shepherd, Ben
  11. On the Predictability of Stock Prices: a Case for High and Low Prices By Massimiliano Caporin; Angelo Ranaldo

  1. By: Bruijn, B. de; Franses, Ph.H.B.F.
    Abstract: This paper deals with the analysis and evaluation of sales forecasts of managers, given that it is unknown how they constructed their forecasts. Our goal is to find out whether these forecasts are rational. To examine deviations from rationality, we argue that one has to approximate how the managers could have generated the forecasts. We describe several ways to construct these approximate expressions. The analysis of a large set of a single manager's forecasts for sales of pharmaceutical products illustrates the practical usefulness of our methodology.
    Keywords: evaluating forecasts;intuition;rationality;fixedevent forecasts;forecast updates;sales forecasts
    Date: 2011–11–14
  2. By: Tim Bollerslev; James Marrone; Lai Xu; Hao Zhou
    Abstract: Recent empirical evidence suggests that the variance risk premium, or the difference between risk-neutral and statistical expectations of the future return variation, predicts aggregate stock market returns, with the predictability especially strong at the 2-4 month horizons. We provide extensive Monte Carlo simulation evidence that statistical finite sample biases in the overlapping return regressions underlying these findings can not ``explain" this apparent predictability. Further corroborating the existing empirical evidence, we show that the patterns in the predictability across different return horizons estimated from country specific regressions for France, Germany, Japan, Switzerland and the U.K. are remarkably similar to the pattern previously documented for the U.S. Defining a ``global" variance risk premium, we uncover even stronger predictability and almost identical cross-country patterns through the use of panel regressions that effectively restrict the compensation for world-wide variance risk to be the same across countries. Our findings are broadly consistent with the implications from a stylized two-country general equilibrium model explicitly incorporating the effects of world-wide time-varying economic uncertainty.
    Date: 2011
  3. By: Xibin Zhang; Maxwell L. King
    Abstract: This paper aims to investigate a Bayesian sampling approach to parameter estimation in the semiparametric GARCH model with an unknown conditional error density, which we approximate by a mixture of Gaussian densities centered at individual errors and scaled by a common standard deviation. This mixture density has the form of a kernel density estimator of the errors with its bandwidth being the standard deviation. The proposed investigation is motivated by the lack of robustness in GARCH models with any parametric assumption of the error density for the purpose of error-density based inference such as value-at-risk (VaR) estimation. The contribution of the paper is to construct the likelihood and posterior of model and bandwidth parameters under the proposed mixture error density, and to forecast the one-step out-of-sample density of asset returns. The resulting VaR measure therefore would be distribution-free. Applying the semiparametric GARCH(1,1) model to daily stock-index returns in eight stock markets, we find that this semiparametric GARCH model is favoured against the GARCH(1,1) model with Student t errors for five indices, and that the GARCH model underestimates VaR compared to its semiparametric counterpart. We also investigate the use and benefit of localized bandwidths in the proposed mixture density of the errors.
    Keywords: Bayes factors, kernel-form error density, localized bandwidths, Markov chain Monte Carlo, value-at-risk
    JEL: C11 C14 C15 G15
    Date: 2011–11–03
  4. By: Rossen, Anja
    Abstract: This study focuses on the question whether nonlinear transformation of lagged time series values and residuals are able to systematically improve the average forecasting performance of simple Autoregressive models. Furthermore it investigates the potential superior forecasting results of a nonlinear Threshold model. For this reason, a large-scale comparison over almost 400 time series which span from 1996:3 up to 2008:12 (production indices, price indices, unemployment rates, exchange rates, money supply) from 10 European countries is made. The average forecasting performance is appraised by means of Mean Group statistics and simple t-tests. Autoregressive models are extended by transformed first lags of residuals and time series values. Whereas additional transformation of lagged time series values are able to reduce the ex-ante forecast uncertainty and provide a better directional accuracy, transformations of lagged residuals also lead to smaller forecast errors. Furthermore, the nonlinear Threshold model is able to capture certain type of economic behavior in the data and provides superior forecasting results than a simple Autoregressive model. These findings are widely independent of considered economic variables. --
    Keywords: Time series modeling,forecasting comparison,nonlinear transformations,Threshold Autoregressive modeling,average forecasting performance
    JEL: C22 C53 C51
    Date: 2011
  5. By: Adam Elbourne; Coen Teulings
    Abstract: <p>This CPB Discussion Paper highlights potential uses of simple, small models where large traditional models are less flexible.</p><p>We run a number of experiments with a small two variable VAR model of GDP growth and unemployment with both quarterly and yearly data. We compare the forecasts of these simple models with the published forecasts of the CPB and we conclude that there is not much di erence. We then show how easy it is to evaluate the usefulness of a given variable for forecasting by extending the model to include world trade. Perfect knowledge of future world trade growth would help considerably but is obviously not available at the time the forecasts were made. The available world trade data doesn't improve the forecasts. Finally we also show how quick and exible measures of the output gap can be constructed.</p>
    JEL: C0 E0
    Date: 2011–11
  6. By: Gabriel Desgranges (THEMA - Théorie économique, modélisation et applications - CNRS : UMR8184 - Université de Cergy Pontoise); Stéphane Gauthier (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris)
    Abstract: We study how asymmetric information affects market volatility in a linear setup where the outcome is determined by forecasts about this same outcome. The unique rational expectations equilibrium will be stable when it is the only rationalizable solution. It has been established in the literature that stability is obtained when the sensitivity of the outcome to agents' forecasts is less than 1, provided that this sensitivity is common knowledge. Relaxing this common knowledge assumption, instability is obtained when the proportion of agents who a priori know the sensitivity is large, and the uninformed agents believe it is possible that the sensitivity is greater than 1.
    Keywords: Asymmetric information, common knowledge, eductive learning, rational expectations, rationalizability, volatility.
    Date: 2011–10
  7. By: Fumagalli, Laura; Sala, Emanuela
    Abstract: Pre-election polls sometimes fail to reach the purpose for which they are carried out: to provide accurate predictions of electoral out-comes. By looking at the 2006 Italian General Elections, this paper aims to assess the role that different factors play in determining the accuracy of the pre-election polls. We find strong evidence that the quality of the sampling frame and non-respondents may contribute to biasing the polls results. This paper also aims to show how to over-come some of the limitations of the survey data by using statistical matching techniques and weighing procedures.
    Date: 2011–11–09
  8. By: Cem Cakmakli (University of Amsterdam); Richard Paap (Erasmus University Rotterdam); Dick van Dijk (Erasmus University Rotterdam)
    Abstract: This paper conducts an empirical analysis of the heterogeneity of recessions in monthly U.S. coincident and leading indicator variables. Univariate Markovswitching models indicate that it is appropriate to allow for two distinct recession regimes, corresponding with ‘mild’ and ‘severe’ recessions. All downturns start with a mild decline in the level of economic activity. Contractions that develop into severe recessions mostly correspond with periods of substantial credit squeezes as suggested by the ‘financial accelerator’ theory. Multivariate Markov-switching models that allow for phase shifts between the cyclical regimes of industrial production and the Conference Board Leading Economic Index confirm these findings.
    Keywords: Business cycle; phase shifts; regime-switching models; Bayesian analysis
    JEL: C11 C32 C51 C52 E32
    Date: 2011–11–03
  9. By: Frank A.G. den Butter (VU University Amsterdam); Harro B.J.B. Maas (Utrecht University)
    Abstract: This paper investigates the history of the shift from expert to model based monetary policy analysis at the Dutch Central Bank (DNB) in the postwar period up to the middle of the nineteen-eighties. For reasons that will become clear expert based reasoning at DNB was referred to as normative impulse analysis. Our focus is on two aspects of this shift: (i) from an expert based monetary analysis to a model based analysis of channels of monetary transmission, and (ii) from the top down way of monetary analysis where the president of DNB acted as the monetary expert that was in line with the hierarchical organisation of DNB to the bottom up modelling approach that was set up by a group of newly hired young academic outsiders and destabilized DNB's organisation. The resulting econometric model enabled DNB to regain some of its argumentative strength in the Dutch policy arena that had become dominated by the econometric model of the Dutch Planning Bureau (of wh ich Tinbergen was the first director), but also led to tensions within DNB's organisation. In spite of efforts to incorporate the main aspects of Holtrop's monetary analysis within the model, its concomitant new research group appeared difficult to integrate within the hierarchical organisation of DNB. The model analysis resulted in the MORKMON model which replaced Holtrop's analysis in the mid 1980s and was regularly used in policy analysis and forecasting of DNB until 2011, when the model was replaced by the DELFI model.
    Keywords: Dutch monetarism; history of economic modelling; monetary policy
    JEL: B23 C52 E58
    Date: 2011–11–15
  10. By: Shepherd, Ben
    Abstract: This note presents a simple generalization of the adaptive expectations mechanism in which the learning parameter is time variant. It is shown that expectations generated in this way are rational in the sense of producing minimum mean squared forecast errors for a broad class of time series models, namely any process that can be written in linear state space form.
    Keywords: Adaptive Expectations; Rational Expectations; Kalman Filter
    JEL: C53 C22
    Date: 2011–10–25
  11. By: Massimiliano Caporin; Angelo Ranaldo
    Abstract: Contrary to the common wisdom that asset prices are hardly possible to forecast, we show that high and low prices of equity shares are largely predictable. We propose to model them using a simple implementation of a fractional vector autoregressive model with error correction (FVECM). This model captures two fundamental patterns of high and low prices: their cointegrating relationship and the long memory of their difference (i.e. the range), which is a measure of realized volatility. Investment strategies based on FVECM predictions of high/low US equity prices as exit/entry signals deliver a superior performance even on a risk-adjusted basis.
    Keywords: high and low prices, predictability of asset prices, range, fractional cointegration, exit/entry trading signals, chart/technical analysis
    JEL: G11 C53
    Date: 2011

This nep-for issue is ©2011 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.