nep-for New Economics Papers
on Forecasting
Issue of 2016‒05‒28
eight papers chosen by
Rob J Hyndman
Monash University

  1. Models of Mortality rates - analysing the residuals By O'Hare, Colin; Li, Youwei
  2. Learning Time-Varying Forecast Combinations By Antoine Mandel; Amir Sani
  3. Comparing Predictive Accuracy under Long Memory - With an Application to Volatility Forecasting By Robinson Kruse; Christian Leschinski; Michael Will
  4. Forecasting with Neural Networks Models. By Francis Bismans; Igor N. Litvine
  5. A Nonparametric Approach to Identifying a Subset of Forecasters that Outperforms the Simple Average By Constantin Bürgi; Tara M. Sinclair
  6. A Nowcasting Model for Canada: Do U.S. Variables Matter? By Bragoli, Daniela; Modugno, Michele
  7. Economic dynamics and technology diffusion in Indian power sector By B. Sudhakara Reddy
  8. Financial Conditions Indicators for Brazil By Wagner Piazza Gaglianone; Waldyr Dutra Areosa

  1. By: O'Hare, Colin; Li, Youwei
    Abstract: The area of mortality modelling has received significant attention over the last 25 years owing to the need to quantify and forecast improving mortality rates. This need is driven primarily by the concern of governments, insurance and actuarial professionals and individuals to be able to fund their old age. In particular, to quantify the costs of increasing longevity we need suitable model of mortality rates that capture the dynamics of the data and forecast them with sufficient accuracy to make them useful. In this paper we test several of the leading time series models by considering the fitting quality and in particular, testing the residuals of those models for normality properties. In a wide ranging study considering 30 countries we find that almost exclusively the residuals do not demonstrate normality. Further, in Hurst tests of the residuals we find evidence that structure remains that is not captured by the models.
    Keywords: Mortality; stochastic models; residuals; Hurst exponents
    JEL: C51 C52 C53 G22 G23 J11
    Date: 2016–05–17
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:71394&r=for
  2. By: Antoine Mandel (CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics); Amir Sani (CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics)
    Abstract: Non-parametric forecast combination methods choose a set of static weights to combine over candidate forecasts as opposed to traditional forecasting approaches, such as ordinary least squares, that combine over information (e.g. exogenous variables). While they are robust to noise, structural breaks, inconsistent predictors and changing dynamics in the target variable, sophisticated combination methods fail to outperform the simple mean. Time-varying weights have been suggested as a way forward. Here we address the challenge to “develop methods better geared to the intermittent and evolving nature of predictive relations” in Stock and Watson (2001) and propose a data driven machine learning approach to learn time-varying forecast combinations for output, inflation or any macroeconomic time series of interest. Further, the proposed procedure “hedges” combination weights against poor performance to the mean, while optimizing weights to minimize the performance gap to the best candidate forecast in hindsight. Theoretical results are reported along with empirical performance on a standard macroeconomic dataset for predicting output and inflation.
    Abstract: Les méthodes non-paramétriques de combinaison de prédicteurs déterminent un vecteur statique de poids pour combiner les prédicteurs. Elles différent ainsi des méthodes de prévision traditionnelles qui visent à combiner l'information (i.e. les variables exogènes). Bien qu'elles soient très robustes, notamment au bruit, aux changements structurels ou à la présence de prédicteurs inconsistants, les méthodes de combinaison complexes n'offrent généralement pas une performance supérieure à celle de la simple moyenne. L'usage de poids variables dans le temps est considéré comme une nouvelle voie de recherche prometteuse face à ce dilemme. Dans cet article, nous développons cette approche en proposant une approche par l'apprentissage statistique du problème de la détermination de combinaisons de prédicteurs évoluant dans le temps pour l'inflation, le PIB ou tout autre série macro-économique. La méthode proposée permet en particulier de garantir, au pire, une performance proche de celle de la moyenne tout en optimisant les poids de telle sorte que la performance soit proche de celle de la meilleure combinaison à posteriori. Nous reportons à cet effet des résultats théoriques et empiriques sur un ensemble de données standard pour la prédiction macro-économique.
    Keywords: Forecast combinations,Machine Learning,Econometrics,Forecasting,Forecast Combination Puzzle,Apprentissage statistique,Combinaison de prédicteurs,Econométrie
    Date: 2016–04
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-01317974&r=for
  3. By: Robinson Kruse (Rijksuniversiteit Groningen and CREATES); Christian Leschinski (Leibniz University Hannover); Michael Will (Leibniz University Hannover)
    Abstract: This paper extends the popular Diebold-Mariano test to situations when the forecast error loss differential exhibits long memory. It is shown that this situation can arise frequently, since long memory can be transmitted from forecasts and the forecast objective to forecast error loss differentials. The nature of this transmission mainly depends on the (un)biasedness of the forecasts and whether the involved series share common long memory. Further results show that the conventional Diebold-Mariano test is invalidated under these circumstances. Robust statistics based on a memory and autocorrelation consistent estimator and an extended fixed-bandwidth approach are considered. The subsequent Monte Carlo study provides a novel comparison of these robust statistics. As empirical applications, we conduct forecast comparison tests for the realized volatility of the Standard and Poors 500 index among recent extensions of the heterogeneous autoregressive model. While we find that forecasts improve significantly if jumps in the log-price process are considered separately from continuous components, improvements achieved by the inclusion of implied volatility turn out to be insignificant in most situations.
    Keywords: Equal Predictive Ability, Long Memory, Diebold-Mariano Test, Long-run Variance Estimation, Realized Volatility
    JEL: C22 C52 C53
    Date: 2016–05–19
    URL: http://d.repec.org/n?u=RePEc:aah:create:2016-17&r=for
  4. By: Francis Bismans; Igor N. Litvine
    Abstract: This paper deals with so-called feedforward neural network model which we consider from a statistical and econometric viewpoint. It was shown how this model can be estimated by maximum likelihood. Finally, we apply the ANN methodology to model demand for electricity in South Africa. The comparison of forecasts based on a linear and ANN model respectively shows the usefulness of the latter.
    Keywords: Artificial neural networks (ANN), electricity consumption, forecasting, linear and non-linear models, recessions.
    JEL: C45 C53 E17 E27 Q43 Q47
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:ulp:sbbeta:2016-28&r=for
  5. By: Constantin Bürgi (The George Washington University); Tara M. Sinclair (The George Washington University)
    Abstract: Empirical studies in the forecast combination literature have shown that it is notoriously di!cult to improve upon the simple average despite the availability of optimal combination weights. In particular, historical performance-based combination approaches do not select forecasters that improve upon the simple average going forward. This paper shows that this is due to the high correlation among forecasters, which only by chance causes some individuals to have lower root mean squared errors (RMSE) than the simple average. We introduce a new nonparametric approach to eliminate forecasters who perform well based purely on chance as well as poor performers. This leaves a subset of forecasters with better performance in subsequent periods. It improves upon the simple average in the SPF for bond yields where some forecasters may be more likely to have specialized knowledge.
    Keywords: Forecast combination; Forecast evaluation; Multiple model comparisons; Real-time data; Survey of Professional Forecasters
    JEL: C22 C52 C53
    Date: 2015–12
    URL: http://d.repec.org/n?u=RePEc:gwc:wpaper:2015-006&r=for
  6. By: Bragoli, Daniela; Modugno, Michele
    Abstract: We propose a dynamic factor model for nowcasting the growth rate of quarterly real Canadian gross domestic product. We show that the proposed model produces more accurate nowcasts than those produced by institutional forecasters, like the Bank of Canada, the The Organisation for Economic Co-operation and Development (OECD), and the survey collected by Bloomberg, which reflects the median forecast of market participants. We show that including U.S. data in a nowcasting model for Canada dramatically improves its predictive accuracy, mainly because of the absence of timely production data for Canada. Moreover, Statistics Canada produces a monthly real GDP measure along with the quarterly one, and we show how to modify the state space representation of our model to properly link the monthly GDP with its quarterly counterpart.
    Keywords: Nowcasting ; Updating ; Dynamic Factor Model
    JEL: C33 C53 E37
    Date: 2016–04
    URL: http://d.repec.org/n?u=RePEc:fip:fedgfe:2016-36&r=for
  7. By: B. Sudhakara Reddy (Indira Gandhi Institute of Development Research)
    Abstract: There is a growing concern among policy makers about how electricity is generated and consumed in the context of energy security and global climate change. In such a scenario, renewable energy sources, especially solar and wind energy, are likely to play a significant role in providing reliable and sustainable electricity to consumers as they are locally available and their carbon foot print is small. The future share of power by renewables will greatly depend on the expected generation cost and the government's support to investments in the sector. Using levelised cost approach, capital cost, operating and fuel costs of major electricity generation technologies are compared. Then, a forecast is made for electricity generation in India, using non-linear Bass diffusion model over 15-year horizon, until 2030 for all major energy technologies, viz., coal, natural gas, hydro, solar, wind, and biomass. The results show how present trends and future forecasts of electricity-generating technologies change the electricity generation mix, and how solar and wind power may increase their share in the total generation. However, fossil fuels will continue to remain competitive relative to renewables due to their cost advantage. The main issue considered here is whether each energy technology has reached its maximum penetration level. This helps set out a path for renewable energy technology diffusion in the Indian power sector.
    Keywords: Cost, Diffusion, Power, Renewables, Technology
    JEL: P28 Q41 Q42 Q48
    Date: 2016–05
    URL: http://d.repec.org/n?u=RePEc:ind:igiwpp:2016-014&r=for
  8. By: Wagner Piazza Gaglianone; Waldyr Dutra Areosa
    Abstract: In this paper, we propose a methodology to construct a Financial Conditions Indicator (FCI) based on Brave and Butters (2011) and Aramonte et al. (2013). The main idea is to use a selected set of economic and financial time series and aggregate their information content into a single index that summarizes the overall financial conditions of the economy. This approach can be further employed to forecast economic activity. An empirical exercise for Brazil is provided to illustrate the methodology, in which a modified IS-type equation (substituting the interest rate by the FCI) is employed to point forecast the output gap. In addition, a standard quantile regression technique (e.g. Koenker, 2005) is used to construct density forecasts and generate fan charts of future economic activity. A risk analysis is conducted within this setup in order to compute conditional probabilities of the output growth to be above/below a given scenario
    Date: 2016–05
    URL: http://d.repec.org/n?u=RePEc:bcb:wpaper:435&r=for

This nep-for issue is ©2016 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.