nep-for New Economics Papers
on Forecasting
Issue of 2017‒11‒26
nine papers chosen by
Rob J Hyndman
Monash University

  1. Forecasting Mortality: Some Recent Developments By Taku Yamamoto; Hiroaki Chigira
  2. Forecasting Stock Returns: A Predictor-Constrained Approach By Pettenuzzo; Zhiyuan Pan; Yudong Wang
  3. Forecasting the real price of oil under alternative specifications of constant and time-varying volatility By Beili Zhu
  4. Asset volatility By Correia, Maria; Kang, Johnny; Richardson, Scott
  5. The accountability imperative for quantifying the uncertainty of emission forecasts : evidence from Mexico By Daniel PUIG; Oswaldo Morales-Napoles; Fatemeh Bakhtiari; Gissela Landa Rivera
  6. An Interdisciplinary View on Tax Revenue Estimates and Forecasts and its Impacts on a Multilevel Public Budget System By André W. Heinemann; Hanna Kotina; Maryna Stepura
  7. Residential investment and recession predictability By Knut Are Aastveit; André K. Anundsen; Eyo I. Herstad
  8. How Well Do Structural Demand Models Work? Counterfactual Predictions in School Choice By Parag A. Pathak; Peng Shi
  9. Adaptive Hierarchical Priors for High-Dimensional Vector Autregressions By Pettenuzzo; Dimitris Korobilis

  1. By: Taku Yamamoto (Hitotsubashi University); Hiroaki Chigira (Tohoku University)
    Abstract: Forecasting mortality has been a vital issue in demography and actuarial science. It also has profound implications for pension plan and long-term economic forecasts of the nation. In the present paper we examine various forecasting methods for mortality in the framework of cointegrated time series analysis. The Lee-Carter (LC) method has been regarded as the benchmark for forecasting mortality. However, its forecasting accuracy has been known to be particularly poor for short-term forecasts, while it is well for long-term forecasts. Recently, a new methods called the multivariate time series variance component (MTV) method has been proposed which explicitly satisfies cointegration restrictions of the series. It overcomes weak points of the LC method. In the present paper we propose two new methods. The first one is the modified MTV (mMTV) method which modifies the MTV method in order to get more accurate forecast of the trend component of the method. The second is the all-component Lee-Carter (LCA) method which generalizes the Lee-Carter method, by using all principal components, in order to improve short-term forecasts of the LC method. However, it may be noted that the LCA method does not satisfy cointegration restrictions. We analytically compare forecasting accuracy of the proposed methods with the Lee-Carter method and the MTV method in the framework of cointegrated time series. We further compare them in a Monte Carlo experiment and in an empirical application of forecasting mortality for Japanese male. It is shown that the mMTV method is generally the most accurate in the Monte Carlo experiment and in Japanese data. The MTV method works almost as well. However, since the drift estimator is inefficient, it is slightly less accurate than the mMTV method in some occasions. The forecast accuracy of the LCA method is reasonably high and can be equivalent to the mMTV method in some occasions, but is generally inferior to the MTV method and the mMTV method. As expected, the LC method is the worst among methods examined in the present study. The mMTV method is recommended for practical use.
    Keywords: Time Series ModelsForecasting MethodsCointegrated ProcessMortality.
    JEL: C01 C32 C53
    Date: 2017–10
    URL: http://d.repec.org/n?u=RePEc:sek:iacpro:5808110&r=for
  2. By: Pettenuzzo (Brandeis University); Zhiyuan Pan (Southwestern University of Finance and Economics, Institute of Chinese Financial Studies); Yudong Wang (School of Economics and Management, Nanjing University of Science and Technology and Economics)
    Abstract: We develop a novel method to impose constraints on univariate predictive regressions of stock returns. Unlike the previous approaches in the literature, we implement our constraints directly on the predictor, setting it to zero whenever its value falls below the variable's past 12-month high. Empirically, we find that relative to standard unconstrained predictive regressions, our approach leads to significantly larger forecasting gains, both in statistical and economic terms. We also show how a simple equal-weighted combination of the constrained forecasts leads to further improvements in forecast accuracy, with predictions that are more precise than those obtained either using the Campbell and Thompson (2008) or Pettenuzzo, Timmermann, and Valkanov (2014) methods. Subsample analysis and a large battery of robustness checks confirm that these findings are robust to the presence of model instabilities and structural breaks.
    Keywords: Equity premium, Predictive regressions, Predictor constraints, 12-month high
    JEL: C11 C22 G11 G12
    Date: 2017–10
    URL: http://d.repec.org/n?u=RePEc:brd:wpaper:115&r=for
  3. By: Beili Zhu
    Abstract: This paper constructs a monthly real-time oil price dataset using backcasting and compares the forecast performance of alternative models of constant and timevarying volatility based on the accuracy of point and density forecasts of real oil prices of both real-time and ex-post revised data. The paper considers Bayesian autoregressive and autoregressive moving average models with respectively, constant volatility and two forms of time-varying volatility: GARCH and stochastic volatility. In addition to the standard time-varying models, more flexible models with volatility in mean and moving average innovations are used to forecast the real price of oil. The results show that timevarying volatility models dominate their counterparts with constant volatility in terms of point forecasting at longer horizons and density forecasting at all horizons. The inclusion of a moving average component provides a substantial improvement in the point and density forecasting performance for both types of time-varying models while stochastic volatility in mean is superfluous for forecasting oil prices.
    Keywords: Forecasting, oil price, real-time data, time-varying volatility, moving average, stochastic volatility in mean.
    JEL: C11 C53 C82 Q43
    Date: 2017–11
    URL: http://d.repec.org/n?u=RePEc:een:camaaa:2017-71&r=for
  4. By: Correia, Maria; Kang, Johnny; Richardson, Scott
    Abstract: We examine whether fundamental measures of volatility are incremental to market based measures of volatility in (i) predicting bankruptcies (out of sample), (ii) explaining crosssectional variation in credit spreads, and (iii) explaining future credit excess returns. Our fundamental measures of volatility include (i) historical volatility in profitability, margins, turnover, operating income growth, and sales growth, (ii) dispersion in analyst forecasts of future earnings, and (iii) quantile regression forecasts of the interquartile range of the distribution of profitability. We find robust evidence that these fundamental measures of volatility improve out of sample forecasts of bankruptcy and are useful in explaining crosssectional variation in credit spreads. This suggests that an analysis of credit risk can be enhanced with a detailed analysis of fundamental information. As a test case of the benefit of volatility forecasting, we document an improved ability to forecast future credit excess returns, particularly when using fundamental measures of volatility.
    JEL: M40 F3 G3
    Date: 2017–07–21
    URL: http://d.repec.org/n?u=RePEc:ehl:lserod:84405&r=for
  5. By: Daniel PUIG (UNEP DTU Partnership, Copenhagen, Denmark); Oswaldo Morales-Napoles (Delft University of Technology, Netherlands); Fatemeh Bakhtiari (UNEP DTU Partnership, Copenhagen, Denmark); Gissela Landa Rivera (OFCE, Sciences Po Paris, France)
    Abstract: Governmental climate change mitigation targets are typically developed with the aid of forecasts of greenhouse-gasemissions. The robustness and credibility of such forecasts depends, among other issues, on the extent to which forecasting approaches can reflect prevailing uncertainties. We apply a transparent and replicable method to quantify the uncertainty associated with projections of gross domestic product growth rates for Mexico, a key driver of greenhouse-gasemissions in the country. We use those projections to produce probabilistic forecasts of greenhouse-gas emissions forMexico. We contrast our probabilistic forecasts with Mexico’s governmental deterministic forecasts. We show that, because they fail to reflect such key uncertainty, deterministic forecasts are ill-suited for use in target-setting processes. We argue that (i) guidelines should be agreed upon, to ensure that governmental forecasts meet certain minimum transparency and quality standards, and (ii) governments should be held accountable for the appropriateness of the forecasting approach applied to prepare governmental forecasts, especially when those forecasts are used to derive climate change mitigation targets.
    Keywords: Uncertainty, projections, structured expert judgment, accountability, emission-reduction targets, gross domestic product growth rates
    JEL: Q25 Q38 Q48
    Date: 2017–09
    URL: http://d.repec.org/n?u=RePEc:fce:doctra:1717&r=for
  6. By: André W. Heinemann (University of Bremen); Hanna Kotina (Kyiv National Economic University); Maryna Stepura (Kyiv National Economic University)
    Abstract: While tax revenue forecasts are required for the public budget planning and execution process, the frameworks and accuracy of tax revenue forecasts are crucial for economic analysis of public budgets in multilevel systems. Insufficient and defective tax revenue forecasts can lead to budget problems as well as budget interdependencies in multilevel systems. The determination of budget revenues amount, which can actually be carried out, needs to estimate the forecast of tax revenues reasonably and accurately. Adequacy and feasibility of relevant indicators depend on the assessment of the state, trends and forecasting of economic and social development, stability and progressiveness of the current legislation, the forms and methods of tax mobilization, the level of fiscal culture and other factors. The role of tax revenue forecasting is enhanced significantly. Fiscal equalization schemes, grant systems and bailout rules have to take into account in the case of problems with the accuracy of tax revenue forecasts. If tax revenues forecasts in the medium-term are upward biased, the institutional setting can be an explanation for forecasts errors (Breuer 2014). However, over-optimistic as well as under-optimistic forecasts influence budgeting and budget targets.The present paper deals with the conditions and institutional frameworks for accuracy of tax revenue forecasts, especially in a medium-income and a high-income country. First, we present a literature review on tax revenue forecast and the importance of institutional performance for accurate tax revenue forecasts. Thereby, empirical studies to explain forecasts errors will be analyzed. In a comparison of two countries, the second session describes the institutional setting for tax revenue forecasts and the procedures in the Ukraine and Germany and shows the methods, actors and institutional mechanisms in these different multilevel systems. The analysis focusses on the degree of decentralization in both countries and legal equalization schemes. We show some determinants of tax revenue forecast errors and discuss the impacts and consequences for budgetary planning and budget managing in multilevel systems. Our findings point out the importance of fiscal governance in multilevel systems if tax revenue forecasts are influenced by many determinants in specific ways that makes revenue forecast difficult. Multilevel fiscal governance is required to solve problems in tax revenue forecasts and budgeting in decentralized systems. Additionally, the knowledge on taxpayer´s behavior (households, employees, consumer, firms) under conditions of globalization of taxation is underestimated at the present, but is needed for the improvement of public budget managing processes.
    Keywords: Tax estimates, Tax forecasts, Budget planning, Multilevel governance
    JEL: H11 H70 H77
    Date: 2017–10
    URL: http://d.repec.org/n?u=RePEc:sek:iacpro:5807719&r=for
  7. By: Knut Are Aastveit (Norges Bank (Central Bank of Norway) and BI Norwegian Business School); André K. Anundsen (Norges Bank (Central Bank of Norway)); Eyo I. Herstad (xUniversity of Chicago)
    Abstract: We assess the importance of residential investment in predicting economic recessions for an unbalanced panel of 12 OECD countries over the period 1960Q1-2014Q4. Our approach is to estimate various probit models with different leading indicators and evaluate their relative prediction accuracy using the receiver operating characteristic curve. We document that residential investment contains information useful in predicting recessions both in-sample and out-of-sample. This result is robust to adding typical leading indicators, such as the term spread, stock prices, consumer confidence surveys and oil prices. It is shown that residential investment is particularly useful in predicting recessions for countries with high home-ownership rates. Finally, in a separate exercise for the US economy, we show that the predictive ability of residential investment is robust to employing real-time data.
    Keywords: Recession predictability, Housing, Leading indicators, Real-time data
    JEL: C33 C53 E32 E37
    Date: 2017–11–16
    URL: http://d.repec.org/n?u=RePEc:bno:worpap:2017_24&r=for
  8. By: Parag A. Pathak; Peng Shi
    Abstract: Discrete choice demand models are widely used for counterfactual policy simulations, yet their out-of-sample performance is rarely assessed. This paper uses a large-scale policy change in Boston to investigate the performance of discrete choice models of school demand. In 2013, Boston Public Schools considered several new choice plans that differ in where applicants can apply. At the request of the mayor and district, we forecast the alternatives' effects by estimating discrete choice models. This work led to the adoption of a plan which significantly altered choice sets for thousands of applicants. Pathak and Shi (2014) update forecasts prior to the policy change and describe prediction targets involving access, travel, and unassigned students. Here, we assess how well these ex ante counterfactual predictions compare to actual outcome under the new choice sets. We find that a simple ad hoc model performs as well as the more complicated structural choice models for one of the two grades we examine. However, the structural models' inconsistent performance is largely due to prediction errors in applicant characteristics, which are auxiliary inputs. Once we condition on the actual applicant characteristics, the structural choice models outperform the ad hoc alternative in predicting both choice patterns and policy relevant outcomes. Moreover, refitting the models using the new choice data does not significantly improve their prediction accuracy, suggesting that the choice models are indeed “structural.” Our findings show that structural demand models can effectively predict counterfactual outcomes, as long there are accurate forecasts about auxiliary input variables.
    JEL: C10 C78 D12 I20
    Date: 2017–11
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:24017&r=for
  9. By: Pettenuzzo (Brandeis University); Dimitris Korobilis (University of Essex)
    Abstract: This paper proposes a scalable and simulation-free estimation algorithm for vector autoregressions (VARs) that allows fast approximate calculation of marginal posterior distributions. We apply the algorithm to derive analytical expressions for popular Bayesian shrinkage priors that admit a hierarchical representation and which would typically require computationally intensive posterior simulation methods. The proposed algorithm is modular, parallelizable, and scales linearly with the number of predictors, allowing fast and efficient estimation of large Bayesian VARs. The benefits of our approach are explored and computational gains of the proposed estimation algorithm and priors. Second, a forecasting exercise involving VARs estimated on macroeconomic data demonstrates the ability of hierarchical shrinkage priors to find useful parsimonious representations. Finally, we show that our approach can be used successfully for structural analysis and can replicate important features of structural shocks predicted by economic theory.
    Keywords: Bayesian VARs, Mixture prior, Large datasets, Macroeconomic forecasting
    JEL: C11 C13 G32
    URL: http://d.repec.org/n?u=RePEc:brd:wpaper:116&r=for

This nep-for issue is ©2017 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.