nep-for New Economics Papers
on Forecasting
Issue of 2017‒01‒29
thirteen papers chosen by
Rob J Hyndman
Monash University

  1. The contribution of jumps to forecasting the density of returns By Christophe Chorro; Florian Ielpo; Benoît Sévi
  2. How Biased Are U.S. Government Forecasts of the Federal Debt? By Neil R. Ericsson
  3. Forecasting the volatility of Nikkei 225 futures By Manabu Asai; Michael McAleer
  4. Statistical and Economic Evaluation of Time Series Models for Forecasting Arrivals at Call Centers By Bastianin, Andrea; Galeotti, Marzio; Manera, Matteo
  5. A Unified Framework for Dimension Reduction in Forecasting By Alessandro Barbarino; Efstathia Bura
  6. Forecasting economic activity in data-rich environment By Maxime Leroux; Rachidi Kotchoni; Dalibor Stevanovic
  7. Electricity prices forecasting by averaging dynamic factor models By García-Martos, Carolina; Bastos, Guadalupe; Alonso Fernández, Andrés Modesto
  8. “Regional tourism demand forecasting with machine learning models: Gaussian process regression vs. neural network models in a multiple-input multiple-output setting” By Oscar Claveria; Enric Monte; Salvador Torra
  9. Intuitive and reliable estimates of the output gap from a Beveridge-Nelson Filter By Gunes Kamber; James Morley; Benjamin Wong
  10. An Improved Equation for Predicting Canadian Non-Commodity Exports By Patrick Alexander; Jean-Philippe Cayen; Alex Proulx
  11. Forecasting UK Income Tax By Zara Ghodsi; Allan Webster
  12. Prediction of Extreme Price Occurrences in the German Day-ahead Electricity Market By Hagfors, Lars Ivar; Kamperud , Hilde Horthe; Paraschiv, Florentina; Prokopczuk, Marcel; Sator, Alma; Westgaard, Sjur
  13. A new approach to volatility modeling: the High-Dimensional Markov model By Arnaud Dufays; Maciej Augustyniak; Luc Bauwens

  1. By: Christophe Chorro (Centre d'Economie de la Sorbonne); Florian Ielpo (Unigestion SA, Centre d'Economie de la Sorbonne et IPAG Business School); Benoît Sévi (LEMNA)
    Abstract: The extraction of the jump component in dynamics of asset prices haw witnessed a considerably growing body of literature. Of particular interest is the decomposition of returns' quadratic variation between their continuous and jump components. Recent contributions highlight the importance of this component in forecasting volatility at different horizons. In this article, we extend a methodology developed in Maheu and McCurdy (2011) to exploit the information content of intraday data in forecasting the density of returns at horizons up to sixty days. We follow Boudt et al. (2011) to detect intraday returns that should be considered as jumps. The methodology is robust to intra-week periodicity and further delivers estimates of signed jumps in contrast to the rest of the literature where only the squared jump component can be estimated. Then, we estimate a bivariate model of returns and volatilities where the jump component is independently modeled using a jump distribution that fits the stylized facts of the estimated jumps. Our empirical results for S&P 500 futures, U.S. 10-year Treasury futures, USD/CAD exchange rate and WTI crude oil futures highlight the importance of considering the continuous/jump decomposition for density forecasting while this is not the case for volatility point forecast. In particular, we show that the model considering jumps apart from the continuous component consistenly deliver better density forecasts for forecasting horizons ranging from 1 to 30 days
    Keywords: density forecasting; jumps; realized volatility; bipower variation; median realized volatility; leverage effect
    JEL: C15 C32 C53 G1
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:mse:cesdoc:17006&r=for
  2. By: Neil R. Ericsson (Board of Governors of the Federal Reserve System)
    Abstract: Government debt and forecasts thereof attracted considerable attention during the recent financial crisis. The current paper analyzes potential biases in different U.S. government agencies’ one-year-ahead forecasts of U.S. gross federal debt over 1984—2012. Standard tests typically fail to detect biases in these forecasts. However, impulse indicator saturation (IIS) detects economically large and highly significant time-varying biases, particularly at turning points in the business cycle. These biases do not appear to be politically related. IIS defines a generic procedure for examining forecast properties; it explains why standard tests fail to detect bias; and it provides a mechanism for potentially improving forecasts.
    Keywords: Autometrics, bias, debt, federal government, forecasts, impulse indicator saturation, heteroscedasticity, projections, United States.
    JEL: H68 C53
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:gwc:wpaper:2017-001&r=for
  3. By: Manabu Asai (Faculty of Economics Soka University, Japan.); Michael McAleer
    Abstract: For forecasting volatility of futures returns, the paper proposes an indirect method based on the relationship between futures and the underlying asset for the returns and time-varying volatility. For volatility forecasting, the paper considers the stochastic volatility model with asymmetry and long memory, using high frequency data for the underlying asset. Empirical results for Nikkei 225 futures indicate that the adjusted R2 supports the appropriateness of the indirect method, and that the new method based on stochastic volatility models with the asymmetry and long memory outperforms the forecasting model based on the direct method using the pseudo long time series.
    Keywords: Forecasting, Volatility, Futures, Realized volatility, Realized kernel, Leverage effects, Long memory.
    JEL: C22 C53 C58 G17
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:ucm:doicae:1707&r=for
  4. By: Bastianin, Andrea; Galeotti, Marzio; Manera, Matteo
    Abstract: Call centers' managers are interested in obtaining accurate forecasts of call arrivals because these are a key input in staffing and scheduling decisions. Therefore their ability to achieve an optimal balance between service quality and operating costs ultimately hinges on forecast accuracy. We present a strategy to model selection in call centers which is based on three pillars: (i) a flexible loss function; (ii) statistical evaluation of forecast accuracy; (iii) economic evaluation of forecast performance using money metrics. We implement fourteen time series models and seven forecast combination schemes on three series of call arrivals. We show that second moment modeling is important when forecasting call arrivals. From the point of view of a call center manager, our results indicate that outsourcing the development of a forecasting model worth its cost, since the simple Seasonal Random Walk model is always outperformed by other, relatively more sophisticated, specifications.
    Keywords: ARIMA; Call center arrivals; Loss function; Seasonality; Telecommunications forecasting.
    JEL: C22 C25 C53 D81 M15
    Date: 2016–12
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:76308&r=for
  5. By: Alessandro Barbarino; Efstathia Bura
    Abstract: Factor models are widely used in summarizing large datasets with few underlying latent factors and in building time series forecasting models for economic variables. In these models, the reduction of the predictors and the modeling and forecasting of the response y are carried out in two separate and independent phases. We introduce a potentially more attractive alternative, Sufficient Dimension Reduction (SDR), that summarizes x as it relates to y, so that all the information in the conditional distribution of y|x is preserved. We study the relationship between SDR and popular estimation methods, such as ordinary least squares (OLS), dynamic factor models (DFM), partial least squares (PLS) and RIDGE regression, and establish the connection and fundamental differences between the DFM and SDR frameworks. We show that SDR significantly reduces the dimension of widely used macroeconomic series data with one or two sufficient reductions delivering similar forecasting performance to that of competing methods in macro-forecasting.
    Keywords: Diffusion Index ; Dimension Reduction ; Factor Models ; Forecasting ; Partial Least Squares ; Principal Components
    JEL: C32 C53 C55 E17
    Date: 2017–01–12
    URL: http://d.repec.org/n?u=RePEc:fip:fedgfe:2017-04&r=for
  6. By: Maxime Leroux; Rachidi Kotchoni; Dalibor Stevanovic
    Abstract: This paper compares the performance of five classes of forecasting models in an extensive out-of-sample exercise. The types of models considered are standard univariate models, factor-augmented regressions, dynamic factor models, other data-rich models and forecast combinations. These models are compared using four types of data: real series, nominal series, the stock market index and exchange rates. Our findings can be summarized in a few points: (i) data-rich models and forecasts combination approaches are the best for predicting real series; (ii) ARMA(1,1) model predicts inflation change incredibly well and outperform data-rich models; (iii) the simple average of forecasts is the best approach to predict future SP500 returns; (iv) exchange rates can be predicted at short horizons mainly by univariate models but the random walk dominates at medium and long terms; (v) the optimal structure of forecasting equations changes much over time; and (vi) the dispersion of out-of-sample point forecasts is a good predictor of some macroeconomic and financial uncertainty measures as well as of the business cycle movements among real activity series.
    Keywords: Forecasting, Factor Models, Data-rich environment, Model averaging,
    JEL: C55 C32 E17
    Date: 2017–01–25
    URL: http://d.repec.org/n?u=RePEc:cir:cirwor:2017s-05&r=for
  7. By: García-Martos, Carolina; Bastos, Guadalupe; Alonso Fernández, Andrés Modesto
    Abstract: In the context of the liberalization of electricity markets, forecasting prices is essential. With this aim, research has evolved to model the particularities of electricity prices. In particular, Dynamic Factor Models have been quite successful in the task, both in the short and long run. However, specifying a single model for the unobserved factors is difficult, and it can not be guaranteed that such a model exists. In this paper, Model Averaging is employed to overcome this difficulty, with the expectation that electricity prices would be better forecast by acombination of models for the factors than by a single model. Although our procedure is applicable in other markets, it is illustrated with applications to forecasting spot prices of the Iberian Market, MIBEL (The Iberian Electricity Market) and the Italian Market. Three combinations of forecasts are successful in providing improved results for alternative forecasting horizons.
    Keywords: Forecast combination; Bayesian model averaging; Electricity prices; Dimensionality reduction
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:24028&r=for
  8. By: Oscar Claveria (AQR-IREA, University of Barcelona); Enric Monte (Polytechnic University of Catalunya (UPC)); Salvador Torra (Riskcenter-IREA, University of Barcelona)
    Abstract: This study presents a multiple-input multiple-output (MIMO) approach for multi-step-ahead time series prediction with a Gaussian process regression (GPR) model. We assess the forecasting performance of the GPR model with respect to several neural network architectures. The MIMO setting allows modelling the cross-correlations between all regions simultaneously. We find that the radial basis function (RBF) network outperforms the GPR model, especially for long-term forecast horizons. As the memory of the models increases, the forecasting performance of the GPR improves, suggesting the convenience of designing a model selection criteria in order to estimate the optimal number of lags used for concatenation.
    Keywords: Regional forecasting, tourism demand, multiple-input multiple-output (MIMO), Gaussian process regression, neural networks, machine learning JEL classification: -
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:aqr:wpaper:201701&r=for
  9. By: Gunes Kamber; James Morley; Benjamin Wong
    Abstract: The Beveridge-Nelson (BN) trend-cycle decomposition based on autoregressive forecasting models of U.S. quarterly real GDP growth produces estimates of the output gap that are strongly at odds with widely-held beliefs about the amplitude, persistence, and even sign of transitory movements in economic activity. These antithetical attributes are related to the autoregressive coefficient estimates implying a very high signal-tonoise ratio in terms of the variance of trend shocks as a fraction of the overall quarterly forecast error variance. When we impose a lower signal-to-noise ratio, the resulting BN decomposition, which we label the “BN filter”, produces a more intuitive estimate of the output gap that is large in amplitude, highly persistent, and typically increases in expansions and decreases in recessions. Real-time estimates from the BN filter are also reliable in the sense that they are subject to smaller revisions and predict future output growth and inflation better than estimates from other methods of trend-cycle decomposition that also impose a low signal-to-noise ratio, including deterministic detrending, the Hodrick-Prescott filter, and the bandpass filter.
    Keywords: Beveridge-Nelson decomposition, output gap, signal-to-noise ratio
    JEL: C18 E17 E32
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:een:camaaa:2017-03&r=for
  10. By: Patrick Alexander; Jean-Philippe Cayen; Alex Proulx
    Abstract: We estimate two new equations for Canadian non-commodity exports (NCX) that incorporate three important changes relative to the current equation used at the Bank of Canada. First, we develop two new foreign activity measures (FAMs), which add new components to the FAM currently used at the Bank of Canada. The first measure adds US exports and US government expenditures, and the second adds US industrial production. These new FAMs calibrate the weights on the various components based on the 2014 World Input-Ouput Database to avoid the instability problem that arises when the equations are estimated. Second, we add a new variable to the equations, the trend of Canada’s manufacturing share of output, to control for structural or competitiveness factors that affect Canada’s global import market share. Third, the relative price of exports is determined by a new measure of the Canadian real effective exchange rate developed by Barnett, Charbonneau and Poulin-Bellisle (2016). We find that the new equations improve the in-sample fit and the out-of-sample forecast accuracy relative to the current equation specified in “LENS,” a forecasting model used at the Bank of Canada.
    Keywords: Balance of payments and components, Exchange rates, International topics
    JEL: F10 F14 F17
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:bca:bocadp:17-1&r=for
  11. By: Zara Ghodsi (Statistical Research Centre, Bournemouth University); Allan Webster (Statistical Research Centre, Bournemouth University)
    Abstract: The literature on forecasting tax revenues focuses on the need for a body of competing forecasts independent of government, to limit potential political bias. The Office for Budget Responsibility does provide detailed independent forecasts for the UK but there are limited alternatives. The literature on appropriate techniques for forecasting detailed tax revenues is under-developed. In many countries tax revenue forecasts are embedded in a more extensive macro-economic forecasting model. These models lack sufficient precision for revenue forecasting revenues for several specific taxes. Such models are too involved to support a body of competing independent forecasts. In consequence there is an established need for single equation revenue forecasts for specific taxes to complement the macro-economic approach. This study considers the use of a number of (mainly) time series forecasting techniques. We find Recurrent Singular Spectrum Analysis (RSSA) to perform the best of the techniques considered.
    Keywords: United Kingdom; Income Tax; Forecasting; Singular Spectrum Analysis; ARIMA; Exponential Smoothing; Neural Networks
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:bam:wpaper:bafes07&r=for
  12. By: Hagfors, Lars Ivar; Kamperud , Hilde Horthe; Paraschiv, Florentina; Prokopczuk, Marcel; Sator, Alma; Westgaard, Sjur
    Abstract: Understanding the mechanisms that drive extreme negative and positive prices in day-ahead electricity prices is crucial for managing risk and market design. In this paper, we consider the problem of understanding how fundamental drivers impact the probability of extreme price occurrences in the German day-ahead electricity market. We develop models using fundamental variables to predict the probability of extreme prices. The dynamics of negative prices and positive price spikes differ greatly. Positive spikes are related to high demand, low supply, and high prices the previous days, and mainly occur during the morning and afternoon peak hours. Negative prices occur mainly during the night, and are closely related to low demand combined with high wind production levels. Furthermore, we do a closer analysis of how renewable energy sources, hereby photovoltaic and wind power, impact the probability of negative prices and positive spikes. The models confirm that extremely high and negative prices have different drivers, and that wind power is particularly important in relation to negative price occurrences. The models capture the main drivers of both positive and negative extreme price occurrences, and perform well with respect to accurately forecasting the probability with high levels of confidence. Our results suggests that probability models are well suited to aid in risk management for market participants in day-ahead electricity markets.
    Keywords: Energy Markets, Fundamental Analysis, Spikes, EPEX
    Date: 2016–07
    URL: http://d.repec.org/n?u=RePEc:usg:sfwpfi:2016:22&r=for
  13. By: Arnaud Dufays; Maciej Augustyniak; Luc Bauwens
    Abstract: A new model -the high-dimensional Markov (HDM) model - is proposed for financial returns and their latent variances. It is also applicable to model directly realized variances. Volatility is modeled as a product of three components: a Markov chain driving volatility persistence, an independent discrete process capable of generating jumps in the volatility, and a predictable (data-driven) process capturing the leverage effect. The Markov chain and jump components allow volatility to switch abruptly between thousands of states. The transition probability matrix of the Markov chain is structured in such a way that the multiplicity of the second largest eigenvalue can be greater than one. This distinctive feature generates a high degree of volatility persistence. The statistical properties of the HDM model are derived and an economic interpretation is attached to each component. In-sample results on six financial time series highlight that the HDM model compares favorably to the main existing volatility processes. A forecasting experiment shows that the HDM model significantly outperforms its competitors when predicting volatility over time horizons longer than five days.
    Keywords: Volatility, Markov-switching, Persistence, Leverage effect.
    JEL: C22 C51 C58
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:lvl:crrecr:1609&r=for

This nep-for issue is ©2017 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.