nep-for New Economics Papers
on Forecasting
Issue of 2013‒04‒27
seventeen papers chosen by
Rob J Hyndman
Monash University

  1. Was the Recent Downturn in US GDP Predictable? By Mehmet Balcilar; Rangan Gupta; Anandamayee Majumdar; Stephen M. Miller
  2. Forecasting with Mixed Frequency Samples: The Case of Common Trends By Peter Fuleky; Carl S. Bonham
  3. The Out-of-Sample Forecasting Performance of Non-Linear Models of Regional Housing Prices in the US By Mehmet Balcilar; Rangan Gupta; Stephen M. Miller
  4. Are Forecast Updates Progressive? By Chang, Chia-Lin; Franses, Philip Hans; McAleer, Michael
  5. The empirical (ir)relevance of the interest rate assumption for central bank forecasts By Knüppel, Malte; Schultefrankenfeld, Guido
  6. Bottom-up or Direct? Forecasting German GDP in a Data-rich Environment By Drechsel. Katja; R. Scheufele
  7. Analyzing Fixed-Event Forecast Revisions By Chia-Lin Chang; Bert de Bruijn; Philip Hans Franses; Michael McAleer
  8. Comparative study of static and dynamic neural network models for nonlinear time series forecasting By Abounoori, Abbas Ali; Mohammadali, Hanieh; Gandali Alikhani, Nadiya; Naderi, Esmaeil
  9. Does long memory matter in forecasting oil price volatility? By Delavari, Majid; Gandali Alikhani, Nadiya; Naderi, Esmaeil
  10. Censored Posterior and Predictive Likelihood in Bayesian Left-Tail Prediction for Accurate Value at Risk Estimation By Lukasz Gatarek; Lennart Hoogerheide; Koen Hooning; Herman K. van Dijk
  11. Estimating investors' behavior and errors in probabilistic forecasts by the Kolmogorov entropy and noise colors of non-hyperbolic attractors By Dominique, C-Rene
  12. Parallel Sequential Monte Carlo for Efficient Density Combination: The DeCo Matlab Toolbox By Roberto Casarin; Stefano Grassi; Francesco Ravazzolo; Herman K. van Dijk
  13. Modeling dynamic diurnal patterns in high frequency financial data By Ito, Ryoko
  14. Robust Predictions in Games with Incomplete Information By Dirk Bergemann; Stephen Morris
  15. Quantifying Heterogeneous Survey Expectations: The Carlson-Parkin Method Revisited By Kajal Lahiri; Yongchen Zhao
  16. Planning of sales on the example of companies in the paper industry and wholesale of chemical products. By Monika Brzezińska; Katarzyna Guhn
  17. Adding Ideology to the Equation: New Predictions for Election Results under Compulsory Voting By Fernanda L L de Leon

  1. By: Mehmet Balcilar (Department of Economics, Eastern Mediterranean University); Rangan Gupta (Department of Economics, University of Pretoria); Anandamayee Majumdar (Department of Biostatistics, University of North Texas Health Science Center); Stephen M. Miller (Department of Economics, University of Nevada, Las Vegas)
    Abstract: This paper uses small set of variables-- real GDP, the inflation rate, and the short-term interest rate -- and a rich set of models -- athoeretical and theoretical, linear and nonlinear, as well as classical and Bayesian models -- to consider whether we could have predicted the recent downturn of the US real GDP. Comparing the performance by root mean squared errors of the models to the benchmark random-walk model, the two theoretical models, especially the nonlinear model, perform well on the average across all forecast horizons in out-of-sample forecasts, although at specific forecast horizons certain nonlinear athoeretical models perform the best. The nonlinear theoretical model also dominates in our ex ante forecast of the Great Recession, suggesting that developing forward-looking, microfounded, nonlinear, dynamic-stochastic-general-equilibrium models of the economy, may prove crucial in forecasting turning points.
    Keywords: Forecasting, Linear and non-linear models, Great Recession
    JEL: C32 E37
    Date: 2012–12
    URL: http://d.repec.org/n?u=RePEc:nlv:wpaper:1210&r=for
  2. By: Peter Fuleky (UHERO and Department of Economics, University of Hawaii at Manoa); Carl S. Bonham (Department of Economics, University of Hawaii at Manoa)
    Abstract: We analyze the forecasting performance of small mixed frequency factor models when the observed variables share stochastic trends. The indicators are observed at various frequencies and are tied together by cointegration so that valuable high fre- quency information is passed to low frequency series through the common factors. Dierencing the data breaks the cointegrating link among the series and some of the signal leaks out to the idiosyncratic components, which do not contribute to the trans- fer of information among indicators. We nd that allowing for common trends improves forecasting performance over a stationary factor model based on dierenced data. The \common-trends factor model" outperforms the stationary factor model at all analyzed forecast horizons. Our results demonstrate that when mixed frequency variables are cointegrated, modeling common stochastic trends improves forecasts.
    Keywords: Dynamic Factor Model, Mixed Frequency Samples, Common Trends, Forecasting
    JEL: E37 C32 C53 L83
    Date: 2013–04
    URL: http://d.repec.org/n?u=RePEc:hai:wpaper:201305&r=for
  3. By: Mehmet Balcilar (Department of Economics, Eastern Mediterranean University); Rangan Gupta (Department of Economics, University of Pretoria); Stephen M. Miller (Department of Economics, University of Nevada, Las Vegas)
    Abstract: This paper provides out-of-sample forecasts of linear and non-linear models of US and Census regions housing prices. The forecasts include the traditional point forecasts, but also include interval and density forecasts of the housing price distributions. The non-linear smooth-transition autoregressive model outperforms the linear autoregressive model in point forecasts at longer horizons, but the linear autoregressive model dominates the non-linear smooth-transition autoregressive model at short horizons. In addition, we generally do not find major differences in performance for the interval and density forecasts between the linear and non-linear models. Finally, in a dynamic 25-step ex-ante and interval forecasting design, we, once again, do not find major differences between the linear and nonlinear models.
    Keywords: Forecasting, Linear and non-linear models, US and Census housing price indexes
    JEL: C32 R31
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:nlv:wpaper:1209&r=for
  4. By: Chang, Chia-Lin; Franses, Philip Hans; McAleer, Michael
    Abstract: Many macroeconomic forecasts and forecast updates like those from IMF and OECD typically involve both a model component, which is replicable, as well as intuition, which is non-replicable. Intuition is expert knowledge possessed by a forecaster. If forecast updates are progressive, forecast updates should become more accurate, on average, as the actual value is approached. Otherwise, forecast updates would be neutral. The paper proposes a methodology to test whether macroeconomic forecast updates are progressive, where the interaction between model and intuition is explicitly taken into account. The data set for the empirical analysis is for Taiwan, where we have three decades of quarterly data available of forecasts and their updates of the inflation rate and real GDP growth rate. Our empirical results suggest that the forecast updates for Taiwan are progressive, and that progress can be explained predominantly by improved intuition.
    Keywords: Macroeconomic forecasts, econometric models, intuition, progressive forecast updates, forecast errors
    JEL: C22 C53 E27 E37
    Date: 2013–03–01
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:46387&r=for
  5. By: Knüppel, Malte; Schultefrankenfeld, Guido
    Abstract: The interest rate assumptions for macroeconomic forecasts differ considerably among central banks. Common approaches are given by the assumption of constant interest rates, interest rates expected by market participants, or the central bank's own interest rate expectations. From a theoretical point of view, the latter should yield the highest forecast accuracy. The lowest accuracy can be expected from forecasts conditioned on constant interest rates. However, when investigating the predictive accuracy of the forecasts for interest rates, inflation and output growth made by the Bank of England and the Banco do Brasil, we hardly find any significant differences between the forecasts based on different interest assumptions. We conclude that the choice of the interest rate assumption, while being a major concern from a theoretical point of view, appears to be at best of minor relevance empirically. --
    Keywords: Forecast Accuracy,Density Forecasts,Projections
    JEL: C12 C53
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:zbw:bubdps:112013&r=for
  6. By: Drechsel. Katja; R. Scheufele
    Abstract: This paper presents a method to conduct early estimates of GDP growth in Germany. We employ MIDAS regressions to circumvent the mixed frequency problem and use pooling techniques to summarize efficiently the information content of the various indicators. More specifically, we investigate whether it is better to disaggregate GDP (either via total value added of each sector or by the expenditure side) or whether a direct approach is more appropriate when it comes to forecasting GDP growth. Our approach combines a large set of monthly and quarterly coincident and leading indicators and takes into account the respective publication delay. In a simulated out-of-sample experiment we evaluate the different modelling strategies conditional on the given state of information and depending on the model averaging technique. The proposed approach is computationally simple and can be easily implemented as a nowcasting tool. Finally, this method also allows retracing the driving forces of the forecast and hence enables the interpretability of the forecast outcome.
    Keywords: contemporaneous aggregation, nowcasting, leading indicators, MIDAS, forecast combination, forecast evaluation
    JEL: E32 E37 C52 C53
    Date: 2013–04
    URL: http://d.repec.org/n?u=RePEc:iwh:dispap:7-13&r=for
  7. By: Chia-Lin Chang (National Chung Hsing University, Taichung, Taiwan); Bert de Bruijn (Erasmus University Rotterdam); Philip Hans Franses (Erasmus University Rotterdam); Michael McAleer (Erasmus University Rotterdam)
    Abstract: It is common practice to evaluate fixed-event forecast revisions in macroeconomics by regressing current forecast revisions on one-period lagged forecast revisions. Under weak-form (forecast) efficiency, the correlation between the current and one-period lagged revisions should be zero. The empirical findings in the literature suggest that this null hypothesis of zero correlation is rejected frequently, where the correlation can be either positive (which is widely interpreted in the literature as <93>smoothing<94>) or negative (which is widely interpreted as <93>over-reacting<94>). We propose a methodology to interpret such non-zero correlations in a straightforward and clear manner. Our approach is based on the assumption that numerical forecasts can be decomposed into both an econometric model and random expert intuition. We show that the interpretation of the sign of the correlation between the current and one-period lagged revisions depends on the process governing intuition, and the current and lagged correlations between intuition and news (or shocks to the numerical forecasts). It follows that the estimated non-zero correlation cannot be given a direct interpretation in terms of smoothing or over-reaction. ^M ^M ^M ^M
    Keywords: Evaluating forecasts; Macroeconomic forecasting; Rationality; Intuition; Weak-form efficiency; Fixed-event forecasts
    JEL: C22 C53 E27 E37
    Date: 2013–04–11
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20130057&r=for
  8. By: Abounoori, Abbas Ali; Mohammadali, Hanieh; Gandali Alikhani, Nadiya; Naderi, Esmaeil
    Abstract: During the recent decades, neural network models have been focused upon by researchers due to their more real performance and on this basis different types of these models have been used in forecasting. Now, there is this question that which kind of these models has more explanatory power in forecasting the future processes of the stock. In line with this, the present paper made a comparison between static and dynamic neural network models in forecasting the return of Tehran Stock Exchange (TSE) index in order to find the best model to be used for forecasting this series (as a nonlinear financial time series). The data were collected daily from 25/3/2009 to 22/10/2011. The models examined in this study included two static models (Adaptive Neuro-Fuzzy Inference Systems or ANFIS and Multi-layer Feed-forward Neural Network or MFNN) and a dynamic model (nonlinear neural network autoregressive model or NNAR). The findings showed that based on the Mean Square Error and Root Mean Square Error criteria, ANFIS model had a much higher forecasting ability compared to other models.
    Keywords: Forecasting, Stock Market, dynamic Neural Network, Static Neural Network.
    JEL: C22 C45 C60 G14 G17
    Date: 2012–10–12
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:46466&r=for
  9. By: Delavari, Majid; Gandali Alikhani, Nadiya; Naderi, Esmaeil
    Abstract: This study attempts to introduce an appropri¬¬ate model for modeling and forecasting Iran’s crude oil price volatility. Therefore, this hypothesis will be tested about whether long memory feature matters in forecasting the price of this commodity. For this purpose, using the Iran’s weekly crude oil price data, the long memory feature will be considered in the return and volatilities series, and the fractal markets hypothesis will also be examined about Iran’s oil market. In addition, from among the different conditional heteroscedasticity models, the best model for forecasting oil price volatilities will be selected based the forecasting error criterion. The main hypothesis of the study will be tested out using Clark-West test (2006). The results of our study confirmed the existence of long memory feature in both mean and variance equations of these series. But from among the conditional heteroscedasticity models, the ARFIMA-FIGARCH model was selected as the best model based on the Akaike and Schwarz information criteria (for modeling), and also the MSE criterion (for forecasting). Finally, the Clark-West test showed that the long memory feature is important in forecasting oil price volatilities.
    Keywords: Oil Price Volatility, Long Memory, FIGARCH, Clark-West.
    JEL: C12 C58 E37 Q47
    Date: 2013–04–19
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:46356&r=for
  10. By: Lukasz Gatarek (Econometric Institute, Erasmus School of Economics, Erasmus University Rotterdam); Lennart Hoogerheide (VU University Amsterdam); Koen Hooning (Delft University of Technology); Herman K. van Dijk (Econometric Institute, Erasmus University Rotterdam, and VU University Amsterdam)
    Abstract: Accurate prediction of risk measures such as Value at Risk (VaR) and Expected Shortfall (ES) requires precise estimation of the tail of the predictive distribution. Two novel concepts are introduced that offer a specific focus on this part of the predictive density: the censored posterior, a posterior in which the likelihood is replaced by the censored likelihood; and the censored predictive likelihood, which is used for Bayesian Model Averaging. We perform extensive experiments involving simulated and empirical data. Our results show the ability of these new approaches to outperform the standard posterior and traditional Bayesian Model Averaging techniques in applications of Value-at-Risk prediction in GARCH models.
    Keywords: censored likelihood; censored posterior; censored predictive likelihood; Bayesian Model Averaging; Value at Risk; Metropolis-Hastings algorithm.
    JEL: C11 C15 C22 C51 C53 C58 G17
    Date: 2013–04–15
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20130060&r=for
  11. By: Dominique, C-Rene
    Abstract: This paper investigates the impact of the Kolmogorov-Sinai entropy on both the accuracy of probabilistic forecasts and the sluggishness of economic growth. It first posits the Gaussian process Zt (indexed by the Hurst exponent H) as the output of a reflexive dynamic input/output system governed by a non-hyperbolic of attractor. It next indexes families of attractors by the Hausdorff measure (D0) and assesses the uncertainty level plaguing probabilistic forecast in each family. The D0 signature of attractors is next applied to the S&P-500 Index The result allows the construction of the dynamic history of the index and establishes robust links between the Hausdorff dimension, investors’ behavior, and economic growth
    Keywords: Stochastic processes, Hausdorff dimension, forecasts, entrupy, attractors (strange, complex, low dimensional, chaotic), investors’ behavior, economic growth
    JEL: C8 G1 G11 G3
    Date: 2013–04–19
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:46451&r=for
  12. By: Roberto Casarin (Department of Economics, University Of Venice Cà Foscari); Stefano Grassi (CREATES, Department of Economics and Business, Aarhus University); Francesco Ravazzolo (Norges Bank and BI Norwegian Business School); Herman K. van Dijk (Erasmus University Rotterdam, VU University Amsterdam and Tinbergen Institute)
    Abstract: This paper presents the Matlab package DeCo (Density Combination) which is based on the paper by Billio et al. (2013) where a constructive Bayesian approach is presented for combining predictive densities originating from different models or other sources of information. The combination weights are time-varying and may depend on past predictive forecasting performances and other learning mechanisms. The core algorithm is the function DeCo which applies banks of parallel Sequential Monte Carlo algorithms to filter the time-varying combination weights. The DeCo procedure has been implemented both for standard CPU computing and for Graphical Process Unit (GPU) parallel computing. For the GPU implementation we use the Matlab parallel computing toolbox and show how to use General Purposes GPU computing almost effortless. This GPU implementation comes with a speed up of the execution time up to seventy times compared to a standard CPU Matlab implementation on a multicore CPU. We show the use of the package and the computational gain of the GPU version, through some simulation experiments and empirical applications
    Keywords: Density Forecast Combination, Sequential Monte Carlo, Parallel Computing, GPU, Matlab.
    JEL: C11 C15 C53 G17
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:ven:wpaper:2013:08&r=for
  13. By: Ito, Ryoko
    Abstract: A spline-DCS model is developed to forecast the conditional distribution of high-frequency financial data with periodic behavior. The dynamic cubic spline of Harvey and Koopman (1993) is applied to allow diurnal patterns to evolve stochastically over time. An empirical application illustrates the practicality and impressive predictive performance of the model.
    Keywords: outlier; robustness, score, calendar effect, spline, trade volume.
    JEL: C22
    Date: 2013–04–19
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:1315&r=for
  14. By: Dirk Bergemann; Stephen Morris
    Date: 2013–04–11
    URL: http://d.repec.org/n?u=RePEc:cla:levarc:786969000000000666&r=for
  15. By: Kajal Lahiri; Yongchen Zhao
    Abstract: We propose a generalized ordered response model that nests the popular Carlson-Parkin (CP) method to quantify household in flation expectations while explicitly control for cross-sectional heterogeneity in the threshold parameters and the variance. By matching qualitative and quantitative data from 1979 to 2012 from the University of Michigan’s Survey of Consumers, we find evidence against the threshold constancy, symmetry, and homogeneity assumptions of the CP method. We show that the quantified expectations produced by the generalized model outperform those produced by the CP method, most notably during the 2008 recession period. We also show that when an rolling-window identification scheme is employed instead of the unbiasedness assumption over the entire sample, quantified expectations are significantly better in terms of predictive accuracy when compared with the quantitative expect ations reported in the survey.
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:nya:albaec:13-08&r=for
  16. By: Monika Brzezińska (UE Wrocław - Uniwersytet Ekonomiczny we Wrocławiu - U); Katarzyna Guhn (UE Wrocław - Uniwersytet Ekonomiczny we Wrocławiu - U)
    Abstract: The report includes forecast of next year's sales in companies operating in the production of paper and paper products, wholesale of chemical products, using the method of percentage of sales. The preparation of projected balance sheets and cash flow planning.
    Keywords: przyszłoroczna sprzedaż, metoda procentu od sprzedaży, metoda małych wskaźników
    Date: 2013–04–11
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-00812840&r=for
  17. By: Fernanda L L de Leon (University of East Anglia)
    Abstract: This paper provides new predictions for compulsory elections, taking into consideration the differences in ideological views between compulsory and voluntary voters. Having explored Brazil's dual voting system, I predict changes in Americans' preferences and estimate a voting model applied to US senatorial elections. I find that, if the current voting population had ideological preferences of a compulsory electorate, Democrats would gain 8.7 percentage points in their vote shares and win 68% of the elections. Moreover, candidates that are voted for less would be the ones that gain more votes under compulsory elections, while this system would be most detrimental for highly voted-for candidates. Another consequence includes the candidates' reaction while converging in the ideological spectrum.
    Date: 2013–04
    URL: http://d.repec.org/n?u=RePEc:uea:aepppr:2012_44&r=for

This nep-for issue is ©2013 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.