nep-for New Economics Papers
on Forecasting
Issue of 2012‒11‒17
eleven papers chosen by
Rob J Hyndman
Monash University

  1. Nonlinear Forecasting Using Large Datasets: Evidences on US and Euro Area Economies By Alessandro Giovannelli
  2. Bayesian forecasting with highly correlated predictors By Dimitris Korobilis
  3. Estimates of Uncertainty around the RBA's Forecasts By Peter Tulip; Stephanie Wallace
  4. Fitting and Forecasting Sovereign Defaults Using Multiple Risk Signals By Roberto Savona; Marika Vezzoli
  5. Imputing Individual Effects in Dynamic Microsimulation Models.An application of the Rank Method. By Ambra Poggi; Matteo G. Richiardi
  6. Realized volatility: evidence from Brazil By Wink Junior, Marcos Vinício; Valls Pereira, Pedro L.
  7. On the plausibility of adaptive learning in macroeconomics: A puzzling conflict in the choice of the representative algorithm By Michele Berardi; Jaqueson K. Galimberti
  8. Robust estimation and forecasting of the long-term seasonal component of electricity spot prices By Jakub Nowotarski; Jakub Tomczyk; Rafal Weron
  9. Long memory and Periodicity in Intraday Volatility By Eduardo Rossi; Dean Fantazzini
  10. Basis risk modelling: a co-integration based approach By Yahia Salhi; Stéphane Loisel
  11. Too close to call: Growth and the cost of ruling in US presidential elections, with an application to the 2012 election By Kurrild-Klitgaard, Peter

  1. By: Alessandro Giovannelli (Department of Economics and Finance, University of Rome "Tor Vergata")
    Abstract: The primary objective of this paper is to propose two nonlinear extensions for macroeconomic forecasting using large datasets. First, we propose an alternative technique for factor estimation, i.e., kernel principal component analysis, which allows the factors to have a nonlinear relationship to the input variables. Second, we propose artificial neural networks as an alternative to the factor augmented linear forecasting equation. These two extensions allow us to determine whether, in general, there is empirical evidence in favor of nonlinear methods and, in particular, to verify whether the nonlinearity occurs in the estimation of the factors or in the functional form that links the target variable to the factors. In an effort to verify the empirical performances of the methods proposed, we conducted several pseudo forecasting exercises on the industrial production index and consumer price index for the Euro area and US economies. These methods were employed to construct the forecasts at 1-, 3-, 6-, and 12-month horizons using a large dataset containing 259 predictors for the Euro area and 131 predictors for the US economy. The results obtained from the empirical study suggest that the estimation of nonlinear factors, using kernel principal components, significantly improves the quality of forecasts compared to the linear method, while the results for artificial neural networks have the same forecasting ability as the factor augmented linear forecasting equation.
    Keywords: Kernel Principal Component Analysis; Large Dataset; Artificial Neural Networks; QuickNet; Forecasting
    JEL: C45 C53 C13 C33
    Date: 2012–11–07
    URL: http://d.repec.org/n?u=RePEc:rtv:ceisrp:255&r=for
  2. By: Dimitris Korobilis
    Abstract: This paper considers Bayesian variable selection in regressions with a large number of possibly highly correlated macroeconomic predictors. I show that by acknowledging the correlation structure in the predictors can improve forecasts over existing popular Bayesian variable selection algorithms.
    Keywords: Bayesian semiparametric selection; Dirichlet process prior; correlated predictors; clustered coefficients
    JEL: C11 C14 C32 C52 C53
    Date: 2012–07
    URL: http://d.repec.org/n?u=RePEc:gla:glaewp:2012_12&r=for
  3. By: Peter Tulip (Reserve Bank of Australia); Stephanie Wallace (Reserve Bank of Australia)
    Abstract: We use past forecast errors to construct confidence intervals and other estimates of uncertainty around the Reserve Bank of Australia's forecasts of key macroeconomic variables. Our estimates suggest that uncertainty about forecasts is high. We find that the RBA's forecasts have substantial explanatory power for the inflation rate but not for GDP growth.
    Keywords: forecast errors; confidence intervals
    JEL: E17 E27 E37
    Date: 2012–11
    URL: http://d.repec.org/n?u=RePEc:rba:rbardp:rdp2012-07&r=for
  4. By: Roberto Savona (Department of Business Studies, University Of Brescia); Marika Vezzoli (Department of Quantitative Methods, University Of Brescia)
    Abstract: In this paper we face the fitting versus forecasting paradox with the objective of realizing an optimal Early Warning System to better describe and predict past and future sovereign defaults. We do this by proposing a new Regression Tree-based model that signals a potential crisis whenever preselected indicators exceed specific thresholds. Using data on 66 emerging markets over the period 1975-2002, our model provides an accurate description of past data, although not the best description relative to existing competing models (Logit, Stepwise logit, Noise-to-Signal Ratio and Regression Trees), and produces the best forecasts accomodating to different risk aversion targets. By modulating in- and out-of sample model accuracy, our methodology leads to unambiguous empirical results, since we find that illiquidity (short-term debt to reserves ratio), insolvency (reserve growth) and contagion risks act as the main determinants/predictors of past/future debt crises.
    Keywords: Data mining; Evaluating forecasts; Model selection; Panel data; Probability forecasting.
    JEL: C14 C23 G01 H63
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:ven:wpaper:2012_26&r=for
  5. By: Ambra Poggi; Matteo G. Richiardi
    Abstract: Dynamic microsimulation modeling involves two stages: estimation and forecasting. Unobserved heterogeneity is often considered in estimation, but not in forecasting, beyond trivial cases. Non-trivial cases involve individuals that enter the simulation with a history of previous outcomes. We show that the simple solutions of attributing to these individuals a null effect or a random draw from the estimated unconditional distributions lead to biased forecasts, which are often worse than those obtained neglecting unobserved heterogeneity altogether. We then present a first implementation of the Rank method, a new algorithm for attributing the individual effects to the simulation sample which greatly simplifies those already known in the literature. Out-of-sample validation of our model shows that correctly imputing unobserved heterogeneity significantly improves the quality of the forecasts.
    Keywords: Dynamic microsimulation, Unobserved heterogeneity, Validation, Rank method, Assignment algorithms, Female labor force participation, Italy
    JEL: C53 C18 C23 C25 J11 J12 J21
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:cca:wplabo:124&r=for
  6. By: Wink Junior, Marcos Vinício; Valls Pereira, Pedro L.
    Abstract: Using intraday data for the most actively traded stocks on the SãoPaulo Stock Market (BOVESPA) index, this study considers two recentlydeveloped models from the literature on the estimation and prediction ofrealized volatility: the Heterogeneous Autoregressive Model of RealizedVolatility (HAR-RV), developed by Corsi (2009), and the Mixed DataSampling model (MIDAS-RV), developed by Ghysels et al. (2004). Usingmeasurements to compare in-sample and out-of-sample forecasts, betterresults were obtained with the MIDAS-RV model for in-sample forecasts. For out-of-sample forecasts, however, there was no statistically signi cantdi¤erence between the models. We also found evidence that the use ofrealized volatility induces distributions of standardized returns that arecloser to normal
    Date: 2012–11–09
    URL: http://d.repec.org/n?u=RePEc:fgv:eesptd:320&r=for
  7. By: Michele Berardi; Jaqueson K. Galimberti
    Abstract: The literature on bounded rationality and learning in macroeconomics has often used recursive algorithms such as least squares and stochastic gradient to depict the evolution of agents' beliefs over time. In this work, we try to assess the plausibility of such practice from an empirical perspective, by comparing forecasts obtained from these algorithms with survey data. In particular, we show that the relative performance of the two algorithms in terms of forecast errors depends on the variable being forecasted, and we argue that rational agents would therefore use different algorithms when forecasting different variables. By using survey data, then, we show that agents instead always behave as least squares learners, irrespective of the variable being forecasted. We thus conclude that such findings point to a puzzling conflict between rational and actual behaviour when it comes to expectations formation.
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:man:cgbcrp:177&r=for
  8. By: Jakub Nowotarski; Jakub Tomczyk; Rafal Weron
    Abstract: When building stochastic models for electricity spot prices the problem of uttermost importance is the estimation and consequent forecasting of a component to deal with trends and seasonality in the data. While the short-term seasonal components (daily, weekly) are more regular and less important for valuation of typical power derivatives, the long-term seasonal components (LTSC; seasonal, annual) are much more difficult to tackle. Surprisingly, in many academic papers dealing with electricity spot price modeling the importance of the seasonal decomposition is neglected and the problem of forecasting it is not considered. With this paper we want to fill the gap and present a thorough study on estimation and forecasting of the LTSC of electricity spot prices. We consider a battery of models based on Fourier or wavelet decomposition combined with linear or exponential decay. We find that all considered wavelet-based models are significantly better in terms of forecasting spot prices up to a year ahead than all considered sine-based models. This result questions the validity and usefulness of stochastic models of spot electricity prices built on sinusoidal long-term seasonal components.
    Keywords: Electricity spot price; Long-term seasonal component; Robust modeling; Forecasting; Wavelets;
    JEL: C45 C53 C80 Q47
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:wuu:wpaper:hsc1206&r=for
  9. By: Eduardo Rossi (Department of Economics and Management, University of Pavia); Dean Fantazzini (Moscow School of Economics, M.V. Lomonosov Moscow State University)
    Abstract: Intraday return volatilities are characterized by the contemporaneous presence of periodicity and long memory. This paper proposes two new parameterizations of the intraday volatility: the Fractionally Integrated Periodic EGARCH and the Seasonal Fractional Integrated Periodic EGARCH, which provide the required flexibility to account for both features. The periodic kurtosis and periodic autocorrelations of power transformations of the absolute returns are computed for both models. The empirical application shows that volatility of the hourly Emini S&P 500 futures returns are characterized by a periodic leverage effect coupled with a statistically significant long-range dependence. An out-of-sample forecasting comparison with alternative models shows that a constrained version of the FI-PEGARCH provides superior forecasts. A simulation experiment is carried out to investigate the effects that sample frequency has on the fractional differencing parameter estimate.
    Keywords: Intraday volatility, Long memory, FI-PEGARCH, SFI-PEGARCH, Periodicmodels.
    JEL: C22 C58 G13
    Date: 2012–11
    URL: http://d.repec.org/n?u=RePEc:pav:demwpp:015&r=for
  10. By: Yahia Salhi (SAF - Laboratoire de Sciences Actuarielle et Financière - Université Claude Bernard - Lyon I : EA2429); Stéphane Loisel (SAF - Laboratoire de Sciences Actuarielle et Financière - Université Claude Bernard - Lyon I : EA2429)
    Abstract: Most mortality models are generally calibrated on national population. However, pensions funds and annuity providers are mainly interested in the mortality rates of their own portfolio. In this paper we put forward a multivariate approach for forecasting pairwise mortality rates of related population. The investigated approach links national population mortality to a subset population using an econometric model that captures a long-run relationship between both mortality dynamics. This model does not lay the emphasis on the correlation that the two given mortality dynamics would present but rather on the long-term behaviour, which suggests that the two time-series cannot wander off in opposite directions for very long without mean reverting force on grounds of biological reasonableness. The model additionally captures the short-run adjustment between the considered mortality dynamics. Our aim is to propose a consistent approach to forecast pairwise mortality and to some extent to better control and assess basis risk underlying index-based longevity securitization. An empirical comparison of the forecast of one-year death probabilities of portfolio-experienced mortality is performed using both a factor-based model and the proposed approach. The robustness of the model is tested on mortality rate data for England & Wales and Continuous Mortality Investigation assured lives representing a sub-population.
    Date: 2012–02
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-00746859&r=for
  11. By: Kurrild-Klitgaard, Peter
    Abstract: The note briefly outlines a new model for the explanation of US presidential elections, founded on (a) recent economic growth and (b) a measure of what may be called “’the cost of ruling”. The former is based in changes in real disposable income for the period following a mid-term election, while the latter combines factors of incumbency and terms-in-office. The model is applied to data from the US presidential elections 1932-2008 and has considerable explanatory power for the variation in the incumbent party’s candidate’s share of the two-party vote (R2=0.74). The model is controlled against a number of other frequent explanations and is found to be quite robust. When augmented with approval ratings for incumbent presidents, the explanatory power increases to 83 pct. and only incorrectly calls one of the last 15 US presidential elections. Applied to the 2012 election as a forecasting model the prediction is that President Obama will win 49,6 pct. of the two-party vote.
    Keywords: Economic voting; US presidential elections
    JEL: D72
    Date: 2012–11–01
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:42464&r=for

This nep-for issue is ©2012 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.