nep-for New Economics Papers
on Forecasting
Issue of 2012‒10‒20
thirteen papers chosen by
Rob J Hyndman
Monash University

  1. Let's Do It Again: Bagging Equity Premium Predictors By Eric Hillebrand; Tae-Hwy Lee; Marcelo C. Medeiros
  2. Predictive Ability of Competing Models for South Africa’s Fixed Business Non- Residential Investment Spending By Renee van Eyden; Goodness C. Aye; Rangan Gupta
  3. The macroeconomic forecasting performance of autoregressive models with alternative specifications of time-varying volatility By Todd E. Clark; Francesco Ravazzolo
  4. Forecast robustness in macroeconometric models By Gunnar Bårdsen, Dag Kolsrud, and Ragnar Nymoen
  5. Can we use seasonally adjusted indicators in dynamic factor models? By Maximo Camacho; Yuliya Lovcha; Gabriel Perez-Quiros
  6. Lasso-type and Heuristic Strategies in Model Selection and Forecasting By Ivan Savin; Peter Winker
  7. Monthly GDP estimates based on the IGAE By Rocío Elizondo
  8. Modelling general dependence between commodity forward curves By Mikhail Zolotko; Ostap Okhrin; ;
  9. Modeling Partial Customer Churn: On the Value of First Product-Category Purchase Sequences By V. L. MIGUÉIS; D. VAN DEN POEL; A.S. CAMANHO; J. FALCAO E CUNHA
  11. Monetary policy implications of the dependence of long term interest rates on disagreement about macroeconomic forecasts By Eric Dor
  12. Economic scenario of United States of America before and after 2012 U.S. Presidential Election By Sinha, Pankaj; Singhal, Anushree; Sondhi, Kriti
  13. Distributed Learning in Hierarchical Networks By Hélène Le Cadre; Bedo Jean-Sébastien

  1. By: Eric Hillebrand (Aarhus University and CREATES); Tae-Hwy Lee (University of California, Riverside); Marcelo C. Medeiros (Pontifical Catholic University of Rio de Janeiro)
    Abstract: The literature on excess return prediction has considered a wide array of estimation schemes, among them unrestricted and restricted regression coefficients. We consider bootstrap aggregation (bagging) to smooth parameter restrictions. Two types of restrictions are considered: positivity of the regression coefficient and positivity of the forecast. Bagging constrained estimators can have smaller asymptotic mean-squared prediction errors than forecasts from a restricted model without bagging. Monte Carlo simulations show that forecast gains can be achieved in realistic sample sizes for the stock return problem. In an empirical application using the data set of Campbell, J., and S. Thompson (2008): “Predicting the Equity Premium Out of Sample: Can Anything Beat the Historical Average?”, Review of Financial Studies 21, 1511-1531, we show that we can improve the forecast performance further by smoothing the restriction through bagging.
    Keywords: Constraints on predictive regression function, Bagging, Asymptotic MSE, Equity premium; Out-of-sample forecasting, Economic value functions.
    JEL: C5 E4 G1
    Date: 2012–09–30
  2. By: Renee van Eyden (Department of Economics, University of Pretoria); Goodness C. Aye (Department of Economics, University of Pretoria); Rangan Gupta (Department of Economics, University of Pretoria)
    Abstract: The study evaluates the forecasting ability of models of South Africa’s real fixed business nonresidential investment spending growth over the recent 2003:1–2011:4 out-of-sample period. The forecasting models are based on the Accelerator, Neoclassical, Cash-Flow, Average Q, Stock Price and Excess Stock Return Predictors models of investment spending. The Average Q, Stock Price and Return Predictors models appear more important in forecasting the behaviour of South Africa’s business investment spending growth over the recent 2003:1–2011:4 out-of-sample period. The results from this study point to the important role of the stock market in promoting investment growth in South Africa, underscoring the need for stock market development. Also, stock market variables seem to play an increasingly important role in predicting investment spending behaviour in recent times.
    Keywords: business fixed investment spending, out-of-sample forecasts, mean squared forecast error, forecast encompassing
    JEL: C22 C53 E22 E27
    Date: 2012–10
  3. By: Todd E. Clark (Federal Reserve bank of Cleveland); Francesco Ravazzolo (Norges Bank (Central Bank of Norway) and BI Norwegian Business School)
    Abstract: This paper compares alternative models of time-varying macroeconomic volatility on the basis of the accuracy of point and density forecasts of macroeconomic variables. In this analysis, we consider both Bayesian autoregressive and Bayesian vector autoregressive models that incorporate some form of time-varying volatility, precisely stochastic volatility (both with constant and time-varying autoregressive coefficients), stochastic volatility following a stationary AR process, stochastic volatility coupled with fat tails, GARCH and mixture of innovation models. The comparison is based on the accuracy of forecasts of key macroeconomic time series for real-time post War-II data both for the United States and United Kingdom. The results show that the AR and VAR specifications with widely-used stochastic volatility dominate models with alternative volatility specifications, in terms of point forecasting to some degree and density forecasting to a greater degree.
    Keywords: Stochastic volatility, GARCH, forecasting
    JEL: E17 C11 C53
    Date: 2012–10–09
  4. By: Gunnar Bårdsen, Dag Kolsrud, and Ragnar Nymoen (Department of Economics, Norwegian University of Science and Technology, Statistics Norway, and University of Oslo)
    Abstract: The paper investigates explanations for forecasting invariance to structural breaks. After highlighting the role of policy, we isolate possible structural invariance in a simplified dynamic macro model that nevertheless has features in common with the standard model of aggregate demand and aggregate supply. We find, as expected, that structural breaks in growth rates and in the means of cointegrating relationships will always damage some of the variables. But we also find examples of "insulation" from shocks. The results about partial robustness is a property of the economy itself (here represented by the DGP) and not of the forecasts.
    Date: 2012–10–06
  5. By: Maximo Camacho (Universidad de Murcia); Yuliya Lovcha (Universidad de Navarra); Gabriel Perez-Quiros (Banco de España)
    Abstract: We examine the short-term performance of two alternative approaches to forecasting using dynamic factor models. The first approach extracts the seasonal component of the individual indicators before estimating the dynamic factor model, while the alternative uses the nonseasonally adjusted data in a model that endogenously accounts for seasonal adjustment. Our Monte Carlo analysis reveals that the performance of the former is always comparable to or even better than that of the latter in all the simulated scenarios. Our results have important implications for the factor models literature because they show that the common practice of using seasonally adjusted data in this type of model is very accurate in terms of forecasting ability. Drawing on fi ve coincident indicators, we illustrate this result for US data
    Keywords: Dynamic factor models, seasonal adjustment, short-term forecasting
    JEL: E32 C22 E27
    Date: 2012–10
  6. By: Ivan Savin (DFG Research Training Program "The Economics of Innovative Change", Friedrich Schiller University Jena and Max Planck Institute of Economics); Peter Winker (Justus Liebig University Giessen, and Centre for European Economic Research, Mannheim)
    Abstract: Several approaches for subset recovery and improved forecasting accuracy have been proposed and studied. One way is to apply a regularization strategy and solve the model selection task as a continuous optimization problem. One of the most popular approaches in this research field is given by Lasso-type methods. An alternative approach is based on information criteria. In contrast to the Lasso, these methods also work well in the case of highly correlated predictors. However, this performance can be impaired by the only asymptotic consistency of the information criteria. The resulting discrete optimization problems exhibit a high computational complexity. Therefore, a heuristic optimization approach (Genetic Algorithm) is applied. The two strategies are compared by means of a Monte-Carlo simulation study together with an empirical application to leading business cycle indicators in Russia and Germany.
    Keywords: Adaptive Lasso, Elastic net, Forecasting, Genetic algorithms, Heuristic methods, Lasso, Model selection
    JEL: C51 C52 C53 C61 C63
    Date: 2012–10–11
  7. By: Rocío Elizondo
    Abstract: This article presents three methods to estimate the logarithm of montly real GDP in Mexico from the Global Indicator of Economic Activity (IGAE): (1) a deterministic approach using the IGAE growth rate; (2) an extension of Denton method; and, (3) the Kalman filter. In these methods the monthly GDP is regarded as an unobservable variable that is approximated using only the IGAE. Results suggest that the method based on the Kalman filter seems to fit better the observed data of quarterly GDP under several error measures. By analyzing different estimation periods it was found that the parameters corresponding to the filter remained relatively stable over the period of study. Therefore, this method was used to perform out-of-sample forecasts.
    Keywords: Gross Domestic Product, Global Indicator of Economic Activity, Kalman Filter, Denton Method, Forecasts.
    JEL: C22 D24 E23 E27
    Date: 2012–10
  8. By: Mikhail Zolotko; Ostap Okhrin; ;
    Abstract: This study proposes a novel framework for the joint modelling of commodity forward curves. Its key contribution is twofold. First, dynamic correlation models are applied in this context as part of the modelling scheme. Second, we introduce a family of dynamic conditional correlation models based on hierarchical Archimedean copulae (HAC DCC), which are flexible, but parsimonious instruments that capture a wide range of dynamic dependencies. The conducted analysis allows us to obtain precise out-of-sample forecasts of the distribution of the returns of various commodity futures portfolios. The Value-at-Risk analysis shows that HAC DCC models outperform other introduced benchmark models on a consistent basis.
    Keywords: commodity forward curves, multivariate GARCH, hierarchical Archimedean copula, Value-at-Risk
    JEL: C13 C53 Q40
    Date: 2012–10
    Abstract: Retaining customers has been considered one of the most critical challenges among those included in Customer Relationship Management (CRM), particularly in the grocery retail sector. In this context, an accurate prediction whether or not a customer will leave the company, i.e. churn prediction, is crucial for companies to conduct effective retention campaigns. This paper proposes to include in partial churn detection models the succession of first products’ categories purchased as a proxy of the state of trust and demand maturity of a customer towards a company in grocery retailing. Motivated by the importance of the first impressions and risks experienced recently on the current state of the relationship, we model the first purchase succession in chronological order as well as in reverse order, respectively. Due to the variable relevance of the first customer-company interactions and of the most recent interactions, these two variables are modeled by considering a variable length of the sequence. In this study we use logistic regression as the classification technique. A real sample of approximately 75,000 new customers taken from the data warehouse of a European retail company is used to test the proposed models. The area under the receiver operating characteristic curve and 1%, 5% and 10% percentiles lift are used to assess the performance of the partial-churn prediction models. The empirical results reveal that both proposed models outperform the standard RFM model.
    Keywords: Marketing, Customer relationship management, Churn analysis, Predictive analytics, Sequence analysis, Retailing, Classification, Logistic regression
    Date: 2012–05
    Abstract: This study investigates the advantage of social network mining in a customer retention context. A company that is able to identify likely churners in an early stage can take appropriate steps to prevent these potential churners from actually churning and subsequently increase profit. Academics and practitioners are constantly trying to optimize their predictive-analytics models by searching for better predictors. The aim of this study is to investigate if, in addition to the conventional sets of variables (socio-demographics, purchase history, etc.), kinship network based variables improve the predictive power of customer retention models. Results show that the predictive power of the churn model can indeed be improved by adding the social network (SNA-) based variables. Including network structure measures (i.e. degree, betweenness centrality and density) increase predictive accuracy, but contextual network based variables turn out to have the highest impact on discriminating churners from non-churners. For the majority of the latter type of network variables, the importance in the model is even higher than the individual level counterpart variable.
    Keywords: network based marketing, CRM, predictive analystics, social network analysis (SNA), kinship network, financial services, random forests
    Date: 2012–05
  11. By: Eric Dor (IESEG School of Management (LEM-CNRS))
    Abstract: Recent studies show that disagreement regarding the future evolution of activity, inflation, or long and short interest rates, significantly forecasts holding excess returns. These studies include the papers of Buraschi and Whelan (2012), Barillas and Nimark (2012), Xiong and Yan (2010), Wu (2009) Such results challenge the common view that, under the expectations hypothesis of the term structure, the excess holding return should be unpredictable. The new evidence thus means that the risk premium is time-varying, moving as a function of disagreement. It is useful to discuss the potential implications of such theoretical results and empirical evidence on related monetary policy issues.
    Date: 2012–12
  12. By: Sinha, Pankaj; Singhal, Anushree; Sondhi, Kriti
    Abstract: This paper examines the economic scenario of the United States, before and after the 2012 US Presidential election by analyzing various macroeconomic variables such as GDP, Public Debt, Exchange Rate, Social Benefit Spending, Trade, Budget Deficit/ Surplus, Unemployment Rate, Inflation and others. We forecast the macroeconomic variables post 2012 using ARIMA modeling and present a picture of the U.S. economy post 2012 US Presidential election. With GDP growth being the major focus, both the parties are formulating policies to promote faster economic recovery by making reforms to reduce the $1 trillion deficit and maintain a balanced budget. Democratic Candidate Barack Obama has policies in place to increase investment in healthcare and education, open up opportunities, favour middle class families, a better trained workforce, double up exports and cuts in military expenses. Whereas Republican Candidate Mitt Romney’s focus is to achieve energy independence, open trade, champion small businesses and lower the tax rates along with lowering expenses. In this paper, we analyze the impact of expected outcome of 2012 U.S. presidential election on various macroeconomic variables of U.S. economy. The findings indicate that GDP is expected to grow at an average of about 2 percent and that a recession is not impending in 2013. Also going by the current policies, it is forecasted that U.S. exports and imports are expected to increase as the U.S. economy recovers. Barack Obama’s policies will inflate the Budget deficit while Mitt Romney’s strategy will lower the US Public Debt and Budget deficit. ARIMA models indicate that with the continuance of present government’s policies, budgetary deficit is estimated to decrease to 4.55 percent of GDP in 2014 from a maximum of 10.1 percent of GDP in 2010.
    Keywords: ARIMA; Box-Jenkins; U.S. economy; forecast; US 2012 presidential election; macroeconomic variables; presidential debate
    JEL: O1 F4 N1 O11 C53 C5 O24 E17 E6 F31
    Date: 2012–10–04
  13. By: Hélène Le Cadre (LIMA - CEA, LIST, Laboratory of Information, Models and Learning - CEA : SACLAY); Bedo Jean-Sébastien (Orange/France-Télécom - Telecom Orange)
    Abstract: In this article, we propose distributed learning based approaches to study the evolution of a decentralized hierarchical system, an illustration of which is the smart grid. Smart grid management requires the control of non-renewable energy production and the inegration of renewable energies which might be highly unpredictable. Indeed, their production levels rely on uncontrolable factors such as sunshine, wind strength, etc. First, we derive optimal control strategies on the non-renewable energy productions and compare competitive learning algorithms to forecast the energy needs of the end users. Second, we introduce an online learning algorithm based on regret minimization enabling the agents to forecast the production of renewable energies. Additionally, we define organizations of the market promoting collaborative learning which generate higher performance for the whole smart grid than full competition.
    Keywords: Algorithmic Game Theory; Coalition; Distributed Learning; Regret
    Date: 2012–10–09

This nep-for issue is ©2012 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.