nep-for New Economics Papers
on Forecasting
Issue of 2015‒12‒08
nine papers chosen by
Rob J Hyndman
Monash University

  1. The role of component-wise boosting for regional economic forecasting By Lehmann, Robert; Wohlrabe, Klaus
  2. Macro-Driven VaR Forecasts: From Very High to Very-Low Frequency Data By Yves Dominicy; Harry-Paul Vander Elst
  3. GDP Nowcasting: Assessing business cycle conditions in Argentina By Laura D´Amato; Lorena Garegnani; Emilio Blanco
  4. Using Land-Use Modelling to Statistically Downscale Population Projections to Small Areas By Michael P. Cameron; William Cochrane
  5. Testing the Predictability of Consumption Growth: Evidence from China By Liping Gao; Hyeongwoo Kim
  6. Nowcasting Mexican GDP By Alberto Caruso
  7. Local Unit Root and Inflationary Inertia in Brazil By Wagner Piazza Gaglianone; Osmani Teixeira de Carvalho Guillén; Francisco Marcos Rodrigues Figueiredo
  8. Exploring the use of anonymized consumer credit information to estimate economic conditions: an application of big data By Wilshusen, Stephanie M.
  9. A Credibility Approach of the Makeham Mortality Law By Yahia Salhi; Pierre-Emmanuel Thérond; Julien Tomas

  1. By: Lehmann, Robert; Wohlrabe, Klaus
    Abstract: This paper applies component-wise boosting to the topic of regional economic forecasting. By using unique quarterly gross domestic product data for one German state for the period from 1996 to 2013, in combination with a large data set of 253 monthly indicators, we show how accurate forecasts obtained from component-wise boosting are compared to a simple benchmark model. We additionally take a closer look into the algorithm and evaluate whether a stable pattern of selected indicators exists over time and four different forecasting horizons. All in all, boosting is a viable method for forecasting regional GDP, especially one and two quarters ahead. We also find that regional survey results, indicators that mirror the Saxon economy and the Composite Leading Indicators by the OECD, get frequently selected by the algorithm.
    Keywords: boosting; regional economic forecasting; gross domestic product
    JEL: C53 E17 E37 R11
    Date: 2015–12–03
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:68186&r=for
  2. By: Yves Dominicy; Harry-Paul Vander Elst
    Abstract: This paper studies in some details the joint-use of high-frequency data and economic variables tomodel financial returns and volatility. We extend the Realized LGARCH model by allowing for a timevaryingintercept, which responds to changes in macroeconomic variables in a MIDAS framework andallows macroeconomic information to be included directly into the estimation and forecast procedure.Using more than 10 years of high-frequency transactions for 55 U.S. stocks, we argue that the combinationof low-frequency exogenous economic indicators with high-frequency financial data improves our abilityto forecast the volatility of returns, their full multi-step ahead conditional distribution and the multiperiodValue-at-Risk. We document that nominal corporate profits and term spreads generate accuraterisk measures forecasts at horizons beyond two business weeks.
    Keywords: realized LGARCH; value-at-risk; density forecasts; realized measures of volatility
    JEL: C22 C53
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/220550&r=for
  3. By: Laura D´Amato (Central Bank of Argentina); Lorena Garegnani (Central Bank of Argentina); Emilio Blanco (Central Bank of Argentina)
    Abstract: Having a correct assessment of current business cycle conditions is one of the mayor challenges for monetary policy conduct. Given that GDP figures are available with a significant delay, central banks are increasingly using Nowcasting as a useful tool for having an immediate perception of economic conditions. Thus we develop a GDP growth nowcasting exercise using two approaches: bridge equations and a dynamic factor model. Both outperform a typical AR(1) benchmark in terms of forecasting accuracy. Moreover, the factor model outperforms the nowcast using bridge equations. Following Giacomini and White (2004) we confirm that these differences are statistically significant.
    Keywords: bridge equations, dynamic factor models, nowcasting
    JEL: C22 C53 E37
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:bcr:wpaper:201569&r=for
  4. By: Michael P. Cameron (University of Waikato); William Cochrane (University of Waikato)
    Abstract: Local government planners, property developers, large businesses and other stakeholders typically require good quality projections of the spatial distribution of the future population at the small-area level. Many approaches are available to project future populations, but all suffer from limitations due to their strict underlying assumptions or limited availability of data. In this paper we apply a novel approach to small-area population projection that combines cohort-component projections at the district level with grid-based land use projections at a fine (four-hectare) geographical scale. In our approach, residential population is directly estimated in the land use model, while a separate statistical model is used to link non-residential population to non-residential land use (by type). The model can then be used to project future small-area populations using projections of future land use from the land use model. We compare four data and model specifications for the statistical modelling, using either absolute land use area or principal components as explanatory variables, and using either OLS or Spatial Durbin model specifications. All four model combinations perform reasonably well for the Waikato Region of New Zealand, with good in-sample (2006) and out-of-sample (2013) properties. However, a naïve model based on constant shares of growth outperforms all four of our models in terms of forecast accuracy and bias. Notwithstanding the underperformance relative to a naïve model, our results suggest that land use modelling may still be useful, because the model is understandable by local authority planners and elected officials, and generates greater stakeholder ‘buy-in’ than black-box or naïve approaches.
    Keywords: population projections; small-area projections; forecasting; land use
    JEL: C53 J11 Q56 R23 R52
    Date: 2015–11–30
    URL: http://d.repec.org/n?u=RePEc:wai:econwp:15/12&r=for
  5. By: Liping Gao; Hyeongwoo Kim
    Abstract: Chow (1985, 2010, 2011) reports indirect evidence in favor of the permanent income hypothesis using time series observations in China. We revisit this issue by evaluating direct measures of the predictability of consumption growth in China during the post-economic reform regime (1978-2009) as well as the postwar US data for comparison. Our in-sample analysis provides strong evidence against the PIH for both countries. Out-of-sample forecast exercises show that consumption changes are highly predictable, which sharply contrasts the implication of Chow (1985, 2010, 2011).
    Keywords: Permanent Income Hypothesis; Consumption; Diebold-Mariano-West Statistic
    JEL: E21 E27
    Date: 2015–12
    URL: http://d.repec.org/n?u=RePEc:abn:wpaper:auwp2015-19&r=for
  6. By: Alberto Caruso
    Abstract: In this paper I study the flow of conjunctural data relevant to assess the state of theMexican economy. I reconstruct the flow of releases that are most frequently monitored bymarket participants, economic commentators and policy makers. Given the close linkages withthe US economy, I take into account both US and Mexican data. Following the literature onnowcasting, I jointly analyse these data in a model that is continuously updated as new data getreleased. The model can be used to assess the current macroeconomic conditions (predictingthe present) of the Mexican economy and to evaluate the importance of each macroeconomicdata release. I find that the model produces forecasts whose accuracy is similar to that ofinstitutional and judgemental forecasts, and I document the importance of considering US data.
    Date: 2015–10
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/219871&r=for
  7. By: Wagner Piazza Gaglianone; Osmani Teixeira de Carvalho Guillén; Francisco Marcos Rodrigues Figueiredo
    Abstract: In this paper, we study the persistence of Brazilian inflation using quantile regression techniques. To characterize the inflation dynamics we employ the Quantile Autoregression model (QAR) of Koenker and Xiao (2004, 2006), where the autoregressive coefficient may assume different values in distinct quantiles, allowing testing the asymmetry hypothesis for the inflation dynamics. Furthermore, the model allows investigating the existence of a local unit root behavior, with episodes of mean reversion sufficient to ensure stationarity. In other words, the model enables one to identify locally unsustainable dynamics, but still compatible with global stationarity; and it can be reformulated in a more conventional random coefficient notation to reveal the periods of local non-stationarity. Another advantage of this technique is the estimation method, which does not require knowledge of the innovation process distribution, making the approach robust against poorly specified models. An empirical exercise with Brazilian inflation data and its components illustrates the methodology. As expected, the behavior of inflation dynamics is not uniform across different conditional quantiles. In particular, the results can be summarized as follows: (i) the dynamics is stationary for most quantiles; (ii) the process is non-stationary in the upper tail of the conditional distribution; (iii) the periods associated with local unsustainable dynamics can be related to those of increased risk aversion and higher inflation expectations; and (iv) out-of-sample forecasting exercises show that the QAR model at the median quantile level can exhibit, in some cases, lower mean squared error (MSE) compared to the random walk and AR forecasts
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:bcb:wpaper:406&r=for
  8. By: Wilshusen, Stephanie M. (Federal Reserve Bank of Philadelphia)
    Abstract: The emergence of high-frequency administrative data and other big data offers an opportunity for improvements to economic forecasting models. This paper considers the potential advantages and limitations of using information contained in anonymized consumer credit reports for improving estimates of current and future economic conditions for various geographic areas and demographic markets. Aggregate consumer credit information is found to be correlated with macroeconomic variables such as gross domestic product, retail sales, and employment and can serve as leading indicators such that lagged values of consumer credit variables can improve the accuracy of forecasts of these macro variables.
    Keywords: Consumer credit information; Administrative data; Big data; Real-time data; Nowcasting; Forecasting
    JEL: C53 D12 D14
    Date: 2015–11–06
    URL: http://d.repec.org/n?u=RePEc:fip:fedpdp:15-05&r=for
  9. By: Yahia Salhi (SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1); Pierre-Emmanuel Thérond (Galea & Associés, SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1); Julien Tomas (SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1)
    Abstract: The present article illustrates a credibility approach to mortality. Interest from life insurers to assess their portfolios' mortality risk has considerably increased. The new regulation and norms, Solvency II, shed light on the need of life tables that best reect the experience of insured portfolios in order to quantify reliably the underlying mortality risk. In this context and following the work of Bühlmann and Gisler (2005) and Hardy and Panjer (1998), we propose a credibility approach which consists on reviewing, as new observations arrive, the assumption on the mortality curve. Unlike the methodology considered in Hardy and Panjer (1998) that consists on updating the aggregate deaths we have chosen to add an age structure on these deaths. Formally, we use a Makeham graduation model. Such an adjustment allows to add a structure in the mortality pattern which is useful when portfolios are of limited size so as to ensure a good representation over the entire age bands considered. We investigate the divergences in the mortality forecasts generated by the classical credibility approaches of mortality including Hardy and Panjer (1998) and the Poisson-Gamma model on portfolios originating from various French insurance companies.
    Keywords: Credibility,Makeham law,Mortality,Life insurance,Graduation,Extrapolation
    Date: 2015–11–24
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01232683&r=for

This nep-for issue is ©2015 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.