nep-for New Economics Papers
on Forecasting
Issue of 2006‒11‒04
five papers chosen by
Rob J Hyndman
Monash University

  1. Forecasting measures of inflation for the Estonian economy By Agostino Consolo
  2. Forecasting Food Price Inflation in Developing Countries with Inflation Targeting Regimes: the Colombian Case By Eliana González; Miguel I. Gómez; Luis F. Melo; José Luis Torres
  3. Comparing alternative predictors based on large-panel factor models By Antonello D'Agostino; Domenico Giannone
  4. Overconfidence in Forecasts of Own Performance: An Experimental Study By Jeremy Clark; Lana Friesen
  5. Non mean reverting affne processes for stochastic mortality By Elisa Luciano; Elena Vigna

  1. By: Agostino Consolo
    Abstract: The aim of this paper is to forecast some of the most important measures of inflation of the Estonian economy by making use of linear and non-linear models. Results from comparing classes of optimal models are similar to those in the forecasting literature. In particular, there are gains from using more sophisticated methods such as factor analysis and time-varying parameters methods. Model discrimination is based on evaluation criteria which are computed by a real-time dynamic estimation procedure. Moreover, forecasts uncertainty is appropriately taken into account: Fan Charts can exhaustively describe the final output for what concerns out-of-sample forecasting.
    Keywords: Estonian Economy, forecasting, inflation modelling
    JEL: C22 C32 C53 E31
    Date: 2006–10–10
    URL: http://d.repec.org/n?u=RePEc:eea:boewps:wp2006-03&r=for
  2. By: Eliana González; Miguel I. Gómez; Luis F. Melo; José Luis Torres
    Abstract: Many developing countries are adopting inflation targeting regimes to guide monetary policy decisions. In such countries the share of food in the consumption basket is high and policy makers often employ total inflation (as opposed to core inflation) to set inflationary targets. Therefore, central banks need to develop reliable models to forecast food inflation. Our literature review suggests that little has been done in the construction of models to forecast short-run food inflation in developing countries. We develop a model to improve short-run food inflation forecasts in Colombia. The model disaggregates food items according to economic theory and employs Flexible Least Squares given the presence of structural changes in the inflation series. We compare the performance of this new model to current models employed by the central bank. Next, we apply econometric methods to combine forecasts from alternative models and test whether such combination outperforms individual models. Our results indicate that forecasts can be improved by classifying food basket items according to unprocessed, processed and food away from home and by employing forecast combination techniques.
    Date: 2006–10–20
    URL: http://d.repec.org/n?u=RePEc:col:001043:002681&r=for
  3. By: Antonello D'Agostino (Central Bank and Financial Services Authority of Ireland - Economic Analysis and Research Department, PO Box 559 - Dame Street, Dublin 2, Ireland.); Domenico Giannone (ECARES, Universit Libre de Bruxelles - CP 114 - av. Jeanne, 44, B-1050, Brussels, Belgium.)
    Abstract: This paper compares the predictive ability of the factor models of Stock and Watson (2002) and Forni, Hallin, Lippi, and Reichlin (2005) using a "large" panel of US macroeconomic variables. We propose a nesting procedure of comparison that clarifies and partially overturns the results of similar exercises in the literature. As in Stock and Watson (2002), we find that efficiency improvements due to the weighting of the idiosyncratic components do not lead to significant more accurate forecasts. In contrast to Boivin and Ng (2005), we show that the dynamic restrictions imposed by the procedure of Forni, Hallin, Lippi, and Reichlin (2005) are not harmful for predictability. Our main conclusion is that for the dataset at hand the two methods have a similar performance and produce highly collinear forecasts. JEL Classification: C31, C52, C53.
    Keywords: Factor Models, Forecasting, Large Cross-Section.
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20060680&r=for
  4. By: Jeremy Clark (University of Canterbury); Lana Friesen
    Abstract: Overconfidence can have important economic consequences, but has received little direct testing within the discipline. We test for overconfidence in forecasts of own absolute or relative performance in two unfamiliar experimental tasks. Given their choice of effort at the tasks, participants have incentives to forecast accurately, and have opportunities for feedback, learning and revision. Forecast accuracy is evaluated at both the aggregate level, and at the individual level using realized outcomes. We find very limited evidence of overconfidence, with zero mean error or under-confidence more prevalent. Under-confidence is greatest in tasks with absolute rather than relative win criteria, often among subjects using greater or "smarter" effort.
    Keywords: Overconfidence; forecast errors; self-assessment
    JEL: C91 D83 D84 J24
    Date: 2006–03–01
    URL: http://d.repec.org/n?u=RePEc:cbt:econwp:06/09&r=for
  5. By: Elisa Luciano; Elena Vigna
    Abstract: In this paper we use doubly stochastic processes (or Cox processes) in order to model the random evolution of mortality of an individual. These processes have been widely used in the credit risk literature in modelling default arrival, and in this context have proved to be quite flexible, especially when the intensity process is of the affne class. We investigate the applicability of time-homogeneous a±ne processes in describing the individual's intensity of mortality and the mortality trend, as well as in forecasting it. We calibrate them to the UK population. Calibrations suggest that, in spite of their popularity in the financial context, mean reverting time-homogeneous processes are less suitable for describing the death intensity of individuals than non mean reverting processes. Among the latter, affne processes whose determin- istic part increases exponentially seem to be appropriate. They are natural generalizations of the Gompertz law. Stress analysis and analytical results indicate that increasing the randomness of the intensity process for a given cohort results in improvements in survivorship. Mortality forecasts and their comparison with experienced mortality rates provide further encour- aging evidence in favour of non mean reverting processes. The mortality trend is evidenced through the evolution over time of the parameters and through the intensity simulation for di®erent gener- ations.
    Keywords: doubly stochastic processes (Cox processes), affne processes, stochastic mortality, mortality forecasting.
    JEL: G22 J11
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:cca:wpaper:30&r=for

This nep-for issue is ©2006 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.