nep-for New Economics Papers
on Forecasting
Issue of 2010‒01‒16
eighteen papers chosen by
Rob J Hyndman
Monash University

  1. The Multivariate k-Nearest Neighbor Model for Dependent Variables : One-Sided Estimation and Forecasting By Dominique Guegan; Patrick Rakotomarolahy
  2. Forecasting the US Real House Price Index: Structural and Non-Structural Models with and without Fundamentals By Rangan Gupta; Alain Kabundi; Stephen M. Miller
  3. Forecasting with nonlinear time series models By Anders Bredahl Kock; Timo Teräsvirta
  4. Modelling and forecasting volatility of East Asian Newly Industrialized Countries and Japan stock markets with non-linear models By Guidi, Francesco
  5. Forecast disagreement among FOMC members By Chanont Banternghansa; Michael W. McCracken
  6. Have Structural Changes Eliminated the Out-of-Sample Ability of Financial Variables To Forecast Real Activity After the Mid-1980s? Evidence From the Canadian Economy. By Akhter Faroque; William Veloce; Jean-Francois Lamarche
  7. On the Realized Volatility of the ECX CO2 Emissions 2008 Futures Contract: Distribution, Dynamics and Forecasting By Julien Chevallier; Benoît Sévi
  8. Forecasting New Zealand's economic growth using yield curve information By Leo Krippner; Leif Anders Thorsrud
  9. Measuring consumer uncertainty about future inflation By Wandi Bruine de Bruin; Charles F. Manski; Giorgio Topa; Wilbert van der Klaauw
  10. Too Many Cooks? The German Joint Diagnosis and Its Production By Ulrich Fritsche; Ullrich Heilemann
  11. Consumption and real exchange rates in professional forecasts By Michael B Devereux; Gregor W Smith; James Yetman
  12. The Camp View of Inflation Forecasts By Felix Geiger; Oliver Sauter; Kai D. Schmid
  13. The Role of Monetary Aggregates in the Policy Analysis of the Swiss National Bank By Gebhard Kirchgässner; Jürgen Wolters
  14. Estimating DSGE-Model-Consistent Trends for Use in Forecasting By Jean-Philippe Cayen; Marc-André Gosselin; Sharon Kozicki
  15. Bankruptcy Prediction: A Comparison of Some Statistical and Machine Learning Techniques By Tonatiuh Peña; Serafín Martínez; Bolanle Abudu
  16. The Predictive Power of Conditional Models: What Lessons to Draw with Financial Crisis in the Case of Pre-Emerging Capital Markets? By El Bouhadi, Abdelhamid; Achibane, Khalid
  17. Tolls, Exchange Rates, and Borderplex International Bridge Traffic By De Leon, Marycruz; Fullerton, Thomas M., Jr.; Kelley, Brian W.
  18. Real time underlying inflation gauges for monetary policymakers By Marlene Amstad; Simon Potter

  1. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris); Patrick Rakotomarolahy (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I)
    Abstract: This article gives the asymptotic properties of multivariate k-nearest neighbor regression estimators for dependent variables belonging to Rd, d > 1. The results derived here permit to provide consistent forecasts, and confidence intervals for time series An illustration of the method is given through the estimation of economic indicators used to compute the GDP with the bridge equations. An empirical forecast accuracy comparison is provided by comparing this non-parametric method with a parametric one based on ARIMA modelling that we consider as a benchmark because it is still often used in Central Banks to nowcast and forecast the GDP.
    Keywords: Multivariate k-nearest neighbor, asymptotic normality of the regression, mixing time series, confidence intervals, forecasts, economic indicators, euro area.
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-00423871_v2&r=for
  2. By: Rangan Gupta (University of Pretoria); Alain Kabundi (University of Johannesburgh); Stephen M. Miller (University of Connecticut and University of Nevada, Las Vegas)
    Abstract: We employ a 10-variable dynamic structural general equilibrium model to forecast the US real house price index as well as its turning point in 2006:Q2. We also examine various Bayesian and classical time-series models in our forecasting exercise to compare to the dynamic stochastic general equilibrium model, estimated using Bayesian methods. In addition to standard vector-autoregressive and Bayesian vector autoregressive models, we also include the information content of either 10 or 120 quarterly series in some models to capture the influence of fundamentals. We consider two approaches for including information from large data sets -- extracting common factors (principle components) in a Factor-Augmented Vector Autoregressive or Factor-Augmented Bayesian Vector Autoregressive models or Bayesian shrinkage in a large-scale Bayesian Vector Autoregressive models. We compare the out-of-sample forecast performance of the alternative models, using the average root mean squared error for the forecasts. We find that the small-scale Bayesian-shrinkage model (10 variables) outperforms the other models, including the large-scale Bayesian-shrinkage model (120 variables). Finally, we use each model to forecast the turning point in 2006:Q2, using the estimated model through 2005:Q2. Only the dynamic stochastic general equilibrium model actually forecasts a turning point with any accuracy, suggesting that attention to developing forward-looking microfounded dynamic stochastic general equilibrium models of the housing market, over and above fundamentals, proves crucial in forecasting turning points.
    Keywords: US House prices, Forecasting, DSGE models, Factor Augmented Models, Large-Scale BVAR models
    JEL: C32 R31
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:uct:uconnp:2009-42&r=for
  3. By: Anders Bredahl Kock (CREATES, Aarhus University); Timo Teräsvirta (CREATES, Aarhus University)
    Abstract: In this paper, nonlinear models are restricted to mean nonlinear parametric models. Several such models popular in time series econo- metrics are presented and some of their properties discussed. This in- cludes two models based on universal approximators: the Kolmogorov- Gabor polynomial model and two versions of a simple artificial neural network model. Techniques for generating multi-period forecasts from nonlinear models recursively are considered, and the direct (non-recursive) method for this purpose is mentioned as well. Forecasting with com- plex dynamic systems, albeit less frequently applied to economic fore- casting problems, is briefly highlighted. A number of large published studies comparing macroeconomic forecasts obtained using different time series models are discussed, and the paper also contains a small simulation study comparing recursive and direct forecasts in a partic- ular case where the data-generating process is a simple artificial neural network model. Suggestions for further reading conclude the paper.
    Keywords: forecast accuracy, Kolmogorov-Gabor, nearest neigh- bour, neural network, nonlinear regression
    JEL: C22 C45 C52 C53
    Date: 2010–01–01
    URL: http://d.repec.org/n?u=RePEc:aah:create:2010-01&r=for
  4. By: Guidi, Francesco
    Abstract: This paper explores the forecasting performances of several non-linear models, namely GARCH, EGARCH, APARCH used with three distributions, namely the Gaussian normal, the Student-t and Generalized Error Distribution (GED). In order to evaluate the performance of the competing models we used the standard loss functions that is the Root Mean Squared Error, Mean Absolute Error, Mean Absolute Percentage Error and the Theil Inequality Coefficient. Our result show that the asymmetric GARCH family models are generally the best for forecasting NICs indices. We also find that both Root Mean Squared Error and Mean Absolute Error forecast statistic measures tend to choose models that were estimated assuming the normal distribution, while the other two remaining forecast measures privilege models with t-student and GED distribution.
    Keywords: GARCH; Volatility forecasting; forecast evaluation.
    JEL: G15 C22
    Date: 2010–01
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:19851&r=for
  5. By: Chanont Banternghansa; Michael W. McCracken
    Abstract: This paper presents empirical evidence on the disagreement among Federal Open Market Committee (FOMC) forecasts. In contrast to earlier studies that analyze the range of FOMC forecasts available in the Monetary Policy Report to the Congress, we analyze the forecasts made by each individual member of the FOMC from 1992 to 1998. This newly available dataset, while rich in detail, is short in duration. Even so, we are able to identify a handful of patterns in the forecasts related to i) forecast horizon; ii) whether the individual is a Federal Reserve Bank president, governor, and/or Vice Chairman; and iii) whether individual is a voting member of the FOMC. Additional comparisons are made between forecasts made by the FOMC and the Survey of Professional Forecasters.
    Keywords: Federal Open Market Committee ; Monetary policy ; Economic forecasting
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:fip:fedlwp:2009-059&r=for
  6. By: Akhter Faroque (Department of Economics, Laurentian University); William Veloce (Department of Economics, Brock University); Jean-Francois Lamarche (Department of Economics, Brock University)
    Abstract: The paper evaluates the reliability of the information content of individual financial variables for Canada’s future output growth. We estimate the timing of structural changes in linear growth models and check robustness to specification changes, multiple breaks, and business cycle asymmetry. Our out-of-sample forecast evaluation using the MSE-F and the ENC-NEW tests show that the leading information content of most financial variables has deteriorated after 1984:4, but the 1-3 year term spread exhibits a consistently reliable predictive ability at the 1 and 2 quarter horizons and has significant forecasting ability at the 8 quarter horizon. Also, the real M1 money growth has regained its ability to forecast output growth since 1991:1.
    Keywords: Financial indicators, structural change, nested model forecasts, minimum MSFE, encompassing tests
    JEL: C52 C53 E44
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:brk:wpaper:0910&r=for
  7. By: Julien Chevallier (Imperial College London); Benoît Sévi (University of Angers (GRANEM) and LEMNA)
    Abstract: The recent implementation of the EU Emissions Trading Scheme (EU ETS) in January 2005 created new financial risks for emitting firms. To deal with these risks, options are traded since October 2006. Because the EU ETS is a new market, the relevant underlying model for option pricing is still a controversial issue. This article improves our understanding of this issue by characterizing the conditional and unconditional distributions of the realized volatility for the 2008 futures contract in the European Climate Exchange (ECX), which is valid during Phase II (2008-2012) of the EU ETS. The realized volatility measures from naive, kernel-based and subsampling estimators are used to obtain inferences about the distributional and dynamic properties of the ECX emissions futures volatility. The distribution of the daily realized volatility in logarithmic form is shown to be close to normal. The mixture-of-distributions hypothesis is strongly rejected, as the returns standardized using daily measures of volatility clearly departs from normality. A simplified HAR-RV model (Corsi, 2009) with only a weekly component, which reproduces long memory properties of the series, is then used to model the volatility dynamics. Finally, the predictive accuracy of the HAR-RV model is tested against GARCH specifications using one-step-ahead forecasts, which confirms the HAR-RV superior ability. Our conclusions indicate that (i) the standard Brownian motion is not an adequate tool for option pricing in the EU ETS, and (ii) a jump component should be included in the stochastic process to price options, thus providing more efficient tools for risk-management activities.
    Keywords: CO2 Price, Realized Volatility, HAR-RV, GARCH, Futures Trading, Emissions Markets, EU ETS, Intraday data, Forecasting
    JEL: C5 G1 Q4
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:fem:femwpa:2009.113&r=for
  8. By: Leo Krippner; Leif Anders Thorsrud (Reserve Bank of New Zealand)
    Abstract: We forecast economic growth in New Zealand using yield curve data within simple statistical models; i.e. typical OLS relationships that have been well-established for other countries, and related VAR specifcations. We find that the yield curve data has significant forecasting power in absolute terms and performs well relative to various benchmarks. Specifications including measures of the yield curve slope produce the best forecasts overall. Our results also highlight the benefits of fully exploiting the timeliness of yield curve information (i.e it is always available and up to date).
    JEL: E43 E44 E47
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:nzb:nzbdps:2009/18&r=for
  9. By: Wandi Bruine de Bruin; Charles F. Manski; Giorgio Topa; Wilbert van der Klaauw
    Abstract: Survey measures of consumer inflation expectations have an important shortcoming in that, while providing useful summary measures of the distribution of point forecasts across individuals, they contain no direct information about an individual's uncertainty about future inflation. The latter is important not only for forecasting inflation and other macroeconomic outcomes, but also for assessing a central bank's credibility and effectiveness of communication. This paper explores the feasibility of eliciting individual consumers' subjective probability distributions of future inflation outcomes. ; In November 2007, we began administering web-based surveys to participants in RAND's American Life Panel. In addition to their point predictions, respondents were asked for their subjective assessments of the percentage chance that inflation will fall in each of several predetermined intervals. We find that our measures of individual forecast densities and uncertainty are internally consistent and reliable. Those who are more uncertain about year-ahead price inflation are also generally more uncertain about longer term price inflation and future wage changes. We find also that participants expressing higher uncertainty in their density forecasts make larger revisions to their point forecasts over time. Measures of central tendency derived from individual density forecasts are highly correlated with point forecasts, but they usually differ, often substantially, at the individual level. ; Finally, we relate our direct measure of aggregate consumer uncertainty to a more conventional approach that uses disagreement among individual forecasters, as seen in the dispersion of their point forecasts, as a proxy for forecast uncertainty. Although the two measures are positively correlated, our results suggest that disagreement and uncertainty are distinct concepts, both relevant to the analysis of inflation expectations.
    Keywords: Consumer surveys ; Inflation (Finance) ; Economic forecasting
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:fip:fednsr:415&r=for
  10. By: Ulrich Fritsche (Department for Socioeconomics, University of Hamburg); Ullrich Heilemann (Faculty of Economics, University of Leipzig)
    Abstract: The “Gemeinschaftsdiagnose” [Joint Diagnosis (JD)] is the most influential semi-annual mac-roeconomic forecast in Germany. Jointly produced by up to six institutes, its accuracy as well as the large number of involved participants is often criticised. This study examines the JD’s growth and inflation forecasts from 1970 to 2007, including most of the contributions of the forecasts submitted by the five institutes at the start of the JD. Four questions are addressed: (i) Are these forecasts unbiased and efficient? (ii) How do results change if we presume an asymmetric loss function? (iii) Are any of the institutes more accurate than the JD? Are five/six institutes necessary and at what cost? (iv) Do the institutes make strategic forecasts to influence the JD forecast? Results show that there is no strong evidence of bias or inefficiency of the institutes’ forecasts and no evidence of asymmetric loss functions. Five institutes are not necessary, but it is very hard to predict the redundant institutes; however, the loss of accu-racy by employing only two is small.
    Keywords: Forecast accuracy, joint forecasts, strategic forecast behaviour
    JEL: E37 C53 D72
    Date: 2010–01
    URL: http://d.repec.org/n?u=RePEc:hep:macppr:201001&r=for
  11. By: Michael B Devereux; Gregor W Smith; James Yetman
    Abstract: Standard models of international risk sharing with complete asset markets predict a positive association between relative consumption growth and real exchange-rate depreciations across countries. The striking lack of evidence for this link - the consumption/real-exchange-rate anomaly or Backus-Smith puzzle - has prompted research on risk-sharing indicators with incomplete asset markets. That research generally implies that the association holds in forecasts, rather than realizations. Using professional forecasts for 28 countries for 1990-2008 we find no such association, thus deepening the puzzle. Independent evidence on the weak link between forecasts for consumption and real interest rates suggests that the presence of 'hand-to-mouth' consumers may help to explain the evidence.
    Keywords: international risk sharing, Backus-Smith puzzle
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:bis:biswps:295&r=for
  12. By: Felix Geiger; Oliver Sauter; Kai D. Schmid
    Abstract: Analyzing sample moments of survey forecasts, we derive disagreement and un- certainty measures for the short- and medium term inflation outlook. The latter provide insights into the development of inflation forecast uncertainty in the context of a changing macroeconomic environment since the beginning of 2008. Motivated by the debate on the role of monetary aggregates and cyclical variables describing a Phillips-curve logic, we develop a macroeconomic indicator spread which is assumed to drive forecasters’ judgments. Empirical evidence suggests procyclical dynamics between disagreement among forecasters, individual forecast uncertainty and the macro-spread. We call this approach the camp view of inflation forecasts and show that camps form up whenever the spread widens.
    Keywords: monetary policy, survey forecasts, inflation uncertainty, heterogenous beliefs and expectations, monetary aggregates
    JEL: E47 E51 E52 E58 E66
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:hoh:hohdip:320&r=for
  13. By: Gebhard Kirchgässner; Jürgen Wolters
    Abstract: Using Swiss data from 1983 to 2008, this paper investigates whether growth rates of the different measures of the quantity of money and or excess money can be used to forecast inflation. After a preliminary data analysis, money demand relations are specified, estimated and tested. Then, employing error correction models, measures of excess money are derived. Using recursive estimates, indicator properties of monetary aggregates for inflation are assessed for the period from 2000 onwards, with time horizons of one, two, and three years. In these calculations, M2 and M3 clearly outperform M1, and excess money is generally a better predictor than the quantity of money. Taking into account also the most (available) recent observations that represent the first three quarters of the economic crisis, the money demand function of M3 remains stable while the one for M2 is strongly influenced by these three observations. While in both cases forecasts for 2010 show inflation rates inside the target zone between zero and two percent, and the same holds for forecasts based on M3 for 2011, forecasts based on M2 provide evidence that the upper limit of this zone might be violated in 2011.
    Keywords: Stability of Money Demand; Monetary Aggregates and Inflation
    JEL: E41 E52
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:cra:wpaper:2009-30&r=for
  14. By: Jean-Philippe Cayen; Marc-André Gosselin; Sharon Kozicki
    Abstract: The workhorse DSGE model used for monetary policy evaluation is designed to capture business cycle fluctuations in an optimization-based format. It is commonplace to loglinearize models and express them with variables in deviation-from-steady-state format. Structural parameters are either calibrated, or estimated using data pre-filtered to extract trends. Such procedures treat past and future trends as fully known by all economic agents or, at least, as independent of cyclical behaviour. With such a setup, in a forecasting environment it seems natural to add forecasts from DSGE models to trend forecasts. While this may be an intuitive starting point, efficiency can be improved in multiple dimensions. Ideally, behaviour of trends and cycles should be jointly modeled. However, for computational reasons it may not be feasible to do so, particularly with medium- or large-scale models. Nevertheless, marginal improvements on the standard framework can still be made. First, pre-filtering of data can be amended to incorporate structural links between the various trends that are implied by the economic theory on which the model is based, improving the efficiency of trend estimates. Second, forecast efficiency can be improved by building a forecast model for model-consistent trends. Third, decomposition of shocks into permanent and transitory components can be endogenized to also be model-consistent. This paper proposes a unified framework for introducing these improvements. Application of the methodology validates the existence of considerable deviations between trends used for detrending data prior to structural parameter estimation and model-consistent estimates of trends, implying the potential for efficiency gains in forecasting. Such deviations also provide information on aspects of the model that are least coherent with the data, possibly indicating model misspecification. Additionally, the framework provides a structure for examining cyclical responses to trend shocks, among other extensions.
    Keywords: Business fluctuations and cycles; Econometric and statistical methods
    JEL: E3 D52 C32
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:bca:bocawp:09-35&r=for
  15. By: Tonatiuh Peña; Serafín Martínez; Bolanle Abudu
    Abstract: We are interested in forecasting bankruptcies in a probabilistic way. Specifically, we compare the classification performance of several statistical and machine-learning techniques, namely discriminant analysis (Altman's Z-score), logistic regression, least-squares support vector machines and different instances of Gaussian processes (GP's) -that is GP's classifiers, Bayesian Fisher discriminant and Warped GP's. Our contribution to the field of computational finance is to introduce GP's as a potentially competitive probabilistic framework for bankruptcy prediction. Data from the repository of information of the US Federal Deposit Insurance Corporation is used to test the predictions.
    Keywords: Bankruptcy prediction, Artificial intelligence, Supervised learning, Gaussian processes, Z-score.
    JEL: C11 C14 C45
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:bdm:wpaper:2009-18&r=for
  16. By: El Bouhadi, Abdelhamid; Achibane, Khalid
    Abstract: The uncertainty plays a central role in most of the problems which addressed by the modern financial theory. For some time, we know that the uncertainty under the speculative price varies over the time. However, it is only recently that a lot of studies in applied finance and monetary economics using the explicit modelling of time series involving the second and the higher moments of variables. Indeed, the first tool appeared in order to model such variables has been introduced by Engel (1982). This is the autoregressive conditional heteroskedasticity and its many extensions. Thus, with the emergence and development of these models, Value-at-Risk, which plays a major role in assessment and risk management of financial institutions, has become a more effective tool to measure the risk of asset holdings. Following the current financial debacle, we give the simple question about the progress and some achievements made in the context of emerging and pre-emergent financial markets microstructure which can sustain and limit the future fluctuations. Today, we know that the crisis has no spared any financial market in the world. The magnitude and damage of the crisis effects vary in the space and time. In the Moroccan stock market context, it was found that the effects were not so harmful and that the future of these markets faces a compromise or at least a long lethargy. Indeed, inspired by these events, our study attempts to undertake two exercises. In first, we are testing the ability of the nonlinear ARCH and GARCH models (EGARCH, TGARCH, GJR-GARCH, QGARCH) to meet the number of expected exceedances (shortfalls) of VaR measurement. In second, we are providing a forecasting volatility under the time-varying of VaR.
    Keywords: Market Microstructure; ARCH Models; VaR; Time-Varying Volatility; Forecasting Volatility; Casablanca Stock Exchange.
    JEL: G14 C53 C52 G18 C22
    Date: 2009–12–20
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:19482&r=for
  17. By: De Leon, Marycruz; Fullerton, Thomas M., Jr.; Kelley, Brian W.
    Abstract: Budget constraints are forcing many governments to consider implementing tolls as a means for financing bridge and road expenditures. Newly available time series data make it possible to analyze the impacts of toll variations and international business cycle fluctuations on cross-border bridge traffic between El Paso and Ciudad Juarez. Parameter estimation is carried out using a linear transfer function ARIMA methodology. Price elasticities of demand are similar to those reported for other regional economies, but out-of-sample forecasting results are mixed.
    Keywords: Bridge Traffic; Tolls; Applied Econometrics; Mexico Border
    JEL: R41
    Date: 2009–06
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:19861&r=for
  18. By: Marlene Amstad; Simon Potter
    Abstract: Central banks analyze a wide range of data to obtain better measures of underlying inflationary pressures. Factor models have widely been used to formalize this procedure. Using a dynamic factor model this paper develops a measure of underlying inflation (UIG) at time horizons of relevance for monetary policymakers for both CPI and PCE. The UIG uses a broad data set allowing for high-frequency updates on underlying inflation. The paper complements the existing literature on U.S. "core" measures by illustrating how UIG is used and interpreted in real time since late 2005.
    Keywords: Inflation (Finance) ; Economic indicators ; Economic forecasting ; Monetary policy ; Banks and banking, Central
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:fip:fednsr:420&r=for

This nep-for issue is ©2010 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.