nep-for New Economics Papers
on Forecasting
Issue of 2012‒08‒23
thirty-one papers chosen by
Rob J Hyndman
Monash University

  1. Uncertainty and Heterogeneity in factor models forecasting By Matteo Luciani; Libero Monteforte
  2. A tutorial note on the properties of ARIMA optimal forecasts By Galimberti, Jaqueson K.
  3. Selecting forecasting models for portfolio allocation By Adam E Clements; Mark Doolan; Stan Hurn; Ralf Becker
  4. Forecasting with Bayesian Vector Autoregressions By Karlsson, Sune
  5. Forecasting Interest Rates with Shifting Endpoints By Dick van Dijk; Siem Jan Koopman; Michel van der Wel; Jonathan H. Wright
  6. Learning, Forecasting and Optimizing: An Experimental Study By Te Bao; John Duffy; Cars Hommes
  7. Optimal Combination of Survey Forecasts By Cristina Conflitti; Christine De Mol; Domenico Giannone
  8. Revisions in official data and forecasting By Cecilia Frale; Valentina Raponi
  9. Forecasting Euro-Area Macroeconomic Variables Using a Factor Model Approach for Backdating By Ralf Brüggemann; Jing Zeng
  10. Evaluation of Short Run Inflation Forecasts in Chile By Pablo Pincheira; Roberto Álvarez
  11. Sparse partial least squares in time series for macroeconomic forecasting By Julieta Fuentes; Pilar Poncela; Julio Rodríguez
  12. Over-optimistic Official Forecasts in the Eurozone and Fiscal Rules By Jeffrey A. Frankel; Jesse Schreger
  13. Forecasting Macroeconomic Variables using Collapsed Dynamic Factor Analysis By Falk Brauning; Siem Jan Koopman
  14. What drives the Quotes of Earnings Forecasters? By Bert de Bruijn; Philip Hans Franses
  15. Bayesian semiparametric multivariate GARCH modeling By Mark J. Jensen; John M. Maheu
  16. Estimating Loss Functions of Experts By Philip Hans Franses; Rianne Legerstee; Richard Paap
  17. Measuring GDP Forecast Uncertainty Using Quantile Regressions By Thomas Laurent; Tomasz Kozluk
  18. Intermittent demand forecasting for inventory control: A multi-series approach By Ralph Snyder; Adrian Beaumont; J. Keith Ord
  19. Assessing the Economic Value of El Niñobased seasonal climate forecasts for smallholder farmers in Zimbabwe By Ephias M. Makaudze
  20. Forecasting Bond Yields with Segmented Term Structure Models By Caio Almeida; Axel Simonsen; José Vicente
  21. Changing central bank transparency in Central and Eastern Europe during the financial crisis By Csávás, Csaba; Erhart, Szilárd; Naszódi, Anna; Pintér, Klára
  22. Policy Analysis and Forecasting in the World Economy: A Panel Unobserved Components Approach By Francis Vitek
  23. Yield Curve Modeling and Forecasting using Semiparametric Factor Dynamics By Wolfgang Karl Härdle,Piotr Majer; Melanie Schienle; ;
  24. Participation and Performance at the London 2012 Olympics By Kuper, Gerard H.; Sterken, Elmer
  25. What Drives Commodity Prices? By Chen, Shu-Ling; Jackson, John D.; Kim, Hyeongwoo; Resiandini, Pramesti
  26. Markov Chain Model of Land Use Change in the Twin Cities By Michael Iacono; David Levinson; Ahmed El-Geneidy; Rania Wasfi
  27. TMPM.ado: The Trauma Mortality Prediction Model is Robust to ICD-9 and AIS Coding Lexicons By Alan Cook; Turner M. Osler
  28. On the role of the estimation error in prediction of expected shortfall By Lönnbark, Carl
  29. Assessing the usefulness of accounting information as an instrument to predict business failure in Spanish cooperatives By Mari-Vidal, Sergio; Segui-Mas, Elies; Marin-Sanchez, Maria del Mar; Mateos-Ronco, Alicia
  30. An Early Warning Model for Predicting Credit Booms using Macroeconomic Aggregates By Alexander Guarín; Andrés González; Daphné Skandalis; Daniela Sánchez
  31. Unbiased estimation of maximum expected profits in the Newsvendor Model: a case study analysis By Halkos, George; Kevork, Ilias

  1. By: Matteo Luciani; Libero Monteforte
    Abstract: In this paper we propose to exploit the heterogeneity of forecasts produced by different model specifications to measure forecast uncertainty. Our approach is simple and intuitive. It consists in selecting all the models that outperform some benchmark model, and then to construct an empirical distribution of the forecasts produced by these models. We interpret this distribution as a measure of uncertainty. We perform a pseudo real-time forecasting exercise on a large database of Italian data from 1982 to 2009, showing case studies of our measure of uncertainty.
    Keywords: Factor models, Model uncertainty, Forecast combination, Density forecast.
    JEL: C13 C32 C33 C52 C53
    Date: 2012–05
    URL: http://d.repec.org/n?u=RePEc:itt:wpaper:wp2012-5&r=for
  2. By: Galimberti, Jaqueson K.
    Abstract: Assuming an ARIMA(p,I,q) model represents the data, I show how optimal forecasts can be computed and derive general expressions for its main properties of interest. Namely, I present stepwise derivations of expressions for the variances of forecast errors, and the covariances between them at arbitrary forecasting horizons. Matricial forms for these expressions are also presented to facilitate computational implementation.
    Keywords: optimal forecasts; forecasts properties; ARIMA
    JEL: C53
    Date: 2012–01–10
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:40303&r=for
  3. By: Adam E Clements (QUT); Mark Doolan (QUT); Stan Hurn (QUT); Ralf Becker (University of Manchester)
    Abstract: Techniques for evaluating and selecting multivariate volatility forecasts are not yet as well understood as their univariate counterparts. This paper considers the ability of different loss functions to discriminate between a competing set of forecasting models which are subsequently applied in a portfolio allocation context. It is found that a likelihood based loss function outperforms it competitors including those based on the given portfolio application. This result indicates that the particular application of forecasts is not necessarily the most effective approach under which to select models.
    Keywords: Multivariate volatility, portfolio allocation, forecast evaluation, model selection, model confidence set
    JEL: C22 G00
    Date: 2012–08–09
    URL: http://d.repec.org/n?u=RePEc:qut:auncer:2012_8&r=for
  4. By: Karlsson, Sune (Department of Business, Economics, Statistics and Informatics)
    Abstract: Prepared for the Handbook of Economic Forecasting, vol 2 <p> This chapter reviews Bayesian methods for inference and forecasting with VAR models. Bayesian inference and, by extension, forecasting depends on numerical methods for simulating from the posterior distribution of the parameters and spe- cial attention is given to the implementation of the simulation algorithm.
    Keywords: Markov chain Monte Carlo; Structural VAR; Cointegration; Condi- tional forecasts; Time-varying parameters; Stochastic volatility; Model selection; Large VAR
    JEL: C11 C32 C53
    Date: 2012–08–04
    URL: http://d.repec.org/n?u=RePEc:hhs:oruesi:2012_012&r=for
  5. By: Dick van Dijk (Erasmus University Rotterdam); Siem Jan Koopman (VU University Amsterdam); Michel van der Wel (Erasmus University Rotterdam, CREATES, Aarhus); Jonathan H. Wright (Johns Hopkins University)
    Abstract: Many economic studies on inflation forecasting have found favorable results when inflation is modeled as a stationary process around a slowly time-varying trend. In contrast, the existing studies on interest rate forecasting either treat yields as being stationary, without any shifting endpoints, or treat yields as a random walk process. In this study we consider the problem of forecasting the term structure of interest rates with the assumption that the yield curve is driven by factors that are stationary around a time-varying trend. We compare alternative ways of modeling the time-varying trend. We find that allowing for shifting endpoints in yield curve factors can provide gains in the out-of-sample predictive accuracy, relative to stationary and random walk benchmarks. The results are both economically and statistically significant.
    Keywords: term structure of interest rates; forecasting; non-stationarity; survey forecasts; yield curve
    JEL: C32 E43 G17
    Date: 2012–07–19
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20120076&r=for
  6. By: Te Bao (University of Amsterdam); John Duffy (University of Pittsburgh); Cars Hommes (University of Amsterdam)
    Abstract: Rational Expectations (RE) models have two crucial dimensions: 1) agents correctly forecast future prices given all available information, and 2) given expectations, agents solve optimization problems and these solutions in turn determine actual price realizations. Experimental testing of such models typically focuses on only one of these two dimensions. In this paper we consider both forecasting and optimization decisions in an experimental cobweb economy. We report results from four experimental treatments: 1) subjects form forecasts only, 2) subjects determine quantity only (solve an optimization problem), 3) they do both and 4) they are paired in teams and one member is assigned the forecasting role while the other is assigned the optimization task. All treatments converge to Rational Expectation Equilibrium (REE), but at very different speeds. We observe that performance is the best in treatment 1) and worst in the treatment 3). Most forecasters use a n adaptive expectations rule. Subjects are less likely to make conditionally optimal production decision for given forecasts in treatment 3) where the forecast is made by themselves, than in treatment 4) where the forecast is made by the other member of their team, which suggests that "two heads are better than one" in finding REE.
    Keywords: Learning; Rational Expectations; Optimization; Experimental Economics; Bounded Rationality
    JEL: C91 C92 D83 D84
    Date: 2012–02–17
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20120015&r=for
  7. By: Cristina Conflitti; Christine De Mol; Domenico Giannone
    Abstract: We consider the problem of optimally combining individual forecasts of gross domestic product (GDP) and inflation from the Survey of Professional Forecasters (SPF) dataset for the Euro Area. Contrary to the common practice of using equal combination weights, we compute optimal weights which minimize the mean square forecast error (MSFE) in the case of point forecasts and maximize a logarithmic score in the case of density forecasts. We show that this is a viable strategy even when the number of forecasts to combine gets large, provided we constrain these weights to be positive and to sum to one. Indeed, this enforces a form of shrinkage on the weights which ensures good out-of-sample performance of the combined forecasts.
    Keywords: forecast combination; forecast evaluation; survey of professional forecasters; real-time data; shrinkage; high-dimensional data
    JEL: C53 C22
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/124527&r=for
  8. By: Cecilia Frale; Valentina Raponi
    Abstract: This paper deals with the topic of revision of data with the aim of investigating whether consecutive releases of macroeconomic series published by statistical agencies contain useful information for economic analysis and forecasting. The rationality of the re-visions process is tested considering the complete history of data and an empirical application to show the usefulness of revisions for improving the precision of forecasting model is proposed. The results for Italian GDP growth show that embedding the revision process in a dynamic factor model helps to reduce the forecast error.
    Keywords: Data revisions, real-time dataset, mixed frequency, Dynamic factor Model.
    JEL: E32 E37 C53
    Date: 2012–03
    URL: http://d.repec.org/n?u=RePEc:itt:wpaper:wp2012-3&r=for
  9. By: Ralf Brüggemann (University of Konstanz, Department Economics, Germany); Jing Zeng (University of Konstanz, Department Economics, Germany)
    Abstract: We suggest to use a factor model based backdating procedure to construct historical Euro-area macroeconomic time series data for the pre-Euro period. We argue that this is a useful alternative to standard contemporaneous aggregation methods. The paper investigates for a number of Euro-area variables whether forecasts based on the factor-backdated data are more precise than those obtained with standard area-wide data. A recursive pseudo-out-of-sample forecasting experiment using quarterly data is conducted. Our results suggest that some key variables (e.g. real GDP, inflation and long-term interest rate) can indeed be forecasted more precisely with the factor-backdated data.
    Keywords: forecasting, factor model, backdating, European monetary union, constructing EMU data
    JEL: C22 C53 C43 C82
    Date: 2012–08–02
    URL: http://d.repec.org/n?u=RePEc:knz:dpteco:1215&r=for
  10. By: Pablo Pincheira; Roberto Álvarez
    Abstract: The Central Bank of Chile builds inflation forecasts for several time horizons and using various methodologies. In this paper, we analyze one of these series of short-term inflation forecasts, which we call Auxiliary Inflation Forecasts (AIF), comparing them to forecasts made by private analysts and to forecasts built from simple time-series models. We also evaluate the AIF using encompassing tests and bias and weak efficiency tests. Our aim is to answer two linked questions: first, which is the best forecast series under a specific loss function? and second, are the differences of accuracy between any two series of forecasts totally explained by the differences in the information sets from which they have been built? Our results indicate that the AIF behave extremely well at one- and two-month horizons, but they are less adequate at longer horizons.
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:chb:bcchwp:674&r=for
  11. By: Julieta Fuentes; Pilar Poncela; Julio Rodríguez
    Abstract: Factor models have been applied extensively for forecasting when high dimensional datasets are available. In this case, the number of variables can be very large. For instance, usual dynamic factor models in central banks handle over 100 variables. However, there is a growing body of the literature that indicates that more variables do not necessarily lead to estimated factors with lower uncertainty or better forecasting results. This paper investigates the usefulness of partial least squares techniques, that take into account the variable to be forecasted when reducing the dimension of the problem from a large number of variables to a smaller number of factors. We propose different approaches of dynamic sparse partial least squares as a means of improving forecast efficiency by simultaneously taking into account the variable forecasted while forming an informative subset of predictors, instead of using all the available ones to extract the factors. We use the well-known Stock and Watson database to check the forecasting performance of our approach. The proposed dynamic sparse models show a good performance in improving the efficiency compared to widely used factor methods in macroeconomic forecasting.
    Keywords: Factor Models, Forecasting, Large Datasets, Partial Least Squares, Sparsity, Variable Selection
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws122216&r=for
  12. By: Jeffrey A. Frankel; Jesse Schreger
    Abstract: Why do countries find it so hard to get their budget deficits under control? Systematic patterns in the errors that official budget agencies make in their forecasts may play an important role. Although many observers have suggested that fiscal discipline can be restored via fiscal rules such as a legal cap on the budget deficit, forecasting bias can defeat such rules. The members of the eurozone are supposedly constrained by the fiscal caps of the Stability and Growth Pact. Yet ever since the birth of the euro in 1999, members have postponed painful adjustment by making overly optimistic forecasts of future growth and budget positions and arguing that the deficits will fall below the cap within a year or two. The new fiscal compact among the euro countries is supposed to make budget rules more binding by putting them into laws and constitutions at the national level. But what is the record with such national rules? Our econometric findings are summarized as follows: • Governments’ budget forecasts are biased in the optimistic direction, especially among the Eurozone countries, especially when they have large contemporaneous budget deficits, and especially during booms. • Governments’ real GDP forecasts are similarly over-optimistic during booms. • Despite the well-known tendency of eurozone members to exceed the 3% cap on budget deficits, often in consecutive years, they almost never forecast that they will violate the cap in the coming years. This is the source of the extra bias among eurozone forecasts. If euro area governments are not in violation of the 3% cap at the time forecasts are made, forecasts are no more biased than other countries. • Although euro members without national budget balance rules have a larger over-optimism bias than non-member countries, national fiscal rules help counteract the wishful thinking that seems to come with euro membership. The reason is that when governments are in violation of the 3% cap the national rules apparently constrain them from making such unrealistic forecasts. • Similarly, the existence of an independent fiscal institution producing budget forecasts at the national level reduces the over-optimism bias of forecasts made when the countries are in violation of the 3% cap.
    JEL: F3
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:18283&r=for
  13. By: Falk Brauning (VU University Amsterdam); Siem Jan Koopman (VU University Amsterdam)
    Abstract: We explore a new approach to the forecasting of macroeconomic variables based on a dynamic factor state space analysis. Key economic variables are modeled jointly with principal components from a large time series panel of macroeconomic indicators using a multivariate unobserved components time series model. When the key economic variables are observed at a low frequency and the panel of macroeconomic variables is at a high frequency, we can use our approach for both nowcasting and forecasting purposes. Given a dynamic factor model as the data generation process, we provide Monte Carlo evidence for the finite-sample justification of our parsimonious and feasible approach. We also provide empirical evidence for a U.S. macroeconomic dataset. The unbalanced panel contain quarterly and monthly variables. The forecasting accuracy is measured against a set of benchmark models. We conclude that our dynamic factor state space analysis can lead to higher forecasting precisions when panel size and time series dimensions are moderate.
    Keywords: Kalman filter; Mixed frequency; Nowcasting; Principal components; State space model; Unobserved Components Time Series Model
    JEL: C33 C53 E17
    Date: 2012–04–20
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20120042&r=for
  14. By: Bert de Bruijn (Erasmus University Rotterdam); Philip Hans Franses (Erasmus University Rotterdam)
    Abstract: Earnings forecasts can be useful for investment decisions. Research on earnings forecasts has focused on forecast performance in relation to firm characteristics, on categorizing the analysts into groups with similar behaviour and on the effect of an earnings announcement by thefirm on future earnings forecasts. In this paper we investigate the factors that determine the value of the forecast and also investigate to what extent the timing of the forecast can be modeled. We propose a novel methodology that allows for such an investigation. As an illustration we analyze within-year earnings forecasts for AMD in the period 1997 to 2011, where the data are obtained from the I/B/E/S database. Our empirical findings suggest clear drivers of the value and the timing of the earnings forecast. We thus show that not only the forecasts themselves are predictable, but that also the timing of the quotes is predictable to some extent.
    Keywords: Earnings Forecasts; Earnings Announcements; Financial Markets; Financial Analysts
    JEL: G17 G24 M41
    Date: 2012–07–12
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20120067&r=for
  15. By: Mark J. Jensen; John M. Maheu
    Abstract: This paper proposes a Bayesian nonparametric modeling approach for the return distribution in multivariate GARCH models. In contrast to the parametric literature, the return distribution can display general forms of asymmetry and thick tails. An infinite mixture of multivariate normals is given a flexible Dirichlet process prior. The GARCH functional form enters into each of the components of this mixture. We discuss conjugate methods that allow for scale mixtures and nonconjugate methods, which provide mixing over both the location and scale of the normal components. MCMC methods are introduced for posterior simulation and computation of the predictive density. Bayes factors and density forecasts with comparisons to GARCH models with Student-t innovations demonstrate the gains from our flexible modeling approach.
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:fip:fedawp:2012-09&r=for
  16. By: Philip Hans Franses (Erasmus University Rotterdam); Rianne Legerstee (Erasmus University Rotterdam); Richard Paap (Erasmus University Rotterdam)
    Abstract: We propose a new and simple methodology to estimate the loss function associated with experts' forecasts. Under the assumption of conditional normality of the data and the forecast distribution, the asymmetry parameter of the lin-lin and linex loss function can easily be estimated using a linear regression. This regression also provides an estimate for potential systematic bias in the forecasts of the expert. The residuals of the regression are the input for a test for the validity of the normality assumption. We apply our approach to a large data set of SKU-level sales forecasts made by experts and we compare the outcomes with those for statistical model-based forecasts of the same sales data. We find substantial evidence for asymmetry in the loss functions of the experts, with underprediction penalized more than overprediction.
    Keywords: model forecasts; expert forecasts; loss functions; asymmetry; econometric models
    JEL: C50 C53
    Date: 2011–12–16
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20110177&r=for
  17. By: Thomas Laurent; Tomasz Kozluk
    Abstract: Uncertainty is inherent to forecasting and assessing the uncertainty surrounding a point forecast is as important as the forecast itself. Following Cornec (2010), a method to assess the uncertainty around the indicator models used at OECD to forecast GDP growth of the six largest member countries is developed, using quantile regressions to construct a probability distribution of future GDP, as opposed to mean point forecasts. This approach allows uncertainty to be assessed conditionally on the current state of the economy and is totally model based and judgement free. The quality of the computed distributions is tested against other approaches to measuring forecast uncertainty and a set of uncertainty indicators is constructed in order to help exploiting the most helpful information.<P>Mesure de l'incertitude sur les prévisions du PIB à l'aide de régressions quantiles<BR>L’incertitude est inhérente à la prévision, et évaluer l’incertitude autour d’une prévision est aussi important que la prévision elle-même. A la suite de Cornec (2010), une méthode pour évaluer l’incertitude autour des modèles d’indicateurs utilisés à l’OCDE pour prévoir la croissance des six plus grandes économies membres est développée, utilisant des régressions quantiles pour construire une distribution de probabilité du PIB future, plutôt qu’une prévision ponctuelle. Cette approche permet d’évaluer l’incertitude conditionnellement à l’état actuel de l’économie et est fondée sur le modèle, sans jugement. La qualité des distributions calculées est testée contre des approches alternatives de la mesure de l’incertitude, et un ensemble d’indicateurs d’incertitudes est construit pour aider à exploiter les informations les plus pertinentes.
    Keywords: forecasting, uncertainty, GDP, quantile regression, prévisions, incertitude, PIB, régression quantile
    JEL: C31 C53
    Date: 2012–07–06
    URL: http://d.repec.org/n?u=RePEc:oec:ecoaaa:978-en&r=for
  18. By: Ralph Snyder; Adrian Beaumont; J. Keith Ord
    Abstract: This paper is concerned with identifying an effective method for forecasting the lead time demand of slow-moving inventories. Particular emphasis is placed on prediction distributions instead of point predictions alone. It is also placed on methods which work with small samples as well as large samples in recognition of the fact that the typical range of items has a mix of vintages due to different commissioning and decommissioning dates over time. Various forecasting methods are compared using monthly demand data for more than one thousand car parts. It is found that a multi-series version of exponential smoothing coupled with a Pólya (negative binomial) distribution works better than the other twenty-four methods considered, including the Croston method.
    Keywords: Demand forecasting; inventory control; shifted Poisson distribution
    JEL: C22
    Date: 2012–07
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2012-15&r=for
  19. By: Ephias M. Makaudze
    Abstract: Drought constitutes the most dominant source of food insecurity in Zimbabwe and many other countries in Africa. With a majority of smallholder farmers practicing dry-land agriculture, seasonal forecasts hold promise as an effective risk management tool, giving farmers the ability to anticipate rainfall variability early enough to adjust crucial farm decisions and better prepared to handle climatic anomalies in ways that can reduce costly losses (crop, animal and even human). This study demonstrates the potential value of forecasts to smallholder farmers in Zimbabwe, a majority who often suffer severely from the impact of drought. Using crop simulation models to compare yield performances of farmers with and without forecasts, results indicate that for a drought year, farmers with forecasts (WF) record higher yield gains (28%) compared to those without forecasts (WOF); in particular, farmers located in driest regions (NR V) record the highest yield gains (42%). Similar results are observed for a neutral/ average year as farmers WF obtain predominantly higher yield gains (20%) than those WOF. However for a good year, results show a different pattern as no yield gains are observed. In fact farmers WOF perform better; suggesting forecasts in this case may not make much difference. Using gross margin analysis, results show farmers WF obtaining higher returns during a drought (US$0.14ha−1) and neutral year (US$0.43ha−1) but again not for good year as farmers WOF outperform those WF. In sum, forecasts can play an important role as loss-minimization instruments especially if the underlying year is a El Nino (drought) year. In conclusion, to attain full economic value of forecasts, complementary policies (currently missing) such as effective communication, improvement in forecast extension skills and promotion of farmer participatory and outreach activities, all could prove vital in enhancing the value of forecasts to smallholder farmers in general
    Keywords: Seasonal forecasts, smallholder farmers, El Nino, economic value, drought risk, with and without forecasts
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:rza:wpaper:303&r=for
  20. By: Caio Almeida; Axel Simonsen; José Vicente
    Abstract: Recent empirical analysis of interest rate markets documents that bond demand and supply directly affect yield curve movements and bond risk premium. Motivated by those findings we propose a parametric interest rate model that allows for segmentation and local shocks in the term structure. We split the yield curve in segments presenting their own local movements that are globally interconnected by smoothing conditions. Two classes of segmented exponential models are derived and compared to successful term structure models based on a sequence of out-of-sample forecasting exercises. Adopting U.S. interest rates data available from 1985 to 2008, the segmented models present overall better forecasting performance suggesting that local shocks might indeed be important determinants of yield curve dynamics.
    Date: 2012–07
    URL: http://d.repec.org/n?u=RePEc:bcb:wpaper:288&r=for
  21. By: Csávás, Csaba; Erhart, Szilárd; Naszódi, Anna; Pintér, Klára
    Abstract: There is ample empirical evidence in the literature for the positive effect of central bank transparency on the economy. The main channel is that transparency reduces the uncertainty regarding future monetary policy and thereby it helps agents to make better investment, and saving decisions. In this paper, we document how the degree of transparency of central banks in Central and Eastern Europe has changed during periods of financial stress, and we argue that during the recent financial crisis central banks became less transparent. We investigate also how these changes affected the uncertainty in these economies, measured by the degree of disagreement across professional forecasters over the future short-term and long-term interest rates and also by their forecast accuracy.
    Keywords: central banking; transparency; financial crises; survey expectations; forecasting
    JEL: E58 E44 E47
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:40335&r=for
  22. By: Francis Vitek
    Abstract: This paper develops a structural macroeconometric model of the world economy, disaggregated into thirty five national economies. This panel unobserved components model features a monetary transmission mechanism, a fiscal transmission mechanism, and extensive macrofinancial linkages, both within and across economies. A variety of monetary policy analysis, fiscal policy analysis, spillover analysis, and forecasting applications of the estimated model are demonstrated, based on a Bayesian framework for conditioning on judgment.
    Keywords: Cross country analysis , Fiscal policy , Monetary policy , Monetary transmission mechanism , Spillovers , Forecasting models ,
    Date: 2012–06–07
    URL: http://d.repec.org/n?u=RePEc:imf:imfwpa:12/149&r=for
  23. By: Wolfgang Karl Härdle,Piotr Majer; Melanie Schienle; ;
    Abstract: Using a Dynamic Semiparametric Factor Model (DSFM) we investigate the term structure of interest rates. The proposed methodology is applied to monthly interest rates for four southern European countries: Greece, Italy, Portugal and Spain from the introduction of the Euro to the recent European sovereign-debt crisis. Analyzing this extraordinary period, we compare our approach with the standard market method - dynamic Nelson-Siegel model. Our findings show that two nonparametric factors capture the spatial structure of the yield curve for each of the bond markets separately. We attributed both factors to the slope of the yield curve. For panel term structure data, three nonparametric factors are necessary to explain 95% variation. The estimated factor loadings are unit root processes and reveal high persistency. In comparison with the benchmark model, the DSFM technique shows superior short term forecasting.
    Keywords: yield curve, term structure of interests rates, semiparametric model, factor structure, prediction
    JEL: C14 C51
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2012-048&r=for
  24. By: Kuper, Gerard H.; Sterken, Elmer (Groningen University)
    Abstract: The current paper predicts the medal tally for the London 2012 Olympic Games. The forecast procedure consists of analyzing participation and success at the country level of the three most recent editions of the Olympic Summer Games. Potential explanatory variables for medal winnings are income per capita, population, geographical distance to the Games, success in terms of medals won at World Championships, and the home advantage. Our forecasts show that the China takes first place in the medal tally with 44 gold medals, followed by the United States of America winning 33 gold medals. We expect Great Britain to take fourth place winning 23 gold medals.
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:dgr:rugsom:12006-eef&r=for
  25. By: Chen, Shu-Ling; Jackson, John D.; Kim, Hyeongwoo; Resiandini, Pramesti
    Abstract: This paper examines common forces driving the prices of 51 highly tradable commodities. We demonstrate that highly persistent movements of these prices are mostly due to the first common component, which is closely related to the US nominal exchange rate. In particular, our simple factor-based model outperforms the random walk model in out-of-sample forecast for the US exchange rate. The second common factor and de-factored idiosyncratic components are consistent with stationarity, implying short-lived deviations from the equilibrium price dynamics. In concert, these results provide an intriguing resolution to the apparent inconsistency arising from stable markets with nonstationary prices.
    Keywords: Commodity Prices; US Nominal Exchange Rate; Panel Analysis of Nonstationarity in Idiosyncratic and Common Components; Cross-Section Dependence; Out-of-Sample Forecast
    JEL: C53 F31
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:40711&r=for
  26. By: Michael Iacono; David Levinson; Ahmed El-Geneidy; Rania Wasfi (Nexus (Networks, Economics, and Urban Systems) Research Group, Department of Civil Engineering, University of Minnesota)
    Abstract: The set of models available to predict land use change in urban regions has become increasingly complex in recent years. Despite their complexity, the predictive power of these models remains relatively weak. This paper presents an example of an alternative modeling framework based on the concept of a Markov chain. The model assumes that land use at any given time, which is viewed as a discrete state, can be considered a function of only its previous state. The probability of transition between each pair of states is recorded as an element of a transition probability matrix. Assuming that this matrix is stationary over time, it can be used to predict future land use distributions from current data. To illustrate this process, a Markov chain model is estimated for the Minneapolis-St. Paul, MN, USA (Twin Cities) metropolitan region. Using a unique set of historical land use data covering several years between 1958 and 2005, the model is tested using historical data to predict recent conditions, and is then used to forecast the future distribution of land use decades into the future. We also use the cell-level data set to estimate the fraction of regional land use devoted to transportation facilities, including major highways, airports, and railways. The paper concludes with some comments on the strengths and weaknesses of Markov chains as a land use modeling framework, and suggests some possible extensions of the model.
    Keywords: mode choice, mode shares, mixed logit, stated preference.
    JEL: R11 R12 R14 R41 R52
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:nex:wpaper:markovlu&r=for
  27. By: Alan Cook (Baylor University Medical Center - Dallas, Texas); Turner M. Osler (University of Vermont College of Medicine, Department of Surgery)
    Abstract: Many methods have been developed to predict mortality following trauma. Two classification systems are used to provide a taxonomy for diseases, including injuries. The ICD-9 is the classification system for administrative data in the U.S.A. AIS was developed for characterization of injuries alone. The Trauma Mortality Prediction Model (TMPM) is based on empiric estimates of severity for each injury in the ICD-9 and AIS lexicons. Each probability of mortality (POD) is estimated from the five worst injuries per patient. TMPM has been rigorously tested against other mortality prediction models using ICD-9 and AIS data and found superior. The TMPM.ado command allows Stata users to efficiently apply TMPM to data sets using ICD-9 or AIS. The command makes use of model-averaged regression coefficients (MARC) that assign empirically derived severity measures for each of the 1,322 AIS codes and 1,579 ICD-9 injury codes. The injury codes are sorted into body regions then merged with the MARC table to assemble a set of regression coefficients. A logit model is generated to calculate the probability of death. TMPM.ado accommodates either AIS or ICD-9 lexicons from a single command and adds the POD for each patient to the original dataset as a new variable.
    Date: 2012–08–01
    URL: http://d.repec.org/n?u=RePEc:boc:scon12:20&r=for
  28. By: Lönnbark, Carl (Department of Economics, Umeå University)
    Abstract: In the estimation of risk measures such as Value at Risk and Expected shortfall relatively short estimation windows are typically used rendering the estimation error a possibly non-negligible component. In this paper we build upon previous results for the Value at Risk and discuss how the estimation error comes into play for the Expected Shortfall. We identify two important aspects where it may be of importance. On the one hand there is in the evaluation of predictors of the measure. On the other there is in the interpretation and communication of it. We illustrate magnitudes numerically and emphasize the practical importance of the latter aspect in an empirical application with stock market index data.
    Keywords: Backtesting; Delta method; Finance; GARCH; Risk Management
    JEL: C52 C53 C58 G10 G19
    Date: 2012–08–16
    URL: http://d.repec.org/n?u=RePEc:hhs:umnees:0844&r=for
  29. By: Mari-Vidal, Sergio; Segui-Mas, Elies; Marin-Sanchez, Maria del Mar; Mateos-Ronco, Alicia
    Abstract: Accounting information has been employed in many economic-financial models applied to registered corporations to predict business failure. Nonetheless, there are practically no research works that predict failure in agricultural cooperatives. The fundamental elements of this legal form justify the development of specific prediction models. The Delphi methodology has been used to define agricultural cooperative failure and to assess the usefulness of accounting variables. The conclusions suggest considering those agricultural cooperatives with negative equity or cash-flow problems to be failures or to come close to this concept. Similarly, indebtedness volume, cash flow and solvency are the most relevant variables that can act as business prediction instruments.
    Keywords: Agricultural cooperatives, business failure, Delphi method, accounting variables, Agribusiness, Farm Management, Risk and Uncertainty, Q13, M41, M15, G33,
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:ags:iaae12:128561&r=for
  30. By: Alexander Guarín; Andrés González; Daphné Skandalis; Daniela Sánchez
    Abstract: In this paper, we propose an alternative methodology to determine the existence of credit booms, which is a complex and crucial issue for policymakers. In particular, we exploit the Mendoza and Terrones (2008)’s idea that macroeconomic aggregates other than the credit growth rate contain valuable information to predict credit boom episodes. Our econometric method is used to estimate and predict the probability of being in a credit boom. We run empirical exercises on quarterly data for six Latin American countries between 1996 and 2011. In order to capture simultaneously model and parameter uncertainty, we implement the Bayesian model averaging method. As we employ panel data, the estimates may be used to predict booms of countries which are not considered in the estimation. Overall, our findings show that macroeconomic variables contain valuable information to predict credit booms. In fact, with our method the probability of detecting a credit boom is 80%, while the probability of not having false alarms is greater than 92%.
    Keywords: Early Warning Indicator, Credit Booms, Business Cycles, Emerging Markets. Classification JEL:E32, E37, E44, E51, C53.
    Date: 2012–07
    URL: http://d.repec.org/n?u=RePEc:bdr:borrec:723&r=for
  31. By: Halkos, George; Kevork, Ilias
    Abstract: In the current paper we study a real life inventory problem whose operating conditions match to the principles of the classical newsvendor model. Applying appropriate tests to the available sample of historical demand data, we get the sufficient statistical evidences to support that daily demand is stationary, uncorrelated, and normally distributed. Given that at the start of each day, the selling price, the purchasing cost per unit, and the salvage value are known, and do not change through the whole period under investigation, we derive exact and asymptotic prediction intervals for the daily maximum expected profit. To evaluate their performance, we derive the analytic form of three accuracy information metrics. The first metric measures the deviation of the estimated probability of no stock-outs during the day from the critical fractile. The other two metrics relate the validity and precision of the two types of prediction interval to the variability of estimates for the ordered quantity. Both theoretical and empirical analysis demonstrates the importance of implications of the loss of goodwill to the adopted inventory policy. Operating the system at the optimal situation, this intangible cost element determines the probability of no stock-outs during the day, and assesses the precision of prediction intervals. The rising of the loss of goodwill leads to smaller estimates for the daily maximum expected profit and to wider prediction intervals. Finally, in the setting of the real life newsvendor problem, we recommend the asymptotic prediction interval since with samples over 25 observations this type of interval has higher precision and probability to include the daily maximum expected profit almost equal to the nominal confidence level.
    Keywords: Newsvendor model; Loss of goodwill; Target inventory measures; Prediction interval; Accuracy information metric
    JEL: C13 M11 C44 D24
    Date: 2012–08–17
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:40724&r=for

This nep-for issue is ©2012 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.