
on Forecasting 
By:  Matteo Luciani; Libero Monteforte 
Abstract:  In this paper we propose to exploit the heterogeneity of forecasts produced by different model specifications to measure forecast uncertainty. Our approach is simple and intuitive. It consists in selecting all the models that outperform some benchmark model, and then to construct an empirical distribution of the forecasts produced by these models. We interpret this distribution as a measure of uncertainty. We perform a pseudo realtime forecasting exercise on a large database of Italian data from 1982 to 2009, showing case studies of our measure of uncertainty. 
Keywords:  Factor models, Model uncertainty, Forecast combination, Density forecast. 
JEL:  C13 C32 C33 C52 C53 
Date:  2012–05 
URL:  http://d.repec.org/n?u=RePEc:itt:wpaper:wp20125&r=for 
By:  Galimberti, Jaqueson K. 
Abstract:  Assuming an ARIMA(p,I,q) model represents the data, I show how optimal forecasts can be computed and derive general expressions for its main properties of interest. Namely, I present stepwise derivations of expressions for the variances of forecast errors, and the covariances between them at arbitrary forecasting horizons. Matricial forms for these expressions are also presented to facilitate computational implementation. 
Keywords:  optimal forecasts; forecasts properties; ARIMA 
JEL:  C53 
Date:  2012–01–10 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:40303&r=for 
By:  Adam E Clements (QUT); Mark Doolan (QUT); Stan Hurn (QUT); Ralf Becker (University of Manchester) 
Abstract:  Techniques for evaluating and selecting multivariate volatility forecasts are not yet as well understood as their univariate counterparts. This paper considers the ability of different loss functions to discriminate between a competing set of forecasting models which are subsequently applied in a portfolio allocation context. It is found that a likelihood based loss function outperforms it competitors including those based on the given portfolio application. This result indicates that the particular application of forecasts is not necessarily the most effective approach under which to select models. 
Keywords:  Multivariate volatility, portfolio allocation, forecast evaluation, model selection, model confidence set 
JEL:  C22 G00 
Date:  2012–08–09 
URL:  http://d.repec.org/n?u=RePEc:qut:auncer:2012_8&r=for 
By:  Karlsson, Sune (Department of Business, Economics, Statistics and Informatics) 
Abstract:  Prepared for the Handbook of Economic Forecasting, vol 2 <p> This chapter reviews Bayesian methods for inference and forecasting with VAR models. Bayesian inference and, by extension, forecasting depends on numerical methods for simulating from the posterior distribution of the parameters and spe cial attention is given to the implementation of the simulation algorithm. 
Keywords:  Markov chain Monte Carlo; Structural VAR; Cointegration; Condi tional forecasts; Timevarying parameters; Stochastic volatility; Model selection; Large VAR 
JEL:  C11 C32 C53 
Date:  2012–08–04 
URL:  http://d.repec.org/n?u=RePEc:hhs:oruesi:2012_012&r=for 
By:  Dick van Dijk (Erasmus University Rotterdam); Siem Jan Koopman (VU University Amsterdam); Michel van der Wel (Erasmus University Rotterdam, CREATES, Aarhus); Jonathan H. Wright (Johns Hopkins University) 
Abstract:  Many economic studies on inflation forecasting have found favorable results when inflation is modeled as a stationary process around a slowly timevarying trend. In contrast, the existing studies on interest rate forecasting either treat yields as being stationary, without any shifting endpoints, or treat yields as a random walk process. In this study we consider the problem of forecasting the term structure of interest rates with the assumption that the yield curve is driven by factors that are stationary around a timevarying trend. We compare alternative ways of modeling the timevarying trend. We find that allowing for shifting endpoints in yield curve factors can provide gains in the outofsample predictive accuracy, relative to stationary and random walk benchmarks. The results are both economically and statistically significant. 
Keywords:  term structure of interest rates; forecasting; nonstationarity; survey forecasts; yield curve 
JEL:  C32 E43 G17 
Date:  2012–07–19 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:20120076&r=for 
By:  Te Bao (University of Amsterdam); John Duffy (University of Pittsburgh); Cars Hommes (University of Amsterdam) 
Abstract:  Rational Expectations (RE) models have two crucial dimensions: 1) agents correctly forecast future prices given all available information, and 2) given expectations, agents solve optimization problems and these solutions in turn determine actual price realizations. Experimental testing of such models typically focuses on only one of these two dimensions. In this paper we consider both forecasting and optimization decisions in an experimental cobweb economy. We report results from four experimental treatments: 1) subjects form forecasts only, 2) subjects determine quantity only (solve an optimization problem), 3) they do both and 4) they are paired in teams and one member is assigned the forecasting role while the other is assigned the optimization task. All treatments converge to Rational Expectation Equilibrium (REE), but at very different speeds. We observe that performance is the best in treatment 1) and worst in the treatment 3). Most forecasters use a n adaptive expectations rule. Subjects are less likely to make conditionally optimal production decision for given forecasts in treatment 3) where the forecast is made by themselves, than in treatment 4) where the forecast is made by the other member of their team, which suggests that "two heads are better than one" in finding REE. 
Keywords:  Learning; Rational Expectations; Optimization; Experimental Economics; Bounded Rationality 
JEL:  C91 C92 D83 D84 
Date:  2012–02–17 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:20120015&r=for 
By:  Cristina Conflitti; Christine De Mol; Domenico Giannone 
Abstract:  We consider the problem of optimally combining individual forecasts of gross domestic product (GDP) and inflation from the Survey of Professional Forecasters (SPF) dataset for the Euro Area. Contrary to the common practice of using equal combination weights, we compute optimal weights which minimize the mean square forecast error (MSFE) in the case of point forecasts and maximize a logarithmic score in the case of density forecasts. We show that this is a viable strategy even when the number of forecasts to combine gets large, provided we constrain these weights to be positive and to sum to one. Indeed, this enforces a form of shrinkage on the weights which ensures good outofsample performance of the combined forecasts. 
Keywords:  forecast combination; forecast evaluation; survey of professional forecasters; realtime data; shrinkage; highdimensional data 
JEL:  C53 C22 
Date:  2012–08 
URL:  http://d.repec.org/n?u=RePEc:eca:wpaper:2013/124527&r=for 
By:  Cecilia Frale; Valentina Raponi 
Abstract:  This paper deals with the topic of revision of data with the aim of investigating whether consecutive releases of macroeconomic series published by statistical agencies contain useful information for economic analysis and forecasting. The rationality of the revisions process is tested considering the complete history of data and an empirical application to show the usefulness of revisions for improving the precision of forecasting model is proposed. The results for Italian GDP growth show that embedding the revision process in a dynamic factor model helps to reduce the forecast error. 
Keywords:  Data revisions, realtime dataset, mixed frequency, Dynamic factor Model. 
JEL:  E32 E37 C53 
Date:  2012–03 
URL:  http://d.repec.org/n?u=RePEc:itt:wpaper:wp20123&r=for 
By:  Ralf Brüggemann (University of Konstanz, Department Economics, Germany); Jing Zeng (University of Konstanz, Department Economics, Germany) 
Abstract:  We suggest to use a factor model based backdating procedure to construct historical Euroarea macroeconomic time series data for the preEuro period. We argue that this is a useful alternative to standard contemporaneous aggregation methods. The paper investigates for a number of Euroarea variables whether forecasts based on the factorbackdated data are more precise than those obtained with standard areawide data. A recursive pseudooutofsample forecasting experiment using quarterly data is conducted. Our results suggest that some key variables (e.g. real GDP, inflation and longterm interest rate) can indeed be forecasted more precisely with the factorbackdated data. 
Keywords:  forecasting, factor model, backdating, European monetary union, constructing EMU data 
JEL:  C22 C53 C43 C82 
Date:  2012–08–02 
URL:  http://d.repec.org/n?u=RePEc:knz:dpteco:1215&r=for 
By:  Pablo Pincheira; Roberto Álvarez 
Abstract:  The Central Bank of Chile builds inflation forecasts for several time horizons and using various methodologies. In this paper, we analyze one of these series of shortterm inflation forecasts, which we call Auxiliary Inflation Forecasts (AIF), comparing them to forecasts made by private analysts and to forecasts built from simple timeseries models. We also evaluate the AIF using encompassing tests and bias and weak efficiency tests. Our aim is to answer two linked questions: first, which is the best forecast series under a specific loss function? and second, are the differences of accuracy between any two series of forecasts totally explained by the differences in the information sets from which they have been built? Our results indicate that the AIF behave extremely well at one and twomonth horizons, but they are less adequate at longer horizons. 
Date:  2012–08 
URL:  http://d.repec.org/n?u=RePEc:chb:bcchwp:674&r=for 
By:  Julieta Fuentes; Pilar Poncela; Julio Rodríguez 
Abstract:  Factor models have been applied extensively for forecasting when high dimensional datasets are available. In this case, the number of variables can be very large. For instance, usual dynamic factor models in central banks handle over 100 variables. However, there is a growing body of the literature that indicates that more variables do not necessarily lead to estimated factors with lower uncertainty or better forecasting results. This paper investigates the usefulness of partial least squares techniques, that take into account the variable to be forecasted when reducing the dimension of the problem from a large number of variables to a smaller number of factors. We propose different approaches of dynamic sparse partial least squares as a means of improving forecast efficiency by simultaneously taking into account the variable forecasted while forming an informative subset of predictors, instead of using all the available ones to extract the factors. We use the wellknown Stock and Watson database to check the forecasting performance of our approach. The proposed dynamic sparse models show a good performance in improving the efficiency compared to widely used factor methods in macroeconomic forecasting. 
Keywords:  Factor Models, Forecasting, Large Datasets, Partial Least Squares, Sparsity, Variable Selection 
Date:  2012–08 
URL:  http://d.repec.org/n?u=RePEc:cte:wsrepe:ws122216&r=for 
By:  Jeffrey A. Frankel; Jesse Schreger 
Abstract:  Why do countries find it so hard to get their budget deficits under control? Systematic patterns in the errors that official budget agencies make in their forecasts may play an important role. Although many observers have suggested that fiscal discipline can be restored via fiscal rules such as a legal cap on the budget deficit, forecasting bias can defeat such rules. The members of the eurozone are supposedly constrained by the fiscal caps of the Stability and Growth Pact. Yet ever since the birth of the euro in 1999, members have postponed painful adjustment by making overly optimistic forecasts of future growth and budget positions and arguing that the deficits will fall below the cap within a year or two. The new fiscal compact among the euro countries is supposed to make budget rules more binding by putting them into laws and constitutions at the national level. But what is the record with such national rules? Our econometric findings are summarized as follows: • Governments’ budget forecasts are biased in the optimistic direction, especially among the Eurozone countries, especially when they have large contemporaneous budget deficits, and especially during booms. • Governments’ real GDP forecasts are similarly overoptimistic during booms. • Despite the wellknown tendency of eurozone members to exceed the 3% cap on budget deficits, often in consecutive years, they almost never forecast that they will violate the cap in the coming years. This is the source of the extra bias among eurozone forecasts. If euro area governments are not in violation of the 3% cap at the time forecasts are made, forecasts are no more biased than other countries. • Although euro members without national budget balance rules have a larger overoptimism bias than nonmember countries, national fiscal rules help counteract the wishful thinking that seems to come with euro membership. The reason is that when governments are in violation of the 3% cap the national rules apparently constrain them from making such unrealistic forecasts. • Similarly, the existence of an independent fiscal institution producing budget forecasts at the national level reduces the overoptimism bias of forecasts made when the countries are in violation of the 3% cap. 
JEL:  F3 
Date:  2012–08 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:18283&r=for 
By:  Falk Brauning (VU University Amsterdam); Siem Jan Koopman (VU University Amsterdam) 
Abstract:  We explore a new approach to the forecasting of macroeconomic variables based on a dynamic factor state space analysis. Key economic variables are modeled jointly with principal components from a large time series panel of macroeconomic indicators using a multivariate unobserved components time series model. When the key economic variables are observed at a low frequency and the panel of macroeconomic variables is at a high frequency, we can use our approach for both nowcasting and forecasting purposes. Given a dynamic factor model as the data generation process, we provide Monte Carlo evidence for the finitesample justification of our parsimonious and feasible approach. We also provide empirical evidence for a U.S. macroeconomic dataset. The unbalanced panel contain quarterly and monthly variables. The forecasting accuracy is measured against a set of benchmark models. We conclude that our dynamic factor state space analysis can lead to higher forecasting precisions when panel size and time series dimensions are moderate. 
Keywords:  Kalman filter; Mixed frequency; Nowcasting; Principal components; State space model; Unobserved Components Time Series Model 
JEL:  C33 C53 E17 
Date:  2012–04–20 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:20120042&r=for 
By:  Bert de Bruijn (Erasmus University Rotterdam); Philip Hans Franses (Erasmus University Rotterdam) 
Abstract:  Earnings forecasts can be useful for investment decisions. Research on earnings forecasts has focused on forecast performance in relation to firm characteristics, on categorizing the analysts into groups with similar behaviour and on the effect of an earnings announcement by thefirm on future earnings forecasts. In this paper we investigate the factors that determine the value of the forecast and also investigate to what extent the timing of the forecast can be modeled. We propose a novel methodology that allows for such an investigation. As an illustration we analyze withinyear earnings forecasts for AMD in the period 1997 to 2011, where the data are obtained from the I/B/E/S database. Our empirical findings suggest clear drivers of the value and the timing of the earnings forecast. We thus show that not only the forecasts themselves are predictable, but that also the timing of the quotes is predictable to some extent. 
Keywords:  Earnings Forecasts; Earnings Announcements; Financial Markets; Financial Analysts 
JEL:  G17 G24 M41 
Date:  2012–07–12 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:20120067&r=for 
By:  Mark J. Jensen; John M. Maheu 
Abstract:  This paper proposes a Bayesian nonparametric modeling approach for the return distribution in multivariate GARCH models. In contrast to the parametric literature, the return distribution can display general forms of asymmetry and thick tails. An infinite mixture of multivariate normals is given a flexible Dirichlet process prior. The GARCH functional form enters into each of the components of this mixture. We discuss conjugate methods that allow for scale mixtures and nonconjugate methods, which provide mixing over both the location and scale of the normal components. MCMC methods are introduced for posterior simulation and computation of the predictive density. Bayes factors and density forecasts with comparisons to GARCH models with Studentt innovations demonstrate the gains from our flexible modeling approach. 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:fip:fedawp:201209&r=for 
By:  Philip Hans Franses (Erasmus University Rotterdam); Rianne Legerstee (Erasmus University Rotterdam); Richard Paap (Erasmus University Rotterdam) 
Abstract:  We propose a new and simple methodology to estimate the loss function associated with experts' forecasts. Under the assumption of conditional normality of the data and the forecast distribution, the asymmetry parameter of the linlin and linex loss function can easily be estimated using a linear regression. This regression also provides an estimate for potential systematic bias in the forecasts of the expert. The residuals of the regression are the input for a test for the validity of the normality assumption. We apply our approach to a large data set of SKUlevel sales forecasts made by experts and we compare the outcomes with those for statistical modelbased forecasts of the same sales data. We find substantial evidence for asymmetry in the loss functions of the experts, with underprediction penalized more than overprediction. 
Keywords:  model forecasts; expert forecasts; loss functions; asymmetry; econometric models 
JEL:  C50 C53 
Date:  2011–12–16 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:20110177&r=for 
By:  Thomas Laurent; Tomasz Kozluk 
Abstract:  Uncertainty is inherent to forecasting and assessing the uncertainty surrounding a point forecast is as important as the forecast itself. Following Cornec (2010), a method to assess the uncertainty around the indicator models used at OECD to forecast GDP growth of the six largest member countries is developed, using quantile regressions to construct a probability distribution of future GDP, as opposed to mean point forecasts. This approach allows uncertainty to be assessed conditionally on the current state of the economy and is totally model based and judgement free. The quality of the computed distributions is tested against other approaches to measuring forecast uncertainty and a set of uncertainty indicators is constructed in order to help exploiting the most helpful information.<P>Mesure de l'incertitude sur les prévisions du PIB à l'aide de régressions quantiles<BR>L’incertitude est inhérente à la prévision, et évaluer l’incertitude autour d’une prévision est aussi important que la prévision ellemême. A la suite de Cornec (2010), une méthode pour évaluer l’incertitude autour des modèles d’indicateurs utilisés à l’OCDE pour prévoir la croissance des six plus grandes économies membres est développée, utilisant des régressions quantiles pour construire une distribution de probabilité du PIB future, plutôt qu’une prévision ponctuelle. Cette approche permet d’évaluer l’incertitude conditionnellement à l’état actuel de l’économie et est fondée sur le modèle, sans jugement. La qualité des distributions calculées est testée contre des approches alternatives de la mesure de l’incertitude, et un ensemble d’indicateurs d’incertitudes est construit pour aider à exploiter les informations les plus pertinentes. 
Keywords:  forecasting, uncertainty, GDP, quantile regression, prévisions, incertitude, PIB, régression quantile 
JEL:  C31 C53 
Date:  2012–07–06 
URL:  http://d.repec.org/n?u=RePEc:oec:ecoaaa:978en&r=for 
By:  Ralph Snyder; Adrian Beaumont; J. Keith Ord 
Abstract:  This paper is concerned with identifying an effective method for forecasting the lead time demand of slowmoving inventories. Particular emphasis is placed on prediction distributions instead of point predictions alone. It is also placed on methods which work with small samples as well as large samples in recognition of the fact that the typical range of items has a mix of vintages due to different commissioning and decommissioning dates over time. Various forecasting methods are compared using monthly demand data for more than one thousand car parts. It is found that a multiseries version of exponential smoothing coupled with a Pólya (negative binomial) distribution works better than the other twentyfour methods considered, including the Croston method. 
Keywords:  Demand forecasting; inventory control; shifted Poisson distribution 
JEL:  C22 
Date:  2012–07 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:201215&r=for 
By:  Ephias M. Makaudze 
Abstract:  Drought constitutes the most dominant source of food insecurity in Zimbabwe and many other countries in Africa. With a majority of smallholder farmers practicing dryland agriculture, seasonal forecasts hold promise as an effective risk management tool, giving farmers the ability to anticipate rainfall variability early enough to adjust crucial farm decisions and better prepared to handle climatic anomalies in ways that can reduce costly losses (crop, animal and even human). This study demonstrates the potential value of forecasts to smallholder farmers in Zimbabwe, a majority who often suffer severely from the impact of drought. Using crop simulation models to compare yield performances of farmers with and without forecasts, results indicate that for a drought year, farmers with forecasts (WF) record higher yield gains (28%) compared to those without forecasts (WOF); in particular, farmers located in driest regions (NR V) record the highest yield gains (42%). Similar results are observed for a neutral/ average year as farmers WF obtain predominantly higher yield gains (20%) than those WOF. However for a good year, results show a different pattern as no yield gains are observed. In fact farmers WOF perform better; suggesting forecasts in this case may not make much difference. Using gross margin analysis, results show farmers WF obtaining higher returns during a drought (US$0.14haâˆ’1) and neutral year (US$0.43haâˆ’1) but again not for good year as farmers WOF outperform those WF. In sum, forecasts can play an important role as lossminimization instruments especially if the underlying year is a El Nino (drought) year. In conclusion, to attain full economic value of forecasts, complementary policies (currently missing) such as effective communication, improvement in forecast extension skills and promotion of farmer participatory and outreach activities, all could prove vital in enhancing the value of forecasts to smallholder farmers in general 
Keywords:  Seasonal forecasts, smallholder farmers, El Nino, economic value, drought risk, with and without forecasts 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:rza:wpaper:303&r=for 
By:  Caio Almeida; Axel Simonsen; José Vicente 
Abstract:  Recent empirical analysis of interest rate markets documents that bond demand and supply directly affect yield curve movements and bond risk premium. Motivated by those findings we propose a parametric interest rate model that allows for segmentation and local shocks in the term structure. We split the yield curve in segments presenting their own local movements that are globally interconnected by smoothing conditions. Two classes of segmented exponential models are derived and compared to successful term structure models based on a sequence of outofsample forecasting exercises. Adopting U.S. interest rates data available from 1985 to 2008, the segmented models present overall better forecasting performance suggesting that local shocks might indeed be important determinants of yield curve dynamics. 
Date:  2012–07 
URL:  http://d.repec.org/n?u=RePEc:bcb:wpaper:288&r=for 
By:  Csávás, Csaba; Erhart, Szilárd; Naszódi, Anna; Pintér, Klára 
Abstract:  There is ample empirical evidence in the literature for the positive effect of central bank transparency on the economy. The main channel is that transparency reduces the uncertainty regarding future monetary policy and thereby it helps agents to make better investment, and saving decisions. In this paper, we document how the degree of transparency of central banks in Central and Eastern Europe has changed during periods of financial stress, and we argue that during the recent financial crisis central banks became less transparent. We investigate also how these changes affected the uncertainty in these economies, measured by the degree of disagreement across professional forecasters over the future shortterm and longterm interest rates and also by their forecast accuracy. 
Keywords:  central banking; transparency; financial crises; survey expectations; forecasting 
JEL:  E58 E44 E47 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:40335&r=for 
By:  Francis Vitek 
Abstract:  This paper develops a structural macroeconometric model of the world economy, disaggregated into thirty five national economies. This panel unobserved components model features a monetary transmission mechanism, a fiscal transmission mechanism, and extensive macrofinancial linkages, both within and across economies. A variety of monetary policy analysis, fiscal policy analysis, spillover analysis, and forecasting applications of the estimated model are demonstrated, based on a Bayesian framework for conditioning on judgment. 
Keywords:  Cross country analysis , Fiscal policy , Monetary policy , Monetary transmission mechanism , Spillovers , Forecasting models , 
Date:  2012–06–07 
URL:  http://d.repec.org/n?u=RePEc:imf:imfwpa:12/149&r=for 
By:  Wolfgang Karl Härdle,Piotr Majer; Melanie Schienle; ; 
Abstract:  Using a Dynamic Semiparametric Factor Model (DSFM) we investigate the term structure of interest rates. The proposed methodology is applied to monthly interest rates for four southern European countries: Greece, Italy, Portugal and Spain from the introduction of the Euro to the recent European sovereigndebt crisis. Analyzing this extraordinary period, we compare our approach with the standard market method  dynamic NelsonSiegel model. Our findings show that two nonparametric factors capture the spatial structure of the yield curve for each of the bond markets separately. We attributed both factors to the slope of the yield curve. For panel term structure data, three nonparametric factors are necessary to explain 95% variation. The estimated factor loadings are unit root processes and reveal high persistency. In comparison with the benchmark model, the DSFM technique shows superior short term forecasting. 
Keywords:  yield curve, term structure of interests rates, semiparametric model, factor structure, prediction 
JEL:  C14 C51 
Date:  2012–08 
URL:  http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2012048&r=for 
By:  Kuper, Gerard H.; Sterken, Elmer (Groningen University) 
Abstract:  The current paper predicts the medal tally for the London 2012 Olympic Games. The forecast procedure consists of analyzing participation and success at the country level of the three most recent editions of the Olympic Summer Games. Potential explanatory variables for medal winnings are income per capita, population, geographical distance to the Games, success in terms of medals won at World Championships, and the home advantage. Our forecasts show that the China takes first place in the medal tally with 44 gold medals, followed by the United States of America winning 33 gold medals. We expect Great Britain to take fourth place winning 23 gold medals. 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:dgr:rugsom:12006eef&r=for 
By:  Chen, ShuLing; Jackson, John D.; Kim, Hyeongwoo; Resiandini, Pramesti 
Abstract:  This paper examines common forces driving the prices of 51 highly tradable commodities. We demonstrate that highly persistent movements of these prices are mostly due to the first common component, which is closely related to the US nominal exchange rate. In particular, our simple factorbased model outperforms the random walk model in outofsample forecast for the US exchange rate. The second common factor and defactored idiosyncratic components are consistent with stationarity, implying shortlived deviations from the equilibrium price dynamics. In concert, these results provide an intriguing resolution to the apparent inconsistency arising from stable markets with nonstationary prices. 
Keywords:  Commodity Prices; US Nominal Exchange Rate; Panel Analysis of Nonstationarity in Idiosyncratic and Common Components; CrossSection Dependence; OutofSample Forecast 
JEL:  C53 F31 
Date:  2012–08 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:40711&r=for 
By:  Michael Iacono; David Levinson; Ahmed ElGeneidy; Rania Wasfi (Nexus (Networks, Economics, and Urban Systems) Research Group, Department of Civil Engineering, University of Minnesota) 
Abstract:  The set of models available to predict land use change in urban regions has become increasingly complex in recent years. Despite their complexity, the predictive power of these models remains relatively weak. This paper presents an example of an alternative modeling framework based on the concept of a Markov chain. The model assumes that land use at any given time, which is viewed as a discrete state, can be considered a function of only its previous state. The probability of transition between each pair of states is recorded as an element of a transition probability matrix. Assuming that this matrix is stationary over time, it can be used to predict future land use distributions from current data. To illustrate this process, a Markov chain model is estimated for the MinneapolisSt. Paul, MN, USA (Twin Cities) metropolitan region. Using a unique set of historical land use data covering several years between 1958 and 2005, the model is tested using historical data to predict recent conditions, and is then used to forecast the future distribution of land use decades into the future. We also use the celllevel data set to estimate the fraction of regional land use devoted to transportation facilities, including major highways, airports, and railways. The paper concludes with some comments on the strengths and weaknesses of Markov chains as a land use modeling framework, and suggests some possible extensions of the model. 
Keywords:  mode choice, mode shares, mixed logit, stated preference. 
JEL:  R11 R12 R14 R41 R52 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:nex:wpaper:markovlu&r=for 
By:  Alan Cook (Baylor University Medical Center  Dallas, Texas); Turner M. Osler (University of Vermont College of Medicine, Department of Surgery) 
Abstract:  Many methods have been developed to predict mortality following trauma. Two classification systems are used to provide a taxonomy for diseases, including injuries. The ICD9 is the classification system for administrative data in the U.S.A. AIS was developed for characterization of injuries alone. The Trauma Mortality Prediction Model (TMPM) is based on empiric estimates of severity for each injury in the ICD9 and AIS lexicons. Each probability of mortality (POD) is estimated from the five worst injuries per patient. TMPM has been rigorously tested against other mortality prediction models using ICD9 and AIS data and found superior. The TMPM.ado command allows Stata users to efficiently apply TMPM to data sets using ICD9 or AIS. The command makes use of modelaveraged regression coefficients (MARC) that assign empirically derived severity measures for each of the 1,322 AIS codes and 1,579 ICD9 injury codes. The injury codes are sorted into body regions then merged with the MARC table to assemble a set of regression coefficients. A logit model is generated to calculate the probability of death. TMPM.ado accommodates either AIS or ICD9 lexicons from a single command and adds the POD for each patient to the original dataset as a new variable. 
Date:  2012–08–01 
URL:  http://d.repec.org/n?u=RePEc:boc:scon12:20&r=for 
By:  Lönnbark, Carl (Department of Economics, Umeå University) 
Abstract:  In the estimation of risk measures such as Value at Risk and Expected shortfall relatively short estimation windows are typically used rendering the estimation error a possibly nonnegligible component. In this paper we build upon previous results for the Value at Risk and discuss how the estimation error comes into play for the Expected Shortfall. We identify two important aspects where it may be of importance. On the one hand there is in the evaluation of predictors of the measure. On the other there is in the interpretation and communication of it. We illustrate magnitudes numerically and emphasize the practical importance of the latter aspect in an empirical application with stock market index data. 
Keywords:  Backtesting; Delta method; Finance; GARCH; Risk Management 
JEL:  C52 C53 C58 G10 G19 
Date:  2012–08–16 
URL:  http://d.repec.org/n?u=RePEc:hhs:umnees:0844&r=for 
By:  MariVidal, Sergio; SeguiMas, Elies; MarinSanchez, Maria del Mar; MateosRonco, Alicia 
Abstract:  Accounting information has been employed in many economicfinancial models applied to registered corporations to predict business failure. Nonetheless, there are practically no research works that predict failure in agricultural cooperatives. The fundamental elements of this legal form justify the development of specific prediction models. The Delphi methodology has been used to define agricultural cooperative failure and to assess the usefulness of accounting variables. The conclusions suggest considering those agricultural cooperatives with negative equity or cashflow problems to be failures or to come close to this concept. Similarly, indebtedness volume, cash flow and solvency are the most relevant variables that can act as business prediction instruments. 
Keywords:  Agricultural cooperatives, business failure, Delphi method, accounting variables, Agribusiness, Farm Management, Risk and Uncertainty, Q13, M41, M15, G33, 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:ags:iaae12:128561&r=for 
By:  Alexander Guarín; Andrés González; Daphné Skandalis; Daniela Sánchez 
Abstract:  In this paper, we propose an alternative methodology to determine the existence of credit booms, which is a complex and crucial issue for policymakers. In particular, we exploit the Mendoza and Terrones (2008)’s idea that macroeconomic aggregates other than the credit growth rate contain valuable information to predict credit boom episodes. Our econometric method is used to estimate and predict the probability of being in a credit boom. We run empirical exercises on quarterly data for six Latin American countries between 1996 and 2011. In order to capture simultaneously model and parameter uncertainty, we implement the Bayesian model averaging method. As we employ panel data, the estimates may be used to predict booms of countries which are not considered in the estimation. Overall, our findings show that macroeconomic variables contain valuable information to predict credit booms. In fact, with our method the probability of detecting a credit boom is 80%, while the probability of not having false alarms is greater than 92%. 
Keywords:  Early Warning Indicator, Credit Booms, Business Cycles, Emerging Markets. Classification JEL:E32, E37, E44, E51, C53. 
Date:  2012–07 
URL:  http://d.repec.org/n?u=RePEc:bdr:borrec:723&r=for 
By:  Halkos, George; Kevork, Ilias 
Abstract:  In the current paper we study a real life inventory problem whose operating conditions match to the principles of the classical newsvendor model. Applying appropriate tests to the available sample of historical demand data, we get the sufficient statistical evidences to support that daily demand is stationary, uncorrelated, and normally distributed. Given that at the start of each day, the selling price, the purchasing cost per unit, and the salvage value are known, and do not change through the whole period under investigation, we derive exact and asymptotic prediction intervals for the daily maximum expected profit. To evaluate their performance, we derive the analytic form of three accuracy information metrics. The first metric measures the deviation of the estimated probability of no stockouts during the day from the critical fractile. The other two metrics relate the validity and precision of the two types of prediction interval to the variability of estimates for the ordered quantity. Both theoretical and empirical analysis demonstrates the importance of implications of the loss of goodwill to the adopted inventory policy. Operating the system at the optimal situation, this intangible cost element determines the probability of no stockouts during the day, and assesses the precision of prediction intervals. The rising of the loss of goodwill leads to smaller estimates for the daily maximum expected profit and to wider prediction intervals. Finally, in the setting of the real life newsvendor problem, we recommend the asymptotic prediction interval since with samples over 25 observations this type of interval has higher precision and probability to include the daily maximum expected profit almost equal to the nominal confidence level. 
Keywords:  Newsvendor model; Loss of goodwill; Target inventory measures; Prediction interval; Accuracy information metric 
JEL:  C13 M11 C44 D24 
Date:  2012–08–17 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:40724&r=for 