nep-for New Economics Papers
on Forecasting
Issue of 2011‒07‒27
nine papers chosen by
Rob J Hyndman
Monash University

  1. Does the Box-Cox transformation help in forecasting macroeconomic time series? By Tommaso, Proietti; Helmut, Luetkepohl
  2. The Euro-Sting revisited: PMI versus ESI to obtain euro area GDP forecasts By Maximo Camacho; Agustin Garcia-Serrador
  3. Forecasting in the presence of recent structural change By Jana Eklund; George Kapetanios; Simon Price
  4. Risk Management of Risk Under the Basel Accord: A Bayesian Approach to Forecasting Value-at-Risk of VIX Futures By Michael McAleer; Roberto Casarin; Chia-Lin Chang; Juan-Ángel Jiménez-Martín; Teodosio Pérez-Amaral
  5. Varying the VaR for Unconditional and Conditional Environments By John Cotter
  6. FOMC Communication Policy and the Accuracy of Fed Funds Futures By Menno Middeldorp
  7. Central Bank Transparency, the Accuracy of Professional Forecasts, and Interest Rate Volatility By Menno Middeldorp
  8. Furniture distribution in Russia By Alessandra Tracogna; Stefania Pelizzari
  9. GFC-Robust Risk Management Under the Basel Accord Using Extreme Value Methodologies By Paulo Araújo Santos; Juan-Ángel Jiménez-Martín; Michael McAleer; Teodosio Pérez Amaral

  1. By: Tommaso, Proietti; Helmut, Luetkepohl
    Abstract: The paper investigates whether transforming a time series leads to an improvement in forecasting accuracy. The class of transformations that is considered is the Box-Cox power transformation, which applies to series measured on a ratio scale. We propose a nonparametric approach for estimating the optimal transformation parameter based on the frequency domain estimation of the prediction error variance, and also conduct an extensive recursive forecast experiment on a large set of seasonal monthly macroeconomic time series related to industrial production and retail turnover. In about one fifth of the series considered the Box-Cox transformation produces forecasts significantly better than the untransformed data at one-step-ahead horizon; in most of the cases the logarithmic transformation is the relevant one. As the forecast horizon increases, the evidence in favour of a transformation becomes less strong. Typically, the na¨ıve predictor that just reverses the transformation leads to a lower mean square error than the optimal predictor at short forecast leads. We also discuss whether the preliminary in-sample frequency domain assessment conducted provides a reliable guidance which series should be transformed for improving significantly the predictive performance.
    Keywords: Forecasts comparisons; Multi-step forecasting; Rolling forecasts; Nonparametric estimation of prediction error variance.
    JEL: C53 C14 C52 C22
    Date: 2011–07–18
  2. By: Maximo Camacho; Agustin Garcia-Serrador
    Abstract: This paper uses an extension of the Euro-Sting single-index dynamic factor model to construct short-term forecasts of quarterly GDP growth for the euro area, as also including financial variables as leading indicators. From a simulated real-time exercise, the model is used to investigate the forecasting accuracy across the different phases of the business cycle. In addition, the model is used to evaluate the relative forecasting ability of the two most watched business cycle surveys for the eurozone, the PMI and the ESI. We show that the latter produces more accurate GDP forecasts than the former. Finally, the proposed model is also characterized by its great ability to capture the European business cycle, as well as the probabilities of expansion and/or contraction periods.
    Keywords: Real-time forecasting, dynamic factor model, eurozone GDP, business cycle
    JEL: E32 C22 E27
    Date: 2011–06
  3. By: Jana Eklund; George Kapetanios; Simon Price
    Abstract: We examine how to forecast after a recent break. We consider monitoring for change and then combining forecasts from models that do and do not use data before the change; and robust methods, namely rolling regressions, forecast averaging over different windows and exponentially weighted moving average (EWMA) forecasting. We derive analytical results for the performance of the robust methods relative to a full-sample recursive benchmark. For a location model subject to stochastic breaks the relative MSFE ranking is EWMA < rolling regression < forecast averaging. No clear ranking emerges under deterministic breaks. In Monte Carlo experiments forecast averaging improves performance in many cases with little penalty where there are small or infrequent changes. Similar results emerge when we examine a large number of UK and US macroeconomic series.
    JEL: C10 C59
    Date: 2011–07
  4. By: Michael McAleer (Erasmus University Rotterdam, Tinbergen Institute, The Netherlands, Complutense University of Madrid, and Institute of Economic Research, Kyoto University); Roberto Casarin (Department of Economics Ca’Foscari University of Venice); Chia-Lin Chang (Department of Applied Economics Department of Finance National Chung Hsing University Taichung, Taiwan); Juan-Ángel Jiménez-Martín (Department of Quantitative Economics Complutense University of Madrid); Teodosio Pérez-Amaral (Department of Quantitative Economics Complutense University of Madrid)
    Abstract: It is well known that the Basel II Accord requires banks and other Authorized Deposit-taking Institutions (ADIs) to communicate their daily risk forecasts to the appropriate monetary authorities at the beginning of each trading day, using one or more risk models, whether individually or as combinations, to measure Value-at-Risk (VaR). The risk estimates of these models are used to determine capital requirements and associated capital costs of ADIs, depending in part on the number of previous violations, whereby realised losses exceed the estimated VaR. McAleer et al. (2009) proposed a new approach to model selection for predicting VaR, consisting of combining alternative risk models, and comparing conservative and aggressive strategies for choosing between VaR models. This paper addresses the question of risk management of risk, namely VaR of VIX futures prices, and extends the approaches given in McAleer et al. (2009) and Chang et al. (2011) to examine how different risk management strategies performed during the 2008-09 global financial crisis (GFC). The empirical results suggest that an aggressive strategy of choosing the Supremum of single model forecasts, as compared with Bayesian and non-Bayesian combinations of models, is preferred to other alternatives, and is robust during the GFC. However, this strategy implies relatively high numbers of violations and accumulated losses, which are admissible under the Basel II Accord.
    Keywords: Median strategy, Value-at-Risk, daily capital charges, violation penalties, aggressive risk management, conservative risk management, Basel Accord, VIX futures, Bayesian strategy, quantiles, forecast densities.
    JEL: G32 C53 C22 C11
    Date: 2011–07
  5. By: John Cotter (University College Dublin)
    Abstract: Accurate forecasting of risk is the key to successful risk management techniques. Using the largest stock index futures from twelve European bourses, this paper presents VaR measures based on their unconditional and conditional distributions for single and multi-period settings. These measures underpinned by extreme value theory are statistically robust explicitly allowing for fat-tailed densities. Conditional tail estimates are obtained by adjusting the unconditional extreme value procedure with GARCH filtered returns. The conditional modelling results in iid returns allowing for the use of a simple and efficient multi-period extreme value scaling law. The paper examines the properties of these distinct conditional and unconditional trading models. The paper finds that the biases inherent in unconditional single and multi-period estimates assuming normality extend to the conditional setting.
    Keywords: extreme value theory; GARCH filter; conditional risk
    JEL: G1 G10
    Date: 2011–07–21
  6. By: Menno Middeldorp
    Abstract: Over the last two decades, the Federal Open Market Committee (FOMC), the rate-setting body of the United States Federal Reserve System, has become increasingly communicative and transparent. According to policymakers, one of the goals of this shift has been to improve monetary policy predictability. Previous academic research has found that the FOMC has indeed become more predictable. Here, I contribute to the literature in two ways. First, instead of simply looking at predictability before and after the Fed's communication reforms in the 1990s, I identify three distinct periods of reform and measure their separate contributions. Second, I correct the interest rate forecasts embedded in fed funds futures contracts for risk premiums, in order to obtain a less biased measure of predictability. My results suggest that the communication reforms of the early 1990s and the 'guidance' provided from 2003 significantly improved predictability, while the release of the FOMC's policy bias in 1999 had no measurable impact. Finally, I find that FOMC speeches and testimonies significantly lower short-term forecasting errors.
    Keywords: central bank communication, central bank transparency, futures pricing, financial market efficiency
    JEL: D83 E58 G13 G14
    Date: 2011–07
  7. By: Menno Middeldorp
    Abstract: Central banks worldwide have become more transparent. An important reason is that democratic societies expect more openness from public institutions. Policymakers also see transparency as a way to improve the predictability of monetary policy, thereby lowering interest rate volatility and contributing to economic stability. Most empirical studies support this view. However, there are three reasons why more research is needed. First, some (mostly theoretical) work suggests that transparency has an adverse effect on predictability. Second, empirical studies have mostly focused on average predictability before and after specific reforms in a small set of advanced economies. Third, less is known about the effect on interest rate volatility. To extend the literature, I use the Dincer and Eichengreen (2007) transparency index for twenty-four economies of varying income and examine the impact of transparency on both predictability and market volatility. I find that higher transparency improves the accuracy of interest rate forecasts for three months ahead and reduces rate volatility.
    Keywords: Central bank communication, interest rate forecasts, central bank transparency, financial market efficiency
    JEL: D83 E47 E58 G14
    Date: 2011–07
  8. By: Alessandra Tracogna (CSIL Centre for Industrial Studies); Stefania Pelizzari (CSIL Centre for Industrial Studies)
    Abstract: The first edition of the report Furniture Distribution in Russia offers an accurate comprehensive picture of the Russian furniture market, providing 2000-2010 trends and forecasts 2011 and 2012 for furniture consumption. Statistics of furniture production, imports and exports are also provided. The furniture market is broken down by product segment and price ranges for the year 2010. Distribution channels, Mark up and Reference Prices of the Russian furniture market are further considered both for domestically produced furniture and imported items. <br> The research includes also a Sector Analysis providing basic data on production, consumption, exports, imports and top companies of the following segment: Kitchen furniture, Bedroom furniture, Upholstered furniture, Office furniture and Hospitality furniture. <br> The analysis of furniture distribution channels includes furniture specialist distributors, non specialist distributors and contract projects and covers: Owned stores, Franchising, Monobrand, Furniture chains, Furniture supermarkets and DIY.<br> Around 60 short profiles of the main distributors, both for domestic and imported furniture operating on the Russian furniture market are also available with contact details, product portfolio and brands.<br> The analysis of the Russian furniture market includes: Demand Drivers (macroeconomic indicators, population distribution and construction market) and of the latest Product Trends among Russian consumers.<br> Among the considered products: Living and dining furniture, Upholstery, Bedrooms, Home tables and chairs, Kitchen furniture, Other occasional furniture, Bathroom furniture, Office furniture, Other non residential furniture and Hotel
    JEL: L22 L81
    Date: 2011–06
  9. By: Paulo Araújo Santos (Escola Superior de Gestão e Tecnologia de Santarém and Center of Statistics and Applications, University of Lisbon); Juan-Ángel Jiménez-Martín (Departamento de Economía Cuantitativa (Department of Quantitative Economics), Facultad de Ciencias Económicas y Empresariales (Faculty of Economics and Business), Universidad Complutense de Madrid); Michael McAleer (Econometrisch Instituut (Econometric Institute), Faculteit der Economische Wetenschappen (Erasmus School of Economics), Erasmus Universiteit, Tinbergen Instituut (Tinbergen Institute).); Teodosio Pérez Amaral (Departamento de Economía Cuantitativa (Department of Quantitative Economics), Facultad de Ciencias Económicas y Empresariales (Faculty of Economics and Business), Universidad Complutense de Madrid)
    Abstract: In McAleer et al. (2010b), a robust risk management strategy to the Global Financial Crisis (GFC) was proposed under the Basel II Accord by selecting a Value-at-Risk (VaR) forecast that combines the forecasts of different VaR models. The robust forecast was based on the median of the point VaR forecasts of a set of conditional volatility models. In this paper we provide further evidence on the suitability of the median as a GFC-robust strategy by using an additional set of new extreme value forecasting models and by extending the sample period for comparison. These extreme value models include DPOT and Conditional EVT. Such models might be expected to be useful in explaining financial data, especially in the presence of extreme shocks that arise during a GFC. Our empirical results confirm that the median remains GFC-robust even in the presence of these new extreme value models. This is illustrated by using the S&P500 index before, during and after the 2008-09 GFC. We investigate the performance of a variety of single and combined VaR forecasts in terms of daily capital requirements and violation penalties under the Basel II Accord, as well as other criteria, including several tests for independence of the violations. The strategy based on the median, or more generally, on combined forecasts of single models, is straightforward to incorporate into existing computer software packages that are used by banks and other financial institutions.
    Keywords: Value-at-Risk (VaR), DPOT, daily capital charges, robust forecasts, violation penalties, optimizing strategy, aggressive risk management, conservative risk management, Basel, global financial crisis.
    JEL: G32 G11 C53 C22
    Date: 2011

This nep-for issue is ©2011 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.