nep-for New Economics Papers
on Forecasting
Issue of 2013‒08‒10
six papers chosen by
Rob J Hyndman
Monash University

  1. Proximity to Hubs of Expertise in Financial Analyst Forecast Accuracy By Elisa Cavezzali; Jacopo Crepaldi; Ugo Rigoni
  2. Risk Modelling and Management: An Overview By Chang, C.L.; Allen, D.E.; McAleer, M.J.; Pérez-Amaral, T.
  3. Financial AnalystsÕ Forecast Accuracy: Do valuation methods matter? By Elisa Cavezzali; Ugo Rigoni
  4. Currency Crises During the Great Recession: Is This Time Different? By Tiziano Arduini; Giuseppe De Arcangelis; Carlo L. Del Bello
  5. The HERMES-13 macroeconomic model of the Irish economy By Bergin, Adele; Conefrey, Thomas; FitzGerald, John; Kearney, Ide; Znuderl, Nusa
  6. Modeling the Commute Mode Share of Transit Using Continuous Accessibility to Jobs By Andrew Owen; David Levinson

  1. By: Elisa Cavezzali (Università Ca' Foscari Venice); Jacopo Crepaldi; Ugo Rigoni (Università Ca' Foscari Venice)
    Abstract: This paper investigates whether the geographical proximity of financial analysts to hubs of information and expertise can influence their forecasting accuracy. Recent studies show that the financial analyst forecasting process show a systematic difference in earnings forecast accuracy dependent on the geographical distance of analysts from the companies which they follow. The literature argues that local analysts issue more accurate forecasts because they have an informational advantage over analysts who are further away. Industrial centres can constitute important knowledge spillovers by creating formal and informal networks amongst firms and higher education and research institutions. In such a hub, information can easily flow and propagate. Our hypothesis is that physical proximity to these hubs, and not to the companies they follow, is an advantage for financial analysts, leading to the issue of more accurate forecasts. Using a sample of 205 observations related to 33 firms, across seven countries and ten sectors, our results are consistent with the hypothesis. Even though preliminary, and probably in part biased by sample selection issues, overall, the empirical evidence confirms the benefit of being part of a network, formal or informal, in which information, knowledge and expertise sharing can flow easily. We try to give some new evidence on what can cause variations in financial analyst accuracy by exploring these concepts, well known and analysed in other fields, but new in the context of financial analysts.
    Keywords: forecast accuracy, sell-side analysts, geography, hubs of knowledge
    JEL: M40 M41
    Date: 2013–08
  2. By: Chang, C.L.; Allen, D.E.; McAleer, M.J.; Pérez-Amaral, T.
    Abstract: The papers in this special issue of Mathematics and Computers in Simulation are substantially revised versions of the papers that were presented at the 2011 Madrid International Conference on “Risk Modelling and Management†(RMM2011). The papers cover the following topics: currency hedging strategies using dynamic multivariate GARCH, risk management of risk under the Basel Accord: A Bayesian approach to forecasting value-at-risk of VIX futures, fast clustering of GARCH processes via Gaussian mixture models, GFC-robust risk management under the Basel Accord using extreme value methodologies, volatility spillovers from the Chinese stock market to economic neighbours, a detailed comparison of Value-at-Risk estimates, the dynamics of BRICS's country risk ratings and domestic stock markets, U.S. stock market and oil price, forecasting value-at-risk with a duration-based POT method, and extreme market risk and extreme value theory.
    Keywords: risk management;forecasting;value-at-risk;volatility spillovers;mixture models;VIX futures;Basel Accord;BRICS;currency hedging strategies;country risk ratings;extreme value methodologies;fast clustering;extreme market risks
    Date: 2013–06–01
  3. By: Elisa Cavezzali (Università Ca' Foscari Venice); Ugo Rigoni (Università Ca' Foscari Venice)
    Abstract: This study investigates how different ways to evaluate a company influence the accuracy of the target price. We know that finance theory and professional practice propose alternative approaches to the evaluation of a company. The literature on the relationship between the valuation methods used and target price accuracy is still scant, and the results are inconclusive and contradictory. Coding the valuation methods of 1,650 reports, we find that the accuracy of target prices decreases when the target price is based just on a main method. Furthermore, we show that methods based on company fundamentals and those based on market multiples lead to similar levels of accuracy. Among different classes of methods, there are no superior methods. Therefore, we argue that in order to improve forecast accuracy, analysts need to assess company value by choosing and applying a set of different methods, combining them and getting the average value, but regardless of the specific technique chosen.
    Keywords: forecast accuracy, sell-side analysts, equity valuation, valuation methods
    JEL: M40 M41
    Date: 2013–08
  4. By: Tiziano Arduini (Sapienza, University of Rome); Giuseppe De Arcangelis (Sapienza, University of Rome); Carlo L. Del Bello (Sapienza, University of Rome)
    Abstract: During the 2007-2009 financial crisis the foreign exchange market was characterized by large volatility and wide currency swings. In this paper we evaluate whether during the period of the Great Recession there has been a structural break in the relationship between fundamentals and exchange rates within an early-warning framework. This is done by extending the original data set by Kaminsky and Reinhart (1999) and including not only the most recent period, but also 17 new countries. Our analysis considers two variations of the original early-warning system. First, we propose two new methods to obtain the probability distribution of the early-warning indicator (conditional on the occurrence of a crisis) - one fully parametric and one based on a novel distribution-free semi-parametric approach. Second, we compare the original early-warning indicator with a core indicator that includes only "pseudo-nancial variables" (domestic credit/GDP, the real exchange rate, international reserves and the real interest-rate dierential) and we evaluate their performance not only for currency crises during the Great Recession, but also for the Asian Crisis. All tests make us conclude that "this time is different", i.e. early-warning systems based on traditional macroeconomic variables have not only failed to forecast currency crises during the Great Recession, but have also significantly worsened with respect to the period of the Asian crisis.
    Keywords: EarlyWarning Systems, Exchange Rates, Semi-parametric Methods.
    JEL: F31 F47 F30
  5. By: Bergin, Adele; Conefrey, Thomas; FitzGerald, John; Kearney, Ide; Znuderl, Nusa
    Abstract: The HERMES macroeconomic model has been used extensively for over 25 years to carry out medium-term forecasting and scenario analysis of the Irish economy. Most recently the model has been used to generate the scenarios underpinning the 2013 edition of the ESRI's Medium-Term Review. In the long period over which the model has been used for policy analysis, the Irish economy has undergone substantial change and new approaches to modelling important economic relationships have been developed. This paper outlines the structure and behaviour of the most recent version of the HERMES model (HERMES-13). We describe the key mechanisms and the modelling innovations which have been introduced to deal with major changes in the economy. As the model draws on a range of research on the Irish economy, we describe how this work has been incorporated into the model to better capture key economic relationships. Finally, we examine the results of a series of shocks to key variables carried out using the model. This provides a benchmark against which to evaluate the long-run properties of the model as well as illustrating how the model can shed light on the key transmission channels in the economy. This paper, and the accompanying detailed model listing and estimation output, provides a basic reference manual that practitioners and interested parties can use to interpret model output and, it is hoped, make suggestions for further model development and improvement.
    Keywords: modelling/Policy/scenarios
    Date: 2013–07
  6. By: Andrew Owen; David Levinson (Nexus (Networks, Economics, and Urban Systems) Research Group, Department of Civil Engineering, University of Minnesota)
    Abstract: This paper presents the results of an accessibility-based model of aggregate commute mode share, focusing on the share of transit relative to auto. It demonstrates the use of continuous accessibility — calculated continuously in time, rather than at a single or a few departure times — for the evaluation of transit systems. These accessibility calculations are accomplished using only publicly-available data sources. A binomial logit model is estimated which predicts the likelihood that a commuter will choose transit rather than auto for a commute trip based on aggregate characteristics of the surrounding area. Variables in this model include demographic factors as well as detailed accessibility calculations for both transit and auto. The model achieves a Ï2 value of 0.597, and analysis of the results suggests that continuous accessibility of transit systems may be a valuable tool for use in modeling and forecasting.
    Keywords: travel behavior, accessibility, mode choice
    JEL: R14 R41 R42
    Date: 2013

This nep-for issue is ©2013 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.