New Economics Papers
on Financial Markets
Issue of 2009‒06‒03
nine papers chosen by



  1. Financial Bubbles, Real Estate bubbles, Derivative Bubbles, and the Financial and Economic Crisis By Didier SORNETTE; Ryan WOODARD
  2. Nonlinear Time Series in Financial Forecasting By Gloria González-Rivera; Tae-Hwy Lee
  3. Volatility Models : frrom GARCH to Multi-Horizon Cascades By Alexander Subbotin; Thierry Chauveau; Kateryna Shapovalova
  4. Option Pricing Using Realized Volatility and ARCH Type Models By Toshiaki Watanabe; Masato Ubukata
  5. Predicting Stock Returns in a Cross-Section : Do Individual Firm chatacteristics Matter ? By Kateryna Shapovalova; Alexander Subbotin
  6. Predicting Securitized Real Estate Returns: Financial and Real Estate Factors vs. Economic Variables By Camilo SERRANO; Martin HOESLI
  7. Risk Transfer with CDOs By Jan Pieter Krahnen; Christian Wilde
  8. A Risk Management Approach for Portfolio Insurance Strategies By Benjamin Hamidi; Bertrand Maillet; Jean-Luc Prigent
  9. The topology of the interbank market: developments in Italy since 1990 By Michele Manna; Carmela Iazzetta

  1. By: Didier SORNETTE (ETH Zurich and Swiss Finance Institute); Ryan WOODARD (ETH Zurich)
    Abstract: The financial crisis of 2008, which started with an initially well-defined epicenter focused on mortgage backed securities (MBS), has been cascading into a global economic recession, whose increasing severity and uncertain duration has led and is continuing to lead to massive losses and damage for billions of people. Heavy central bank interventions and government spending programs have been launched worldwide and especially in the USA and Europe, with the hope to unfreeze credit and boltster consumption. Here, we present evidence and articulate a general framework that allows one to diagnose the fundamental cause of the unfolding financial and economic crisis: the accumulation of several bubbles and their interplay andmutual reinforcement has led to an illusion of a “perpetual money machine” allowing financial institutions to extract wealth from an unsustainable artificial process. Taking stock of this diagnostic, we conclude that many of the interventions to address the so-called liquidity crisis and to encourage more consumption are ill-advised and even dangerous, given that precautionary reserves were not accumulated in the “good times” but that huge liabilities were. The most “interesting” present times constitute unique opportunities but also great challenges, for which we offer a few recommendations.
    Keywords: Financial crisis, bubbles, real estate, derivatives, out-of-equilibrium, super-exponential growth, crashes, complex systems
    JEL: O16
    URL: http://d.repec.org/n?u=RePEc:chf:rpseri:rp0915&r=fmk
  2. By: Gloria González-Rivera (Department of Economics, University of California Riverside); Tae-Hwy Lee (Department of Economics, University of California Riverside)
    Date: 2007–09
    URL: http://d.repec.org/n?u=RePEc:ucr:wpaper:200803&r=fmk
  3. By: Alexander Subbotin (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I); Thierry Chauveau (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I); Kateryna Shapovalova (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I)
    Abstract: We overview different methods of modeling volatility of stock prices and exchange rates, focusing on their ability to reproduce the empirical properties in the corresponding time series. The properties of price fluctuations vary across the time scales of observation. The adequacy of different models for describing price dynamics at several time horizons simultaneously is the central topic of this study. We propose a detailed survey of recent volatility models, accounting for multiple horizons. These models are based on different and sometimes competing theoretical concepts. They belong either to GARCH or stochastic volatility model families and often borrow methodological tools from statistical physics. We compare their properties and comment on their pratical usefulness and perspectives.
    Keywords: Volatility modeling, GARCH, stochastic volatility, volatility cascade, multiple horizons in volatility.
    Date: 2009–05
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-00390636_v1&r=fmk
  4. By: Toshiaki Watanabe; Masato Ubukata
    Abstract: This article analyzes whether daily realized volatility, which is the sum of squared intraday returns over a day, is useful for option pricing. Different realized volatilities are calculated with or without taking account of microstructure noise and with or without using overnight and lunch-time returns. The both ARFIMA and ARFIMAX models are employed to specify the dynamics of realized volatility. The former can capture the long-memory property and the latter can also capture the asymmetry in volatility depending on the sign of previous day's return. Option prices are derived under the assumption of risk-neutrality. For comparison, GARCH, EGARCH, and FIEGARCH models are estimated using daily returns, where option prices are derived by assuming the risk-neutrality and by using the Duan (1995) method in which the assumption of risk-neutrality is relaxed. Main results using the Nikkei 225 stock index and its put options prices are: (1) the ARFIMAX model with daily realized volatility performs best, (2) applying the Bartlett adjustment to the calculation of realized volatility to take account of microstructure noise does not improve the performance while the Hansen and Lunde (2005a) adjustment without using overnight and lunch-time returns improves the performance, and (3) the Duan (1995) method does not improve the performance compared with assuming the risk neutrality.
    Keywords: ARFIMA, GARCH, Microstructure Noise, Option, Realized Volatility
    JEL: C22 C52 G13
    Date: 2009–04
    URL: http://d.repec.org/n?u=RePEc:hst:ghsdps:gd09-066&r=fmk
  5. By: Kateryna Shapovalova (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I); Alexander Subbotin (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I)
    Abstract: It is a common wisdom that individual stocks' returns are difficult to predict, though in many situations it is important to have such estimates at our disposal. In particular, they are needed to determine the cost of capital. Market equilibrium models posit that expected returns are proportional to the sensitivities to systematic risk factors. Fama and French (1993) three-factor model explains the stock returns premium as a sum of three components due to different risk factors : the traditional CAPM market beta, and the betas to the returns on two portfolios, "Small Minus Big" (the differential in the stock returns for small and big companies) and "High Minus Low" (the differential in the stock returns for the companies with high and low book-to-price ratio). The authors argue that this model is sufficient to capture the impact on returns of companies' accounting fundamentals, such as earnings-to-price, cash flow-to-price, past sales growth, long term and short-term past earnings. Using a panel of stock returns and accounting data from 1979 to 2008 for the companies listed on NYSE, we show that this is not the case, at least at individual stocks' level. According to our findings, fundamental characteristics of companies' performance are of higher importance to predict future expected returns than sensitivities to the Fama and French risk factors. We explain this finding within the rational pricing paradigm : contemporaneous accounting fundamentals may be better proxies for the future sensitivity to risk factors, than the historical covariance estimates.
    Keywords: Accounting fundamentals, equity performance, style analysis, value and growth, cost of capital.
    Date: 2009–05
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-00390647_v1&r=fmk
  6. By: Camilo SERRANO (University of Geneva); Martin HOESLI (University of Geneva (HEC and SFI), University of Aberdeen, Bordeaux Ecole de Management)
    Abstract: Securitized real estate returns have traditionally been forecasted using economic variables. However, no consensus exists regarding the variables to use. Financial and real estate factors have recently emerged as an alternative set of variables useful in forecasting securitized real estate returns. This paper examines whether the predictive ability of the two sets of variables differs. We use fractional cointegration analysis to identify whether long-run nonlinear relations exist between securitized real estate and each of the two sets of forecasting variables. That is, we examine whether such relationships are characterized by long memory, short memory, mean reversion (no long-run effects) or no mean reversion (no long-run equilibrium). Empirical analyses are conducted using data for the U.S., the U.K., and Australia. The results show that financial and real estate factors generally outperform economic variables in forecasting securitized real estate returns. Long memory (long-range dependence) is generally found between securitized real estate returns and stocks, bonds, and direct real estate returns, while only short memory is found between securitized real estate returns and the economic variables. Such results imply that to forecast securitized real estate returns, it may not be necessary to identify the economic variables that are related to changing economic trends and business conditions.
    Keywords: Fractional Cointegration, Fractionally Integrated Error Correction Model (FIECM), Forecasting, Multifactor Models, Securitized Real Estate, REITs
    JEL: G11 C53
    Date: 2009–03
    URL: http://d.repec.org/n?u=RePEc:chf:rpseri:rp0908&r=fmk
  7. By: Jan Pieter Krahnen (Goethe Uniyversity Frankfurt); Christian Wilde (Goethe University Frankfurt)
    Abstract: Modern bank management comprises both classical lending business and transfer of asset risk to capital markets through securitization. Sound knowledge of the risks involved in securitization transactions is a prerequisite for solid risk management. This paper aims to resolve a part of the opaqueness surrounding credit-risk allocation to tranches that represent claims of different seniority on a reference portfolio. In particular, this paper analyzes the allocation of credit risk to different tranches of a CDO transaction when the underlying asset returns are driven by a common macro factor and an idiosyncratic component. Junior and senior tranches are found to be nearly orthogonal, motivating a search for the whereabout of systematic risk in CDO transactions. We propose a metric for capturing the allocation of systematic risk to tranches. First, in contrast to a widely-held claim, we show that (extreme) tail risk in standard CDO transactions is held by all tranches. While junior tranches take on all types of systematic risk, senior tranches take on almost no non-tail risk. This is in stark contrast to an untranched bond portfolio of the same rating quality, which on average suffers substantial losses for all realizations of the macro factor. Second, given tranching, a shock to the risk of the underlying asset portfolio (e.g. a rise in asset correlation or in mean portfolio loss) has the strongest impact, in relative terms, on the exposure of senior tranche CDO-investors. Our findings can be used to explain major stylized facts observed in credit markets.
    Keywords: Credit Risk, Risk Transfer, Systematic Risk
    JEL: G21 G28
    Date: 2008–04–28
    URL: http://d.repec.org/n?u=RePEc:cfs:cfswop:wp200815&r=fmk
  8. By: Benjamin Hamidi (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, A.A.Advisors-QCG - ABN AMRO); Bertrand Maillet (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, A.A.Advisors-QCG - ABN AMRO, EIF - EIF); Jean-Luc Prigent (THEMA - Théorie économique, modélisation et applications - CNRS : UMR8184 - Université de Cergy Pontoise)
    Abstract: Controlling and managing potential losses is one of the main objectives of the Risk Management. Following Ben Ameur and Prigent (2007) and Chen et al. (2008), and extending the first results by Hamidi et al. (2009) when adopting a risk management approach for defining insurance portfolio strategies, we analyze and illustrate a specific dynamic portfolio insurance strategy depending on the Value-at-Risk level of the covered portfolio on the French stock market. This dynamic approach is derived from the traditional and popular portfolio insurance strategy (Cf. Black and Jones, 1987 ; Black and Perold, 1992) : the so-called "Constant Proportion Portfolio Insurance" (CPPI). However, financial results produced by this strategy crucially depend upon the leverage - called the multiple - likely guaranteeing a predetermined floor value whatever the plausible market evolutions. In other words, the unconditional multiple is defined once and for all in the traditional setting. The aim of this article is to further examine an alternative to the standard CPPI method, based on the determination of a conditional multiple. In this time-varying framework, the multiple is conditionally determined in order to remain the risk exposure constant, even if it also depends upon market conditions. Furthermore, we propose to define the multiple as a function of an extended Dynamic AutoRegressive Quantile model of the Value-at-Risk (DARQ-VaR). Using a French daily stock database (CAC 40) and individual stocks in the period 1998-2008), we present the main performance and risk results of the proposed Dynamic Proportion Portfolio Insurance strategy, first on real market data and secondly on artificial bootstrapped and surrogate data. Our main conclusion strengthens the previous ones : the conditional Dynamic Strategy with Constant-risk exposure dominates most of the time the traditional Constant-asset exposure unconditional strategies.
    Keywords: CPPI, Portfolio insurance, VaR, CAViaR, quantile regression, dynamic quantile model.
    Date: 2009–05
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-00389789_v1&r=fmk
  9. By: Michele Manna (Bank of Italy); Carmela Iazzetta (Bank of Italy)
    Abstract: When a bank defaults or stops trading in the interbank market, both a liquidity shortage in the market itself and mounting trading losses should be anticipated. To gain more insight into the way a liquidity crisis spreads, we apply network topology techniques to monthly data on deposits exchanged by Italian banks, from 1990 to 2008. Our research yields three main results: first, only a few banks are today pivotal in the redistribution of liquidity across the system, while banks close to, but outside this core circle, weigh less than they used to; secondly, the halt in operations in a second set of banks may cut off some of their counterparts from the rest of the network, with increasingly less negligible effects; finally, only 2-3 banks out of the 10 we identify as most interconnected within the network are currently also among the top 10 banks by volume of traded deposits.
    Keywords: interbank market, topology, liquidity crisis
    JEL: D4 E5 G2
    Date: 2009–05
    URL: http://d.repec.org/n?u=RePEc:bdi:wptemi:td_711_09&r=fmk

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.