
on Risk Management 
By:  Gilbert Colletaz (LEO  Laboratoire d'économie d'Orleans  CNRS : UMR6221  Université d'Orléans); Christophe Hurlin (LEO  Laboratoire d'économie d'Orleans  CNRS : UMR6221  Université d'Orléans); Christophe Pérignon (GREGH  Groupement de Recherche et d'Etudes en Gestion à HEC  GROUPE HEC  CNRS : UMR2959) 
Abstract:  This paper presents a new method to validate risk models: the Risk Map. This method jointly accounts for the number and the magnitude of extreme losses and graphically summarizes all information about the performance of a risk model. It relies on the concept of a super exception, which is de.ned as a situation in which the loss exceeds both the standard ValueatRisk (VaR) and a VaR de.ned at an extremely low coverage probability. We then formally test whether the sequences of exceptions and super exceptions are rejected by standard model validation tests. We show that the Risk Map can be used to validate market, credit, operational, or systemic risk estimates (VaR, stressed VaR, expected shortfall, and CoVaR) or to assess the performance of the margin system of a clearing house. 
Keywords:  Financial Risk Management; Tail Risk; Basel III 
Date:  2012–10–28 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:halshs00746273&r=rmg 
By:  Michael McAleer (Econometric Institute Erasmus School of Economics Erasmus University Rotterdam and Tinbergen Institute The Netherlands and Institute of Economic Research Kyoto University and Department of Quantitative Economics Complutense University of Madrid); JuanAngel JimenezMartin (Department of Quantitative Economics Complutense University of Madrid); Teodosio PerezAmaral (Department of Quantitative Economics Complutense University of Madrid) 
Abstract:  The Basel II Accord requires that banks and other Authorized Deposittaking Institutions (ADIs) communicate their daily risk forecasts to the appropriate monetary authorities at the beginning of each trading day, using one or more risk models to measure ValueatRisk (VaR). The risk estimates of these models are used to determine capital requirements and associated capital costs of ADIs, depending in part on the number of previous violations, whereby realised losses exceed the estimated VaR. In this paper we define risk management in terms of choosing from a variety of risk models, and discuss the selection of optimal risk models. A new approach to model selection for predicting VaR is proposed, consisting of combining alternative risk models, and we compare conservative and aggressive strategies for choosing between VaR models. We then examine how different risk management strategies performed during the 2008 09 global financial crisis. These issues are illustrated using Standard and Poor’s 500 Composite Index. 
Keywords:  ValueatRisk (VaR), daily capital charges, violation penalties, optimizing strategy, risk forecasts, aggressive or conservative risk management strategies, Basel Accord, global financial crisis. 
JEL:  G32 G11 G17 C53 C22 
Date:  2012–11 
URL:  http://d.repec.org/n?u=RePEc:kyo:wpaper:832&r=rmg 
By:  Sylvain Benoit (LEO  Laboratoire d'économie d'Orleans  CNRS : UMR6221  Université d'Orléans); Gilbert Colletaz (LEO  Laboratoire d'économie d'Orleans  CNRS : UMR6221  Université d'Orléans); Christophe Hurlin (LEO  Laboratoire d'économie d'Orleans  CNRS : UMR6221  Université d'Orléans); Christophe Pérignon (GREGH  Groupement de Recherche et d'Etudes en Gestion à HEC  GROUPE HEC  CNRS : UMR2959) 
Abstract:  We propose a theoretical and empirical comparison of the most popular systemic risk measures. To do so, we derive the systemic risk measures in a common framework and show that they can be expressed as linear transformations of firms' market risk (e.g., beta). We also derive conditions under which the different measures lead similar rankings of systemically important financial institutions (SIFIs). In an empirical analysis of US financial institutions, we show that (1) different systemic risk measures identify different SIFIs and that (2) firm rankings based on systemic risk estimates mirror rankings obtained by sorting firms on market risk or liabilities. Onefactor linear models explain between 83% and 100% of the variability of the systemic risk estimates, which indicates that standard systemic risk measures fall short in capturing the multiple facets of systemic risk. 
Keywords:  Banking Regulation; Systemically Important Financial Firms; Marginal Expected; Shortfall; SRISK; CoVaR; Systemic vs. Systematic Risk. 
Date:  2012–10–28 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:halshs00746272&r=rmg 
By:  Jürgen Eichberger (University of Heidelberg, AlfredWeberInstitut für Wirtschaftswissenschaften); Klaus Rheinberger (University of Applied Sciences Vorarlberg, Research Center Process and Product Engineering); Martin Summer (Oesterreichische Nationalbank) 
Abstract:  Credit risk models used in quantitative risk management treat credit risk analysis conceptually like a single person decision problem. From this perspective an exogenous source of risk drives the fundamental parameters of credit risk: probability of default, exposure at default and the recovery rate. In reality these parameters are the result of the interaction of many market participants: They are endogenous. We develop a general equilibrium model with endogenous credit risk that can be viewed as an extension of the capital asset pricing model. We analyze equilibrium prices of securities as well as equilibrium allocations in the presence of credit risk. We use the model to discuss the conceptual underpinnings of the approach to risk weight calibration for credit risk taken by the Basel Committee. JEL Classification: G32, G33, G01, D52. 
Keywords:  Credit Risk, Endogenous Risk, Systemic Risk, Banking Regulation. 
Date:  2012–06 
URL:  http://d.repec.org/n?u=RePEc:ecb:ecbwps:20121445&r=rmg 
By:  Vladislav Damjanovic (Department of Economics, University of Exeter) 
Abstract:  We consider a model of financial intermediation with a monopolistic competition market structure. A nonmonotonic relationship between the risk measured as a probability of default and the degree of competition is established. 
Keywords:  Competition and Risk, Risk in DSGE models, Bank competition; Bank failure, Default correlation, Riskshifting effect, Margin effect. 
JEL:  G21 G24 D43 E13 E43 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:exe:wpaper:1208&r=rmg 
By:  Harald Hau (University of Geneva); Sam Langfield (European Systemic Risk Board Secretariat; UK Financial Services Authority); David MarquésIbáñez (European Central Bank) 
Abstract:  This paper examines the quality of credit ratings assigned to banks in Europe and the United States by the three largest rating agencies over the past two decades. We interpret credit ratings as relative assessments of creditworthiness, and define a new ordinal metric of rating error based on banks’ expected default frequencies. Our results suggest that rating agencies assign more positive ratings to large banks and to those institutions more likely to provide the rating agency with additional securities rating business (as indicated by private structured credit origination activity). These competitive distortions are economically significant and contribute to perpetuate the existence of ‘toobigtofail’ banks. We also show that, overall, differential risk weights recommended by the Basel accords for investment grade banks bear no significant relationship to empirical default probabilities. JEL Classification: G21, G23, G28 
Keywords:  Rating agencies, credit ratings, conflicts of interest, prudential regulation 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:ecb:ecbwps:20121484&r=rmg 
By:  Christophe Hurlin (LEO  Laboratoire d'économie d'Orleans  CNRS : UMR6221  Université d'Orléans); Christophe Pérignon (GREGH  Groupement de Recherche et d'Etudes en Gestion à HEC  GROUPE HEC  CNRS : UMR2959) 
Abstract:  This paper presents a validation framework for collateral requirements or margins on a derivatives exchange. It can be used by investors, risk managers, and regulators to check the accuracy of a margining system. The statistical tests presented in this study are based either on the number, frequency, magnitude, or timing of margin exceedances, which are defined as situations in which the trading loss of a market participant exceeds his or her margin. We also propose an original way to validate globally the margining system by aggregating individual backtesting statistics obtained for each market participant. 
Keywords:  Collateral Requirements; Futures Markets; Tail Risk; Derivatives Clearing 
Date:  2012–10–28 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:halshs00746274&r=rmg 
By:  YiHsuan Chen; Wolfgang Karl Härdle; ; 
Abstract:  We examine what are common factors that determine systematic credit risk and estimate and interpret the common risk factors. We also compare the contributions of common factors in explaining the changes of credit default swap (CDS) spreads during the precrisis, crisis and postcrisis period. Based on the testing result from the common principal components model, this study finds that the eigenstructures across the three subperiods are distinct and the determinants of risk factors differ from three subperiods. Furthermore, we analyze the predictive ability of dynamics in CDS indices changes by dynamic factor models. 
Keywords:  credit default swaps; common factors; credit risk 
JEL:  C38 G32 E43 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2012063&r=rmg 
By:  Carlos León 
Abstract:  Informational constraints may turn the Merton Model for corporate credit risk impractical. Applying this framework to the Colombian financial sector is limited to four stockmarketlisted firms; more than a hundred banking and nonbanking firms are not listed. Within the same framework, firms’ debt spread over the riskfree rate may be considered as the market value of the sold put option that makes risky debt trade below defaultriskfree debt. In this sense, under some supplementary but reasonable assumptions, this paper uses money market spreads implicit in sell/buy backs to infer default probabilities for local financial firms. Results comprise a richer set of (38) banking and nonbanking firms. As expected, default probabilities are nonnegligible, where the ratio of defaultprobabilitytoleverage is lower for firms with access to lenderoflastresort facilities. The approach is valuable since it allows for inferring forwardlooking default probabilities in the absence of stock prices. Yet, two issues may limit the validity of results to serial and crosssection analysis: overvaluation of default probabilities due to (i) spreads containing noncredit risk factors, and (ii) systematic undervaluation of the firm’s value. However, crosssection assessments of default probabilities within a wider range of firms are vital for financial authorities’ decision making, and represent a major improvement in the implementation of the Merton Model in absence of equity market data. 
Keywords:  Merton model, structural model, credit risk, probability of default, distance to default. Classification JEL: G2, G13, G33, G32 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:bdr:borrec:743&r=rmg 
By:  Alberto Elices 
Abstract:  This paper describes the current taxonomy of model risk, ways for its mitigation and management and the importance of the model validation function in collaboration with other departments to design and implement them. 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1211.0225&r=rmg 
By:  Christophe Dutang (SAF  Laboratoire de Sciences Actuarielle et Financière  Université Claude Bernard  Lyon I : EA2429, IRMA  Institut de Recherche Mathématique Avancée  CNRS : UMR7501  Université de Strasbourg); Claude Lefèvre (Département de Mathématique  Université Libre de Bruxelles); Stéphane Loisel (SAF  Laboratoire de Sciences Actuarielle et Financière  Université Claude Bernard  Lyon I : EA2429) 
Abstract:  The purpose of this paper is to point out that an asymptotic rule "A+B/u" for the ultimate ruin probability applies to a wide class of dependent risk models, in discrete and continuous time. Dependence is incorporated through a mixing approach among claim amounts or claim interarrival times, leading to a systemic risk behavior. Ruin corresponds here either to classical ruin, or to stopping the activity after realizing that it is not pro table at all, when one has little possibility to increase premium income rate. Several special cases for which closed formulas are derived, are also investigated in some detail. 
Date:  2012–10–01 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal00746251&r=rmg 
By:  Alois Pichler; Alexander Shapiro 
Abstract:  This paper addresses law invariant coherent risk measures and their Kusuoka representations. By elaborating the existence of a minimal representation we show that every Kusuoka representation can be reduced to its minimal representation. Uniqueness  in a sense specified in the paper  of the risk measure's Kusuoka representation is derived from this initial result. Further, stochastic order relations are employed to identify the minimal Kusuoka representation. It is shown that measures in the minimal representation are extremal with respect to the order relations. The tools are finally employed to provide the minimal representation for important practical examples. Although the Kusuoka representation is usually given only for nonatomic probability spaces, this presentation closes the gap to spaces with atoms. 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1210.7257&r=rmg 
By:  Jennie Bai; Pierre CollinDufresne; Robert S. Goldstein; Jean Helwege 
Abstract:  Reducedform models of default that attribute a large fraction of credit spreads to compensation for credit event risk typically preclude the most plausible economic justification for such risk to be pricednamely, a “contagious” response of the market portfolio during the credit event. When this channel is introduced within a general equilibrium framework for an economy comprised of a large number of firms, credit event risk premia have an upper bound of just a few basis points and are dwarfed by the contagion premium. We provide empirical evidence supporting the view that credit event risk premia are minuscule. 
Keywords:  Default (Finance) ; Credit ; Risk ; Financial crises 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:fip:fednsr:577&r=rmg 
By:  Kartik Anand; James Chapman; Prasanna Gai; 
Abstract:  We examine the financial stability implications of covered bonds. Banks issue covered bonds by encumbering assets on their balance sheet and placing them within a dynamic ring fence. As more assets are encumbered, jittery unsecured creditors may run, leading to a banking crisis. We provide conditions for such a crisis to occur. We examine how different overthecounter market network structures influence the liquidity of secured funding markets and crisis dynamics. We draw on the framework to consider several policy measures aimed at mitigating systemic risk, including caps on asset encumbrance, global legal entity identifiers, and swaps of good for bad collateral by central banks. 
Keywords:  covered bonds, overthecounter markets, systemic risk, asset encumbrance, legal entity identifiers, velocity of collateral 
JEL:  G01 G18 G21 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2012064&r=rmg 
By:  David E Allen (School of Accouting Finance & Economics, Edith Cowan University, Australia); Abhay K Singh (School of Accouting Finance & Economics, Edith Cowan University, Australia); Robert J Powell (School of Accouting Finance & Economics, Edith Cowan University, Australia); Michael McAleer (Erasmus School of Economics, Erasmus University Rotterdam, Institute for Economic Research,Kyoto University, and Department of Quantitative Economics, Complutense University of Madrid); James Taylor (Said Business School, University of Oxford, Oxford); Lyn Thomas (Southampton Management School, University of Southampton, Southampton) 
Abstract:  This paper examines the asymmetric relationship between price and implied volatility and the associated extreme quantile dependence using a linear and non linear quantile regression approach. Our goal is to demonstrate that the relationship between the volatility and market return, as quantied by Ordinary Least Square (OLS) regression, is not uniform across the distribution of the volatilityprice re turn pairs using quantile regressions. We examine the bivariate relationships of six volatilityreturn pairs, namely: CBOE VIX and S&P 500, FTSE 100 Volatility and FTSE 100, NASDAQ 100 Volatility (VXN) and NASDAQ, DAX Volatility (VDAX) and DAX 30, CAC Volatility (VCAC) and CAC 40, and STOXX Volatility (VS TOXX) and STOXX. The assumption of a normal distribution in the return series is not appropriate when the distribution is skewed, and hence OLS may not capture a complete picture of the relationship. Quantile regression, on the other hand, can be set up with various loss functions, both parametric and nonparametric (linear case) and can be evaluated with skewed marginalbased copulas (for the nonlinear case), which is helpful in evaluating the nonnormal and nonlinear nature of the relationship between price and volatility. In the empirical analysis we compare the results from linear quantile regression (LQR) and copula based nonlinear quantile regression known as copula quantile regression (CQR). The discussion of the prop erties of the volatility series and empirical ndings in this paper have signicance for portfolio optimization, hedging strategies, trading strategies and risk management, in general. 
Keywords:  Return Volatility relationship, quantile regression, copula, copula quantile regression, volatility index, tail dependence 
JEL:  C14 C58 G11 
Date:  2012–11 
URL:  http://d.repec.org/n?u=RePEc:kyo:wpaper:831&r=rmg 
By:  Stéphane Loisel (SAF  Laboratoire de Sciences Actuarielle et Financière  Université Claude Bernard  Lyon I : EA2429); HansU. Gerber (UNIL  Université de Lausanne  Université de Lausanne) 
Abstract:  We present applications of risk theory to contemporary problems related to the implemented of Solvency II related concepts, like the Own Risk and Solvency Assessment. 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:hal:journl:hal00746231&r=rmg 
By:  LEV RATNOVSKI (International Monetary Fund); Giovanni Dell'Ariccia (IMF) 
Abstract:  We revisit the link between bailouts and bank risk taking. The expectation of government support to failing banks (bailout) creates moral hazard and encourages risktaking. However, when a bank's success depends on both its idiosyncratic risk and the overall stability of the banking system, a government's commitment to shield banks from contagion may increase their incentives to invest prudently. We explore these issues in a simple model of financial intermediation where a bank's survival depends on another bank's success. We show that the positive effect from systemic insurance dominates the classical moral hazard effect when the risk of contagion is high. 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:red:sed012:133&r=rmg 
By:  Lars Peter Hansen 
Abstract:  Sparked by the recent “great recession” and the role of financial markets, considerable interest exists among researchers within both the academic community and the public sector in modeling and measuring systemic risk. In this essay I draw on experiences with other measurement agendas to place in perspective the challenge of quantifying systemic risk, or more generally, of providing empirical constructs that can enhance our understanding of linkages between financial markets and the macroeconomy. 
JEL:  E44 
Date:  2012–11 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:18505&r=rmg 
By:  Bruno Biais (Toulouse School of Economics (TSE)); Florian Heider (European Central Bank); Marie Hoerova (European Central Bank) 
Abstract:  We study the optimal design of clearing systems. We analyze how counterparty risk should be allocated, whether traders should be fully insured against that risk, and how moral hazard affects the optimal allocation of risk. The main advantage of centralized clearing, as opposed to no or decentralized clearing, is the mutualization of risk. While mutualization fully insures idiosyncratic risk, it cannot provide insurance against aggregate risk. When the latter is significant, it is efficient that protection buyers exert effort to find robust counterparties, whose low default risk makes it possible for the clearing system to withstand aggregate shocks. When this effort is unobservable, incentive compatibility requires that protection buyers retain some exposure to counterparty risk even with centralized clearing. JEL Classification: G22, G28, D82 
Keywords:  Risksharing, moral hazard, optimal contracting, counterparty risk, central clearing counterparty, mutualization, aggregate and idiosyncratic risk 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:ecb:ecbwps:20121481&r=rmg 
By:  Andrew Clare, James Seaton, Peter N. Smith and Stephen Thomas 
Abstract:  We show that combining momentum and trend following strategies for individual commodity futures can lead to portfolios which offer attractive risk adjusted returns which are superior to simple momentum strategies; when we expose these returns to a wide array of sources of systematic risk we find that robust alpha survives. Experimenting with risk parity portfolio weightings has limited impact on our results though in particular is beneficial to longshort strategies; the marginal impact of applying trend following methods far outweighs momentum and risk parity adjustments in terms of riskadjusted returns and limiting downside risk.Overall this leads to an attractive strategy for investing in commodity futures and emphasises the importance of trend following as an investment strategy in the commodity futures context. 
Keywords:  trend following, momentum, risk parity, equallyweighted, portfolios, commodity futures. 
JEL:  G10 G11 G12 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:yor:yorken:12/28&r=rmg 
By:  Ronkainen , Vesa (Financial Supervisory Authority) 
Abstract:  This work studies and develops tools to quantify and manage the risks and uncertainty relating to the pricing of annuities in the long run. To this end, an idealized MonteCarlo simulation model is formulated, estimated and implemented, which enables one to investigate some typical pension and life insurance products. The main risks in pension insurance relate to investment performance and mortality/longevity development. We first develop stochastic models for equity and bond returns. The S&P 500 yearly total return is modeled by an uncorrelated and Normally distributed process to which exogenous Gamma distributed negative shocks arrive with Geometrically distributed interarrival times. This regime switching jump model takes into account the empirical observations of infrequent exceptionally large losses. The 5year US government bond yearly total return is modeled as an ARMA(1,1) process after suitably logtransforming the returns. This model is able to generate long term interest rate cycles and allows rapid yeartoyear corrections in the returns. We also address the parameter uncertainty in these models. <p> We then develop a stochastic model for mortality. The chosen mortality forecasting model is the wellknown model of Lee and Carter (1992), in which we use the Bayesian MCMC methods in the inference concerning the time index. Our analysis with a local version of the model showed that the assumptions of the LeeCarter model are not fully compatible with Finnish mortality data. In particular we found that mortality has been lower than average for the cohort born in wartime. However, because the forecasts of these two models were not significantly different, we chose the more parsimonious LeeCarter model. Although our main focus is on the total population data, we also analysed the data for males and females separately. Finally we build a flexible model for the dependence structure that allows us to generate stochastic scenarios in which mortality and economic processes are either uncorrelated, correlated or shockcorrelated. <p> By using the simulation model to generate stochastic pension cashflows, we are then able to analyse the financing of longevity risk in pension insurance and the resulting risk management issues. This is accomplished via three case studies. Two of these concentrate on the pricing and solvency questions of a pension portfolio. The first study covers a single cohort of different sizes, and the second allows for multiple cohorts of annuitants. The final case study discusses individual pension insurance from the customer and longterm points of view. <p> Realistic statistical longterm risk measurement is the key theme of this work, and so we compare our simulation results with the ValueatRisk or VaR approach. The results show that the limitations of basic VaR approach must be carefully accounted for in applications. The VaR approach is the most commonly used risk measurement methodology in insurance and finance applications. For instance, it underlies the solvency capital requirement in Solvency II, which we also discuss in this work. 
Keywords:  equities; stocks; jump model; bond; longevity; LeeCarter model; stochastic mortality; cohort mortality; dependence model; asymmetric dependence; parameter uncertainty; stochastic annuity; pension; cohort size; solvency; internal model 
JEL:  G12 J11 
Date:  2012–05–25 
URL:  http://d.repec.org/n?u=RePEc:hhs:bofism:2012_044&r=rmg 
By:  Marco Bianchetti 
Abstract:  Once upon a time there was a classical financial world in which all the Libors were equal. Standard textbooks taught that simple relations held, such that, for example, a 6 months Libor Deposit was replicable with a 3 months Libor Deposits plus a 3x6 months Forward Rate Agreement (FRA), and that Libor was a good proxy of the risk free rate required as basic building block of noarbitrage pricing theory. Nowadays, in the modern financial world after the credit crunch, some Libors are more equal than others, depending on their rate tenor, and classical formulas are history. Banks are not anymore too "big to fail", Libors are fixed by panels of risky banks, and they are risky rates themselves. These simple empirical facts carry very important consequences in derivative's trading and risk management, such as, for example, basis risk, collateralization and regulatory pressure in favour of Central Counterparties. Something that should be carefully considered by anyone managing even a single plain vanilla Swap. In this qualitative note we review the problem trying to shed some light on this modern animal farm, recurring to an analogy with quantum physics, the Zeeman effect. 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1210.7329&r=rmg 
By:  Joan del castillo; Jalila Daoudi; Isabel Serra 
Abstract:  In this article we show the relationship between the Pareto distribution and the gamma distribution. This shows that the second one, appropriately extended, explains some anomalies that arise in the practical use of extreme value theory. The results are useful to certain phenomena that are fitted by the Pareto distribution but, at the same time, they present a deviation from this law for very large values. Two examples of data analysis with the new model are provided. The first one is on the influence of climate variability on the occurrence of tropical cyclones. The second one on the analysis of aggregate loss distributions associated to operational risk management. 
Date:  2012–11 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1211.0130&r=rmg 