
on Risk Management 
Issue of 2012‒10‒27
seventeen papers chosen by 
By:  Brice Hakwa; Manfred J\"agerAmbro\.zewicz; Barbara R\"udiger 
Abstract:  This paper is devoted to the quantification and analysis of marginal risk contribution of a given single financial institution i to the risk of a financial system s. Our work expands on the CoVaR concept proposed by Adrian and Brunnermeier as a tool for the measurement of marginal systemic risk contribution. We first give a mathematical definition of CoVaR_{\alpha}^{sL^i=l}. Our definition improves the CoVaR concept by expressing CoVaR_{\alpha}^{sL^i=l} as a function of a state l and of a given probability level \alpha relative to i and s respectively. Based on Copula theory we connect CoVaR_{\alpha}^{sL^i=l} to the partial derivatives of Copula through their probabilistic interpretation and definitions (Conditional Probability). Using this we provide a closed formula for the calculation of CoVaR_{\alpha}^{sL^i=l} for a large class of (marginal) distributions and dependence structures (linear and nonlinear). Our formula allows a better analysis of systemic risk using CoVaR in the sense that it allows to define CoVaR_{\alpha}^{sL^i=l} depending on the marginal distributions of the losses of i and s respectively and the copula between L^i and L^s. We discuss the implications of this in the context of the quantification and analysis of systemic risk contributions. %some mathematical This makes possible the For example we will analyse the marginal effects of L^i, L^s and C of the risk contribution of i. 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1210.4713&r=rmg 
By:  Ramon Alemany (Department of Econometrics, RiskcenterIREA, University of Barcelona,Av. Diagonal, 690, 08034 Barcelona, Spain); Catalina Bolancé (Department of Econometrics, RiskcenterIREA, University of Barcelona,Av. Diagonal, 690, 08034 Barcelona, Spain); Montserrat Guillén (Department of Econometrics, RiskcenterIREA, University of Barcelona,Av. Diagonal, 690, 08034 Barcelona, Spain) 
Abstract:  A method to estimate an extreme quantile that requires no distributional assumptions is presented. The approach is based on transformed kernel estimation of the cumulative distribution function (cdf). The proposed method consists of a double transformation kernel estimation. We derive optimal bandwidth selection methods that have a direct expression for the smoothing parameter. The bandwidth can accommodate to the given quantile level. The procedure is useful for large data sets and improves quantile estimation compared to other methods in heavy tailed distributions. Implementation is straightforward and R programs are available. 
Keywords:  kernel estimation, bandwidth selection, quantile, risk measures.. 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:xrp:wpaper:xreap201219&r=rmg 
By:  Radim Gottwald (Department of Finance, Faculty of Business and Economics, Mendel university in Brno) 
Abstract:  The focus of the author is the Value at Risk model which is currently often adopted as the risk analysis model, particularly in banking and insurance. Following the model principle characteristics, the Value at Risk is economically interpreted. Attention is paid to the distinct features of three submethods: historical simulation, the Monte Carlo method and variance and covariance method. A row of empirical studies of the practical application of these methods are provided. The objective of the paper is the application of the Value at Risk model on shares from the SPAD segment of the Prague Stock Exchange between 2009 and 2011. A corresponding reliability interval, hold time, historical period and other essential parameters related to the submethods are gradually defined and chosen. By using historical values of stocks and shares, diverse statistical indicators are calculated. The diversified Values at Risk of the submethods are benchmarked against the nondiversified ones. The results show that any loss related to the nondiversified Value at Risk is always higher among the three methods than a loss related to a diversified Value at Risk. We can expect â€“ with selected probability â€“ a drop in the value of the portfolio which differs depending on which method is adopted based on recent share developments. The methodology is further benchmarked against other methodologies used in other papers applying the Value at Risk model. The message of this paper lies in the unique selection of applied methods, risk factors and the stock market. The methodology allows us to evaluate the risk level for investments in shares in a specific way, which will be appreciated by numerous financial entities when making an investment decision. 
Keywords:  risk measurement, historical simulation method, Monte Carlo method, variance covariance method 
JEL:  C15 E37 G32 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:men:wpaper:30_2012&r=rmg 
By:  Azusa Takeyama (Deputy Director and Economist, Institute for Monetary and Economic Studies, Bank of Japan (Email: azusa.takeyama@boj.or.jp)); Nick Constantinou (Lectuer, Essex Business School, University of Essex (Email: nconst@essex.ac.uk)); Dmitri Vinogradov (Lectuer, Essex Business School, University of Essex (Email:dvinog@essex.ac.uk)) 
Abstract:  This paper investigates how the market valuation of credit risk changed during 20082009 via a separation of the probability of default (PD) and the loss given default (LGD) of credit default swaps ( CDSs), using the information implied by equity options. While the Lehman Brothers collapse in September 2008 harmed the stability of the financial systems in major industrialized countries, the CDS spreads of some major UK banks did not increase in response to this turmoil in financial markets including the decline in their own stock prices. This implies that the CDS spreads of financial institutions may not reflect all their credit risk due to the government interventions. Since CDS spreads are not appropriate to analyze the impact of the government interventions on credit risk and the cross sectional movement of credit risk, we investigate how the government interventions affect the PD and LGD of financial institutions and how the PD and LGD of financial institutions were related with those of nonfinancial firms. We demonstrate that the rise in the credit risk of financial institutions did not bring about that of nonfinancial firms (credit risk contagion) both in the US and UK using principal component analysis. 
Keywords:  Credit Default Swap (CDS), Probability of Default (PD), Loss Given Default (LGD), Credit Risk Contagion 
JEL:  C12 C53 G13 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:ime:imedps:12e15&r=rmg 
By:  Cai, J.; Einmahl, J.H.J.; Haan, L.F.M. de; Zhou, C. (Tilburg University, Center for Economic Research) 
Abstract:  Abstract: Denote the loss return on the equity of a financial institution as X and that of the entire market as Y . For a given very small value of p > 0, the marginal expected shortfall (MES) is defined as E(X  Y > QY (1âˆ’p)), where QY (1âˆ’p) is the (1âˆ’p)th quantile of the distribution of Y . The MES is an important factor when measuring the systemic risk of financial institutions. For a wide nonparametric class of bivariate distributions, we construct an estimator of the MES and establish the asymptotic normality of the estimator when p â†“ 0, as the sample size n â†’ âˆž. Since we are in particular interested in the case p = O(1=n), we use extreme value techniques for deriving the estimator and its asymptotic behavior. The finite sample performance of the estimator and the adequacy of the limit theorem are shown in a detailed simulation study. We also apply our method to estimate the MES of three large U.S. investment banks. 
Keywords:  Asymptotic normality;extreme values;tail dependence. 
JEL:  C13 C14 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:dgr:kubcen:2012080&r=rmg 
By:  Marek Petrik; Dharmashankar Subramanian 
Abstract:  Stochastic domains often involve riskaverse decision makers. While recent work has focused on how to model risk in Markov decision processes using risk measures, it has not addressed the problem of solving large riskaverse formulations. In this paper, we propose and analyze a new method for solving large riskaverse MDPs with hybrid continuousdiscrete state spaces and continuous action spaces. The proposed method iteratively improves a bound on the value function using a linearity structure of the MDP. We demonstrate the utility and properties of the method on a portfolio optimization problem. 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1210.4901&r=rmg 
By:  Bielecki, T.R. (Illinois Institute of Technology); Cousin, A. (Université de Lyon); Crépey, S. (Université d’Évry Val d’Essonne); Herbertsson, Alexander (Department of Economics, School of Business, Economics and Law, Göteborg University) 
Abstract:  In [4], the authors introduced a Markov copula model of portfolio credit risk. This model solves the topdown versus bottomup puzzle in achieving efficient joint calibration to singlename CDS and to multiname CDO tranches data. In [4], we studied a general model, that allows for stochastic default intensities and for random recoveries, and we conducted empirical study of our model using both deterministic and stochastic default intensities, as well as deterministic and random recoveries only. Since, in case of some “badly behaved” data sets a satisfactory calibration accuracy can only be achieved through the use of random recoveries, and, since for important applications, such as CVA computations for credit derivatives, the use of stochastic intensities is advocated by practitioners, efficient implementation of our model that would account for these two issues is very important. However, the details behind the implementation of the loss distribution in the case with random recoveries were not provided in [4]. Neither were the details on the stochastic default intensities given there. This paper is thus a complement to [4], with a focus on a detailed description of the methodology that we used so to implement these two model features: random recoveries and stochastic intensities.<p> 
Keywords:  Portfolio Credit Risk; Markov Copula Model; Common Shocks; Stochastic Spreads; Random Recoveries 
JEL:  C02 C63 G13 G32 G33 
Date:  2012–10–16 
URL:  http://d.repec.org/n?u=RePEc:hhs:gunwpe:0545&r=rmg 
By:  Namho Kang; Peter Kondor; Ronnie Sadka 
Abstract:  This paper studies the effect of hedgefund trading on idiosyncratic risk. We hypothesize that while hedgefund activity would often reduce idiosyncratic risk, high initial levels of idiosyncratic risk might be further amplified due to fund loss limits. Panelregression analyses provide supporting evidence for this hypothesis. The results are robust to sample selection and are further corroborated by a natural experiment using the Lehman bankruptcy as an exogenous adverse shock to hedgefund trading. Hedgefund capital also explains the increased idiosyncratic volatility of highidiosyncraticvolatility stocks as well as the decreased idiosyncratic volatility of lowidiosyncraticvolatility stocks over the past few decade. 
Date:  2012–10–04 
URL:  http://d.repec.org/n?u=RePEc:ceu:econwp:2012_15&r=rmg 
By:  Azusa Takeyama (Deputy Director and Economist, Institute for Monetary and Economic Studies, Bank of Japan (Email: azusa.takeyama@boj.or.jp)); Nick Constantinou (Lectuer, Essex Business School, University of Essex (Email: nconst@essex.ac.uk)); Dmitri Vinogradov (Lectuer, Essex Business School, University of Essex (Email:dvinog@essex.ac.uk)) 
Abstract:  This paper develops a framework to estimate the probability of default (PD) implied in listed stock options. The underlying option pricing model measures PD as the intensity of a jump diffusion process, in which the underlying stock price jumps to zero at default. We adopt a twostage calibration algorithm to obtain the precise estimator of PD. In the calibration procedure, we improve the fitness of the option pricing model via the implementation of the time inhomogeneous term structure model in the option pricing model. Since the term structure model perfectly fits the actual term structure, we resolve the estimation bias caused by the poor fitness of the time homogeneous term structure model. It is demonstrated that the PD estimator from listed stock options can provide meaningful insights on the pricing of credit derivatives like credit default swap. 
Keywords:  probability of default (PD), option pricing under credit risk, perturbation method 
JEL:  C12 C53 G13 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:ime:imedps:12e14&r=rmg 
By:  Xuqing Huang; Irena Vodenska; Shlomo Havlin; H. Eugene Stanley 
Abstract:  In order to design complex networks that are robust and sustainable, we must understand systemic risk. As economic systems become increasingly interconnected, for example, a shock in a single financial network can provoke cascading failures throughout the system. The widespread effects of the current EU debt crisis and the 2008 world financial crisis occur because financial systems are characterized by complex relations that allow a local crisis to spread dramatically. We study US commercial bank balance sheet data and design a bipartite banking network composed of (i) banks and (ii) bank assets. We propose a cascading failure model to simulate the crisis spreading process in a bipartite banking network. We test our model using 2007 data to analyze failed banks. We find that, within realistic parameters, our model identifies a significant portion of the actual failed banks from the FDIC failed bank database from 2008 to 2011. 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1210.4973&r=rmg 
By:  Victor Aguirregabiria; Robert Clark; Hui Wang 
Abstract:  The 1994 Riegle Neal (RN) Act removed interstate banking restrictions in the US. The primary motivation was to permit geographic risk diversification (GRD). Using a factor model to measure banks' geographic risk, we show that RN expanded GRD possibilities in small states, but that few banks took advantage. Using our measure of geographic risk and a revealed preference approach, we identify preferences towards GRD separately from the contribution of other factors to branch network configuration. Risk has a negative effect on bank value, but this has been counterbalanced by economies of density/scale, reallocation/merging costs, and concerns for local market power. 
Keywords:  Geographic risk diversification; Retail banking; Oligopoly competition; Branch networks; Riegle Neal Act 
JEL:  L13 L51 G21 
Date:  2012–10–15 
URL:  http://d.repec.org/n?u=RePEc:tor:tecipa:tecipa465&r=rmg 
By:  André van Stel; Andrew Burke; José Maria Millan; Concepcion Roman 
Abstract:  Startup size is a key strategic decision for entrepreneurs. Should entrepreneurs start up close to minimum efficient scale or should they take less risks and startup on a smaller scale? Previously, this strategic decision appeared to be one of simply making a choice between a higher risk/reward larger startup versus a lower risk/reward smaller scale startup. However, recent research on the relationship between risk management and performance (Burke, 2009) indicates that in situations of greater uncertainty and where innovation is incremental, a lower risk small startup size can enable greater reward through enhanced post startup flexibility and agility. In this paper we provide the first statistical test of the efficacy of startup size strategies. We focus on employer businesses that provide jobs. We find that employer businesses that originally adopted a small scale (ownaccount) startup strategy have higher survival chances and entrepreneurial incomes than employer businesses that employed personnel immediately from startup. We also find that prior entrepreneurial experience positively affects firm survival and entrepreneurial incomes. Given the high failure rates among startups and the associated difficulty for new enterprises to create sustainable jobs, the research results highlight how strategic choice in relation to firm startup size and risk management can have an important bearing on new venture performance. 
Date:  2012–08–29 
URL:  http://d.repec.org/n?u=RePEc:eim:papers:h201207&r=rmg 
By:  Gareth W. Peters; Alice X. D. Dong; Robert Kohn 
Abstract:  Our article considers the class of recently developed stochastic models that combine claims payments and incurred losses information into a coherent reserving methodology. In particular, we develop a family of Heirarchical Bayesian PaidIncurredClaims models, combining the claims reserving models of Hertig et al. (1985) and Gogol et al. (1993). In the process we extend the independent lognormal model of Merz et al. (2010) by incorporating different dependence structures using a DataAugmented mixture Copula PaidIncurred claims model. The utility and influence of incorporating both payment and incurred losses into estimating of the full predictive distribution of the outstanding loss liabilities and the resulting reserves is demonstrated in the following cases: (i) an independent payment (P) data model; (ii) the independent PaymentIncurred Claims (PIC) data model of Merz et al. (2010); (iii) a novel dependent lagyear telescoping block diagonal Gaussian Copula PIC data model incorporating conjugacy via transformation; (iv) a novel dataaugmented mixture Archimedean copula dependent PIC data model. Inference in such models is developed via a class of adaptive Markov chain Monte Carlo sampling algorithms. These incorporate a dataaugmentation framework utilized to efficiently evaluate the likelihood for the copula based PIC model in the loss reserving triangles. The adaptation strategy is based on representing a positive definite covariance matrix by the exponential of a symmetric matrix as proposed by Leonard et al. (1992). 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1210.3849&r=rmg 
By:  P. Del Moral; G. W. Peters; Ch. Verg\'e 
Abstract:  Interacting particle methods are increasingly used to sample from complex and highdimensional distributions. These stochastic particle integration techniques can be interpreted as an universal acceptancerejection sequential particle sampler equipped with adaptive and interacting recycling mechanisms. Practically, the particles evolve randomly around the space independently and to each particle is associated a positive potential function. Periodically, particles with high potentials duplicate at the expense of low potential particle which die. This natural genetic type selection scheme appears in numerous applications in applied probability, physics, Bayesian statistics, signal processing, biology, and information engineering. It is the intention of this paper to introduce them to risk modeling. From a purely mathematical point of view, these stochastic samplers can be interpreted as FeynmanKac particle integration methods. These functional models are natural mathematical extensions of the traditional change of probability measures, commonly used to design an importance sampling strategy. In this article, we provide a brief introduction to the stochastic modeling and the theoretical analysis of these particle algorithms. Then we conclude with an illustration of a subset of such methods to resolve important risk measure and capital estimation in risk and insurance modelling. 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1210.3851&r=rmg 
By:  Mohamed Belhaj (Centrale Marseille (AixMarseille School of Economics), CNRS & EHESS); Nataliya Klimenko (AixMarseille Université, Greqam) 
Abstract:  Early regulator interventions into problem banks are one of the key suggestions of Basel II. However, no guidance is given on their design. To fill this gap, we outline an incentivebased preventive supervision strategy that eliminates bad asset management in banks. Two supervision techniques are combined: continuous regulator intervention and random audits. Random audit technologies differ as to quality and cost. Our design ensures good management without excessive supervision costs, through a gradual adjustment of supervision effort to the bank's financial health. We also consider preventive supervision in a setting where audits can be delegated to an independent audit agency, showing how to induce agency compliance with regulatory instructions in the least costly way. 
Keywords:  banking supervision, random audit, incentives, moral hazard, delegation. 
JEL:  G21 G28 
Date:  2012–01 
URL:  http://d.repec.org/n?u=RePEc:aim:wpaimx:1201&r=rmg 
By:  Joseph Y. Halpern; Samantha Leung 
Abstract:  We consider a setting where an agent's uncertainty is represented by a set of probability measures, rather than a single measure. Measurebymeasure updating of such a set of measures upon acquiring new information is wellknown to suffer from problems; agents are not always able to learn appropriately. To deal with these problems, we propose using weighted sets of probabilities: a representation where each measure is associated with a weight, which denotes its significance. We describe a natural approach to updating in such a situation and a natural approach to determining the weights. We then show how this representation can be used in decisionmaking, by modifying a standard approach to decision makingminimizing expected regretto obtain minimax weighted expected regret (MWER).We provide an axiomatization that characterizes preferences induced by MWER both in the static and dynamic case. 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1210.4853&r=rmg 
By:  Jan Willem van den End; Marco Hoeberichts 
Abstract:  We analyse the relationship between tail risk and crisis measures by governments and the central bank. Using an adjusted Merton model in a game theoretical setup, the analysis shows that the participation constraint for interventions by the central bank and the governments is less binding if the risk of contagion is high. The strategic interaction between governments and the central bank also influences the effectiveness of the interventions. A joint effort of both the governments and central bank leads to a better outcome. To prevent a bad equilibrium a sizable commitment by both players is required. Our stylized model sheds light on the strategic interaction between EMU governments and the Eurosystem in the context of the Outright Monetary Transactions program (OMT). 
Keywords:  Financial crisis; Monetary policy; Central banks; Policy coordination 
JEL:  E42 E52 E61 G01 G18 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:dnb:dnbwpp:352&r=rmg 