nep-rmg New Economics Papers
on Risk Management
Issue of 2009‒09‒26
twenty-two papers chosen by
Stan Miles
Thompson Rivers University

  1. Risk Measures in Quantitative Finance By Sovan Mitra
  2. International Diversification: An Extreme Value Approach By Chollete, Loran; de la Pena , Victor; Lu, Ching-Chih
  3. "Optimal Risk Management Before, During and After the 2008-09 Financial Crisis" By Michael McAleer; Juan-Angel Jimenez-Martin; Teodosio Perez-Amaral
  4. What Happened to Risk Management During the 2008-09 Financial Crisis? By Juan-Angel Jimenez-Martin; Michael McAleer; Teodosio Pérez-Amaral
  5. Basel II and the Capital Requirements Directive: Responding to the 2008/09 Financial Crisis By Ojo, Marianne
  6. Credit Default Swaps and the Credit Crisis By René M. Stulz
  7. Haar Wavelets-Based Approach for Quantifying Credit Portfolio Losses By Josep J. Masdemont; Luis Ortiz-Gracia
  8. International Diversification: A Copula Approach By Chollete, Loran; Pena, Victor de la; Lu, Ching-Chih
  9. A Simplified Approach to modeling the credit-risk of CMO By K. Rajaratnam
  10. Climbing Down from the Top: Single Name Dynamics in Credit Top Down Models By Igor Halperin; Pascal Tomecek
  11. A new approach for scenario generation in Risk management By Juan-Pablo Ortega; Rainer Pullirsch; Josef Teichmann; Julian Wergieluk
  12. Estimating LGD Correlation By Jiří Witzany
  13. Implementing Loss Distribution Approach for Operational Risk By Pavel V. Shevchenko
  14. Macrostate Parameter, an Econophysics Approach for the Risk Analysis of the Stock Exchange Market Transactions By Anca Gheorghiu; Ion Spanulescu
  15. Liquidity risk, credit risk, and the Federal Reserve's responses to the crisis By Asani Sarkar
  16. An extension of Davis and Lo's contagion model By Didier Rulli\`ere; Diana Dorobantu
  17. Global risk minimization in financial markets By Andreas Martin Lisewski
  18. Second Order Risk By Peter G. Shepard
  19. A Bayesian Networks Approach to Operational Risk By V. Aquaro; M. Bardoscia; R. Bellotti; A. Consiglio; F. De Carlo; G. Ferri
  20. Operational Risk Management using a Fuzzy Logic Inference System By Alejandro Reveiz; Carlos Léon
  21. First-passage and risk evaluation under stochastic volatility By Jaume Masoliver; Josep Perello
  22. Computing Tails of Compound Distributions Using Direct Numerical Integration By Xiaolin Luo; Pavel V. Shevchenko

  1. By: Sovan Mitra
    Abstract: This paper was presented and written for two seminars: a national UK University Risk Conference and a Risk Management industry workshop. The target audience is therefore a cross section of Academics and industry professionals. The current ongoing global credit crunch has highlighted the importance of risk measurement in Finance to companies and regulators alike. Despite risk measurement's central importance to risk management, few papers exist reviewing them or following their evolution from its foremost beginnings up to the present day risk measures. This paper reviews the most important portfolio risk measures in Financial Mathematics, from Bernoulli (1738) to Markowitz's Portfolio Theory, to the presently preferred risk measures such as CVaR (conditional Value at Risk). We provide a chronological review of the risk measures and survey less commonly known risk measures e.g. Treynor ratio.
    Date: 2009–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0904.0870&r=rmg
  2. By: Chollete, Loran (University of Stavanger); de la Pena , Victor (Columbia Universit); Lu, Ching-Chih (National Chengchi University)
    Abstract: .
    Keywords: Diversification; Downside Risk; Correlation Complexity; Extreme Value; Systemic Risk
    JEL: C14 F30 G15
    Date: 2009–06–30
    URL: http://d.repec.org/n?u=RePEc:hhs:stavef:2009_026&r=rmg
  3. By: Michael McAleer (Econometric Institute, Erasmus School of Economics, Erasmus University Rotterdam and Tinbergen Institute and Center for International Research on the Japanese Economy (CIRJE), Faculty of Economics, University of Tokyo); Juan-Angel Jimenez-Martin (Department of Quantitative Economics, Complutense University of Madrid); Teodosio Perez-Amaral (Department of Quantitative Economics Complutense University of Madrid)
    Abstract: In this paper we advance the idea that optimal risk management under the Basel II Accord will typically require the use of a combination of different models of risk. This idea is illustrated by analyzing the best empirical models of risk for five stock indexes before, during, and after the 2008-09 financial crisis. The data used are the Dow Jones Industrial Average, Financial Times Stock Exchange 100, Nikkei, Hang Seng and Standard and Poor's 500 Composite Index. The primary goal of the exercise is to identify the best models for risk management in each period according to the minimization of average daily capital requirements under the Basel II Accord. It is found that the best risk models can and do vary before, during and after the 2008-09 financial crisis. Moreover, it is found that an aggressive risk management strategy, namely the supremum strategy that combines different models of risk, can result in significant gains in average daily capital requirements, relative to the strategy of using single models, while staying within the limits of the Basel II Accord.
    Date: 2009–09
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2009cf667&r=rmg
  4. By: Juan-Angel Jimenez-Martin (Dpto. de Fundamentos de Análisis Económico II, Universidad Complutense); Michael McAleer; Teodosio Pérez-Amaral (Dpto. de Fundamentos de Análisis Económico II, Universidad Complutense)
    Abstract: When dealing with market risk under the Basel II Accord, variation pays in the form of lower capital requirements and higher profits. Typically, GARCH type models are chosen to forecast Value-at-Risk (VaR) using a single risk model. In this paper we illustrate two useful variations to the standard mechanism for choosing forecasts, namely: (i) combining different forecast models for each period, such as a daily model that forecasts the supremum or infinum value for the VaR; (ii) alternatively, select a single model to forecast VaR, and then modify the daily forecast, depending on the recent history of violations under the Basel II Accord. We illustrate these points using the Standard and Poor’s 500 Composite Index. In many cases we find significant decreases in the capital requirements, while incurring a number of violations that stays within the Basel II Accord limits.
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:ucm:doicae:0919&r=rmg
  5. By: Ojo, Marianne
    Abstract: This paper addresses factors which have prompted the need for further revision of banking regulation, with particular reference to the Capital Requirements Directive. The Capital Requirements Directive (CRD), which comprises the 2006/48/EC Directive on the taking up and pursuit of the business of credit institutions and the 2006/49/EC Directive on the capital adequacy of investment firms and credit institutions, implemented the revised framework for the International Convergence of Capital Measurement and Capital Standards (Basel II) within EU member states. Pro cyclicality has attracted a lot of attention – particularly with regards to the recent financial crisis, owing to concerns arising from increased sensitivity to credit risk under Basel II. This paper not only considers whether such concerns are well-founded, but also the beneficial and not so beneficial consequences emanating from Basel II’s increased sensitivity to credit risk (as illustrated by the Internal Ratings Based approaches). In so doing it considers the effects of Pillar 2 of Basel II, namely, supervisory review, with particular reference to buffer levels, and whether banks’ actual capital ratios can be expected to correspond with Basel capital requirements given the fact that they are expected to hold certain capital buffers under Pillar 2. Furthermore, it considers how regulators can respond to prevent systemic risks to the financial system during periods when firms which are highly leveraged become reluctant to lend. In deciding to cut back on lending activities, are the decisions of such firms justified in situations where such firms’ credit risk models are extremely and unduly sensitive - hence the level of capital being retained is actually much higher than minimum regulatory Basel capital requirements ?
    Keywords: Basel II; Capital Requirements Directive; pro cyclicality; risk; regulation; banks
    JEL: E0 D0 K2 E5 E3
    Date: 2009–09–18
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:17379&r=rmg
  6. By: René M. Stulz
    Abstract: Many observers have argued that credit default swaps contributed significantly to the credit crisis. Of particular concern to these observers are that credit default swaps trade in the largely unregulated over-the-counter market as bilateral contracts involving counterparty risk and that they facilitate speculation involving negative views of a firm’s financial strength. Some observers have suggested that credit default swaps would not have made the crisis worse had they been traded on exchanges. I conclude that credit default swaps did not cause the dramatic events of the credit crisis, that the over-the-counter credit default swaps market worked well during much of the first year of the credit crisis, and that exchange trading has both advantages and costs compared to over-the-counter trading. Though I argue that eliminating over-the-counter trading of credit default swaps could reduce social welfare, I also recognize that much research is needed to understand better and quantify the social gains and costs of derivatives in general and credit default swaps in particular.
    JEL: G13 G14 G18 G21 G24 G28
    Date: 2009–09
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:15384&r=rmg
  7. By: Josep J. Masdemont; Luis Ortiz-Gracia
    Abstract: This paper proposes a new methodology to compute Value at Risk (VaR) for quantifying losses in credit portfolios. We approximate the cumulative distribution of the loss function by a finite combination of Haar wavelets basis functions and calculate the coefficients of the approximation by inverting its Laplace transform. In fact, we demonstrate that only a few coefficients of the approximation are needed, so VaR can be reached quickly. To test the methodology we consider the Vasicek one-factor portfolio credit loss model as our model framework. The Haar wavelets method is fast, accurate and robust to deal with small or concentrated portfolios, when the hypothesis of the Basel II formulas are violated.
    Date: 2009–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0904.4620&r=rmg
  8. By: Chollete, Loran (University of Stavanger); Pena, Victor de la (Columbia University); Lu, Ching-Chih (National Chengchi University)
    Abstract: .
    Keywords: Diversification; Copula; Correlation Complexity; Downside Risk; Systemic Risk
    JEL: C14 F30 G15
    Date: 2009–06–29
    URL: http://d.repec.org/n?u=RePEc:hhs:stavef:2009_027&r=rmg
  9. By: K. Rajaratnam
    Abstract: The credit crisis of 2007 and 2008 has thrown much focus on the models used to price mortgage backed securities. Many institutions have relied heavily on the credit ratings provided by credit agency. The relationships between management of credit agencies and debt issuers may have resulted in conflict of interest when pricing these securities which has lead to incorrect risk assumptions and value expectations from institutional buyers. Despite the existence of sophisticated models, institutional buyers have relied on these ratings when considering the risks involved with these products. Institutional investors interested in non-agency MBS are particularly vulnerable due to both the credit risks as well as prepayment risks. This paper describes a simple simulation model that model non-agency MBS and CMO. The simulation model builds on existing models for agency MBS. It incorporates credit risks of mortgage buyers using existing models used in capital requirements as specified by the Basel II Accord.
    Date: 2009–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0903.1643&r=rmg
  10. By: Igor Halperin; Pascal Tomecek
    Abstract: In the top-down approach to multi-name credit modeling, calculation of singe name sensitivities appears possible, at least in principle, within the so-called random thinning (RT) procedure which dissects the portfolio risk into individual contributions. We make an attempt to construct a practical RT framework that enables efficient calculation of single name sensitivities in a top-down framework, and can be extended to valuation and risk management of bespoke tranches. Furthermore, we propose a dynamic extension of the RT method that enables modeling of both idiosyncratic and default-contingent individual spread dynamics within a Monte Carlo setting in a way that preserves the portfolio "top"-level dynamics. This results in a model that is not only calibrated to tranche and single name spreads, but can also be tuned to approximately match given levels of spread volatilities and correlations of names in the portfolio.
    Date: 2009–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0901.3404&r=rmg
  11. By: Juan-Pablo Ortega; Rainer Pullirsch; Josef Teichmann; Julian Wergieluk
    Abstract: We provide a new dynamic approach to scenario generation for the purposes of risk management in the banking industry. We connect ideas from conventional techniques -- like historical and Monte Carlo simulation -- and we come up with a hybrid method that shares the advantages of standard procedures but eliminates several of their drawbacks. Instead of considering the static problem of constructing one or ten day ahead distributions for vectors of risk factors, we embed the problem into a dynamic framework, where any time horizon can be consistently simulated. Additionally, we use standard models from mathematical finance for each risk factor, whence bridging the worlds of trading and risk management. Our approach is based on stochastic differential equations (SDEs), like the HJM-equation or the Black-Scholes equation, governing the time evolution of risk factors, on an empirical calibration method to the market for the chosen SDEs, and on an Euler scheme (or high-order schemes) for the numerical evaluation of the respective SDEs. The empirical calibration procedure presented in this paper can be seen as the SDE-counterpart of the so called Filtered Historical Simulation method; the behavior of volatility stems in our case out of the assumptions on the underlying SDEs. Furthermore, we are able to easily incorporate "middle-size" and "large-size" events within our framework always making a precise distinction between the information obtained from the market and the one coming from the necessary a-priori intuition of the risk manager. Results of one concrete implementation are provided.
    Date: 2009–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0904.0624&r=rmg
  12. By: Jiří Witzany (University of Economics, Prague, Czech Republic)
    Abstract: The paper proposes a new method to estimate correlation of account level Basle II Loss Given Default (LGD). The correlation determines the probability distribution of portfolio level LGD in the context of a copula model which is used to stress the LGD parameter as well as to estimate the LGD discount rate and other parameters. Given historical LGD observations we apply the maximum likelihood method to estimate the best correlation parameter. The method is applied and analyzed on a real large data set of unsecured retail account level LGDs and the corresponding monthly series of the average LGDs. The correlation estimate comes relatively close to the PD regulatory correlation. It is also tested for stability using the bootstrapping method and used in an efficient formula to estimate ex ante one-year stressed LGD, i.e. one-year LGD quantiles on any reasonable probability level.
    Keywords: credit risk, recovery rate, loss given default, correlation, regulatory capital
    JEL: G21 G28 C14
    Date: 2009–09
    URL: http://d.repec.org/n?u=RePEc:fau:wpaper:wp2009_21&r=rmg
  13. By: Pavel V. Shevchenko
    Abstract: To quantify the operational risk capital charge under the current regulatory framework for banking supervision, referred to as Basel II, many banks adopt the Loss Distribution Approach. There are many modeling issues that should be resolved to use the approach in practice. In this paper we review the quantitative methods suggested in literature for implementation of the approach. In particular, the use of the Bayesian inference method that allows to take expert judgement and parameter uncertainty into account, modeling dependence and inclusion of insurance are discussed.
    Date: 2009–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0904.1805&r=rmg
  14. By: Anca Gheorghiu; Ion Spanulescu
    Abstract: In this paper we attempt to introduce an econophysics approach to evaluate some aspects of the risks in financial markets. For this purpose, the thermodynamical methods and statistical physics results about entropy and equilibrium states in the physical systems are used. Some considerations on economic value and financial information are made. Finally, on this basis, a new index for the financial risk estimation of the stock-exchange market transactions, named macrostate parameter, was introduced and discussed. Keywords: econophysics, stock-exchange markets, financial risk, informational fascicle, entropy, macrostate parameter.
    Date: 2009–07
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0907.5600&r=rmg
  15. By: Asani Sarkar
    Abstract: In responding to the severity and broad scope of the financial crisis that began in 2007, the Federal Reserve has made aggressive use of both traditional monetary policy instruments and innovative tools in an effort to provide liquidity. In this paper, I examine the Fed's actions in light of the underlying financial amplification mechanisms propagating the crisis--in particular, balance sheet constraints and counterparty credit risk. The empirical evidence supports the Fed's views on the primacy of balance sheet constraints in the earlier stages of the crisis and the increased prominence of counterparty credit risk as the crisis evolved in 2008. I conclude that an understanding of the prevailing risk environment is necessary in order to evaluate when central bank programs are likely to be effective and under what conditions the programs might cease to be necessary.
    Keywords: Credit ; Liquidity (Economics) ; Risk ; Federal Reserve Bank of New York ; Bank supervision
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:fip:fednsr:389&r=rmg
  16. By: Didier Rulli\`ere; Diana Dorobantu
    Abstract: The present paper provides a multi-period contagion model in the credit risk field. Our model is an extension of Davis and Lo's infectious default model. We consider an economy of $n$ firms which may default directly or may be infected by another defaulting firm (a domino effect being also possible). The spontaneous default without external influence and the infections are described by not necessary independent Bernoulli-type random variables. Moreover, several contaminations could be necessary to infect another firm. In this paper we compute the probability distribution function of the total number of defaults in a dependency context. We also give a simple recursive algorithm to compute this distribution in an exchangeability context. Numerical applications illustrate the impact of exchangeability among direct defaults and among contaminations, on different indicators calculated from the law of the total number of defaults.
    Date: 2009–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0904.1653&r=rmg
  17. By: Andreas Martin Lisewski
    Abstract: Recurring international financial crises have adverse socioeconomic effects and demand novel regulatory instruments or strategies for risk management and market stabilization. However, the complex web of market interactions often impedes rational decisions that would absolutely minimize the risk. Here we show that, for any given expected return, investors can overcome this complexity and globally minimize their financial risk in portfolio selection models, which is mathematically equivalent to computing the ground state of spin glass models in physics, provided the margin requirement remains below a critical, empirically measurable value. For markets with centrally regulated margin requirements, this result suggests a potentially stabilizing intervention strategy.
    Date: 2009–08
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0908.0682&r=rmg
  18. By: Peter G. Shepard
    Abstract: Managing a portfolio to a risk model can tilt the portfolio toward weaknesses of the model. As a result, the optimized portfolio acquires downside exposure to uncertainty in the model itself, what we call "second order risk." We propose a risk measure that accounts for this bias. Studies of real portfolios, in asset-by-asset and factor model contexts, demonstrate that second order risk contributes significantly to realized volatility, and that the proposed measure accurately forecasts the out-of-sample behavior of optimized portfolios.
    Date: 2009–08
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0908.2455&r=rmg
  19. By: V. Aquaro; M. Bardoscia; R. Bellotti; A. Consiglio; F. De Carlo; G. Ferri
    Abstract: A system for Operational Risk management based on the computational paradigm of Bayesian Networks is presented. The algorithm allows the construction of a Bayesian Network targeted for each bank using only internal loss data, and takes into account in a simple and realistic way the correlations among different processes of the bank. The internal losses are averaged over a variable time horizon, so that the correlations at different times are removed, while the correlations at the same time are kept: the averaged losses are thus suitable to perform the learning of the network topology and parameters. The algorithm has been validated on synthetic time series. It should be stressed that the practical implementation of the proposed algorithm has a small impact on the organizational structure of a bank and requires an investment in human resources limited to the computational area.
    Date: 2009–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0906.3968&r=rmg
  20. By: Alejandro Reveiz; Carlos Léon
    Abstract: Operational Risk (OR) results from endogenous and exogenous risk factors, as diverse and complex to assess as human resources and technology, which may not be properly measured using traditional quantitative approaches. Engineering has faced the same challenges when designing practical solutions to complex multifactor and non-linear systems where human reasoning, expert knowledge or imprecise information are valuable inputs. One of the solutions provided by engineering is a Fuzzy Logic Inference System (FLIS). Despite the goal of the FLIS model for OR is its assessment, it is not an end in itself. The choice of a FLIS results in a convenient and sound use of qualitative and quantitative inputs, capable of effectively articulating risk management’s identification, assessment, monitoring and mitigation stages. Different from traditional approaches, the proposed model allows evaluating mitigation efforts ex-ante, thus avoiding concealed OR sources from system complexity build-up and optimizing risk management resources. Furthermore, because the model contrasts effective with expected OR data, it is able to constantly validate its outcome, recognize environment shifts and issue warning signals.
    Date: 2009–09–13
    URL: http://d.repec.org/n?u=RePEc:col:000094:005841&r=rmg
  21. By: Jaume Masoliver; Josep Perello
    Abstract: We solve the first-passage problem for the Heston random diffusion model. We obtain exact analytical expressions for the survival and hitting probabilities to a given level of return. We study several asymptotic behaviors and obtain approximate forms of these probabilities which prove, among other interesting properties, the non-existence of a mean first-passage time. One significant result is the evidence of extreme deviations --which implies a high risk of default-- when certain dimensionless parameter, related to the strength of the volatility fluctuations, increases. We believe that this may provide an effective tool for risk control which can be readily applicable to real markets.
    Date: 2009–02
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0902.2735&r=rmg
  22. By: Xiaolin Luo; Pavel V. Shevchenko
    Abstract: An efficient adaptive direct numerical integration (DNI) algorithm is developed for computing high quantiles and conditional Value at Risk (CVaR) of compound distributions using characteristic functions. A key innovation of the numerical scheme is an effective tail integration approximation that reduces the truncation errors significantly with little extra effort. High precision results of the 0.999 quantile and CVaR were obtained for compound losses with heavy tails and a very wide range of loss frequencies using the DNI, Fast Fourier Transform (FFT) and Monte Carlo (MC) methods. These results, particularly relevant to operational risk modelling, can serve as benchmarks for comparing different numerical methods. We found that the adaptive DNI can achieve high accuracy with relatively coarse grids. It is much faster than MC and competitive with FFT in computing high quantiles and CVaR of compound distributions in the case of moderate to high frequencies and heavy tails.
    Date: 2009–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0904.0830&r=rmg

This nep-rmg issue is ©2009 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.