
on Risk Management 
By:  Sovan Mitra 
Abstract:  This paper was presented and written for two seminars: a national UK University Risk Conference and a Risk Management industry workshop. The target audience is therefore a cross section of Academics and industry professionals. The current ongoing global credit crunch has highlighted the importance of risk measurement in Finance to companies and regulators alike. Despite risk measurement's central importance to risk management, few papers exist reviewing them or following their evolution from its foremost beginnings up to the present day risk measures. This paper reviews the most important portfolio risk measures in Financial Mathematics, from Bernoulli (1738) to Markowitz's Portfolio Theory, to the presently preferred risk measures such as CVaR (conditional Value at Risk). We provide a chronological review of the risk measures and survey less commonly known risk measures e.g. Treynor ratio. 
Date:  2009–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0904.0870&r=rmg 
By:  Chollete, Loran (University of Stavanger); de la Pena , Victor (Columbia Universit); Lu, ChingChih (National Chengchi University) 
Abstract:  . 
Keywords:  Diversification; Downside Risk; Correlation Complexity; Extreme Value; Systemic Risk 
JEL:  C14 F30 G15 
Date:  2009–06–30 
URL:  http://d.repec.org/n?u=RePEc:hhs:stavef:2009_026&r=rmg 
By:  Michael McAleer (Econometric Institute, Erasmus School of Economics, Erasmus University Rotterdam and Tinbergen Institute and Center for International Research on the Japanese Economy (CIRJE), Faculty of Economics, University of Tokyo); JuanAngel JimenezMartin (Department of Quantitative Economics, Complutense University of Madrid); Teodosio PerezAmaral (Department of Quantitative Economics Complutense University of Madrid) 
Abstract:  In this paper we advance the idea that optimal risk management under the Basel II Accord will typically require the use of a combination of different models of risk. This idea is illustrated by analyzing the best empirical models of risk for five stock indexes before, during, and after the 200809 financial crisis. The data used are the Dow Jones Industrial Average, Financial Times Stock Exchange 100, Nikkei, Hang Seng and Standard and Poor's 500 Composite Index. The primary goal of the exercise is to identify the best models for risk management in each period according to the minimization of average daily capital requirements under the Basel II Accord. It is found that the best risk models can and do vary before, during and after the 200809 financial crisis. Moreover, it is found that an aggressive risk management strategy, namely the supremum strategy that combines different models of risk, can result in significant gains in average daily capital requirements, relative to the strategy of using single models, while staying within the limits of the Basel II Accord. 
Date:  2009–09 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2009cf667&r=rmg 
By:  JuanAngel JimenezMartin (Dpto. de Fundamentos de Análisis Económico II, Universidad Complutense); Michael McAleer; Teodosio PérezAmaral (Dpto. de Fundamentos de Análisis Económico II, Universidad Complutense) 
Abstract:  When dealing with market risk under the Basel II Accord, variation pays in the form of lower capital requirements and higher profits. Typically, GARCH type models are chosen to forecast ValueatRisk (VaR) using a single risk model. In this paper we illustrate two useful variations to the standard mechanism for choosing forecasts, namely: (i) combining different forecast models for each period, such as a daily model that forecasts the supremum or infinum value for the VaR; (ii) alternatively, select a single model to forecast VaR, and then modify the daily forecast, depending on the recent history of violations under the Basel II Accord. We illustrate these points using the Standard and Poor’s 500 Composite Index. In many cases we find significant decreases in the capital requirements, while incurring a number of violations that stays within the Basel II Accord limits. 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:ucm:doicae:0919&r=rmg 
By:  Ojo, Marianne 
Abstract:  This paper addresses factors which have prompted the need for further revision of banking regulation, with particular reference to the Capital Requirements Directive. The Capital Requirements Directive (CRD), which comprises the 2006/48/EC Directive on the taking up and pursuit of the business of credit institutions and the 2006/49/EC Directive on the capital adequacy of investment firms and credit institutions, implemented the revised framework for the International Convergence of Capital Measurement and Capital Standards (Basel II) within EU member states. Pro cyclicality has attracted a lot of attention – particularly with regards to the recent financial crisis, owing to concerns arising from increased sensitivity to credit risk under Basel II. This paper not only considers whether such concerns are wellfounded, but also the beneficial and not so beneficial consequences emanating from Basel II’s increased sensitivity to credit risk (as illustrated by the Internal Ratings Based approaches). In so doing it considers the effects of Pillar 2 of Basel II, namely, supervisory review, with particular reference to buffer levels, and whether banks’ actual capital ratios can be expected to correspond with Basel capital requirements given the fact that they are expected to hold certain capital buffers under Pillar 2. Furthermore, it considers how regulators can respond to prevent systemic risks to the financial system during periods when firms which are highly leveraged become reluctant to lend. In deciding to cut back on lending activities, are the decisions of such firms justified in situations where such firms’ credit risk models are extremely and unduly sensitive  hence the level of capital being retained is actually much higher than minimum regulatory Basel capital requirements ? 
Keywords:  Basel II; Capital Requirements Directive; pro cyclicality; risk; regulation; banks 
JEL:  E0 D0 K2 E5 E3 
Date:  2009–09–18 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:17379&r=rmg 
By:  RenÃ© M. Stulz 
Abstract:  Many observers have argued that credit default swaps contributed significantly to the credit crisis. Of particular concern to these observers are that credit default swaps trade in the largely unregulated overthecounter market as bilateral contracts involving counterparty risk and that they facilitate speculation involving negative views of a firmâ€™s financial strength. Some observers have suggested that credit default swaps would not have made the crisis worse had they been traded on exchanges. I conclude that credit default swaps did not cause the dramatic events of the credit crisis, that the overthecounter credit default swaps market worked well during much of the first year of the credit crisis, and that exchange trading has both advantages and costs compared to overthecounter trading. Though I argue that eliminating overthecounter trading of credit default swaps could reduce social welfare, I also recognize that much research is needed to understand better and quantify the social gains and costs of derivatives in general and credit default swaps in particular. 
JEL:  G13 G14 G18 G21 G24 G28 
Date:  2009–09 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:15384&r=rmg 
By:  Josep J. Masdemont; Luis OrtizGracia 
Abstract:  This paper proposes a new methodology to compute Value at Risk (VaR) for quantifying losses in credit portfolios. We approximate the cumulative distribution of the loss function by a finite combination of Haar wavelets basis functions and calculate the coefficients of the approximation by inverting its Laplace transform. In fact, we demonstrate that only a few coefficients of the approximation are needed, so VaR can be reached quickly. To test the methodology we consider the Vasicek onefactor portfolio credit loss model as our model framework. The Haar wavelets method is fast, accurate and robust to deal with small or concentrated portfolios, when the hypothesis of the Basel II formulas are violated. 
Date:  2009–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0904.4620&r=rmg 
By:  Chollete, Loran (University of Stavanger); Pena, Victor de la (Columbia University); Lu, ChingChih (National Chengchi University) 
Abstract:  . 
Keywords:  Diversification; Copula; Correlation Complexity; Downside Risk; Systemic Risk 
JEL:  C14 F30 G15 
Date:  2009–06–29 
URL:  http://d.repec.org/n?u=RePEc:hhs:stavef:2009_027&r=rmg 
By:  K. Rajaratnam 
Abstract:  The credit crisis of 2007 and 2008 has thrown much focus on the models used to price mortgage backed securities. Many institutions have relied heavily on the credit ratings provided by credit agency. The relationships between management of credit agencies and debt issuers may have resulted in conflict of interest when pricing these securities which has lead to incorrect risk assumptions and value expectations from institutional buyers. Despite the existence of sophisticated models, institutional buyers have relied on these ratings when considering the risks involved with these products. Institutional investors interested in nonagency MBS are particularly vulnerable due to both the credit risks as well as prepayment risks. This paper describes a simple simulation model that model nonagency MBS and CMO. The simulation model builds on existing models for agency MBS. It incorporates credit risks of mortgage buyers using existing models used in capital requirements as specified by the Basel II Accord. 
Date:  2009–03 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0903.1643&r=rmg 
By:  Igor Halperin; Pascal Tomecek 
Abstract:  In the topdown approach to multiname credit modeling, calculation of singe name sensitivities appears possible, at least in principle, within the socalled random thinning (RT) procedure which dissects the portfolio risk into individual contributions. We make an attempt to construct a practical RT framework that enables efficient calculation of single name sensitivities in a topdown framework, and can be extended to valuation and risk management of bespoke tranches. Furthermore, we propose a dynamic extension of the RT method that enables modeling of both idiosyncratic and defaultcontingent individual spread dynamics within a Monte Carlo setting in a way that preserves the portfolio "top"level dynamics. This results in a model that is not only calibrated to tranche and single name spreads, but can also be tuned to approximately match given levels of spread volatilities and correlations of names in the portfolio. 
Date:  2009–01 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0901.3404&r=rmg 
By:  JuanPablo Ortega; Rainer Pullirsch; Josef Teichmann; Julian Wergieluk 
Abstract:  We provide a new dynamic approach to scenario generation for the purposes of risk management in the banking industry. We connect ideas from conventional techniques  like historical and Monte Carlo simulation  and we come up with a hybrid method that shares the advantages of standard procedures but eliminates several of their drawbacks. Instead of considering the static problem of constructing one or ten day ahead distributions for vectors of risk factors, we embed the problem into a dynamic framework, where any time horizon can be consistently simulated. Additionally, we use standard models from mathematical finance for each risk factor, whence bridging the worlds of trading and risk management. Our approach is based on stochastic differential equations (SDEs), like the HJMequation or the BlackScholes equation, governing the time evolution of risk factors, on an empirical calibration method to the market for the chosen SDEs, and on an Euler scheme (or highorder schemes) for the numerical evaluation of the respective SDEs. The empirical calibration procedure presented in this paper can be seen as the SDEcounterpart of the so called Filtered Historical Simulation method; the behavior of volatility stems in our case out of the assumptions on the underlying SDEs. Furthermore, we are able to easily incorporate "middlesize" and "largesize" events within our framework always making a precise distinction between the information obtained from the market and the one coming from the necessary apriori intuition of the risk manager. Results of one concrete implementation are provided. 
Date:  2009–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0904.0624&r=rmg 
By:  Jiří Witzany (University of Economics, Prague, Czech Republic) 
Abstract:  The paper proposes a new method to estimate correlation of account level Basle II Loss Given Default (LGD). The correlation determines the probability distribution of portfolio level LGD in the context of a copula model which is used to stress the LGD parameter as well as to estimate the LGD discount rate and other parameters. Given historical LGD observations we apply the maximum likelihood method to estimate the best correlation parameter. The method is applied and analyzed on a real large data set of unsecured retail account level LGDs and the corresponding monthly series of the average LGDs. The correlation estimate comes relatively close to the PD regulatory correlation. It is also tested for stability using the bootstrapping method and used in an efficient formula to estimate ex ante oneyear stressed LGD, i.e. oneyear LGD quantiles on any reasonable probability level. 
Keywords:  credit risk, recovery rate, loss given default, correlation, regulatory capital 
JEL:  G21 G28 C14 
Date:  2009–09 
URL:  http://d.repec.org/n?u=RePEc:fau:wpaper:wp2009_21&r=rmg 
By:  Pavel V. Shevchenko 
Abstract:  To quantify the operational risk capital charge under the current regulatory framework for banking supervision, referred to as Basel II, many banks adopt the Loss Distribution Approach. There are many modeling issues that should be resolved to use the approach in practice. In this paper we review the quantitative methods suggested in literature for implementation of the approach. In particular, the use of the Bayesian inference method that allows to take expert judgement and parameter uncertainty into account, modeling dependence and inclusion of insurance are discussed. 
Date:  2009–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0904.1805&r=rmg 
By:  Anca Gheorghiu; Ion Spanulescu 
Abstract:  In this paper we attempt to introduce an econophysics approach to evaluate some aspects of the risks in financial markets. For this purpose, the thermodynamical methods and statistical physics results about entropy and equilibrium states in the physical systems are used. Some considerations on economic value and financial information are made. Finally, on this basis, a new index for the financial risk estimation of the stockexchange market transactions, named macrostate parameter, was introduced and discussed. Keywords: econophysics, stockexchange markets, financial risk, informational fascicle, entropy, macrostate parameter. 
Date:  2009–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0907.5600&r=rmg 
By:  Asani Sarkar 
Abstract:  In responding to the severity and broad scope of the financial crisis that began in 2007, the Federal Reserve has made aggressive use of both traditional monetary policy instruments and innovative tools in an effort to provide liquidity. In this paper, I examine the Fed's actions in light of the underlying financial amplification mechanisms propagating the crisisin particular, balance sheet constraints and counterparty credit risk. The empirical evidence supports the Fed's views on the primacy of balance sheet constraints in the earlier stages of the crisis and the increased prominence of counterparty credit risk as the crisis evolved in 2008. I conclude that an understanding of the prevailing risk environment is necessary in order to evaluate when central bank programs are likely to be effective and under what conditions the programs might cease to be necessary. 
Keywords:  Credit ; Liquidity (Economics) ; Risk ; Federal Reserve Bank of New York ; Bank supervision 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:fip:fednsr:389&r=rmg 
By:  Didier Rulli\`ere; Diana Dorobantu 
Abstract:  The present paper provides a multiperiod contagion model in the credit risk field. Our model is an extension of Davis and Lo's infectious default model. We consider an economy of $n$ firms which may default directly or may be infected by another defaulting firm (a domino effect being also possible). The spontaneous default without external influence and the infections are described by not necessary independent Bernoullitype random variables. Moreover, several contaminations could be necessary to infect another firm. In this paper we compute the probability distribution function of the total number of defaults in a dependency context. We also give a simple recursive algorithm to compute this distribution in an exchangeability context. Numerical applications illustrate the impact of exchangeability among direct defaults and among contaminations, on different indicators calculated from the law of the total number of defaults. 
Date:  2009–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0904.1653&r=rmg 
By:  Andreas Martin Lisewski 
Abstract:  Recurring international financial crises have adverse socioeconomic effects and demand novel regulatory instruments or strategies for risk management and market stabilization. However, the complex web of market interactions often impedes rational decisions that would absolutely minimize the risk. Here we show that, for any given expected return, investors can overcome this complexity and globally minimize their financial risk in portfolio selection models, which is mathematically equivalent to computing the ground state of spin glass models in physics, provided the margin requirement remains below a critical, empirically measurable value. For markets with centrally regulated margin requirements, this result suggests a potentially stabilizing intervention strategy. 
Date:  2009–08 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0908.0682&r=rmg 
By:  Peter G. Shepard 
Abstract:  Managing a portfolio to a risk model can tilt the portfolio toward weaknesses of the model. As a result, the optimized portfolio acquires downside exposure to uncertainty in the model itself, what we call "second order risk." We propose a risk measure that accounts for this bias. Studies of real portfolios, in assetbyasset and factor model contexts, demonstrate that second order risk contributes significantly to realized volatility, and that the proposed measure accurately forecasts the outofsample behavior of optimized portfolios. 
Date:  2009–08 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0908.2455&r=rmg 
By:  V. Aquaro; M. Bardoscia; R. Bellotti; A. Consiglio; F. De Carlo; G. Ferri 
Abstract:  A system for Operational Risk management based on the computational paradigm of Bayesian Networks is presented. The algorithm allows the construction of a Bayesian Network targeted for each bank using only internal loss data, and takes into account in a simple and realistic way the correlations among different processes of the bank. The internal losses are averaged over a variable time horizon, so that the correlations at different times are removed, while the correlations at the same time are kept: the averaged losses are thus suitable to perform the learning of the network topology and parameters. The algorithm has been validated on synthetic time series. It should be stressed that the practical implementation of the proposed algorithm has a small impact on the organizational structure of a bank and requires an investment in human resources limited to the computational area. 
Date:  2009–06 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0906.3968&r=rmg 
By:  Alejandro Reveiz; Carlos Léon 
Abstract:  Operational Risk (OR) results from endogenous and exogenous risk factors, as diverse and complex to assess as human resources and technology, which may not be properly measured using traditional quantitative approaches. Engineering has faced the same challenges when designing practical solutions to complex multifactor and nonlinear systems where human reasoning, expert knowledge or imprecise information are valuable inputs. One of the solutions provided by engineering is a Fuzzy Logic Inference System (FLIS). Despite the goal of the FLIS model for OR is its assessment, it is not an end in itself. The choice of a FLIS results in a convenient and sound use of qualitative and quantitative inputs, capable of effectively articulating risk management’s identification, assessment, monitoring and mitigation stages. Different from traditional approaches, the proposed model allows evaluating mitigation efforts exante, thus avoiding concealed OR sources from system complexity buildup and optimizing risk management resources. Furthermore, because the model contrasts effective with expected OR data, it is able to constantly validate its outcome, recognize environment shifts and issue warning signals. 
Date:  2009–09–13 
URL:  http://d.repec.org/n?u=RePEc:col:000094:005841&r=rmg 
By:  Jaume Masoliver; Josep Perello 
Abstract:  We solve the firstpassage problem for the Heston random diffusion model. We obtain exact analytical expressions for the survival and hitting probabilities to a given level of return. We study several asymptotic behaviors and obtain approximate forms of these probabilities which prove, among other interesting properties, the nonexistence of a mean firstpassage time. One significant result is the evidence of extreme deviations which implies a high risk of default when certain dimensionless parameter, related to the strength of the volatility fluctuations, increases. We believe that this may provide an effective tool for risk control which can be readily applicable to real markets. 
Date:  2009–02 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0902.2735&r=rmg 
By:  Xiaolin Luo; Pavel V. Shevchenko 
Abstract:  An efficient adaptive direct numerical integration (DNI) algorithm is developed for computing high quantiles and conditional Value at Risk (CVaR) of compound distributions using characteristic functions. A key innovation of the numerical scheme is an effective tail integration approximation that reduces the truncation errors significantly with little extra effort. High precision results of the 0.999 quantile and CVaR were obtained for compound losses with heavy tails and a very wide range of loss frequencies using the DNI, Fast Fourier Transform (FFT) and Monte Carlo (MC) methods. These results, particularly relevant to operational risk modelling, can serve as benchmarks for comparing different numerical methods. We found that the adaptive DNI can achieve high accuracy with relatively coarse grids. It is much faster than MC and competitive with FFT in computing high quantiles and CVaR of compound distributions in the case of moderate to high frequencies and heavy tails. 
Date:  2009–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0904.0830&r=rmg 