nep-rmg New Economics Papers
on Risk Management
Issue of 2011‒02‒26
fifteen papers chosen by
Stan Miles
Thompson Rivers University

  1. Macro, Industry and Frailty Effects in Defaults: The 2008 Credit Crisis in Perspective By Siem Jan Koopman; Andre Lucas; Bernd Schwaab
  2. Market Timing, Investment, and Risk Management By Patrick Bolton; Hui Chen; Neng Wang
  3. Applying hedging strategies to estimate model risk and provision calculation By Alberto Elices; Eduard Gim\'enez
  4. Transition Probability Matrix Methodology for Incremental Risk Charge By Tzahi Yavin; Hu Zhang; Eugene Wang; Michael A. Clayton
  5. A Random Matrix Approach on Credit Risk By Michael C. M\"unnix; Rudi Sch\"afer; Thomas Guhr
  6. Dependence of defaults and recoveries in structural credit risk models By Rudi Sch\"afer; Alexander F. R. Koivusalo
  7. Longevity risk and capital markets: The 2009-2010 update By Blake, David; Brockett, Patrick; Cox, Samuel; MacMinn, Richard
  8. Cash Flow and Discount Rate Risk in Up and Down Markets: What is actually priced? By Mahmoud Botshekan; Roman Kraeussl; Andre Lucas
  9. Banking risk and regulation: Does one size fit all? By Jeroen Klomp
  10. Analytic Loss Distributional Approach Model for Operational Risk from the alpha-Stable Doubly Stochastic Compound Processes and Implications for Capital Allocation By Gareth W. Peters; Pavel Shevchenko; Mark Young; Wendy Yip
  11. The Downside Risk of Heavy Tails induces Low Diversification By Namwon Hyung; Casper G. de Vries
  12. Two-way interplays between capital buffers, credit and output: evidence from French banks By Coffinet, J.; Coudert, V.; Pop, A.; Pouvelle, C.
  13. A Class of Adaptive EM-based Importance Sampling Algorithms for Efficient and Robust Posterior and Predictive Simulation By Lennart Hoogerheide; Anne Opschoor; Herman K. van Dijk
  14. Searching For Lost Decades By Blake LeBaron
  15. Black swans or dragon kings? A simple test for deviations from the power law By Joanna Janczura; Rafal Weron

  1. By: Siem Jan Koopman (VU University Amsterdam); Andre Lucas (VU University Amsterdam); Bernd Schwaab (VU University Amsterdam)
    Abstract: We determine the magnitude and nature of systematic default risk using 1971{2009) default data from Moody's. We disentangle systematic risk factors due to business cycle effects, common default dynamics (frailty), and industry-specific dynamics (including contagion). To quantify the contribution of each of these factors to default rate volatility we introduce a new and flexible model class for factor structures on non-Gaussian (defaults) and Gaussian (macro factors) data simultaneously. We find that all three types of risk factors (macro, frailty, industry/contagion) are important for default risk. The systematic risk factors account for roughly one third of observed default risk variation. Half of this is captured by macro and financial market factors. The remainder is captured by frailty and industry effects (in roughly equal proportions). The frailty components are particularly relevant in times of stress. Models based only on macro variables may both under-estimate and over-estimate default activity during such times. This indicates that frailty factors do not simply capture missed non-linear responses of defaults to business cycle dynamics. We also find significant differences in the impact of crises on defaults at the sectoral level, implying frailty as well as contagion may play a role in systematic default clustering. Finally, we show that the contribution of frailty and industry factors on top of macro factors is economicallysignificant for assessing portfolio risk.
    Keywords: systematic default risk; credit portfolio models; mixed-measurement dynamic factor model; frailty-correlated defaults; state space methods; dynamic credit risk management
    JEL: G21 C33
    Date: 2010–01–28
  2. By: Patrick Bolton; Hui Chen; Neng Wang
    Abstract: Firms face uncertain financing conditions and are exposed to the risk of a sudden rise in financing costs during financial crises. We develop a tractable model of dynamic corporate financial management (cash accumulation, investment, equity issuance, risk management, and payout policies) for a financially constrained firm facing time-varying external financing costs. Firms value financial slack and build cash reserves to mitigate financial constraints. However, uncertainty about future financing opportunities also induce firms to rationally time the equity market, even if they have no immediate needs for cash. The stochastic financing conditions have rich implications for investment and risk management: (1) investment can be decreasing in financial slack; (2) firms may invest less as expected future financing costs fall; (3) investment-cash sensitivity, marginal value of cash, and firm's risk premium can all be non-monotonic in cash holdings; (4) speculation (as opposed to hedging) can be value-maximizing for financially constrained firms.
    JEL: E22 G12 G3
    Date: 2011–02
  3. By: Alberto Elices; Eduard Gim\'enez
    Abstract: This paper introduces a new model risk measure based on hedging strategies to estimate model risk and provision calculation under uncertainty of volatility. This measure allows comparing different products and models (pricing hypothesis) under a homogeneous framework and conclude which one is the best. The model risk measure is defined in terms of the expected value and standard deviation of the loss given by the hedging strategy at a given time horizon. It has been assumed that the market volatility surface is driven by Heston's dynamics calibrated to market for a given time horizon. The method is applied to estimate and compare model risk under volga-vanna and Black-Scholes models for double-no-touch options and a porfolio of forward fader options.
    Date: 2011–02
  4. By: Tzahi Yavin; Hu Zhang; Eugene Wang; Michael A. Clayton
    Abstract: As part of Basel II's incremental risk charge (IRC) methodology, this paper summarizes our extensive investigations of constructing transition probability matrices (TPMs) for unsecuritized credit products in the trading book. The objective is to create monthly or quarterly TPMs with predefined sectors and ratings that are consistent with the bank's Basel PDs. Constructing a TPM is not a unique process. We highlight various aspects of three types of uncertainties embedded in different construction methods: 1) the available historical data and the bank's rating philosophy; 2) the merger of one-year Basel PD and the chosen Moody's TPMs; and 3) deriving a monthly or quarterly TPM when the generator matrix does not exist. Given the fact that TPMs and specifically their PDs are the most important parameters in IRC, it is our view that banks may need to make discretionary choices regarding their methodology, with uncertainties well understood and managed.
    Date: 2011–02
  5. By: Michael C. M\"unnix; Rudi Sch\"afer; Thomas Guhr
    Abstract: We consider a structural model for the estimation of credit risk based on Merton's original model. By using Random-Matrix theory we demonstrate analytically that the presence of correlations severely limits the effect of diversification in a credit portfolio if the correlation are not identical zero. The existence of correlations alters the tails of the loss distribution tremendously, even if their average is zero. Under the assumption of randomly fluctuating correlations, a lower bound for the estimation of the loss distribution is provided.
    Date: 2011–02
  6. By: Rudi Sch\"afer; Alexander F. R. Koivusalo
    Abstract: The current research on credit risk is primarily focused on modeling default probabilities. Recovery rates are often treated as an afterthought; they are modeled independently, in many cases they are even assumed constant. This is despite of their pronounced effect on the tail of the loss distribution. Here, we take a step back, historically, and start again from the Merton model, where defaults and recoveries are both determined by an underlying process. Hence, they are intrinsically connected. For the diffusion process, we can derive the functional relation between expected recovery rate and default probability. This relation depends on a single parameter only. In Monte Carlo simulations we find that the same functional dependence also holds for jump-diffusion and GARCH processes. We discuss how to incorporate this structural recovery rate into reduced form models, in order to restore essential structural information which is usually neglected in the reduced-form approach.
    Date: 2011–02
  7. By: Blake, David; Brockett, Patrick; Cox, Samuel; MacMinn, Richard
    Abstract: This Special Issue of the North American Actuarial Journal contains ten contributions to the academic literature all dealing with longevity risk and capital markets. Draft versions of the papers were presented at Longevity Five: the Fifth International Longevity Risk and Capital Markets Solutions Conference that was held in New York on 25-26 September 2009. It was hosted by J. P. Morgan and St John’s University and organized by the Pensions Institute at Cass Business School, London, and the Edmondson-Miller Chair at Illinois State University.
    Keywords: Longevity Risk; Capital Market
    JEL: G23
    Date: 2011–02
  8. By: Mahmoud Botshekan (VU University Amsterdam); Roman Kraeussl (VU University Amsterdam); Andre Lucas (VU University Amsterdam)
    Abstract: We test whether asymmetric preferences for losses versus gains as in Ang, Chen, and Xing (2006) also affect the pricing of cash flow versus discount rate news as in Campbell and Vuolteenaho (2004). We construct a new four-fold beta decomposition, distinguishing cash flow and discount rate betas in up and down markets. Using CRSP data over 1963--2008, we find that the downside cash flow beta and downside discount rate beta carry the largest premia. We subject our result to an extensive number of robustness checks. Overall, downside cash flow risk is priced most consistently across different samples, periods, and return decomposition methods, and is the only component of beta that has significant out-of-sample predictive ability. The downside cash flow risk premium is mainly attributable to small stocks. The risk premium for large stocks appears much more driven by a compensation for symmetric, cash flow related risk. Finally, we multiply our premia estimates by average betas to compute the contribution of the different risk components to realized average returns. We find that up and down discount rate components dominate the contribution to average returns of downside cash flow risk.
    Keywords: asset pricing; beta; downside risk; upside risk; cash flow risk; discount rate risk
    JEL: G11 G12 G14
    Date: 2010–11–25
  9. By: Jeroen Klomp
    Abstract: Using data for more than 200 banks from 21 OECD countries for the period 2002 to 2008, we examine the impact of bank regulation and supervision on banking risk.
    JEL: E44 G2
    Date: 2010–12
  10. By: Gareth W. Peters; Pavel Shevchenko; Mark Young; Wendy Yip
    Abstract: Under the Basel II standards, the Operational Risk (OpRisk) advanced measurement approach is not prescriptive regarding the class of statistical model utilised to undertake capital estimation. It has however become well accepted to utlise a Loss Distributional Approach (LDA) paradigm to model the individual OpRisk loss process corresponding to the Basel II Business line/event type. In this paper we derive a novel class of doubly stochastic alpha-stable family LDA models. These models provide the ability to capture the heavy tailed loss process typical of OpRisk whilst also providing analytic expressions for the compound process annual loss density and distributions as well as the aggregated compound process annual loss models. In particular we develop models of the annual loss process in two scenarios. The first scenario considers the loss process with a stochastic intensity parameter, resulting in an inhomogeneous compound Poisson processes annually. The resulting arrival process of losses under such a model will have independent counts over increments within the year. The second scenario considers discretization of the annual loss process into monthly increments with dependent time increments as captured by a Binomial process with a stochastic probability of success changing annually. Each of these models will be coupled under an LDA framework with heavy-tailed severity models comprised of $\alpha$-stable severities for the loss amounts per loss event. In this paper we will derive analytic results for the annual loss distribution density and distribution under each of these models and study their properties.
    Date: 2011–02
  11. By: Namwon Hyung (University of Seoul); Casper G. de Vries (Erasmus University Rotterdam)
    Abstract: Actual portfolios contain fewer stocks than are implied by standard financial analysis that balances the costs of diversification against the benefits in terms of the standard deviation of the returns. Suppose a safety first investor cares about downside risk and recognizes the heavy
    Keywords: Portfolio diversification; downside risk; heavy tails
    JEL: G0 G1 C2
    Date: 2010–08–26
  12. By: Coffinet, J.; Coudert, V.; Pop, A.; Pouvelle, C.
    Abstract: We assess the extent to which capital buffers (the capital banks hold in excess of the regulatory minimum) exacerbate rather than reduce the cyclical behavior of credit. We empirically study the relationships between output gap, capital buffers and loan growth with firm-level data for French banks over the period 1993—2009. Our findings reveal that bank capital buffers intensify the cyclical credit fluctuations arising from the output gap developments, all the more as better quality capital is considered. Moreover, by performing Granger causality tests at the bank level, we find evidence of a two-way causality between capital buffers and loan growth, pointing to mutually reinforcing mechanisms. Overall, those empirical results lend support to a countercyclical financial regulation that focuses on highest-quality capital and aims at smoothing loan growth.
    Keywords: Bank Capital Regulation, Procyclicality, Capital Buffers, Business Cycle Fluctuations, Basel III.
    JEL: G28 G21
    Date: 2011
  13. By: Lennart Hoogerheide (Erasmus University Rotterdam); Anne Opschoor (Erasmus University Rotterdam); Herman K. van Dijk (Erasmus University Rotterdam)
    Abstract: A class of adaptive sampling methods is introduced for efficient posterior and predictive simulation. The proposed methods are robust in the sense that they can handle target distributions that exhibit non-elliptical shapes such as multimodality and skewness. The basic method makes use of sequences of importance weighted Expectation Maximization steps in order to efficiently construct a mixture of Student-<I>t</I> densities that approximates accurately the target distribution -typically a posterior distribution, of which we only require a kernel - in the sense that the Kullback-Leibler divergence between target and mixture is minimized. We label this approach Mixture of <I>t</I> by Importance Sampling and Expectation Maximization (MitISEM). We also introduce three extensions of the basic MitISEM approach. First, we propose a method for applying MitISEM in a sequential manner, so that the candidate distribution for posterior simulation is cleverly updated when new data become available. Our results show that the computational effort reduces enormously. This sequential approach can be combined with a tempering approach, which facilitates the simulation from densities with multiple modes that are far apart. Second, we introduce a permutation-augmented MitISEM approach, for importance sampling from posterior distributions in mixture models without the requirement of imposing identification restrictions on the model's mixture regimes' parameters. Third, we propose a partial MitISEM approach, which aims at approximating the marginal and conditional posterior distributions of subsets of model parameters, rather than the joint. This division can substantially reduce the dimension of the approximation problem.
    Keywords: mixture of Student-t distributions; importance sampling; Kullback-Leibler divergence; Expectation Maximization; Metropolis-Hastings algorithm; predictive likelihoods; mixture GARCH models; Value at Risk
    JEL: C11 C15 C22
    Date: 2011–01–06
  14. By: Blake LeBaron (International Business School, Brandeis University)
    Abstract: This paper estimates the probability of a ``lost decade'' where equity investments lose value over a ten year period. The findings are a reminder that equity investments are risky even over longer time periods, and investors should take this into consideration when making portfolio choices. It also introduces a simple method to allow the reader to combine beliefs about long run stock returns along with computer simulated return distributions. Finally, it is shown that mistaken perceptions of using arithmetic means could account for some common misconceptions about the chance of losses over a decade.
    Date: 2010–06
  15. By: Joanna Janczura; Rafal Weron
    Abstract: We develop a simple test for deviations from power law tails, which is based on the asymptotic properties of the empirical distribution function. We use this test to answer the question whether great natural disasters, financial crashes or electricity price spikes should be classified as dragon kings or 'only' as black swans.
    Date: 2011–02

This nep-rmg issue is ©2011 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.