nep-rmg New Economics Papers
on Risk Management
Issue of 2013‒01‒19
thirteen papers chosen by
Stan Miles
Thompson Rivers University

  1. Has the Basel Accord Improved Risk Management During the Global Financial Crisis? By Michael McAleer; Juan-Ángel Jiménez-Martín; Teodosio Pérez-Amaral
  2. Multivariate risk measures: a constructive approach based on selections By Ignacio Cascos; Ilya Molchanov
  3. An Autocorrelated Loss Distribution Approach : back to the time series By Dominique Guegan; Bertrand Hassani
  4. The Foster-Hart Measure of Riskiness for General Gambles By Frank Riedel; Tobias Hellmann
  5. Incentive audits : a new approach to financial regulation By Cihak, Martin; Demirguc-Kunt, Asli; Johnston, R. Barry
  6. Comparing quadratic and non-quadratic local risk minimization for the hedging of contingent claims By Frédéric Abergel
  7. On a dynamic adaptation of the Distribution Builder approach to investment decisions By Phillip Monin
  8. Natural delta gamma hedging of longevity and interest rate risk By Elisa Luciano; Luca Regis; Elena Vigna
  9. Forecasting extreme electricity spot prices By Volodymyr Korniichuk
  10. Partial Splitting of Longevity and Financial Risks: The Longevity Nominal Choosing Swaptions By Harry Bensusan; Nicole El Karoui; Stéphane Loisel; Yahia Salhi
  11. Extreme Financial Cycles By Bertrand Candelon; Guillaume Gaulier; Christophe Hurlin
  12. Are Risk Attitudes Fixed Factors or Fleeting Feelings? By Cho, In Soo
  13. Empirical Research on Corporate Credit-Ratings: A Literature Review By Alexander B. Matthies; ; ;

  1. By: Michael McAleer (Erasmus University Rotterdam); Juan-Ángel Jiménez-Martín (Complutense University of Madrid); Teodosio Pérez-Amaral (Complutense University of Madrid)
    Abstract: The Basel II Accord requires that banks and other Authorized Deposit-taking Institutions (ADIs) communicate their daily risk forecasts to the appropriate monetary authorities at the beginning of each trading day, using one or more risk models to measure Value-at-Risk (VaR). The risk estimates of these models are used to determine capital requirements and associated capital costs of ADIs, depending in part on the number of previous violations, whereby realised losses exceed the estimated VaR. In this paper we define risk management in terms of choosing from a variety of risk models, and discuss the selection of optimal risk models. A new approach to model selection for predicting VaR is proposed, consisting of combining alternative risk models, and we compare conservative and aggressive strategies for choosing between VaR models. We then examine how different risk management strategies performed during the 2008-09 global financial crisis. These issues are illustrated using Standard and Poor’s 500 Composite Index.
    Keywords: Value-at-Risk (VaR); daily capital charges; violation penalties; optimizing strategy; risk forecasts; aggressive or conservative risk management strategies; Basel Accord; global financial crisis
    JEL: G32 G11 G17 C53 C22
    Date: 2013–01–08
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20130010&r=rmg
  2. By: Ignacio Cascos; Ilya Molchanov
    Abstract: Since risky positions in multivariate portfolios can be offset by various choices of capital requirements that depend on the exchange rules and related transaction costs, it is natural to assume that the risk measures of random vectors are set-valued. Furthermore, it is reasonable to include the exchange rules in the argument of the risk and so consider risk measures of set-valued portfolios. This situation includes the classical Kabanov's transaction costs model, where the set-valued portfolio is given by the sum of a random vector and an exchange cone. The definition of the selection risk measure is based on calling a set-valued portfolio acceptable if it possesses a selection with all individually acceptable marginals. The obtained risk measure is coherent (or convex), law invariant and has values being upper convex closed sets. We describe the dual representation of the selection risk measure and suggest efficient ways of approximating it from below and from above. In case of Kabanov's exchange cone model, it is shown how the selection risk measure relates to the set-valued risk measures considered by Kulikov (2008) and Hamel and Heyde (2010).
    Date: 2013–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1301.1496&r=rmg
  3. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Paris I - Panthéon Sorbonne); Bertrand Hassani (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Paris I - Panthéon Sorbonne)
    Abstract: The Advanced Measurement Approach requires financial institutions to develop internal models to evaluate their capital charges. Traditionally, the Loss Distribution Approach (LDA) is used mixing frequencies and severities to build a Loss Distribution Function (LDF). This distribution represents annual losses, consequently the 99.9 percentile of the distribution providing the capital charge denotes the worst year in a thousand. The current approach suggested by the regulator implemented in the financial institutions assumes the independence of the losses. In this paper, we propose a solution to address the issues arising when autocorrelations are detected between the losses. Our approach suggests working with the losses considered as time series. Thus, the losses are aggregated periodically and time series processes are adjusted on the related time series among AR, ARFI, and Gegenbauer processes, and a distribution is fitted on the residuals. Finally a Monte Carlo simulation enables constructing the LDF, and the pertaining risk measures are evaluated. In order to show the impact of the choice of the internal models retained by the companies on the capital charges, the paper draws a parallel between the static traditional approach and an appropriate dynamical modelling. If by implementing the traditional LDA, no particular distribution proves its adequacy to the data - as soon as the goodness-of-fits tests rejects them -, keeping the LDA modelling corresponds to an arbitrary choice. We suggest in this paper an alternative and robust approach. For instance, for the two data sets we explore in this paper, with the strategies presented in this paper, the independence assumption is released and we are able to capture the autocorrelations inside the losses through the time series modelling. The construction of the related LDF enables the computation of the capital charge and therefore permits complying with the regulation taking into account as the same time the large losses with adequate distributions on the residuals and the correlations between losses with the time series modelling.
    Keywords: Operation risk, time series, Gegenbauer processes, Monte Carlo, risk measures.
    Date: 2012–12
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-00771387&r=rmg
  4. By: Frank Riedel; Tobias Hellmann
    Abstract: Foster and Hart proposed an operational measure of riskiness for discrete random variables. We show that their defining equation has no solution for many common continuous distributions including many uniform distributions, e.g. We show how to extend consistently the definition of riskiness to continuous random variables. For many continuous random variables, the risk measure is equal to the worst--case risk measure, i.e. the maximal possible loss incurred by that gamble. We also extend the Foster--Hart risk measure to dynamic environments for general distributions and probability spaces, and we show that the extended measure avoids bankruptcy in infinitely repeated gambles.
    Date: 2013–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1301.1471&r=rmg
  5. By: Cihak, Martin; Demirguc-Kunt, Asli; Johnston, R. Barry
    Abstract: A large body of evidence points to misaligned incentives as having a key role in the run-up to the global financial crisis. These include bank managers'incentives to boost short-term profits and create banks that are"too big to fail,"regulators'incentives to forebear and withhold information from other regulators in stressful times, and credit rating agencies'incentives to keep issuing high ratings for subprime assets. As part of the response to the crisis, policymakers and regulators also attempted to address some incentive issues, but various outside observers have criticized the response for being insufficient. This paper proposes a pragmatic approach to re-orienting financial regulation to have at its core the objective of addressing incentives on an ongoing basis. Specifically, the paper proposes"incentive audits"as a tool that could help in identifying incentive misalignments in the financial sector. The paper illustrates how such audits could be implemented in practice, and what the implications would be for the design of policies and frameworks to mitigate systemic risks.
    Keywords: Banks&Banking Reform,Debt Markets,Emerging Markets,Labor Policies,Insurance&Risk Mitigation
    Date: 2013–01–01
    URL: http://d.repec.org/n?u=RePEc:wbk:wbrwps:6308&r=rmg
  6. By: Frédéric Abergel (FiQuant - Chaire de finance quantitative - Ecole Centrale Paris, MAS - Mathématiques Appliquées aux Systèmes - EA 4037 - Ecole Centrale Paris)
    Abstract: In this note, I study further a new approach recently introduced for the hedging of derivatives in incomplete markets via non quadratic local risk minimization. A structure result is provided, which essentially shows the equivalence between non-quadratic risk minimization under the historical probability and quadratic local risk minimization under an equivalent, implicitly defined probability.
    Date: 2013–01–08
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-00771528&r=rmg
  7. By: Phillip Monin
    Abstract: Sharpe et al. proposed the idea of having an expected utility maximizer choose a probability distribution for future wealth as an input to her investment problem instead of a utility function. They developed a computer program, called The Distribution Builder, as one way to elicit such a distribution. In a single-period model, they then showed how this desired distribution for terminal wealth can be used to infer the investor's risk preferences. We adapt their idea, namely that a risk-averse investor can choose a desired distribution for future wealth as an alternative input attribute for investment decisions, to continuous time. In a variety of scenarios, we show how the investor's desired distribution combines with her initial wealth and market-related input to determine the feasibility of her distribution, her implied risk preferences, and her optimal policies throughout her investment horizon. We then provide several examples.
    Date: 2013–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1301.0907&r=rmg
  8. By: Elisa Luciano; Luca Regis; Elena Vigna
    Abstract: The paper presents closed-form Delta and Gamma hedges for an- nuities and death assurances, in the presence of both longevity and interest-rate risk. Longevity risk is modelled through an extension of the classical Gompertz law, while interest rate risk is modelled via an Hull-and-White process. We theoretically provide natural hedg- ing strategies, considering also contracts written on dierent genera- tions. We provide a UK-population and bond-market calibrated exam- ple. We compute longevity exposures and explicitly calculate Delta- Gamma hedges. Re-insurance is needed in order to set-up portfolios which are Delta-Gamma neutral to both longevity and interest-rate risk.
    Date: 2012–01
    URL: http://d.repec.org/n?u=RePEc:icr:wpmath:21-2011&r=rmg
  9. By: Volodymyr Korniichuk
    Abstract: We propose a model for forecasting extreme electricity prices in real time (high frequency) settings. The unique feature of our model is its ability to forecast electricity price exceedances over very high thresholds, where only a few (if any) observations are available. The model can also be applied for simulating times of occurrence and magnitudes of the extreme prices. We employ a copula with a changing dependence parameter for capturing serial dependence in the extreme prices and the censored GPD for modelling their marginal distributions. For modelling times of the extreme price occurrences we propose an approach based on a negative binomial distribution. The model is applied to electricity spot prices from Australia's national electricity market.
    Keywords: electricity spot prices, copula, GPD, negative binomial distribution
    JEL: C53 C51 C32
    Date: 2012–12–27
    URL: http://d.repec.org/n?u=RePEc:cgr:cgsser:03-14&r=rmg
  10. By: Harry Bensusan (CMAP - Centre de Mathématiques Appliquées - Ecole Polytechnique - Polytechnique - X - CNRS : UMR7641); Nicole El Karoui (CMAP - Centre de Mathématiques Appliquées - Ecole Polytechnique - Polytechnique - X - CNRS : UMR7641, LPMA - Laboratoire de Probabilités et Modèles Aléatoires - CNRS : UMR7599 - Université Paris VI - Pierre et Marie Curie - Université Paris VII - Paris Diderot); Stéphane Loisel (SAF - Laboratoire de Sciences Actuarielle et Financière - Université Claude Bernard - Lyon I : EA2429); Yahia Salhi (SAF - Laboratoire de Sciences Actuarielle et Financière - Université Claude Bernard - Lyon I : EA2429)
    Abstract: In this paper, we introduce a new structured financial product: the so-called Life Nominal Chooser Swaption (LNCS). Thanks to such a contract, insurers could keep pure longevity risk and transfer a great part of interest rate risk underlying annuity portfolios to financial markets. Before the issuance of the contract, the insurer determines a confidence band of survival curves for her portfolio. An interest rate hedge is set up, based on swaption mechanisms. The bank uses this band as well as an interest rate model to price the product. At the end of the first period (e.g. 8 to 10 years), the insurer has the right to enter into an interest rate swap with the bank, where the nominal is adjusted to her (re-forecasted) needs. She chooses (inside the band) the survival curve that better fits her anticipation of future mortality of her portfolio (during 15 to 20 more years, say) given the information available at that time. We use a population dynamics longevity model and a classical two-factor interest rate model %two-factor Heath-Jarrow-Morton (HJM) model for interest rates to price this product. Numerical results show that the option offered to the insurer (in terms of choice of nominal) is not too expensive in many real-world cases. We also discuss the pros and the cons of the product and of our methodology. This structure enables insurers and financial institutions to remain in their initial field of expertise.
    Date: 2012–12–21
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-00768526&r=rmg
  11. By: Bertrand Candelon (Economics - Maastricht University); Guillaume Gaulier (Centre de recherche de la Banque de France - Banque de France); Christophe Hurlin (LEO - Laboratoire d'économie d'Orleans - CNRS : UMR6221 - Université d'Orléans)
    Abstract: This paper proposes a new approach to date extreme financial cycles. Elaborating on recent methods in extreme value theory, it elaborates an extension of the famous calculus rule to detect extreme peaks and troughs. Applied on United-States stock market since 1871, it leads to a dating of these exceptional events and calls for adequate economic policies in order to tackle them.
    Keywords: Financial extreme cycles; Extreme value theory;
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:halshs-00769817&r=rmg
  12. By: Cho, In Soo
    Abstract: We investigate the stability of measured risk attitudes over time, using a 13-year longitudinal sample of individuals in the NLSY79. We find that an individual’s risk aversion changes systematically in response to personal economic circumstances.  Risk aversion increases with lengthening spells of employment and time out of labor force, and decreases with lengthening unemployment spells.  However, the most important result is that the majority of the variation in risk aversion is due to changes in measured individual tastes over time and not to variation across individuals.  These findings that measured risk preferences are endogenous and subject to substantial measurement errors suggest caution in interpreting coefficients in models relying on contemporaneous, one-time measures of risk preferences.  
    Keywords: risk aversion; stability; variance decomposition; within; measurement error; between; fixed effects
    JEL: C23 D81
    Date: 2013–01–10
    URL: http://d.repec.org/n?u=RePEc:isu:genres:35751&r=rmg
  13. By: Alexander B. Matthies; ; ;
    Abstract: We report on the current state and important older findings of empirical studies on corporate credit ratings and their relationship to ratings of other entities. Specifically, we consider the results of three lines of research: The correlation of credit ratings and corporate default, the influence of ratings on capital markets, and the determinants of credit ratings and rating changes. Results from each individual line are important and relevant for the construction and interpretation of studies in the other two fields, e.g. the choice of statistical methods. Moreover, design and construct of credit ratings and the credit rating scale are essential to understand empirical findings.
    Keywords: Rating agency; Credit Ratings; Through-the-cycle rating methodology; Corporate Governance
    JEL: G20 G24 G30 G32 G34
    Date: 2013–01
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2013-003&r=rmg

This nep-rmg issue is ©2013 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.