nep-rmg New Economics Papers
on Risk Management
Issue of 2017‒04‒23
fifteen papers chosen by
Stan Miles
Thompson Rivers University

  1. Risk Measure Estimates in Quiet and Turbulent Times:An Empirical Study By Rosnan Chotard; Michel Dacorogna; Marie Kratz
  2. Multinomial var backtests: A simple implicit approach to backtesting expected shortfall By Marie Kratz; Yen Lok; Alexander Mcneil
  3. SenSR: A sentiment-based systemic risk indicator By Svetlana Borovkova; Evgeny Garmaev; Philip Lammers; Jordi Rustige
  4. Relationship Between Level of Firm Performances and Risk in Food and Beverages Industry: Empirical Analysis on Khee San Berhad By Erizal, Nurulhidayu
  5. Stress Testing in Wartime and in Peacetime By Schuermann, Til
  6. How the corporate governance mechanisms affect bank risk taking By Mamatzakis, Emmanuel; Zhang, Xiaoxiang; Wang, Chaoke
  7. How does risk flow in the credit default swap market? By D'Errico, Marco; Battiston, Stefano; Peltonen, Tuomas; Scheicher, Martin
  8. A self-calibrating method for heavy tailed data modeling : Application in neuroscience and finance By Nehla Debbabi; Marie Kratz; Mamadou Mboup
  9. Estimating the Counterparty Risk Exposure by using the Brownian Motion Local Time By Michele Bonollo; Luca Di Persio; Luca Mammi; Immacolata Oliva
  10. Simplifying credit scoring rules using LVQ+PSO By Laura Cristina Lanzarini; Augusto Villa Monte; Aurelio F. Bariviera; Patricia Jimbo Santana
  11. A Joint Quantile and Expected Shortfall Regression Framework By Timo Dimitriadis; Sebastian Bayer
  12. Good Deal Hedging and Valuation under Combined Uncertainty about Drift and Volatility By Dirk Becherer; Klebert Kentia
  13. On the properties of non-monetary measures for risks By Christophe Courbage; Henri Loubergé; Béatrice Rey
  14. Less Really Can Be More: Why Simplicity and Comparability Should be Regulatory Objectives By Herring, Richard J.
  15. A systemic shock model for too big to fail financial institutions By Sabrina Mulinacci

  1. By: Rosnan Chotard (CREAR - Center of Research in Econo-finance and Actuarial sciences on Risk / Centre de Recherche Econo-financière et Actuarielle sur le Risque - Essec Business School); Michel Dacorogna (SCOR SE - SCOR SE, DEAR Consulting); Marie Kratz (ESSEC Business School - Essec Business School, MAP5 - MAP5 - Mathématiques Appliquées à Paris 5 - CNRS - Centre National de la Recherche Scientifique - Institut National des Sciences Mathématiques et de leurs Interactions - UPD5 - Université Paris Descartes - Paris 5)
    Abstract: In this study we empirically explore the capacity of historical VaR to correctly predict the future risk of a financial institution. We observe that rolling samples are better able to capture the dynamics of future risks. We thus introduce another risk measure, the Sample Quantile Process, which is a generalization of the VaR calculated on a rolling sample, and study its behavior as a predictor by varying its parameters. Moreover, we study the behavior of the future risk as a function of past volatility. We show that if the past volatility is low, the historical computation of the risk measure underestimates the future risk, while in period of high volatility, the risk measure overestimates the risk, confirming that the current way financial institutions measure their risk is highly procyclical.
    Keywords: backtest,risk measure,sample quantile process,stochastic model,VaR,volatility
    Date: 2016–11–24
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01424285&r=rmg
  2. By: Marie Kratz (MAP5 - MAP5 - Mathématiques Appliquées à Paris 5 - CNRS - Centre National de la Recherche Scientifique - Institut National des Sciences Mathématiques et de leurs Interactions - UPD5 - Université Paris Descartes - Paris 5); Yen Lok (Heriot Watt University); Alexander Mcneil (University of York [York])
    Abstract: Under the Fundamental Review of the Trading Book (FRTB) capital charges for the trading book are based on the coherent expected shortfall (ES) risk measure, which show greater sensitivity to tail risk. In this paper it is argued that backtesting of expected shortfall-or the trading book model from which it is calculated-can be based on a simultaneous multinomial test of value-at-risk (VaR) exceptions at different levels, an idea supported by an approximation of ES in terms of multiple quantiles of a distribution proposed in Emmer et al. (2015). By comparing Pearson, Nass and likelihood-ratio tests (LRTs) for different numbers of VaR levels N it is shown in a series of simulation experiments that multinomial tests with N ≥ 4 are much more powerful at detecting misspecifications of trading book loss models than standard bi-nomial exception tests corresponding to the case N = 1. Each test has its merits: Pearson offers simplicity; Nass is robust in its size properties to the choice of N ; the LRT is very powerful though slightly over-sized in small samples and more computationally burdensome. A traffic-light system for trading book models based on the multinomial test is proposed and the recommended procedure is applied to a real-data example spanning the 2008 financial crisis.
    Keywords: multinomial distribution,Nass test,Pearson test,risk management,risk measure,statistical test,tail of distribution,backtesting,banking regulation,coherence,elicitability,expected short-fall,heavy tail,likelihood ratio test,value-at-risk
    Date: 2016–11
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01424279&r=rmg
  3. By: Svetlana Borovkova; Evgeny Garmaev; Philip Lammers; Jordi Rustige
    Abstract: The media influence our perception of reality and, since we act on those perceptions, reality is in turn affected by the media. News is a rich source of information, but, in addition, the sentiment (i.e., the tone of financial news) tells us how others perceive the financial system and how that perception changes. In this paper we propose a new indicator of the systemic risk in the global financial system. We call it SenSR : Sentiment-based Systemic Risk indicator. This measure is constructed by dynamically aggregating the sentiment in news about systemically important financial institutions (SIFIs). We test the SenSR for its ability to indicate or even forecast systemic stress in the financial system. We compare its performance to other well-known systemic risk indicators, as well as with macroeconomic fundamentals. We find that SenSR anticipates other systemic risk measures such as SRISK or VIX in signaling stressed times. In particular, it leads other systemic risk measures and macroeconomic indicators by as long as 12 weeks.
    Keywords: systemic risk; sentiment analysis; Granger causality
    JEL: G01 G18 C58 G17
    Date: 2017–04
    URL: http://d.repec.org/n?u=RePEc:dnb:dnbwpp:553&r=rmg
  4. By: Erizal, Nurulhidayu
    Abstract: The main objective of this study was to identify the relationship between risk management and its impaction in the profitability of food and beverage industry. Specifically, this study examined liquidity risk, credit/counterparty risk, operating risk and leverage and how risk will affect to the profitability. For the profitability was measured with using Return on Asset (ROA). In this study it found that a strong relationship exists between the risk management practices under study and the firm’s profitability. The result of this study indicate that consideration firms’ in risk management will give good impact to the firm profitability.
    Keywords: credit risk, liquidity risk, market risk, leverage and profitability risk.
    JEL: G0 G2
    Date: 2017–04–17
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:78521&r=rmg
  5. By: Schuermann, Til (Oliver Wyman)
    Abstract: Stress testing served us well as a crisis management tool, and we see it applied increasingly to peacetime oversight of banks and banking systems. It is rapidly becoming the dominant supervisory tool on both sides of the Atlantic. Yet the objectives and certainly the conditions are quite different, and to date we see a range of practices across jurisdictions. Stress testing has proved to be enormously useful, not just for the supervisors but also for the banks. Using a simple taxonomy of stress testing-–scenario design, models and projections, and disclosure--I analyze some of those different approaches with a view to examining how wartime stress testing can be adapted to peacetime concerns.
    JEL: G21 G28 G32
    Date: 2016–03
    URL: http://d.repec.org/n?u=RePEc:ecl:upafin:17-01&r=rmg
  6. By: Mamatzakis, Emmanuel; Zhang, Xiaoxiang; Wang, Chaoke
    Abstract: The effectiveness of the management team, ownership structure and other corporate governance systems in determining appropriate risk taking is a critical issue in a modern commercial bank. Appropriate risk management techniques and structures within financial institutions play an important role to ensure the stability of economy. After analyzing 43 Asian banks over the period from 2006 to 2014, I find that banks with strong corporate governance are associated with higher risk taking. More specifically, banks with intermediate size of board, separation of CEO and chairman of board, and audited by Big Four audit firm, are likely higher risk taking. Overall, my findings provide some new perspectives into the governance mechanisms that affect risk taking on commercial banks.
    Keywords: Banks, Risk taking, Corporate governance
    JEL: G21 G32 G39
    Date: 2017–04–04
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:78137&r=rmg
  7. By: D'Errico, Marco; Battiston, Stefano; Peltonen, Tuomas; Scheicher, Martin
    Abstract: We develop a framework to analyse the Credit Default Swaps (CDS) market as a network of risk transfers among counterparties. From a theoretical perspective, we introduce the notion of flow-of-risk and provide sufficient conditions for a bow-tie network architecture to endogenously emerge as a result of intermediation. This architecture shows three distinct sets of counterparties: i) Ultimate Risk Sellers (URS), ii) Dealers (indirectly connected to each other), iii) Ultimate Risk Buyers (URB). We show that the probability of widespread distress due to counterparty risk is higher in a bow-tie architecture than in more fragmented network structures. Empirically, we analyse a unique global dataset of bilateral CDS exposures on major sovereign and financial reference entities in 2011 - 2014. We find the presence of a bow-tie network architecture consistently across both reference entities and time, and that the flow-of-risk originates from a large number of URSs (e.g. hedge funds) and ends up in a few leading URBs, most of which are non-banks (in particular asset managers). Finally, the analysis of the CDS portfolio composition of the URBs shows a high level of concentration: in particular, the top URBs often show large exposures to potentially correlated reference entities. JEL Classification: G10, G15
    Keywords: credit default swap, financial networks, flow-of-risk, network architecture, systemic risk
    Date: 2017–03
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20172041&r=rmg
  8. By: Nehla Debbabi (SUP'COM - Ecole Supérieure des Communications de Tunis - Ecole Supérieure des Communications de Tunis, ESPRIT - École Supérieure Privée d'Ingénierie et de Technologie); Marie Kratz (ESSEC Business School - Essec Business School); Mamadou Mboup (CRESTIC - Centre de Recherche en Sciences et Technologies de l'Information et de la Communication - URCA - Université de Reims Champagne-Ardenne)
    Abstract: One of the main issues in the statistical literature of extremes concerns the tail index estimation, closely linked to the determination of a threshold above which a Generalized Pareto Distribution (GPD) can be tted. Approaches to this estimation may be classi ed into two classes, one using standard Peak Over Threshold (POT) methods, in which the threshold to estimate the tail is chosen graphically according to the problem, the other suggesting self-calibrating methods, where the threshold is algorithmically determined. Our approach belongs to this second class proposing a hybrid distribution for heavy tailed data modeling, which links a normal (or lognormal) distribution to a GPD via an exponential distribution that bridges the gap between mean and asymptotic behaviors. A new unsupervised algorithm is then developed for estimating the parameters of this model. The eff ectiveness of our self-calibrating method is studied in terms of goodness-of-fi t on simulated data. Then, it is applied to real data from neuroscience and fi nance, respectively. A comparison with other more standard extreme approaches follows.
    Keywords: Least squares optimization,Hybrid model,S&P 500 index,Levenberg Marquardt algorithm,Neural data,Algorithm,Extreme Value Theory,Gaussian distribution,Generalized Pareto Distribution,Heavy tailed data
    Date: 2016–12–12
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01424298&r=rmg
  9. By: Michele Bonollo; Luca Di Persio; Luca Mammi; Immacolata Oliva
    Abstract: In recent years, the counterparty credit risk measure, namely the default risk in \emph{Over The Counter} (OTC) derivatives contracts, has received great attention by banking regulators, specifically within the frameworks of \emph{Basel II} and \emph{Basel III.} More explicitly, to obtain the related risk figures, one has first obliged to compute intermediate output functionals related to the \emph{Mark-to-Market} (MtM) position at a given time $t \in [0, T],$ T being a positive, and finite, time horizon. The latter implies an enormous amount of computational effort is needed, with related highly time consuming procedures to be carried out, turning out into significant costs. To overcome latter issue, we propose a smart exploitation of the properties of the (local) time spent by the Brownian motion close to a given value.
    Date: 2017–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1704.03244&r=rmg
  10. By: Laura Cristina Lanzarini; Augusto Villa Monte; Aurelio F. Bariviera; Patricia Jimbo Santana
    Abstract: One of the key elements in the banking industry rely on the appropriate selection of customers. In order to manage credit risk, banks dedicate special efforts in order to classify customers according to their risk. The usual decision making process consists in gathering personal and financial information about the borrower. Processing this information can be time consuming, and presents some difficulties due to the heterogeneous structure of data. We offer in this paper an alternative method that is able to classify customers' profiles from numerical and nominal attributes. The key feature of our method, called LVQ+PSO, is the finding of a reduced set of classifying rules. This is possible, due to the combination of a competitive neural network with an optimization technique. These rules constitute a predictive model for credit risk approval. The reduced quantity of rules makes this method not only useful for credit officers aiming to make quick decisions about granting a credit, but also could act as borrower's self selection. Our method was applied to an actual database of a credit consumer financial institution in Ecuador. We obtain very satisfactory results. Future research lines are exposed.
    Date: 2017–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1704.04450&r=rmg
  11. By: Timo Dimitriadis; Sebastian Bayer
    Abstract: We introduce a novel regression framework which simultaneously models the quantile and the Expected Shortfall of a response variable given a set of covariates. The foundation for this joint regression is a recent result by Fissler and Ziegel (2016), who show that the quantile and the ES are jointly elicitable. This joint elicitability allows for M- and Z-estimation of the joint regression parameters. Such a parameter estimation is not possible for an Expected Shortfall regression alone as Expected Shortfall is not elicitable. We show consistency and asymptotic normality for the M- and Z-estimator under standard regularity conditions. The loss function used for the M-estimation depends on two specification functions, whose choices affect the properties of the resulting estimators. In an extensive simulation study, we verify the asymptotic properties and analyze the small sample behavior of the M-estimator under different choices for the specification functions. This joint regression framework allows for various applications including estimating, forecasting and backtesting Expected Shortfall, which is particularly relevant in light of the upcoming introduction of Expected Shortfall in the Basel Accords.
    Date: 2017–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1704.02213&r=rmg
  12. By: Dirk Becherer; Klebert Kentia
    Abstract: We study robust notions of good-deal hedging and valuation under combined uncertainty about the drifts and volatilities of asset prices. Good-deal bounds are determined by a subset of risk-neutral pricing measures such that not only opportunities for arbitrage are excluded but also deals that are too good, by restricting instantaneous Sharpe ratios. A non-dominated multiple priors approach to model uncertainty (ambiguity) leads to worst-case good-deal bounds. Corresponding hedging strategies arise as minimizers of a suitable coherent risk measure. Good-deal bounds and hedges for measurable claims are characterized by solutions to second-order backward stochastic differential equations whose generators are non-convex in the volatility. These hedging strategies are robust with respect to uncertainty in the sense that their tracking errors satisfy a supermartingale property under all a-priori valuation measures, uniformly over all priors.
    Date: 2017–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1704.02505&r=rmg
  13. By: Christophe Courbage (Geneva School of Business Administration - University of Applied Sciences Western Switzerland); Henri Loubergé (University of Geneva [Switzerland]); Béatrice Rey (GATE Lyon Saint-Étienne - Groupe d'analyse et de théorie économique - ENS Lyon - École normale supérieure - Lyon - UL2 - Université Lumière - Lyon 2 - UCBL - Université Claude Bernard Lyon 1 - UJM - Université Jean Monnet [Saint-Etienne] - Université de Lyon - CNRS - Centre National de la Recherche Scientifique)
    Abstract: This paper investigates how welfare losses for facing risks change as the risk environment of the decision-maker is altered. To that aim, we define the risk apportionment of order n (RAn) utility premium as a measure of pain associated with facing the passage from one risk to a riskier one. Changes in risks are expressed through the concept of stochastic dominance of order n. Three configurations of risk exposures are considered. The paper first shows how the RAn utility premium is modified when initial wealth becomes riskier. Second, the paper provides conditions on individual preferences for superadditivity of the RAn utility premium. Third, the paper investigates welfare changes of merging increases in risks. These results offer new interpretations of the sign of higher derivatives of the utility function.
    Keywords: risk apportionment, superadditivity, RA-n utility premium
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:halshs-01471888&r=rmg
  14. By: Herring, Richard J. (University of PA)
    Abstract: Regulatory complexity undermined efforts to strengthen financial stability before the crisis. Nonetheless, post-crisis reforms have greatly exacerbated regulatory complexity. Using the example of capital regulation, this paper shows how complexity has grown geometrically from the introduction of the Basel Accord on Capital Adequacy in 1988 to the introduction of Basel III and the total loss-absorbing capacity (TLAC) proposal in 2015. Analysis of the current welter of required capital ratios leads to a proposal to eliminate 75 % of them without jeopardizing the safety and soundness of the system. Quite possibly, regulators might argue that one or more of these deleted ratios does make an important incremental contribution to the safety and soundness of the system. But these important debates are not taking place in public, in part because we lack systematic measures of the costs of regulatory compliance and effective sunset laws that would require that regulations meet a rigorous cost-benefit test periodically. The concluding section poses the more speculative question of why, despite the evident advantages of a simpler, more transparent regulatory system, the authorities layer on ever more complexity.
    JEL: G28
    Date: 2016–04
    URL: http://d.repec.org/n?u=RePEc:ecl:upafin:16-08&r=rmg
  15. By: Sabrina Mulinacci
    Abstract: In this paper we study the distributional properties of a vector of lifetimes in which each lifetime is modeled as the first arrival time between an idiosyncratic shock and a common systemic shock. Despite unlike the classical multidimensional Marshall-Olkin model here only a unique common shock affecting all the lifetimes is assumed, some dependence is allowed between each idiosyncratic shock arrival time and the systemic shock arrival time. The dependence structure of the resulting distribution is studied through the analysis of its singularity and its associated copula function. Finally, the model is applied to the analysis of the systemic riskiness of those European banks classified as systemically important (SIFI).
    Date: 2017–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1704.02160&r=rmg

This nep-rmg issue is ©2017 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.