nep-rmg New Economics Papers
on Risk Management
Issue of 2015‒12‒08
eleven papers chosen by
Stan Miles
Thompson Rivers University

  1. The end of the waterfall: default resources of central counterparties By Rama Cont
  2. Simulation of the term structure. An application for measuring the interest rate risk. By Mirta González; María Cecilia Pérez
  3. lCARE – localizing Conditional AutoRegressive Expectiles By Xiu Xu; Andrija Mihoci; Wolfgang Karl Härdle;
  4. On Game-Theoretic Risk Management (Part Two) - Algorithms to Compute Nash-Equilibria in Games with Distributions as Payoffs By Stefan Rass
  5. A Credibility Approach of the Makeham Mortality Law By Yahia Salhi; Pierre-Emmanuel Thérond; Julien Tomas
  6. Hedging with Derivatives and Firm Value By Mariana Vila Nova; António Melo Cerqueira; Elísio Brandão
  7. Dynamic Correlations and Portfolio Diversification between Islamic and Conventional Sector Equity Indexes By Walid Mensi; Shawkat Hammoudeh; Ahmet Sensoy; Seong-Min Yoon
  8. Macro-Driven VaR Forecasts: From Very High to Very-Low Frequency Data By Yves Dominicy; Harry-Paul Vander Elst
  9. Price Impacts of Imperfect Collateralization (Revised version of CARF-F-355; Forthcoming in "International Journal of Financial Engineering") By Kenichiro Shiraya; Akihiko Takahashi
  10. Option-Based Estimation of the Price of Co-Skewness and Co-Kurtosis Risk By Peter Christoffersen; Mathieu Fournier; Kris Jacobs; Mehdi Karoui
  11. Coping with Nasty Surprises: Improving Risk Management in the Public Sector Using Simplified Bayesian Methods By Mark Matthews; Tom Kompas

  1. By: Rama Cont (Imperial College London)
    Abstract: Central counterparties (CCPs) have become pillars of the new global financial architecture following the financial crisis of 2008. The key role of CCPs in mitigating counterparty risk and contagion has in turn cast them as systemically important financial institutions whose eventual failure may lead to potentially serious consequences for financial stability, and prompted discussions on CCP risk management standards and safeguards for recovery and resolutions of CCPs in case of failure. We contribute to the debate on CCP default resources by focusing on the incentives generated by the CCP loss allocation rules for the CCP and its members and discussing how the design of loss allocation rules may be used to align these incentives in favor of outcomes which benefit financial stability. After reviewing the ingredients of the CCP loss waterfall and various proposals for loss recovery provisions for CCPs, we examine the risk management incentives created by different ingredients in the loss waterfall and discuss possible approaches for validating the design of the waterfall. We emphasize the importance of CCP stress tests and argue that such stress tests need to account for the interconnectedness of CCPs through common members and cross-margin agreements. A key proposal is that capital charges on assets held against CCP Default Funds should depend on the quality of the risk management of the CCP, as assessed through independent stress tests.
    Keywords: CCP, central clearing, central counterparty, systemic risk, default risk, counterparty risk, default fund, OTC derivatives, mechanism design, regulation, EMIR.
    Date: 2015–11–27
  2. By: Mirta González (Central Bank of Argentina); María Cecilia Pérez (Central Bank of Argentina)
    Abstract: In order to provide a tool for risk management improvement and appropriate regulation, a methodology for measuring interest rate risk is applied in this paper. After estimating and simulating the interest rate term structure, the value at risk and expected shortfall are calculated on a portfolio. An application of alpha-stable distributions has allowed representing the asymmetric, leptokurtic and heavy tailed shape of financial returns and occurrence of extreme scenarios.
    Keywords: interest rate risk, regulation, risk management, term structure
    JEL: C15 C16 E43 E59 G11 G12
    Date: 2015–11
  3. By: Xiu Xu; Andrija Mihoci; Wolfgang Karl Härdle;
    Abstract: We account for time-varying parameters in the conditional expectile based value at risk (EVaR) model. EVaR appears more sensitive to the magnitude of portfolio losses compared to the quantile-based Value at Risk (QVaR), nevertheless, by fitting the models over relatively long ad-hoc fixed time intervals, research ignores the potential time-varying parameter properties. Our work focuses on this issue by exploiting the local parametric approach in quantifying tail risk dynamics. By achieving a balance between parameter variability and modelling bias, one can safely fit a parametric expectile model over a stable interval of homogeneity. Empirical evidence at three stock markets from 2005- 2014 shows that the parameter homogeneity interval lengths account for approximately 1-6 months of daily observations. Our method outperforms models with one-year fixed intervals, as well as quantile based candidates while employing a time invariant portfolio protection (TIPP) strategy for the DAX portfolio. The tail risk measure implied by our model finally provides valuable insights for asset allocation and portfolio insurance.
    Keywords: expectiles, tail risk, local parametric approach, risk management
    JEL: C32 C51 G17
  4. By: Stefan Rass
    Abstract: The game-theoretic risk management framework put forth in the precursor work "Towards a Theory of Games with Payoffs that are Probability-Distributions" (arXiv:1506.07368 [q-fin.EC]) is herein extended by algorithmic details on how to compute equilibria in games where the payoffs are probability distributions. Our approach is "data driven" in the sense that we assume empirical data (measurements, simulation, etc.) to be available that can be compiled into distribution models, which are suitable for efficient decisions about preferences, and setting up and solving games using these as payoffs. While preferences among distributions turn out to be quite simple if nonparametric methods (kernel density estimates) are used, computing Nash-equilibria in games using such models is discovered as inefficient (if not impossible). In fact, we give a counterexample in which fictitious play fails to converge for the (specifically unfortunate) choice of payoff distributions in the game, and introduce a suitable tail approximation of the payoff densities to tackle the issue. The overall procedure is essentially a modified version of fictitious play, and is herein described for standard and multicriteria games, to iteratively deliver an (approximate) Nash-equilibrium.
    Date: 2015–11
  5. By: Yahia Salhi (SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1); Pierre-Emmanuel Thérond (Galea & Associés, SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1); Julien Tomas (SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1)
    Abstract: The present article illustrates a credibility approach to mortality. Interest from life insurers to assess their portfolios' mortality risk has considerably increased. The new regulation and norms, Solvency II, shed light on the need of life tables that best reect the experience of insured portfolios in order to quantify reliably the underlying mortality risk. In this context and following the work of Bühlmann and Gisler (2005) and Hardy and Panjer (1998), we propose a credibility approach which consists on reviewing, as new observations arrive, the assumption on the mortality curve. Unlike the methodology considered in Hardy and Panjer (1998) that consists on updating the aggregate deaths we have chosen to add an age structure on these deaths. Formally, we use a Makeham graduation model. Such an adjustment allows to add a structure in the mortality pattern which is useful when portfolios are of limited size so as to ensure a good representation over the entire age bands considered. We investigate the divergences in the mortality forecasts generated by the classical credibility approaches of mortality including Hardy and Panjer (1998) and the Poisson-Gamma model on portfolios originating from various French insurance companies.
    Keywords: Credibility,Makeham law,Mortality,Life insurance,Graduation,Extrapolation
    Date: 2015–11–24
  6. By: Mariana Vila Nova (FEP-UP - School os Economics and Management); António Melo Cerqueira (FEP-UP - School os Economics and Management); Elísio Brandão (FEP-UP - School os Economics and Management)
    Abstract: This study examines the impacts of risk management strategies with derivatives on firm’s market value using a sample of non-financial firms listed in the FTSE-350 share index at the London Stock Exchange between 2005 and 2013. We focus on the derivatives use to hedge both the foreign exchange risk and the interest rate risk. To avoid, as far as possible, the endogeneity among variables and consequently strengthen the tests, it is employed an instrumental variables approach in addition to the OLS with time and industry fixed effects. The results reveal a positive effect of foreign currency derivatives and interest rate derivatives on firm’s market value, which indicates that investors, at least under the conditions described in the study, appreciate these risk management practices and reward them with higher market values. However, if we attempt to the derivative contract employed – forward, option, swap - their impacts on firm value differ. For instance, while swaps use to hedge the interest rate risk or the forward contracts to hedge the foreign exchange rate risk have positive and significant effects on value, this effect is not clear when we employ an option contract.
    Keywords: Hedging, Derivatives, Firm Value, Risk Management
    JEL: G32 F31
    Date: 2015–12
  7. By: Walid Mensi; Shawkat Hammoudeh; Ahmet Sensoy; Seong-Min Yoon
    Abstract: This study analyses the dynamic spillovers across ten Dow Jones Islamic-conventional sector index pairs. Using various multivariate GARCH models, the results show significant time-varying conditional correlations among all the pairs. Moreover, there is evidence that the conditional correlations among all the sector pairs, except those of the Telecommunication and Utilities sectors, increase after the onset of the global financial crisis, suggesting non-subsiding risks, contagion effects and gradual financial linkages. The Islamic sectors? risk exposure can be effectively hedged over time in portfolios containing conventional sector stocks. These results provide several practical implications for portfolio managers and policymakers in regard to optimal asset allocations, portfolio risk management and the diversification benefits among these markets.
    Keywords: Sectoral Islamic index, Conventional index, Cross-correlation analysis, Diversification, GARCHcDCC model.
    JEL: G14 G15
    Date: 2015–11
  8. By: Yves Dominicy; Harry-Paul Vander Elst
    Abstract: This paper studies in some details the joint-use of high-frequency data and economic variables tomodel financial returns and volatility. We extend the Realized LGARCH model by allowing for a timevaryingintercept, which responds to changes in macroeconomic variables in a MIDAS framework andallows macroeconomic information to be included directly into the estimation and forecast procedure.Using more than 10 years of high-frequency transactions for 55 U.S. stocks, we argue that the combinationof low-frequency exogenous economic indicators with high-frequency financial data improves our abilityto forecast the volatility of returns, their full multi-step ahead conditional distribution and the multiperiodValue-at-Risk. We document that nominal corporate profits and term spreads generate accuraterisk measures forecasts at horizons beyond two business weeks.
    Keywords: realized LGARCH; value-at-risk; density forecasts; realized measures of volatility
    JEL: C22 C53
    Date: 2015–11
  9. By: Kenichiro Shiraya (The University of Tokyo); Akihiko Takahashi (The University of Tokyo)
    Abstract: This paper studies impacts of imperfect collateralization on derivatives values. Particularly, we investigate option prices in no collateral posting and time-lagged collateral posting cases with stochastic volatility, interest rate and default intensity models, where a stochastic collateral asset value may depend on the values of the assets different from the underlying contract. We also derive an approximation of the credit value adjustment (CVA)'s density function in pricing forward contract with bilateral counter party risk, which seems useful in evaluation of the CVA's Value-at-Risk(VaR).
    Date: 2015–11
  10. By: Peter Christoffersen (University of Toronto - Rotman School of Management and CREATES); Mathieu Fournier (HEC Montreal); Kris Jacobs (University of Houston - C.T. Bauer College of Business); Mehdi Karoui (McGill University)
    Abstract: We show that the prices of risk for factors that are nonlinear in the market return are readily obtained using index option prices. We apply this insight to the price of co-skewness and co-kurtosis risk. The price of co-skewness risk corresponds to the spread between the physical and the risk-neutral second moments, and the price of co-kurtosis risk corresponds to the spread between the physical and the risk-neutral third moments. The option-based estimates of the prices of risk lead to reasonable values of the associated risk premia. An out-of-sample analysis of factor models with co-skewness and co-kurtosis risk indicates that the new estimates of the price of risk improve the models performance. Models with higher-order market moments also robustly outperform standard competitors such as the CAPM and the Fama-French model.
    Keywords: Co-skewness, co-kurtosis, risk premia, options, cross-section, out-of-sample
    JEL: G12 G13 G17
    Date: 2015–01–09
  11. By: Mark Matthews; Tom Kompas
    Abstract: Bayesian methods are particularly useful to informing decisions when information is sparse and ambiguous, but decisions involving risks must still be made in a timely manner. Given the utility of these approaches to public policy, this article considers the case for refreshing the general practice of risk management in governance by using a simplified Bayesian approach based on using raw data expressed as ‘natural frequencies’. This simplified Bayesian approach, which benefits from the technical advances made in signal processing and machine learning, is suitable for use by non-specialists, and focuses attention on the incidence and potential implications of false positives and false negatives in the diagnostic tests used to manage risk. The article concludes by showing how graphical plots of the incidence of true positives relative to false positives in test results can be used to assess diagnostic capabilities in an organisation—and also inform strategies for capability improvement.
    Keywords: risk;management;Bayesian inference;public sector
    Date: 2015–08–17

This nep-rmg issue is ©2015 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.