nep-rmg New Economics Papers
on Risk Management
Issue of 2015‒12‒28
eleven papers chosen by
Stan Miles
Thompson Rivers University

  1. Which measure for PFE? The Risk Appetite Measure, A By Chris Kenyon; Andrew Green; Mourad Berrahoui
  2. Measuring heterogeneity in bank liquidity risk: who are the winners and the losers? By Jean-Loup SOULA
  3. Hedging Efficiency of Atlantic Salmon Futures By Asche, Frank; Misund, Bard
  4. Financial Ratios and Prediction on Corporate Bankruptcy in the Atlantic Salmon Industry By Misund, Bård
  5. Risk Assessment under Ambiguity: Precautionary Learning vs. Research Pessimism By Heyen, Daniel; Goeschl, Timo; Wiesenfarth , Boris
  6. Market inconsistencies of the market-consistent European life insurance economic valuations: pitfalls and practical solutions By Nicole El Karoui; Stéphane Loisel; Jean-Luc Prigent; Julien Vedani
  7. Predictive Models for Disaggregate Stock Market Volatility By Chong, Terence Tai Leung; Lin, Shiyu
  8. Systemic Risk and the Optimal Seniority Structure of Banking Liabilities By Sprios Bougheas; Alan Kirman
  9. The allocation of financial risks during the life cycle in individual and collective DC pension contracts By Marcel Lever; Ilja Boelaars (University of Chicago); Ryanne Cox (DNB); Roel Mehlkopf (DNB; Netspar)
  10. Uniform bounds for Black--Scholes implied volatility By Michael R. Tehranchi
  11. A fully non-parametric heteroskedastic model By Matthieu Garcin; Clément Goulet

  1. By: Chris Kenyon; Andrew Green; Mourad Berrahoui
    Abstract: Potential Future Exposure (PFE) is a standard risk metric for managing business unit counterparty credit risk but there is debate on how it should be calculated. The debate has been whether to use one of many historical ("physical") measures (one per calibration setup), or one of many risk-neutral measures (one per numeraire). However, we argue that limits should be based on the bank's own risk appetite provided that this is consistent with regulatory backtesting and that whichever measure is used it should behave (in a sense made precise) like a historical measure. Backtesting is only required by regulators for banks with IMM approval but we expect that similar methods are part of limit maintenance generally. We provide three methods for computing the bank price of risk from readily available business unit data, i.e. business unit budgets (rate of return) and limits (e.g. exposure percentiles). Hence we define and propose a Risk Appetite Measure, A, for PFE and suggest that this is uniquely consistent with the bank's Risk Appetite Framework as required by sound governance.
    Date: 2015–12
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1512.06247&r=rmg
  2. By: Jean-Loup SOULA (LaRGE Research Center, Université de Strasbourg)
    Abstract: The 2007-2009 crisis stressed the importance of liquidity for banks. Using a risk factor model, we propose a measure of bank exposure to liquidity risk based on their sensitivity to aggregate liquidity conditions. Results indicate that liquidity risk is a specific risk. Moreover, this measure sheds light on the heterogeneity among banks in terms of exposure to liquidity risk. Banks benefit, lose or are insensitive to liquidity conditions, and we document large variation in exposure across the 2008 and 2011 crises. Larger size and capital levels tend to insulate banks from aggregate liquidity risk. However, deposit share, reliance on wholesale funding and funding gap impact only banks whose risk decreases with increasing aggregate liquidity risk. These ratios indicate the level of liquidity production by banks. This suggests that market discipline applies to liquidity production but only on the less risky banks in case of a liquidity crisis. Thus market discipline appears to be one-sided. To that extent it reinforces the necessity to impose liquidity requirements to all banks, as through the Basel III liquidity ratios.
    Keywords: E51, G21, G28, G32.
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:lar:wpaper:2015-09&r=rmg
  3. By: Asche, Frank (UiS); Misund, Bard (University of Stavanger)
    Abstract: This paper examines the hedging properties of Atlantic salmon futures. Hedging is important since it allows for mitigation of the risk of adverse price changes in the spot market. We examine the hedging efficiency of three types of hedging strategies; unhedged, fully hedged and hedging using optimal hedging ratios. To find the optimal hedge ratio we use an estimated constant hedge ratio, optimal hedge ratios estimated with rolling 20-week and 52-week windows, and bivariate GARCH models. The results provide evidence that hedging using futures contracts listed on Fish Pool reduces risk for producers of farmed Atlantic salmon. The best hedging efficiency is achieved with a simple one-to-one hedge, closely followed by the bivariate GARCH approach.
    Keywords: Atlantic salmon markets; Forward prices; Risk premium
    JEL: G13 G14 Q22
    Date: 2015–12–18
    URL: http://d.repec.org/n?u=RePEc:hhs:stavef:2015_012&r=rmg
  4. By: Misund, Bård (UiS)
    Abstract: This paper addresses the issue of credit risk in the salmon industry. During the period 2000-2002 the Norwegian salmon industry witnessed a period of low prices leading to a wave of defaults and bankruptcies. The consequences were large monetary losses for both investors and banks, highlighting the importance of early detection of failing firms. Using financial ratios measuring the firms' financial status prior to this event, two credit risk models are developed; one using logit regression and the other Classification and Regression trees. The performance of the two models developed is compared to a cross-industry benchmark model developed by the Norwegian Central Bank. The models estimated on industry data is better at separating between companies with high and low credit risk in the salmon industry compared to the benchmark model.
    Keywords: Atlantic salmon production; credit risk; default probability models; logit; Classification and regression trees.
    JEL: G17 M49 Q22
    Date: 2015–12–18
    URL: http://d.repec.org/n?u=RePEc:hhs:stavef:2015_009&r=rmg
  5. By: Heyen, Daniel; Goeschl, Timo; Wiesenfarth , Boris
    Abstract: Agencies charged with regulating complex risks such as food safety or novel substances frequently need to take decisions on risk assessment and risk management under conditions of ambiguity, i.e. where probabilities cannot be assigned to possible outcomes of regulatory actions. What mandates should society write for such agencies? Two approaches stand out in the current discussion. One charges the agency to apply welfare economics based on expected utility theory. This approach underpins conventional cost-benet analysis (CBA). The other requires that an ambiguity-averse decision-rule - of which maxmin expected utility (MEU) is the best known - be applied in order to build a margin of safety in accordance with the Precautionary Principle (PP). The contribution of the present paper is a relative assessment of how a CBA and a PP mandate impact on the regulatory task of risk assessment. In our parsimonious model, a decision maker can decide on the precision of a signal which provides noisy information on a payoff-relevant parameter. We find a complex interplay of MEU on information acquisition shaped by two countervailing forces that we dub 'Precautionary Learning' and 'Research Pessimism'. We find that - contrary to intuition - a mandate of PP rather than CBA will often give rise to a less informed regulator. PP can therefore lead to a higher likelihood of regulatory mistakes, such as the approval of harmful new substances.
    Keywords: scientific uncertainty; ambiguity; learning; risk assessment; precautionary principle; active information acquisition; regulatory mandates.
    Date: 2015–12–18
    URL: http://d.repec.org/n?u=RePEc:awi:wpaper:0605&r=rmg
  6. By: Nicole El Karoui (LPMA - Laboratoire de Probabilités et Modèles Aléatoires - UPMC - Université Pierre et Marie Curie - Paris 6 - UP7 - Université Paris Diderot - Paris 7 - CNRS - Centre National de la Recherche Scientifique); Stéphane Loisel (SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1); Jean-Luc Prigent (THEMA - Théorie économique, modélisation et applications - Université de Cergy Pontoise - CNRS - Centre National de la Recherche Scientifique); Julien Vedani (SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1)
    Abstract: The Solvency II directive has introduced a specific so-called risk-neutral framework to valuate economic accounting quantities throughout European life insurance companies. The adaptation of this theoretical notion for regulatory purposes requires the addition of a specific criterion, namely the market-consistency, in order to objectify the choice of the valuation probability measure. This paper aims at pointing out and fixing some of the major risk sources embedded in the current regulatory life insurance valuation scheme. We compare actuarial and financial valuation schemes. We then address first operational issues and potential market manipulation sources in life insurance, induced by both theoretical and regulatory pitfalls. For example, we show that calibrating the interest rate model in October 2014 instead of December 31 st 2014 generates a 140%-increase in the economic own funds of a representative French life insurance company. We propose various adaptations of the current implementations, including product-specific valuation scheme, to limit the impact of these market-inconsistencies.
    Keywords: risk-neutral valuation,economic valuation,market-consistency,European regulation,life insurance
    Date: 2015–12–11
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01242023&r=rmg
  7. By: Chong, Terence Tai Leung; Lin, Shiyu
    Abstract: This paper incorporates the macroeconomic determinants into the forecasting model of industry-level stock return volatility in order to detect whether different macroeconomic factors can forecast the volatility of various industries. To explain different fluctuation characteristics among industries, we identified a set of macroeconomic determinants to examine their effects. The Clark and West (2007) test is employed to verify whether the new forecasting models, which vary among industries based on the in-sample results, can have better predictions than the two benchmark models. Our results show that default return and default yield have significant impacts on stock return volatility.
    Keywords: Industry level stock return volatility; Out-of-sample forecast; Granger Causality.
    JEL: C12 G12
    Date: 2015–11–08
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:68460&r=rmg
  8. By: Sprios Bougheas; Alan Kirman
    Abstract: The paper argues that systemic risk must be taken into account when designing optimal bankruptcy procedures in general, and priority rules in particular. Allowing for endogenous formation of links in the interbank market we show that the optimal policy depends on the distribution of shocks and the severity of fire sales
    Keywords: Banks; Priority rules; Systemic Risk
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:not:notcfc:15/15&r=rmg
  9. By: Marcel Lever; Ilja Boelaars (University of Chicago); Ryanne Cox (DNB); Roel Mehlkopf (DNB; Netspar)
    Abstract: This paper measures how financial shocks - equity market, interest rate or inflation shocks - affect different generations of participants in pension schemes. We show that an individual scheme, by using a life cycle investment strategy, can largely replicate the allocation of traded risks across generations of a collective pension scheme that gradually adjusts pensions after financial shocks. Collective schemes can shift some financial risk to generations that will participate in the future, whereas individual accounts cannot. In the current institutional setting this shift of traded risk in collective contracts to future generations is limited. Collective pension schemes are able to reallocate non-traded risks among the participants to obtain a more efficient distribution of risk across generations. In schemes with individual accounts, risk sharing is limited to risks traded on financial markets.
    JEL: D91 G11 G23
    Date: 2015–12
    URL: http://d.repec.org/n?u=RePEc:cpb:discus:317&r=rmg
  10. By: Michael R. Tehranchi
    Abstract: In this note, Black--Scholes implied volatility is expressed in terms of various optimisation problems. From these representations, upper and lower bounds are derived which hold uniformly across moneyness and call price. Various symmetries of the Black--Scholes formula are exploited to derive new bounds from old. These bounds are used to reprove asymptotic formulae for implied volatility at extreme strikes and/or maturities. Finally, a curious characterisation of log-concave distributions on the real line is derived, generalising the main optimisation-based representation.
    Date: 2015–12
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1512.06812&r=rmg
  11. By: Matthieu Garcin (CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique, Natixis Asset Management - SAMS, LABEX Refi - ESCP Europe); Clément Goulet (CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique, LABEX Refi - ESCP Europe)
    Abstract: In this paper we propose a new model for estimating returns and volatility. Our approach is based both on the wavelet denoising technique and on the variational theory. We assess that the volatility can be expressed as a non-parametric functional form of past returns. Therefore, we are able to forecast both returns and volatility and to build confidence intervals for predicted returns. Our technique outperforms classical time series theory. Our model does not require the stationarity of the observed log-returns, it preserves the volatility stylised facts and it is based on a fully non-parametric form. This non-parametric form is obtained thanks to the multiplicative noise theory. To our knowledge, this is the first time that such a method is used for financial modeling. We propose an application to intraday and daily financial data.
    Keywords: Volatility modeling,non variational calculus,wavelet theory,trading strategy
    Date: 2015–09
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-01244292&r=rmg

This nep-rmg issue is ©2015 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.