nep-rmg New Economics Papers
on Risk Management
Issue of 2018‒07‒16
fourteen papers chosen by
Stan Miles
Thompson Rivers University

  1. Proposal on ELBE and LGD in-default: tackling capital requirements after the financial crisis By González, Marta Ramos; Ureña, Antonio Partal; Fernández-Aguado, Pilar Gómez
  2. A Multi-Criteria Financial and Energy Portfolio Analysis of Hedge Fund Strategies By Allen, D.E.; McAleer, M.J.; Singh, A.K.
  3. Tail Risks, Asset prices, and Investment Horizons By Joyef Barun\'ik; Mat\v{e}j Nevrla
  4. Estimation of Covariance Matrices for Portfolio Optimization using Gaussian Processes By Rajbir-Singh Nirwan; Nils Bertschinger
  5. Impact of multimodality of distributions on VaR and ES calculations By Dominique Guegan; Bertrand Hassani; Kehan Li
  6. Should the advanced measurement approach be replaced with the standardized measurement approach for operational risk? By Gareth Peters; Pavel Shevchenko; Bertrand Hassani; Ariane Chapelle
  7. Taxonomy of Chilean Financial Fragility Periods from 1975 By Juan Francisco Martínez; José Miguel Matus; Daniel Oda
  8. A New Model for Pricing Collateralized Financial Derivatives By Tim Xiao
  9. Simple Market Timing with Moving Averages By Ilomäki, J.; Laurila, H.; McAleer, M.J.
  10. Financial Credit Risk and Core Enterprise Supply Chains By Mou, W.M.; Wong, W.-K.; McAleer, M.J.
  11. Quantitative approach to multifractality induced by correlations and broad distribution of data By Rafal Rak; Dariusz Grech
  12. Portfolio Choice with Market-Credit Risk Dependencies By Lijun Bo; Agostino Capponi
  13. The Impact of the Identification of GSIBs on their Business Model By Aurélien Violon; Dominique Durant; Oana Toader
  14. Mortality/longevity Risk-Minimization with or without securitization By Tahir Choulli; Catherine Daveloose; Mich\`ele Vanmaele

  1. By: González, Marta Ramos; Ureña, Antonio Partal; Fernández-Aguado, Pilar Gómez
    Abstract: Following the financial crisis, the share of non-performing loans has significantly increased, while the regulatory guidelines on the Internal-Ratings Based (IRB) approach for capital adequacy calculation related to defaulted exposures remains too general. As a result, the high-risk nature of these portfolios is clearly in danger of being managed in a heterogeneous and inappropriate manner by those financial institutions permitted to use the IRB system, with the consequent undue variability of Risk-Weighted Assets (RWA). This paper presents a proposal to construct Advanced IRB models for defaulted exposures, in line with current regulations, that preserve the risk sensitivity of capital requirements. To do so, both parameters Expected Loss Best Estimate (ELBE) and Loss Given Default (LGD) in-default are obtained, backed by an innovative indicator (Mixed Adjustment Indicator) that is introduced to ensure an appropriate estimation of expected and unexpected losses. The methodology presented has low complexity and is easily applied to the databases commonly used at these institutions, as illustrated by two examples. JEL Classification: C51, G21, G28, G32
    Keywords: banking regulation, credit risk, defaulted exposures
    Date: 2018–06
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20182165&r=rmg
  2. By: Allen, D.E.; McAleer, M.J.; Singh, A.K.
    Abstract: The paper is concerned with a multi-criteria portfolio analysis of hedge fund strategies that are concerned with nancial commodities, including the possibility of energy spot, futures and exchange traded funds (ETF). It features a tri-criteria analysis of the Eurekahedge fund data strategy index data. We use nine Eurekahedge equally weighted main strategy indices for the portfolio analysis. The tri-criteria analysis features three objectives: return, risk and dispersion of risk objectives in a Multi-Criteria Optimisation (MCO) portfolio analysis. We vary the MCO return and risk targets, and contrast the results with four more standard portfolio optimisation criteria, namely tangency portfolio (MSR), most diversied portfolio (MDP), global minimum variance portfolio (GMW), and portfolios based on minimising expected shortfall (ERC). Backtests of the chosen portfolios for this hedge fund data set indicate that the use of MCO is accompanied by uncertainty about the a priori choice of optimal parameter settings for the decision criteria. The empirical results do not appear to outperform more standard bi-criteria portfolio analyses in the backtests undertaken on the hedge fund index data.
    Keywords: MCO, Portfolio Analysis, Hedge Fund Strategies, Multi-Criteria Optimisation, Genetic Algorithms, Spot prices, Futures pricees, Exchange Traded Funds (ETF)
    JEL: G15 G17 G32 C58 D53
    Date: 2018–06–11
    URL: http://d.repec.org/n?u=RePEc:ems:eureir:109055&r=rmg
  3. By: Joyef Barun\'ik; Mat\v{e}j Nevrla
    Abstract: We examine how extreme market risks are priced in the cross-section of asset returns at various horizons. Based on the frequency decomposition of covariance between indicator functions, we define the quantile cross-spectral beta of an asset capturing tail-specific as well as horizon-, or frequency-specific risks. Further, we work with two notions of frequency-specific extreme market risks. First, we define tail market risk that captures dependence between extremely low market as well as asset returns. Second, extreme market volatility risk is characterized by dependence between extremely high increments of market volatility and extremely low asset return. Empirical findings based on the datasets with long enough history, 30 Fama-French Industry portfolios, and 25 Fama-French portfolios sorted on size and book-to-market support our intuition. Results suggest that both frequency-specific tail market risk and extreme volatility risks are significantly priced and our five-factor model provides improvement over specifications considered by previous literature.
    Date: 2018–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1806.06148&r=rmg
  4. By: Rajbir-Singh Nirwan; Nils Bertschinger
    Abstract: Estimating covariances between financial assets plays an important role in risk management and optimal portfolio allocation. In practice, when the sample size is small compared to the number of variables, i.e. when considering a wide universe of assets over just a few years, this poses considerable challenges and the empirical estimate is known to be very unstable. Here, we propose a novel covariance estimator based on the Gaussian Process Latent Variable Model (GP-LVM). Our estimator can be considered as a non-linear extension of standard factor models with readily interpretable parameters reminiscent of market betas. Furthermore, our Bayesian treatment naturally shrinks the sample covariance matrix towards a more structured matrix given by the prior and thereby systematically reduces estimation errors.
    Date: 2018–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1806.03294&r=rmg
  5. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique, Labex ReFi - UP1 - Université Panthéon-Sorbonne); Bertrand Hassani (Grupo Santander, CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique, Labex ReFi - UP1 - Université Panthéon-Sorbonne); Kehan Li (CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique, Labex ReFi - UP1 - Université Panthéon-Sorbonne)
    Abstract: Unimodal probability distribution has been widely used for Value-at-Risk (VaR) computation by investors, risk managers and regulators. However, financial data may be characterized by distributions having more than one modes. Using a unimodal distribution may lead to bias for risk measure computation. In this paper, we discuss the influence of using multimodal distributions on VaR and Expected Shortfall (ES) calculation. Two multimodal distribution families are considered: Cobb's family and distortion family. We provide two ways to compute the VaR and the ES for them: an adapted rejection sampling technique for Cobb's family and an inversion approach for distortion family. For empirical study, two data sets are considered: a daily data set concerning operational risk and a three month scenario of market portfolio return built five minutes intraday data. With a complete spectrum of confidence levels from 0001 to 0.999, we analyze the VaR and the ES to see the interest of using multimodal distribution instead of unimodal distribution.
    Keywords: Risks,Multimodal distributions,Value-at-Risk,Expected Shortfall,Moments method,Adapted rejection sampling,Regulation
    Date: 2017–03
    URL: http://d.repec.org/n?u=RePEc:hal:journl:halshs-01491990&r=rmg
  6. By: Gareth Peters (Department of Statistical Sciences - UCL - University College of London [London]); Pavel Shevchenko (CSIRO - Commonwealth Scientific and Industrial Research Organisation [Canberra]); Bertrand Hassani (CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique); Ariane Chapelle (Department of Computer Science - UCL - University College of London [London])
    Abstract: Recently, Basel Committee for Basel Committee for Banking Supervision proposed to replace all approaches, including Advanced Measurement Approach (AMA), for operational risk capital with a simple formula referred to as the Standardised Measurement Approach (SMA). This paper discusses and studies the weaknesses and pitfalls of SMA such as instability, risk insensitivity, super-additivity and the implicit relationship between SMA capital model and systemic risk in the banking sector. We also discuss the issues with closely related operational risk Capital-at-Risk (OpCar) Basel Committee proposed model which is the precursor to the SMA. In conclusion, we advocate to maintain the AMA internal model framework and suggest as an alternative a number of standardization recommendations that could be considered to unify internal modelling of operational risk. The findings and views presented in this paper have been discussed with and supported by many OpRisk practitioners and academics in Australia, Europe, UK and USA, and recently at OpRisk Europe 2016 conference in London.
    Keywords: Basel Committee for Banking Supervision regulations,loss distribution approach,advanced measurement approach,operational risk,standardised measurement approach
    Date: 2016–07
    URL: http://d.repec.org/n?u=RePEc:hal:journl:halshs-01391091&r=rmg
  7. By: Juan Francisco Martínez; José Miguel Matus; Daniel Oda
    Abstract: The measurement of financial fragility is a key element but still an ongoing task for monetary, financial authorities and international financial institutions. This is specially relevant when applying financial policies that are contingent on the behavior of a particular economy or try to anticipate disruptive events. However, there are several dimensions that complicate the precise definition of financial fragility and the identification of these periods; some examples are: the distinction of causes, symptoms, effects and policy management measures. The current literature points out to a few key elements that have a broad impact on the financial system. In particular, it highlights the role of materialized credit risk, profits and credit activity of banks as signs of instability. In this paper, we combine these elements to identify and delimit historical financial fragility periods for the Chilean economy. In doing so, we build a novel monthly database that includes the 1980's local banking crisis period.
    Date: 2018–06
    URL: http://d.repec.org/n?u=RePEc:chb:bcchwp:822&r=rmg
  8. By: Tim Xiao (University of Toronto)
    Abstract: This paper presents a new model for pricing financial derivatives subject to collateralization. It allows for collateral arrangements adhering to bankruptcy laws. As such, the model can back out the market price of a collateralized contract. This framework is very useful for valuing outstanding derivatives. Using a unique dataset, we find empirical evidence that credit risk alone is not overly important in determining credit-related spreads. Only accounting for both collateral posting and credit risk can sufficiently explain unsecured credit costs. This finding suggests that failure to properly account for collateralization may result in significant mispricing of derivatives. We also empirically gauge the impact of collateral agreements on risk measurements. Our findings indicate that there are important interactions between market and credit risk.
    Keywords: collateralization,asset pricing,plumbing of financial system,swap premium spread,CVA
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:hal:journl:hal-01800559&r=rmg
  9. By: Ilomäki, J.; Laurila, H.; McAleer, M.J.
    Abstract: Consider using the simple moving average (MA) rule of Gartley (1935) to determine when to buy stocks, and when to sell them and switch to the risk-free rate. In comparison, how might the performance be affected if the frequency is changed to the use of MA calculations? The empirical results show that, on average, the lower is the frequency, the higher are average daily returns, even though the volatility is virtually unchanged when the frequency is lower. The volatility from the highest to the lowest frequency is about 30% lower as compared with the buy-and-hold strategy volatility, but the average returns approach the buy-and-hold returns when frequency is lower. The 30% reduction in volatility appears if we invest randomly half the time in stock markets and half in the riskfree rate.
    Keywords: Market timing, Moving averages, Risk-free rate, Returns and volatility
    JEL: G32 C58 C22 C41 D23
    Date: 2018–05–01
    URL: http://d.repec.org/n?u=RePEc:ems:eureir:107290&r=rmg
  10. By: Mou, W.M.; Wong, W.-K.; McAleer, M.J.
    Abstract: Supply chain finance has broken through traditional credit modes and advanced rapidly as a creative financial business. Core enterprises have played a crucial role in the credit enhancement of supply chain finance. Through the analysis of the core enterprise credit risks in supply chain finance, by means of Fuzzy Analytical Hierarchy Process (FAHP), the paper constructs a supply chain financial credit risk evaluation system, leading to quantitative measurement and evaluation of core enterprise credit risk. This novel approach should assist enterprises in taking appropriate measures to control credit risk, thereby promoting the healthy development of supply chain finance.
    Keywords: Supply chain finance, Core enterprises, Credit risk, Fuzzy analytic hierarchy process (FAHP)
    JEL: P42 H81 D81 G32
    Date: 2018–06–01
    URL: http://d.repec.org/n?u=RePEc:ems:eureir:109056&r=rmg
  11. By: Rafal Rak; Dariusz Grech
    Abstract: We analyze quantitatively the effect of spurious multifractality induced by the presence of fat-tailed symmetric and asymmetric probability distributions of fluctuations in time series. In the presented approach different kinds of symmetric and asymmetric broad probability distributions of synthetic data are examined starting from Levy regime up to those with finite variance. We use nonextensive Tsallis statistics to construct all considered data in order to have good analytical description of frequencies of fluctuations in the whole range of their magnitude and simultaneously the full control over exponent of power-law decay for tails of probability distribution. The semi-analytical compact formulas are then provided to express the level of spurious multifractality generated by the presence of fat tails in terms of Tsallis parameter $\tilde{q}$ and the scaling exponent $\beta$ of the asymptotic decay of cumulated probability density function (CDF).The results are presented in Hurst and H\"{o}lder languages - more often used in study of multifractal phenomena. According to the provided semi-analytical relations, it is argued how one can make a clear quantitative distinction for any real data between true multifractality caused by the presence of nonlinear correlations, spurious multifractality generated by fat-tailed shape of distributions - eventually with their asymmetry, and the correction due to linear autocorrelations in analyzed time series of finite length. In particular, the spurious multifractal effect of fat tails is found basic for proper quantitative estimation of all spurious multifractal effects. Examples from stock market data are presented to support these findings.
    Date: 2018–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1805.11909&r=rmg
  12. By: Lijun Bo; Agostino Capponi
    Abstract: We study an optimal investment/consumption problem in a model capturing market and credit risk dependencies. Stochastic factors drive both the default intensity and the volatility of the stocks in the portfolio. We use the martingale approach and analyze the recursive system of nonlinear Hamilton-Jacobi-Bellman equations associated with the dual problem. We transform such a system into an equivalent system of semi-linear PDEs, for which we establish existence and uniqueness of a bounded global classical solution. We obtain explicit representations for the optimal strategy, consumption path and wealth process, in terms of the solution to the recursive system of semi-linear PDEs. We numerically analyze the sensitivity of the optimal investment strategies to risk aversion, default risk and volatility.
    Date: 2018–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1806.07175&r=rmg
  13. By: Aurélien Violon; Dominique Durant; Oana Toader
    Abstract: Most research papers dealing with systemic footprint in the banking system either investigate the definition and the measure of systemic risk, or try to identify systemic banks and to calibrate the systemic risk buffers. To the best of our knowledge, this paper is among the first to provide empirical evidence on how the recent international regulation designed for global systemically important banks (GSIBs) drove changes on these institutions’ activity. Our data consists of cross-section observations for 97 large international banks from 22 countries from 2005 to 2016 (12 years). Our econometric approach quantifies the impact of the FSB designation on GSIBs’ activity, controlling for both structural differences between GSIBs and non-GSIBs and structural evolutions of the banking system over time (industry trends). We find that GSIBs have curbed downward the expansion of their total balance sheet after the FSB designation, which resulted in an additional improvement of their leverage ratio. In turn, a sizeable downward pressure is noticed on their return on equity (ROE). However, no adverse consequences can be observed on risk-taking and issuance of loans to the economy. Finally, while the relative deleveraging experienced by GSIBs illustrates a mean-reverting process, tending to close the structural gap between GSIBs and non-GSIBs, this is not the case for the cost of their deposits, which remains lower than the one of other banks, tending to prove that the GSIB framework has not so far put an end to the “too-big-to-fail” distortions.
    Keywords: GSIBs, business model, profitability, leverage, RWA.
    JEL: G01 G21 G28 G32
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:bfr:decfin:33&r=rmg
  14. By: Tahir Choulli; Catherine Daveloose; Mich\`ele Vanmaele
    Abstract: This paper addresses the risk-minimization problem, with and without mortality securitization, \`a la F\"ollmer-Sondermann for a large class of equity-linked mortality contracts when no model for the death time is specified. This framework includes the situation where the correlation between the market model and the time of death is arbitrary general, and hence leads to the case of a market model where there are two levels of information. The public information which is generated by the financial assets, and a larger flow of information that contains additional knowledge about a death time of an insured. By enlarging the filtration, the death uncertainty and its entailed risk are fully considered without any mathematical restriction. Our key tool lies in our optional martingale representation that states that any martingale in the large filtration stopped at the death time can be decomposed into precise orthogonal local martingales. This allows us to derive the dynamics of the value processes of the mortality/longevity securities used for the securitization, and to decompose any mortality/longevity liability into the sum of orthogonal risks by means of a risk basis. The first main contribution of this paper resides in quantifying, as explicit as possible, the effect of mortality uncertainty on the risk-minimizing strategy by determining the optimal strategy in the enlarged filtration in terms of strategies in the smaller filtration. Our second main contribution consists of finding risk-minimizing strategies with insurance securitization by investing in stocks and one (or more) mortality/longevity derivatives such as longevity bonds. This generalizes the existing literature on risk-minimization using mortality securitization in many directions.
    Date: 2018–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1805.11844&r=rmg

This nep-rmg issue is ©2018 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.