nep-rmg New Economics Papers
on Risk Management
Issue of 2018‒03‒19
fifteen papers chosen by
Stan Miles
Thompson Rivers University

  1. Continuous partition-of-unity copulas and their application to risk management By Dietmar Pfeifer; Andreas M\"andle; Olena Ragulina; C\^ome Girschig
  2. Credit Risk Meets Random Matrices: Coping with Non-Stationary Asset Correlations By Andreas M\"uhlbacher; Thomas Guhr
  3. Multilevel nested simulation for efficient risk estimation By Michael B. Giles; Abdul-Lateef Haji-Ali
  4. Multivariate dependence and portfolio optimization algorithms under illiquid market scenarios By Al Janabi, Mazin A.M.; Arreola Hernandez, Jose; Berger, Theo; Nguyen, Duc Khuong
  5. The MacroFinancial Risk Assessment Framework (MFRAF), Version 2.0 By Jose Fique
  6. Credit Risk Analysis using Machine and Deep Learning models By Peter Addo; Dominique Guegan; Bertrand Hassani
  7. Strong Boards and Risk-taking in Islamic Banks By Sabur Mollah; Michael Skully; Eva Liljeblom
  8. Leverage Ratio as a Macroprudential Policy Instrument By Nijolë Valinskytë; Erika Ivanauskaitë; Darius Kulikauskas; Simonas Krëpðta
  9. Insuring entrepreneurial downside risk By Sumudu Kankanamge; Alexandre Gaillard
  10. A Unified Modeling Framework for Life and Non-Life Insurance By Francesca Biagini; Yinglin Zhang
  11. How do manager incentives influence corporate hedging? By Bihary, Zsolt; Dömötör, Barbara
  12. On the Relationship Between Cognitive Ability and Risk Preference By Dohmen, Thomas; Falk, Armin; Huffman, David; Sunde, Uwe
  13. Coordination of circuit breakers? Volume migration and volatility spillover in fagmented markets By Clapham, Benjamin; Gomber, Peter; Panz, Sven
  14. An Operational (Preasymptotic) Measure of Fat-tailedness By Nassim Nicholas Taleb
  15. Financial Frictions, Volatility, and Skewness By David Zeke

  1. By: Dietmar Pfeifer; Andreas M\"andle; Olena Ragulina; C\^ome Girschig
    Abstract: In this paper we discuss a natural extension of infinite discrete partition-of-unity copulas to continuous partition of copulas which were recently introduced in the literature, with possible applications in risk management and other fields. We present a general simple algorithm to generate such copulas on the basis of the empirical copula from high-dimensional data sets. In particular, our constructions also allow for positive tail dependence which sometimes is a desirable property of data-driven copula modelling, in particular for internal models under Solvency II.
    Date: 2018–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1803.00957&r=rmg
  2. By: Andreas M\"uhlbacher; Thomas Guhr
    Abstract: We review recent progress in modeling credit risk for correlated assets. We start from the Merton model which default events and losses are derived from the asset values at maturity. To estimate the time development of the asset values, the stock prices are used whose correlations have a strong impact on the loss distribution, particularly on its tails. These correlations are non-stationary which also influences the tails. We account for the asset fluctuations by averaging over an ensemble of random matrices that models the truly existing set of measured correlation matrices. As a most welcome side effect, this approach drastically reduces the parameter dependence of the loss distribution, allowing us to obtain very explicit results which show quantitatively that the heavy tails prevail over diversification benefits even for small correlations. We calibrate our random matrix model with market data and show how it is capable of grasping different market situations. Furthermore, we present numerical simulations for concurrent portfolio risks, i.e., for the joint probability densities of losses for two portfolios. For the convenience of the reader, we give an introduction to the Wishart random matrix model.
    Date: 2018–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1803.00261&r=rmg
  3. By: Michael B. Giles; Abdul-Lateef Haji-Ali
    Abstract: We investigate the problem of computing a nested expectation of the form $\mathbb{P}[\mathbb{E}[X|Y] \!\geq\!0]\!=\!\mathbb{E}[\textrm{H}(\mathbb{E}[X|Y])]$ where $\textrm{H}$ is the Heaviside function. This nested expectation appears, for example, when estimating the probability of a large loss from a financial portfolio. We present a method that combines the idea of using Multilevel Monte Carlo (MLMC) for nested expectations with the idea of adaptively selecting the number of samples in the approximation of the inner expectation, as proposed by (Broadie et al., 2011). We propose and analyse an algorithm that adaptively selects the number of inner samples on each MLMC level and prove that the resulting MLMC method with adaptive sampling has an $\mathcal{O}\left( \varepsilon^{-2}|\log\varepsilon|^2 \right)$ complexity to achieve a root mean-squared error $\varepsilon$. The theoretical analysis is verified by numerical experiments on a simple model problem. We also present a stochastic root-finding algorithm that, combined with our adaptive methods, can be used to compute other risk measures such as Value-at-Risk (VaR) and Conditional Value-at-Risk (CVaR), with the latter being achieved with $\mathcal{O}\left(\varepsilon^{-2}\right)$ complexity.
    Date: 2018–02
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1802.05016&r=rmg
  4. By: Al Janabi, Mazin A.M.; Arreola Hernandez, Jose; Berger, Theo; Nguyen, Duc Khuong
    Abstract: We propose a model for optimizing structured portfolios with liquidity-adjusted Value-at-Risk (LVaR) constraints, whereby linear correlations between assets are replaced by the multivariate nonlinear dependence structure based on Dynamic Conditional Correlation t-copula modeling. Our portfolio optimization algorithm minimizes the LVaR function under adverse market circumstances and multiple operational and financial constraints. When we consider a diversified portfolio of international stock and commodity market indices under multiple realistic portfolio optimization scenarios, the obtained results consistently show the superiority of our approach relative to other competing portfolio strategies including the minimum-variance, risk-parity and equally weighted portfolio allocations.
    Keywords: Dynamic copulas, LVaR, dependence structure, portfolio optimization algorithm
    JEL: C5 G11 G17
    Date: 2016–06
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:84626&r=rmg
  5. By: Jose Fique
    Abstract: This report provides a detailed technical description of the updated MacroFinancial Risk Assessment Framework (MFRAF), which replaces the version described in Gauthier, Souissi and Liu (2014) as the Bank of Canada’s stress-testing model for banks with a focus on domestic systemically important banks (D-SIBs). This new version incorporates the characteristics of the previous model and also includes fire-sale effects resulting from the regulatory leverage constraints faced by banks, as well as an enhanced treatment of feedback-loop effects between solvency and liquidity risks through both the pricing and costly asset-liquidation channels. These new features improve the model’s ability to capture the non-linear effects of risk scenarios on D-SIBs’ capital positions and shed light on the importance of additional channels of stress propagation. The model is also subject to a comprehensive sensitivity analysis.
    Keywords: Financial stability, Financial system regulation and policies
    JEL: G01 G21 G28 C72 E58
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:bca:bocatr:111&r=rmg
  6. By: Peter Addo (Lead Data Scientist - SNCF Mobilité); Dominique Guegan (UP1 - Université Panthéon-Sorbonne, Labex ReFi - UP1 - Université Panthéon-Sorbonne, University of Ca’ Foscari [Venice, Italy], CES - Centre d'économie de la Sorbonne - CNRS - Centre National de la Recherche Scientifique - UP1 - Université Panthéon-Sorbonne, IPAG - IPAG Business School - Ipag); Bertrand Hassani (Labex ReFi - UP1 - Université Panthéon-Sorbonne, Capgemini Consulting [Paris])
    Abstract: Due to the hyper technology associated to Big Data, data availability and computing power, most banks or lending financial institutions are renewing their business models. Credit risk predictions, monitoring, model reliability and effective loan processing are key to decision making and transparency. In this work, we build binary classifiers based on machine and deep learning models on real data in predicting loan default probability. The top 10 important features from these models are selected and then used in the modelling process to test the stability of binary classifiers by comparing performance on separate data. We observe that tree-based models are more stable than models based on multilayer artificial neural networks. This opens several questions relative to the intensive used of deep learning systems in the enterprises.
    Keywords: Credit risk,Financial regulation,Data Science,Bigdata,Deep learning
    Date: 2018–02
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-01719983&r=rmg
  7. By: Sabur Mollah (School of Management, Swansea University); Michael Skully (Monash University); Eva Liljeblom (Hanken School of Economics)
    Abstract: This paper examines whether variations in strong boards explain the differences between risk-taking in Islamic and conventional banks. From an analysis of a pooled sample of Islamic and conventional banks, we find that strong boards in general serve their shareholders through engaging in higher risk-taking activities across both types of banks. In Islamic banks, however, the Shari'ah Supervisory Board (SSB) is found to mitigate risk-taking when integrated with a strong board, as religiosity restrains risk-taking.
    Keywords: Strong Board, SSB, Religiosity, Risk-Taking, Islamic Banks, and Conventional Banks.
    JEL: G01 G21 G34
    Date: 2018–02–24
    URL: http://d.repec.org/n?u=RePEc:swn:wpaper:2018-08&r=rmg
  8. By: Nijolë Valinskytë (Bank of Lithuania); Erika Ivanauskaitë (Bank of Lithuania); Darius Kulikauskas (Bank of Lithuania); Simonas Krëpðta (Bank of Lithuania)
    Abstract: This paper aims to explain the relationship between risk-based and LR requirements and the motivation for the macroprudential use of LR requirements. The rest of the paper is structured as follows. First, we define the LR and the microprudential requirement that is based on it (Chapter 1) and discuss the merits and drawbacks of risk-weighted and nonrisk-weighted capital requirements, assessing how LR requirements can improve the current capital regulation framework (Chapter 2). Then, we turn to the stylized quantitative relationship between the two kinds of requirements and illustrate the rationale for macroprudential LR add-ons (Chapter 3). Further on, we consider legal issues, with a focus on the EU (Chapter 4) and review the country experience with LR requirements (Chapter 5). Finally, we take a look at the LR situation in the Lithuanian banking sector (Chapter 6) and conclude.
    Date: 2018–03–07
    URL: http://d.repec.org/n?u=RePEc:lie:opaper:18&r=rmg
  9. By: Sumudu Kankanamge (Toulouse School of Economics); Alexandre Gaillard (Toulouse School of Economics)
    Abstract: This paper examines the effects of entrepreneurial downside risk insurance on the level and composition of the entrepreneurial pool and to a larger extent on unemployment, production and welfare. We build a rich theoretical framework combining occupational choice, heterogenous agents and incomplete markets to address our main policy concerns. Using CPS, SCF and SBO data, we match our economy to fundamental empirical elements on unemployment, entrepreneurship and mobility and provide contributions on the transition between occupations with respect to individual ability such as matching the U-shaped curve of the transition from worker to self-employed or the hump-shaped curve of the reverse transition. Depending on the downside risk insurance policy considered, we find that this insurance can have a significative impact not only on the level of entrepreneurship but also on the firm size in the entrepreneurial pool and production, although the impact on unemployment is modest.
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:red:sed017:1406&r=rmg
  10. By: Francesca Biagini; Yinglin Zhang
    Abstract: In this paper we propose for the first time a unified framework suitable for modeling both life and non-life insurance market, with nontrivial dependence with the financial market. We introduce a direct modeling approach, which generalizes the reduced-form framework for credit risk and life insurance. We apply these results for pricing insurance products in hybrid markets by taking into account the role of inflation under the benchmark approach. This framework offers at the same time a general and flexible structure, as well as explicit and treatable pricing formula.
    Date: 2018–02
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1802.07741&r=rmg
  11. By: Bihary, Zsolt; Dömötör, Barbara
    Abstract: We explain the diversity of corporate hedging behavior in a single model. The hedging ratio is obtained by maximizing expected utility that is a combination of the corporate level utility and a component that models the incentives of the financial manager. We derive a theoretical model that gives back the classic result of the literature if the financial manager has no other incentive than to maximize corporate utility. In the case the financial manager expects that his evaluation will be based exclusively on the financial profit (the profit of the hedging transactions), being risk averse, he decides not to hedge at all. The hedging ratio depends on the weight of these contradictory effects. We test our theoretical results on Hungarian corporate survey data.
    Keywords: corporate hedging, corporate utility, manager incentives
    JEL: F13 G32 G34
    Date: 2018–02–26
    URL: http://d.repec.org/n?u=RePEc:cvh:coecwp:2018/01&r=rmg
  12. By: Dohmen, Thomas (University of Bonn and IZA); Falk, Armin (briq and University of Bonn); Huffman, David (University of Pittsburgh); Sunde, Uwe (LMU)
    Abstract: This paper focuses on the relationship between cognitive ability and decision making under risk and uncertainty. We begin by clarifying some important distinctions between concepts and measurement of risk preference and cognitive ability and then take stock of what is known empirically on the connections between cognitive ability and measured risk preferences.
    Keywords: ;
    Date: 2018–03–05
    URL: http://d.repec.org/n?u=RePEc:rco:dpaper:76&r=rmg
  13. By: Clapham, Benjamin; Gomber, Peter; Panz, Sven
    Abstract: We study circuit breakers in a fragmented, multi-market environment and investigate whether a coordination of circuit breakers is necessary to ensure their effectiveness. In doing so, we analyze 2,337 volatility interruptions on Deutsche Boerse and research whether a volume migration and an accompanying volatility spillover to alternative venues that continue trading can be observed. Different to prevailing theoretical rationale, trading volume on alternative venues significantly decreases during circuit breakers on the main market and we do not find any evidence for volatility spillover. Moreover, we show that the market share of the main market increases sharply during a circuit breaker. Surprisingly, this is amplified with increasing levels of fragmentation. We identify high-frequency trading as a major reason for the vanishing trading activity on the alternative venues and give empirical evidence that a coordination of circuit breakers is not essential for their effectiveness as long as market participants shift to the dominant venue during market stress.
    Keywords: Circuit Breaker,Volatility Interruption,Market Fragmentation,High-Frequency Trading,Stock Market,Regulation,Liquidity
    JEL: G14 G15 G18 G28
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:zbw:safewp:196&r=rmg
  14. By: Nassim Nicholas Taleb
    Abstract: This note presents an operational measure of fat-tailedness for univariate probability distributions, in $[0,1]$ where 0 is maximally thin-tailed (Gaussian) and 1 is maximally fat-tailed. Among others,1) it helps assess the sample size needed to establish a comparative $n$ needed for statistical significance, 2) allows practical comparisons across classes of fat-tailed distributions, 3) helps understand some inconsistent attributes of the lognormal, pending on the parametrization of its scale parameter. The literature is rich for what concerns asymptotic behavior, but there is a large void for finite values of $n$, those needed for operational purposes. Conventional measures of fat-tailedness, namely 1) the tail index for the power law class, and 2) Kurtosis for finite moment distributions fail to apply to some distributions, and do not allow comparisons across classes and parametrization, that is between power laws outside the Levy-Stable basin, or power laws to distributions in other classes, or power laws for different number of summands. How can one compare a sum of 100 Student T distributed random variables with 3 degrees of freedom to one in a Levy-Stable or a Lognormal class? How can one compare a sum of 100 Student T with 3 degrees of freedom to a single Student T with 2 degrees of freedom? We propose an operational and heuristic measure that allow us to compare $n$-summed independent variables under all distributions with finite first moment. The method is based on the rate of convergence of the Law of Large numbers for finite sums, $n$-summands specifically. We get either explicit expressions or simulation results and bounds for the lognormal, exponential, Pareto, and the Student T distributions in their various calibrations --in addition to the general Pearson classes.
    Date: 2018–02
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1802.05495&r=rmg
  15. By: David Zeke (University of Southern California)
    Abstract: A number of recent papers use the interaction of firm idiosyncratic volatility shocks with firm financial frictions to explain business cycle fluctuations. I argue that a key parameter for these models is the cost of default, as it has a quantitatively first-order effect on the magnitude of the decline in employment and other aggregates in response to idiosyncratic volatility shocks. I use firm-level panel data and a structural model of financial frictions and volatility shocks to assess the role of volatility shocks and the cost of default on firm and aggregate employment over the business cycle. I find that when the cost of default is calibrated to the range of estimates coming from the corporate finance literature, the model reproduces key cross-sectional moments of equity volatility, bond spreads, and employment growth. However, this calibration implies aggregate employment losses driven by shocks to firm idiosyncratic volatility are modest. I propose two additional shocks calibrated using firm-level panel data which could amplify the decline in employment in the context of such a model. First, the decline in employment is amplified when the increase in firm idiosyncratic risk is modeled not only as a positive second moment shock but also a negative third moment shock. Second, a plausible increase in the cost of default over the business cycle can interact with volatility shocks to dramatically reduce aggregate employment.
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:red:sed017:1421&r=rmg

This nep-rmg issue is ©2018 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.