
on Risk Management 
Issue of 2019‒03‒04
nineteen papers chosen by 
By:  Sebastian M. Krause; Hrvoje \v{S}tefan\v{c}i\'c; Vinko Zlati\'c; Guido Caldarelli 
Abstract:  Evaluation of systemic risk in networks of financial institutions in general requires information of interinstitution financial exposures. In the framework of Debt Rank algorithm, we introduce an approximate method of systemic risk evaluation which requires only node properties, such as total assets and liabilities, as inputs. We demonstrate that this approximation captures a large portion of systemic risk measured by Debt Rank. Furthermore, using Monte Carlo simulations, we investigate network structures that can amplify systemic risk. Indeed, while no topology in general sense is {\em a priori} more stable if the market is liquid [1], a larger complexity is detrimental for the overall stability [2]. Here we find that the measure of scalar assortativity correlates well with level of systemic risk. In particular, network structures with high systemic risk are scalar assortative, meaning that risky banks are mostly exposed to other risky banks. Network structures with low systemic risk are scalar disassortative, with interactions of risky banks with stable banks. 
Date:  2019–02 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1902.08483&r=all 
By:  John Armstrong; Damiano Brigo 
Abstract:  We show that coherent risk measures are ineffective in curbing the behaviour of investors with limited liability if the market admits statistical arbitrage opportunities which we term $\rho$arbitrage for a risk measure $\rho$. We show how to determine analytically whether such portfolios exist in complete markets and in the Markowitz model. We also consider realistic numerical examples of incomplete markets and determine whether expected shortfall constraints are ineffective in these markets. We find that the answer depends heavily upon the probability model selected by the risk manager but that it is certainly possible for expected shortfall constraints to be ineffective in realistic markets. Since value at risk constraints are weaker than expected shortfall constraints, our results can be applied to value at risk. 
Date:  2019–02 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1902.10015&r=all 
By:  Das, Sanjiv; Mitchener, Kris James; Vossmeyer, Angela 
Abstract:  We employ a unique handcollected dataset and a novel methodology to examine systemic risk before and after the largest U.S. banking crisis of the 20th century. Our systemic risk measure captures both the credit risk of an individual bank as well as a bank's position in the network. We construct linkages between all U.S. commercial banks in 1929 and 1934 so that we can measure how predisposed the entire network was to risk, where risk was concentrated, and how the failure of more than 9,000 banks during the Great Depression altered risk in the network. We find that the pyramid structure of the commercial banking system (i.e., the network's topology) created more inherent fragility, but systemic risk was nevertheless fairly dispersed throughout banks in 1929, with the top 20 banks contributing roughly 18% of total systemic risk. The massive banking crisis that occurred between 193033 raised systemic risk per bank by 33% and increased the riskiness of the very largest banks in the system. We use Bayesian methods to demonstrate that when network measures, such as eigenvector centrality and a bank's systemic risk contribution, are combined with balance sheet data capturing ex ante bank default risk, they strongly predict bank survivorship in 1934. 
Keywords:  banking networks; financial crises; Global Financial Crisis; Great Depression; marginal likelihood; systemic risk 
JEL:  E42 E44 G01 G18 G21 L1 N12 N22 
Date:  2018–12 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:13416&r=all 
By:  E Philip Davis; Dilruba Karim; Dennison Noel 
Abstract:  Following experience in the global financial crisis (GFC), when banks with low leverage ratios were often in severe difficulty, despite highriskadjusted capital measures, a leverage ratio was introduced in Basel III to complement the riskadjusted capital ratio (RAR). Empirical testing of the leverage ratio, individually and relative to regulatory capital is, however, sparse. More generally, the capital/risk/competition nexus has been neglected by regulators and researchers. In this paper, we undertake empirical research that sheds light on leverage as a regulatory tool controlling for competition. We assess the effectiveness of a leverage ratio relative to the riskadjusted capital ratio (RAR) in predicting bank risk given competition for up to 8216 banks in the EU and 1270 in the US, using the FitchConnect database of banks’ financial statements. On balance, US banks tend to behave in a manner consistent with “skin in the game” (a negative relation of competition to risk) while European banks tend to follow the “regulatory hypothesis” (positive relation), although there are exceptions to these generalisations. Accordingly, the expected effect of changes in capital on risk needs careful attention by regulators. There is a tendency for the leverage ratio to be more often significant than the riskadjusted measure in a number of the regressions. This observation favours its use in macroprudential policy. The effect of capital on risk varies considerably over time and cross sectionally for Europe vis a vis the US; effects often differ between lowleverage and highleverage ratio banks as well as pre and postcrisis and for individual EU countries. The overall results are robust to a number of variations in sample and specification. We consider the inclusion of competition as a control variable to be a major contribution that adds to the relevance of our study. The results show that bank competition, allowing for capital, is a significant macroprudential indicator in virtually all regressions and hence more note should be taken of this by regulators, notably in the US where there is mainly evidence of competitionfragility (a positive link of competition to risk). On the other hand we note that exclusion of competition does not markedly change the effect of capital. Finally there are differences in the relation of risk both to competition and capital adequacy for banks at different levels of risk that need to be taken into account by regulators both in Europe and the US. There is some evidence of greater vulnerability of weaker banks to low capital and high competition than would be shown by the sample average or median. 
Keywords:  Macroprudential policy, capital adequacy, leverage ratio, bank competition, bank risks, panel estimation, quantile regressions 
JEL:  E58 G28 
Date:  2019–02 
URL:  http://d.repec.org/n?u=RePEc:nsr:niesrd:499&r=all 
By:  Hui Chen; Scott Joslin; Sophie X. Ni 
Abstract:  We propose a new measure of financial intermediary constraints based on how the intermediaries manage their tail risk exposures. Using data for the trading activities in the market of deep outofthemoney S&P 500 put options, we identify periods when the variations in the net amount of trading between financial intermediaries and public investors are likely to be mainly driven by shocks to intermediary constraints. We then infer tightness of intermediary constraints from the quantities of option trading during such periods. A tightening of intermediary constraint according to our measure is associated with increasing option expensiveness, higher risk premia for a wide range of financial assets, deterioration in funding liquidity, and brokerdealer deleveraging. 
JEL:  G01 G12 G17 G2 
Date:  2019–02 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:25573&r=all 
By:  Tomasz R. Bielecki; Igor Cialenco; Marcin Pitera; Thorsten Schmidt 
Abstract:  In this paper we develop a novel methodology for estimation of risk capital allocation. The methodology is rooted in the theory of risk measures. We work within a general, but tractable class of lawinvariant coherent risk measures, with a particular focus on expected shortfall. We introduce the concept of fair capital allocations and provide explicit formulae for fair capital allocations in case when the constituents of the risky portfolio are jointly normally distributed. The main focus of the paper is on the problem of approximating fair portfolio allocations in the case of not fully known law of the portfolio constituents. We define and study the concepts of fair allocation estimators and asymptotically fair allocation estimators. A substantial part of our study is devoted to the problem of estimating fair risk allocations for expected shortfall. We study this problem under normality as well as in a nonparametric setup. We derive several estimators, and prove their fairness and/or asymptotic fairness. Last, but not least, we propose two backtesting methodologies that are oriented at assessing the performance of the allocation estimation procedure. The paper closes with a substantial numerical study of the subject. 
Date:  2019–02 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1902.10044&r=all 
By:  Tim Xiao (University of Toronto) 
Abstract:  The incremental risk charge (IRC) is a new regulatory requirement from the Basel Committee in response to the recent financial crisis. Notably few models for IRC have been developed in the literature. This paper proposes a methodology consisting of two Monte Carlo simulations. The first Monte Carlo simulation simulates default, migration, and concentration in an integrated way. Combining with full revaluation, the loss distribution at the first liquidity horizon for a subportfolio can be generated. The second Monte Carlo simulation is the random draws based on the constant level of risk assumption. It convolutes the copies of the single loss distribution to produce one year loss distribution. The aggregation of different subportfolios with different liquidity horizons is addressed. Moreover, the methodology for equity is also included, even though it is optional in IRC. Acknowledge: The work was sponsored by FinPricing at www.finpricing.com 
Keywords:  Incremental risk charge (IRC),constant level of risk,liquidity horizon,constant loss distribution,Mertontype model,concentration 
Date:  2019–02–18 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal02024148&r=all 
By:  Nizam, Ahmed Mehedi 
Abstract:  Basel framework for bank's capital adequacy has been criticized for its over reliance on external credit rating agencies. Moreover, implementation of Minimum Capital Requirement (MCR) under BaselIII is often linked to a decrease in economic growth as it requires banks to maintain a higher capital base which raises their cost of fund. In addition to these, here, we criticize the Basel accord for the capital requirement under this framework is not inspired by the essence of the basic accounting equation. Moreover, under Basel framework, capital requirement and liquidity parameters are discussed separately. Here, we argue that the capital requirement should arise as a byproduct of the day to day liquidity management and hence both the requirements can be brought together under one umbrella which enables us to view the overall position of a bank from a more holistic point of view. Here, we attain all the above issues and provide a comprehensive framework regarding bank's capital adequacy and liquidity requirements which is claimed to settle all the aforementioned issues and reduces all the extensive paper works needed for the implementation of the Basel accord. 
Keywords:  Basel; Capital Adequacy; Minimum Capital Requirement; MCR; Liquidity Ratio; LCR; NSFR; Liquidity Coverage Ratio; Net Stable Funding Ratio; Banking; Basic Accounting Equation 
JEL:  E58 G0 G01 G20 G21 G28 
Date:  2019–02–22 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:92330&r=all 
By:  Tim Xiao (University of Toronto) 
Abstract:  This article presents a comprehensive framework for valuing financial instruments subject to credit risk. In particular, we focus on the impact of default dependence on asset pricing, as correlated default risk is one of the most pervasive threats in financial markets. We analyze how swap rates are affected by bilateral counterparty credit risk, and how CDS spreads depend on the trilateral credit risk of the buyer, seller, and reference entity in a contract. Moreover, we study the effect of collateralization on valuation, since the majority of OTC derivatives are collateralized. The model shows that a fully collateralized swap is riskfree, whereas a fully collateralized CDS is not equivalent to a riskfree one. Acknowledge: The data were provided by FinPricing at www.finpricing.com 
Keywords:  asset pricing,credit risk modeling,unilateral,bilateral,multilateral credit risk,collateralization,comvariance,comrelation,correlation 
Date:  2019–02–18 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal02024145&r=all 
By:  Tim Xiao (University of Toronto) 
Abstract:  This article presents a new model for valuing financial contracts subject to credit risk and collateralization. Examples include the valuation of a credit default swap (CDS) contract that is affected by the trilateral credit risk of the buyer, seller and reference entity. We show that default dependency has a significant impact on asset pricing. In fact, correlated default risk is one of the most pervasive threats in financial markets. We also show that a fully collateralized CDS is not equivalent to a riskfree one. In other words, full collateralization cannot eliminate counterparty risk completely in the CDS market. Acknowledge: The work was sponsored by FinPricing at www.finpricing.com 
Keywords:  asset pricing,credit risk modeling,collateralization,comvariance,comrelation,correlation,CDS 
Date:  2019–02–18 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal02024147&r=all 
By:  Miryana Grigorova (School of Mathematics  University of Leeds  University of Leeds); MarieClaire Quenez (LPSM UMR 8001  Laboratoire de Probabilités, Statistique et Modélisation  UPD7  Université Paris Diderot  Paris 7  Sorbonne Université  CNRS  Centre National de la Recherche Scientifique); Agnès Sulem (MATHRISK  Mathematical Risk Handling  Inria de Paris  Inria  Institut National de Recherche en Informatique et en Automatique  ENPC  École des Ponts ParisTech  UPEM  Université ParisEst MarnelaVallée) 
Abstract:  This paper studies the superhedging prices and the associated superhedging strategies for European options in a nonlinear incomplete market model with default. We present the seller's and the buyer's point of view. The underlying market model consists of a riskfree asset and a risky asset driven by a Brownian motion and a compensated default martingale. The portfolio processes follow nonlinear dynamics with a nonlinear driver f. By using a dynamic programming approach, we first provide a dual formulation of the seller's (superhedging) price for the European option as the supremum, over a suitable set of equivalent probability measures Q ∈ Q, of the fevaluation/expectation under Q of the payoff. We also provide a characterization of the seller's (superhedging) price process as the minimal supersolution of a constrained BSDE with default and a characterization in terms of the minimal weak supersolution of a BSDE with default. By a form of symmetry, we derive corresponding results for the buyer. Our results rely on first establishing a nonlinear optional and a nonlinear predictable decomposition for processes which are $\mathcal{E}^f$strong supermartingales under Q, for all Q ∈ Q. 
Keywords:  fexpectation,BSDEs with constraints,Nonlinear pricing,Superhedging,Incomplete market,European options,Control problems with nonlinear expectation,Nonlinear optional decomposition,Pricinghedging duality 
Date:  2019–02–19 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal02025833&r=all 
By:  E Philip Davis; Dilruba Karim; Dennison Noel 
Abstract:  The Global Financial Crisis (GFC) highlighted the importance of a number of unresolved empirical issues in the field of financial stability. First, there is the sign of the relationship between bank competition and financial stability. Second, there is the relation of capital adequacy of banks to risk. Third, the introduction of a leverage ratio in Basel III following the crisis leaves open the question of its effectiveness relative to the risk adjusted capital ratio (RAR). Fourth, there is the issue of the relative stability of advanced versus emerging market financial systems, and whether similar factors lead to risk, which may have implications for appropriate regulation. Finally, there is the nature of the relation between bank competition and bank capital. In this context, we address these five issues via estimates for the relation between capital adequacy, bank competition and other control variables and aggregate bank risk. We undertake this for different country groups and time periods, using macro data from the World Bank’s Global Financial Development Database over 19992015 for up to 120 countries globally, using single equation logit and GMM estimation techniques and panel VAR. This is an overall approach that to our knowledge is new to the literature. The results cast light on each of the issues outlined above, with important implications for regulation: (1) The results for the Lerner Index largely underpin the “competitionfragility” hypothesis of a positive relation of competition to risk rather than “competition stability” (a negative relation) and show a widespread impact of competition on risk generally. (2) There is a tendency for both the leverage ratio and the RAR to be significant predictors of risk, and for crises and Z score they are supportive of the “skin in the game” hypothesis of a negative relation between capital ratios and risk, whereas for provisions and NPLs they are consistent with the “regulatory hypothesis” of a positive relation of capital adequacy to risk. (3) The leverage ratio is much more widely relevant than the RAR, underlining its importance as a regulatory tool. The relative ineffectiveness of risk adjusted measures may relate to untruthful or inaccurate assessments of bank real risk exposure. (4) There are marked differences between advanced countries and EMEs in the capitalriskcompetition nexus, with for example a wider impact of competition in EMEs (although both types of country need to pay careful attention to the evolution of competition in macroprudential surveillance). Similar pattern to EMEs are apparent in many cases for the global sample pre crisis, which arguably are more consistent with normal market functioning than post crisis. (5) Competition reduces leverage ratios significantly in a Panel VAR, with impulse responses showing that more competition leads to lower leverage ratios and vice versa. This result is consistent over a range of subsamples and risk variables. In the variance decomposition, we find that competition is autonomous, while the variance of both risk and capital ratios are strongly affected by competition. The Panel VAR results give some indication of the transmission mechanism from competition to risk and financial instability. 
Keywords:  Macroprudential policy, capital adequacy, leverage ratio, bank competition, bank risks, emerging market economies, logit, GMM, Panel VAR 
JEL:  E58 G28 
Date:  2019–02 
URL:  http://d.repec.org/n?u=RePEc:nsr:niesrd:500&r=all 
By:  Luca Di Persio; Luca Prezioso; Kai Wallbaum 
Abstract:  Recent years have seen an emerging class of structured financial products based on options linked to dynamic asset allocation strategies. One of the most chosen approach is the socalled target volatility mechanism. It shifts between risky and riskless assets to control the volatility of the overall portfolio. Even if a series of articles have been already devoted to the analysis of options linked to the target volatility mechanism, this paper is the first, to the best of our knowledge, that tries to develop closedend formulas for VolTarget options. In particular, we develop closedend formulas for option prices and some key hedging parameters within a Black and Scholes setting, assuming the underlying follows a target volatility mechanism. 
Date:  2019–02 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1902.08821&r=all 
By:  Miryana Grigorova (School of Mathematics  University of Leeds  University of Leeds); MarieClaire Quenez (LPSM UMR 8001  Laboratoire de Probabilités, Statistique et Modélisation  UPD7  Université Paris Diderot  Paris 7  Sorbonne Université  CNRS  Centre National de la Recherche Scientifique); Agnès Sulem (MATHRISK  Mathematical Risk Handling  Inria de Paris  Inria  Institut National de Recherche en Informatique et en Automatique  ENPC  École des Ponts ParisTech  UPEM  Université ParisEst MarnelaVallée) 
Abstract:  We study the superhedging prices and the associated superhedging strategies for American options in a nonlinear incomplete market model with default. The points of view of the seller and of the buyer are presented. The underlying market model consists of a riskfree asset and a risky asset driven by a Brownian motion and a compensated default martingale. The portfolio processes follow nonlinear dynamics with a nonlinear driver f. We give a dual representation of the seller's (superhedging) price for the American option associated with a completely irregular payoff $(\xi_t)$ (not necessarily càdlàg) in terms of the value of a nonlinear mixed control/stopping problem. The dual representation involves a suitable set of equivalent probability measures, which we call fmartingale probability measures. We also provide two infinitesimal characterizations of the seller's price process: in terms of the minimal supersolution of a constrained reflected BSDE and in terms of the minimal supersolution of an optional reflected BSDE. Under some regularity assumptions on $\xi$, we also show a duality result for the buyer's price in terms of the value of a nonlinear control/stopping game problem. 
Keywords:  Control problems with nonlinear expectation,Constrained reflected BSDE,fexpectation,Nonlinear pricing,Incomplete markets,American options,Optimal stopping with nonlinear expectation,Nonlinear optional decomposition,Pricinghedging duality 
Date:  2019–02–19 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal02025835&r=all 
By:  Zviadadze, Irina 
Abstract:  This article develops an empirical methodology to determine which economic shocks span risk in asset returns and fluctuations in discount rate and cash flow news. A theoretically motivated shock identification scheme in a presentvalue model identifies economic shocks. The choice of identifying restrictions is based on the properties of the term structure of risk in expected returns in the data and in equilibrium models. Empirically, I relate equity discount rate news and cash flow news to multiple sources of risk in the variance of consumption growth. Both types of news are almost equally important for the aggregate market risk. 
Keywords:  incremental expected dividend; incremental expected return; permanent and transient shocks 
JEL:  C32 C52 G12 
Date:  2018–12 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:13414&r=all 
By:  Chiara Lattanzi; Manuele Leonelli 
Abstract:  Inference over tails is usually performed by fitting an appropriate limiting distribution over observations that exceed a fixed threshold. However, the choice of such threshold is critical and can affect the inferential results. Extreme value mixture models have been defined to estimate the threshold using the full dataset and to give accurate tail estimates. Such models assume that the tail behavior is constant for all observations. However, the extreme behavior of financial returns often changes considerably in time and such changes occur by sudden shocks of the market. Here we extend the extreme value mixture model class to formally take into account distributional extreme changepoints, by allowing for the presence of regimedependent parameters modelling the tail of the distribution. This extension formally uses the full dataset to both estimate the thresholds and the extreme changepoint locations, giving uncertainty measures for both quantities. Estimation of functions of interest in extreme value analyses is performed via MCMC algorithms. Our approach is evaluated through a series of simulations, applied to real data sets and assessed against competing approaches. Evidence demonstrates that the inclusion of different extreme regimes outperforms both static and dynamic competing approaches in financial applications. 
Date:  2019–02 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1902.09205&r=all 
By:  Sangyeon Kim; Myungjoo Kang 
Abstract:  Financial time series prediction, especially with machine learning techniques, is an extensive field of study. In recent times, deep learning methods (especially time series analysis) have performed outstandingly for various industrial problems, with better prediction than machine learning methods. Moreover, many researchers have used deep learning methods to predict financial time series with various models in recent years. In this paper, we will compare various deep learning models, such as multilayer perceptron (MLP), onedimensional convolutional neural networks (1D CNN), stacked long shortterm memory (stacked LSTM), attention networks, and weighted attention networks for financial time series prediction. In particular, attention LSTM is not only used for prediction, but also for visualizing intermediate outputs to analyze the reason of prediction; therefore, we will show an example for understanding the model prediction intuitively with attention vectors. In addition, we focus on time and factors, which lead to an easy understanding of why certain trends are predicted when accessing a given time series table. We also modify the loss functions of the attention models with weighted categorical cross entropy; our proposed model produces a 0.76 hit ratio, which is superior to those of other methods for predicting the trends of the KOSPI 200. 
Date:  2019–02 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1902.10877&r=all 
By:  Cyril B\'en\'ezet; JeanFran\c{c}ois Chassagneux; Christoph Reisinger 
Abstract:  We consider the numerical approximation of the quantile hedging price in a nonlinear market. In a Markovian framework, we propose a numerical method based on a Piecewise Constant Policy Timestepping (PCPT) scheme coupled with a monotone finite difference approximation. We prove the convergence of our algorithm combining BSDE arguments with the Barles & Jakobsen and Barles & Souganidis approaches for nonlinear equations. In a numerical section, we illustrate the efficiency of our scheme by considering a financial example in a market with imperfections. 
Date:  2019–02 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1902.11228&r=all 
By:  J. L. Subias 
Abstract:  The present paper describes a practical example in which the probability distribution of the prices of a stock market blue chip is calculated as the wave function of a quantum particle confined in a potential well. This model may naturally explain the operation of several empirical rules used by technical analysts. Models based on the movement of a Brownian particle do not account for fundamental aspects of financial markets. This is due to the fact that the Brownian particle is a classical particle, while stock market prices behave more like quantum particles. When a classical particle meets an obstacle or a potential barrier, it may either bounce or overcome the obstacle, yet not both at a time. Only a quantum particle can simultaneously reflect and transmit itself on a potential barrier. This is precisely what prices in a stock market imitate when they find a resistance level: they partially bounce against and partially overcome it. This can only be explained by admitting that prices behave as quantum rather than as classic particles. The proposed quantum model finds natural justification not only for the aforementioned facts but also for other empirically wellknown facts such as sudden changes in volatility, nonGaussian distribution in prices, among others. 
Date:  2019–01 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1902.10502&r=all 