
on Risk Management 
By:  Houllier, Melanie (The London Institute for Banking and Finance); Murphy, David (Bank of England) 
Abstract:  The advent of mandatory central clearing for certain types of overthecounter derivatives and margin requirements for others means that margin is the most important mitigation mechanism for many counterparty credit risks. Initial margin requirements are typically calculated using riskbased margin models, and these models must be tested to ensure that they are prudent. However, two different margin models can calculate substantially different levels of margin yet both pass the usual tests. This paper presents a new approach to parameter selection based on the statistical properties of the worst loss over a margin period of risk estimated by the margin model under test. This measure is related to risk estimated at a fixed confidence interval yet leads to a more powerful test which is better able to justify the choice of parameters used in margin models. The test proposed is used on a variety of volatility estimation techniques applied to a long history of returns of the S&P 500 index. Well known techniques, including exponentially weighted moving average volatility estimation and generalised autoregressive conditional heteroskedasticity approaches are considered, and novel approaches derived from signal processing are also analysed. In each case a range of model parameters which give rise to acceptable risk estimates is identified. 
Keywords:  Conditional volatility; filtered volatility; GARCH(1; 1); initial margin model; model backtesting; volatility estimation 
JEL:  C12 C52 G13 
Date:  2017–09–01 
URL:  http://d.repec.org/n?u=RePEc:boe:boeewp:0673&r=rmg 
By:  Mabelle Sayah (Université Claude Bernard Lyon 1, UCBL, Faculte des Sciences  Universite Saint Joseph  USJ  Université SaintJoseph de Beyrouth) 
Abstract:  Recent financial crises were the root of many changes in regulatory implementations in the banking sector. Basel previously covered the default capital charge for counterparty exposures however, the crisis showed that more than two third of the losses related to this risk emerged from the exposure to the movement of the counterparty's credit quality and not its actual default therefore, Basel III divided the required counterparty risk capital into two categories: The traditional default capital charge and an additional counterparty credit valuation adjustment (CVA) capital charge. In this article, we explain the new methodologies to compute these capital charges on the OTC market: The standardized approach for default capital charge (SACCR) and the basic approach for CVA (BACVA). Based on historical calibration and future estimations, we built internal models in order to compare them with the amended standardized approach. Up till June 2015, interest rate and FX derivatives constituted more than 90% of the traded total OTC notional amount; we constructed our application on such portfolios containing and computed their total counterparty capital charge. The analysis reflected different impacts of the netting and collateral agreements on the regulatory capital depending on the instruments' typologies. Moreover, results showed an important increase in the capital charge due to the CVA addition doubling it in some cases. 
Keywords:  Basel III,Counterparty Credit Risk,SACCR,CVA,OTC Derivatives 
Date:  2016–12–30 
URL:  http://d.repec.org/n?u=RePEc:hal:journl:hal01550312&r=rmg 
By:  Lijun Bo; Agostino Capponi; Claudia Ceci 
Abstract:  We study dynamic hedging of counterparty risk for a portfolio of credit derivatives. Our empirically driven credit model consists of interacting default intensities which ramp up and then decay after the occurrence of credit events. Using the GaltchoukKunitaWatanabe decomposition of the counterparty risk price payment stream, we recover a closedform representation for the risk minimizing strategy in terms of classical solutions to nonlinear recursive systems of Cauchy problems. We discuss applications of our framework to the most prominent class of credit derivatives, including credit swap and risky bond portfolios, as well as firsttodefault claims. 
Date:  2017–09 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1709.01115&r=rmg 
By:  Felix Moldenhauer; Marcin Pitera 
Abstract:  In this short note we propose a new backtesting framework for Expected Shortfall that could be used by the regulator. Instead of looking at the estimated capital reserve and the realized cashflow separately, one could bind them into the secured position, for which the risk measurement process is much easier. Using this simple concept combined with monotonicity of Expected Shortfall with respect to its target confidence level, one can provide an unconditional coverage backtesting framework for Expected Shortfall that is a natural extension of the current ValueatRisk regulatory trafficlight approach. 
Date:  2017–09 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1709.01337&r=rmg 
By:  Bikramjit Das (Singapore University of Technology and Design  parent); Marie Kratz (ESSEC Business School  Essec Business School) 
Abstract:  We analyze risk diversifi cation in a portfolio of heavytailed risk factors under the assumption of second order multivariate regular variation. Asymptotic limits for a measure of diversifi cation benefi t are obtained when considering, for instance, the valueatrisk . The asymptotic limits are computed in a few examples exhibiting a variety of different assumptions made on marginal or joint distributions. This study ties up existing related results available in the literature under a broader umbrella. 
Keywords:  asymptotic theory,diversi cation bene t,heavy tail,risk concentration,second order regular variation,valueatrisk 
Date:  2017–04 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal01520655&r=rmg 
By:  Luca Barbaglia; Christophe Croux; Ines Wilms 
Abstract:  Volatility is a key measure of risk in financial analysis. The high volatility of one financial asset today could affect the volatility of another asset tomorrow. These lagged effects among volatilities  which we call volatility spillovers  are studied using the Vector AutoRegressive (VAR) model. We account for the possible fattailed distribution of the VAR model errors using a VAR model with errors following a multivariate Student tdistribution with unknown degrees of freedom. Moreover, we study volatility spillovers among a large number of assets. To this end, we use penalized estimation of the VAR model with tdistributed errors. We study volatility spillovers among energy, biofuel and agricultural commodities and reveal bidirectional volatility spillovers between energy and biofuel, and between energy and agricultural commodities. 
Keywords:  Commodities, Forecasting, Multivariate tdistribution, Vector AutoRegressive model, Volatility spillover 
Date:  2017–08 
URL:  http://d.repec.org/n?u=RePEc:ete:kbiper:590528&r=rmg 
By:  Tröger, Tobias H. 
Abstract:  This paper analyses the bailin tool under the BRRD and predicts that it will not reach its policy objective. To make this argument, this paper first describes the policy rationale that calls for mandatory private sector involvement (PSI). From this analysis the key features for an effective bailin tool can be derived. These insights serve as the background to make the case that the European resolution framework is likely ineffective in establishing adequate market discipline through riskreflecting prices for bank capital. The main reason for this lies in the avoidable embeddedness of the BRRD's bailin tool in the much broader resolution process which entails ample discretion of the authorities also in forcing private sector involvement. Finally, this paper synthesized the prior analysis by putting forward an alternative regulatory approach that seeks to disentangle private sector involvement as a precondition for effective bankresolution as much as possible from the resolution process as such. 
Keywords:  bailin,private sector involvement,precautionary recapitalization,crossborder insolvency,market discipline 
JEL:  G01 G18 G21 G28 K22 K23 
Date:  2017 
URL:  http://d.repec.org/n?u=RePEc:zbw:safewp:179&r=rmg 
By:  Perri, Fabrizio (Federal Reserve Bank of Minneapolis); Stefanidis, Georgios (Federal Reserve Bank of Minneapolis) 
Abstract:  We use balance sheet data and stock market data for the major U.S. banking institutions during and after the 20078 financial crisis to estimate the magnitude of the losses experienced by these institutions because of the crisis. We then use these estimates to assess the impact of the crisis under alternative, and higher, capital requirements. We find that substantially higher capital requirements (in the 20% to 30% range) would have substantially reduced the vulnerability of these financial institutions, and consequently they would have significantly reduced the need of a public bailout. 
Keywords:  Financial crises; Too big to fail 
JEL:  G01 G21 
Date:  2017–08–31 
URL:  http://d.repec.org/n?u=RePEc:fip:fedmsr:554&r=rmg 
By:  Blaurock, Ivonne; Schmitt, Noemi; Westerhoff, Frank 
Abstract:  We develop a simple agentbased financial market model in which speculators' market entry decisions are subject to herding behavior and market risk. Moreover, speculators' orders depend on price trends, market misalignments and fundamental news. Using a mix of analytical and numerical tools, we show that a herdinginduced market entry wave may amplify excess demand, triggering lasting volatility outbursts. Eventually, however, higher stock market risk reduces stock market participation and volatility decreases again. Simulations furthermore reveal that our approach is also able to produce bubbles and crashes, excess volatility, fattailed return distributions and serially uncorrelated price changes. 
Keywords:  stock markets,heterogeneous speculators,exponential replicator dynamics,herding behavior,stylized facts 
JEL:  C63 D84 G15 
Date:  2017 
URL:  http://d.repec.org/n?u=RePEc:zbw:bamber:128&r=rmg 
By:  Clark, Todd E. (Federal Reserve Bank of Cleveland); McCracken, Michael W. (Federal Reserve Bank of St. Louis); Mertens, Elmar (Bank for International Settlements) 
Abstract:  We develop uncertainty measures for point forecasts from surveys such as the Survey of Professional Forecasters, Blue Chip, or the Federal Open Market Committee's Summary of Economic Projections. At a given point of time, these surveys provide forecasts for macroeconomic variables at multiple horizons. To track timevarying uncertainty in the associated forecast errors, we derive a multiplehorizon specification of stochastic volatility. Compared to constantvariance approaches, our stochasticvolatility model improves the accuracy of uncertainty measures for survey forecasts. 
Keywords:  Stochastic volatility; survey forecasts; prediction 
JEL:  C32 C53 E47 
Date:  2017–08–28 
URL:  http://d.repec.org/n?u=RePEc:fip:fedlwp:2017026&r=rmg 
By:  Jane E. Ihrig; Edward Kim; Ashish Kumbhat; Cindy M. Vojtech; Gretchen C. Weinbach 
Abstract:  Leading up to 2014, banks generally increased their holdings of excess reserves as they moved to become compliant with the liquidity coverage ratio (LCR) requirement. However, once the LCR requirement was met, some banks shifted the compositions of their highquality liquid assets (HQLA), reducing shares of reserves and increasing shares of Treasury securities and certain mortgagebacked securities (MBS). This raises the question: For a given stock of HQLA, what is its optimal composition? We use standard optimal portfolio theory to benchmark the ideal composition of a given stock of HQLA and find that a range of "optimal" HQLA portfolios is plausible depending on banks' tolerance for risk. A bank that is highly risk averse (inclined) prefers a relatively large share of reserves (MBS). Of course, the LCR is not the only constraint on banks' operations. We discuss how other factors interact with the LCR, and then examine the data for individual BHCs to show that they are currently employing a range of approaches to managing the compositions of their HQLA. In addition, we find that the pattern of dispersion in the daily variance of banks' HQLA shares supports the view that such factors are important drivers of banks' management of HQLA. Finally, we discuss possible policy implications of our results regarding the Federal Reserve's longerrun implementation of monetary policy. 
Keywords:  CAPM ; HQLA ; LCR ; Bank balance sheets ; Liquid assets ; Liquidity management ; Reserve balances 
JEL:  E51 E58 G21 G28 
Date:  2017–08–30 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgfe:201792&r=rmg 
By:  Victor Olkhov 
Abstract:  This paper presents hydrodynamiclike model of business cycles aggregate fluctuations of economic and financial variables. We model macroeconomics as ensemble of economic agents on economic space and agent's risk ratings play role of their coordinates. Sum of economic variables of agents with coordinate x define macroeconomic variables as functions of time and coordinates x. We describe evolution and interactions between macro variables on economic space by hydrodynamiclike equations. Integral of macro variables over economic space defines aggregate economic or financial variables as functions of time t only. Hydrodynamiclike equations define fluctuations of aggregate variables. Motion of agents from low risk to high risk area and back define the origin for repeated fluctuations of aggregate variables. Economic or financial variables on economic space may define statistical moments like mean risk, mean square risk and higher. Fluctuations of statistical moments describe phases of financial and economic cycles. As example we present a simple model relations between Assets and RevenueonAssets and derive hydrodynamiclike equations that describe evolution and interaction between these variables. Hydrodynamiclike equations permit derive systems of ordinary differential equations that describe fluctuations of aggregate Assets, Assets mean risks and Assets mean square risks. Our approach allows describe business cycle aggregate fluctuations induced by interactions between any number of economic or financial variables. 
Date:  2017–08 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1709.00282&r=rmg 
By:  Frédéric Sart (Risk Management  Delta Lloyd Life) 
Abstract:  The fair replication method is a method designed to value liabilities with an endogenous profit sharing mechanism, i.e. based on the book yield of the backing assets. The basic idea is to construct a hypothetical portfolio, the fair replicating portfolio (FRP), whose cash flows are scenarioinvariant. The method is a computationally efficient alternative to traditional stochastic modeling. It may be particularly useful in applications where extensive calculations of best estimate of liabilities are required. 
Keywords:  Solvency II,Best estimate of liabilities,Life insurance,Profit sharing mechanism,Replicating portfolio 
Date:  2016 
URL:  http://d.repec.org/n?u=RePEc:hal:journl:hal01574949&r=rmg 