nep-rmg New Economics Papers
on Risk Management
Issue of 2016‒11‒20
fourteen papers chosen by

  1. Banks' internal rating models - time for a change? The system of floors as proposed by the Basel committee By Haselmann, Rainer; Wahrenburg, Mark
  2. High Frequency vs. Daily Resolution: the Economic Value of Forecasting Volatility Models By F. Lilla
  3. Forecasting Financial Vulnerability in the US: A Factor Model Approach By Hyeongwoo Kim; Wen Shi
  4. Multinomial VaR Backtests: A simple implicit approach to backtesting expected shortfall By Marie Kratz; Yen H. Lok; Alexander J McNeil
  5. Regulation and Rational Banking Bubbles in Infinite Horizon By Claire Océane Chevallier; Sarah El Joueidi
  6. Risk Aversion: Differential Conditions for the Concavity in Transformed Two-Parameter Distributions By Fausto Corradin; Domenico Sartore
  7. What do central counterparties default funds really cover? A network-based stress test answer By Giulia Poce; Giulio Cimini; Andrea Gabrielli; Andrea Zaccaria; Giuditta Baldacci; Marco Polito; Mariangela Rizzo; Silvia Sabatini
  8. The Asset Liability Management problem of a nuclear operator : a numerical stochastic optimization approach By Xavier Warin
  9. First Stochastic Dominance and Risk Measurement By Niu, Cuizhen; Wong, Wing-Keung; Zhu, Lixing
  10. A positive analysis of bank behaviour under capital requirements By Bahaj, Saleem; Malherbe, Frédéric
  11. On the Third Order Stochastic Dominance for Risk-Averse and Risk-Seeking Investors with Analysis of their Traditional and Internet Stocks By Chan, Raymond H.; Clark, Ephraim; Wong, Wing-Keung
  12. Almost Unbiased Variance Estimation in Simultaneous Equation Models By Phillip, Garry; Xu, Yongdeng
  13. Computation of first-order Greeks for barrier options using chain rules for Wiener path integrals By Kensuke Ishitani
  14. How the interbank market becomes systemically dangerous: an agent-based network model of financial distress propagation By Matteo Serri; Guido Caldarelli; Giulio Cimini

  1. By: Haselmann, Rainer; Wahrenburg, Mark
    Abstract: We provide an assessment of the Basel Committee on Banking Supervision (BCBS) proposal to restrict the internal ratings-based approach on bank risk and to introduce risk-weighted asset floors. If well enforced, risk-sensitive capital regulation results in a more efficient credit allocation compared to the standard approach. Thus, the internal ratings-based approach should be maintained. Further, the use of internal ratings-based output floors potentially results in unintended negative side effects. Input floors are likely a valuable tool to achieve risk-weighted assets comparability. Finally, the proposed measures have a potential detrimental impact for European banks as compared to others.
    Keywords: internal rating models,floors,banking regulation,BCBS
    Date: 2016
  2. By: F. Lilla
    Abstract: Forecasting-volatility models typically rely on either daily or high frequency (HF) data and the choice between these two categories is not obvious. In particular, the latter allows to treat volatility as observable but they suffer of many limitations. HF data feature microstructure problem, such as the discreteness of the data, the properties of the trading mechanism and the existence of bid-ask spread. Moreover, these data are not always available and, even if they are, the asset’s liquidity may be not sufficient to allow for frequent transactions. This paper considers different variants of these two family forecasting-volatility models, comparing their performance (in terms of Value at Risk, VaR) under the assumptions of jumping prices and leverage effects for volatility. Findings suggest that GARJI model provides more accurate VaR measures for the S&P 500 index than RV models. Furthermore, the assumption of conditional normality is shown to be not sufficient to obtain accurate risk measures even if jump contribution is provided. More sophisticated models might address this issue, improving VaR results.
    JEL: C58 C53 C22 C01 C13
    Date: 2016–11
  3. By: Hyeongwoo Kim; Wen Shi
    Abstract: This paper presents a factor-based forecasting model for the financial market vulnerability, measured by changes in the Cleveland Financial Stress Index (CFSI). We estimate latent common factors via the method of the principal components from 170 monthly frequency macroeconomic data in order to out-of-sample forecast the CFSI. Our factor models outperform both the random walk and the autoregressive benchmark models in out-of-sample predictability at least for the short-term forecast horizons, which is a desirable feature since financial crises often come to a surprise realization. Interestingly, the first common factor, which plays a key role in predicting the financial vulnerability index, seems to be more closely related with real activity variables rather than nominal variables. We also present a binary choice version factor model that estimates the probability of the high stress regime successfully.
    Keywords: Financial Stress Index; Method of the Principal Component; Out-of-Sample Forecast; Ratio of Root Mean Square Prediction Error; Diebold-Mariano-West Statistic; Ordered Probit Model
    JEL: E44 E47 G01 G17
    Date: 2016–11
  4. By: Marie Kratz; Yen H. Lok; Alexander J McNeil
    Abstract: Under the Fundamental Review of the Trading Book (FRTB) capital charges for the trading book are based on the coherent expected shortfall (ES) risk measure, which show greater sensitivity to tail risk. In this paper it is argued that backtesting of expected shortfall - or the trading book model from which it is calculated - can be based on a simultaneous multinomial test of value-at-risk (VaR) exceptions at different levels, an idea supported by an approximation of ES in terms of multiple quantiles of a distribution proposed in Emmer et al. (2015). By comparing Pearson, Nass and likelihood-ratio tests (LRTs) for different numbers of VaR levels $N$ it is shown in a series of simulation experiments that multinomial tests with $N\geq 4$ are much more powerful at detecting misspecifications of trading book loss models than standard binomial exception tests corresponding to the case $N=1$. Each test has its merits: Pearson offers simplicity; Nass is robust in its size properties to the choice of $N$; the LRT is very powerful though slightly over-sized in small samples and more computationally burdensome. A traffic-light system for trading book models based on the multinomial test is proposed and the recommended procedure is applied to a real-data example spanning the 2008 financial crisis.
    Date: 2016–11
  5. By: Claire Océane Chevallier (CREA, Université du Luxembourg); Sarah El Joueidi (CREA, Université du Luxembourg)
    Abstract: This chapter develops a dynamic stochastic general equilibrium model in infinite horizon with a regulated banking sector where stochastic banking bubbles may arise endogenously. We analyze the conditions under which stochastic bubbles exist and their impact on macroeconomic key variables. We show that when banks face capital requirements based on Value-at- Risk, two different equilibria emerge and can coexist: the bubbleless and the bubbly equilibria. Alternatively, under a regulatory framework where capital requirements are based on credit risk only, as in Basel I, bubbles are explosive and, as a consequence, cannot exist. The stochastic bubbly equilibrium is characterized by positive or negative bubbles depending on the tightness of capital requirements based on Value-at-Risk. We find a maximum value of capital requirements under which bubbles are positive. Below this threshold, the stochastic bubbly equilibrium provides larger wel- fare than the bubbleless equilibrium. In particular, our results suggest that a change in banking policies might lead to a crisis without external shocks.
    Keywords: Banking bubbles; banking regulation; DSGE; infinitely lived agents; multiple equilibria; Value-at-Risk
    JEL: E2 E44 G01 G20
    Date: 2016
  6. By: Fausto Corradin; Domenico Sartore
    Abstract: The condition of Risk Aversion implies that the Utility Function must be concave. Taking into account the dependence of the Utility Function on the wealth that in turn depends on the return, we consider a return with any type of two-parameter distribution. It is possible to define Risk and Return as a generic function of these two parameters. This paper determines the Differential Conditions for the definitions of Risk and Return that maintain the Risk Aversion property in the 3D space of the Risk, Return and Expected Utility Function. As a particular case, Standard Deviation, Value at Risk and Expected Shortfall of the Truncated Normal variable with CRRA Utility Function are analyzed. Only the Standard Deviation respects the Differential Conditions and maintains the concavity of the Expected Utility Function downward.
    Keywords: Concavity, CRRA Utility Function, Expected Utility Function, Expected Shortfall, Differential Conditions, Quadratic Utility Function, Standard Deviation, Transformation Parametric Functions, Truncated Normal Distribution
    JEL: G11 G14 G23 G24
    Date: 2016
  7. By: Giulia Poce; Giulio Cimini; Andrea Gabrielli; Andrea Zaccaria; Giuditta Baldacci; Marco Polito; Mariangela Rizzo; Silvia Sabatini
    Abstract: In the last years, increasing efforts have been put into the development of effective stress tests to quantify the resilience of financial institutions. Here we propose a stress test methodology for central counterparties based on a network characterization of clearing members, whose links correspond to direct credits and debits. This network constitutes the ground for the propagation of financial distress: equity losses caused by an initial shock with both exogenous and endogenous components reverberate within the network and are amplified through credit and liquidity contagion channels. At the end of the dynamics, we determine the vulnerability of each clearing member, which represents its potential equity loss. We apply the proposed framework to the Fixed Income asset class of CC&G, the central counterparty operating in Italy whose main cleared securities are Italian Government Bonds. We consider two different scenarios: a distributed, plausible initial shock, as well as a shock corresponding to the cover 2 regulatory requirement (the simultaneous default of the two most exposed clearing members). Although the two situations lead to similar results after an unlimited reverberation of shocks on the network, the distress propagation is much more hasty in the latter case, with a large number of additional defaults triggered at early stages of the dynamics. Our results thus show that setting a default fund to cover insolvencies only on a cover 2 basis may not be adequate for taming systemic events, and only very conservative default funds, such as CC&G's one, can face total losses due to the shock propagation. Overall, our network-based stress test represents a refined tool for calibrating default fund amounts.
    Date: 2016–11
  8. By: Xavier Warin
    Abstract: We numerically study an Asset Liability Management problem linked to the decommissioning of French nuclear power plants. We link the risk aversion of practitioners to an optimization problem. Using different price models we show that the optimal solution is linked to a de-risking management strategy similar to a concave strategy and we propose an effective heuristic to simulate the underlying optimal strategy. Besides we show that the strategy is stable with respect to the main parameters involved in the liability problem.
    Date: 2016–11
  9. By: Niu, Cuizhen; Wong, Wing-Keung; Zhu, Lixing
    Abstract: Farinelli and Tibiletti (2008) propose a general risk-reward performance measurement ratio. Due to its simplicity and generality, the F-T ratios have gained much attentions. F-T ratios are ratios of average gains to average losses with respect to a target, each raised by some power index. Omega ratio and Upside Potential ratio are both special cases of F-T ratios. In this paper, we establish the consistency of F-T ratios with respect to first-order stochastic dominance. It is shown that second-order stochastic dominance is not consistent to the F-T ratios. This point is illustrated by a simple example.
    Keywords: Stochastic Dominance, Upside Potential Ratio, Farinelli and Tibiletti ratio.
    JEL: C00 D81 G10
    Date: 2016–11–11
  10. By: Bahaj, Saleem; Malherbe, Frédéric
    Abstract: We propose a theory of bank behaviour under capital requirements. The sign of the lending response to a change in capital requirement is ambiguous due to the interplay between risk-taking incentives and debt overhang considerations. Optimal lending is typically U-shaped in the capital requirement. Changes in expected returns on loans shift this relationship. The lower expected returns the lower its slope. Using UK regulatory data (1989-2007), we find support for this prediction. It follows that a bank mainly adjusts to a higher capital requirement through cutting lending when expected returns are low, and by raising capital when they are high.
    JEL: G21 G28
    Date: 2016–11
  11. By: Chan, Raymond H.; Clark, Ephraim; Wong, Wing-Keung
    Abstract: This paper presents some interesting new properties of third order stochastic dominance (TSD) for risk-averse and risk-seeking investors. We show that the means of the assets being compared should be included in the definition of TSD for both investor types. We also derive the conditions on the variance order of two assets with equal means for both investor types and extend the second order SD (SSD) reversal result of Levy and Levy (2002) to TSD. We apply our results to analyze the investment behaviors on traditional stocks and internet stocks for both risk averters and risk seekers.
    Keywords: Third order stochastic dominance, expected-utility maximization, risk aversion, risk seeking, investment behaviors.
    JEL: C00 G11
    Date: 2016–11–10
  12. By: Phillip, Garry (Cardiff Business School); Xu, Yongdeng (Cardiff Business School)
    Abstract: While a good deal of research in simultaneous equation models has been conducted to examine the small sample properties of coefficient estimators there has not been a corresponding interest in the properties of estimators for the associated variances. In this paper we build on Kiviet and Phillips (2000) and explore the biases in variance estimators. This is done for the 2SLS and the MLIML estimators.The approximations to the bias are then used to develop less biased estimators whose properties are examined and compared in a number of simulation experiments. In addition, a bootstrap estimator is included which is found to perform especially well. The experiments also consider coverage probabilities/test sizes and test powers of the t-tests where it is shown that tests based on 2SLS are generally oversized while test sizes based on MLIML are closer to nominal levels. In both cases test statistics based on the corrected variance estimates generally have a higher power than standard procedures.
    Keywords: Simultaneous equation models, 2SLS and Fuller's estimators, Bias corrected variance estimation, Inference and bias corrected variance
    JEL: C12 C13 C26 C30
    Date: 2016–10
  13. By: Kensuke Ishitani
    Abstract: This paper presents a new methodology to compute first-order Greeks for barrier options under the framework of path-dependent payoff functions with European, Lookback, or Asian type and with time-dependent trigger levels. In particular, we develop chain rules for Wiener path integrals between two curves that arise in the computation of first-order Greeks for barrier options. We also illustrate the effectiveness of our method through numerical examples.
    Date: 2016–11
  14. By: Matteo Serri; Guido Caldarelli; Giulio Cimini
    Abstract: Assessing the stability of economic systems is a fundamental research focus in economics, that has become increasingly interdisciplinary in the currently troubled economic situation. In particular, much attention has been devoted to the interbank lending market as an important diffusion channel for financial distress during the recent crisis. In this work we study the stability of the interbank market to exogenous shocks using an agent-based network framework. Our model encompasses several ingredients that have been recognized in the literature as pro-cyclical triggers of financial distress in the banking system: credit and liquidity shocks through bilateral exposures, liquidity hoarding due to counterparty creditworthiness deterioration, target leveraging policies and fire-sales spillovers. But we exclude the possibility of central authorities intervention. We implement this framework on a dataset of 183 European banks that were publicly traded between 2004 and 2013. We document the extreme fragility of the interbank lending market up to 2008, when a systemic crisis leads to total depletion of market equity with an increasing speed of market collapse. After the crisis instead the system is more resilient to systemic events in terms of residual market equity. However, the speed at which the crisis breaks out reaches a new maximum in 2011, and never goes back to values observed before 2007. Our analysis points to the key role of the crisis outbreak speed, which sets the maximum delay for central authorities intervention to be effective.
    Date: 2016–11

General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.