|
on Risk Management |
Issue of 2016‒02‒29
sixteen papers chosen by |
By: | Chang, C-L.; Jiménez-Martín, J.A.; McAleer, M.J.; Pérez-Amaral, T. |
Abstract: | __Abstract__ The Basel Committee on Banking Supervision (BCBS) (2013) recently proposed shifting the quantitative risk metrics system from Value-at-Risk (VaR) to Expected Shortfall (ES). The BCBS (2013) noted that “a number of weaknesses have been identified with using VaR for determining regulatory capital requirements, including its inability to capture tail risk” (p. 3). For this reason, the Basel Committee is considering the use of ES, which is a coherent risk measure and has already become common in the insurance industry, though not yet in the banking industry. While ES is mathematically superior to VaR in that it does not show “tail risk” and is a coherent risk measure in being subadditive, its practical implementation and large calculation requirements may pose operational challenges to financial firms. Moreover, previous empirical findings based only on means and standard deviations suggested that VaR and ES were very similar in most practical cases, while ES could be less precise because of its larger variance. In this paper we find that ES is computationally feasible using personal computers and, contrary to previous research, it is shown that there is a stochastic difference between the 97.5% ES and 99% VaR. In the Gaussian case, they are similar but not equal, while in other cases they can differ substantially: in fat-tailed conditional distributions, on the one hand, 97.5%-ES would imply higher risk forecasts, while on the other, it provides a smaller down-side risk than using the 99%-VaR. It is found that the empirical results in the paper generally support the proposals of the Basel Committee. |
Keywords: | Stochastic dominance, Value-at-Risk, Expected Shortfall, Optimizing strategy, Basel III Accord |
JEL: | G32 G11 G17 C53 C22 |
Date: | 2015–05–01 |
URL: | http://d.repec.org/n?u=RePEc:ems:eureir:78155&r=rmg |
By: | Meriem Rjiba, Meriem; Tsagris, Michail; Mhalla, Hedi |
Abstract: | We evaluate the predictive performance of a variety of value-at-risk (VaR) models for a portfolio consisting of five assets. Traditional VaR models such as historical simulation with bootstrap and filtered historical simulation methods are considered. We suggest a new method for estimating Value at Risk: the filtered historical simulation GJR-GARCH method based on bootstrapping the standardized GJR-GARCH residuals. The predictive performance is evaluated in terms of three criteria, the test of unconditional coverage, independence and conditional coverage and the quadratic loss function suggested. The results show that classical methods are inefficient under moderate departures from normality and that the new method produces the most accurate forecasts of extreme losses. |
Keywords: | Value at Risk, bootstrap, GARCH |
JEL: | C15 G17 |
Date: | 2015 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:68842&r=rmg |
By: | Murphy, David (Bank of England); Nahai-Williamson, Paul (Bank of England) |
Abstract: | Central counterparties (CCPs) are a key feature of the post-crisis financial system, and it is vital that they are robust. Indeed, as Paul Tucker said, ‘it is an understatement that it would be a disaster if a clearing house failed’ (Tucker (2011)). Therefore the question of how safe CCPs are is an important one. A key regulatory standard for CCPs is ‘cover 2’: this states that systemically important clearing houses must have sufficient financial resources to ‘cover’, or be robust under the failure of, their two largest members in extreme but plausible circumstances. This is an unusual standard, in that it is independent of the number of members a CCP has. Therefore it is natural to ask how prudent the cover 2 standard is for different sizes of CCP. This is the question investigated in this paper. We first use a simple model to quantify the likelihood of CCP failure. This model is used to produce stylised results showing how the probability of failure of a CCP that meets the cover 2 standard can be estimated. Second, we present a simple approach to explore how the distribution of risk among clearing members affects the prudence of the cover 2 standard. Our results give some reassurance in that we find that CCPs meeting the cover 2 standard are not highly risky provided that tail risks are not distributed too uniformly amongst CCP members. They do however suggest that CCPs and their supervisors should monitor this distribution as central clearing evolves. |
Keywords: | financial regulation; central counter parties |
JEL: | G28 |
Date: | 2014–10–24 |
URL: | http://d.repec.org/n?u=RePEc:boe:finsta:0030&r=rmg |
By: | Curti, Filippo (Federal Reserve Bank of Richmond); Migueis, Marco (Board of Governors of the Federal Reserve System (U.S.)) |
Abstract: | Operational risk models, such as the loss distribution approach, frequently use past internal losses to forecast operational loss exposure. However, the ability of past losses to predict exposure, particularly tail exposure, has not been thoroughly examined in the literature. In this paper, we test whether simple metrics derived from past loss experience are predictive of future tail operational loss exposure using quantile regression. We find evidence that past losses are predictive of future exposure, particularly metrics related to loss frequency. |
Keywords: | Operational risk; quantile regression; tail risk |
JEL: | G21 G28 G32 |
Date: | 2016–02–03 |
URL: | http://d.repec.org/n?u=RePEc:fip:fedgfe:2016-02&r=rmg |
By: | Aikman, David (Bank of England); Galesic, Mirta (Bank of England); Gigerenzer, Gerd (Bank of England); Kapadia, Sujit (Bank of England); Katsikopoulos, Konstantinos (Bank of England); Kothiyal, Amit (Bank of England); Murphy, Emma (Bank of England); Neumann, Tobias (Bank of England) |
Abstract: | Distinguishing between risk and uncertainty, this paper draws on the psychological literature on heuristics to consider whether and when simpler approaches may outperform more complex methods for modelling and regulating the financial system. We find that: (i) simple methods can sometimes dominate more complex modelling approaches for calculating banks’ capital requirements, especially if limited data are available for estimating models or the underlying risks are characterised by fat-tailed distributions; (ii) simple indicators often outperformed more complex metrics in predicting individual bank failure during the global financial crisis; and (iii) when combining information from different indicators to predict bank failure, ‘fast-and-frugal’ decision trees can perform comparably to standard, but more information-intensive, regression techniques, while being simpler and easier to communicate. |
Keywords: | financial regulation; uncertainty |
JEL: | G28 |
Date: | 2014–05–02 |
URL: | http://d.repec.org/n?u=RePEc:boe:finsta:0028&r=rmg |
By: | Pablo Koch-Medina; Cosimo Munari; Gregor Svindland |
Abstract: | We study comonotonicity of regulatory risk measures in terms of the primitives of the theory of risk measures: acceptance sets and eligible assets. We show that comonotonicity cannot be characterized by the properties of the acceptance set alone and heavily depends on the choice of the eligible asset. In fact, in many important cases, comonotonicity is only compatible with risk-free eligible assets. These findings seem to call for a renewed discussion about the meaning and the role of comonotonicity within the theory of regulatory risk measures. |
Date: | 2016–02 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1602.05477&r=rmg |
By: | Oliver Bettis; Simon Dietz; Nick Silver |
Abstract: | How large a risk is society prepared to run with the climate system? One perspective on this is to compare the risk that the world is running with the climate system, defined in terms of the risk of ‘climate ruin’, with the comparable risk that financial institutions, in particular insurance companies, are prepared or allowed to run with their own financial ruin. We conclude that, in terms of greenhouse gas emissions today and in the future, the world is running a higher risk with the climate system than financial institutions, in particular insurance companies, would usually run with their own solvency. |
Date: | 2015–11 |
URL: | http://d.repec.org/n?u=RePEc:lsg:lsgwps:wp217&r=rmg |
By: | Bormann, Carsten; Schaumburg, Julia; Schienle, Melanie |
Abstract: | In practice, multivariate dependencies between extreme risks are often only assessed in a pairwise way. We propose a test for detecting situations when such pairwise measures are inadequate and give incomplete results. This occurs when a significant portion of the multivariate dependence structure in the tails is of higher dimension than two. Our test statistic is based on a decomposition of the stable tail dependence function describing multivariate tail dependence. The asymptotic properties of the test are provided and a bootstrap based finite sample version of the test is proposed. A simulation study documents good size and power properties of the test including settings with time-series components and factor models. In an application to stock indices for non-crisis times, pairwise tail models seem appropriate for global markets while the test finds them not admissible for the tightly interconnected European market. From 2007/08 on, however, higher order dependencies generally increase and require a multivariate tail model in all cases. |
Keywords: | decomposition of multivariate tail dependence,multivariate extreme values,stable tail dependence function,extreme dependence modeling |
JEL: | C01 C46 C58 |
Date: | 2016 |
URL: | http://d.repec.org/n?u=RePEc:zbw:kitwps:80&r=rmg |
By: | Christoph Aymanns; Fabio Caccioli; J. Doyne Farmer; Vincent W.C. Tan |
Abstract: | Effective risk control must make a tradeoff between the microprudential risk of exogenous shocks to individual institutions and the macroprudential risks caused by their systemic interactions. We investigate a simple dynamical model for understanding this tradeoff, consisting of a bank with a leverage target and an unleveraged fundamental investor subject to exogenous noise with clustered volatility. The parameter space has three regions: (i) a stable region, where the system always reaches a fixed point equilibrium; (ii) a locally unstable region, characterized by cycles and chaotic behavior; and (iii) a globally unstable region. A crude calibration of parameters to data puts the model in region (ii). In this region there is a slowly building price bubble, resembling a “Great Moderation”, followed by a crash, with a period of approximately 10-15 years, which we dub the Basel leverage cycle. We propose a criterion for rating macroprudential policies based on their ability to minimize risk for a given average leverage. We construct a one parameter family of leverage policies that allows us to vary from the procyclical policies of Basel II or III, in which leverage decreases when volatility increases, to countercyclical policies in which leverage increases when volatility increases. We find the best policy depends critically on three parameters: The average leverage used by the bank; the relative size of the bank and the fundamentalist, and the amplitude of the exogenous noise. Basel II is optimal when the exogenous noise is high, the bank is small and leverage is low; in the opposite limit where the bank is large or leverage is high the optimal policy is closer to constant leverage. We also find that systemic risk can be dramatically decreased by lowering the leverage target adjustment speed of the banks. |
Keywords: | Financial stability; capital regulation; systemic risk |
JEL: | G11 G20 |
Date: | 2015–07 |
URL: | http://d.repec.org/n?u=RePEc:ehl:lserod:65089&r=rmg |
By: | Zura Kakushadze; Willie Yu |
Abstract: | We give a complete algorithm and source code for constructing general multifactor risk models (for equities) via any combination of style factors, principal components (betas) and/or industry factors. For short horizons we employ the Russian-doll risk model construction to obtain a nonsingular factor covariance matrix. This generalizes the heterotic risk model construction to include arbitrary non-industry risk factors as well as industry risk factors with generic "weights". The aim of sharing our proprietary know-how with the investment community is to encourage organic risk model building. The presentation is intended to be essentially self-contained and pedagogical. So, stop wasting money and complaining, start building risk models and enjoy! |
Date: | 2016–02 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1602.04902&r=rmg |
By: | Anatolii A. Puhalskii |
Abstract: | We obtain a lower asymptotic bound on the decay rate of the probability of a portfolio's underperformance against a benchmark over a large time horizon. It is assumed that the prices of the securities are governed by geometric Brownian motions with the coefficients depending on an economic factor, possibly nonlinearly. The bound is tight so that there exists a portfolio that optimises the decay rate. That portfolio is also risk-sensitive optimal. |
Date: | 2016–02 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1602.02192&r=rmg |
By: | Spiros Bougheas (School of Economics, University of Nottingham); Alan Kirman (Faculte de Droit et de Sciences Politiques, Aix-Marseille Université) |
Abstract: | The paper argues that systemic risk must be taken into account when designing optimal bankruptcy procedures in general, and priority rules in particular. Allowing for endogenous formation of links in the interbank market we show that the optimal policy depends on the distribution of shocks and the severity of fire sales. |
Keywords: | Banks; Priority rules; Systemic Risk |
JEL: | G21 G28 |
Date: | 2016–01 |
URL: | http://d.repec.org/n?u=RePEc:cst:wpaper:201602&r=rmg |
By: | Sucarrat, Genaro; Grønneberg, Steffen |
Abstract: | The probability of an observed financial return being equal to zero is not necessarily zero. This can be due to price discreteness or rounding error, liquidity issues (e.g. low trading volume), market closures, data issues (e.g. data imputation due to missing values), characteristics specific to the market, and so on. Moreover, the zero probability may change and depend on market conditions. In standard models of return volatility, however, e.g. ARCH, SV and continuous time models, the zero probability is zero, constant or both. We propose a new class of models that allows for a time-varying zero probability, and which can be combined with standard models of return volatility: They are nested and obtained as special cases when the zero probability is constant and equal to zero. Another attraction is that the return properties of the new class (e.g. volatility, skewness, kurtosis, Value-at-Risk, Expected Shortfall) are obtained as functions of the underlying volatility model. The new class allows for autoregressive conditional dynamics in both the zero probability and volatility specifications, and for additional covariates. Simulations show parameter and risk estimates are biased if zeros are not appropriately handled, and an application illustrates that risk-estimates can be substantially biased in practice if the time-varying zero probability is not accommodated. |
Keywords: | Financial return, volatility, zero-inflated return, GARCH, log-GARCH, ACL |
JEL: | C01 C22 C32 C51 C52 C58 |
Date: | 2016–01–17 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:68931&r=rmg |
By: | Giovanni Ferri (LUMSA University); Doris Neuberger (University of Rostock) |
Abstract: | We claim that we currently live in a banking regulatory bubble. We review how: i) banking intermediation theory hinges on dealing with borrower-lender asymmetry of information; ii) instead, the presence of complete information is the keystone of the finance theory. Next, we document how finance theory prevailed over banking intermediation theory in shaping banking regulation: This appalling contradiction is the true culprit behind lower credit standards, mounting systemic risk in banking, and macroeconomic debt overhang. Consequently, we discuss actions that, by restoring the consistency of banking regulation with the theory of banking intermediation, would make banking sounder. |
Keywords: | Asymmetric Information; Relationship Lending vs. Transactional Lending; Efficient Markets Hypothesis; Banking Regulation Inconsistencies; Basel II. |
JEL: | G01 G14 G21 G28 |
Date: | 2014–05 |
URL: | http://d.repec.org/n?u=RePEc:lsa:wpaper:wpc01&r=rmg |
By: | Uddin, Md Akther |
Abstract: | Islamic banking industry has been growing rapidly for last three decades. As risk is inherent in banking business it is necessary to develop a comprehensive risk management framework and process. In this paper, a humble attempt has been made to study and analyze risk management practices of Islami Bank Bangladesh Limited (IBBL), one of the leading Islamic banks in Bangaladesh. Annual reports of IBBL and 7 other full-fledged Islamic banks, Bangladesh Bank, the central bank of Bangladesh, publications and guidelines on risk management, secondary data collected from various published papers and Datastream are used to analyze and support the findings of the study. It is found that the bank has developed an extensive risk management framework and process. The bank is found to be moderate to low risk taker in terms of investment exposures. The bank has been generating sustainable earnings from its depositors fund but lower return on shareholder’s fund due to lack of shari’ah compliant financial instruments in Bangladesh. Even though the bank mobilizes funds on profit and loss sharing principle, but study indicates that all risks are actually borne by the bank e.g., financing impairment is charged to shareholders only like conventional banks. Physical assets constitute a significant portion of the bank’s balance sheet and evidently these assets are funded by shareholders fund. Currently, the bank does not use any derivative instruments as they are not available in the financial market of Bangladesh. Income gap analysis of the bank shows that its rate sensitive assets are higher than rate sensitive liabilities which seem unfavorable in decreasing interest rate environment, consequently, bank’s profitability declined over the year. In spite of that, the spread between funding cost and financing income is above 5%. It can be argued that the inclination towards murabaha financing is evident from the analysis and 72% of the exposure of the Bank lies under the Risk weight category of 50% or below, which is considered to be one of the significant strengths of IBBL. Moreover, displaced commercial risk and excess liquidity risk tend to affect significantly the bank’s efficiency and profitability. |
Keywords: | risk management, investment risk, liquidity risk, Islamic bank, Islami Bank Bangladesh Ltd |
JEL: | A1 A10 G21 P51 |
Date: | 2015–12–18 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:68781&r=rmg |
By: | Francisco Ibáñez |
Abstract: | The dynamic version of the Nelson-Siegel model has shown useful applications in the investment management industry. These applications go from forecasting the yield curve to portfolio risk management. Because of the complexity in the estimation of the parameters, some practitioners are unable to benefit from the uses of this model. This note presents two approximations to estimate the time series of the model’s factors. The first one has a more technical aim, focusing on the construction of a representative base to work, and uses a genetic algorithm to face the optimization problem. The second approximation has a practitioner spirit, focusing on the easiness of implementation. The results show that both methodologies have good fitting for the U.S. Treasury bonds market. |
Date: | 2016–01 |
URL: | http://d.repec.org/n?u=RePEc:chb:bcchwp:774&r=rmg |