nep-rmg New Economics Papers
on Risk Management
Issue of 2011‒11‒21
eleven papers chosen by
Stan Miles
Thompson Rivers University

  1. Financial Risk Measurement for Financial Risk Management By Torben G. Andersen; Tim Bollerslev; Peter F. Christoffersen; Francis X. Diebold
  2. Viewing Risk Measures as Information By Dominique Guegan; Wayne Tarrant
  3. A mathematical resurgence of risk management : an extreme modeling of expert opinions By Dominique Guegan; Bertrand Hassani
  4. Operational risk : A Basel II++ step before Basel III By Dominique Guegan; Bertrand Hassani
  5. Longevity hedge effectiveness: a decomposition By Cairns, Andrew; Dowd, Kevin; Blake, David; Coughlan, Guy
  6. The financial stress index: identification of systemic risk conditions By Mikhail V. Oet; Ryan Eiben; Timothy Bianco; Dieter Gramlich; Stephen J. Ong
  7. Bayesian semiparametric GARCH models By Xibin Zhang; Maxwell L. King
  8. Historical financial analogies of the current crisis By Julián Andrada-Félix; Fernando Fernández-Rodríguez; Simón Sosvilla-Rivero
  9. A new method to estimate the risk of financial intermediaries By Delis, Manthos D; Tsionas, Efthymios
  10. Lessons from International Central Counterparties: Benchmarking and Analysis By Alexandre Lazarow
  11. Intended and Unintended Results of the Proposed Volcker Rule By Skold, Alida S.

  1. By: Torben G. Andersen (Northwestern University and CREATES); Tim Bollerslev (Duke University and CREATES); Peter F. Christoffersen (University of Toronto and CREATES); Francis X. Diebold (University of Pennsylvania)
    Abstract: Current practice largely follows restrictive approaches to market risk measurement, such as historical simulation or RiskMetrics. In contrast, we propose exible methods that exploit recent developments in nancial econometrics and are likely to produce more accurate risk assessments, treating both portfoliolevel and asset-level analysis. Asset-level analysis is particularly challenging because the demands of real-world risk management in nancial institutions - in particular, real-time risk tracking in very high-dimensional situations - impose strict limits on model complexity. Hence we stress powerful yet parsimonious models that are easily estimated. In addition, we emphasize the need for deeper understanding of the links between market risk and macroeconomic fundamentals, focusing primarily on links among equity return volatilities, real growth, and real growth volatilities. Throughout, we strive not only to deepen our scientic understanding of market risk, but also cross-fertilize the academic and practitioner communities, promoting improved market risk measurement technologies that draw on the best of both.
    Keywords: Risk measurement, risk management, volatility, conditionality, dimensionality reduction, high-frequency data, macro fundamentals
    JEL: C22 C32 G32
    Date: 2011–11–02
    URL: http://d.repec.org/n?u=RePEc:aah:create:2011-37&r=rmg
  2. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I); Wayne Tarrant (Wingate University - Department of Mathematics)
    Abstract: Regulation and Risk management in banks depend on underlying risk measures. In general this is the only purpose that is seen for risk measures. In this paper, we suggest that the reporting of risk measures can be used to determine the loss distribution function for a financial entity. We demonstrate that a lack of sufficient information can lead to ambiguous risk situations. We give examples, showing the need for the reporting of multiple risk measures in order to determine a bank's loss distribution. We conclude by suggesting a regulatory requirement of multiple risk measures being reported by banks, giving specific recommendations.
    Keywords: Risk measure, Value at Risk, bank capital, Basel II accord.
    Date: 2011–08
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-00639489&r=rmg
  3. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I); Bertrand Hassani (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, BPCE - BPCE)
    Abstract: The Operational Risk Advanced Measurement Approach requires financial institutions to use scenarios to model these risks and to evaluate the pertaining capital charges. Considering that a banking group is composed of numerous entities (branches and subsidiaries), and that each one of them is represented by an Operational Risk Manager (ORM), we propose a novel scenario approach based on ORM expertise to collect information and create new data sets focusing on large losses, and the use of the Extreme Value Theory (EVT) to evaluate the corresponding capital allocation. In this paper, we highlight the importance to consider an a priori knowledge of the experts associated to a a posteriori backtesting based on collected incidents.
    Keywords: Basel II, operational risks, EVT, AMA, expert, Value-at-Risk, expected shortfall.
    Date: 2011–09
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-00639666&r=rmg
  4. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I); Bertrand Hassani (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, BPCE - BPCE)
    Abstract: Following Banking Committee on Banking Supervision, operational risk quantification is based on the Basel matrix which enables sorting incidents. In this paper, we deeply analyze these incidents and propose strategies for carrying out the supervisory guidelines proposed by the regulators. The objectives are numerous. On the first hand, banks need to provide a univariate capital charge for each cell of the Basel matrix. On the other hand, banks need also to provide a global capital charge corresponding to the whole matrix taking into account dependences. We provide a solution to do so. Finally, we draw regulators and managers attention on two crucial points : the granularity and the risk measure.
    Keywords: Basel II, operational risks, EVT, copula.
    Date: 2011–09
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-00639484&r=rmg
  5. By: Cairns, Andrew; Dowd, Kevin; Blake, David; Coughlan, Guy
    Abstract: We use a case study of a pension plan wishing to hedge the longevity risk in its pension liabilities at a future date. The plan has the choice of using either a customised hedge or an index hedge, with the degree of hedge effectiveness being closely related to the correlation between the value of the hedge and the value of the pension liability. The key contribution of this paper is to show how correlation and, therefore, hedge effectiveness can be broken down into contributions from a number of distinct types of risk factor. Our decomposition of the correlation indicates that population basis risk has a significant influence on the correlation. But recalibration risk as well as the length of the recalibration window are also important, as is cohort effect uncertainty. Having accounted for recalibration risk, parameter uncertainty and Poisson risk have only a marginal impact on hedge effectiveness. Our case study shows that longevity risk can be substantially hedged using index hedges as an alternative to customised longevity hedges and that, as a consequence, index longevity hedges - in conjunction with the other components of an ALM strategy - can provide an effective and lower cost alternative to both a full buy-out of pension liabilities or even to a strategy using customised longevity hedges.
    Keywords: Hedge Effectiveness; Correlation; Mark-to-Model; Valuation Model; Simulation; Value Hedging; Longevity Risk; Stochastic Mortality; Population Basis Risk; Recalibration Risk
    JEL: G23 J11
    Date: 2011–05
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:34236&r=rmg
  6. By: Mikhail V. Oet; Ryan Eiben; Timothy Bianco; Dieter Gramlich; Stephen J. Ong
    Abstract: This paper develops a financial stress index for the United States, the Cleveland Financial Stress Index (CFSI), which provides a continuous signal of financial stress and broad coverage of the areas that could indicate it. The index is based on daily public-market data collected from four sectors of the fi nancial markets—the credit, foreign exchange, equity, and interbank markets. A dynamic weighting method is employed to capture changes in the relative importance of these four sectors as they occur. In addition, the design of the index allows the origin of the stress to be identified. We compare the CFSI to alternative indexes, using a detailed benchmarking methodology, and show how the CFSI can be applied to systemic stress monitoring and early warning system design. To that end, we investigate alternative stress-signaling thresholds and frequency regimes and then establish optimal frequencies for filtering out market noise and idiosyncratic episodes. Finally, we quantify a powerful CFSI-based rating system that assigns a probability of systemic stress to ranges of CFSI outcomes.
    Keywords: Systemic risk ; Risk assessment
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:fip:fedcwp:1130&r=rmg
  7. By: Xibin Zhang; Maxwell L. King
    Abstract: This paper aims to investigate a Bayesian sampling approach to parameter estimation in the semiparametric GARCH model with an unknown conditional error density, which we approximate by a mixture of Gaussian densities centered at individual errors and scaled by a common standard deviation. This mixture density has the form of a kernel density estimator of the errors with its bandwidth being the standard deviation. The proposed investigation is motivated by the lack of robustness in GARCH models with any parametric assumption of the error density for the purpose of error-density based inference such as value-at-risk (VaR) estimation. The contribution of the paper is to construct the likelihood and posterior of model and bandwidth parameters under the proposed mixture error density, and to forecast the one-step out-of-sample density of asset returns. The resulting VaR measure therefore would be distribution-free. Applying the semiparametric GARCH(1,1) model to daily stock-index returns in eight stock markets, we find that this semiparametric GARCH model is favoured against the GARCH(1,1) model with Student t errors for five indices, and that the GARCH model underestimates VaR compared to its semiparametric counterpart. We also investigate the use and benefit of localized bandwidths in the proposed mixture density of the errors.
    Keywords: Bayes factors, kernel-form error density, localized bandwidths, Markov chain Monte Carlo, value-at-risk
    JEL: C11 C14 C15 G15
    Date: 2011–11–03
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2011-24&r=rmg
  8. By: Julián Andrada-Félix (Universidad de Las Palmas de Gran Canaria, Spain); Fernando Fernández-Rodríguez (Universidad de Las Palmas de Gran Canaria, Spain); Simón Sosvilla-Rivero (Universidad Complutense de Madrid, Spain)
    Abstract: This paper tries to shed light on the historical analogies of the current crisis. To that end we compare the current sample distribution of Dow Jones Industrial Average Index returns for a 769-day period (from 15 September 2008, the Lehman Brothers bankruptcy, to September 2011), with all historical sample distributions of returns computed with a moving window of 769 days in the 2 January 1900 to 12 September 2008 period. Using a Kolmogorov-Smirnov and a x2 homogeneity tests which have the null hypothesis of equal distribution we find that the stock market returns distribution during the current crisis would be similar to several past periods of severe financial crises that evolved into intense recessions, being the sub-sample from 28 May 1935 to 17 Jun 1938 the most analogous episode to the current situation. Furthermore, when applying the procedure proposed by Diebold, Gunther and Tay (1998) for comparing densities of sub-samples, we obtain additional support for our findings and discover a period from 10 September 1930 to 13 October 1933 where the severity of the crisis overcomes the current situation having sharper tail events. Finally, when comparing historical market risk with the current risk, we observe that the current market risk has only been exceeded at the beginning of the Great Depression.
    Keywords: Financial crisis, Great Recession, Great Depression
    JEL: E32 G15
    Date: 2011–11
    URL: http://d.repec.org/n?u=RePEc:aee:wpaper:1108&r=rmg
  9. By: Delis, Manthos D; Tsionas, Efthymios
    Abstract: In this paper we reconsider the formal estimation of the risk of financial intermediaries. Risk is modeled as the variability of the profit function of a representative intermediary, here bank, as formally considered in finance theory. In turn, banking theory suggests that risk is determined simultaneously with profits and other bank- and industry-level characteristics that cannot be considered predetermined when profit maximizing decisions of financial institutions are to be made. Thus, risk is endogenous. We estimate the model on a panel of US banks, spanning the period 1985q1-2010q2. The findings suggest that risk was fairly stable up to 2001 and accelerated quickly thereafter and up to 2007. Indices of bank risk commonly used in the literature do not capture this trend and/or the scale of the increase.
    Keywords: Risk of financial intermediaries; Endogenous risk; Full information maximum likelihood; Profit function; Duality
    JEL: C51 C33 G21
    Date: 2011–11–15
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:34735&r=rmg
  10. By: Alexandre Lazarow
    Abstract: Since the financial crisis, attention has focused on central counterparties (CCPs) as a solution to systemic risk for a variety of financial markets, ranging from repurchase agreements and options to swaps. However, internationally accepted standards and the academic literature have left unanswered many practical questions related to the design of CCPs. The author analyzes the inherent trade]offs and resulting international benchmarks for a certain set of issues. Four CCPs - FINet, CME Clearing, Eurex Clearing and LCH.Clearnet - are considered in terms of risk management, CCP links, governance and operational risk.
    Keywords: Financial system regulation and policies; Financial stability; Payment, clearing, and settlement systems; Financial markets
    JEL: G14 G18 G28 G38
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:bca:bocadp:11-4&r=rmg
  11. By: Skold, Alida S.
    Abstract: Regulation is written with the intent of protecting the vulnerable. However, it can cause an undesirable result if written without understanding how the positive intent can have a negative impact. In its present form, the proposed Volcker Rule has the potential of expanding the liquidity crisis that devastated the housing market into the capital markets. Risk will be transferred to less regulated entities. Banks conducting business in the U.S. or with U.S. “residents” will be at a competitive disadvantage.
    Keywords: Volcker Rule; Regulation; Prop Trading; Market Making; Hedge Fund; Risk
    JEL: G38 D02 D78 L50
    Date: 2011–11–12
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:34672&r=rmg

This nep-rmg issue is ©2011 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.