nep-rmg New Economics Papers
on Risk Management
Issue of 2011‒08‒02
nine papers chosen by
Stan Miles
Thompson Rivers University

  1. The Extreme Value Theory as a Tool to Measure Market Risk By Krenar Avdulaj
  2. Risk Management of Risk Under the Basel Accord: A Bayesian Approach to Forecasting Value-at-Risk of VIX Futures By Roberto Casarin; Chia-Lin Chang; Juan-Ángel Jiménez-Martín; Michael McAleer; Teodosio Pérez Amaral
  3. GFC-Robust Risk Management Under the Basel Accord Using Extreme Value Methodologies By Paulo Araújo Santos; Juan-Ángel Jiménez-Martín; Michael McAleer; Teodosio Pérez Amaral
  4. Addressing risk challenges in a changing financial environment: the need for greater accountability in financial regulation and risk management By Ojo, Marianne
  5. Value at Risk forecasting with the ARMA-GARCH family of models in times of increased volatility By Milan Rippel; Ivo Jánský
  6. The Importance of Estimation Uncertainty in a Multi-Rating Class Loan Portfolio By Henry Dannenberg
  7. From Smile Asymptotics to Market Risk Measures By Ronnie Sircar; Stephan Sturm
  8. Is there Any Dependence Between Consumer Credit Line Utilization and Default Probability on a Term Loan? Evidence from Bank-Level Data By Anne-Sophie Bergerès; Philippe d'Astous; Georges Dionne
  9. Estimation of extreme risk regions under multivariate regular variation. By Cai, J.; Einmahl, J.H.J.; Haan, L.F.M. de

  1. By: Krenar Avdulaj (Institute of Economic Studies, Faculty of Social Sciences, Charles University, Prague, Czech Republic)
    Abstract: Assessing the extreme events is crucial in financial risk management. All risk managers and financial institutions want to know the risk of their portfolio under rare events scenarios. We illustrate a multivariate market risk estimating method which employs Monte Carlo simulations to estimate Value-at-Risk (VaR) for a portfolio of 4 stock exchange indexes from Central Europe. The method uses the non-parametric empirical distribution to capture small risks and the parametric Extreme Value theory to capture large and rare risks. We compare estimates of this method with historical simulation and variance-covariance method under low and high volatility samples of data. In general historical simulation method overestimates the VaR for extreme events, while variance-covariance underestimates it. The method that we illustrate gives a result in between because it considers historical performance of the stocks and also corrects for the heavy tails of the distribution. We conclude that the estimate method that we illustrate here is useful in estimating VaR for extreme events, especially for high volatility times.
    Keywords: Value-at-Risk, Extreme Value Theory, copula.
    JEL: C22
    Date: 2011–07
    URL: http://d.repec.org/n?u=RePEc:fau:wpaper:wp2011_26&r=rmg
  2. By: Roberto Casarin; Chia-Lin Chang; Juan-Ángel Jiménez-Martín; Michael McAleer (University of Canterbury); Teodosio Pérez Amaral
    Abstract: It is well known that the Basel II Accord requires banks and other Authorized Deposit-taking Institutions (ADIs) to communicate their daily risk forecasts to the appropriate monetary authorities at the beginning of each trading day, using one or more risk models, whether individually or as combinations, to measure Value-at-Risk (VaR). The risk estimates of these models are used to determine capital requirements and associated capital costs of ADIs, depending in part on the number of previous violations, whereby realised losses exceed the estimated VaR. McAleer et al. (2009) proposed a new approach to model selection for predicting VaR, consisting of combining alternative risk models, and comparing conservative and aggressive strategies for choosing between VaR models. This paper addresses the question of risk management of risk, namely VaR of VIX futures prices, and extends the approaches given in McAleer et al. (2009) and Chang et al. (2011) to examine how different risk management strategies performed during the 2008-09 global financial crisis (GFC). The empirical results suggest that an aggressive strategy of choosing the Supremum of single model forecasts, as compared with Bayesian and non-Bayesian combinations of models, is preferred to other alternatives, and is robust during the GFC. However, this strategy implies relatively high numbers of violations and accumulated losses, which are admissible under the Basel II Accord.
    Keywords: Median strategy; Value-at-Risk; daily capital charges; violation penalties; aggressive risk management; conservative risk management; Basel Accord; VIX futures; Bayesian strategy; quantiles; forecast densities
    JEL: G32 C53 C22 C11
    Date: 2011–07–01
    URL: http://d.repec.org/n?u=RePEc:cbt:econwp:11/26&r=rmg
  3. By: Paulo Araújo Santos; Juan-Ángel Jiménez-Martín; Michael McAleer (University of Canterbury); Teodosio Pérez Amaral
    Abstract: In McAleer et al. (2010b), a robust risk management strategy to the Global Financial Crisis (GFC) was proposed under the Basel II Accord by selecting a Value-at-Risk (VaR) forecast that combines the forecasts of different VaR models. The robust forecast was based on the median of the point VaR forecasts of a set of conditional volatility models. In this paper we provide further evidence on the suitability of the median as a GFC-robust strategy by using an additional set of new extreme value forecasting models and by extending the sample period for comparison. These extreme value models include DPOT and Conditional EVT. Such models might be expected to be useful in explaining financial data, especially in the presence of extreme shocks that arise during a GFC. Our empirical results confirm that the median remains GFC-robust even in the presence of these new extreme value models. This is illustrated by using the S&P500 index before, during and after the 2008-09 GFC. We investigate the performance of a variety of single and combined VaR forecasts in terms of daily capital requirements and violation penalties under the Basel II Accord, as well as other criteria, including several tests for independence of the violations. The strategy based on the median, or more generally, on combined forecasts of single models, is straightforward to incorporate into existing computer software packages that are used by banks and other financial institutions.
    Keywords: Value-at-Risk (VaR); DPOT; daily capital charges; robust forecasts; violation penalties; optimizing strategy; aggressive risk management; conservative risk management; Basel; global financial crisis
    JEL: G32 G11 C53 C22
    Date: 2011–07–01
    URL: http://d.repec.org/n?u=RePEc:cbt:econwp:11/28&r=rmg
  4. By: Ojo, Marianne
    Abstract: The need for continuous monitoring and regulation is particularly attributed to, and justified by, the inevitable presence of risks and uncertainty – both in terms of certain externalities and indeterminacies which are capable of being reasonably quantified and those which are not. Amongst other goals, this paper aims to address complexities and challenges faced by regulators in identifying and assessing risk, problems arising from different perceptions of risk, and solutions aimed at countering problems of risk regulation. It will approach these issues through an assessment of explanations put forward to justify the growing importance of risks, well known risk theories such as cultural theory, risk society theory and governmentality theory. “Socio cultural” explanations which relate to how risk is increasingly becoming embedded in organisations and institutions will also be considered as part of those factors attributable to why the financial environment has become transformed to the state in which it currently exists. A consideration of regulatory developments which have contributed to a change in the way financial regulation is carried out, as well as developments which have contributed to the de formalisation of rules and a corresponding “loss of certainty”, will also constitute focal points of the paper. To what extent are risks capable of being quantified? Who is able to assist with such quantification –and why has it become necessary to introduce other regulatory actors and greater measures aimed at fostering corporate governance and accountability into the regulatory process? These questions constitute some of the issues which this paper aims to address.
    Keywords: risk; financial; regulation; audit; governmentality theory; risk society; cultural theory; hedge funds; uncertainty; legal theory; accountability
    JEL: K2 D8 G3
    Date: 2011–07–24
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:32396&r=rmg
  5. By: Milan Rippel (Institute of Economic Studies, Faculty of Social Sciences, Charles University, Prague, Czech Republic); Ivo Jánský (Institute of Economic Studies, Faculty of Social Sciences, Charles University, Prague, Czech Republic)
    Abstract: The paper evaluates several hundred one-day-ahead VaR forecasting models in the time period between the years 2004 and 2009 on data from six world stock indices - DJI, GSPC, IXIC, FTSE, GDAXI and N225. The models model mean using the ARMA processes with up to two lags and variance with one of GARCH, EGARCH or TARCH processes with up to two lags. The models are estimated on the data from the in-sample period and their forecasting accuracy is evaluated on the out-of-sample data, which are more volatile. The main aim of the paper is to test whether a model estimated on data with lower volatility can be used in periods with higher volatility. The evaluation is based on the conditional coverage test and is performed on each stock index separately. The primary result of the paper is that the volatility is best modelled using a GARCH process and that an ARMA process pattern cannot be found in analyzed time series.
    Keywords: VaR, risk analysis, conditional volatility, conditional coverage, garch, egarch, tarch, moving average process, autoregressive process
    JEL: C51 C52 C53 G24
    Date: 2011–07
    URL: http://d.repec.org/n?u=RePEc:fau:wpaper:wp2011_27&r=rmg
  6. By: Henry Dannenberg
    Abstract: This article seeks to make an assessment of estimation uncertainty in a multi-rating class loan portfolio. Relationships are established between estimation uncertainty and parameters such as probability of default, intra- and inter-rating class correlation, degree of inhomogeneity, number of rating classes used, number of debtors and number of historical periods used for parameter estimations. In addition, by using an exemplary portfolio based on Moody’s ratings, it becomes clear that estimation uncertainty does indeed have an effect on interest rates.
    Keywords: credit portfolio risk, estimation uncertainty, bootstrapping, economic equity
    JEL: C15 D81 G11
    Date: 2011–07
    URL: http://d.repec.org/n?u=RePEc:iwh:dispap:11-11&r=rmg
  7. By: Ronnie Sircar; Stephan Sturm
    Abstract: The left tail of the implied volatility skew, coming from quotes on out-of-the-money put options, can be thought to reflect the market's assessment of the risk of a huge drop in stock prices. We analyze how this market information can be integrated into the theoretical framework of convex monetary measures of risk. In particular, we make use of indifference pricing by dynamic convex risk measures, which are given as solutions of backward stochastic differential equations (BSDEs), to establish a link between these two approaches to risk measurement. We derive a characterization of the implied volatility in terms of the solution of a nonlinear PDE and provide a small time-to-maturity expansion and numerical solutions. This procedure allows to choose convex risk measures in a conveniently parametrized class, distorted entropic dynamic risk measures, which we introduce here, such that the asymptotic volatility skew under indifference pricing can be matched with the market skew.
    Date: 2011–07
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1107.4632&r=rmg
  8. By: Anne-Sophie Bergerès; Philippe d'Astous; Georges Dionne
    Abstract: Recent studies of credit lines suggest a positive relationship between exposure at default and default probability on the line. In this paper, we consider another important dependence between two consumers’ financial instruments by investigating the relationship between credit line utilization and default probability on a term loan. We model the variation of both financial instruments endogenously in a simultaneous equations system. We find strong evidence for a positive relationship between the two variables: individuals in the default state use their credit line about 59% more often, while credit line utilization has a positive marginal effect of 46% on loan default probability. Our results suggest that banks should monitor both financial instruments simultaneously.
    Keywords: Consumer finance, consumer risk management, credit line, term loan, default probability, ability to pay, endogeneity, simultaneous equations
    JEL: D12 D14 G21 G33
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:lvl:lacicr:1119&r=rmg
  9. By: Cai, J. (Universiteit van Tilburg); Einmahl, J.H.J. (Universiteit van Tilburg); Haan, L.F.M. de (Universiteit van Tilburg)
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:ner:tilbur:urn:nbn:nl:ui:12-4643295&r=rmg

This nep-rmg issue is ©2011 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.