
on Risk Management 
Issue of 2013‒05‒19
ten papers chosen by 
By:  Michael McAleer (Erasmus University Rotterdam); JuanÁngel JiménezMartín (Complutense University of Madrid); Teodosio PérezAmaral (Complutense University of Madrid) 
Abstract:  The Basel II Accord requires that banks and other Authorized Deposittaking Institutions (ADIs) communicate their daily risk forecasts to the appropriate monetary authorities at the beginning of each trading day, using one or more risk models to measure ValueatRisk (VaR). The risk estimates of these models are used to determine capital requirements and associated capital costs of ADIs, depending in part on the number of previous violations, whereby realised losses exceed the estimated VaR. In this paper we define risk management in terms of choosing from a variety of risk models, and discuss the selection of optimal risk models. A new approach to model selection for predicting VaR is proposed, consisting of combining alternative risk models, and we compare conservative and aggressive strategies for choosing between VaR models. We then examine how different risk management strategies performed during the 200809 global financial crisis. These issues are illustrated using Standard and Poor’s 500 Composite Index. 
Keywords:  ValueatRisk (VaR), daily capital charges, violation penalties, optimizing strategy, risk forecasts, aggressive or conservative risk management strategies, Basel Accord, global financial crisis 
JEL:  G32 G11 G17 C53 C22 
Date:  2013–01–08 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:2013010&r=rmg 
By:  Andre Lucas (VU University Amsterdam); Bernd Schwaab (European Central Bank, Financial Markets Research); Xin Zhang (VU University Amsterdam, and Sveriges Riksbank, Research Division) 
Abstract:  Two new measures for financial systemic risk are computed based on the timevarying conditional and unconditional probability of simultaneous failures of several financial institutions. These risk measures are derived from a multivariate model that allows for skewed and heavytailed changes in the market value of financial firms’ equity. Our model can be interpreted as a Merton model with correlated Levy drivers. This model incorporates dynamic volatilities and dependence measures and uses the overall information on the shape of the multivariate distribution. Our correlation estimates are robust against possible outliers and influential observations. For very large crosssectional dimensions, we propose an approximation based on a conditional Law of Large Numbers to compute extreme joint default probabilities. We apply the model to assess the risk of joint financial firm failure in the European Union during the financial crisis. By augmenting the dynamic parameter model with EuriborEONIA rate and other variables that capture situations of systemic stress, we find that including extra economic variables helps to explain systemic correlation dynamics. 
Keywords:  systemic risk; dynamic equicorrelation model; generalized hyperbolic distribution; Law of Large Numbers 
JEL:  G21 C32 
Date:  2013–05–13 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:20130063&r=rmg 
By:  Andre Lucas (VU University Amsterdam, and Duisenberg school of finance); Bastiaan Verhoef (Royal Bank of Scotland) 
Abstract:  We investigate the effect of model specification on the aggregation of (correlated) market and credit risk. We focus on the functional form linking systematic credit risk drivers to default probabilities. Examples include the normal based probit link function for typical structural models, or the exponential (Poisson) link function for typical reduced form models. We first show analytically how model specification impacts 'diversification benefits' for aggregated market and credit risk. The specification effect can lead to ValueatRisk (VaR) reductions in the range of 3 percent to 47 percent, particularly at high confidence level VaRs. We also illustrate the effects using a fully calibrated empirical model for US data. The empirical effects corroborate our analytic results. 
Keywords:  risk aggregation, credit risk, market risk, link function, diversification, reduced form models, structural models 
JEL:  G32 G21 C58 
Date:  2012–05–31 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:2012057&r=rmg 
By:  Mardi Dungey (School of Economics and Finance, University of Tasmania; CFAP University of Cambridge, CAMA ANU); Matteo Luciani (ECARES, Solvay Brussels School of Economics and Management, Université Libre de Bruxelles; F.R.S.FNRS); David Veredas (ECARES, Solvay Brussels School of Economics and Management, and Duisenberg school of finance) 
Abstract:  We propose a simple network–based methodology for ranking systemically important financial institutions. We view the risks of firms –including both the financial sector and the real economy– as a network with nodes representing the volatility shocks. The metric for the connections of the nodes is the correlation between these shocks. Daily dynamic centrality measures allow us to rank firms in terms of risk connectedness and firm characteristics. We present a general systemic risk index for the financial sector. Results from applying this approach to all firms in the S&P500 for 2003–2011 are twofold. First, Bank of America, JP Morgan and Wells Fargo are consistently in the top 10 throughout the sample. Citigroup and Lehman Brothers also were consistently in the top 10 up to late 2008. At the end of the sample, insurance firms emerge as systemic. Second, the systemic risk in the financial sector built–up from early 2005, peaked in September 2008, and greatly reduced after the introduction of TARP and the rescue of AIG. Anxiety about European debt markets saw the systemic risk begin to rise again from April 2010. We further decompose these results to find that the systemic risk of insurance and deposit– taking institutions differs importantly, the latter experienced a decline from late 2007, in line with the burst of the housing price bubble, while the former continued to climb up to the rescue of AIG. 
Keywords:  Systemic risk, ranking, financial institutions, Lehman 
JEL:  G01 G10 G18 G20 G28 G32 G38 
Date:  2012–10–26 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:2012115&r=rmg 
By:  David E. Allen (Edith Cowan University, Australia); Abhay K. Singh (Edith Cowan University, Australia); Robert J. Powell (Edith Cowan University, Australia); Michael McAleer (Erasmus University Rotterdam, Complutense University of Madrid, Spain, and Kyoto University, Japan); James Taylor (University of Oxford, Oxford); Lyn Thomas (University of Southampton, Southampton) 
Abstract:  The purpose of this paper is to examine the asymmetric relationship between price and implied volatility and the associated extreme quantile dependence using linear and non linear quantile regression approach. Our goal in this paper is to demonstrate that the relationship between the volatility and market return as quantified by Ordinary Least Square (OLS) regression is not uniform across the distribution of the volatilityprice return pairs using quantile regressions. We examine the bivariate relationship of six volatilityreturn pairs, viz. CBOEVIX and S&P500, FTSE100 Volatility and FTSE100, NASDAQ100 Volatility (VXN) and NASDAQ, DAX Volatility (VDAX) and DAX30, CAC Volatility (VCAC) and CAC40 and STOXX Volatility (VSTOXX) and STOXX. The assumption of a normal distribution in the return series is not appropriate when the distribution is skewed and hence OLS does not capture the complete picture of the relationship. Quantile regression on the other hand can be set up with various loss functions, both parametric and nonparametric (linear case) and can be evaluated with skewed marginal based copulas (for the non linear case). Which is helpful in evaluating the nonnormal and nonlinear nature of the relationship between price and volatility. In the empirical analysis we compare the results from linear quantile regression (LQR) and copula based non linear quantile regression known as copula quantile regression (CQR). The discussion of the properties of the volatility series and empirical findings in this paper have significance for portfolio optimization, hedging strategies, trading strategies and risk management in general. 
Keywords:  ReturnVolatility relationship, quantile regression, copula, copula quantile regression, volatility index, tail dependence 
JEL:  C14 C58 G11 
Date:  2013–01–18 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:2013020&r=rmg 
By:  Lukasz Gatarek (Econometric Institute, Erasmus School of Economics, Erasmus University Rotterdam); Lennart Hoogerheide (VU University Amsterdam); Koen Hooning (Delft University of Technology); Herman K. van Dijk (Econometric Institute, Erasmus University Rotterdam, and VU University Amsterdam) 
Abstract:  Accurate prediction of risk measures such as Value at Risk (VaR) and Expected Shortfall (ES) requires precise estimation of the tail of the predictive distribution. Two novel concepts are introduced that offer a specific focus on this part of the predictive density: the censored posterior, a posterior in which the likelihood is replaced by the censored likelihood; and the censored predictive likelihood, which is used for Bayesian Model Averaging. We perform extensive experiments involving simulated and empirical data. Our results show the ability of these new approaches to outperform the standard posterior and traditional Bayesian Model Averaging techniques in applications of ValueatRisk prediction in GARCH models. 
Keywords:  censored likelihood, censored posterior, censored predictive likelihood, Bayesian Model Averaging, Value at Risk, MetropolisHastings algorithm. 
JEL:  C11 C15 C22 C51 C53 C58 G17 
Date:  2013–04–15 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:2013060&r=rmg 
By:  David Ardia (Universite Laval, Quebec, Canada); Lennart Hoogerheide (VU University Amsterdam) 
Abstract:  We analyze the impact of the estimation frequency  updating parameter estimates on a daily, weekly, monthly or quarterly basis  for commonly used GARCH models in a largescale study, using more than twelve years (20002012) of daily returns for constituents of the S&P 500 index. We assess the implication for oneday ahead 95% and 99% ValueatRisk (VaR) forecasts with the test for correct conditional coverage of Christoffersen (1998) and for Expected Shortfall (ES) forecasts with the blockbootstrap test of ES violations of Jalal and Rockinger (2008). Using the false discovery rate methodology of Storey (2002) to estimate the percentage of stocks for which the model yields correct VaR and ES forecasts, we reach the following conclusions. First, updating the parameter estimates of the GARCH equation on a daily frequency improves only marginally the performance of the model, compared with weekly, monthly or even quarterly updates. The 90% confidence bands overlap, reflecting that the performance is not significantly different. Second, the asymmetric GARCH model with nonparametric kernel density estimate performs well; it yields correct VaR and ES forecasts for an estimated 90% to 95% of the S&P 500 constituents. Third, specifying a Student<I>t</I> (or Gaussian) innovations' density yields substantially and significantly worse forecasts, especially for ES. In sum, the somewhat more advanced model with infrequently updated parameter estimates yields much better VaR and ES forecasts than simpler models with daily updated parameter estimates. 
Keywords:  GARCH, ValueatRisk, Expected Shortfall, equity, frequency, false discovery rate 
JEL:  C12 C22 C58 G17 G32 
Date:  2013–03–21 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:2013047&r=rmg 
By:  Simon A. Broda (University of Amsterdam) 
Abstract:  Countless test statistics can be written as quadratic forms in certain random vectors, or ratios thereof. Consequently, their distribution has received considerable attention in the literature. Except for a few special cases, no closedform expression for the cdf exists, and one resorts to numerical methods. Traditionally the problem is analyzed under the assumption of joint Gaussianity; the algorithm that is usually employed is that of Imhof (1961). The present manuscript generalizes this result to the case of multivariate generalized hyperbolic (MGHyp) random vectors. The MGHyp is a very exible distribution which nests, among others, the multivariate <I>t</I>, Laplace, and variance gamma distributions. An expression for the first partial moment is also obtained, which plays a vital role in financial risk management. The proof involves a generalization of the classic inversion formula due to GilPelaez (1951). Two applications are considered: first, the nitesample distribution of the 2SLS estimator of a structural parameter. Second, the Value at Risk and Expected Shortfall of a quadratic portfolio with heavytailed risk factors. 
Keywords:  Finite Samples; Characteristic Function; Transform Inversion; 2SLS; CVaR; Expected Shortfall 
JEL:  C16 C36 C63 G11 G32 
Date:  2013–01–08 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:2013001&r=rmg 
By:  Martin Koudstaal (Double Effect); Sweder van Wijnbergen (University of Amsterdam) 
Abstract:  This paper deals with the relation between excessive risk taking and capital structure in banks. Examining a quarterly dataset of U.S. banks between 1993 and 2010, we find that equity is valued higher when more risky portfolios are chosen when leverage is high, and that more risk taking has a negative impact on valuation of the debt of highly leveraged banks. We find no evidence that deposit insurance is encouraging risk taking behaviour. We do find that banks with a more troubled loan portfolio take on more risk. Banks whose share price has slumped tend to gamble for resurrection by increasing the riskiness of their asset portfolios. The results suggest that incentives embedded in the capital structure of banks contribute to systemic fragility, and so support the Basel III proposals towards less leverage and higher loss absorption capacity of capital. 
Keywords:  bank fragility, risk shifting, deposit insurance, gambles for resurrection 
JEL:  G21 G28 G32 
Date:  2012–03–12 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:2012022&r=rmg 
By:  Gabriela Flores (Institute of Health Economics and Management, University of Lausanne, and Institute of Health Policy and Management, Erasmus University Rotterdam); Owen O'Donnell (Erasmus School of Economics, Erasmus University Rotterdam, and University of Macedonia, Greece) 
Abstract:  Medical expenditure risk can pose a major threat to living standards. We derive decomposable measures of catastrophic medical expenditure risk from referencedependent utility with loss aversion. We propose a quantile regression based method of estimating risk exposure from crosssection data containing information on the means of financing health payments. We estimate medical expenditure risk in seven Asian countries and find it is highest in Laos and China, and is lowest in Malaysia. Exposure to risk is generally higher for households that have less recourse to selfinsurance, lower incomes, wealth and education, and suffer from chronic illness. 
Keywords:  medical expenditures, catastrophic payments, downside risk, referencedependent utility, Asia 
JEL:  D12 D31 D80 I15 
Date:  2012–07–24 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:2012078&r=rmg 