
on Risk Management 
Issue of 2014‒11‒07
twelve papers chosen by 
By:  Contino, Christian; Gerlach, Richard H. 
Abstract:  A Realised Volatility GARCH model is developed within a Bayesian framework for the purpose of forecasting Value at Risk and Conditional Value at Risk. Studentt and Skewed Studentt return distributions are combined with Gaussian and Studentt distributions in the measurement equation in a GARCH framework to forecast tail risk in eight international equity index markets over a four year period. Three Realised Volatility proxies are considered within this framework. Realised Volatility GARCH models show a marked improvement compared to ordinary GARCH for both Value at Risk and Conditional Value at Risk forecasting. This improvement is consistent across a variety of data, volatility model speci_cations and distributions, and demonstrates that Realised Volatility is superior when producing volatility forecasts. Realised Volatility models implementing a Skewed Studentt distribution for returns in the GARCH equation are favoured. 
Keywords:  Risk Management; Expected Shortfall; HighFrequency Data; CVaR; ValueatRisk; GARCH; Realised Volatility 
Date:  2014–10–10 
URL:  http://d.repec.org/n?u=RePEc:syb:wpbsba:2123/12060&r=rmg 
By:  Mark H. A. Davis 
Abstract:  This paper concerns the computation of risk measures for financial data and asks how, given a risk measurement procedure, we can tell whether the answers it produces are correct. We draw the distinction between `external' and `internal' risk measures and concentrate on the latter, where we observe data in real time, make predictions and observe outcomes. It is argued that evaluation of such procedures is best addressed from the point of view of probability forecasting or Dawid's theory of `prequential statistics' [Dawid, JRSS(A)1984]. We introduce a concept of `consistency' of a risk measure, which is close to Dawid's `strong prequential principle', and examine its application to quantile forecasting (VaR  value at risk) and to mean estimation (applicable to CVaR  expected shortfall). We show in particular that VaR has special properties not shared by any other risk measure. In a final section we show that a simple datadriven feedback algorithm can produce VaR estimates on financial data that easily pass both the consistency test and a further newlyintroduced statistical test for independence of a binary sequence. 
Date:  2014–10 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1410.4382&r=rmg 
By:  Carlos Abad; Garud Iyengar 
Abstract:  We propose an iterative gradientbased algorithm to efficiently solve the portfolio selection problem with multiple spectral risk constraints. Since the conditional value at risk (CVaR) is a special case of the spectral risk measure, our algorithm solves portfolio selection problems with multiple CVaR constraints. In each step, the algorithm solves very simple separable convex quadratic programs; hence, we show that the spectral risk constrained portfolio selection problem can be solved using the technology developed for solving meanvariance problems. The algorithm extends to the case where the objective is a weighted sum of the mean return and either a weighted combination or the maximum of a set of spectral risk measures. We report numerical results that show that our proposed algorithm is very efficient; it is at least two orders of magnitude faster than the stateoftheart general purpose solver for all practical instances. One can leverage this efficiency to be robust against model risk by including constraints with respect to several different risk models. 
Date:  2014–10 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1410.5328&r=rmg 
By:  Bec, Frédérique; Gollier, Christian 
Abstract:  This paper explores empirically the link between stocks returns ValueatRisk (VaR) and the state of financial markets across various holding horizons. The econometric analysis is based on a selfexciting threshold autoregression setup. Using quarterly French and US data from 1970Q4 to 2012Q4, it turns out that the kyear VaR of equities is actually dependent on the state of the market: the expected losses as measured by the VaR are smaller in bear market than in normal or bull market, whatever the horizon. These results suggest that the rules regarding the solvency capital requirements should adapt to the state of the financial market. 
Keywords:  Expected equities returns, Value at Risk, Financial cycle, Investment horizon, Threshold Autoregression. 
JEL:  G11 
Date:  2014–09 
URL:  http://d.repec.org/n?u=RePEc:tse:wpaper:28462&r=rmg 
By:  Juliusz Jabłecki (Faculty of Economic Sciences, University of Warsaw; National Bank of Poland); Ryszard Kokoszczyński (Faculty of Economic Sciences, University of Warsaw; National Bank of Poland); Paweł Sakowski (Faculty of Economic Sciences, University of Warsaw); Robert Ślepaczuk (Faculty of Economic Sciences, University of Warsaw; Union Investment TFI S.A.); Piotr Wójcik (Faculty of Economic Sciences, University of Warsaw) 
Abstract:  The adjustment speed of delta hedged options exposure depends on the market realized and implied volatility. We observe that by consistently hedging long and short positions in options we can eventually end up with pure exposure to volatility without any options in the portfolio at all. The results of such arbitrage strategy is based only on speed of adjustment of delta hedged option positions. More specifically, they rely on interrelation between realized volatility levels calculated for various time intervals (from daily to intraday frequency). Theoretical intuition enables us to solve the puzzle of the optimal frequency of hedge adjustment and its influence on hedging efficiency. We present results of a simple hedge strategy based on the consistent hedging of a portfolio of options for various worldwide equity indice 
Keywords:  options hedging efficiency, optimal hedging frequency, realized and implied volatility, index futures, investment strategies 
JEL:  G11 G14 G15 G23 C61 C22 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:war:wpaper:201427&r=rmg 
By:  Balasubramanyan, Lakshmi (Federal Reserve Bank of Cleveland) 
Abstract:  In this paper, I attempt to amalgamate the study of leverageratio performance with the monitoring decisions of a profitmaximizing bank. Applying tools used in studying the industrial organization of banking, my paper serves as a first step to tying the performance differences between the leverage and riskbased constraints to the more fundamental issue of monitoring. Does a bank faced with a leveragebased capital constraint monitor its loans better than a bank under a riskbased capital constraint? In a market that is characterized by a dominant bank and fringe banks, I seek to understand if the dominant bank monitors its loan when faced with a Basel III–style leverage ratio. The results show that under certain parameter ranges, the dominant bank will monitor its portfolio when faced with a leveragebased capital constraint. The results also show that the dominant bank will not monitor its portfolio when faced with a riskbased capital constraint. 
Keywords:  Differential capital requirements; dominantbank model; bank loan monitoring 
JEL:  G2 
Date:  2014–10–02 
URL:  http://d.repec.org/n?u=RePEc:fip:fedcwp:1415&r=rmg 
By:  John Cotter; Enrique Salvador 
Abstract:  This study develops a multifactor framework where not only market risk is considered but also potential changes in the investment opportunity set. Although previous studies find no clear evidence about a positive and significant relation between return and risk, favourable evidence can be obtained if a nonlinear relation is pursued. The positive and significant riskreturn tradeoff is essentially observed during low volatility periods. However, this relationship is not obtained during periods of high volatility. Also, different patterns for the risk premium dynamics in low and high volatility periods are obtained both in prices of risk and market risk dynamics. 
Date:  2014–10 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1410.6005&r=rmg 
By:  Chris Brooks (ICMA Centre, Henley Business School, University of Reading); Adrian FernandezPerez (Auckland University of Technology); Joëlle Miffre (EDHEC Business School, France); Ogonna Nneji (ICMA Centre, Henley Business School, University of Reading) 
Abstract:  The article examines whether commodity risk is priced in the crosssection of equity returns. Alongside a longonly equallyweighted portfolio of commodity futures, we employ as an alternative commodity risk factor a term structure portfolio that captures the propensity of commodity futures markets to be backwardated or contangoed. Equitysorted portfolios with greater sensitivities to the two commodity risk factors command higher average returns. The two commodity portfolios are also found to explain part of the size, value and momentum anomalies. Conclusions regarding the pricing of the commodity risk factors are not an artifact driven by crude oil and are robust to the inclusion of financial and macroeconomic variables and to the addition of a composite leading indicator in the pricing model. 
Keywords:  Longonly commodity portfolio, term structure portfolio, commodity risk, crosssection of equity returns 
JEL:  G11 G13 
Date:  2014–09 
URL:  http://d.repec.org/n?u=RePEc:rdg:icmadp:icmadp201409&r=rmg 
By:  Benjamin HAMIDI; Bertrand MAILLET; JeanLuc PRIGENT 
Keywords:  CPPI, VaR, Expected Shortfall, Expective, Quantile Regression, Dynamic Quantile Model, 
Date:  2013 
URL:  http://d.repec.org/n?u=RePEc:leo:wpaper:164&r=rmg 
By:  Silvia Muzzioli 
Abstract:  Corridor implied volatility is obtained from modelfree implied volatility by truncating the integration domain between two barriers. Empirical evidence on volatility forecasting, in various markets, points to the utility of trimming the riskneutral distribution of the underlying stock price, in order to obtain unbiased measures of future realised volatility (see e.g. [9], [3]). The aim of the paper is to investigate, both in a statistical and in an economic setting, the optimal corridor of strike prices to use for volatility forecasting in the Italian market, by analysing a data set which covers the years 20052010 and span both a relatively tranquil and a turmoil period 
Keywords:  corridor implied volatility, modelfree implied volatility, volatility forecasting, financial turmoil 
JEL:  G13 G14 
Date:  2013–12 
URL:  http://d.repec.org/n?u=RePEc:mod:dembwp:0029&r=rmg 
By:  Michael R. CARTER (Université du Wisconsin); Alain de JANVRY (Université de Californie à Berkeley); Elisabeth SADOULET (Université de Californie à Berkeley); Alexandros SARRIS (Université d'Athènes) 
Abstract:  Indexbased weather insurance is a major institutional innovation that could revolutionize access to formal insurance for millions of smallholder farmers and related individuals. It has been introduced in pilot or experimental form in many countries at the individual or institutional level. Significant efforts have been made in research to assess its impacts on shock coping and risk management, and to contribute to improvements in design and implementation. While impacts have typically been positive where uptake has occurred, uptake has generally been low and in most cases under conditions that were not sustainable. This paper addresses the reasons for this current discrepancy between promise and reality. We conclude on perspectives for improvements in product design, complementary interventions to boost uptake, and strategies for sustainable scaling up of uptake. Specific recommendations include: (1) The firstorder importance of reducing basis risk, pursuing for this multiple technological, contractual, and institutional innovations. (2) The need to use risk layering, combining the use of insurance, credit, savings, and riskreducing investments to optimally address different categories of risk. For this, these various financial products should be offered in a coordinated fashion. (3) Calling on a role for state intervention on two fronts. One is the implementation of public certification standards for maximum basis risk of insurance contracts; the other is “smart” subsidies for learning, data accumulation, initial reinsurance, and catastrophic risks. (4) Using twintrack institutionallevel index insurance contracts combined with intrainstitution distribution of payouts to reduce basis risk and improve the quality of insurance. For this, credible intrainstitutional rules for idiosyncratic transfers must be carefully designed. Finally (5), the need for further research on the determinants of behavior toward risk and insurance, the design of indexbased insurance products combined with others risk handling financial instruments, and rigorous impact analyses of ongoing programs and experiments. 
JEL:  O16 Q12 Q14 
Date:  2014–09 
URL:  http://d.repec.org/n?u=RePEc:fdi:wpaper:1799&r=rmg 
By:  Bruno Bouchard (CEREMADE  CEntre de REcherches en MAthématiques de la DEcision  CNRS : UMR7534  Université Paris IX  Paris Dauphine, CREST  Centre de Recherche en Économie et Statistique  INSEE  École Nationale de la Statistique et de l'Administration Économique); Ludovic Moreau (Department of Mathematics, ETH zurich  Swiss Federal Institute of Technology in Zurich (ETH Zurich).); Mete Soner (Department of Mathematics, ETH zurich  Swiss Federal Institute of Technology in Zurich (ETH Zurich).) 
Abstract:  We consider the problem of option hedging in a market with proportional transaction costs. Since superreplication is very costly in such markets, we replace perfect hedging with an expected loss constraint. Asymptotic analysis for small transactions is used to obtain a tractable model. A general expansion theory is developed using the dynamic programming approach. Explicit formulae are also obtained in the special cases of an exponential or power loss function. As a corollary, we retrieve the asymptotics for the exponential utility indifference price. 
Keywords:  Expected loss constraint, hedging, transaction cost, asymptotic expansion. 
Date:  2013–09–19 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal00863562&r=rmg 