nep-rmg New Economics Papers
on Risk Management
Issue of 2015‒03‒13
twelve papers chosen by
Stan Miles
Thompson Rivers University

  1. Filtered historical simulation Value-at-Risk models and their competitors By Gurrola-Perez, Pedro; Murphy, David
  2. A Quantization Approach to the Counterparty Credit Exposure Estimation By M. Bonollo; L. Di Persio; I. Oliva; A. Semmoloni
  3. A Directional Multivariate Value at Risk By Raúl Torres; Rosa E. Lillo; Henry Laniado
  4. Volatility of aggregate volatility and hedge funds returns By Agarwal, Vikas; Arisoy, Y. Eser; Naik, Narayan Y.
  5. Implementing Basel III through the Capital Requirements Directive (CRD) IV: leverage ratios and capital adequacy requirements By Ojo, Marianne
  6. Measuring Tail-Risk Cross-Country Exposures in the Banking Industry By Antonio Rubia Serrano; Lidia Sanchis-Marco
  7. Valuation of credit default swaps via Bessel bridges By Hernández del Valle Gerardo; Pacheco-González Carlos
  8. Game-theoretic approach to risk-sensitive benchmarked asset management By Amogh Deshpande; Saul D. Jacka
  9. The Dark Side of the Four-Eyes Principle – Evidence from Bank Lending By Brown, Martin; Schaller, Matthias; Westerfeld, Simone; Heusler, Markus
  10. Predictability of the daily high and low of the S&P 500 index By Jones, Clive
  12. Purchasing Term Life Insurance to Reach a Bequest Goal: Time-Dependent Case By Erhan Bayraktar; Virginia R. Young; David Promislow

  1. By: Gurrola-Perez, Pedro (Bank of England); Murphy, David (Bank of England)
    Abstract: Financial institutions have for many years sought measures which cogently summarise the diverse market risks in portfolios of financial instruments. This quest led institutions to develop Value-at-Risk (VaR) models for their trading portfolios in the 1990s. Subsequently, so-called filtered historical simulation VaR models have become popular tools due to their ability to incorporate information on recent market returns and thus produce risk estimates conditional on them. These estimates are often superior to the unconditional ones produced by the first generation of VaR models. This paper explores the properties of various filtered historical simulation models. We explain how these models are constructed and illustrate their performance, examining in particular how filtering transforms various properties of return distribution. The procyclicality of filtered historical simulation models is also discussed and compared to that of unfiltered VaR. A key consideration in the design of risk management models is whether the model’s purpose is simply to estimate some percentile of the return distribution, or whether its aims are broader. We discuss this question and relate it to the design of the model testing framework. Finally, we discuss some recent developments in the filtered historical simulation paradigm and draw some conclusions about the use of models in this tradition for the estimation of initial margin requirements.
    Keywords: Value-at-Risk; filtered historical simulation; conditional volatility; volatility scaling; risk model backtesting
    JEL: C58 G18 G32
    Date: 2015–03–06
  2. By: M. Bonollo; L. Di Persio; I. Oliva; A. Semmoloni
    Abstract: During recent years the counterparty risk subject has received a growing attention because of the so called Basel Accord. In particular the Basel III Accord asks the banks to fulfill finer conditions concerning counterparty credit exposures arising from banks' derivatives, securities financing transactions, default and downgrade risks characterizing the Over The Counter (OTC) derivatives market, etc. Consequently the development of effective and more accurate measures of risk have been pushed, particularly focusing on the estimate of the future fair value of derivatives with respect to prescribed time horizon and fixed grid of time buckets. Standard methods used to treat the latter scenario are mainly based on ad hoc implementations of the classic Monte Carlo (MC) approach, which is characterized by a high computational time, strongly dependent on the number of considered assets. This is why many financial players moved to more enhanced Technologies, e.g., grid computing and Graphics Processing Units (GPUs) capabilities. In this paper we show how to implement the quantization technique, in order to accurately estimate both pricing and volatility values. Our approach is tested to produce effective results for the counterparty risk evaluation, with a big improvement concerning required time to run when compared to MC approach.
    Date: 2015–03
  3. By: Raúl Torres; Rosa E. Lillo; Henry Laniado
    Abstract: In economics, insurance and finance, value at risk (VaR) is a widely used measure of the risk of loss on a specific portfolio of financial assets. For a given portfolio, time horizon, and probability alfa, the 100alfa% VaR is defined as a threshold loss value, such that the probability that the loss on the portfolio over the given time horizon exceeds this value is alfa. That is to say, it is a quantile of the distribution of the losses, which has both good analytic properties and easy interpretation as a risk measure. However, its extension to the multivariate framework is not unique because a unique definition of multivariate quantile does not exist. In the current literature, the multivariate quantiles are related to a specific partial order considered in Rn, or to a property of the univariate quantile that is desirable to be extended to Rn. In this work, we introduce a multivariate value at risk as a vector-valued directional risk measure, based on a directional multivariate quantile, which has recently been introduced in the literature. The directional approach allows the manager to consider external information or risk preferences in her/his analysis. We have derived some properties of the risk measure and we have compared the univariate VaR over the marginals with the components of the directional multivariate VaR. We have also analyzed the relationship between some families of copulas, for which it is possible to obtain closed forms of the multivariate VaR that we propose. Finally, comparisons with other alternative multivariate VaR given in the literature, are provided in terms of robustness.
    Date: 2015–01
  4. By: Agarwal, Vikas; Arisoy, Y. Eser; Naik, Narayan Y.
    Abstract: This paper investigates empirically whether uncertainty about the expected returns on the market portfolio can explain the performance of hedge funds both in the cross-section and over time. We measure uncertainty via volatility of aggregate volatility (VOV) and construct an investable version of this measure by computing monthly returns on lookback straddles written on the VIX index. We find that VOV exposure is a significant determinant of hedge fund returns at the overall index level, at different strategy levels, and at an individual fund level. We find that funds with low (more negative) VOV betas outperform funds with high VOV betas by 1.62% per month. After controlling for a large set of fund characteristics, we document a robust and significant negative risk premium for VOV exposure in the crosssection of hedge fund returns. We further show that strategies with less negative VOV betas outperform their counterparts during the financial crisis period when uncertainty about expected returns was at its highest. On the contrary, strategies with more negative VOV betas generate superior returns when uncertainty in the market is less. Furthermore, the variation in the VOV betas is consistent with the risk-taking incentives of hedge funds arising from the different fund characteristics including their contractual features.
    Keywords: uncertainty,volatility of volatility,hedge funds,performance
    JEL: G10 G11 C13
    Date: 2015
  5. By: Ojo, Marianne
    Abstract: The Capital Requirements Directive (CRD) IV, which constitutes the Capital Requirements Regulation (CRR), as well as the Capital Requirements Directive (CRD), is aimed at implementing Basel III in the European Union. Consequently, this CRD package, replaces Directives 2006/48 and 2006/49 with a Regulation and a Directive. The significance of such a move not only highlights the awareness of the importance of ensuring that Basel rules and regulations become more binding and enforceable, but also signals an era whereby the use of enforcement and supervisory tools such as Binding Technical Standards (BTS) are being introduced and generated by the European Banking Authority, as its plays a crucial role in the implementation of Basel III in the EU. Another significance of such a move towards Basel rules and regulations becoming more enforceable and binding lies in the facilitation of greater consistency, convergence and compliance, which the introduction of a Regulation, Binding Technical Standards, as well as other reporting requirements and provisions would generate in the implementation process. The increased relevance of Basel rules, and particularly Basel III rules, as well as their significance for the Eurozone, European Union institutions and European banks is hereby emphasised. This paper is also aimed at providing an analysis of the recent updates which have taken place in respect of the Basel III Leverage Ratio and the Basel III Supplementary Leverage Ratio – both in respect of recent amendments introduced by the Basel Committee and proposals introduced in the United States. As well as highlighting and addressing gaps which exist in the literature relating to liquidity risks, corporate governance and information asymmetries, by way of reference to pre-dominant based dispersed ownership systems and structures, as well as concentrated ownership systems and structures, this paper will also consider the consequences – as well as the impact - which Basel III, and in particular, the recent Basel Leverage ratios could have on the Eurozone, and European financial institutions. From this perspective, the rise of macro economics, micro economic inefficiency debates - as well as the validity of such debates will be considered.
    Keywords: Basel III; Capital Requirements Directive IV; European Banking Authority; enforcement; supervision; Binding Technical Standards; Keynesian revolution; macroeconomics; micro economic inefficiency
    JEL: D8 E3 E6 G2 G3 K2 M4
    Date: 2015–03–06
  6. By: Antonio Rubia Serrano (Universidad de Alicante); Lidia Sanchis-Marco (Dpto. Análisis Económico y Finanzas)
    Abstract: In this paper we analyze the state-dependent risk-spillover in different economic areas. To this end, weapply the quantile regression-based methodology developed in Adams, Füss and Gropp (2014)approach to examine the spillover in conditional tails of daily returns of indices of the banking industryin the US, BRICs, Peripheral EMU, Core EMU, Scandinavia, the UK and Emerging Markets. Thismethodology allows us to characterize size, direction and strength of financial contagion in a networkof bilateral exposures to address cross-border vulnerabilities under different states of the economy. Thegeneral evidence shows as the spillover effects are higher and more significant in volatile periods thanin tranquil ones. There is evidence of tail spillovers of which much is attributable to a spillover from theUS on the rest of the analyzed regions, especially on European countries. In sharp contrast, the USbanking system shows more financial resilience against foreign shocks.
    Keywords: Spillover effects, Bank contagion, SDSVaR, Expected Shortfall, VaR, Expectiles.
    JEL: C23 G15 Q43
    Date: 2015–02
  7. By: Hernández del Valle Gerardo; Pacheco-González Carlos
    Abstract: A credit default swap (CDS) is a financial contract in which the holder of the instrument will be compensated in the event of a loan default. When available, CDS's are used to monitor the credit risk of countries and companies. In this work we develop a closed form procedure to value a CDS in the case in which the so-called "credit rate index" is modelled as a Bessel bridge of arbitrary order. In particular, these processes seem to capture the nature of a defaultable asset in the sense that they remain strictly positive before default, and thus enrich the existing literature in this field.
    Keywords: Credit default swap, Bessel bridge, hitting time, defaultable bond.
    JEL: G0 G1
    Date: 2014–12
  8. By: Amogh Deshpande; Saul D. Jacka
    Abstract: In this article we consider a game theoretic approach to the Risk-Sensitive Benchmarked Asset Management problem (RSBAM) of Davis and Lleo \cite{DL}. In particular, we consider a stochastic differential game between two players, namely, the investor who has a power utility while the second player represents the market which tries to minimize the expected payoff of the investor. The market does this by modulating a stochastic benchmark that the investor needs to outperform. We obtain an explicit expression for the optimal pair of strategies as for both the players.
    Date: 2015–03
  9. By: Brown, Martin; Schaller, Matthias; Westerfeld, Simone; Heusler, Markus
    Abstract: In small business lending the four-eyes principle leads loan officers to propose inflated credit ratings for their clients. Inflated ratings are, however, anticipated and corrected by the credit officers responsible for approving credit assessments. More experienced loan officers inflate those parameters of a credit rating which are least likely to be corrected by credit officers. Our analysis is based on administrative data covering 10,568 internal ratings for 3,661 small business clients at 6 retail banks. Our results provide empirical support to theories suggesting that internal control can induce strategic communication of information in organizations when decision proposers and decision makers have diverging interests. Our findings also point to the limits of the four-eyes principle as a risk-management tool in financial institutions.
    Keywords: Four-Eyes Principle, Authority, Information, Small Business Lending
    JEL: D23 G21 G34 L20 M2
    Date: 2015–02
  10. By: Jones, Clive
    Abstract: Ratios involving the current period opening price and the high or low price of the previous period are significant predictors of the current period high or low price for many stocks and stock indexes. This is illustrated with daily trading data from the S&P 500 index. Regressions specifying these “proximity variables” have higher explanatory and predictive power than benchmark autoregressive and “no change” models. This is shown with out-of-sample comparisons of MAPE, MSE, and the proportion of time models predict the correct direction or sign of change of daily high and low stock prices. In addition, predictive models incorporating these proximity variables show time varying effects over the study period, 2000 to February 2015. This time variation looks to be more than random and probably relates to investor risk preferences and changes in the general climate of investment risk.
    Keywords: predictability of stock prices, time varying parameters, proximity variable method for predicting stock prices, accuracy of proximity variable method compared with autoregressive and benchmark forecasts
    JEL: C32 C58 G11 G17
    Date: 2015–03–01
  11. By: Javier Mencía (Bank of Spain); Enrique Sentana (CEMFI, Centro de Estudios Monetarios y Financieros)
    Abstract: We compare Semi-Nonparametric expansions of the Gamma distribution with alternative Laguerre expansions, showing that they substantially widen the range of feasible moments of positive random variables. Then, we combine those expansions with a component version of the Multiplicative Error Model to capture the mean reversion typical in positive but stationary financial time series. Finally, we carry out an empirical application in which we compare various asset allocation strategies for Exchange Traded Notes tracking VIX futures indices, which are increasingly popular but risky financial instruments. We show the superior performance of the strategies based on our econometric model.
    Keywords: Density expansions, exchange traded notes, multiplicative error model, volatility index futures.
    JEL: G13 C16
    Date: 2015–02
  12. By: Erhan Bayraktar; Virginia R. Young; David Promislow
    Abstract: We consider the problem of how an individual can use term life insurance to maximize the probability of reaching a given bequest goal, an important problem in financial planning. We assume that the individual buys instantaneous term life insurance with a premium payable continuously. By contrast with Bayraktar et al. (2014), we allow the force of mortality to vary with time, which, as we show, greatly complicates the problem.
    Date: 2015–03

This nep-rmg issue is ©2015 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.