nep-rmg New Economics Papers
on Risk Management
Issue of 2013‒06‒04
thirteen papers chosen by
Stan Miles
Thompson Rivers University

  1. Hedging without sweat: a genetic programming approach By Terje Lensberg; Klaus Reiner Schenk-Hopp\'e
  2. Forecasting Value-at-Risk using Block Structure Multivariate Stochastic Volatility Models By Manabu Asai; Massimiliano Caporin; Michael McAleer
  3. Robust Portfolio Allocation with Systematic Risk Contribution Restrictions By Serge Darolles; Christian Gouriéroux; Emmanuelle Jay
  4. Systemic Risk Allocation for Systems with A Small Number of Banks By Xiao Qin; Chen Zhou
  5. GFC-Robust Risk Management under the Basel Accord using Extreme Value Methodologies By Juan-Angel Jimenez-Martin; Michael McAleer; Teodosio Perez Amaral; Paulo Araujo Santos
  6. An Accurate Solution for Credit Value Adjustment (CVA) and Wrong Way Risk By Xiao, Tim
  7. Risk Measure Estimation On Fiegarch Processes By Taiane S. Prass; S\'ilvia R. C. Lopes
  8. The Impact of Default Dependency and Collateralization on Asset Pricing and Credit Risk Modeling By Xiao, Tim
  9. The systemic risk of energy markets By PIERRET, Diane
  10. Worldwide equity Risk Prediction By David Ardia; Lennart F. Hoogerheide
  11. Survival of Hedge Funds : Frailty vs Contagion By Serge Darolles; Patrick Gagliardini; Christian Gouriéroux
  12. Fully Flexible Views in Multivariate Normal Markets By Attilio Meucci; David Ardia; Simon Keel
  13. Pricing Default Events : Surprise, Exogeneity and Contagion By Christian Gouriéroux; Alain Monfort; Jean-Paul Renne

  1. By: Terje Lensberg; Klaus Reiner Schenk-Hopp\'e
    Abstract: Hedging in the presence of transaction costs leads to complex optimization problems. These problems typically lack closed-form solutions, and their implementation relies on numerical methods that provide hedging strategies for specific parameter values. In this paper we use a genetic programming algorithm to derive explicit formulas for near-optimal hedging strategies under nonlinear transaction costs. The strategies are valid over a large range of parameter values and require no information about the structure of the optimal hedging strategy.
    Date: 2013–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1305.6762&r=rmg
  2. By: Manabu Asai (Soka University, Japan); Massimiliano Caporin (University of Padova, Italy); Michael McAleer (Erasmus University Rotterdam, The Netherlands, Complutense University of Madrid, Spain, and Kyoto University, Japan)
    Abstract: Most multivariate variance or volatility models suffer from a common problem, the “curse of dimensionality”. For this reason, most are fitted under strong parametric restrictions that reduce the interpretation and flexibility of the models. Recently, the literature has focused on multivariate models with milder restrictions, whose purpose is to combine the need for interpretability and efficiency faced by model users with the computational problems that may emerge when the number of assets can be very large. We contribute to this strand of the literature by proposing a block-type parameterization for multivariate stochastic volatility models. The empirical analysis on stock returns on the US market shows that 1% and 5 % Value-at-Risk thresholds based on one-step-ahead forecasts of covariances by the new specification are satisfactory for the period including the Global Financial Crisis.
    Keywords: block structures; multivariate stochastic volatility; curse of dimensionality; leverage effects; multi-factors; heavy-tailed distribution
    JEL: C32 C51 C10
    Date: 2013–05–27
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:2013073&r=rmg
  3. By: Serge Darolles (Paris-Dauphine University and CREST); Christian Gouriéroux (CREST and University of Toronto); Emmanuelle Jay (Quantitative Asset Management Laboratory)
    Abstract: The standard mean-variance approach can imply extreme weights in some assets in the optimal allocation and a lack of stability of this allocation over time. To improve the robustness of the portfolio allocation, but also to better control for the portfolio turnover and the sensitivity of the portfolio to systematic risk, it is proposed in this paper to introduce additional constraints on both the total systematic risk contribution of the portfolio and its turnover. Our paper extends the existing literature on risk parity in three directions: i) we consider other risk criteria than the variance, such as the Value-at-Risk (VaR), or the Expected Shortfall; ii) we manage separately the systematic and idiosyncratic components of the portfolio risk; iii) we introduce a set of portfolio management approaches which control for the degree of market neutrality of the portfolio, for the strength of the constraint on systematic risk contribution and for the turnover
    Keywords: Asset Allocation, Portfolio Turnover, Risk Diversification, Minimum Variance Portfolio, Risk Parity Portfolio, Systematic Risk, Euler Allocation, Hedge Fund
    JEL: G12 C23
    Date: 2012–12
    URL: http://d.repec.org/n?u=RePEc:crs:wpaper:2012-35&r=rmg
  4. By: Xiao Qin; Chen Zhou
    Abstract: This paper provides a new estimation method for the marginal expected shortfall (MES) based on multivariate extreme value theory. In contrast to previous studies, the method does not assume specific dependence structure among bank equity returns and is applicable to both large and small systems. Furthermore, our MES estimator inherits the theoretical additive property. Thus, it serves as a tool to allocate systemic risk. We apply the proposed method to 29 global systemically important financial institutions (G-SIFIs) to evaluate the cross sections and dynamics of the systemic risk allocation. We show that allocating systemic risk according to either size or individual risk is imperfect and can be unfair. Between the allocation with respect to individual risk and that with respect to size, the former is less unfair. On the time dimension, both allocation fairness across all the G-SIFIs has decreased since 2008.
    Keywords: Systemic risk allocation; marginal expected shortfall; systemically important financial institutions; extreme value theory
    JEL: G21 C14 G32
    Date: 2013–05
    URL: http://d.repec.org/n?u=RePEc:dnb:dnbwpp:378&r=rmg
  5. By: Juan-Angel Jimenez-Martin (Complutense University of Madrid, Spain); Michael McAleer (Complutense University of Madrid, Spain, Erasmus School of Economics, Erasmus University Rotterdam, The Netherlands, and Kyoto University, Japan); Teodosio Perez Amaral (Complutense University of Madrid, Spain); Paulo Araujo Santos (University of Lisbon, Portugal)
    Abstract: In this paper we provide further evidence on the suitability of the median of the point VaR forecasts of a set of models as a GFC-robust strategy by using an additional set of new extreme value forecasting models and by extending the sample period for comparison. These extreme value models include DPOT and Conditional EVT. Such models might be expected to be useful in explaining financial data, especially in the presence of extreme shocks that arise during a GFC. Our empirical results confirm that the median remains GFC-robust even in the presence of these new extreme value models. This is illustrated by using the S&P500 index before, during and after the 2008-09 GFC. We investigate the performance of a variety of single and combined VaR forecasts in terms of daily capital requirements and violation penalties under the Basel II Accord, as well as other criteria, including several tests for independence of the violations. The strategy based on the median, or more generally, on combined forecasts of single models, is straightforward to incorporate into existing computer software packages that are used by banks and other financial institutions.
    Keywords: Value-at-Risk (VaR), DPOT, daily capital charges, robust forecasts, violation penalties, optimizing strategy, aggressive risk management, conservative risk management, Basel, global financial crisis
    JEL: G32 G11 G17 C53 C22
    Date: 2013–05–21
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:2013070&r=rmg
  6. By: Xiao, Tim
    Abstract: This paper presents a new framework for credit value adjustment (CVA) that is a relatively new area of financial derivative modeling and trading. In contrast to previous studies, the model relies on the probability distribution of a default time/jump rather than the default time itself, as the default time is usually inaccessible. As such, the model can achieve a high order of accuracy with a relatively easy implementation. We find that the prices of risky contracts are normally determined via backward induction when their payoffs could be positive or negative. Moreover, the model can naturally capture wrong or right way risk.
    Keywords: credit value adjustment (CVA), wrong way risk, right way risk, credit risk modeling, risky valuation, default time approach (DTA), default probability approach (DPA), collateralization, margin and netting.
    JEL: E44 G12 G32 G33
    Date: 2013–05–01
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:47104&r=rmg
  7. By: Taiane S. Prass; S\'ilvia R. C. Lopes
    Abstract: We consider the Fractionally Integrated Exponential Generalized Autoregressive Conditional Heteroskedasticity process, denoted by FIEGARCH(p,d,q), introduced by Bollerslev and Mikkelsen (1996). We present a simulated study regarding the estimation of the risk measure $VaR_p$ on FIEGARCH processes. We consider the distribution function of the portfolio log-returns (univariate case) and the multivariate distribution function of the risk-factor changes (multivariate case). We also compare the performance of the risk measures $VaR_p$, $ES_p$ and MaxLoss for a portfolio composed by stocks of four Brazilian companies.
    Date: 2013–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1305.5238&r=rmg
  8. By: Xiao, Tim
    Abstract: This article presents a comprehensive framework for valuing financial instruments subject to credit risk and collateralization. In particular, we focus on the impact of default dependence on asset pricing, as correlated default risk is one of the most pervasive threats to financial markets. Some well-known risky valuation models in the markets can be viewed as special cases of this framework. We introduce the concept of comvariance (or comrelation) into the area of credit risk modeling to capture the default relationship among three or more parties. Accounting for default correlations and comrelations becomes important, especially during the credit crisis. Moreover, we find that collateralization works well for financial instruments subject to bilateral credit risk, but fails for ones subject to multilateral credit risk.
    Keywords: asset pricing; credit risk modeling; unilateral, bilateral, multilateral credit risk; collateralization; comvariance; comrelation; correlation.
    JEL: E44 G12 G21 G33
    Date: 2013–05–01
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:47136&r=rmg
  9. By: PIERRET, Diane (Université catholique de Louvain, ISBA, Belgium)
    Abstract: This paper investigates the meaning of systemic risk in energy markets and proposes a methodology to measure it. Energy Systemic Risk is defined by the risk of an energy crisis raising the prices of all energy commodities with negative consequences for the real economy. Measures of the total cost (EnSysRISK) and the net impact (ΔMES) of an energy crisis on the rest of the economy are proposed. The measures are derived from the Marginal Expected Shortfall (MES) capturing the tail dependence between the asset and the energy market factor. The adapted MES accounts for causality and dynamic exposure to common latent factors. The methodology is applied to the European Energy Exchange and the DAX industrial index, where a minor decline in industrial productivity is observed from recent energy shocks.
    Keywords: energy crisis, factor models, marginal expected shortfall, market integration
    JEL: C32 C58 Q43
    Date: 2013–05–17
    URL: http://d.repec.org/n?u=RePEc:cor:louvco:2013018&r=rmg
  10. By: David Ardia; Lennart F. Hoogerheide
    Abstract: Various GARCH models are applied to daily returns of more than 1200 constituents of major stock indices worldwide. The value-at-risk forecast performance is investigated for different markets and industries, considering the test for correct conditional coverage using the false discovery rate (FDR) methodology. For most of the markets and industries we find the same two conclusions. First, an asymmetric GARCH specification is essential when forecasting the 95% value-at-risk. Second, for both the 95% and 99% value-at-risk it is crucial that the innovations’ distribution is fat-tailed (e.g., Student-t or – even better – a non-parametric kernel density estimate). Then we discuss two applications. First, we use normal Entropy Pooling to estimate a market distribution consistent with the CAPM equilibrium, which improves on the “implied returns” a-la-Black and Litterman (1990) and can be used as the starting point for portfolio construction. Second, we use normal Entropy Pooling to process ranking signals for alpha-generation.
    Keywords: GARCH, value-at-risk, equity, worldwide, false discovery rate
    JEL: C11 C22 C52
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:lvl:lacicr:1312&r=rmg
  11. By: Serge Darolles (Paris-Dauphine University and CREST); Patrick Gagliardini (University of Lugano); Christian Gouriéroux (CREST and University of Toronto)
    Abstract: In this paper we examine the dependence between the liquidation risks of individual hedge funds. This dependence can result either from common exogenous shocks (shared frailty), or from contagion phenomena, which occur when an endogenous behaviour of a fund manager impacts the Net Asset Values of other funds. We introduce dynamic models able to distinguish between frailty and contagion phenomena, and test for the presence of such dependence effects, according to the age and management style of the fund. We demonstrate the empirical relevance of our approach by measuring the magnitudes of contagion and exogenous frailty in liquidation risk dependence in the TASS database. The empirical analysis is completed by stress-tests on portfolios of hedge funds.
    Keywords: Hedge Fund, Liquidation Correlation, Frailty, Contagion, Dynamic Count Model, Autoregressive Gamma Process, Systemic Risk, Stress-tests, Liquidation Swap, Funding Liquidity, Market Liquidity.
    JEL: G12 C23
    Date: 2012–11
    URL: http://d.repec.org/n?u=RePEc:crs:wpaper:2012-36&r=rmg
  12. By: Attilio Meucci; David Ardia; Simon Keel
    Abstract: The Entropy Pooling approach in Meucci (2008) is a versatile, general framework to process market views in portfolio construction and generalized stress-tests in risk management. Here we present an efficient algorithm to implement Entropy Pooling with fully general views in multivariate normal markets. Then we discuss two applications. First, we use normal Entropy Pooling to estimate a market distribution consistent with the CAPM equilibrium, which improves on the “implied returns” a-la-Black and Litterman (1990) and can be used as the starting point for portfolio construction. Second, we use normal Entropy Pooling to process ranking signals for alpha-generation.
    Keywords: Portfolio construction, tactical allocation, Entropy Pooling, Kullback-Leibler, Black-Litterman, equilibrium prior, portfolios from sorts, ranking, alpha, signals, factor models, risk management
    JEL: C1 G11
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:lvl:lacicr:1311&r=rmg
  13. By: Christian Gouriéroux (CREST and University of Toronto); Alain Monfort (CREST, Banque de France and University of Maastricht); Jean-Paul Renne (Banque de France)
    Abstract: In order to derive closed-form expressions of the prices of credit derivatives, the standard models for credit risk usually price the default intensities but not the default events themselves. The default indicator is replaced by an appropriate prediction and the prediction error, that is the default-event surprise, is neglected. Our paper develops an approach to get closed-form expressions for the prices of credit derivatives written on multiple names without neglecting default-event surprises. The approach differs from the standard one, since the default counts cause the factor process under the risk-neutral probability Q, even if this is not the case under the historical probability. This implies that the default intensities under Q do not exist. A numerical illustration shows the potential magnitude of the mispricing when the surprise on credit events is neglected. We also illustrate the effect of the propagation of defaults on the prices of credit derivatives.
    Keywords: Credit Derivative, Default Event, Default Intensity, Frailty, Contagion, Mispricing
    Date: 2013–01
    URL: http://d.repec.org/n?u=RePEc:crs:wpaper:2013-03&r=rmg

This nep-rmg issue is ©2013 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.