nep-rmg New Economics Papers
on Risk Management
Issue of 2015‒12‒01
thirteen papers chosen by
Stan Miles
Thompson Rivers University

  1. A Stochastic Dominance Approach to the Basel III Dilemma: Expected Shortfall or VaR? By Chia-Lin Chang; Juan-Ángel Jiménez-Martín; Esfandiar Maasoumi; Michael McAleer; Teodosio Pérez-Amaral
  2. Down-side Risk Metrics as Portfolio Diversification Strategies across the GFC By David E. Allen; Michael McAleer; Robert J. Powell; Abhay K. Singh
  3. Joint inference on market and estimation risks in dynamic portfolios By Francq, Christian; Zakoian, Jean-Michel
  4. Gold–oil prices co-movements and portfolio diversification implications By Chkili, Walid
  5. Loss-Deviation risk measures By Marcelo Brutti Righi
  6. The coming breakthrough in risk research By Jaeger, Carlo
  7. Correlated Defaults of UK Banks: Dynamics and Asymmetries By Mario Cerrato; John Crosby; Minjoo Kim; Yang Zhao
  8. An Application of Correlation Clustering to Portfolio Diversification By Hannah Cheng Juan Zhan; William Rea; Alethea Rea
  9. Expected returns and idiosyncratic risk: Industry-level evidence from Russia By Kinnunen, Jyri; Martikainen, Minna
  10. Upside and Downside Risks in Momentum Returns By Victoria Dobrynskaya
  11. Quantile-based inference and estimation of heavy-tailed distributions By Yves Dominicy
  12. Double Bank Runs and Liquidity Risk Management By Ippolito, Filippo; Peydró, José Luis; Polo, Andrea; Sette, Enrico
  13. Tree-based censored regression with applications to insurance By Olivier Lopez; Xavier Milhaud; Pierre-Emmanuel Thérond

  1. By: Chia-Lin Chang (Department of Applied Economics Department of Finance National Chung Hsing University Taichung, Taiwan.); Juan-Ángel Jiménez-Martín (Departamento de Fundamentos del Análisis Económico II (Economía Cuantitativa). Universidad Complutense de Madrid.); Esfandiar Maasoumi (Department of EconomicsEmory University, USA); Michael McAleer (Department of Quantitative Finance National Tsing Hua University, Taiwan); Teodosio Pérez-Amaral (Departamento de Fundamentos del Análisis Económico II (Economía Cuantitativa). Universidad Complutense de Madrid.)
    Abstract: The Basel Committee on Banking Supervision (BCBS) (2013) recently proposed shifting the quantitative risk metrics system from Value-at-Risk (VaR) to Expected Shortfall (ES). The BCBS (2013) noted that “a number of weaknesses have been identified with using VaR for determining regulatory capital requirements, including its inability to capture tail risk” (p. 3). For this reason, the Basel Committee is considering the use of ES, which is a coherent risk measure and has already become common in the insurance industry, though not yet in the banking industry. While ES is mathematically superior to VaR in that it does not show “tail risk” and is a coherent risk measure in being subadditive, its practical implementation and large calculation requirements may pose operational challenges to financial firms. Moreover, previous empirical findings based only on means and standard deviations suggested that VaR and ES were very similar in most practical cases, while ES could be less precise because of its larger variance. In this paper we find that ES is computationally feasible using personal computers and, contrary to previous research, it is shown that there is a stochastic difference between the 97.5% ES and 99% VaR. In the Gaussian case, they are similar but not equal, while in other cases they can differ substantially: in fat-tailed conditional distributions, on the one hand, 97.5%-ES would imply higher risk forecasts, while on the other, it provides a smaller down-side risk than using the 99%-VaR. It is found that the empirical results in the paper generally support the proposals of the Basel Committee.
    Keywords: Stochastic dominance, Value-at-Risk, Expected Shortfall, Optimizing strategy, Basel III Accord.
    JEL: G32 G11 G17 C53 C22
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:ucm:doicae:1516&r=rmg
  2. By: David E. Allen (School of Mathematics and Statistics, the University of Sydney, and Center for Applied Financial Studies, University of South Australia.); Michael McAleer (Department of Quantitative Finance National Tsing Hua University, Taiwan.); Robert J. Powell (School of Accounting, Finance and Economics, Edith Cowan University, Australia.); Abhay K. Singh (School of Accounting, Finance and Economics, Edith Cowan University, Australia.)
    Abstract: This paper features an analysis of the effectiveness of a range of portfolio diversification strategies, with a focus on down-side risk metrics, as a portfolio diversification strategy in a European market context. We apply these measures to a set of daily arithmetically compounded returns on a set of ten market indices representing the major European markets for a nine year period from the beginning of 2005 to the end of 2013. The sample period, which incorporates the periods of both the Global Financial Crisis (GFC) and subsequent European Debt Crisis (EDC), is challenging one for the application of portfolio investment strategies. The analysis is undertaken via the examination of multiple investment strategies and a variety of hold-out periods and back-tests. We commence by using four two year estimation periods and subsequent one year investment hold out period, to analyse a naive 1/N diversification strategy, and to contrast its effectiveness with Markowitz mean variance analysis with positive weights. Markowitz optimisation is then compared with various downside investment optimisation strategies. We begin by comparing Markowitz with CVaR, and then proceed to evaluate the relative effectiveness of Markowitz with various draw-down strategies, utilising a series of backtests. Our results suggest that none of the more sophisticated optimisation strategies appear to dominate naive diversification.
    Keywords: Portfolio Diversification, Markowitz Analaysis, Downside Risk, CVaR, Draw-down.
    JEL: G11 C61
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:ucm:doicae:1519&r=rmg
  3. By: Francq, Christian; Zakoian, Jean-Michel
    Abstract: We study the estimation risk induced by univariate and multivariate methods for evaluating the conditional Value-at-Risk (VaR) of a portfolio of assets. The composition of the portfolio can be time-varying and the individual returns are assumed to follow a general multivariate dynamic model. Under sphericity of the innovations distribution, we introduce in the multivariate framework a concept of VaR parameter, and we establish the asymptotic distribution of its estimator. A multivariate Filtered Historical Simulation method, which does not rely on sphericity, is also studied. We derive asymptotic confidence intervals for the conditional VaR, which allow to quantify simultaneously the market and estimation risks. The particular case of minimal variance and minimal VaR portfolios is considered. Potential usefulness, feasibility and drawbacks of the different approaches are illustrated via Monte-Carlo experiments and an empirical study based on stock returns.
    Keywords: Confidence Intervals for VaR; DCC GARCH model, Estimation risk; Filtered Historical Simulation; Optimal Dynamic Portfolio
    JEL: C13 C22 C58
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:68100&r=rmg
  4. By: Chkili, Walid
    Abstract: In this paper we use the bivariate fractionally integrated GARCH (FIGARCH) model to analyze the dynamic relationship between gold and crude oil markets. We also test the role of gold as a hedge or safe haven for crude oil risk. Empirical results show that the dynamic links between the two markets vary over time and decline significantly during major economic and political crisis episodes. This suggests that gold can act as a safe haven during extreme oil market conditions. Finally, Findings indicate that adding gold to crude oil portfolio helps to hedge against the oil risk.
    Keywords: Gold, oil, hedge, safe haven, DCC- FIGARCH
    JEL: C58 Q4
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:68110&r=rmg
  5. By: Marcelo Brutti Righi
    Abstract: In this paper we present a class of risk measures composed of coherent risk measures with generalized deviation measures. Based on the Limitedness axiom, we prove that this set is a sub-class of coherent risk measures. We present extensions of this result for the case of convex or co-monotone coherent risk measures. Under this perspective, we propose a specific formulation that generates, from any coherent measure, a generalized deviation based on the dispersion of results worse than it, which leads to a very interesting risk measure. Moreover, we present some examples of risk measures that lie in our proposed class.
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1511.06943&r=rmg
  6. By: Jaeger, Carlo
    Abstract: Rich countries have developed a historically unprecedented capability to manage conventional risks - fire, floods, earthquakes etc., but also car accidents, many workplace risks, and more. It is based on two institutions - insurance markets and public risk governance - supported by a powerful theory: the expected utility approach to risk. Expected utility refines the utilitarian paradigm of rational action by combining the concept of utility functions with the concept of probability distributions, using subjective probabilities where required. One might think that future progress in risk research will consist mainly in refining this approach and spreading it to emerging and less developed countries. However, greater progress is necessary and possible. It is necessary because the global economy and technostructure we live in have generated new systemic risks - including financial crises, pandemics, climate change, nuclear war. These risks exceed the coping capacity of conventional risk management and call for new forms of integrated risk governance. Greater progress is possible because recent research has developed ways to address the basic difficulties of expected utility without loosing its valuable insights. They involve three major advances. First, to introduce a risk function that generalizes expected utility so as to overcome well-known difficulties like the Allais paradox. Second, to embed expected utility in a framework of iterated network games so as to take into account the social learning processes that are essential for real world risk governance. And third, to accommodate the logic of complementary descriptions called for by the new systemic risks of the 21st century. The coming breakthrough in risk research may best be achieved by bringing these advances to bear on practical efforts aiming at integrated risk governance.
    JEL: B41 C73 D80
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:zbw:ifwedp:201565&r=rmg
  7. By: Mario Cerrato; John Crosby; Minjoo Kim; Yang Zhao
    Abstract: We document asymmetric and time-varying features of dependence between the credit risks of global systemically important banks (G-SIBs) in the UK banking industry using a CDS dataset. We model the dependence of CDS spreads using a dynamic asymmetric cop- ula. Comparing our model with traditional copula models, we find that they usually under- estimate the probability of joint (or conditional) default in the UK G-SIBs. Furthermore, we show that dynamics and asymmetries between CDS spreads are closely associated with the probabilities of joint (or conditional) default through the extensive regression analysis. Especially, our regression analysis provides a policy implication that copula correlation or tail dependence coefficients are able to be leading indicators for the systemic credit event.
    Keywords: Calibrated marginal default probability, probability of joint default, probability of conditional default, GAS-based GHST copula.
    JEL: C32 G32
    Date: 2015–10
    URL: http://d.repec.org/n?u=RePEc:gla:glaewp:2015_24&r=rmg
  8. By: Hannah Cheng Juan Zhan; William Rea; Alethea Rea
    Abstract: This paper presents a novel application of a clustering algorithm developed for constructing a phylogenetic network to the correlation matrix for 126 stocks listed on the Shanghai A Stock Market. We show that by visualizing the correlation matrix using a Neighbor-Net network and using the circular ordering produced during the construction of the network we can reduce the risk of a diversified portfolio compared with random or industry group based selection methods in times of market increase.
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1511.07945&r=rmg
  9. By: Kinnunen, Jyri (BOFIT); Martikainen, Minna (BOFIT)
    Abstract: In this paper, we explore a relation between expected returns and idiosyncratic risk. As in many emerging markets, investors in the Russian stock market cannot fully diversify their portfolios due to transaction costs, information gathering and processing costs, and short-comings in investor protection. This implies that investors demand a premium for idiosyncratic risk – unique asset-specific risk plays a role in investment decisions. We estimate the price of idiosyncratic risk using MIDAS regressions and a cross-section of Russian industry portfolios. We find that idiosyncratic risk commands an economically and statistically significant risk premium. The results remain unaffected after controlling for global pricing factors and short-term return reversal.
    Keywords: idiosyncratic risk; industry risk; cross-sectional returns; MIDAS; Russia
    JEL: G12
    Date: 2015–10–30
    URL: http://d.repec.org/n?u=RePEc:hhs:bofitp:2015_030&r=rmg
  10. By: Victoria Dobrynskaya (National Research University Higher School)
    Abstract: I provide a novel risk-based explanation for the profitability of momentum strategies. I show that the past winners and the past losers are differently exposed to the upside and downside market risks. Winners systematically have higher relative downside market betas and lower relative upside market betas than losers. As a result, the winner-minus-loser momentum portfolios are exposed to extra downside market risk, but hedge against the upside market risk. Such asymmetry in the upside and downside risks is a mechanical consequence of rebalancing momentum portfolios. But it is unattractive for an investor because both positive relative downside betas and negative relative upside betas carry positive risk premiums according to the Downside-Risk CAPM. Hence, the high returns to momentum strategies are a mere compensation for their upside and downside risks. The Downside Risk-CAPM is a robust unifying explanation of returns to momentum portfolios, constructed for different geographical and asset markets, and it outperforms alternative multi-factor models.
    Keywords: momentum, downside risk, downside beta, upside risk, upside beta, Downside-Risk CAPM
    JEL: G12 G14 G15
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:hig:wpaper:50/fe/2015&r=rmg
  11. By: Yves Dominicy
    Abstract: This thesis is divided in four chapters. The two first chapters introduce a parametric quantile-based estimation method of univariate heavy-tailed distributions and elliptical distributions, respectively. If one is interested in estimating the tail index without imposing a parametric form for the entire distribution function, but only on the tail behaviour, we propose a multivariate Hill estimator for elliptical distributions in chapter three. In the first three chapters we assume an independent and identically distributed setting, and so as a first step to a dependent setting, using quantiles, we prove in the last chapter the asymptotic normality of marginal sample quantiles for stationary processes under the S-mixing condition.<p><p><p>The first chapter introduces a quantile- and simulation-based estimation method, which we call the Method of Simulated Quantiles, or simply MSQ. Since it is based on quantiles, it is a moment-free approach. And since it is based on simulations, we do not need closed form expressions of any function that represents the probability law of the process. Thus, it is useful in case the probability density functions has no closed form or/and moments do not exist. It is based on a vector of functions of quantiles. The principle consists in matching functions of theoretical quantiles, which depend on the parameters of the assumed probability law, with those of empirical quantiles, which depend on the data. Since the theoretical functions of quantiles may not have a closed form expression, we rely on simulations.<p><p><p>The second chapter deals with the estimation of the parameters of elliptical distributions by means of a multivariate extension of MSQ. In this chapter we propose inference for vast dimensional elliptical distributions. Estimation is based on quantiles, which always exist regardless of the thickness of the tails, and testing is based on the geometry of the elliptical family. The multivariate extension of MSQ faces the difficulty of constructing a function of quantiles that is informative about the covariation parameters. We show that the interquartile range of a projection of pairwise random variables onto the 45 degree line is very informative about the covariation.<p><p><p>The third chapter consists in constructing a multivariate tail index estimator. In the univariate case, the most popular estimator for the tail exponent is the Hill estimator introduced by Bruce Hill in 1975. The aim of this chapter is to propose an estimator of the tail index in a multivariate context; more precisely, in the case of regularly varying elliptical distributions. Since, for univariate random variables, our estimator boils down to the Hill estimator, we name it after Bruce Hill. Our estimator is based on the distance between an elliptical probability contour and the exceedance observations. <p><p><p>Finally, the fourth chapter investigates the asymptotic behaviour of the marginal sample quantiles for p-dimensional stationary processes and we obtain the asymptotic normality of the empirical quantile vector. We assume that the processes are S-mixing, a recently introduced and widely applicable notion of dependence. A remarkable property of S-mixing is the fact that it doesn't require any higher order moment assumptions to be verified. Since we are interested in quantiles and processes that are probably heavy-tailed, this is of particular interest.<p>
    Keywords: Finance -- Econometric models; Distribution (Probability theory); Estimation theory; Finances -- Modèles économétriques; Distribution (Théorie des probabilités); Théorie de l'estimation; Tail index; Quantiles; Simulation; Elliptical distributions; Heavy-tailed distributions; Estimation
    Date: 2014–04–18
    URL: http://d.repec.org/n?u=RePEc:ulb:ulbeco:2013/209311&r=rmg
  12. By: Ippolito, Filippo; Peydró, José Luis; Polo, Andrea; Sette, Enrico
    Abstract: By providing liquidity to depositors and credit line borrowers, banks are exposed to double-runs on assets and liabilities. For identification, we exploit the 2007 freeze of the European interbank market and the Italian Credit Register. After the shock, there are sizeable, aggregate double-runs. In the cross-section, pre-shock interbank exposure is (unconditionally) unrelated to post-shock credit line drawdowns. However, conditioning on firm observable and unobservable characteristics, higher pre-shock interbank exposure implies more post-shock drawdowns. We show that is the result of active pre-shock liquidity risk management by more exposed banks granting credit lines to firms that run less in a crisis.
    Keywords: credit lines; financial crisis; liquidity risk; risk management; runs
    JEL: G01 G21 G28
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:10948&r=rmg
  13. By: Olivier Lopez (UPMC - Université Pierre et Marie Curie - Paris 6, Laboratoire de Finance et d'Assurance - CREST-INSEE - Centre de Recherche en Economie et en Statistique - Institut national de la statistique et des études économiques (INSEE)); Xavier Milhaud (Laboratoire de Finance et d'Assurance - CREST-INSEE - Centre de Recherche en Economie et en Statistique - Institut national de la statistique et des études économiques (INSEE)); Pierre-Emmanuel Thérond (SAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1)
    Abstract: In this paper, we propose a regression tree procedure to estimate the conditional distribution of a variable which is not directly observed due to censoring. The model that we consider is motivated by applications in insurance, including the analysis of guarantees that involve durations, and claim reserving. We derive consistency results for our procedure, and for the selection of an optimal subtree using a pruning strategy. These theoretical results are supported by a simulation study, and two applications to insurance datasets. The first one concerns income protection insurance, while the second deals with reserving in third-party liability insurance.
    Keywords: model selection,regression tree,insurance,survival analysis,censoring
    Date: 2015–04–10
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01141228&r=rmg

This nep-rmg issue is ©2015 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.