New Economics Papers
on Risk Management
Issue of 2014‒06‒07
ten papers chosen by

  1. Estimating Operational Risk Capital with Greater Accuracy, Precision, and Robustness By J. D. Opdyke
  2. Factor High-Frequency Based Volatility (HEAVY) Models By Kevin Sheppard
  3. Sensitivity of Value at Risk estimation to NonNormality of returns and Market capitalization By Sinha, Pankaj; Agnihotri, Shalini
  4. Bregman superquantiles. Estimation methods and applications By Fabrice Gamboa; Aur\'elien Garivier; Bertrand Iooss; Tatiana Labopin-Richard
  5. Clustering and hierarchy of financial markets data: advantages of the DBHT By Nicolo Musmeci; Tomaso Aste; Tiziana Di Matteo
  6. Assessing systematic risk in the S&P500 index between 2000 and 2011: A Bayesian nonparametric approach By Rodriguez, Abel; Wang, Ziwei; Kottas, Athanasios
  7. How Strong are the Linkages between Real Estate and Other Sectors in China? By Wenlang Zhang; Gaofeng Han; Steven Chan
  8. Implied volatility of basket options at extreme strikes By Archil Gulisashvili; Peter Tankov
  9. Default Prediction for Small-Medium Enterprises in France: A comparative approach By Sami BEN JABEUR; Youssef FAHMI
  10. Estimation of the Global Minimum Variance Portfolio in High Dimensions By Taras Bodnar; Nestor Parolya; Wolfgang Schmid

  1. By: J. D. Opdyke
    Abstract: The largest US banks are required by regulatory mandate to estimate the operational risk capital they must hold using an Advanced Measurement Approach (AMA) as defined by the Basel II/III Accords. Most use the Loss Distribution Approach (LDA) which defines the aggregate loss distribution as the convolution of a frequency and a severity distribution representing the number and magnitude of losses, respectively. Estimated capital is a Value-at-Risk (99.9th percentile) estimate of this annual loss distribution. In practice, the severity distribution drives the capital estimate, which is essentially a very high quantile of the estimated severity distribution. Unfortunately, because the relevant severities are heavy-tailed AND the quantiles being estimated are so high, VaR is a convex function of the severity parameters, so all widely-used estimators will generate biased capital estimates due to Jensen's Inequality. This capital inflation is sometimes enormous, even hundreds of millions of dollars at the unit-of-measure (UoM) level. Herein I present an estimator of capital that essentially eliminates this upward bias. The Reduced-bias Capital Estimator (RCE) is more consistent with the regulatory intent of the LDA framework than implementations that fail to mitigate, if not eliminate this bias. RCE also notably increases the precision of the capital estimate and consistently increases its robustness to violations of the i.i.d. data presumption (which are endemic to operational risk loss event data). So with greater capital accuracy, precision, and robustness, RCE lowers capital requirements at both the UoM and enterprise levels, increases capital stability from quarter to quarter, ceteris paribus, and does both while more accurately and precisely reflecting regulatory intent. RCE is straightforward to explain, understand, and implement using any major statistical software package.
    Date: 2014–06
  2. By: Kevin Sheppard
    Abstract: �We propose a new class of multivariate volatility models utilizing realized measures of asset volatility and covolatility extracted from high-frequency data. Dimension reduction for estimation of large covariance matrices is achieved by imposing a factor structure with time-varying conditional factor loadings. Statistical properties of the model, including conditions that ensure covariance stationary or returns, are established. The model is applied to modeling the conditional covariance data of large U.S. financial institutions during the financial crisis, where empirical results show that the new model has both superior in- and out-of-sample properties. We show that the superior performance applies to a wide range of quantities of interest, including volatilities, covolatilities, betas and scenario-based risk measures, where the model's performance is particularly strong at short forecast horizons. �
    Keywords: Conditional Beta, Conditional Covariance, Forecasting, HEAVY, Marginal Expected Shortfall, Realized Covariance, Realized Kernel, Systematic Risk
    JEL: C32 C53 C58 G17 G21
    Date: 2014–05–30
  3. By: Sinha, Pankaj; Agnihotri, Shalini
    Abstract: This paper investigates sensitivity of the VaR models when return series of stocks and stock indices are not normally distributed. It also studies the effect of market capitalization of stocks and stock indices on their Value at risk and Conditional VaR estimation. Three different market capitalized indices S&P BSE Sensex, BSE Mid cap and BSE Small cap indices have been considered for the recession and post-recession periods. It is observed that VaR violations are increasing with decreasing market capitalization in both the periods considered. The same effect is also observed on other different market capitalized stock portfolios. Further, we study the relationship of liquidity represented by volume traded of stocks and the market risk calculated by VaR of the firms. It confirms that the decrease in liquidity increases the value at risk of the firms.
    Keywords: Non-normality, market capitalization, Value at risk (VaR), CVaR, GARCH
    JEL: C51 C52 C58 G01 G20 G22 G24 G28
    Date: 2014–03–10
  4. By: Fabrice Gamboa (IMT); Aur\'elien Garivier (IMT); Bertrand Iooss (- M\'ethodes d'Analyse Stochastique des Codes et Traitements Num\'eriques); Tatiana Labopin-Richard (IMT)
    Abstract: In this work, we extend some quantities introduced in "Optimization of conditional value-at-risk" of R.T Rockafellar and S. Uryasev to the case where the proximity between real numbers is measured by using a Bregman divergence. This leads to the definition of the Bregman superquantile. Axioms of a coherent measure of risk discussed in "Coherent approches to risk in optimization under uncertainty" of R.T Rockafellar are studied in the case of Bregman superquantile. Furthermore, we deal with asymptotic properties of a Monte Carlo estimator of the Bregman superquantile.
    Date: 2014–05
  5. By: Nicolo Musmeci; Tomaso Aste; Tiziana Di Matteo
    Abstract: We present a set of analyses aiming at quantifying the amount of information filtered by different hierarchical clustering methods on correlations between stock returns. In particular we apply, for the first time to financial data, a novel hierarchical clustering approach, the Directed Bubble Hierarchical Tree (DBHT), and we compare it with other methods including the Linkage and k-medoids. In particular by taking the industrial sector classification of stocks as a benchmark partition we evaluate how the different methods retrieve this classification. The results show that the Directed Bubble Hierarchical Tree outperforms the other methods, being able to retrieve more information with fewer clusters. Moreover, we show that the economic information is hidden at different levels of the hierarchical structures depending on the clustering method. The dynamical analysis also reveals that the different methods show different degrees of sensitivity to financial events, like crises. These results can be of interest for all the applications of clustering methods to portfolio optimization and risk hedging.
    Date: 2014–06
  6. By: Rodriguez, Abel; Wang, Ziwei; Kottas, Athanasios
    Keywords: Social and Behavioral Sciences
    Date: 2014–06–05
  7. By: Wenlang Zhang (Hong Kong Monetary Authority); Gaofeng Han (Hong Kong Monetary Authority); Steven Chan (Hong Kong Monetary Authority)
    Abstract: International experience points to the critical role of stable property markets in maintaining financial stability. In China, the real estate sector has become increasingly important for the economy, but existing evidence has likely understated its importance as its linkages with other sectors have not been taken into account. This paper attempts to shed some light on these linkages which occur through both real and financial channels. Our analysis based on input-output tables shows that the linkages between the real estate and other sectors have strengthened through real channels, and that the real estate sector has been much more important to the economy's output than suggested by the share of its value added in total value added. The real estate industry is also closely linked to other sectors through various financial channels, including serving as collateral in credit expansion. We quantify these financial linkages by studying the spill-overs of credit risk across sectors using data of listed firms. In general, we find that corporate credit risk has risen in recent years, and that credit risk in the real estate sector can potentially have large-scale spill-over effects onto other sectors. Consequently, shocks to the property market could have much larger impact on the Chinese economy than suggested by headline figures.
    Date: 2014–05
  8. By: Archil Gulisashvili; Peter Tankov
    Abstract: In the paper, we characterize the asymptotic behavior of the implied volatility of a basket call option at large and small strikes in a variety of settings with increasing generality. First, we obtain an asymptotic formula with an error bound for the left wing of the implied volatility, under the assumption that the dynamics of asset prices are described by the multidimensional Black-Scholes model. Next, we find the leading term of asymptotics of the implied volatility in the case where the asset prices follow the multidimensional Black-Scholes model with time change by an independent increasing stochastic process. Finally, we deal with a general situation in which the dependence between the assets is described by a given copula function. In this setting, we obtain a model-free tail-wing formula that links the implied volatility to a special characteristic of the copula called the weak lower tail dependence function.
    Date: 2014–06
  9. By: Sami BEN JABEUR; Youssef FAHMI
    Abstract: The aim of this paper is to compare between three statistical methods in predicting corporate financial distress. We will use the Discriminant Analysis, Logit model and Random Forest. These approaches are based on a sample of 800 companies during the period from 2006 to 2008, as well as on the use of 33 financial ratios. The results show a superiority of the random forest approach.
    Date: 2014–06–02
  10. By: Taras Bodnar; Nestor Parolya; Wolfgang Schmid
    Abstract: We estimate the global minimum variance (GMV) portfolio in the high-dimensional case using results from random matrix theory. This approach leads to a shrinkage-type estimator which is distribution-free and it is optimal in the sense of minimizing the out-of-sample variance. Its asymptotic properties are investigated assuming that the number of assets $p$ depends on the sample size $n$ such that $\frac{p}{n}\rightarrow c\in (0,+\infty)$ as $n$ tends to infinity. The results are obtained under weak assumptions imposed on the distribution of the asset returns, namely it is only required the fourth moments existence. Furthermore, we make no assumption on the upper bound of the spectrum of the covariance matrix. As a result, the theoretical findings are also valid if the dependencies between the asset returns are described by a factor model which appears to be very popular in financial literature nowadays. This is also well-documented in a numerical study where the small- and large-sample behavior of the derived estimator are compared with existing estimators of the GMV portfolio. The resulting estimator shows significant improvements and it turns out to be robust to the deviations from normality.
    Date: 2014–06

General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.