
on Risk Management 
Issue of 2019‒02‒18
twenty papers chosen by 
By:  Tobias Fissler; Johanna F. Ziegel 
Abstract:  The predictive performance of point forecasts for a statistical functional, such as the mean, a quantile, or a certain risk measure, is commonly assessed in terms of scoring (or loss) functions. A scoring functions should be (strictly) consistent for the functional of interest, that is, the expected score should be minimised by the correctly specified functional value. A functional is elicitable if it possesses a strictly consistent scoring function. In quantitative risk management, the elicitability of a risk measure is closely related to comparative backtesting procedures. As such, it has gained considerable interest in the debate about which risk measure to choose in practice. While this discussion has mainly focused on the dichotomy between Value at Risk (VaR)  a quantile  and Expected Shortfall (ES)  a tail expectation, this paper is concerned with Range Value at Risk (RVaR). RVaR can be regarded as an interpolation of VaR and ES, which constitutes a tradeoff between the sensitivity of the latter and the robustness of the former. Recalling that RVaR is not elicitable, we show that a triplet of RVaR with two VaR components at different levels is elicitable. We characterise the class of strictly consistent scoring functions. Moreover, additional properties of these scoring functions are examined, including the diagnostic tool of Murphy diagrams. The results are illustrated with a simulation study, and we put our approach in perspective with respect to the classical approach of trimmed least squares in robust regression. 
Date:  2019–02 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1902.04489&r=all 
By:  Simon Fecamp; Joseph Mikael; Xavier Warin 
Abstract:  We propose some machinelearningbased algorithms to solve hedging problems in incomplete markets. Sources of incompleteness cover illiquidity, untradable risk factors, discrete hedging dates and transaction costs. The proposed algorithms resulting strategies are compared to classical stochastic control techniques on several payoffs using a variance criterion. One of the proposed algorithm is flexible enough to be used with several existing risk criteria. We furthermore propose a new momentbased risk criteria. 
Date:  2019–02 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1902.05287&r=all 
By:  Antonio Falato; Diana A. Iercosan; Filip Zikes 
Abstract:  This paper uses detailed highfrequency regulatory data to evaluate whether trading increases or decreases systemic risk in the U.S. banking sector. We estimate the sensitivity of weekly bank trading net profits to a variety of aggregate risk factors, which include equities, fixedincome, derivatives, foreign exchange, and commodities. We find that U.S. banks had large trading exposures to equity market risk before the introduction of the Volcker Rule in 2014 and that they curtailed these exposures afterwards. Prerule equity risk exposures were large across the board of the main asset classes, including fixedincome. There is also evidence of smaller exposures to credit and currency risk. We corroborate the main finding on equity risk with a quasinatural experiment that exploits the phasedin introduction of reporting requirements to refine identification, and an optimal changepoint regression that estimates timevarying exposures to address rebalancing. A stresstest calibration indicates that the Volcker Rule was an effective financialstability regulation, as even a 5% drop in stock market returns would have led to material aggregate trading losses for banks in the preVolcker period, as large as about 3% (1.5%) of sectorwide market risk weighted assets (tier 1 capital). 
Keywords:  Bank trading ; Regulation ; Risk exposures ; Systemic risk 
JEL:  G38 G32 G21 
Date:  2019–02–07 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgfe:201905&r=all 
By:  Ronald Heijmans; Chen Zhou 
Abstract:  This paper studies the detection of outliers in risk indicators based on large value payment system transaction data. The ten risk indicators are daily time series measuring various risks in the large value payment system, such as operational risk, concentration risk and liquidity flows related to other financial market infrastructures. We use extreme value theory and local outlier factor methods to identify anomalous data points (outliers). In a univariate setup, the extreme value analysis quantifies the unusualness of each outlier. In a multivariate setup, the local outlier factor method identifies outliers by measuring the local deviation of a given data point with respect to its neighbours. We find that most detected outliers are at the beginning and near end of the calendar month when turnover is significantly larger than at other days. Our method can be used e.g. by overseers and financial stability experts who wish to look at many (risk) indicators in relation to each other. 
Keywords:  risk indicator; TARGET2; financial market infrastructure; extrem value theory (EVT); local outlier factor (LOF); anomaly 
JEL:  E42 E50 E58 E59 
Date:  2019–02 
URL:  http://d.repec.org/n?u=RePEc:dnb:dnbwpp:624&r=all 
By:  Marco Bottone; Mauro Bernardi; Lea Petrella 
Abstract:  Conditional Autoregressive ValueatRisk and Conditional Autoregressive Expectile have become two popular approaches for direct measurement of market risk. Since their introduction several improvements both in the Bayesian and in the classical framework have been proposed to better account for asymmetry and local nonlinearity. Here we propose a unified Bayesian Conditional Autoregressive Risk Measures approach by using the Skew Exponential Power distribution. Further, we extend the proposed models using a semiparametric Pspline approximation answering for a flexible way to consider the presence of nonlinearity. To make the statistical inference we adapt the MCMC algorithm proposed in Bernardi et al. (2018) to our case. The effectiveness of the whole approach is demonstrated using real data on daily return of five stock market indices. 
Date:  2019–02 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1902.03982&r=all 
By:  Oliver Kley; Claudia Kl\"uppelberg; Sandra Paterlini 
Abstract:  We introduce a statistical model for operational losses based on heavytailed distributions and bipartite graphs, which captures the event type and business line structure of operational risk data. The model explicitly takes into account the Pareto tails of losses and the heterogeneous dependence structures between them. We then derive estimators for individual as well as aggregated tail risk, measured in terms of ValueatRisk and ConditionalTailExpectation for very high confidence levels, and provide also an asymptotically full capital allocation method. Estimation methods for such tail risk measures and capital allocations are also proposed and tested on simulated data. Finally, by having access to realworld operational risk losses from the Italian banking system, we show that even with a small number of observations, the proposed estimation methods produce reliable estimates, and that quantifying dependence by means of the empirical network has a big impact on estimates at both individual and aggregate level, as well as for capital allocations. 
Date:  2019–02 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1902.03041&r=all 
By:  David Barrera (CMAP  Centre de Mathématiques Appliquées  Ecole Polytechnique  X  École polytechnique  CNRS  Centre National de la Recherche Scientifique); Stéphane Crépey (LaMME  Laboratoire de Mathématiques et Modélisation d'Evry  INRA  Institut National de la Recherche Agronomique  UEVE  Université d'ÉvryVald'Essonne  ENSIIE  CNRS  Centre National de la Recherche Scientifique); Babacar Diallo (LaMME  Laboratoire de Mathématiques et Modélisation d'Evry  INRA  Institut National de la Recherche Agronomique  UEVE  Université d'ÉvryVald'Essonne  ENSIIE  CNRS  Centre National de la Recherche Scientifique); Gersende Fort (IMT  Institut de Mathématiques de Toulouse UMR5219  UT1  Université Toulouse 1 Capitole  UT2J  Université Toulouse  Jean Jaurès  UPS  Université Toulouse III  Paul Sabatier  Université Fédérale Toulouse MidiPyrénées  PRES Université de Toulouse  INSA Toulouse  Institut National des Sciences Appliquées  Toulouse  INSA  Institut National des Sciences Appliquées  CNRS  Centre National de la Recherche Scientifique); Emmanuel Gobet (CMAP  Centre de Mathématiques Appliquées  Ecole Polytechnique  X  École polytechnique  CNRS  Centre National de la Recherche Scientifique); Uladzislau Stazhynski (CMAP  Centre de Mathématiques Appliquées  Ecole Polytechnique  X  École polytechnique  CNRS  Centre National de la Recherche Scientifique) 
Abstract:  We consider the problem of the numerical computation of its economic capital by an insurance or a bank, in the form of a valueatrisk or expected shortfall of its loss over a given time horizon. This loss includes the appreciation of the marktomodel of the liabilities of the firm, which we account for by nested Monte Carlo à la Gordy and Juneja (2010) or by regression à la Broadie, Du, and Moallemi (2015). Using a stochastic approximation point of view on valueatrisk and expected shortfall, we establish the convergence of the resulting economic capital simulation schemes, under mild assumptions that only bear on the theoretical limiting problem at hand, as opposed to assumptions on the approximating problems in GordyJuneja (2010) and BroadieDuMoallemi (2015). Our economic capital estimates can then be made conditional in a Markov framework and integrated in an outer Monte Carlo simulation to yield the risk margin of the firm, corresponding to a market value margin (MVM) in insurance or to a capital valuation adjustment (KVA) in banking par lance. This is illustrated numerically by a KVA case study implemented on GPUs. 
Date:  2019 
URL:  http://d.repec.org/n?u=RePEc:hal:journl:hal01710394&r=all 
By:  Masato Hisakado; Shintaro Mori 
Abstract:  The probability of default (PD) estimation is an important process for financial institutions. The difficulty of the estimation depends on the correlations between borrowers. In this paper, we introduce a hierarchical Bayesian estimation method using the beta binomial distribution, and consider a multiyear case with a temporal correlation. A phase transition occurs when the temporal correlation decays by power decay. When the power index is less than one, the PD estimator does not converge. It is difficult to estimate the PD with the limited historical data. Conversely, when the power index is greater than one, the convergence is the same as that of the binomial distribution. We provide a condition for the estimation of the PD and discuss the universality class of the phase transition. We investigate the empirical default data history of rating agencies, and their Fourier transformations to confirm the correlation decay equation. The power spectrum of the decay history seems to be 1/f of the fluctuations that correspond to long memory. But the estimated power index is much greater than one. If we collect adequate historical data, the parameters can be estimated correctly. 
Date:  2019–02 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1902.03797&r=all 
By:  Aknouche, Abdelhakim; Dimitrakopoulos, Stefanos; Touche, Nassim 
Abstract:  We propose a novel class of count time series models, the mixed Poisson integervalued stochastic volatility models. The proposed specification, which can be considered as an integervalued analogue of the discretetime stochastic volatility model, encompasses a wide range of conditional distributions of counts. We study its probabilistic structure and develop an easily adaptable Markov chain Monte Carlo algorithm, based on the GriddyGibbs approach that can accommodate any conditional distribution that belongs to that class. We demonstrate that by considering the cases of Poisson and negative binomial distributions. The methodology is applied to simulated and real data. 
Keywords:  GriddyGibbs, Markov chain Monte Carlo, mixed Poisson parameterdriven models, stochastic volatility, Integervalued GARCH. 
JEL:  C13 C15 C32 C35 C58 
Date:  2019–02–04 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:91962&r=all 
By:  Svetlana Boyarchenko; Sergei Levendorskii 
Abstract:  In this paper, we argue that, once the costs of maintaining the hedging portfolio are properly taken into account, semistatic portfolios should more properly be thought of as separate classes of derivatives, with nontrivial, modeldependent payoff structures. We derive new integral representations for payoffs of exotic European options in terms of payoffs of vanillas, different from CarrMadan representation, and suggest approximations of the idealized static hedging/replicating portfolio using vanillas available in the market. We study the dependence of the hedging error on a model used for pricing and show that the variance of the hedging errors of static hedging portfolios can be sizably larger than the errors of varianceminimizing portfolios. We explain why the exact semistatic hedging of barrier options is impossible for processes with jumps, and derive general formulas for varianceminimizing semistatic portfolio. We show that hedging using vanillas only leads to larger errors than hedging using vanillas and first touch digitals. In all cases, efficient calculations of the weights of the hedging portfolios are in the dual space using new efficient numerical methods for calculation of the WienerHopf factors and LaplaceFourier inversion. 
Date:  2019–02 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1902.02854&r=all 
By:  Fabrizio Ferriani (Bank of Italy); Wanda Cornacchia (Bank of Italy); Paolo Farroni (Bank of Italy); Eliana Ferrara (Bank of Italy); Francesco Guarino (Bank of Italy); Francesco Pisanti (Bank of Italy JEL Classification: G21, G28) 
Abstract:  This paper presents a statistical early warning system for less significant institutions (LSIs) under the direct supervision of the Bank of Italy. The model is calibrated on the basis of a wider definition of possible distress events, using the universe of Italian LSIs active in the period 20082016 as a reference. We selected an extensive list of variables that might give early warnings of a crisis in relation both to the overall banking system and Italy’s macrofinancial situation, and to individual banks’ performances. A logit model is used to calculate the probability of default (PD) for each bank, with time horizon estimates set at four and six quarters. The empirical specifications proposed are tested using several statistical indices; the expost analysis of the estimated PDs shows a percentage of correct bank classification in the range of 8090 per cent, with better results the nearer the moment of prediction is to when a crisis situation actually occurs. 
Keywords:  early warning system, default, less significant institutions, CAMELS, banking supervision 
Date:  2019–01 
URL:  http://d.repec.org/n?u=RePEc:bdi:opques:qef_480_19&r=all 
By:  Jamal Bouoiyour (CATT  Centre d'Analyse Théorique et de Traitement des données économiques  UPPA  Université de Pau et des Pays de l'Adour, IRMAPE  Institut de Recherche en Management et Pays Emergents  ESC Pau); Refk Selmi (CATT  Centre d'Analyse Théorique et de Traitement des données économiques  UPPA  Université de Pau et des Pays de l'Adour, IRMAPE  Institut de Recherche en Management et Pays Emergents  ESC Pau); Mark Wohar (University of Nebraska Omaha, Loughborough University) 
Abstract:  This study seeks to address whether Bitcoin ever match or even replace gold as a safe haven. To this end, we use a dynamic Markovswitching copula model to test the complementarity and substitutability among Bitcoin and gold within two risk scenarios (i.e., lowand highrisk regimes). Our results reveal a positive and strong correlation between gold and Bitcoin returns coinciding with specific economic and political events. Gold and Bitcoin benefit from the same economic conditions. This suggests that gold and Bitcoin are likely to be complementary, rather than in competition with each other. Gold could act as a diversifier for investors in digital assets. But the Bitcoin have a lot to teach gold in terms of the efficient transfer of value. 
Keywords:  Bitcoin,Gold,Testing for complementarity and substitution,Markovswitching copula model 
Date:  2019 
URL:  http://d.repec.org/n?u=RePEc:hal:journl:hal01994187&r=all 
By:  Mohammad Ali Elminejad (Institute of Economic Studies, Faculty of Social Sciences, Charles University in Prague, Smetanovo nabrezi 6, 111 01 Prague 1, Czech Republic) 
Abstract:  This paper investigates systemic risk and contagion processes in the interbank network using network science methods. The interbank network consisting 10 banks, similar to the real world interbank networks, is studied to understand the contagion process in the network regarding changes in the network structure, as well as changes in the characteristics of components. Simulations support the claim that heterogeneous networks are more resilient to contagious shocks, while systemic shocks are more problematic in homogeneous networks. The study also shows that more interconnections among banks could accelerate or block contagion proces depending on the structure of the network and seniority of debts in the interbank network as well. 
Keywords:  Complex Networks, Systemic Risk, Contagion, Default Risk, Epidemic Modeling 
JEL:  G01 G21 
Date:  2018–12 
URL:  http://d.repec.org/n?u=RePEc:fau:wpaper:wp2018_39&r=all 
By:  Cormac O'Dea (Institute for Fiscal Studies); David Sturrock (Institute for Fiscal Studies and Institute for Fiscal Studies) 
Abstract:  The "annuity puzzle" refers to the fact that annuities are rarely purchased despite the longevity insurance they provide. Most explanations for this puzzle assume that individuals have accurate expectations about their future survival. We provide evidence that individuals misperceive their mortality risk, and study the demand for annuities in a setting where annuities are priced by insurers on the basis of objectivelymeasured survival probabilities but in which individuals make purchasing decisions based on their own subjective survival probabilities. Subjective expectations have the capacity to explain signi cant rates of nonannuitization, yielding a quantitatively important explanation for the annuities puzzle. 
Keywords:  Annuity Puzzle, Subjective Expectations, Survival Probabilities 
Date:  2019–01–17 
URL:  http://d.repec.org/n?u=RePEc:ifs:ifsewp:19/02&r=all 
By:  Kanamura, Takashi 
Abstract:  This paper studies volumetric risk hedging strategies for solar power under incomplete market settings with a twofold proposal of temperaturebased and solar power generationbased models for solar power derivatives and discusses the basis risk arising from solar power volumetric risk hedge with temperature. Based on an indirect modeling of solar power generation using temperature and a direct modeling of solar power generation, we design two types of call options written on the accumulated non cooling degree days (ANCDDs) and the accumulated low solar power generation days (ALSPGDs), respectively, which can hedge cool summer volumetric risk more appropriately than those on wellknown accumulated cooling degree days. We offer the pricing formulas of the two options under the gooddeal bounds (GDBs) framework, which can consider incompleteness of solar power derivative markets. To calculate the option prices numerically, we derive the partial differential equations for the two options using the GDBs. Empirical studies using Czech solar power generation and Prague temperature estimate the parameters of temperaturebased and solar power generationbased models, respectively. We numerically calculate the call option prices on ANCDDs and ALSPGDs, respectively, as the upper and lower price boundaries using the finite difference method. Results show that the call option prices based on a solar power generation process are bigger than the call option prices based on a temperature process. This is consistent with the fact that the solar power generation approach takes into account more comprehensive risk than the temperature approach, resulting in the bigger prices for the solar power generation approach. We finally show that the basis risk premiums, i.e., solar power generationbased call option prices minus temperaturebased call option prices, decrease in line with initial temperature greater than around 25 ◦C. This may be because the uncertainty in solar power generation by temperature decreases due to the cancellation between the increase in solar power generation due to the increase in solar radiation and the decrease in solar power generation due to the decrease in solar panel efficiency. 
Keywords:  Solar power, weather risk, temperature model, basis risk, gooddeal bounds 
JEL:  G13 L94 Q42 
Date:  2019–01–30 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:92009&r=all 
By:  Edouard Debonneuil (SAF  Laboratoire de Sciences Actuarielle et Financière  UCBL  Université Claude Bernard Lyon 1  Université de Lyon); Anne EyraudLoisel (SAF  Laboratoire de Sciences Actuarielle et Financière  UCBL  Université Claude Bernard Lyon 1  Université de Lyon); Frédéric Planchet (SAF  Laboratoire de Sciences Actuarielle et Financière  UCBL  Université Claude Bernard Lyon 1  Université de Lyon) 
Abstract:  Pension funds that handle retirement risk need to invest assets in a diversified manner, on long durations and if possible while facing interest rate and longevity risk. In the recent years, a new class of investment called a longevity megafund was described, that invests in clinical trials for solutions against agerelated diseases. Using simple models, we here study the financial interest for pension funds of investing in a longevity megafund. 
Date:  2018 
URL:  http://d.repec.org/n?u=RePEc:hal:journl:hal01571937&r=all 
By:  Wataru Hirata (Bank of Japan); Mayumi Ojima (Bank of Japan) 
Abstract:  Bank competition and financial stability is a recurrent research issue, and researchers have begun to shed light on the competition effect on systemicrisk. Japan is an interesting case in this venue since its regional banking system has confronted intensified competition and there is growing evidence that the competition has led the portfolio of Japan's regional banks to be more overlapped, an indication of increased systemic risk. In this paper, we first examine the empirical relationship between competition and systemicrisk for Japan's regional banks. We find that the bank markup is negatively associated with the level of systemic risk, indicating that competition undermines the systemwide financial stability in Japan. However, this result is at odds with existing studies. To this end, we perform a theoretical analysis focusing on bank's portfolio diversification. We demonstrate that Japan's regional banks tend to diversify toward alternative lending when the profitability of the core business declines. This diversification results in the buildup of systemic risk through higher common exposure, a form of indirect interconnectedness. 
Keywords:  Competition; Markup; Systemicrisk; Indirect interconnectedness 
JEL:  G11 G21 L51 
Date:  2019–01–31 
URL:  http://d.repec.org/n?u=RePEc:boj:bojwps:wp19e01&r=all 
By:  Matteo Barigozzi; Marc Hallin; Stefano Soccorsi 
Abstract:  Ripple effects in financial markets associated with crashes, systemic risk and contagion are characterized by nontrivial leadlag dynamics which is crucial for understanding how crises spread and, therefore, central in risk management. In the spirit of Diebold and Yilmaz (2014), we investigate connectedness among financial firms via an analysis of impulse response functions of adjusted intraday logranges to market shocks involving network theory methods. Motivated by overwhelming evidence that the interdependence structure of financial markets is varying over time, we are basing that analysis on the socalled timevarying General Dynamic Factor Model proposed by Eichler et al. (2011), which extends to the locally stationary context the framework developed by Forni et al. (2000) under stationarity assumptions. The estimation methods in Eichler et al. (2011), however, present the major drawback of involving twosided filters which make it impossible to recover impulse response functions. We therefore introduce a novel approach extending to the timevarying context the onesided method of Forni et al. (2017). The resulting estimators of timevarying impulse response functions are shown to be consistent, hence can be used in the analysis of (timevarying) connectedness. Our empirical analysis on a large and strongly comoving panel of intraday price ranges of US stocks indicates that large increases in mid to longrun connectedness are associated with the main financial turmoils. 
Keywords:  Dynamic factor models, volatility, financial crises, contagion, financial connectedness, highdimensional time series, panel data, timevarying models, local stationarity. 
Date:  2019–02 
URL:  http://d.repec.org/n?u=RePEc:eca:wpaper:2013/283963&r=all 
By:  John Leventides; Kalliopi Loukaki; Vassilios G. Papavassiliou 
Abstract:  The purpose of this study is to assess the resilience of financial systems to exogenous shocks using techniques drawn from the theory of complex networks. We investigate by means of Monte Carlo simulations the fragility of several network topologies using a simple default model of contagion applied on interbank networks of varying sizes. We trigger a series of banking crises by exogenously failing each bank in the system and observe the propagation mechanisms that take effect within the system under different scenarios. Finally, we add to the existing literature by analyzing the interplay of several crucial drivers of interbank contagion, such as network topology, leverage, interconnectedness, heterogeneity and homogeneity across bank sizes and interbank exposures. 
Keywords:  Interbank congtagion; Random networks; Financial stability; Interconectedness; Systemic risk 
Date:  2018–12 
URL:  http://d.repec.org/n?u=RePEc:rru:oapubs:10197/9601&r=all 
By:  Federico Bassetti (Politecnico of Milan, Italy); Roberto Casarin (University Ca' Foscari of Venice, Italy); Francesco Ravazzolo (Free University of Bolzano‐Bozen Faculty of Economics, Italy and BI Norwegian Business School) 
Abstract:  This paper reviews different methods to construct density forecasts and to aggregate forecasts from many sources. Density evaluation tools to measure the accuracy of density forecasts are reviewed and calibration methods for improving the accuracy of forecasts are presented. The manuscript provides some numerical simulation tools to approximate predictive densities with a focus on parallel computing on graphical process units. Some simple examples are proposed to illustrate the methods. 
Keywords:  Density forecasting, density combinations, density evaluation, bootstrapping, Bayesian inference, Monte Carlo simulations, GPU computing 
JEL:  C10 C11 C15 C53 C63 
Date:  2019–02 
URL:  http://d.repec.org/n?u=RePEc:bzn:wpaper:bemps59&r=all 