on Risk Management
 Issue of 2018‒07‒30 eighteen papers chosen by

1.  By: Matthieu Garcin (Natixis Asset Management and LabEx ReFi); Dominique Guegan (Centre d'Economie de la Sorbonne and LabEx ReFi); Bertrand Hassani (Grupo Santander and Centre d'Economie de la Sorbonne and LabEx ReFi) Abstract: The definition of multivariate Value at Risk is a challenging problem, whose most common solutions are given by the lower- and upper-orthant VaRs, which are based on copulas: the lower-orthant VaR is indeed the quantile of the multivariate distribution function, whereas the upper-orthant VaR is the quantile of the multivariate survival function. In this paper we introduce a new multivariate Value at Risk, referred to as the Kendall Value at Risk, which linkd the copula approach to an alternative definition of multivariate quantiles, known as the quantile surface, which is not used in finance, to our knowledge. We more precisely transform the notion of orthant VaR tanks to the Kendall function so as to get a multivariate VaR, that is to say a set of loss vectors, with some advantageous properties compared to the standard orthant VaR: i/ it is based on a total order, ii/ the probability level of the VaR is consistent with the probability measure of the set of the more severe loss vectors, iii/ the d-dimensional Vars based on the distribution function or on the survival function have vectors in common, which conciliate both upper- and lower-orthant approaches. We quantify the differences between this new Kendall VaR and orthant VaRs. In particular, we show that Kendall VaRs are less (respectively more) conservative than lower-orthant (resp. upper-orthant) VaRs. the definition and the properties of the Kendall VaR are illustrated using Gumbel and Clayton copulas with lognormal marginal distributions and several levels of risk Keywords: Value at Risk; multivariate quantile; risk measure; Kendall function; copula; total order JEL: C1 C6 Date: 2017–01 URL: http://d.repec.org/n?u=RePEc:mse:cesdoc:17008r&r=rmg
2.  By: J-C Gerlach (ETH Zurich); Guilherme Demos (ETH Zurich); Didier Sornette (ETH Zurich and Swiss Finance Institute) Abstract: We present a detailed bubble analysis of the Bitcoin to US Dollar price dynamics from January 2012 to February 2018. We introduce a robust automatic peak detection method that classifies price time series into periods of uninterrupted market growth (drawups) and regimes of uninterrupted market decrease (drawdowns). In combination with the Lagrange Regularisation Method for detecting the beginning of a new market regime, we identify 3 major peaks and 10 additional smaller peaks, that have punctuated the dynamics of Bitcoin price during the analyzed time period. We explain this classification of long and short bubbles by a number of quantitative metrics and graphs to understand the main socio-economic drivers behind the ascent of Bitcoin over this period. Then, a detailed analysis of the growing risks associated with the three long bubbles using the Log-Periodic Power Law Singularity (LPPLS) model is based on the LPPLS Confidence Indicators, defined as the fraction of qualified fits of the LPPLS model over multiple time windows. Furthermore, for various fictitious present analysis times t2, positioned in advance to bubble crashes, we employ a clustering method to group LPPLS fits over different time scales and the predicted critical times tc (the most probable time for the start of the crash ending the bubble). Each cluster is argued to provide a plausible scenario for the subsequent Bitcoin price evolution. We present these predictions for the three long bubbles and the four short bubbles that our time scale of analysis was able to resolve. Overall, our predictive scheme provides useful information to warn of an imminent crash risk. Keywords: Cryptocurrency, Bitcoin, k-Means Clustering, Multiscale Bubble Indicator, Log-Periodic Power Law Singularity Analysis, Forecasting, Time Series Analysis, Market Crashes JEL: C2 C13 C32 C53 C61 G1 G10 Date: 2018–04 URL: http://d.repec.org/n?u=RePEc:chf:rpseri:rp1830&r=rmg
3.  By: Christensen, Jens H. E. (Federal Reserve Bank of San Francisco); Lopez, Jose A. (Federal Reserve Bank of San Francisco); Mussche, Paul (Federal Reserve Bank of San Francisco) Abstract: Insurance companies and pension funds have liabilities far into the future and typically well beyond the longest maturity bonds trading in fixed-income markets. Such long-lived liabilities still need to be discounted, and yield curve extrapolations based on the information in observed yields can be used. We use dynamic Nelson-Siegel (DNS) yield curve models for extrapolating risk-free yield curves for Switzerland, Canada, France, and the U.S. We find slight biases in extrapolated long bond yields of a few basis points. In addition, the DNS model allows the generation of useful financial risk metrics, such as ranges of possible yield outcomes over projection horizons commonly used for stress-testing purposes. Therefore, we recommend using DNS models as a simple tool for generating extrapolated yields for long-term interest rate risk management. JEL: E43 E47 G12 G22 G28 Date: 2018–07–06 URL: http://d.repec.org/n?u=RePEc:fip:fedfwp:2018-09&r=rmg
4.  By: Friederike Niepmann; Viktors Stebunovs Abstract: We investigate systematic changes in banks' projected credit losses between the 2014 and 2016 EBA stress tests, employing methodology from Philippon et al. (2017). We find that projected credit losses were smoothed across the tests through systematic model adjustments. Those banks whose losses would have increased the most from 2014 to 2016 due to changes in the supervisory scenarios-keeping the models constant and controlling for changes in the riskiness of underlying portfolios-saw the largest decrease in losses due to model changes. Model changes were more pronounced for banks that rely more on the Internal Ratings-Based approach, and they explain the cross-section of market responses to the release of the 2016 results. Stock prices and CDS spreads increased more for banks with larger reductions in projected credit losses due to model changes, as investors apparently did not interpret lower loan losses as reflecting mainly a decrease in credit risk but, instead, as a sign of lower capital requirements going forward. Keywords: Stress tests ; Financial institutions ; Regulation ; Credit risk models JEL: G21 G28 Date: 2018–07–19 URL: http://d.repec.org/n?u=RePEc:fip:fedgif:1232&r=rmg
5.  By: Morema, Kgotso; Bonga-Bonga, Lumengo Abstract: This paper aims to study the impact of gold and oil price fluctuations on the volatility of the South African stock market and its component indices or sectors – namely, the financial, industrial and resource sectors – making use of the asymmetric dynamic conditional correlation (ADCC) generalised autoregressive conditional heteroskedasticity (GARCH) model. Moreover, the study assesses the magnitude of the optimal portfolio weight, hedge ratio and hedge effectiveness for portfolios that are constituted of a pair of assets, namely oil-stock and gold-stock pairs. The findings of the study show that there is significant volatility spillover between the gold and the stock markets, and the oil and stock markets. This finding suggests the importance of the link between futures commodity markets and the stock markets, which is essential for portfolio management. With reference to portfolio optimisation and the possibility of hedging when using the pairs of assets under study, the findings suggest the importance of combining oil and stocks as well as gold and stocks for effective hedging against any risks Keywords: Hedge ratio, optimal portfolio weight, ADCC model, crises, hedge effectiveness, Asymmetric, risk, safe haven. JEL: C5 C58 G11 G15 Date: 2018–04–10 URL: http://d.repec.org/n?u=RePEc:pra:mprapa:87637&r=rmg
6.  By: Chandrashekar Kuyyamudi; Anindya S. Chakrabarti; Sitabhra Sinha Abstract: We show that the emergence of systemic risk in complex systems can be understood from the evolution of functional networks representing interactions inferred from fluctuation correlations between macroscopic observables. Specifically, we analyze the long-term collective dynamics of the New York Stock Exchange between 1926-2016, showing that periods marked by systemic crisis, viz., around the Great Depression of 1929-33 and the Great Recession of 2007-09, are associated with emergence of frustration indicated by the loss of structural balance in the interaction networks. During these periods the dominant eigenmodes characterizing the collective behavior exhibit delocalization leading to increased coherence in the dynamics. The topological structure of the networks exhibits a slowly evolving trend marked by the emergence of a prominent core-periphery organization around both of the crisis periods. Date: 2018–07 URL: http://d.repec.org/n?u=RePEc:arx:papers:1807.02923&r=rmg
7.  By: Giovanni Barone-Adesi (Swiss Finance Institute); Chiara Legnazzi (Swiss Finance Institute); Carlo Sala (ESADE Business School and University of Lugano) Abstract: The forward-looking nature of option market data allows one to derive economically-based and model-free conditional risk measures. The option-implied methodology is a tool for regulators and companies to perform external or internal risk analysis without posing assumptions on the distribution of returns. The article proposes the first comprehensive and extensive analysis of the performances of these measures compared with classical risk measures for the S&P500. Delivering good results both at short and long time horizons, the option-implied estimates emerge as a convenient alternative to the existing risk measures. Keywords: Option Prices, VaR and CVaR, Long and Short-term Risk Measures, S&P 500 Index JEL: G13 G32 D81 Date: 2018–04 URL: http://d.repec.org/n?u=RePEc:chf:rpseri:rp1829&r=rmg
8.  By: Ahmed, Hanan (Tilburg University, Center For Economic Research); Einmahl, John (Tilburg University, Center For Economic Research) Abstract: Heavy tailed phenomena are naturally analyzed by extreme value statistics. A crucial step in such an analysis is the estimation of the extreme value index, which describes the tail heaviness of the underlying probability distribution. We consider the situation where we have next to the n observations of interest another n+m observations of one or more related variables, like, e.g., financial losses due to earthquakes and the related amounts of energy released, for a longer period than that of the losses. Based on such a data set, we present an adapted version of the Hill estimator that shows greatly improved behavior and we establish the asymptotic normality of this estimator. For this adaptation the tail dependence between the variable of interest and the related variable(s) plays an important role. A simulation study confirms the substantially improved performance of our adapted estimator relative to the Hill estimator. We also present an application to the aforementioned earthquake losses. Keywords: asymptotic normality; heavy tail; Hill estimator; tail dependence; variance reduction JEL: C13 C14 Date: 2018 URL: http://d.repec.org/n?u=RePEc:tiu:tiucen:78738894-06ad-409e-ba03-531c3308e118&r=rmg
9.  By: Eugenia Andreasen; Patricio Valenzuela Abstract: Using a panel dataset for international corporate bonds, this paper explores the relationship between investment opportunities and corporate credit risk. Consistent with theoretical arguments that investment opportunities reduce a firm’s likelihood of bankruptcy, this study shows that corporate credit spreads are negatively related to proxies for investment opportunities, even after controlling for the standard determinants of credit risk. This result is stronger for bonds maturing in the short and medium term. This paper also presents evidence that credit spreads and investment opportunities are linked through a credit-rating channel. JEL CODE: F3, F4, G1, G2, G3. Key words: KEY WORDS: Bankruptcy; Credit ratings; Credit spreads; Default risk; Investment opportunities Date: 2018 URL: http://d.repec.org/n?u=RePEc:edj:ceauch:335&r=rmg
10.  By: Olkhov, Victor Abstract: This paper presents a quantitative model of financial transactions between economic agents on economic space. Risk ratings of economic agents play role of their coordinates. Aggregate amounts of agent’s financial variables at point x define macro financial variables as functions of time and coordinates. Financial transactions between agents define evolution of agent’s financial variables. Aggregate amounts of financial transactions between agents at points x and y define macro financial transactions as functions of x and y. Macro transactions determine evolution of macro financial variables. To describe dynamics and interactions of macro transactions we derive hydrodynamic-like equations. Description of macro transactions permits model evolution of macro financial variables and hence develop dynamics and forecasts of macro finance. As example for simple model interactions between macro transactions we derive hydrodynamic-like equations and obtain wave equations for their perturbations. Waves of macro transactions induce waves of macro financial variables on economic space. Diversities of financial waves of macro transactions and macro financial variables on economic space in simple models uncover internal complexity of macro financial processes. Any developments of financial models and forecast should take into account financial wave processes and their influences. Keywords: Macro Finance; Risk Ratings; Economic Space; Wave Equations JEL: C02 E32 G00 G17 Date: 2017 URL: http://d.repec.org/n?u=RePEc:pra:mprapa:87316&r=rmg
11.  By: Brkic, Sabina; Hodzic, Migdat; Dzanic, Enis Abstract: The work reported in this paper aims to present possibility distribution model of soft data used for corporate client credit risk assessment in commercial banking by applying Type 2 fuzzy membership functions (distributions) for the purpose of developing a new expert decision-making fuzzy model for evaluating credit risk of corporate clients in a bank. The paper is an extension of previous research conducted on the same subject which was based on Type 1 fuzzy distributions. Our aim in this paper is to address inherent limitations of Type 1 fuzzy dis-tributions so that broader range of banking data uncertainties can be handled and combined with the corresponding hard data, which all affect banking credit deci-sion making process. Banking experts were interviewed about the types of soft variables used for credit risk assessment of corporate clients, as well as for providing the inputs for generating Type 2 fuzzy logic membership functions of these soft variables. Similar to our analysis with Type 1 fuzzy distributions, all identified soft variables can be grouped into a number of segments, which may depend on the specific bank case. In this paper we looked into the following segments: (i) stability, (ii) capability and (iii) readiness/willingness of the bank client to repay a loan. The results of this work represent a new approach for soft data modeling and usage with an aim of being incorporated into a new and superior soft-hard data fusion model for client credit risk assessment. Keywords: Soft data, Type 2 fuzzy distributions, credit risk, default risk, commercial banking JEL: C53 G21 G32 Date: 2018–06–15 URL: http://d.repec.org/n?u=RePEc:pra:mprapa:87652&r=rmg
12.  By: Gross, Christian; Siklos, Pierre Abstract: Using variance decompositions in vector autoregressions (VARs) we model a highdimensional network of European CDS spreads to assess the transmission of credit risk to the non-financial corporate sector. Our findings suggest a sectoral clustering in the CDS network, where financial institutions are located in the center and non-financial as well as sovereign CDS are grouped around the financial center. The network has a geographical component reflected in differences in the magnitude and direction of real-sector risk transmission across European countries. While risk transmission to the non-financial sector increases during crisis events, risk transmission within the nonfinancial sector remains largely unchanged. JEL Classification: C01, C32, G01, G15 Keywords: connectedness, contagion, credit risk, financial-real linkages, networks, systemic risk Date: 2018–07 URL: http://d.repec.org/n?u=RePEc:srk:srkwps:201878&r=rmg
13.  By: Jérémy Eydieux (CREM - Centre de recherche en économie et management - UNICAEN - Université de Caen Normandie - NU - Normandie Université - UR1 - Université de Rennes 1 - UNIV-RENNES - Université de Rennes - CNRS - Centre National de la Recherche Scientifique) Abstract: How can anticipation (analyzing risk, establishing rules and norms) and resilience (capacity to maintain safety when facing unpredictable situations) be articulated? Classical theories of risk management tend to view them as potentially contradictory. Valuation helps to think their articulation by going beyond such a dichotomy. Adapting operational beliefs through ongoing valuation can never provide certain anticipations, but this social process should make anticipations intelligible, communicable and debatable. As fixing beliefs in advance is always incomplete for situations yet to come, its results should have flexible meanings in order to support and not interfere with situated management. So that anticipation would be an instrument for resilience and resilience an instrument for anticipation. Keywords: Theory of valuation,Risk management,Pragmatist philosophy,Risk governance Date: 2018–06–13 URL: http://d.repec.org/n?u=RePEc:hal:journl:hal-01817415&r=rmg
14.  By: Hansjoerg Albrecher (University of Lausanne and Swiss Finance Institute); Arian Cani (University of Lausanne) Abstract: In this paper we discuss the potential of randomizing reinsurance treaties for efficient risk management. While it may be considered counter-intuitive to introduce additional external randomness in the determination of the retention function for a given occurred loss, we indicate why and to what extent randomizing a treaty can be interesting for the insurer. We illustrate the approach with a detailed analysis of the effects of randomizing a stop-loss treaty on the expected profit after reinsurance in the framework of a one-year reinsurance model under regulatory solvency constraints and cost of capital considerations. Keywords: optimal reinsurance, randomization, stop-loss treaties, cost of capital, mean-excess function JEL: G22 C61 Date: 2018–05 URL: http://d.repec.org/n?u=RePEc:chf:rpseri:rp1833&r=rmg
15.  By: Arthur Charpentier (Department of Economics, Ecole Polytechnique - X - École polytechnique - CNRS - Centre National de la Recherche Scientifique) Date: 2018–07–05 URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01831481&r=rmg
16.  By: Cremades, R.; Surminski, Swenja; Máñez Costa, M.; Hudson, P.; Shrivastava, P.; Gascoigne, J. Abstract: Assessing the dynamics of resilience could help insurers and governments reduce the costs of climate-risk insurance schemes and secure future insurability in the face of an increase in extreme hydro-meteorological events related to climate change. JEL: G32 Date: 2018–01–02 URL: http://d.repec.org/n?u=RePEc:ehl:lserod:86505&r=rmg
17.  By: Giovanni Caggiano; Efrem Castelnuovo; Gabriela Nodari Abstract: We employ real-time data available to the US monetary policy makers to estimate a Taylor rule augmented with a measure of financial uncertainty over the period 1969-2008. We find evidence in favor of a systematic response to financial uncertainty over and above that to expected inflation, output gap, and output growth. However, this evidence regards the Greenspan-Bernanke period only. Focusing on this period, the “risk-management” approach is found to be responsible for monetary policy easings for up to 75 basis points of the federal funds rate. Keywords: Risk management-driven policy rate gap, uncertainty, monetary policy, Taylor rules, real-time data. JEL: C2 E4 E5 Date: 2018–07 URL: http://d.repec.org/n?u=RePEc:een:camaaa:2018-34&r=rmg
18.  By: Julien Hugonnier (Ecole Polytechnique Fédérale de Lausanne and Swiss Finance Institute); Florian Pelgrin (EDHEC Business School); Pascal St-Amour (University of Lausanne and Swiss Finance Institute) Abstract: The Human Capital (HK) and Statistical Life Values (VSL) differ sharply in their empirical pricing of a human life and lack a common theoretical background to justify these differences. We first contribute to the theory and measurement of life value by providing a unified framework to formally define and relate the Hicksian willingness to pay (WTP) to avoid changes in death risks, the HK and the VSL. Second, we use this setting to introduce an alternative life value calculated at Gunpoint (GPV), i.e. the WTP to avoid certain, instantaneous death. Third, we associate a flexible human capital model to the common framework to characterize the WTP and the three life valuations in closed-form. Fourth, our structural estimates of these solutions yield mean life values of 8.35 M\$ (VSL), 421 K\$ (HK) and 447 K\$ (GPV). We con firm that the strong curvature of the WTP and the linear projection hypothesis of the VSL explain why the latter is much larger than other values. Keywords: Value of Human Life, Human Capital, Value of Statistical Life, Hicksian Willingness to Pay, Equivalent Variation, Mortality, Structural Estimation JEL: J17 G11 Date: 2018–04 URL: http://d.repec.org/n?u=RePEc:chf:rpseri:rp1827&r=rmg

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.