
on Risk Management 
By:  Chiara Pederzoli; Costanza Torricelli 
Abstract:  The European Banking Authority (EBA) stress tests, which aim to quantify banks’ capital shortfall in a potential future crisis (adverse economic scenario), further stimulated an academic debate over systemic risk measures and their predictive/informative content. Focusing on marked based measures, Acharya et al. (2010) provides a theoretical background to justify the use of Marginal Expected Shortfall (MES) for predicting the stress test results, and verify it on the first stress test conducted after the 20072008 crises on the US banking system (SCAP, Supervisory Capital Assessment Program). The aim of this paper is to further test the goodness of MES as a predictive measure, by analysing it in relation to the results of the 2014 European stress tests exercise conducted by EBA. Our results are strongly dependent on index used to capture the systemic distress event, whereby MES, based on a global market index, does not show association with EBA stress test, by contrast to FMES, which is based on a financial market index, and has a significant information and predictive power. Our results may carry useful regulatory implication for the stress test exercises. 
Keywords:  systemic risk, stress test, macroprudential regulation 
JEL:  G01 G10 G28 
Date:  2015–07 
URL:  http://d.repec.org/n?u=RePEc:mod:wcefin:15207&r=rmg 
By:  Véronique MaumeDeschamps (ICJ  Institut Camille Jordan [Villeurbanne]  ECL  École Centrale de Lyon  Université Jean Monnet  SaintEtienne  UCBL  Université Claude Bernard Lyon 1  INSA  Institut National des Sciences Appliquées  CNRS); Didier Rullière (SAF  Laboratoire de Sciences Actuarielle et Financière  UCBL  Université Claude Bernard Lyon 1); Khalil Said (COACTIS  UL2  Université Lumière  Lyon 2  Université Jean Monnet  SaintEtienne, SAF  Laboratoire de Sciences Actuarielle et Financière  UCBL  Université Claude Bernard Lyon 1) 
Abstract:  The European insurance sector will soon be faced with the application of Solvency 2 regulation norms. It will create a real change in risk management practices. The ORSA approach of the second pillar makes the capital allocation an important exercise for all insurers and specially for groups. Considering multibranches firms, capital allocation has to be based on a multivariate risk modeling. Several allocation methods are present in the literature and insurers practices. In this paper, we present a new risk allocation method, we study its coherence using an axiomatic approach, and we try to define what the best allocation choice for an insurance group is. 
Date:  2015–06–12 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal01163180&r=rmg 
By:  Suzanne Emmer (CREAR  Center of Research in Econofinance and Actuarial sciences on Risk / Centre de Recherche Econofinancière et Actuarielle sur le Risque  Essec Business School); Marie Kratz (SID  Information Systems, Decision Sciences and Statistics Department  Essec Business School, MAP5  MAP5  Mathématiques Appliquées à Paris 5  CNRS  UPD5  Université Paris Descartes  Paris 5  Institut National des Sciences Mathématiques et de leurs Interactions); Dirk Tasche (Prudential Regulation Authority  Bank of England) 
Abstract:  Expected Shortfall (ES) has been widely accepted as a risk measure that is conceptually superior to ValueatRisk (VaR). At the same time, however, it has been criticized for issues relating to backtesting. In particular, ES has been found not to be elicitable which means that backtesting for ES is less straightforward than, e.g., backtesting for VaR. Expectiles have been suggested as potentially better alternatives to both ES and VaR. In this paper, we revisit commonly accepted desirable properties of risk measures like coherence, comonotonic additivity, robustness and elicitability. We check VaR, ES and Expectiles with regard to whether or not they enjoy these properties, with particular emphasis on Expectiles. We also consider their impact on capital allocation, an important issue in risk management. We find that, despite the caveats that apply to the estimation and backtesting of ES, it can be considered a good risk measure. In particular, there is no sufficient evidence to justify an allinclusive replacement of ES by Expectiles in applications, especially as we provide an alternative way for backtesting of ES. 
Date:  2013–12–20 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal00921283&r=rmg 
By:  Stéphane Crépey (LaMME  Laboratoire de Mathématiques et Modélisation d'Evry  Institut national de la recherche agronomique (INRA)  Université d'EvryVal d'Essonne  UPD5  Université Paris Descartes  Paris 5  ENSIIE  CNRS  UPEC UP12  Université ParisEst Créteil ValdeMarne  Paris 12); S. Song (LaMME  Laboratoire de Mathématiques et Modélisation d'Evry  Institut national de la recherche agronomique (INRA)  Université d'EvryVal d'Essonne  UPD5  Université Paris Descartes  Paris 5  ENSIIE  CNRS  UPEC UP12  Université ParisEst Créteil ValdeMarne  Paris 12) 
Abstract:  A basic reducedform counterparty risk modeling approach hinges on a standard immersion hypothesis between a reference filtration and the filtration progressively enlarged by the default times of the two parties, also involving the continuity of some of the data at default time. This basic approach is too restrictive for application to credit derivatives, which are characterized by strong wrongway risk, i.e. adverse dependence between the exposure and the credit riskiness of the counterparties, and gap risk, i.e. slippage between the portfolio and its collateral during the so called cure period that separates default from liquidation. This paper shows how a suitable extension of the basic approach can be devised so that it can be applied in dynamic copula models of counterparty risk on credit derivatives. More generally, this method is applicable in any marked default times intensity setup satisfying a suitable integrability condition. The integrability condition expresses that no mass is lost in a related measure change. The changed probability measure is not needed algorithmically. All one needs in practice is an explicit expression for the intensities of the marked default times. 
Date:  2014–03–04 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal00989062&r=rmg 
By:  Marc Busse (SCOR SE  SCOR SE); Michel Dacorogna (SCOR SE  SCOR SE); Marie Kratz (SID  Information Systems, Decision Sciences and Statistics Department  Essec Business School, MAP5  MAP5  Mathématiques Appliquées à Paris 5  CNRS  UPD5  Université Paris Descartes  Paris 5  Institut National des Sciences Mathématiques et de leurs Interactions) 
Abstract:  Risk diversification is the basis of insurance and investment. It is thus crucial to study the effects that could limit it. One of them is the existence of systemic risk that affects all the policies at the same time. We introduce here a probabilistic approach to examine the consequences of its presence on the risk loading of the premium of a portfolio of insurance policies. This approach could be easily generalized for investment risk. We see that, even with a small probability of occurrence, systemic risk can reduce dramatically the diversification benefits. It is clearly revealed via a nondiversifiable term that appears in the analytical expression of the variance of our models. We propose two ways of introducing it and discuss their advantages and limitations. By using both VaR and TVaR to compute the loading, we see that only the latter captures the full effect of systemic risk when its probability to occur is low. 
Date:  2013–12–06 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal00914844&r=rmg 
By:  Aikman, David (Bank of England); Kiley, Michael T. (Board of Governors of the Federal Reserve System (U.S.)); Lee, Seung Jung (Board of Governors of the Federal Reserve System (U.S.)); Palumbo, Michael G. (Board of Governors of the Federal Reserve System (U.S.)); Warusawitharana, Missaka (Board of Governors of the Federal Reserve System (U.S.)) 
Abstract:  We provide a framework for assessing the buildup of vulnerabilities in the U.S. financial system. We collect fortyfour indicators of financial and balancesheet conditions, cutting across measures of valuation pressures, nonfinancial borrowing, and financialsector health. We place the data in economic categories, track their evolution, and develop an algorithmic approach to monitoring vulnerabilities that can complement the more judgmental approach of most officialsector organizations. Our approach picks up rising imbalances in the U.S. financial system through the mid2000s, presaging the financial crisis. We also highlight several statistical properties of our approach: most importantly, our summary measures of systemwide vulnerabilities lead the credittoGDP gap (a key gauge in Basel III and related research) by a year or more. Thus, our framework may provide useful information for setting macroprudential policy tools such as the countercyclical capital buffer. 
Keywords:  Early warning system; financial crisis; financial stability; financial vulnerabilities; heat maps; macroprudential policy; systemic risk; data visualization; countercyclical capital buffers 
JEL:  G01 G12 G21 G23 G28 
Date:  2015–06–24 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgfe:201559&r=rmg 
By:  Christian Thimann (PSE  ParisJourdan Sciences Economiques  CNRS  Institut national de la recherche agronomique (INRA)  EHESS  École des hautes études en sciences sociales  ENS Paris  École normale supérieure  Paris  École des Ponts ParisTech (ENPC), Axa  AXA, EEPPSE  Ecole d'Économie de Paris  Paris School of Economics) 
Abstract:  This paper aims at providing a conceptual distinction between banking and insurance with regard to systemic regulation. It discusses key differences and similarities as to how both sectors interact with the financial system. Insurers interact as financial intermediaries and through financial market investments, but do not share the features of banking that give rise to particular systemic risk in that sector, such as the institutional interconnectedness through the interbank market, the maturity transformation combined with leverage, the prevalence of liquidity risk and the operation of the payment system. The paper also draws attention to three salient features in insurance that need to be taken account in systemic regulation: the quasiabsence of leverage, the fundamentally different role of capital and the ‘builtin bailin’ of a significant part of insurance liabilities through policyholder participation. Based on these considerations, the paper argues that if certain activities were to give rise to concerns about systemic risk in the case of insurers, regulatory responses other than capital surcharges may be more appropriate. 
Date:  2014–10–16 
URL:  http://d.repec.org/n?u=RePEc:hal:psewpa:halshs01074933&r=rmg 
By:  Zachary Feinstein; Birgit Rudloff 
Abstract:  A method for calculating multiportfolio time consistent multivariate risk measures in discrete time is presented. Market models for $d$ assets with transaction costs or illiquidity and possible trading constraints are considered on a finite probability space. The set of capital requirements at each time and state is calculated recursively backwards in time along the event tree. We motivate why the proposed procedure can be seen as a setvalued Bellman's principle, that might be of independent interest within the growing field of set optimization. We give conditions under which the backwards calculation of the sets reduces to solving a sequence of linear, respectively convex vector optimization problems. Numerical examples are given and include superhedging under illiquidity, the setvalued entropic risk measure, and the multiportfolio time consistent version of the relaxed worst case risk measure and of the setvalued average value at risk. 
Date:  2015–08 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1508.02367&r=rmg 
By:  Altunbas, Yener (Bangor Business School); Manganelli, Simone (European Central Bank); MarquesIbanez, David (Board of Governors of the Federal Reserve System (U.S.)) 
Abstract:  In the years preceding the 20072009 financial crisis, forwardlooking indicators of bank risk concentrated and suggested unusually low expectations of bank default. We assess whether the exante (i.e. prior to the crisis) crosssectional variability in bank characteristics is related to the expost (i.e. during the crisis) materialization of bank risk. Our tailormade dataset crucially accounts for the different dimensions of realized bank risk including access to central bank liquidity during the crisis. We consistently find that less reliance on deposit funding, more aggressive credit growth, larger size and leverage were associated with larger levels of realized risk. The impact of these characteristics is particularly relevant for capturing the systemic dimensions of bank risk and tends to become stronger for the tail of the riskier banks. The majority of these characteristics also predicted bank risk as materialized before the financial crisis. 
Keywords:  Bank risk; business models; Great Recession 
JEL:  E58 G15 G21 G32 
Date:  2015–08–04 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgif:1140&r=rmg 
By:  Véronique MaumeDeschamps (ICJ  Institut Camille Jordan [Villeurbanne]  ECL  École Centrale de Lyon  UCBL  Université Claude Bernard Lyon 1  Université Jean Monnet  SaintEtienne  INSA  Institut National des Sciences Appliquées  CNRS); Didier Rullière (SAF  Laboratoire de Sciences Actuarielle et Financière  UCBL  Université Claude Bernard Lyon 1); Khalil Said (SAF  Laboratoire de Sciences Actuarielle et Financière  UCBL  Université Claude Bernard Lyon 1, COACTIS  UL2  Université Lumière  Lyon 2  Université Jean Monnet  SaintEtienne) 
Abstract:  The minimization of some multivariate risk indicators may be used as an allocation method, as proposed in Cénac et al. [6]. The aim of capital allocation is to choose a point in a simplex, according to a given criterion. In a previous paper [17] we proved that the proposed allocation technique satisfies a set of coherence axioms. In the present one, we study the properties and asymptotic behavior of the allocation for some distribution models. We analyze also the impact of the dependence structure on the allocation using some copulas. 
Date:  2015–07–04 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal01171395&r=rmg 
By:  Véronique MaumeDeschamps (ICJ  Institut Camille Jordan [Villeurbanne]  ECL  École Centrale de Lyon  UCBL  Université Claude Bernard Lyon 1  Université Jean Monnet  SaintEtienne  INSA  Institut National des Sciences Appliquées  CNRS); Didier Rullière (SAF  Laboratoire de Sciences Actuarielle et Financière  UCBL  Université Claude Bernard Lyon 1); Khalil Said (COACTIS  UL2  Université Lumière  Lyon 2  Université Jean Monnet  SaintEtienne, SAF  Laboratoire de Sciences Actuarielle et Financière  UCBL  Université Claude Bernard Lyon 1) 
Abstract:  The issue of capital allocation in a multivariate context arises from the presence of dependence between the various risky activities which may generate a diversification effect. Several allocation methods in the literature are based on a choice of a univariate risk measure and an allocation principle, others on optimizing a multivariate ruin probability or some multivariate risk indicators. In this paper, we focus on the latter technique. Using an axiomatic approach, we study its coherence properties. We give some explicit results in mono periodic cases. Finally we analyze the impact of the dependence structure on the optimal allocation. 
Date:  2014–11–13 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal01082559&r=rmg 
By:  Georg Mainik 
Abstract:  This paper studies convergence properties of multivariate distributions constructed by endowing empirical margins with a copula. This setting includes Latin Hypercube Sampling with dependence, also known as the ImanConover method. The primary question addressed here is the convergence of the component sum, which is relevant to risk aggregation in insurance and finance. This paper shows that a CLT for the aggregated risk distribution is not available, so that the underlying mathematical problem goes beyond classic functional CLTs for empirical copulas. This issue is relevant to MonteCarlo based risk aggregation in all multivariate models generated by plugging empirical margins into a copula. Instead of a functional CLT, this paper establishes strong uniform consistency of the estimated sum distribution function and provides a sufficient criterion for the convergence rate $O(n^{1/2})$ in probability. These convergence results hold for all copulas with bounded densities. Examples with unbounded densities include bivariate Clayton and Gauss copulas. The convergence results are not specific to the component sum and hold also for any other componentwise nondecreasing aggregation function. On the other hand, convergence of estimates for the joint distribution is much easier to prove, including CLTs. Beyond ImanConover estimates, the results of this paper apply to multivariate distributions obtained by plugging empirical margins into an exact copula or by plugging exact margins into an empirical copula. 
Date:  2015–08 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1508.02749&r=rmg 
By:  Pascal Le Masson (CGS  Centre de Gestion Scientifique  MINES ParisTech  École nationale supérieure des mines de Paris); Benoit Weil (CGS  Centre de Gestion Scientifique  MINES ParisTech  École nationale supérieure des mines de Paris); Olga Kokshagina (CGS  Centre de Gestion Scientifique  MINES ParisTech  École nationale supérieure des mines de Paris) 
Abstract:  Risk management today has its main roots in decision theory paradigm (Friedman and Savage 1948). It consists in making the optimal choice between given possible decisions and probable states of nature. In this paper we extend this model to include a design capacity to deal with risk situations. A design perspective leads to add a new action possibility in the model: to design a new alternative to deal with the probable states of nature. The new alternative design might also "create" new risks, so that a design perspective leads also to model the emergence of new risks as an exogenous "design process". Hence a design perspective raises two issues: can we design an alternative that would lower the risk? Does this new alternative create new risks? We show (1) that minimizing known risks consists in designing an alternative whose success is independent from all the known risks – this alternative can be considered as a generic technology. We show (2)that the design of this generic technology depends on the structure of the unknown, ie the structure of the space generated by the concept of riskfree alternative. (3) We identify new strategies to deal with risks as dealing with the unknown. 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:hal:journl:hal01083249&r=rmg 
By:  Thomas Humblot (Larefi  Laboratoire d'analyse et de recherche en économie et finance internationales  Université Montesquieu  Bordeaux 4) 
Abstract:  This paper investigates Basel III potential effects on SME access to bank credit. In an innovative empirical framework, French small firms are studied using microdata over the 20082013 periods. We conclude that the new regulation will have an Mshaped impact. Eventually, Basel III eliminates low profitable exposures regardless of their regulatory charge alleviations, restricts risky positions despite of their profitability and digs SME funding gap. Only regulatory adjusted dominant risk/return profiles are funded. On average, no reduction in credit maturity nor in volume is observable. The overall effect ultimately depends on banks' initial position. 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal01096527&r=rmg 
By:  Andries, Marianne (Toulouse School of Economics); Eisenbach, Thomas M. (Federal Reserve Bank of New York); Schmalz, Martin C. (University of Michigan); Wang, Yichuan (University of Michigan) 
Abstract:  We estimate the term structure of the price of variance risk (PVR), which helps distinguish between competing assetpricing theories. First, we measure the PVR as proportional to the Sharpe ratio of shortterm holding returns of deltaneutral index straddles; second, we estimate the PVR in a Heston (1993) stochasticvolatility model. In both cases, the estimation is performed separately for different maturities. We find the PVR is negative and decreases in absolute value with maturity; it is more negative and its term structure is steeper when volatility is high. These findings are inconsistent with calibrations of established assetpricing models that assume constant risk aversion across maturities. 
Keywords:  volatility risk; option returns; straddle; term structure 
JEL:  G12 G13 
Date:  2015–08–01 
URL:  http://d.repec.org/n?u=RePEc:fip:fednsr:736&r=rmg 
By:  Mengying Cui; David Levinson (Nexus (Networks, Economics, and Urban Systems) Research Group, Department of Civil Engineering, University of Minnesota) 
Abstract:  Risk severity in transportation network analysis is defined as the effects of a link or network failure on the whole system. Change accessibility (reduction in the number of jobs which can be reached) is used as an integrated indicator to reflect the severity of a link outage. The changes of accessibility beforeandafter the removing of a freeway segment from the network represent its risk severity. The analysis in the Minneapolis  St. Paul (Twin Cities) region show that links near downtown Minneapolis have relative higher risk severity than those in rural area. The geographical distribution of links with the highest risk severity displays the property that these links tend to be near or at the intersection of freeways. Risk severity of these links based on the accessibility to jobs and to workers at different time thresholds and during different dayparts are also analyzed in the paper. The research finds that network structure measures: betweenness, straightness and closeness, help explain the severity of loss due to network outage. 
Keywords:  GPS data, congestion, network structure, accessibility, vulnerability 
JEL:  R14 R41 R42 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:nex:wpaper:vulnerability&r=rmg 
By:  Guillaume Carlier (CEREMADE  CEntre de REcherches en MAthématiques de la DEcision  CNRS  Université Paris IX  Paris Dauphine); Victor Chernozhukov (CEREMADE  CEntre de REcherches en MAthématiques de la DEcision  CNRS  Université Paris IX  Paris Dauphine); Alfred Galichon (CEREMADE  CEntre de REcherches en MAthématiques de la DEcision  CNRS  Université Paris IX  Paris Dauphine) 
Abstract:  We propose a notion of conditional vector quantile function and a vector quantile regression. A conditional vector quantile function (CVQF) of a random vector Y, taking values in ℝd given covariates Z=z, taking values in ℝk, is a map u↦QY∣Z(u,z), which is monotone, in the sense of being a gradient of a convex function, and such that given that vector U follows a reference nonatomic distribution FU, for instance uniform distribution on a unit cube in ℝd, the random vector QY∣Z(U,z) has the distribution of Y conditional on Z=z. Moreover, we have a strong representation, Y=QY∣Z(U,Z) almost surely, for some version of U. The vector quantile regression (VQR) is a linear model for CVQF of Y given Z. Under correct specification, the notion produces strong representation, Y=β(U)⊤f(Z), for f(Z) denoting a known set of transformations of Z, where u↦β(u)⊤f(Z) is a monotone map, the gradient of a convex function, and the quantile regression coefficients u↦β(u) have the interpretations analogous to that of the standard scalar quantile regression. As f(Z) becomes a richer class of transformations of Z, the model becomes nonparametric, as in series modelling. A key property of VQR is the embedding of the classical MongeKantorovich's optimal transportation problem at its core as a special case. In the classical case, where Y is scalar, VQR reduces to a version of the classical QR, and CVQF reduces to the scalar conditional quantile function. Several applications to diverse problems such as multiple Engel curve estimation, and measurement of financial risk, are considered. 
Date:  2015–06 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal01169653&r=rmg 
By:  Fernando Jaramillo (Universidad del Rosario  Facultad de Economia); Hubert Kempf (EEPPSE  Ecole d'Économie de Paris  Paris School of Economics); Fabien Moizeau (CREM  Centre de Recherche en Economie et Management  CNRS  Université de Caen BasseNormandie  UR1  Université de Rennes 1) 
Abstract:  We study the relationship between the distribution of individuals' attributes over the population and the extent of risk sharing in a risky environment. We consider a society where individuals voluntarily form risksharing groups in the absence of financial markets. We obtain a partition of society into distinct coalitions leading to partial risk sharing. When individuals differ only with respect to risk, the partition is homophilybased: the less risky agents congregate together and reject more risky ones into other coalitions. The distribution of risk affects the number and size of these coalitions. It turns out that individuals may pay a lower risk premium in more risky societies. We show that a higher heterogeneity in risk leads to a lower degree of partial risk sharing. The case of heterogenous risk aversion generates similar results. The empirical evidence on partial risk sharing can be understood when the endogenous partition of society into risksharing coalitions is taken into account. 
Date:  2015–05 
URL:  http://d.repec.org/n?u=RePEc:hal:journl:halshs01075648&r=rmg 
By:  Nicolas Gravel (AMSE  AixMarseille School of Economics  EHESS  École des hautes études en sciences sociales  Centre national de la recherche scientifique (CNRS)  Ecole Centrale Marseille (ECM)  AMU  AixMarseille Université); Benoît Tarroux (CREM  Centre de Recherche en Economie et Management  CNRS  Université de Caen BasseNormandie  UR1  Université de Rennes 1) 
Abstract:  In this paper, we theoretically characterize robust empirically implementable normative criteria for evaluating socially risky situations. Socially risky situations are modeled as distributions, among individuals, of lotteries on a finite set of statecontingent pecuniary consequences. Individuals are assumed to have selfish Von NeumannMorgenstern preferences for these socially risky situations. We provide empirically implementable criteria that coincide with the unanimity, over a reasonably large class of such individual preferences, of anonymous and Paretoinclusive Von Neuman Morgenstern social rankings of risks. The implementable criteria can be interpreted as sequential expected poverty dominance.An illustration of the usefulness of the criteria for comparing the exposure to unemployment risk of different segments of the French and US workforce is also provided. 
Date:  2015–02 
URL:  http://d.repec.org/n?u=RePEc:hal:journl:halshs01057024&r=rmg 
By:  Alexandre Mornet (Allianz, SAF  Laboratoire de Sciences Actuarielle et Financière  UCBL  Université Claude Bernard Lyon 1); Thomas Opitz (I3M  Institut de Mathématiques et de Modélisation de Montpellier  CNRS  UM2  Université Montpellier 2  Sciences et Techniques); Michel Luzi (Allianz); Stéphane Loisel (SAF  Laboratoire de Sciences Actuarielle et Financière  UCBL  Université Claude Bernard Lyon 1) 
Abstract:  For insurance companies, wind storms represent a main source of volatility, leading to potentially huge aggregated claim amounts. In this article, we compare different constructions of a storm index allowing us to assess the economic impact of storms on an insurance portfolio by exploiting information from historical wind speed data. Contrary to historical insurance portfolio data, meteorological variables can be considered as stationary between years and are easily available with long observation records; hence, they represent a valuable source of additional information for insurers if the relation between observations of claims and wind speeds can be revealed. Since standard correlation measures between raw wind speeds and insurance claims are weak, a storm index focusing on high wind speeds can afford better information. This method has been used on the German territory by Klawa and Ulbrich and gave good results for yearly aggregated claims. Using historical meteorological and insurance data, we assess the consistency of the proposed indices construction and we test their sensitivity to their various parameters and weights. Moreover, we are able to place the major insurance events since 1998 on a broader horizon of 40+ years. Our approach provides a meteorological justification for calculating the return periods of extreme stormrelated insurance events whose magnitude has rarely been reached. 
Date:  2014–07 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal01081758&r=rmg 