
on Risk Management 
By:  Michael McAleer (Erasmus University Rotterdam); JuanÁngel JiménezMartín (Complutense University of Madrid); Teodosio PérezAmaral (Complutense University of Madrid) 
Abstract:  The Basel II Accord requires that banks and other Authorized Deposittaking Institutions (ADIs) communicate their daily risk forecasts to the appropriate monetary authorities at the beginning of each trading day, using one or more risk models to measure ValueatRisk (VaR). The risk estimates of these models are used to determine capital requirements and associated capital costs of ADIs, depending in part on the number of previous violations, whereby realised losses exceed the estimated VaR. In this paper we define risk management in terms of choosing from a variety of risk models, and discuss the selection of optimal risk models. A new approach to model selection for predicting VaR is proposed, consisting of combining alternative risk models, and we compare conservative and aggressive strategies for choosing between VaR models. We then examine how different risk management strategies performed during the 200809 global financial crisis. These issues are illustrated using Standard and Poor’s 500 Composite Index. 
Keywords:  ValueatRisk (VaR); daily capital charges; violation penalties; optimizing strategy; risk forecasts; aggressive or conservative risk management strategies; Basel Accord; global financial crisis 
JEL:  G32 G11 G17 C53 C22 
Date:  2013–01–08 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:20130010&r=rmg 
By:  Ignacio Cascos; Ilya Molchanov 
Abstract:  Since risky positions in multivariate portfolios can be offset by various choices of capital requirements that depend on the exchange rules and related transaction costs, it is natural to assume that the risk measures of random vectors are setvalued. Furthermore, it is reasonable to include the exchange rules in the argument of the risk and so consider risk measures of setvalued portfolios. This situation includes the classical Kabanov's transaction costs model, where the setvalued portfolio is given by the sum of a random vector and an exchange cone. The definition of the selection risk measure is based on calling a setvalued portfolio acceptable if it possesses a selection with all individually acceptable marginals. The obtained risk measure is coherent (or convex), law invariant and has values being upper convex closed sets. We describe the dual representation of the selection risk measure and suggest efficient ways of approximating it from below and from above. In case of Kabanov's exchange cone model, it is shown how the selection risk measure relates to the setvalued risk measures considered by Kulikov (2008) and Hamel and Heyde (2010). 
Date:  2013–01 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1301.1496&r=rmg 
By:  Dominique Guegan (CES  Centre d'économie de la Sorbonne  CNRS : UMR8174  Université Paris I  Panthéon Sorbonne); Bertrand Hassani (CES  Centre d'économie de la Sorbonne  CNRS : UMR8174  Université Paris I  Panthéon Sorbonne) 
Abstract:  The Advanced Measurement Approach requires financial institutions to develop internal models to evaluate their capital charges. Traditionally, the Loss Distribution Approach (LDA) is used mixing frequencies and severities to build a Loss Distribution Function (LDF). This distribution represents annual losses, consequently the 99.9 percentile of the distribution providing the capital charge denotes the worst year in a thousand. The current approach suggested by the regulator implemented in the financial institutions assumes the independence of the losses. In this paper, we propose a solution to address the issues arising when autocorrelations are detected between the losses. Our approach suggests working with the losses considered as time series. Thus, the losses are aggregated periodically and time series processes are adjusted on the related time series among AR, ARFI, and Gegenbauer processes, and a distribution is fitted on the residuals. Finally a Monte Carlo simulation enables constructing the LDF, and the pertaining risk measures are evaluated. In order to show the impact of the choice of the internal models retained by the companies on the capital charges, the paper draws a parallel between the static traditional approach and an appropriate dynamical modelling. If by implementing the traditional LDA, no particular distribution proves its adequacy to the data  as soon as the goodnessoffits tests rejects them , keeping the LDA modelling corresponds to an arbitrary choice. We suggest in this paper an alternative and robust approach. For instance, for the two data sets we explore in this paper, with the strategies presented in this paper, the independence assumption is released and we are able to capture the autocorrelations inside the losses through the time series modelling. The construction of the related LDF enables the computation of the capital charge and therefore permits complying with the regulation taking into account as the same time the large losses with adequate distributions on the residuals and the correlations between losses with the time series modelling. 
Keywords:  Operation risk, time series, Gegenbauer processes, Monte Carlo, risk measures. 
Date:  2012–12 
URL:  http://d.repec.org/n?u=RePEc:hal:cesptp:halshs00771387&r=rmg 
By:  Frank Riedel; Tobias Hellmann 
Abstract:  Foster and Hart proposed an operational measure of riskiness for discrete random variables. We show that their defining equation has no solution for many common continuous distributions including many uniform distributions, e.g. We show how to extend consistently the definition of riskiness to continuous random variables. For many continuous random variables, the risk measure is equal to the worstcase risk measure, i.e. the maximal possible loss incurred by that gamble. We also extend the FosterHart risk measure to dynamic environments for general distributions and probability spaces, and we show that the extended measure avoids bankruptcy in infinitely repeated gambles. 
Date:  2013–01 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1301.1471&r=rmg 
By:  Cihak, Martin; DemirgucKunt, Asli; Johnston, R. Barry 
Abstract:  A large body of evidence points to misaligned incentives as having a key role in the runup to the global financial crisis. These include bank managers'incentives to boost shortterm profits and create banks that are"too big to fail,"regulators'incentives to forebear and withhold information from other regulators in stressful times, and credit rating agencies'incentives to keep issuing high ratings for subprime assets. As part of the response to the crisis, policymakers and regulators also attempted to address some incentive issues, but various outside observers have criticized the response for being insufficient. This paper proposes a pragmatic approach to reorienting financial regulation to have at its core the objective of addressing incentives on an ongoing basis. Specifically, the paper proposes"incentive audits"as a tool that could help in identifying incentive misalignments in the financial sector. The paper illustrates how such audits could be implemented in practice, and what the implications would be for the design of policies and frameworks to mitigate systemic risks. 
Keywords:  Banks&Banking Reform,Debt Markets,Emerging Markets,Labor Policies,Insurance&Risk Mitigation 
Date:  2013–01–01 
URL:  http://d.repec.org/n?u=RePEc:wbk:wbrwps:6308&r=rmg 
By:  Frédéric Abergel (FiQuant  Chaire de finance quantitative  Ecole Centrale Paris, MAS  Mathématiques Appliquées aux Systèmes  EA 4037  Ecole Centrale Paris) 
Abstract:  In this note, I study further a new approach recently introduced for the hedging of derivatives in incomplete markets via non quadratic local risk minimization. A structure result is provided, which essentially shows the equivalence between nonquadratic risk minimization under the historical probability and quadratic local risk minimization under an equivalent, implicitly defined probability. 
Date:  2013–01–08 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal00771528&r=rmg 
By:  Phillip Monin 
Abstract:  Sharpe et al. proposed the idea of having an expected utility maximizer choose a probability distribution for future wealth as an input to her investment problem instead of a utility function. They developed a computer program, called The Distribution Builder, as one way to elicit such a distribution. In a singleperiod model, they then showed how this desired distribution for terminal wealth can be used to infer the investor's risk preferences. We adapt their idea, namely that a riskaverse investor can choose a desired distribution for future wealth as an alternative input attribute for investment decisions, to continuous time. In a variety of scenarios, we show how the investor's desired distribution combines with her initial wealth and marketrelated input to determine the feasibility of her distribution, her implied risk preferences, and her optimal policies throughout her investment horizon. We then provide several examples. 
Date:  2013–01 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1301.0907&r=rmg 
By:  Elisa Luciano; Luca Regis; Elena Vigna 
Abstract:  The paper presents closedform Delta and Gamma hedges for an nuities and death assurances, in the presence of both longevity and interestrate risk. Longevity risk is modelled through an extension of the classical Gompertz law, while interest rate risk is modelled via an HullandWhite process. We theoretically provide natural hedg ing strategies, considering also contracts written on dierent genera tions. We provide a UKpopulation and bondmarket calibrated exam ple. We compute longevity exposures and explicitly calculate Delta Gamma hedges. Reinsurance is needed in order to setup portfolios which are DeltaGamma neutral to both longevity and interestrate risk. 
Date:  2012–01 
URL:  http://d.repec.org/n?u=RePEc:icr:wpmath:212011&r=rmg 
By:  Volodymyr Korniichuk 
Abstract:  We propose a model for forecasting extreme electricity prices in real time (high frequency) settings. The unique feature of our model is its ability to forecast electricity price exceedances over very high thresholds, where only a few (if any) observations are available. The model can also be applied for simulating times of occurrence and magnitudes of the extreme prices. We employ a copula with a changing dependence parameter for capturing serial dependence in the extreme prices and the censored GPD for modelling their marginal distributions. For modelling times of the extreme price occurrences we propose an approach based on a negative binomial distribution. The model is applied to electricity spot prices from Australia's national electricity market. 
Keywords:  electricity spot prices, copula, GPD, negative binomial distribution 
JEL:  C53 C51 C32 
Date:  2012–12–27 
URL:  http://d.repec.org/n?u=RePEc:cgr:cgsser:0314&r=rmg 
By:  Harry Bensusan (CMAP  Centre de Mathématiques Appliquées  Ecole Polytechnique  Polytechnique  X  CNRS : UMR7641); Nicole El Karoui (CMAP  Centre de Mathématiques Appliquées  Ecole Polytechnique  Polytechnique  X  CNRS : UMR7641, LPMA  Laboratoire de Probabilités et Modèles Aléatoires  CNRS : UMR7599  Université Paris VI  Pierre et Marie Curie  Université Paris VII  Paris Diderot); Stéphane Loisel (SAF  Laboratoire de Sciences Actuarielle et Financière  Université Claude Bernard  Lyon I : EA2429); Yahia Salhi (SAF  Laboratoire de Sciences Actuarielle et Financière  Université Claude Bernard  Lyon I : EA2429) 
Abstract:  In this paper, we introduce a new structured financial product: the socalled Life Nominal Chooser Swaption (LNCS). Thanks to such a contract, insurers could keep pure longevity risk and transfer a great part of interest rate risk underlying annuity portfolios to financial markets. Before the issuance of the contract, the insurer determines a confidence band of survival curves for her portfolio. An interest rate hedge is set up, based on swaption mechanisms. The bank uses this band as well as an interest rate model to price the product. At the end of the first period (e.g. 8 to 10 years), the insurer has the right to enter into an interest rate swap with the bank, where the nominal is adjusted to her (reforecasted) needs. She chooses (inside the band) the survival curve that better fits her anticipation of future mortality of her portfolio (during 15 to 20 more years, say) given the information available at that time. We use a population dynamics longevity model and a classical twofactor interest rate model %twofactor HeathJarrowMorton (HJM) model for interest rates to price this product. Numerical results show that the option offered to the insurer (in terms of choice of nominal) is not too expensive in many realworld cases. We also discuss the pros and the cons of the product and of our methodology. This structure enables insurers and financial institutions to remain in their initial field of expertise. 
Date:  2012–12–21 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal00768526&r=rmg 
By:  Bertrand Candelon (Economics  Maastricht University); Guillaume Gaulier (Centre de recherche de la Banque de France  Banque de France); Christophe Hurlin (LEO  Laboratoire d'économie d'Orleans  CNRS : UMR6221  Université d'Orléans) 
Abstract:  This paper proposes a new approach to date extreme financial cycles. Elaborating on recent methods in extreme value theory, it elaborates an extension of the famous calculus rule to detect extreme peaks and troughs. Applied on UnitedStates stock market since 1871, it leads to a dating of these exceptional events and calls for adequate economic policies in order to tackle them. 
Keywords:  Financial extreme cycles; Extreme value theory; 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:halshs00769817&r=rmg 
By:  Cho, In Soo 
Abstract:  We investigate the stability of measured risk attitudes over time, using a 13year longitudinal sample of individuals in the NLSY79. We find that an individualâ€™s risk aversion changes systematically in response to personal economic circumstances. Risk aversion increases with lengthening spells of employment and time out of labor force, and decreases with lengthening unemployment spells. However, the most important result is that the majority of the variation in risk aversion is due to changes in measured individual tastes over time and not to variation across individuals. These findings that measured risk preferences are endogenous and subject to substantial measurement errors suggest caution in interpreting coefficients in models relying on contemporaneous, onetime measures of risk preferences. 
Keywords:  risk aversion; stability; variance decomposition; within; measurement error; between; fixed effects 
JEL:  C23 D81 
Date:  2013–01–10 
URL:  http://d.repec.org/n?u=RePEc:isu:genres:35751&r=rmg 
By:  Alexander B. Matthies; ; ; 
Abstract:  We report on the current state and important older findings of empirical studies on corporate credit ratings and their relationship to ratings of other entities. Specifically, we consider the results of three lines of research: The correlation of credit ratings and corporate default, the influence of ratings on capital markets, and the determinants of credit ratings and rating changes. Results from each individual line are important and relevant for the construction and interpretation of studies in the other two fields, e.g. the choice of statistical methods. Moreover, design and construct of credit ratings and the credit rating scale are essential to understand empirical findings. 
Keywords:  Rating agency; Credit Ratings; Throughthecycle rating methodology; Corporate Governance 
JEL:  G20 G24 G30 G32 G34 
Date:  2013–01 
URL:  http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2013003&r=rmg 