nep-rmg New Economics Papers
on Risk Management
Issue of 2006‒01‒24
fifteen papers chosen by
Stan Miles
York University

  1. The modelling of operational risk: experience with the analysis of the data collected by the Basel Committee By Marco Moscadelli
  2. The Day of the Week Effect Patterns on Stock Market Return and Volatility: Evidence for the Athens Stock Exchange By Dimitris Kenourgios; Aristeidis Samitas; Spyros Papathanasiou
  3. On the Choice-Based Sample Bias in Probabilistic Business Failure Prediction By Skogsvik, Kenth
  4. Monetary policy and stock prices: theory and evidence By Stefano Neri
  5. The International CAPM and a wavelet-based decomposition of Value at Risk By Viviana Fernandez;
  6. Portfolio Value at Risk Based on Independent Components Analysis By Ying Chen; Wolfgang Härdle; Vladimir Spokoiny
  7. Methodology and Implementation of Value-at-Risk Measures in Emerging Fixed-Income Markets with Infrequent Trading. By Gonzalo Cortazar; Alejandro Bernales; Diether Beuermann
  8. La Value-at-Risk: Modèles de la VaR, simulations en Visual Basic (Excel) et autres mesures récentes du risque de marché By Francois-Éric Racicot; Raymond Théoret
  9. Risk Diversification by European Financial Conglomerates By Jan Frederik Slijkerman; Dirk Schoenmaker; Casper de Vries
  10. On the Appropriateness of Inappropriate VaR Models By Wolfgang Härdle; Zdenek Hlavka; Gerhard Stahl
  11. Data Scaling for Operational Risk Modelling By Na, H.S.; Couto Miranda, L.; Berg, J. van den; Leipoldt, M.
  12. Calibration Risk for Exotic Options By Kai Detlefsen; Wolfgang Härdle
  13. Common Failings: How Corporate Defaults are Correlated By Sanjiv Das; Darrell Duffie; Nikunj Kapadia; Leandro Saita
  14. Stock Markets Turmoil: Worldwide Effects of Middle East Conflicts By Viviana Fernandez
  15. Stock market returns and economic activity: evidence from wavelet analysis By Marco Gallegati

  1. By: Marco Moscadelli (Banca d'Italia)
    Abstract: The revised Basel Capital Accord requires banks to meet a capital requirement for operational risk as part of an overall risk-based capital framework. Three distinct options for calculating operational risk charges are proposed (Basic Approach, Standardised Approach, Advanced Measurement Approaches), reflecting increasing levels of risk sensitivity. Since 2001, the Risk Management Group of the Basel Committee has been performing specific surveys of banks’ operational loss data, with the main purpose of obtaining information on the industry’s operational risk experience, to be used for the refinement of the capital framework and for the calibration of the regulatory coefficients. The second loss data collection was launched in the summer of 2002: the 89 banks participating in the exercise provided the Group with more than 47,000 observations, grouped by eight standardised Business Lines and seven Event Types. A summary of the data collected, which focuses on the description of the range of individualgross loss amounts and of the distribution of the banks’ losses across the business lines/event types, was returned to the industry in March 2003. The objective of this paper is to move forward with respect to that document, by illustrating the methodologies and the outcomes of the inferential analysis carried out on the data collected through 2002. To this end, after pooling the individual banks’ losses according to a Business Line criterion, the operational riskiness of each Business Line data set is explored using empirical and statistical tools. The work aims, first of all, to compare the sensitivity of conventional actuarial distributions and models stemming from the Extreme Value Theory in representing the highest percentiles of the data sets: the exercise shows that the extreme value model, in its Peaks Over Threshold representation, explains the behaviour of the operational risk data in the tail area well. Then, measures of severity and frequency of the large losses are gained and, by a proper combination of these estimates, a bottom-up operational risk capital figure is computed for each Business Line. Finally, for each Business Line and in the eight Business Lines as a whole, the contributions of the expected losses to the capital figures are evaluated and the relationships between the capital charges and the corresponding average level of Gross Incomes are determined and compared with the current coefficients envisaged in the simplified approaches of the regulatory framework.
    Keywords: operational risk, heavy tails, conventional inference, Extreme Value Theory, Peaks Over Threshold, median shortfall, Point Process of exceedances, capital charge, Business Line, Gross Income, regulatory coefficients
    JEL: C11 C13 C14 C19 C29 C81 G21 G28
    Date: 2004–07
  2. By: Dimitris Kenourgios (University of Athens); Aristeidis Samitas (University of Aegean); Spyros Papathanasiou (Hellenic Open University)
    Abstract: This paper investigates the day of the week effect in the Athens Stock Exchange (ASE) General Index over a ten year period divided into two subperiods: 1995-2000 and 2001-2004. Five major indices are also considered: Banking, Insurance, and Miscellaneous for the first subperiod, and FTSE-20 and FTSE-40 for the second subperiod. Using a conditional variance framework, which extends previous work on the Greek stock market, we test for possible existence of day of the week variation in both return and volatility equations. When using the GARCH (1,1) specification only for the return equation and the Modified-GARCH (1,1) specification for both the return and volatility equations, findings indicate that the day of the week effect is present for the examined indices of the emerging ASE over the period 1995-2000. However, this stock market anomaly seems to loose its strength and significance in the ASE over the period 2001-2004, which might be due to the Greek entry to the Euro-Zone and the market upgrade to the developed.
    Keywords: Day of the week effect; mean stock returns; volatility; GARCH
    JEL: G10 G12
    Date: 2005–12–28
  3. By: Skogsvik, Kenth (Dept. of Business Administration, Stockholm School of Economics)
    Abstract: Probabilistic business failure prediction models are commonly estimated from non-random samples of companies. The proportion of failure companies in such samples is often much larger than the proportion of failure companies in most real-world decision contexts. This so-called “choice-based sample bias” implies that calculated failure probabilities will be (more or less) biased. The purpose of the paper is to analyse this bias and its consequences for standard applications of probabilistic failure prediction models (for example probit/logit analysis) and in particular to investigate whether the bias can be eliminated without having to re-estimate the underlying statistical model. It is shown that there is a straightforward linkage between sample-based probabilities of failure and the corresponding population-based probabilities. Knowing this linkage, sample-based probabilities can be adjusted for the “choice-based sample bias”, provided that sufficiently large samples of randomly selected failure companies and randomly selected survival companies have been used in the estimation of the underlying statistical model. Empirical observations in previous research are in line with the theoretical results of the paper.
    Keywords: Business Failure Prediction; Choice-Based Sample Bias; Financial Analysis; Probabilistic Prediction Model; Probit/Logit Analysis
    Date: 2005–12–01
  4. By: Stefano Neri (Banca d'Italia)
    Abstract: The objective of this paper is to evaluate the effects of monetary policy shocks on stock market indices in the G-7 countries and Spain using the methodology of structural VARs. A model is estimated for each country and the effects of monetary policy shocks are evaluated by means of impulse responses. A contractionary shock has a negative and temporary effect on stock market indices. There is evidence of a significant cross-country heterogeneity in the persistence, magnitude and timing of the responses. A limited participation model with households trading in stocks is set up and the responses of stock prices to a monetary policy shock under different rules are evaluated. The model is able to account for the empirical response of stock prices to monetary policy shocks under different policy rules.
    Keywords: monetary policy; stock prices; structural VAR; limited participation model
    JEL: C32 E52 G12
    Date: 2004–07
  5. By: Viviana Fernandez;
    Abstract: In this article, we formulate a time-scale decomposition of an international version of the CAPM that accounts for both market and exchange-rate risk. In addition, we derive an analytical formula for time-scale value at risk and marginal value at risk (VaR) of a portfolio. We apply our methodology to stock indices of seven emerging economies belonging to Latin America and Asia, for the sample period 1990-2004. Our main conclusions are the following. First, the estimation results hinge upon the choice of the world market portfolio. In particular, the stock markets of the sampled countries appear to be more integrated with other emerging countries than with developed ones. Second, value at risk depends on the investor's time horizon. In the short run, potential losses are greater than in the long run. Third, additional exposure to some specific stock indices will increase value at risk to a greater extent, depending on the investment horizon. Our results go in line with recent research in asset pricing that stresses the importance of heterogeneous investors.
    Keywords: wavelets, ICAPM, value at risk.
    Date: 2005–12–15
  6. By: Ying Chen; Wolfgang Härdle; Vladimir Spokoiny
    Abstract: Risk management technology applied to high dimensional portfolios needs simple and fast methods for calculation of Value-at-Risk (VaR). The multivariate normal framework provides a simple off-the-shelf methodology but lacks the heavy tailed distributional properties that are observed in data. A principle component based method (tied closely to the elliptical structure of the distribution) is therefore expected to be unsatisfactory. Here we propose and analyze a technology that is based on Independent Component Analysis (ICA). We study the proposed ICVaR methodology in an extensive simulation study and apply it to a high dimensional portfolio situation. Our analysis yields very accurate VaRs.
    Keywords: independent component analysis, Value-at-Risk
    JEL: C14 C15 C32 C53 G20
    Date: 2005–09
  7. By: Gonzalo Cortazar (Pontificia Universidad Catolica de Chile); Alejandro Bernales (Inter-American Development Bank); Diether Beuermann (Inter-American Development Bank)
    Abstract: This paper deals with the issue of calculating daily Value-at-Risk (VaR) measures within an environment of thin trading. Our approach focuses on fixed income portfolios with low frequency of transactions in which the missing data problem makes VaR measures difficult to calculate. We propose and implement a methodology to calculate VaR measures with an incomplete panel of prices. The methodology is composed of three phases: Phase I, generates a complete panel of prices, using a term-structure dynamic model of interest rates. Phase II, calculates portfolio VaR measures with several alternative methods using the complete panel data generated in phase I. Phase III, shows how to back-test the VaR measures obtained in phase II using the original incomplete panel of prices. We provide an empirical implementation of the methodology for the Chilean fixed income market. The proposed methodology seems to provide reliable VaR measures for thinly traded markets addressing an important issue for financial risk management in emerging markets.
    Keywords: Risk, Value-at-Risk, Fixed Income, Incomplete Panels, Term- Structure Dynamic Models, Extreme Value, GARCH, Kalman Filter.
    JEL: C51 C52 G11 G15
    Date: 2005–12–28
  8. By: Francois-Éric Racicot (Département des sciences administratives, Université du Québec (Outaouais) et LRSP); Raymond Théoret (Département de stratégie des affaires, Université du Québec (Montréal))
    Abstract: Since the end of the nineties, Basle Committee has required that banks compute periodically their VaR and maintain sufficient capital to pay the eventual losses projected by VaR. Unfortunately, there is not only one measure of VaR because volatility, which is a fundamental component of VaR, is latent. Therefore, banks must use many VaR models to compute the range of their prospective losses. These computations might be complex because the distribution of high frequency returns is not normal. This article analyses many VaR models and produces their programs in Visual Basic. It considers also other new measures of market risk and the use of copulas and Fourier Transform for the computation of VaR.
    Keywords: Ingénierie financière, simulation de Monte Carlo, banques, copules, transformée de Fourier.
    JEL: G12 G13 G33
    Date: 2006–01–12
  9. By: Jan Frederik Slijkerman (Faculty of Economics, Erasmus Universiteit Rotterdam); Dirk Schoenmaker (Vrije Universiteit Amsterdam, and Ministry of Finance, The Hague); Casper de Vries (Faculty of Economics, Erasmus Universiteit Rotterdam)
    Abstract: We study the dependence between the downside risk of European banks and insurers. Since the downside risk of banks and insurers differs, an interesting question from a supervisory point of view is the risk reduction that derives from diversification within large banks and financial conglomerates. We discuss the limited value of the normal distribution based correlation concept, and propose an alternative measure which better captures the downside dependence given the fat tail property of the risk distribution. This measure is estimated and indicates better diversification benefits for conglomerates versus large banks.
    Keywords: Financial conglomerates; Banking; Insurance; Diversification; Extreme Value Theory
    JEL: G21 G22 G28 C49
  10. By: Wolfgang Härdle; Zdenek Hlavka; Gerhard Stahl
    Abstract: The Value-at-Risk calculation reduces the dimensionality of the risk factor space. The main reasons for such simplifications are, e.g., technical efficiency, the logic and statistical appropriateness of the model. In Chapter 2 we present three simple mappings: the mapping on the market index, the principal components model and the model with equally correlated risk factors. The comparison of these models in Chapter 3 is based on the literatere on the verification of weather forecasts (Murphy and Winkler 1992, Murphy 1997). Some considerations on the quantitative analysis are presented in the fourth chapter. In the last chapter, we present empirical analysis of the DAX data using XploRe.
    Keywords: Value-at-Risk, market index model, principal components, random effects model, probability forecast
    JEL: C51 C52 G20
    Date: 2006–01
  11. By: Na, H.S.; Couto Miranda, L.; Berg, J. van den; Leipoldt, M. (Erasmus Research Institute of Management (ERIM), RSM Erasmus University)
    Abstract: In 2004, the Basel Committee on Banking Supervision defined Operational Risk (OR) as the risk of loss resulting from inadequate or failed internal processes, people and systems or from external events. After publication of the new capital accord containing this dfinition, statistical properties of OR losses have attracted considerable attention in the financial industry since financial institutions have to quantify their exposures towards OR events. One of the major topics related to loss data is the non-availability of a suficient amount of data within the Financial Institutions. This paper describes a way to circumvent the problem of data availability by proposing a scaling mechanism that enables an organization to put together data originating from several business units, each one having its specific characteristics like size and exposure towards operational risk. The same scaling mechanism can also be used to enable an institution to include external data originating from other institutions into their own exposure calculations. Using both internal data from different business units and publicly available data from other (anonymous) institutions, we show that there is a strong relationship between losses incurred in one business unit respectively institution, and a specific size driver, in this case gross revenue. We study an appropriate scaling power law as a mechanism that explains this relationship. Having properly scaled the data from different business units, we also show how the resulting aggregated data set can be used to calculate the Value-at-OR for each business unit and present the principles of calculating the value of the OR capital charge according the minimal capital requirements of the Basel committee.
    Keywords: Operational Risk;Power Law Scaling;Loss Distribution;Value at Operational Risk;Minimal Capital Requirements;
    Date: 2006–01–12
  12. By: Kai Detlefsen; Wolfgang Härdle
    Abstract: Option pricing models are calibrated to market data of plain vanillas by minimization of an error functional. From the economic viewpoint, there are several possibilities to measure the error between the market and the model. These different specifications of the error give rise to different sets of calibrated model parameters and the resulting prices of exotic options vary significantly. These price differences often exceed the usual profit margin of exotic options. We provide evidence for this calibration risk in a time series of DAX implied volatility surfaces from April 2003 to March 2004. We analyze in the Heston and in the Bates model factors influencing these price differences of exotic options and finally recommend an error functional. Moreover, we determine the model risk of these two stochastic volatility models for the time series and consider its relation to calibration risk.
    Keywords: calibration risk, calibration, model risk, Heston model, Bates model, barrier option, cliquet option
    JEL: C13 G12
    Date: 2006–01
  13. By: Sanjiv Das; Darrell Duffie; Nikunj Kapadia; Leandro Saita
    Abstract: We develop, and apply to data on U.S. corporations from 1979-2004, tests of the standard doubly-stochastic assumption under which firms'default times are correlated only as implied by the correlation of factors determining their default intensities. This assumption is violated in the presence of contagion or "frailty" (unobservable explanatory variables that are correlated across firms). Our tests do not depend on the time-series properties of default intensities. The data do not support the joint hypothesis of well specified default intensities and the doubly-stochastic assumption. There is also some evidence of default clustering in excess of that implied by the doubly-stochastic model with the given intensities.
    JEL: G3
    Date: 2006–01
  14. By: Viviana Fernandez
    Abstract: In this article, we analyze the impact of recent political conflicts in the Middle East on stock markets worldwide. In particular, we study how political instability––mainly due to the war in Iraq––has affected long-term volatility of stock markets. In doing so, we utilize two approaches to detecting structural breakpoints in volatility: Inclan and Tiao’s Iterative Cumulative Sum of Squares (ICSS) algorithm and wavelet-based variante analysis. After controlling for conditional heteroskedasticity and serial correlation in returns, we conclude that Middle East conflicts have had an impact primarily on the stock markets of countries in that region and emerging Asian countries (e.g., Turkey, Morocco, Egypt, Pakistan, and Indonesia). Further evidence, from an international version of the CAPM, shows that political instability in the Middle East has increased the sensitivity of stock markets to exchange rate risk and, to a lesser extent, to market risk (e.g., Pakistan and Spain).
    Date: 2005
  15. By: Marco Gallegati (Department of Economics, Università Politecnica delle Marche)
    Abstract: In this paper we investigate the relationship between stock market returns and economic activity by using signal decomposition techniques based on wavelet analysis. In particular, we apply the maximum overlap discrete wavelet transform (MODWT) to the DJIA stock price index and the industrial production index for US over the period 1961:1- 2005:3 and using the definitions of wavelet variance, wavelet correlation and cross-correlations analyze the association as well as the lead/lag relationship between stock prices and industrial production at the different time scales. Our results show that stock market returns tends to lead the level of economic activity but only at the highest scales (lowest frequencies), corresponding to periods of 16 months and longer, and that the periods by which stock returns lead output increase as the wavelet time scale increases.
    Keywords: stock market, industrial production, wavelet analysis
    JEL: C32 E44
    Date: 2005–12–27

This nep-rmg issue is ©2006 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.