New Economics Papers
on Risk Management
Issue of 2010‒12‒18
twelve papers chosen by

  1. The Extreme-Value Dependence Between the Chinese and Other International Stock Markets By David E. Giles
  2. Backtesting Value-at-Risk Models: A Multivariate Approach By Cristina Danciulescu
  3. Backtesting Portfolio Value-at-Risk with Estimated Portfolio Weights By Pei Pei
  4. An efficient Peak-over-Threshold implementation for operational risk capital computation By Dominique Guegan; Bertrand Hassani; Cédric Naud
  5. Post-crisis bank liquidity risk management disclosure By Simplice A., Asongu
  7. Evaluation of static hedging strategies for hydropower producers in the Nordic market By Fleten, Stein-Erik; Bråthen, Espen; Nissen-Meyer, Sigurd-Erik
  8. Choice of Collateral Currency By Masaaki Fujii; Akihiko Takahashi
  9. Leverage and risk in US commercial banking in the light of the current financial crisis By Nikolaos Papanikolaou; Christian Wolff
  10. Liquidity Stress-Tester: Do Basel III and Unconventional Monetary Policy Work? By Jan Willem van den End
  11. Which Option Pricing Model is the Best? High Frequency Data for Nikkei225 Index Options By Ryszard Kokoszczyński; Paweł Sakowski; Robert Ślepaczuk
  12. Gaussian and non-Gaussian models for financial bubbles via econophysics By Fry, J. M.

  1. By: David E. Giles (Department of Economics, University of Victoria)
    Abstract: Extreme value theory (EVT) measures the behavior of extreme observations on a random variable. EVT in risk management, an approach to modeling and measuring risks under rare events, has taken on a prominent role in recent years. This paper contributes to the literature in two respects by analyzing an interesting international financial data set. First, we apply conditional EVT to examine the Value at Risk (VAR) and the Expected Shortfall (ES) for the Chinese and several representative international stock market indices: Hang Seng (Hong Kong), TSEC (Taiwan), Nikkei 225 (Japan), Kospi (Korea), BSE (India), STI (Singapore), S&P 500 (US), SPTSE (Canada), IPC (Mexico), CAC 40 (France), DAX 30 (Germany), FTSE100 (UK) index. We find that China has the highest VaR and ES for negative daily stock returns. Second, we examine the extreme dependence between these stock markets, and we find that the Chinese market is asymptotically independent of the other stock markets considered.
    Keywords: Extreme value analysis, peaks-over-threshold, value at risk, expected shortfall, asymptotic dependence, Chinese equity market
    JEL: C13 C16 G15
    Date: 2010–12–09
  2. By: Cristina Danciulescu (Indiana University - Bloomington)
    Abstract: The purpose of this paper is to develop a new and simple backtesting procedure that ex- tends the previous work into the multivariate framework. We propose to use the multivariate Portmanteau statistic of Ljung-Box type to jointly test for the absence of autocorrelations and cross-correlations in the vector of hits sequences for dierent positions, business lines or nancial institutions. Simulation exercises illustrate that this shift to a multivariate hits dimension delivers a test that increases signicantly the power of the traditional backtesting methods in capturing systemic risk: the building up of positive and signicant hits cross-correlations which translates into simultaneous realization of large losses at several business lines or banks. Our multivariate procedure is addressing also an operational risk issue. The proposed technique provides a simple solution to the Value-at-Risk(VaR) estimates aggregation problem: the institution's global VaR measure being either smaller or larger than the sum of individual trading lines' VaRs leading to the institution either under- or over- risk exposure by maintaining excessively high or low capital levels. An application using Prot and Loss and VaR data collected from two international major banks illustrates how our proposed testing approach performs in a realistic environment. Results from experiments we conducted using banks' data suggest that the proposed multivariate testing procedure is a more powerful tool in detecting systemic risk if it is combined with multivariate risk modeling i.e. if covariances are modeled in the VaR forecasts.
    Date: 2010–04
  3. By: Pei Pei (Indiana University Bloomington)
    Abstract: This paper theoretically and empirically analyzes backtesting portfolio VaR with estimation risk in an intrinsically multivariate framework. For the first time in the literature, it takes into account the estimation of portfolio weights in forecasting portfolio VaR and its impact on backtesting. It shows that the estimation risk from estimating the portfolio weights as well as that from estimating the multivariate dynamic model of asset returns make the existing methods in a univariate framework inapplicable. And it proposes a general theory to quantify estimation risk applicable to the present problem and suggests practitioners a simple but effective way to carry out valid inference to overcome the effect of estimation risk in backtesting portfolio VaR. A simulation exercise illustrates our theoretical findings. In application, a portfolio of three stocks is considered.
    Date: 2010–11
  4. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris); Bertrand Hassani (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, BPCE - BPCE); Cédric Naud (BPCE - BPCE)
    Abstract: Operational risk quantification requires dealing with data sets which often present extreme values which have a tremendous impact on capital computations (VaR). In order to take into account these effects we use extreme value distributions to model the tail of the loss distribution function. We focus on the Generalized Pareto Distribution (GPD) and use an extension of the Peak-over-threshold method to estimate the threshold above which the GPD is fitted. This one will be approximated using a Bootstrap method and the EM algorithm is used to estimate the parameters of the distribution fitted below the threshold. We show the impact of the estimation procedure on the computation of the capital requirement - through the VaR - considering other estimation methods used in extreme value theory. Our work points also the importance of the building's choice of the information set by the regulators to compute the capital requirement and we exhibit some incoherence with the actual rules.
    Keywords: Operational risk, generalized pareto distribution, Picklands estimate, Hill estimate, Expectation Maximization algorithm, Monte Carlo simulations, VaR.
    Date: 2010–11
  5. By: Simplice A., Asongu
    Abstract: Purpose – This work seeks to investigate what post crisis principles, banks have taken in a bid to manage liquidity risk. Its basis is founded on the ground that, the financial liquidity market was greatly affected during the recent economic turmoil and financial meltdown; when liquidity management disclosure was imperative for confidence building in depositors and shareholders. Design/methodology/approach – The study investigates Basel II pillar 3 disclosures on liquidity risk management in 20 of the top 33 world banks. Bank selection is based on available information, geographical balance and language permissibility. Information is searched from the World Wide Web; with a minimum of one hour allocated for ‘content search’; notwithstanding time spell for ‘content analysis’. When information on liquidity risk management is found, content scrutiny is guided by 16 disclosure principles; clubbed in four categories. Findings – Just 25% of sampled banks provide explicitly public accessible liquidity risk management information. This is a stark indication that, even in the post-crisis era, many top ranking banks do not still take seriously Basel disclosure norms; especially the February 2008 pre-crisis warning of the Basel Committee on Banking Supervision. Implications/limitations – Stakeholders of banks should easily have access to information on liquidity risk management. Banks falling short of this might not breed confidence in customers and shareholders in event of financial panic and turmoil. Like in the run-up to the previous financial crisis, if banks are not compelled to explicitly and expressly disclose what measures they adopt in a bid to guarantee stakeholder liquidity ; the onset of any financial shake-up would only precipitate a meltdown. The main limitation of this study is; the World Wide Web is used as the only source of information for bank stakeholders. Originality/value – The contribution of this paper to literature can be viewed from the role it plays in investigating what post-crisis measures banks have taken to inform stakeholders on how they manage liquidity risk. Paper type: Qualitative finance research paper.
    Keywords: Post crisis; Liquidity risk management; Bank
    JEL: G18 E50 G00 D80
    Date: 2010–12–07
  6. By: Haim Shalit (Department of Economics, Ben-Gurion University of the Negev, Israel)
    Abstract: This paper compiles the risk measures associated with the Lorenz curve. The Lorenz curve is the main tool in economics for measuring income distribution and inequality. For the past decades some of the Lorenz curve spin-offs have been used in risk analysis and finance. In particular, the Lorenz curve addresses the concepts of second degree stochastic dominance, Gini’s mean difference, Conditional Value-at-Risk, and the extended Gini in portfolio theory and in investment practice. Because the Lorenz curve can be estimated from asset returns, the risk measures are easy to implement and use.
    JEL: G32 D81 G11
    Date: 2010
  7. By: Fleten, Stein-Erik; Bråthen, Espen; Nissen-Meyer, Sigurd-Erik
    Abstract: In this paper we develop an optimization model to derive static hedge positions for hydropower producers with different risk characteristics. Previous research has primarily considered dynamic hedging; however, static hedging is the common choice among hydropower producers because of its simplicity. Our contribution is to evaluate such hedging out of sample. The hedging strategies we analyze include a natural hedge, which means no hedging, and output from an optimization model that we develop ourselves. The results show that, although optimized positions vary over time, hedging with use of forward contracts significantly reduces the risk in terms of value-at-risk, conditional value-at-risk and standard deviation of the revenue. Furthermore, this improvement results in only a minor reduction in mean revenue.
    Keywords: Risk management; Static hedging; Hydropower producers; Nordic electricity market; Risk premium
    JEL: D81 G32 Q4
    Date: 2010–12
  8. By: Masaaki Fujii (Graduate School of Economics, University of Tokyo); Akihiko Takahashi (Faculty of Economics, University of Tokyo)
    Abstract: Collateral has been used for a long time in the cash market and we have also experienced significant increase of its use as an important credit risk mitigation tool in the derivatives market for this decade. Despite its long history in the financial market, its importance for funding has been recognized relatively recently following the explosion of basis spreads in the crisis. This paper has demonstrated the impact of collateralization on derivatives pricing through its funding effects based on the actual data of swap markets. It has also shown the importance of the hchoiceh of collateral currency. In particular, when a contract allows multiple currencies as eligible collateral as well as its free replacement, the paper has found that the embedded hcheapest-todeliverh option can be quite valuable and significantly change the fair value of a trade. The implications of these findings for risk management have been also discussed.
    Date: 2010–12
  9. By: Nikolaos Papanikolaou (Luxembourg School of Finance, University of Luxembourg); Christian Wolff (Luxembourg School of Finance, University of Luxembourg)
    Abstract: In this paper we study the relationship between leverage and risk in commercial banking market. We employ a panel data set that consists of the biggest US commercial banks and which extends from 2002 to 2010 thus covering both the years before the outbreak of the current financial crisis as well as those followed. We make clear distinctions among different leverage types like on- and off-balance sheet leverage as well as short- and long-term leverage, which have never been made in the relevant literature. Our findings provide evidence that excessive leverage, both explicit and hidden off-the-balance sheet, rendered large banks vulnerable to financial shocks thus contributing to the fragility of the whole banking industry. In a similar vein, a direct link between short- and long-term leverage with risk is reported before the crisis, showing that leverage has been one of the key factors responsible for the serious liquidity shortages that were revealed after 2007 when the crisis erupted. We also demonstrate that banks which concentrate on traditional banking activities typically carry less risk exposure than those that are involved with modern financial instruments. Overall, our results provide a better understanding of the role of leverage in destabilizing the whole system whereas at the same time contribute to the current discussion on the resilience of the banking sector through the strengthening of the existing regulatory framework.
    Keywords: financial crisis; risk; leverage; commercial banking
    JEL: C23 D02 G21 G28
    Date: 2010
  10. By: Jan Willem van den End
    Abstract: This paper presents a macro stress-testing model for liquidity risks of banks, incorporating the proposed Basel III liquidity regulation, unconventional monetary policy and credit supply effects. First and second round (feedback) effects of shocks are simulated by a Monte Carlo approach. Banks react according to the Basel III standards, endogenising liquidity risk. The model shows how banks’ reactions interact with extended refinancing operations and asset purchases by the central bank. The results indicate that Basel III limits liquidity tail risk, in particular if it leads to a higher quality of liquid asset holdings. The flip side of increased bond holdings is that monetary policy conducted through asset purchases gets more influence on banks relative to refinancing operations.
    Keywords: banking; financial stability; stress-tests; liquidity risk
    JEL: C15 E44 G21 G32
    Date: 2010–12
  11. By: Ryszard Kokoszczyński (Faculty of Economic Sciences, University of Warsaw, Economic Institute, National Bank of Poland); Paweł Sakowski (Faculty of Economic Sciences, University of Warsaw); Robert Ślepaczuk (Faculty of Economic Sciences, University of Warsaw)
    Abstract: Option pricing models are the main subject of many research papers prepared both in academia and financial industry. Using high-frequency data for Nikkei225 index options, we check the properties of option pricing models with different assumptions concerning the volatility process (historical, realized, implied, stochastic or based on GARCH model). In order to relax the continuous dividend payout assumption, we use the Black model for pricing options on futures, instead of the Black-Scholes-Merton model. The results are presented separately for 5 classes of moneyness ratio and 5 classes of time to maturity in order to show some patterns in option pricing and to check the robustness of our results. The Black model with implied volatility (BIV) comes out as the best one. Highest average pricing errors we obtain for the Black model with realized volatility (BRV). As a result, we do not see any additional gain from using more complex and time-consuming models (SV and GARCH models. Additionally, we describe liquidity of the Nikkei225 option pricing market and try to compare our results with a detailed study for the emerging market of WIG20 index options (Kokoszczyński et al. 2010b).
    Keywords: option pricing models, financial market volatility, high-frequency financial data, midquotes data, transactional data, realized volatility, implied volatility, stochastic volatility, microstructure bias, emerging markets
    JEL: G14 G15 C61 C22
    Date: 2010
  12. By: Fry, J. M.
    Abstract: We develop a rational expectations model of financial bubbles and study how the risk-return interplay is incorporated into prices. We retain the interpretation of the leading Johansen-Ledoit-Sornette model: namely, that the price must rise prior to a crash in order to compensate a representative investor for the level of risk. This is accompanied, in our stochastic model, by an illusion of certainty as described by a decreasing volatility function. As the volatility function decreases crashes can be seen to represent a phase transition from stochastic to deterministic behaviour in prices. Our approach is first illustrated by a benchmark Gaussian model - subsequently extended to a heavy-tailed model based on the Normal Inverse Gaussian distribution. Our model is illustrated by an empirical application to the London Stock Exchange. Results suggest that the aftermath of the Bank of England's process of quantitative easing has coincided with a bubble in the FTSE 100.
    Keywords: financial crashes; super-exponential growth; illusion of certainty; bubbles; heavy tails
    JEL: C10 C53 C02
    Date: 2010–12–08

General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.