nep-rmg New Economics Papers
on Risk Management
Issue of 2012‒06‒25
twenty papers chosen by
Stan Miles
Thompson Rivers University

  1. Risk management – a new priority system customs and its consequences By Iacob, Constanta; Zaharia, Stefan
  2. Forecasting Value-at-Risk with Time-Varying Variance, Skewness and Kurtosis in an Exponential Weighted Moving Average Framework By Alexandros Gabrielsen; Paolo Zagaglia; Axel Kirchner; Zhuoshi Liu
  3. A Comparison of Traditional and Copula based VaR with Agricultural portfolio By Mandal, Maitreyi; Lagerkvist, Carl Johan
  4. Optimal starting times, stopping times and risk measures for algorithmic trading By Mauricio Labadie; Charles-Albert Lehalle
  5. A wavelet-based assessment of market risk: The emerging markets case By António Rua; Luís Catela Nunes
  6. Lifetime Dependence Modelling using the Truncated Multivariate Gamma Distribution By Daniel Alai; Zinoviy Landsman; Michael Sherris
  7. Measuring systemic funding liquidity risk in the Russian banking system By Andrievskaya, Irina
  8. An algorithm for the orthogonal decomposition of financial return data By Vic Norton
  9. Payment changes and default risk: theimpact of refinancing on expected credit losses By Joseph Tracy; Joshua Wright
  10. Binomial Tree Model for Convertible Bond Pricing within Equity to Credit Risk Framework By K. Milanov; O. Kounchev
  11. An Analysis of Reinsurance Optimisation in Life By Elena Veprauskaite; Michael Sherris
  12. Collateral requirements for mandatory central clearing of over-the-counter derivatives By Daniel Heller; Nicholas Vause
  13. Determinants of corporate default: a BMA approach By Carlos González-Aguado; Enrique Moral-Benito
  14. The procyclicality of Basel III leverage: Elasticity-based indicators and the Kalman filter By Christian Calmès; Raymond Théoret
  15. Bank systemic risk and the business cycle: Canadian and U.S. evidence By Christian Calmès; Raymond Théoret
  16. Do RIN Mandates and Blender's Tax Credit Affect Blenders' Hedging Strategies? By Ahmedov, Zafarbek; Woodard, Joshua
  17. Robust volatility forecasts in the presence of structural breaks By Elena Andreou; Eric Ghysels; Constantinos Kourouyiannis
  18. Aggregation of Market Risks using Pair-Copulas By Dominique Guegan; Fatima Jouad
  19. Managing Catastrophic Risk By Howard Kunreuther; Geoffrey Heal
  20. Price Insurance, Moral Hazard and Agri-environmental Policy By Fraser, Rob W.

  1. By: Iacob, Constanta; Zaharia, Stefan
    Abstract: The preoccupations concerning the analysis and the administration of the risks began in the year 1970 and lasted for 10 years, aiming at different fields of the human sciences: administration, sociology, economy, political sciences. The negligence in analyzing the risks for more than 20 years marked the competitiveness of the enterprises. The risks are still the same, some of them amplified, other new ones appeared by sometimes creating „the avalanche effect", so it is difficult to estimate and to stop their consequences. Constantly facing new challenges, customs administrations should remain reactive when it comes to managing emerging risks. Risk management in combination with other essential constituents customs, indicate direction of the XXI century in this field. This paper aims to analyze the risks faced by customs, the possibility of analyzing and measuring the risks and their management directions.
    Keywords: risks; evaluation; indicators; analysis; monitoring; review
    JEL: G32 G28 M42
    Date: 2012–06–09
  2. By: Alexandros Gabrielsen (Sumitomo Mitsui Banking Corporation, UK); Paolo Zagaglia (Department of Economics, University of Bologna, Italy); Axel Kirchner (Deutsche Bank, UK); Zhuoshi Liu (Bank of England, UK)
    Abstract: This paper provides an insight to the time-varying dynamics of the shape of the distribution of financial return series by proposing an exponential weighted moving average model that jointly estimates volatility, skewness and kurtosis over time using a modified form of the Gram-Charlier density in which skewness and kurtosis appear directly in the functional form of this density. In this setting VaR can be described as a function of the time-varying higher moments by applying the Cornish-Fisher expansion series of the first four moments. An evaluation of the predictive performance of the proposed model in the estimation of 1-day and 10-day VaR forecasts is performed in comparison with the historical simulation, filtered historical simulation and GARCH model. The adequacy of the VaR forecasts is evaluated under the unconditional, independence and conditional likelihood ratio tests as well as Basel II regulatory tests. The results presented have significant implications for risk management, trading and hedging activities as well as in the pricing of equity derivatives.
    Keywords: exponential weighted moving average, time-varying higher moments, Cornish-Fisher expansion, Gram-Charlier density, risk management, Value-at-Risk
    JEL: C51 C52 C53 G15
    Date: 2012–06
  3. By: Mandal, Maitreyi; Lagerkvist, Carl Johan
    Abstract: Mean-Variance theory of portfolio construction is still regarded as the main building block of modern portfolio theory. However, many authors have suggested that the mean-variance criterion, conceived by Markowitz (1952), is not optimal for asset allocation, because the investor expected utility function is better proxied by a function that uses higher moments and because returns are distributed in a non-Normal way, being asymmetric and/or leptokurtic, so the mean-variance criterion cannot correctly proxy the expected utility with non-Normal returns. Copulas are a very useful tool to deal with non standard multivariate distribution. Value at Risk (VaR) and Conditional Value at Risk (CVaR) have emerged as a golden measure of risk in recent times. Though almost unutilized so far, as agriculture becomes more industrialized, there will be growing interest in these risk measures. In this paper, we apply a Gaussian copula and Student’s t copula models to create a joint distribution of return of two (Farm Return and S&P 500 Index Return) and three (Farm Return, S&P 500 Index Return and US Treasury Bond Index) asset classes and finally use VaR measures to create the optimal portfolio. The resultant portfolio offers better hedges against losses.
    Keywords: Portfolio Choice, Downside Risk Protection, Value at risk, Copula, Agricultural Finance, Risk and Uncertainty, C52, G11, Q14,
    Date: 2012
  4. By: Mauricio Labadie (EXQIM - EXclusive Quantitative Investment Management - EXQIM); Charles-Albert Lehalle (Head of Quantitative Research - CALYON group)
    Abstract: We derive explicit recursive formulas for Target Close (TC) and Implementation Shortfall (IS) in the Almgren-Chriss framework. We explain how to compute the optimal starting and stopping times for IS and TC, respectively, given a minimum trading size. We also show how to add a minimum participation rate constraint (Percentage of Volume, PVol) for both TC and IS. We also study an alternative set of risk measures for the optimisation of algorithmic trading curves. We assume a self-similar process (e.g. Levy process, fractional Brownian motion or fractal process) and define a new risk measure, the p-variation, which reduces to the variance if the process is a Brownian motion. We deduce the explicit formula for the TC and IS algorithms under a self-similar process. We show that there is an equivalence between self-similar models and a family of risk measures called p-variations: assuming a self-similar process and calibrating empirically the parameter p for the p-variation yields the same result as assuming a Brownian motion and using the p-variation as risk measure instead of the variance. We also show that p can be seen as a measure of the aggressiveness: p increases if and only if the TC algorithm starts later and executes faster. From the explicit expression of the TC algorithm one can compute the sensitivities of the curve with respect to the parameters up to any order. As an example, we compute the first order sensitivity with respect to both a local and a global surge of volatility. Finally, we show how the parameter p of the p-variation can be implied from the optimal starting time of TC, and that under this framework p can be viewed as a measure of the joint impact of market impact (i.e. liquidity) and volatility.
    Keywords: Quantitative Finance; High-Frequency Trading; Algorithmic Trading; Optimal Execution; Market Impact; Risk Measures; Self-similar Processes; Fractal Processes
    Date: 2012–05–18
  5. By: António Rua; Luís Catela Nunes
    Abstract: The measurement of market risk poses major challenges to researchers and different economic agents. On one hand, it is by now widely recognized that risk varies over time. On the other hand, the risk profile of an investor, in terms of investment horizon, makes it crucial to also assess risk at the frequency level. We propose a novel approach to measuring market risk based on the continuous wavelet transform. Risk is allowed to vary both through time and at the frequency level within a unified framework. In particular, we derive the wavelet counterparts of well-known measures of risk. One is thereby able to assess total risk, systematic risk and the importance of systematic risk to total risk in the time-frequency space. To illustrate the method we consider the emerging markets case over the last twenty years, finding noteworthy heterogeneity across frequencies and over time, which highlights the usefulness of the wavelet approach.
    JEL: C40 F30 G15
    Date: 2012
  6. By: Daniel Alai (ARC Centre of Excellence in Population Ageing Research, Australian School of Business, University of New South Wales); Zinoviy Landsman (Departmant of Statistics, University of Haifa); Michael Sherris (School of Risk and Actuarial Studies and ARC Centre of Excellence in Population Ageing Research, Australian School of Business, University of New South Wales)
    Abstract: Systematic improvements in mortality results in dependence in the survival distributions of insured lives. This is not allowed for in standard life tables and actuarial models used for annuity pricing and reserving. Systematic longevity risk also undermines the law of large numbers; a law that is relied on in the risk management of life insurance and annuity portfolios. This paper applies a multivariate gamma distribution to incorporate dependence. Lifetimes are modelled using a truncated multivariate gamma distribution that induces dependence through a shared gamma distributed component. Model parameter estimation is developed based on the method of moments and generalized to allow for truncated observations. The impact of dependence on the valuation of a portfolio, or cohort, of annuitants with similar risk characteristics is demonstrated by applying the model to annuity valuation. The dependence is shown to have a significant impact on the risk of the annuity portfolio as compared with traditional actuarial methods that implicitly assume independent lifetimes.
    Keywords: Systematic longevity risk, dependence, multivariate gamma, lifetime distribution, annuity valuation
    JEL: G22 G32 C13 C02
    Date: 2012–05
  7. By: Andrievskaya, Irina (BOFIT)
    Abstract: The 2007-2009 global financial crisis demonstrated the need for effective systemic risk measurement and regulation. This paper proposes a straightforward approach for estimating the systemic funding liquidity risk in a banking system and identifying systemically critical banks. Focusing on the surplus of highly liquid assets above due payments, we find systemic funding liquidity risk can be expressed as the distance of the aggregate liquidity surplus from its current level to its critical value. Calculations are performed using simulated distribution of the aggregate liquidity surplus determined using Independent Component Analysis. The systemic importance of banks is then assessed based on their contribution to variation of the liquidity surplus in the system. We apply this methodology to the case of Russia, an emerging economy, to identify the current level of systemic funding liquidity risk and rank banks based on their systemic relevance.
    Keywords: systemic risk; liquidity surplus; banking; Russia
    JEL: G21 G28 P29
    Date: 2012–06–18
  8. By: Vic Norton
    Abstract: We present an algorithm for the decomposition of periodic financial return data into orthogonal factors of expected return and "systemic", "productive", and "nonproductive" risk. Generally, when the number of funds does not exceed the number of periods, the expected return of a portfolio is an affine function of its productive risk.
    Date: 2012–06
  9. By: Joseph Tracy; Joshua Wright
    Abstract: This paper analyzes the relationship between changes in borrowers' monthly mortgage payments and future credit performance. This relationship is important for the design of an internal refinance program such as the Home Affordable Refinance Program (HARP). We use a competing risk model to estimate the sensitivity of default risk to downward adjustments of borrowers' monthly mortgage payments for a large sample of prime adjustable-rate mortgages. Applying a 26 percent average monthly payment reduction that we estimate would result from refinancing under HARP, we find that the cumulative five-year default rate on prime conforming adjustable-rate mortgages with loan-to-value ratios above 80 percent declines by 3.8 percentage points. If we assume an average loss given default of 35.2 percent, this lower default risk implies reduced credit losses of 134 basis points per dollar of balance for mortgages that refinance under HARP.
    Keywords: Adjustable rate mortgages ; Mortgages ; Default (Finance) ; Risk ; Credit
    Date: 2012
  10. By: K. Milanov; O. Kounchev
    Abstract: In the present paper we fill an essential gap in the Convertible Bonds pricing world by deriving a Binary Tree based model for valuation subject to credit risk. This model belongs to the framework known as Equity to Credit Risk. We show that this model converges in continuous time to the model developed by Ayache, Forsyth and Vetzal [2003]. To this end, both forms of credit risk modeling, the so-called reduced (constant intensity of default model for the underlying) and the so-called synthesis (variable intensity of default model for the underlying) are considered. We highlight and quantify certain issues that arise, as transition probability analysis and threshold values of model inputs (tree step, underlying stock price, etc.). This study may be considered as an alternative way to develop the price dynamics model of Ayache et al. [2003] for convertible bonds in credit risk environment.
    Date: 2012–06
  11. By: Elena Veprauskaite (School of Management, University of Bath); Michael Sherris (School of Risk and Actuarial Studies and ARC Centre of Excellence in Population Ageing Research, Australian School of Business, University of New South Wales)
    Abstract: This paper considers optimal reinsurance based on an assessment of the reinsurance arrangements for a large life insurer. The objective is to determine the reinsurance structure, based on actual insurer data, using a modified mean-variance criteria that maximises the retained premiums and minimizes the variance of retained claims while keeping the retained risk exposure constant, assuming a given level of risk appetite. The portfolio of life and disability policies use quota-share, surplus and a combination of both quota-share and surplus reinsurance. Alternative reinsurance arrangements are compared using the modified mean-variance criteria to assess the optimal reinsurance strategy. The analysis takes into account recent claims experience as well as actual premiums paid by insured lives and to the reinsurers. Optimal reinsurance cover depends on many factors including retention levels, premiums and the variance of sum insured values (and therefore claims), as a result an insurer should assess the tradeoff between retained premiums and the variance of retained claims based on its own experience and risk appetite.
    Keywords: Life insurance, optimal reinsurance, proportional reinsurance, mean-variance criteria
    JEL: G22 G32 L21
    Date: 2012–03
  12. By: Daniel Heller; Nicholas Vause
    Abstract: By the end of 2012, all standardised over-the-counter (OTC) derivatives must be cleared with central counterparties (CCPs). In this paper, we estimate the amount of collateral that CCPs should demand to clear safely all interest rate swap and credit default swap positions of the major derivatives dealers. Our estimates are based on potential losses on a set of hypothetical dealer portfolios that replicate several aspects of the way that derivatives positions are distributed within and across dealer portfolios in practice. Our results suggest that major dealers already have sufficient unencumbered assets to meet initial margin requirements, but that some of them may need to increase their cash holdings to meet variation margin calls. We also find that default funds worth only a small fraction of dealers' equity appear sufficient to protect CCPs against almost all possible losses that could arise from the default of one or more dealers, especially if initial margin requirements take into account the tail risks and time variation in risk of cleared portfolios. Finally, we find that concentrating clearing of OTC derivatives in a single CCP could economise on collateral requirements without undermining the robustness of central clearing.
    Keywords: central counterparties, clearing, collateral, derivatives, default funds, initial margins, variation margins
    Date: 2012–03
  13. By: Carlos González-Aguado (BLUECAP); Enrique Moral-Benito (Banco de España)
    Abstract: Model uncertainty hampers consensus on the main determinants of corporate default. We employ Bayesian model averaging (BMA) techniques in order to shed light on this issue. Empirical findings suggest that the most robust determinants of corporate default are firm-specifi c variables such as the ratio of working capital to total assets, the ratio of retained earnings to total assets, the ratio of total liabilities to total assets and the standard deviation of the firm’s stock return. In contrast, aggregate variables do not seem to play a relevant role once firm-specific characteristics (observable and unobservable) are taken into consideration
    Keywords: Default probabilities, Bayesian model averaging, Credit Risk
    JEL: G33 C1
    Date: 2012–06
  14. By: Christian Calmès (Chaire d'information financière et organisationnelle ESG-UQAM, Laboratory for Research in Statistics and Probability, Université du Québec (Outaouais)); Raymond Théoret (Chaire d'information financière et organisationnelle ESG-UQAM, Université du Québec (Montréal), Université du Québec (Outaouais))
    Abstract: Traditional leverage ratios assume that bank equity captures all changes in asset values. However, in the context of market-oriented banking, capital can be funded by additional debt or asset sales without directly influencing equity. Given the new sources of liquidity generated by off-balance-sheet (OBS), time-varying indicators of leverage are better suited to capture the dynamics of aggregate leverage. In this paper, we introduce a Kalman filter procedure to study such elasticity-based measures of broad leverage. This approach enables the detection of the build-up in bank risk years before what the traditional assets to equity ratio predicts. Most elasticity measures appear in line with the historical episodes, well tracking the cyclical pattern of leverage. Importantly, the degree of total leverage suggests that OBS banking exerts a stronger influence on leverage during expansion periods.
    Keywords: Basel III; Banking stability; Macroprudential policy; Herding; Macroeconomic uncertainty.
    JEL: C32 G20 G21
    Date: 2012–01–27
  15. By: Christian Calmès (Chaire d'information financière et organisationnelle ESG-UQAM, Laboratory for Research in Statistics and Probability, Université du Québec (Outaouais)); Raymond Théoret (Chaire d'information financière et organisationnelle ESG-UQAM, Université du Québec (Montréal), Université du Québec (Outaouais))
    Abstract: This paper investigates how banks, as a group, react to macroeconomic risk and uncertainty, and more specifically the way banks systemic risk evolves over the business cycle. Adopting the methodology of Beaudry et al. (2001), our results clearly suggest that the dispersion across banks traditional portfolios has increased through time. We introduce an estimation procedure based on EGARCH and refine Baum et al. (2002, 2004, 2009) and Quagliariello (2007, 2009) framework to analyze the question in the new industry context, i.e. shadow banking. Consistent with finance theory, we first confirm that banks tend to behave homogeneously vis-à-vis macroeconomic uncertainty. In particular, we find that the cross-sectional dispersions of loans to assets and non-traditional activities shrink essentially during downturns, when the resilience of the banking system is at its lowest. More importantly, our results also suggest that the cross-sectional dispersion of market-oriented activities is both more volatile and sensitive to the business cycle than the dispersion of the traditional activities.
    Keywords: Banking stability; Macroprudential policy; Herding; Macroeconomic uncertainty; Markov switching regime; EGARCH.
    JEL: C32 G20 G21
    Date: 2012–04–27
  16. By: Ahmedov, Zafarbek; Woodard, Joshua
    Abstract: In this study stylized gasoline blender’s optimal hedging strategy in the presence of ethanol mandates is analyzed. In particular, the main objective of this study is to investigate whether the ability to purchase RINs and the presence of tax incentives would affect blenders’ optimal hedging strategies. Multicommodity hedging method with Lower Partial Moments hedging criterion as a measure of downside risk is utilized in obtaining the optimal hedge ratios. Based on the obtained results, the Renewable Identification Number purchases do not reduce risk, hence, is not a good risk management tool in the presence of blenders’ tax credits. However, in the absence of tax credit, RINs can be used as a risk management tool.
    Keywords: Ethanol, RINs, hedging, LPM, Agribusiness, Agricultural Finance, Resource /Energy Economics and Policy, Risk and Uncertainty,
    Date: 2012
  17. By: Elena Andreou; Eric Ghysels; Constantinos Kourouyiannis
    Abstract: Financial time series often undergo periods of structural change that yield biased estimates or forecasts of volatility and thereby risk management measures. We show that in the context of GARCH diussion models ignoring structural breaks in the leverage coecient and the constant can lead to biased and inecient AR-RV and GARCH-type volatility estimates. Similarly, we nd that volatility forecasts based on AR-RV and GARCH-type models that take into account structural breaks by estimating the parameters only in the post-break period, signicantly outperform those that ignore them. Hence, we propose a Flexible Forecast Combination method that takes into account not only information from dierent volatility models, but from different subsamples as well. This methods consists of two main steps: First, it splits the estimation period in subsamples based on estimated structural breaks detected by a change-pointtest. Second, it forecasts volatility weighting information from all subsamples by minimizing particular loss function, such as the Square Error and QLIKE. An empirical application using the S&P 500 Index shows that our approach performs better, especially in periods of high volatility, compared to a large set of individual volatility models and simple averaging methods as well as Forecast Combinations under Regime Switching.
    Keywords: forecast, combinations, volatility, structural breaks
    Date: 2012–05
  18. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Paris I - Panthéon Sorbonne, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris); Fatima Jouad (EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris, AXA GRM - AXA Group Risk Management)
    Abstract: The advent of the Internal Model Approval Process within Solvency II and the desirability of many insurance companies to gain approval has increased the importance of some topics such as risk aggregation in determining overall economic capital level. The most widely used approach for aggregating risks is the variance-covariance matrix approach. Although being a relatively well-known concept that is computationally convenient, linear correlations fail to model every particularity of the dependence pattern between risks. In this paper we apply different pair-copula models for aggregating market risks that represent usually an important part of an insurer risk profile. We then calculate the economic capital needed to withstand unexpected future losses and the associated diversification benefits. The economic capital will be determined by computing both 99.5th VaR and 99.5th ES following the requirements of Solvency II and SST.
    Keywords: Solvency II, risk aggregation, market risks, pair-copulas, economic capital, diversification gains.
    Date: 2012–05
  19. By: Howard Kunreuther; Geoffrey Heal
    Abstract: A principal reason that losses from catastrophic risks have been increasing over time is that more individuals and firms are locating in harm’s way while not taking appropriate protective measures. Several behavioural biases lead decision-makers not to invest in adaptation measures until after it is too late. In an interdependent world with no intervention by the public sector, it may be economically rational for those at risk not to invest in protective measures. Risk management strategies that involve private-public partnerships that address these issues may help in reducing future catastrophic losses. These may include multi-year insurance contracts, well-enforced regulations, third-party inspections, and alternative risk transfer instruments such as catastrophe bonds.
    JEL: D62 D80 D85 H20
    Date: 2012–06
  20. By: Fraser, Rob W.
    Abstract: Motivated by recent EC proposals to “strengthen risk management tools” in the CAP in relation to farmers’ increased exposure to market price risk, this paper draws attention to a potential negative consequence of such a change in the CAP – an associated increase in cheating behaviour by farmers in the context of environmental stewardship. A theoretical framework for this policy problem is developed and used not just to illustrate the problem, but also to propose a solution – specifically to combine the introduction of CAP-supported policy changes which reduce farmers’ exposure to market-based risk with changes in environmental stewardship policies which increase the riskiness of cheating and thereby discourage such behaviour.
    Keywords: Demand and Price Analysis, Environmental Economics and Policy, Risk and Uncertainty,
    Date: 2012

This nep-rmg issue is ©2012 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.