nep-rmg New Economics Papers
on Risk Management
Issue of 2018‒10‒01
25 papers chosen by

  1. On a quest for robustness: About model risk, randomness and discretion in credit risk stress tests By Siemsen, Thomas; Vilsmeier, Johannes
  2. Modelling the spreading process of extreme risks via a simple agent-based model: Evidence from the China stock market By Dinghai Xu; Jingru Ji; Donghua Wang
  3. Reinsurance Pricing of Large Motor Insurance Claims in Nigeria: An Extreme Value Analysis By Queensley Chukwudum
  4. Challenges in Implementing Worst-Case Analysis By Jon Danielsson; Lerby Ergun; Casper G. de Vries
  5. Extreme Value Theory and Copulas: Reinsurance in the Presence of Dependent Risks By Queensley Chukwudum
  6. Reputational risks, value of losses and financial sustainability of commercial banks By Natalia Kunitsyna; Igor Britchenko; Igor Kunitsyn
  7. Effectiveness of United States Corn Futures Contracts as Hedging Instruments for Mexican Corn Producers By Cuautle-Parra, David; Riley, John M.
  8. Dealing with Downside Risk in Energy Markets: Futures versus Exchange-Traded Funds By Arunanondchai, Panit; Sukcharoen, Kunlapath; Leatham, David J.
  9. Change-Point Testing and Estimation for Risk Measures in Time Series By Lin Fan; Peter W. Glynn; Markus Pelger
  10. Enhancing adaptive capacity through climate-smart insurance: Theory and evidence from India By Kramer, B.; Ceballos, F.
  11. Model Risk Measurement under Wasserstein Distance By Yu Feng; Erik Schl\"ogl
  12. Did the Basel process of capital regulation enhance the resiliency of European Banks? By Gehrig, Thomas; Iannino, Maria Chiara
  13. Generalizing Geometric Brownian Motion By Peter Carr; Zhibai Zhang
  14. Prudential Liquidity Regulation in Banking—A Literature Review By Adi Mordel
  15. Should Bank Capital Regulation Be Risk Sensitive? By Toni Ahnert; James Chapman; Carolyn Wilkins
  16. The distortion principle for insurance pricing: properties, identification and robustness By Daniela Escobar; Georg Pflug
  17. Off-farm Income: A Risk Management Strategy for Farm Households? By Huettel, Silke; Murtazashvili, Irina; Weltin, Meike; Erickson, Kenneth W.
  18. Designing Stress Scenarios By Cecilia Parlatore
  19. Evaluation of investment projects under uncertainty: multi-criteria approach using interval data By Olga A. Shvetsova; Elena A. Rodionova; Michael Z. Epstein
  20. Corporate Debt Choice and Bank Capital Regulation By Haotian Xiang
  21. Environmental Hazards and Mortgage Credit Risk: Evidence from Texas Pipeline Incidents By Xu, Minhong; Xu, Yilan
  22. Measuring Systematic Risk with Neural Network Factor Model By Jeonggyu Huh
  23. Flexible Modeling of Multivariate Risks in Pricing Margin Protection Insurance: Modeling Portfolio Risks with Mixtures of Mixtures By Zeytoon Nejad Moosavian, Seyyed Ali
  24. Non-Gaussian Stochastic Volatility Model with Jumps via Gibbs Sampler By Arthur T. Rego; Thiago R. dos Santos
  25. Risk Attitudes of US Agricultural Producers By Rosch, Stephanie D.

  1. By: Siemsen, Thomas; Vilsmeier, Johannes
    Abstract: In this paper we study the impact of model uncertainty, which occurs when linking a stress scenario to default probabilities, on reduced-form credit risk stress testing. This type of uncertainty is omnipresent in most macroeconomic stress testing applications due to short time series for banks' portfolio risk parameters and highly collinear macroeconomic covariates. We quantify the effect of model uncertainty on supervisory and bank stress tests in terms of predicted portfolio loss distributions and implied capital shortfalls by conducting a full-edged top-down credit risk stress test for over 1,500 German banks. Our results suggest that the impact of model uncertainty on predicted capital shortfalls can be huge, even among models with similar predictive power. This leaves both banks and supervisors with uncertainty when calculating stress impacts and implied capital requirements. To mitigate the impact of uncertainty, we suggest a modeling approach which filters the model space by combining the standard Bayesian model averaging (BMA) paradigm with a structural filter derived from the Merton/Vasicek credit risk model. Applying our stress testing framework, the dispersion decreases and the median stress effect is reduced from -5.0pp of CET1 ratio under the BMA model to -2.5pp under the structurally augmented BMA model, while the predicted capital shortfall is reduced by 70 %. The structural filter eliminates extreme outcomes on both sides of the stress forecast distribution, leading in our application to the German banking sector to a reduction in impact compared to the model without the "stress testing plausibility" filter.
    Keywords: model uncertainty,stress test,Bayesian model averaging,quantile mapping,credit risk
    JEL: C11 C52 G21
    Date: 2018
  2. By: Dinghai Xu (Department of Economics, University of Waterloo); Jingru Ji (Department of Finance, East China University of Science and Technology); Donghua Wang (Department of Finance, East China University of Science and Technology)
    Abstract: This paper focuses on investigating financial returns' extreme risks, which are defined as the negative log-returns over a certain threshold. A simple agent-based model is constructed to explain the behavior of the market traders when extreme risks occur. We consider both the volatility clustering and the heavy tail characteristics when constructing the model. Empirical study uses the China securities index 300 daily level data and applies the method of simulated moments to estimate the model parameters. The stationarity and ergodicity tests provide evidence that the proposed model is good for estimation and prediction. The goodness-of-fit measures show that our proposed model fits the empirical data well. Our estimated model performs well in out-of-sample Value-at-Risk prediction, which contributes to the risk management.
    JEL: C15 C52 G15
    Date: 2018–01–09
  3. By: Queensley Chukwudum (PAUSTI - Pan African University Institute of Basic Sciences, Technology and Innovation)
    Abstract: Insurers that undertake high risk profiles usually exceed their financial capabilities, hence the importance of reinsurance. This allows the insurance company to cover risks that they, under normal circumstances, would not be able to cover on their own. An insurer needs to be able to evaluate his solvency probability and consequently, adjust his retention levels appropriately because the insurer's retention level plays a vital role in determining the premiums he will pay to the reinsurer. To illustrate how Extreme Value theory can be applied, this study delves into modeling the probabilistic behavior of the frequency and severity of large motor claims from the Nigerian insurance sector (2013-2016) using the Negative Binomial-Generalized Pareto distribution (NB-GPD). The annual loss distribution is simulated using the Monte Carlo method. Pricing of the Excess-of-loss (XL) reinsurance is also examined to aid insurers in optimizing their risk management decision in regards to the choice of their risk transfer position.
    Keywords: Extreme value theory,Generalized Pareto distribution,Risk Management,XL Reinsurance,Negative Binomial,Monte Carlo simulation
    Date: 2018–08–09
  4. By: Jon Danielsson; Lerby Ergun; Casper G. de Vries
    Abstract: Worst-case analysis is used among financial regulators in the wake of the recent financial crisis to gauge the tail risk. We provide insight into worst-case analysis and provide guidance on how to estimate it. We derive the bias for the non-parametric heavy-tailed order statistics and contrast it with the semi-parametric extreme value theory (EVT) approach. We find that if the return distribution has a heavy tail, the non-parametric worst-case analysis, i.e. the minimum of the sample, is always downwards biased and hence is overly conservative. Relying on semi-parametric EVT reduces the bias considerably in the case of relatively heavy tails. But for the less-heavy tails this relationship is reversed. Estimates for a large sample of US stock returns indicate that this pattern in the bias is indeed present in financial data. With respect to risk management, this induces an overly conservative capital allocation if the worst case is estimated incorrectly.
    Keywords: Financial stability
    JEL: C01 C14 C58
    Date: 2018
  5. By: Queensley Chukwudum (PAUSTI - Pan African University Institute of Basic Sciences, Technology and Innovation)
    Abstract: An insurer's ability to accurately estimate the accumulation of risk, particularly in the right hand tail, is vital in ensuring that his risk appetites matches his risk exposures. This paper, therefore, focuses on the modeling of the extremal dependence structure between insurance risks using the Generalized Pareto distribution and the copula technique. The results obtained after comparing the dependence between large losses from two lines of business (motor and fire) of the Nigerian insurance industry and two specific non-life insurance companies, indicates that the correlation coefficients vary and is generally weak. With the aid of the archimedean copula, the analysis makes use of the data pair exhibiting the highest correlation to draw particular attention to the importance of taking into account the extremal dependence structure when quantifying the risk capital, allocating risk and when estimating the net reinsurance premium under different reinsurance strategies.
    Keywords: Tail dependent risks,Reinsurance treaties,Copulas,Economic capital,Stochastic simulation,Extreme value
    Date: 2018–08–09
  6. By: Natalia Kunitsyna (North-Caucasus Federal University); Igor Britchenko (State Higher Vocational School Memorial of Prof. Stanislaw Tarnowski in Tarnobrzeg); Igor Kunitsyn (North-Caucasus Federal University)
    Abstract: Currently, under the conditions of permanent financial risks that hamper the sustainable economic growth in the financial sector, the development of evaluation and risk management methods both regulated by Basel II and III and others seem to be of special importance. The reputation risk is one of significant risks affecting reliability and credibility of commercial banks. The importance of reputation risk management and the quality of their assessment remain relevant as the probability of decrease in or loss of business reputation influences the financial results and the degree of customers', partners' and stakeholders' confidence. By means of imitating modeling based on Bayesian Networks and the fuzzy data analysis, the article characterizes the mechanism of reputation risk assessment and possible losses evaluation in banks by plotting normal and lognormal distribution functions. Monte-Carlo simulation is used to calculate the probability of losses caused by reputation risks. The degree of standardized histogram similarity is determined on the basis of the fuzzy data analysis applying Hamming distance method. The tree-like hierarchy based on the OWA-operator is used to aggregate the data with Fishburne's coefficients as the convolution scales. The mechanism takes into account the impact of criteria, such as return on equity, goodwill value, the risk assets ratio, the share of the productive assets in net assets, the efficiency ratio of interest bearing liabilities, the risk ratio of credit operations, the funding ratio and reliability index on the business reputation of the bank. The suggested methods and recommendations might be applied to develop the decision-making mechanism targeted at the implementation of reputation risk management system in commercial banks as well as to optimize risk management technologies.
    Keywords: risk level,economic modeling,reputation risks,commercial banks,business reputation,sustainable development,value of losses
    Date: 2018–06–29
  7. By: Cuautle-Parra, David; Riley, John M.
    Abstract: Mexico pledged 550 million pesos ($41 million USD) in 2012 to fund a price risk management program for agricultural producers (Stargardter, 2012). The program utilizes risk management tools based in the United States, primarily options on futures contracts. In some cases, the subsidy levels for option premiums were as high as 100%, but the program has scaled these back to an 85% subsidy or less. The purpose of this project is to determine the effectiveness of United States corn futures contracts as hedging instruments for Mexican corn producers. Local cash prices for multiple locations across Mexico are reported by La Secretaría de Agricultura, Ganadería, Desarrollo Rural, Pesca y Alimentación. These prices are available weekly from January 1998 to present. Futures prices for corn are from the CME Group. To determine the effectiveness of the CME Group corn futures contract as a price risk tool for Mexican corn producers linear regression models are estimated where the local cash price is the dependent variable and the futures price is the independent variable. The results of the model offer insight on the basis and optimal hedge ratio for Mexican producers, and we can comment that using futures contracts of yellow corn negotiated at the CME Group as a price risk management tool is effective if we consider white corn national data, since our regression model proves with a ninety five percent of confidence level that prices in Mexico are explain eighty three point five percent by prices at the CME group. However, basis given by the government appear to be insufficient according to our analysis.
    Keywords: Risk and Uncertainty
    Date: 2018–01–17
  8. By: Arunanondchai, Panit; Sukcharoen, Kunlapath; Leatham, David J.
    Abstract: The emergence of energy exchange-traded funds (ETFs) has provided an alternative vehicle for both energy producers and users to hedge their respective exposures to unfavorable energy price movements without opening a relative expensive futures account. While hedging with energy ETFs has been touted as a promising alternative to hedging with traditional energy futures, the question concerning the hedging effectiveness of energy ETFs versus energy futures, especially in terms of their ability to manage downside risk, remains largely unexplored. Accordingly, this study formally compares the hedging effectiveness of the two instruments in a downside risk framework from the perspective of both short and long hedgers. Two estimation methods are applied to estimate the minimum-Value at Risk (VaR) and minimum-Expected Shortfall (ES) hedge ratios: the empirical distribution function method and the kernel copula method. The empirical application focuses on four different energy commodities: crude oil, gasoline, heating oil, and natural gas.
    Keywords: Research Methods/ Statistical Methods, Risk and Uncertainty
    Date: 2018–02–05
  9. By: Lin Fan; Peter W. Glynn; Markus Pelger
    Abstract: We investigate methods of change-point testing and confidence interval construction for nonparametric estimators of expected shortfall and related risk measures in weakly dependent time series. A key aspect of our work is the ability to detect general multiple structural changes in the tails of time series marginal distributions. Unlike extant approaches for detecting tail structural changes using quantities such as tail index, our approach does not require parametric modeling of the tail and detects more general changes in the tail. Additionally, our methods are based on the recently introduced self-normalization technique for time series, allowing for statistical analysis without the issues of consistent standard error estimation. The theoretical foundation for our methods are functional central limit theorems, which we develop under weak assumptions. An empirical study of S&P 500 returns and US 30-Year Treasury bonds illustrates the practical use of our methods in detecting and quantifying market instability via the tails of financial time series during times of financial crisis.
    Date: 2018–09
  10. By: Kramer, B.; Ceballos, F.
    Abstract: Bundling agricultural insurance with climate-smart technologies and practices (CSA) can help improve risk management for smallholder farmers. This paper analyzes how bundling affects demand for insurance and CSA. Calibrating index insurance parameters to CSA payoff profiles increases the demand for insurance, but only when basis risk is low, and these effects of reducing basis risk itself. This raises the question how to bundle insurance products that leverage new technologies to provide indemnity insurance coverage with minimal basis risk. We therefore study the effect of bundling indemnity insurance with CSA technologies. Specifically, in a field experiment in India, we test whether conditioning insurance payouts on not burning residues improves residue management as a CSA technology. We find that this is the case, suggesting that indemnity insurance can help promote CSA technology adoption, but we also discuss shortcomings of this bundling approach, and identify potential alternatives to combine indemnity insurance and CSA technologies into a complementary risk management bundle.
    Keywords: Agricultural and Food Policy, Environmental Economics and Policy, International Development
    Date: 2018–07
  11. By: Yu Feng; Erik Schl\"ogl
    Abstract: The paper proposes a new approach to model risk measurement based on the Wasserstein distance between two probability measures. It formulates the theoretical motivation resulting from the interpretation of fictitious adversary of robust risk management. The proposed approach accounts for all alternative models and incorporates the economic reality of the fictitious adversary. It provides practically feasible results that overcome the restriction and the integrability issue imposed by the nominal model. The Wasserstein approach suits for all types of model risk problems, ranging from the single-asset hedging risk problem to the multi-asset allocation problem. The robust capital allocation line, accounting for the correlation risk, is not achievable with other non-parametric approaches.
    Date: 2018–09
  12. By: Gehrig, Thomas; Iannino, Maria Chiara
    Abstract: This paper analyses the evolution of the safety and soundness of the European banking sector during the various stages of the Basel process of capital regulation. In the first part we document the evolution of various measures of systemic risk as the Basel process unfolds. Most strikingly, we find that the exposure to systemic risk as measured by SRISK has been steeply rising for the highest quintile, moderately rising for the second quintile and remaining roughly stationary for the remaining three quintiles of listed European banks. This observation suggests that the Basel process has succeeded in containing systemic risk for the majority of European banks but not for the largest and most risky institutions. In the second part we analyze the drivers of systemic risk. We find compelling evidence that the increase in exposure to systemic risk (SRISK) is intimately tied to the implementation of internal models for determining credit risk as well as market risk. Based on this evidence, the sub-prime crisis found especially the largest and more systemic banks ill-prepared and lacking resiliency. This condition has even aggravated during the European sovereign crisis. Banking Union has not restored aggregate resiliency to pre-crises levels. Finally, low interest rates considerably a ect the contribution to systemic risk for the safer banks.
    JEL: B26 E58 G21 G28 H12 N24
    Date: 2018–09–27
  13. By: Peter Carr; Zhibai Zhang
    Abstract: To convert standard Brownian motion $Z$ into a positive process, Geometric Brownian motion (GBM) $e^{\beta Z_t}, \beta >0$ is widely used. We generalize this positive process by introducing an asymmetry parameter $ \alpha \geq 0$ which describes the instantaneous volatility whenever the process reaches a new low. For our new process, $\beta$ is the instantaneous volatility as prices become arbitrarily high. Our generalization preserves the positivity, constant proportional drift, and tractability of GBM, while expressing the instantaneous volatility as a randomly weighted $L^2$ mean of $\alpha$ and $\beta$. The running minimum and relative drawup of this process are also analytically tractable. Letting $\alpha = \beta$, our positive process reduces to Geometric Brownian motion. By adding a jump to default to the new process, we introduce a non-negative martingale with the same tractabilities. Assuming a security's dynamics are driven by these processes in risk neutral measure, we price several derivatives including vanilla, barrier and lookback options.
    Date: 2018–09
  14. By: Adi Mordel
    Abstract: Prudential liquidity requirements are a relatively recent regulatory tool on the international front, introduced as part of the Basel III accord in the form of a liquidity coverage ratio (LCR) and a net stable funding ratio (NSFR). I first discuss the rationale for regulating bank liquidity by highlighting the market failures that it addresses while reviewing key theoretical contributions to the literature on the motivation for prudential liquidity regulation. I then introduce some of the empirical literature on the firm-specific and systemwide effects of that regulation. These findings suggest that while banks respond to binding requirements by increasing long-term funding and reducing maturity mismatch, there is also evidence that risk in the financial system has gone up. In an environment where both bank liquidity and capital are regulated, it is natural to consider the interactions between them. The main conclusions from this growing literature indicate that while liquidity requirements tend to make capital constraints less binding, capital requirements appear to be more costly to comply with, and that both regulations have a non-trivial effect on financial stability. I conclude with a discussion of potential avenues to explore as the Basel III liquidity standards are being implemented in Canada.
    Keywords: Financial Institutions, Financial system regulation and policies
    JEL: G G2 G21 G28
    Date: 2018
  15. By: Toni Ahnert; James Chapman; Carolyn Wilkins
    Abstract: We present a simple model to study the risk sensitivity of capital regulation. A banker funds investment with uninsured deposits and costly capital, where capital resolves a moral hazard problem in the banker’s choice of risk. Investors are uninformed about investment quality, but a regulator receives a signal about it and imposes minimum capital requirements. With a perfect signal, capital requirements are risk sensitive and achieve the first-best levels of risk and intermediation: safer banks attract cheaper deposit funding and require less capital. With a noisy signal, risk-sensitive capital regulation can implement a separating equilibrium in which low-quality banks do not participate. We show that the degree of risk sensitivity is non-monotone in the precision of the signal and in investment characteristics. Without a signal, a leverage ratio still induces the efficient risk choice but leads to excessive or insufficient intermediation.
    Keywords: Financial institutions; Financial system regulation and policies
    JEL: G21 G28
    Date: 2018
  16. By: Daniela Escobar; Georg Pflug
    Abstract: Distortion (Denneberg 1990) is a well known premium calculation principle for insurance contracts. In this paper, we study sensitivity properties of distortion functionals w.r.t. the assumptions for risk aversion as well as robustness w.r.t. ambiguity of the loss distribution. Ambiguity is measured by the Wasserstein distance. We study variances of distances for probability models and identify some worst case distributions. In addition to the direct problem we also investigate the inverse problem, that is how to identify the distortion density on the basis of observations of insurance premia.
    Date: 2018–09
  17. By: Huettel, Silke; Murtazashvili, Irina; Weltin, Meike; Erickson, Kenneth W.
    Keywords: Risk and Uncertainty, Agricultural Finance, Agribusiness
    Date: 2017–06–15
  18. By: Cecilia Parlatore (New York University Stern)
    Abstract: We study the optimal design of scenarios by a risk-averse principal (e.g, a risk officer, a regulator) who seeks to learn about the exposures of agents (e.g., traders, banks) to a set of risk factors. We decompose the problem into a learning part and a design part. Conditional on the stress scenarios, we show how to apply a Kalman filter to solve the learning problem. The design of optimal scenarios is then a function of what the regulator wants to learn and of how she intends to intervene if she uncovers excessive exposures. We show how the optimal design depends on ex-ante leverage, the correlation of exposures within and across agents, and the non-linearities in potential losses.
    Date: 2018
  19. By: Olga A. Shvetsova (Korea University of Technology and Education); Elena A. Rodionova (SPbPU - Peter the Great St. Petersburg Polytechnic University); Michael Z. Epstein (SPbPU - Peter the Great St. Petersburg Polytechnic University)
    Abstract: Multi-criteria decision making (MCDM) methods have evolved for various types of applications. In the past, even small variations to existing methods have led to the creation of new avenues for research. Thus, in this study, we review the MCDM methods in investment management and examine the advantages and disadvantages of these methods in a risk environment. In addition, we study the effectiveness of investment projects using these methods. The analysis of MCDM methods performed in this study provides a guide for the use of these methods, especially the ones based on interval data, in investment project analysis. Furthermore, we propose a combination of multi-criterial selection and interval preferences to evaluate investment projects. Our method improves on the method of calculating economic efficiency based on a one-dimensional criterion and sensitivity analysis, though our proposal involves complicated calculations.
    Keywords: multicriterial approach,Pareto set,risk management,investment project evaluation,investment project,interval data
    Date: 2018–06–29
  20. By: Haotian Xiang (Wharton School of the University of Pennsylvania)
    Abstract: I investigate the impact of bank capital requirements in a business cycle model with corporate debt choice. Compared to non-bank investors, banks provide restructurable loans that reduce firm bankruptcy losses and enhance production efficiency. Raising capital requirements eliminates deposit insurance distortions but also deposit tax shields. As a result, firms cut back on both bank and non-bank borrowing while going bankrupt more frequently. Implementing an optimal capital ratio of 11 percent in the US produces limited marginal impacts on aggregate quantities and welfare.
    Date: 2018
  21. By: Xu, Minhong; Xu, Yilan
    Keywords: Environmental Economics and Policy, Resource/Energy Economics and Policy, Risk and Uncertainty
    Date: 2017–06–26
  22. By: Jeonggyu Huh
    Abstract: In this paper, we measure systematic risk with a new nonparametric factor model, the neural network factor model. The suitable factors for systematic risk can be naturally found by inserting daily returns on a wide range of assets into the bottleneck network. The network-based model does not stick to a probabilistic structure unlike parametric factor models, and it does not need feature engineering because it selects notable features by itself. In addition, we compare performance between our model and the existing models using 20-year data of S&P 100 components. Although the new model can not outperform the best ones among the parametric factor models due to limitations of the variational inference, the estimation method used for this study, it is still noteworthy in that it achieves the performance as best the comparable models could without any prior knowledge.
    Date: 2018–09
  23. By: Zeytoon Nejad Moosavian, Seyyed Ali
    Keywords: Risk and Uncertainty, Demand and Price Analysis, Agricultural and Food Policy
    Date: 2017–06–15
  24. By: Arthur T. Rego; Thiago R. dos Santos
    Abstract: In this work, we propose a model for estimating volatility from financial time series, extending the non-Gaussian family of space-state models with exact marginal likelihood proposed by Gamerman, Santos and Franco (2013). On the literature there are models focused on estimating financial assets risk, however, most of them rely on MCMC methods based on Metropolis algorithms, since full conditional posterior distributions are not known. We present an alternative model capable of estimating the volatility, in an automatic way, since all full conditional posterior distributions are known, and it is possible to obtain an exact sample of parameters via Gibbs Sampler. The incorporation of jumps in returns allows the model to capture speculative movements of the data, so that their influence does not propagate to volatility. We evaluate the performance of the algorithm using synthetic and real data time series. Keywords: Financial time series, Stochastic volatility, Gibbs Sampler, Dynamic linear models.
    Date: 2018–08
  25. By: Rosch, Stephanie D.
    Keywords: Risk and Uncertainty, Institutional and Behavioral Economics, Agribusiness
    Date: 2017–06–26

General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.