
on Risk Management 
Issue of 2013‒07‒15
eight papers chosen by 
By:  ChiaLin Chang (Department of Applied Economics Department of Finance National Chung Hsing University Taiwan); David E. Allen (School of Accounting, Finance and Economics Edith Cowan University Australia); Michael McAleer (Econometric Institute Erasmus School of Economics Erasmus University Rotterdam and Tinbergen Institute The Netherlands and Department of Quantitative Economics Complutense University of Madrid Spain and Institute of Economic Research Kyoto University Japan); Teodosio Perez Amaral (Department of Quantitative Economics Complutense University of Madrid Spain) 
Abstract:  The papers in this special issue of Mathematics and Computers in Simulation are substantially revised versions of the papers that were presented at the 2011 Madrid International Conference on “Risk Modelling and Management” (RMM2011). The papers cover the following topics: currency hedging strategies using dynamic multivariate GARCH, risk management of risk under the Basel Accord: A Bayesian approach to forecasting valueatrisk of VIX futures, fast clustering of GARCH processes via Gaussian mixture models, GFCrobust risk management under the Basel Accord using extreme value methodologies, volatility spillovers from the Chinese stock market to economic neighbours, a detailed comparison of ValueatRisk estimates, the dynamics of BRICS's country risk ratings and domestic stock markets, U.S. stock market and oil price, forecasting valueatrisk with a durationbased POT method, and extreme market risk and extreme value theory. 
Keywords:  Currency hedging strategies, Basel Accord, risk management, forecasting, VIX futures, fast clustering, mixture models, extreme value methodologies, volatility spillovers, ValueatRisk, country risk ratings, BRICS, extreme market risk. 
JEL:  C14 C32 C53 C58 G11 G32 
Date:  2013–07 
URL:  http://d.repec.org/n?u=RePEc:kyo:wpaper:872&r=rmg 
By:  Pauline Barrieu; Giacomo Scandolo 
Abstract:  Model risk has a huge impact on any risk measurement procedure and its quantification is therefore a crucial step. In this paper, we introduce three quantitative measures of model risk when choosing a particular reference model within a given class: the absolute measure of model risk, the relative measure of model risk and the local measure of model risk. Each of the measures has a specific purpose and so allows for flexibility. We illustrate the various notions by studying some relevant examples, so as to emphasize the practicability and tractability of our approach. 
Date:  2013–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1307.0684&r=rmg 
By:  Svend Rasmussen (Department of Food and Resource Economics, University of Copenhagen); Anders L. Madsen (HUGIN EXPERT A/S; Aalborg University); Mogens Lund (Department of Food and Resource Economics, University of Copenhagen) 
Abstract:  The importance of risk management increases as farmers become more exposed to risk. But risk management is a difficult topic because income risk is the result of the complex interaction of multiple risk factors combined with the effect of an increasing array of possible risk management tools. In this paper we use Bayesian networks as an integrated modelling approach for representing uncertainty and analysing risk management in agriculture. It is shown how historical farm account data may be efficiently used to estimate conditional probabilities, which are the core elements in Bayesian network models. We further show how the Bayesian network model RiBay is used for stochastic simulation of farm income, and we demonstrate how RiBay can be used to simulate risk management at the farm level. It is concluded that the key strength of a Bayesian network is the transparency of assumptions, and that it has the ability to link uncertainty from different external sources to budget figures and to quantify risk at the farm level. 
Keywords:  Bayesian network, Risk, Conditional probabilities, Stochastic simulation, Database, Farm account 
JEL:  C11 C63 D81 Q12 
Date:  2013–05 
URL:  http://d.repec.org/n?u=RePEc:foi:wpaper:2013_12&r=rmg 
By:  Carlo Domenico Mottura; Luca Passalacqua 
Abstract:  The paper analyses the problem of evaluating a guarantee contract against default risk in which the guarantor party is defaultable and the default risks of the guarantor and of the borrower are correlated. This problem has several relevant applications within the present sovereign risk crisis. We have investigated the effects of the dependence structure between defaults events within a framework defined by the classical noarbitrage market approach, considering intensity models driven by Cox processes for the term structure of survival prob abilities and copula models to derive the joint distribution of default times. We compare numerical results on the probability of the guarantee being paid, for different values of the default intensities, using the Gaussian and the MarshallOlkin copulas, finding relevant differencies and counterintuitive dependence on the correlation parameter 
Keywords:  government guarantees, default risk, correlation, MarshallOlkin 
JEL:  C16 G13 G28 
Date:  2013–07 
URL:  http://d.repec.org/n?u=RePEc:rtr:wpaper:0177&r=rmg 
By:  Erhan Bayraktar; Zhou Zhou 
Abstract:  We consider the pricing and hedging of exotic options in a modelindependent setup using \emph{shortfall risk and quantiles}. We assume that the marginal distributions at certain times are given. This is tantamount to calibrating the model to call options with discrete set of maturities but a continuum of strikes. In the case of pricing with shortfall risk, we prove that the minimum initial amount is equal to the superhedging price plus the inverse of the utility at the given shortfall level. In the second result, we show that the quantile hedging problem is equivalent to superhedging problems for knockout options. These results generalize the duality results of [5,6] to the model independent setting of [1]. 
Date:  2013–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1307.2493&r=rmg 
By:  Stefano Olgiati; Alessandro Danovi 
Abstract:  Credit risk management in Italy is characterized, in the period June 2008 to June 2012, by frequent (frequency=0.5 cycles per year) and intense (peak amplitude: mean=39.2 billion Euros, s.e.=2.83 billion Euros) quarterly contractions and expansions around the mean (915.4 billion Euros, s.e.=3.59 billion Euros) of the nominal total credit used by nonfinancial corporations. Such frequent and intense fluctuations are frequently ascribed to exogenous Basel II procyclical effects on credit flow into the economy and, consequently, Basel III output based point in time Credit to GDP countercyclical buffering advocated. We have tested the opposite null hypotheses that such variation is significantly correlated to actual default rates, and that such correlation is explained by fluctuations of credit supply around a steady state. We have found that, in the period June 2008 to June 2012 (n=17), linear regression of credit growth rates on default rates reveals a negative correlation of r=minus 0.6903 with R squared=0.4765, and that credit supply fluctuates steadily around the default rate with an Internal Steady State Parameter SSP=0.00245 with chi squared=37.47 (v=16, P<.005). We conclude that fluctuations of the total credit used by nonfinancial corporations are exhaustively explained by variation of the independent variable default rate, and that credit variation fluctuates around a steady state. We conclude that credit risk management in Italy has been effective in parameterizing credit supply variation to default rates within the Basel II operating framework. Basel III prospective countercyclical point in time output buffers based on filtered Credit to GDP ratios and dynamic provisioning proposals should take into account this underlying steady state statistical pattern. 
Date:  2013–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1307.2465&r=rmg 
By:  Eva Schliephake (Faculty of Economics and Management, OttovonGuericke University Magdeburg) 
Abstract:  Microprudential capital requirements are designed to reduce the excessive risk taking of banks. If banks are required to use more equity funding for risky assets they invest more funds into safe assets. This paper analyzes a government that simultaneously regulates the banking sector and borrows from it. I argue that a government may have the incentive to use capital requirements to alleviate its budget burden. The risk weights for risky assets may be placed relatively too high compared to the risk weight on government bonds. This could have a negative impact on welfare. The supply of loans for the risky sector shrinks, which may have a negative impact on long term growth. Moreover, the government may be tempted to increase its debt level due to better funding conditions, which increases the risk of a future sovereign debt crisis. A short term focused government may be tempted to neglect the risk and, thereby, may introduce systemic risk in the banking sector. 
Keywords:  Capital Requirement Regulation, Government Debt 
JEL:  G21 G28 G32 
Date:  2013–06 
URL:  http://d.repec.org/n?u=RePEc:mag:wpaper:130011&r=rmg 
By:  Moshe A. Milevsky; Thomas S. Salisbury 
Abstract:  Historical tontines promised enormous rewards to the last survivors at the expense of those who died early. While this design appealed to the gambling instinct, it is a suboptimal way to manage longevity risk during retirement. This is why fair life annuities making constant payments  where the insurance company is exposed to the longevity risk  induces greater lifetime utility. However, tontines do not have to be designed using a winnertakeall approach and insurance companies do not actually sell fair life annuities, partially due to aggregate longevity risk. In this paper we derive the tontine structure that maximizes lifetime utility, but doesn't expose the sponsor to any longevity risk. We examine its sensitivity to the size of the tontine pool; individual longevity risk aversion; and subjective health status. The optimal tontine varies with the individual's longevity risk aversion $\gamma$ and the number of participants $n$, which is problematic for product design. That said, we introduce a structure called a natural tontine whose payout declines in exact proportion to the (expected) survival probabilities, which is nearoptimal for all $\gamma$ and $n$. We compare the utility of optimal tontines to the utility of loaded life annuities under reasonable demographic and economic conditions and find that the life annuity's advantage over tontines, is minimal. We also review and analyze the firstever mortalityderivative issued by the British government, known as King Williams's tontine of 1693. We shed light on the preferences and beliefs of those who invested in the tontines vs. the annuities and argue that tontines should be reintroduced and allowed to coexist with life annuities. Individuals would likely select a portfolio of tontines and annuities that suit their personal preferences for consumption and longevity risk, as they did over 320 years ago. 
Date:  2013–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1307.2824&r=rmg 