
on Risk Management 
By:  Iacob, Constanta; Zaharia, Stefan 
Abstract:  The preoccupations concerning the analysis and the administration of the risks began in the year 1970 and lasted for 10 years, aiming at different fields of the human sciences: administration, sociology, economy, political sciences. The negligence in analyzing the risks for more than 20 years marked the competitiveness of the enterprises. The risks are still the same, some of them amplified, other new ones appeared by sometimes creating „the avalanche effect", so it is difficult to estimate and to stop their consequences. Constantly facing new challenges, customs administrations should remain reactive when it comes to managing emerging risks. Risk management in combination with other essential constituents customs, indicate direction of the XXI century in this field. This paper aims to analyze the risks faced by customs, the possibility of analyzing and measuring the risks and their management directions. 
Keywords:  risks; evaluation; indicators; analysis; monitoring; review 
JEL:  G32 G28 M42 
Date:  2012–06–09 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:39352&r=rmg 
By:  Alexandros Gabrielsen (Sumitomo Mitsui Banking Corporation, UK); Paolo Zagaglia (Department of Economics, University of Bologna, Italy); Axel Kirchner (Deutsche Bank, UK); Zhuoshi Liu (Bank of England, UK) 
Abstract:  This paper provides an insight to the timevarying dynamics of the shape of the distribution of financial return series by proposing an exponential weighted moving average model that jointly estimates volatility, skewness and kurtosis over time using a modified form of the GramCharlier density in which skewness and kurtosis appear directly in the functional form of this density. In this setting VaR can be described as a function of the timevarying higher moments by applying the CornishFisher expansion series of the first four moments. An evaluation of the predictive performance of the proposed model in the estimation of 1day and 10day VaR forecasts is performed in comparison with the historical simulation, filtered historical simulation and GARCH model. The adequacy of the VaR forecasts is evaluated under the unconditional, independence and conditional likelihood ratio tests as well as Basel II regulatory tests. The results presented have significant implications for risk management, trading and hedging activities as well as in the pricing of equity derivatives. 
Keywords:  exponential weighted moving average, timevarying higher moments, CornishFisher expansion, GramCharlier density, risk management, ValueatRisk 
JEL:  C51 C52 C53 G15 
Date:  2012–06 
URL:  http://d.repec.org/n?u=RePEc:rim:rimwps:34_12&r=rmg 
By:  Mandal, Maitreyi; Lagerkvist, Carl Johan 
Abstract:  MeanVariance theory of portfolio construction is still regarded as the main building block of modern portfolio theory. However, many authors have suggested that the meanvariance criterion, conceived by Markowitz (1952), is not optimal for asset allocation, because the investor expected utility function is better proxied by a function that uses higher moments and because returns are distributed in a nonNormal way, being asymmetric and/or leptokurtic, so the meanvariance criterion cannot correctly proxy the expected utility with nonNormal returns. Copulas are a very useful tool to deal with non standard multivariate distribution. Value at Risk (VaR) and Conditional Value at Risk (CVaR) have emerged as a golden measure of risk in recent times. Though almost unutilized so far, as agriculture becomes more industrialized, there will be growing interest in these risk measures. In this paper, we apply a Gaussian copula and Student’s t copula models to create a joint distribution of return of two (Farm Return and S&P 500 Index Return) and three (Farm Return, S&P 500 Index Return and US Treasury Bond Index) asset classes and finally use VaR measures to create the optimal portfolio. The resultant portfolio offers better hedges against losses. 
Keywords:  Portfolio Choice, Downside Risk Protection, Value at risk, Copula, Agricultural Finance, Risk and Uncertainty, C52, G11, Q14, 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:ags:aaea12:124387&r=rmg 
By:  Mauricio Labadie (EXQIM  EXclusive Quantitative Investment Management  EXQIM); CharlesAlbert Lehalle (Head of Quantitative Research  CALYON group) 
Abstract:  We derive explicit recursive formulas for Target Close (TC) and Implementation Shortfall (IS) in the AlmgrenChriss framework. We explain how to compute the optimal starting and stopping times for IS and TC, respectively, given a minimum trading size. We also show how to add a minimum participation rate constraint (Percentage of Volume, PVol) for both TC and IS. We also study an alternative set of risk measures for the optimisation of algorithmic trading curves. We assume a selfsimilar process (e.g. Levy process, fractional Brownian motion or fractal process) and define a new risk measure, the pvariation, which reduces to the variance if the process is a Brownian motion. We deduce the explicit formula for the TC and IS algorithms under a selfsimilar process. We show that there is an equivalence between selfsimilar models and a family of risk measures called pvariations: assuming a selfsimilar process and calibrating empirically the parameter p for the pvariation yields the same result as assuming a Brownian motion and using the pvariation as risk measure instead of the variance. We also show that p can be seen as a measure of the aggressiveness: p increases if and only if the TC algorithm starts later and executes faster. From the explicit expression of the TC algorithm one can compute the sensitivities of the curve with respect to the parameters up to any order. As an example, we compute the first order sensitivity with respect to both a local and a global surge of volatility. Finally, we show how the parameter p of the pvariation can be implied from the optimal starting time of TC, and that under this framework p can be viewed as a measure of the joint impact of market impact (i.e. liquidity) and volatility. 
Keywords:  Quantitative Finance; HighFrequency Trading; Algorithmic Trading; Optimal Execution; Market Impact; Risk Measures; Selfsimilar Processes; Fractal Processes 
Date:  2012–05–18 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal00705056&r=rmg 
By:  António Rua; Luís Catela Nunes 
Abstract:  The measurement of market risk poses major challenges to researchers and different economic agents. On one hand, it is by now widely recognized that risk varies over time. On the other hand, the risk profile of an investor, in terms of investment horizon, makes it crucial to also assess risk at the frequency level. We propose a novel approach to measuring market risk based on the continuous wavelet transform. Risk is allowed to vary both through time and at the frequency level within a unified framework. In particular, we derive the wavelet counterparts of wellknown measures of risk. One is thereby able to assess total risk, systematic risk and the importance of systematic risk to total risk in the timefrequency space. To illustrate the method we consider the emerging markets case over the last twenty years, finding noteworthy heterogeneity across frequencies and over time, which highlights the usefulness of the wavelet approach. 
JEL:  C40 F30 G15 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:ptu:wpaper:w201203&r=rmg 
By:  Daniel Alai (ARC Centre of Excellence in Population Ageing Research, Australian School of Business, University of New South Wales); Zinoviy Landsman (Departmant of Statistics, University of Haifa); Michael Sherris (School of Risk and Actuarial Studies and ARC Centre of Excellence in Population Ageing Research, Australian School of Business, University of New South Wales) 
Abstract:  Systematic improvements in mortality results in dependence in the survival distributions of insured lives. This is not allowed for in standard life tables and actuarial models used for annuity pricing and reserving. Systematic longevity risk also undermines the law of large numbers; a law that is relied on in the risk management of life insurance and annuity portfolios. This paper applies a multivariate gamma distribution to incorporate dependence. Lifetimes are modelled using a truncated multivariate gamma distribution that induces dependence through a shared gamma distributed component. Model parameter estimation is developed based on the method of moments and generalized to allow for truncated observations. The impact of dependence on the valuation of a portfolio, or cohort, of annuitants with similar risk characteristics is demonstrated by applying the model to annuity valuation. The dependence is shown to have a significant impact on the risk of the annuity portfolio as compared with traditional actuarial methods that implicitly assume independent lifetimes. 
Keywords:  Systematic longevity risk, dependence, multivariate gamma, lifetime distribution, annuity valuation 
JEL:  G22 G32 C13 C02 
Date:  2012–05 
URL:  http://d.repec.org/n?u=RePEc:asb:wpaper:201211&r=rmg 
By:  Andrievskaya, Irina (BOFIT) 
Abstract:  The 20072009 global financial crisis demonstrated the need for effective systemic risk measurement and regulation. This paper proposes a straightforward approach for estimating the systemic funding liquidity risk in a banking system and identifying systemically critical banks. Focusing on the surplus of highly liquid assets above due payments, we find systemic funding liquidity risk can be expressed as the distance of the aggregate liquidity surplus from its current level to its critical value. Calculations are performed using simulated distribution of the aggregate liquidity surplus determined using Independent Component Analysis. The systemic importance of banks is then assessed based on their contribution to variation of the liquidity surplus in the system. We apply this methodology to the case of Russia, an emerging economy, to identify the current level of systemic funding liquidity risk and rank banks based on their systemic relevance. 
Keywords:  systemic risk; liquidity surplus; banking; Russia 
JEL:  G21 G28 P29 
Date:  2012–06–18 
URL:  http://d.repec.org/n?u=RePEc:hhs:bofitp:2012_012&r=rmg 
By:  Vic Norton 
Abstract:  We present an algorithm for the decomposition of periodic financial return data into orthogonal factors of expected return and "systemic", "productive", and "nonproductive" risk. Generally, when the number of funds does not exceed the number of periods, the expected return of a portfolio is an affine function of its productive risk. 
Date:  2012–06 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1206.2333&r=rmg 
By:  Joseph Tracy; Joshua Wright 
Abstract:  This paper analyzes the relationship between changes in borrowers' monthly mortgage payments and future credit performance. This relationship is important for the design of an internal refinance program such as the Home Affordable Refinance Program (HARP). We use a competing risk model to estimate the sensitivity of default risk to downward adjustments of borrowers' monthly mortgage payments for a large sample of prime adjustablerate mortgages. Applying a 26 percent average monthly payment reduction that we estimate would result from refinancing under HARP, we find that the cumulative fiveyear default rate on prime conforming adjustablerate mortgages with loantovalue ratios above 80 percent declines by 3.8 percentage points. If we assume an average loss given default of 35.2 percent, this lower default risk implies reduced credit losses of 134 basis points per dollar of balance for mortgages that refinance under HARP. 
Keywords:  Adjustable rate mortgages ; Mortgages ; Default (Finance) ; Risk ; Credit 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:fip:fednsr:562&r=rmg 
By:  K. Milanov; O. Kounchev 
Abstract:  In the present paper we fill an essential gap in the Convertible Bonds pricing world by deriving a Binary Tree based model for valuation subject to credit risk. This model belongs to the framework known as Equity to Credit Risk. We show that this model converges in continuous time to the model developed by Ayache, Forsyth and Vetzal [2003]. To this end, both forms of credit risk modeling, the socalled reduced (constant intensity of default model for the underlying) and the socalled synthesis (variable intensity of default model for the underlying) are considered. We highlight and quantify certain issues that arise, as transition probability analysis and threshold values of model inputs (tree step, underlying stock price, etc.). This study may be considered as an alternative way to develop the price dynamics model of Ayache et al. [2003] for convertible bonds in credit risk environment. 
Date:  2012–06 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1206.1400&r=rmg 
By:  Elena Veprauskaite (School of Management, University of Bath); Michael Sherris (School of Risk and Actuarial Studies and ARC Centre of Excellence in Population Ageing Research, Australian School of Business, University of New South Wales) 
Abstract:  This paper considers optimal reinsurance based on an assessment of the reinsurance arrangements for a large life insurer. The objective is to determine the reinsurance structure, based on actual insurer data, using a modified meanvariance criteria that maximises the retained premiums and minimizes the variance of retained claims while keeping the retained risk exposure constant, assuming a given level of risk appetite. The portfolio of life and disability policies use quotashare, surplus and a combination of both quotashare and surplus reinsurance. Alternative reinsurance arrangements are compared using the modified meanvariance criteria to assess the optimal reinsurance strategy. The analysis takes into account recent claims experience as well as actual premiums paid by insured lives and to the reinsurers. Optimal reinsurance cover depends on many factors including retention levels, premiums and the variance of sum insured values (and therefore claims), as a result an insurer should assess the tradeoff between retained premiums and the variance of retained claims based on its own experience and risk appetite. 
Keywords:  Life insurance, optimal reinsurance, proportional reinsurance, meanvariance criteria 
JEL:  G22 G32 L21 
Date:  2012–03 
URL:  http://d.repec.org/n?u=RePEc:asb:wpaper:201204&r=rmg 
By:  Daniel Heller; Nicholas Vause 
Abstract:  By the end of 2012, all standardised overthecounter (OTC) derivatives must be cleared with central counterparties (CCPs). In this paper, we estimate the amount of collateral that CCPs should demand to clear safely all interest rate swap and credit default swap positions of the major derivatives dealers. Our estimates are based on potential losses on a set of hypothetical dealer portfolios that replicate several aspects of the way that derivatives positions are distributed within and across dealer portfolios in practice. Our results suggest that major dealers already have sufficient unencumbered assets to meet initial margin requirements, but that some of them may need to increase their cash holdings to meet variation margin calls. We also find that default funds worth only a small fraction of dealers' equity appear sufficient to protect CCPs against almost all possible losses that could arise from the default of one or more dealers, especially if initial margin requirements take into account the tail risks and time variation in risk of cleared portfolios. Finally, we find that concentrating clearing of OTC derivatives in a single CCP could economise on collateral requirements without undermining the robustness of central clearing. 
Keywords:  central counterparties, clearing, collateral, derivatives, default funds, initial margins, variation margins 
Date:  2012–03 
URL:  http://d.repec.org/n?u=RePEc:bis:biswps:373&r=rmg 
By:  Carlos GonzálezAguado (BLUECAP); Enrique MoralBenito (Banco de España) 
Abstract:  Model uncertainty hampers consensus on the main determinants of corporate default. We employ Bayesian model averaging (BMA) techniques in order to shed light on this issue. Empirical findings suggest that the most robust determinants of corporate default are firmspecifi c variables such as the ratio of working capital to total assets, the ratio of retained earnings to total assets, the ratio of total liabilities to total assets and the standard deviation of the firm’s stock return. In contrast, aggregate variables do not seem to play a relevant role once firmspecific characteristics (observable and unobservable) are taken into consideration 
Keywords:  Default probabilities, Bayesian model averaging, Credit Risk 
JEL:  G33 C1 
Date:  2012–06 
URL:  http://d.repec.org/n?u=RePEc:bde:wpaper:1221&r=rmg 
By:  Christian Calmès (Chaire d'information financière et organisationnelle ESGUQAM, Laboratory for Research in Statistics and Probability, Université du Québec (Outaouais)); Raymond Théoret (Chaire d'information financière et organisationnelle ESGUQAM, Université du Québec (Montréal), Université du Québec (Outaouais)) 
Abstract:  Traditional leverage ratios assume that bank equity captures all changes in asset values. However, in the context of marketoriented banking, capital can be funded by additional debt or asset sales without directly influencing equity. Given the new sources of liquidity generated by offbalancesheet (OBS), timevarying indicators of leverage are better suited to capture the dynamics of aggregate leverage. In this paper, we introduce a Kalman filter procedure to study such elasticitybased measures of broad leverage. This approach enables the detection of the buildup in bank risk years before what the traditional assets to equity ratio predicts. Most elasticity measures appear in line with the historical episodes, well tracking the cyclical pattern of leverage. Importantly, the degree of total leverage suggests that OBS banking exerts a stronger influence on leverage during expansion periods. 
Keywords:  Basel III; Banking stability; Macroprudential policy; Herding; Macroeconomic uncertainty. 
JEL:  C32 G20 G21 
Date:  2012–01–27 
URL:  http://d.repec.org/n?u=RePEc:pqs:wpaper:012012&r=rmg 
By:  Christian Calmès (Chaire d'information financière et organisationnelle ESGUQAM, Laboratory for Research in Statistics and Probability, Université du Québec (Outaouais)); Raymond Théoret (Chaire d'information financière et organisationnelle ESGUQAM, Université du Québec (Montréal), Université du Québec (Outaouais)) 
Abstract:  This paper investigates how banks, as a group, react to macroeconomic risk and uncertainty, and more specifically the way banks systemic risk evolves over the business cycle. Adopting the methodology of Beaudry et al. (2001), our results clearly suggest that the dispersion across banks traditional portfolios has increased through time. We introduce an estimation procedure based on EGARCH and refine Baum et al. (2002, 2004, 2009) and Quagliariello (2007, 2009) framework to analyze the question in the new industry context, i.e. shadow banking. Consistent with finance theory, we first confirm that banks tend to behave homogeneously visàvis macroeconomic uncertainty. In particular, we find that the crosssectional dispersions of loans to assets and nontraditional activities shrink essentially during downturns, when the resilience of the banking system is at its lowest. More importantly, our results also suggest that the crosssectional dispersion of marketoriented activities is both more volatile and sensitive to the business cycle than the dispersion of the traditional activities. 
Keywords:  Banking stability; Macroprudential policy; Herding; Macroeconomic uncertainty; Markov switching regime; EGARCH. 
JEL:  C32 G20 G21 
Date:  2012–04–27 
URL:  http://d.repec.org/n?u=RePEc:pqs:wpaper:022012&r=rmg 
By:  Ahmedov, Zafarbek; Woodard, Joshua 
Abstract:  In this study stylized gasoline blender’s optimal hedging strategy in the presence of ethanol mandates is analyzed. In particular, the main objective of this study is to investigate whether the ability to purchase RINs and the presence of tax incentives would affect blenders’ optimal hedging strategies. Multicommodity hedging method with Lower Partial Moments hedging criterion as a measure of downside risk is utilized in obtaining the optimal hedge ratios. Based on the obtained results, the Renewable Identification Number purchases do not reduce risk, hence, is not a good risk management tool in the presence of blenders’ tax credits. However, in the absence of tax credit, RINs can be used as a risk management tool. 
Keywords:  Ethanol, RINs, hedging, LPM, Agribusiness, Agricultural Finance, Resource /Energy Economics and Policy, Risk and Uncertainty, 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:ags:aaea12:124980&r=rmg 
By:  Elena Andreou; Eric Ghysels; Constantinos Kourouyiannis 
Abstract:  Financial time series often undergo periods of structural change that yield biased estimates or forecasts of volatility and thereby risk management measures. We show that in the context of GARCH diussion models ignoring structural breaks in the leverage coecient and the constant can lead to biased and inecient ARRV and GARCHtype volatility estimates. Similarly, we nd that volatility forecasts based on ARRV and GARCHtype models that take into account structural breaks by estimating the parameters only in the postbreak period, signicantly outperform those that ignore them. Hence, we propose a Flexible Forecast Combination method that takes into account not only information from dierent volatility models, but from different subsamples as well. This methods consists of two main steps: First, it splits the estimation period in subsamples based on estimated structural breaks detected by a changepointtest. Second, it forecasts volatility weighting information from all subsamples by minimizing particular loss function, such as the Square Error and QLIKE. An empirical application using the S&P 500 Index shows that our approach performs better, especially in periods of high volatility, compared to a large set of individual volatility models and simple averaging methods as well as Forecast Combinations under Regime Switching. 
Keywords:  forecast, combinations, volatility, structural breaks 
Date:  2012–05 
URL:  http://d.repec.org/n?u=RePEc:ucy:cypeua:082012&r=rmg 
By:  Dominique Guegan (CES  Centre d'économie de la Sorbonne  CNRS : UMR8174  Université Paris I  Panthéon Sorbonne, EEPPSE  Ecole d'Économie de Paris  Paris School of Economics  Ecole d'Économie de Paris); Fatima Jouad (EEPPSE  Ecole d'Économie de Paris  Paris School of Economics  Ecole d'Économie de Paris, AXA GRM  AXA Group Risk Management) 
Abstract:  The advent of the Internal Model Approval Process within Solvency II and the desirability of many insurance companies to gain approval has increased the importance of some topics such as risk aggregation in determining overall economic capital level. The most widely used approach for aggregating risks is the variancecovariance matrix approach. Although being a relatively wellknown concept that is computationally convenient, linear correlations fail to model every particularity of the dependence pattern between risks. In this paper we apply different paircopula models for aggregating market risks that represent usually an important part of an insurer risk profile. We then calculate the economic capital needed to withstand unexpected future losses and the associated diversification benefits. The economic capital will be determined by computing both 99.5th VaR and 99.5th ES following the requirements of Solvency II and SST. 
Keywords:  Solvency II, risk aggregation, market risks, paircopulas, economic capital, diversification gains. 
Date:  2012–05 
URL:  http://d.repec.org/n?u=RePEc:hal:cesptp:halshs00706689&r=rmg 
By:  Howard Kunreuther; Geoffrey Heal 
Abstract:  A principal reason that losses from catastrophic risks have been increasing over time is that more individuals and firms are locating in harm’s way while not taking appropriate protective measures. Several behavioural biases lead decisionmakers not to invest in adaptation measures until after it is too late. In an interdependent world with no intervention by the public sector, it may be economically rational for those at risk not to invest in protective measures. Risk management strategies that involve privatepublic partnerships that address these issues may help in reducing future catastrophic losses. These may include multiyear insurance contracts, wellenforced regulations, thirdparty inspections, and alternative risk transfer instruments such as catastrophe bonds. 
JEL:  D62 D80 D85 H20 
Date:  2012–06 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:18136&r=rmg 
By:  Fraser, Rob W. 
Abstract:  Motivated by recent EC proposals to “strengthen risk management tools” in the CAP in relation to farmers’ increased exposure to market price risk, this paper draws attention to a potential negative consequence of such a change in the CAP – an associated increase in cheating behaviour by farmers in the context of environmental stewardship. A theoretical framework for this policy problem is developed and used not just to illustrate the problem, but also to propose a solution – specifically to combine the introduction of CAPsupported policy changes which reduce farmers’ exposure to marketbased risk with changes in environmental stewardship policies which increase the riskiness of cheating and thereby discourage such behaviour. 
Keywords:  Demand and Price Analysis, Environmental Economics and Policy, Risk and Uncertainty, 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:ags:aare12:124305&r=rmg 