|
on Risk Management |
Issue of 2014‒09‒29
twelve papers chosen by |
By: | Toshinao Yoshiba (Bank of Japan) |
Abstract: | This paper examines the marginal distributions of stocks and bonds, and a copula between the movement of stock prices and interest rates. Because some widely used aggregation methods such as variance-covariance tend to underestimate the risk of an aggregated portfolio, a copula is utilized for risk aggregation, which captures various dependencies in the median and the tail of marginal distributions, unlike a linear correlation. In this study, various types of copula, including one that simultaneously captures both positive and negative linear correlations, are analyzed under several time periods. We examine data related to the Euro crisis and the post-bubble period in Japan. Our analyses show that widely used risk aggregation methods may overestimate the diversification effect. |
Keywords: | copula; multivariate distribution; tail dependency; risk aggregation; economic capital |
Date: | 2013–09–24 |
URL: | http://d.repec.org/n?u=RePEc:boj:bojwps:13-e-12&r=rmg |
By: | van der Hoorn, S.; Knapp, S. |
Abstract: | __Abstract__ Shipping activity has increased worldwide and maritime administrations are trying to enhance risk mitigation strategies by using proactive approaches. We present and discuss a conceptual framework to minimize potential harm based on a multi-layered approach which can be implemented in either real time for operational purposes or in prediction mode for medium or longer term strategic planning purposes. We introduce the concept of total risk exposure which integrates risk at the individual ship level with vessel traffic densities and location specific parameters such as weather and oceanographic conditions, geographical features or environmental sensitivities. A comprehensive and robust method to estimate and predict risk exposure can be beneficial to maritime administrations to enhance mitigation strategies and understand uncertainties. We further provide a proof of concept based on 53 million observations of vessel positions and individual risk profiles of 8,900 individual ships. We present examples on how endpoints can be visualized for two integrated risk layers – ship specific risk and vessel traffic densities. We further identify and discuss uncertainties and present our ideas on how other risk layers could be integrated in the future. |
Keywords: | econometrics, shipping industry |
JEL: | C10 |
Date: | 2014–08–01 |
URL: | http://d.repec.org/n?u=RePEc:ems:eureir:51748&r=rmg |
By: | Bertrand Hassani (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Paris I - Panthéon-Sorbonne, Santander UK - Santander UK); Alexis Renaudin (Aon GRC - Aon Global Risk Consulting -) |
Abstract: | According to the last proposals of the Basel Committee on Banking Supervision, banks under the Advanced Measurement Approach (AMA) must use four different sources of information to assess their Operational Risk capital requirement. The fourth including "business environment and internal control factors", i.e. qualitative criteria, the three main quantitative sources available to banks to build the Loss Distribution are Internal Loss Data, External Loss Data, and Scenario Analysis. This paper proposes an innovative methodology to bring together these three different sources in the Loss Distribution Approach (LDA) framework through a Bayesian strategy. The integration of the different elements is performed in two different steps to ensure an internal data driven model is obtained. In a first step, scenarios are used to inform the prior distributions and external data informs the likelihood component of the posterior function. In the second step, the initial posterior function is used as the prior distribution and the internal loss data inform the likelihood component of the second posterior. This latter posterior function enables the estimation of the parameters of the severity distribution selected to represent the Operational Risk event types. |
Keywords: | Operational Risk; Loss Distribution Approach; Bayesian inference; Marchov Chain Monte Carlo; Extreme Value Theory; non-parametric statistics; risk measures |
Date: | 2013–02 |
URL: | http://d.repec.org/n?u=RePEc:hal:journl:halshs-00795046&r=rmg |
By: | Massimiliano Caporin (University of Padova); Eduardo Rossi (University of Pavia); Paolo Santucci de Magistris (Aarhus University and CREATES) |
Abstract: | The realized volatility of financial returns is characterized by persistence and occurrence of unpredictable large increments. To capture those features, we introduce the Multiplicative Error Model with jumps (MEM-J). When a jump component is included in the multiplicative specification, the conditional density of the realized measure is shown to be a countably infinite mixture of Gamma and K distributions. Strict stationarity conditions are derived. A Monte Carlo simulation experiment shows that maximum likelihood estimates of the model parameters are reliable even when jumps are rare events. We estimate alternative specifications of the model using a set of daily bipower measures for 7 stock indexes and 16 individual NYSE stocks. The estimates of the jump component confirm that the probability of jumps dramatically increases during the financial crises. Compared to other realized volatility models, the introduction of the jump component provides a sensible improvement in the fit, as well as for in-sample and out-of-sample volatility tail forecasts. |
Keywords: | Multiplicative Error Model with Jumps, Jumps in volatility, Realized measures, Volatility-at-Risk |
JEL: | C22 C58 G01 |
Date: | 2014–08–29 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2014-29&r=rmg |
By: | Minamihashi, Naoaki; Wakamori, Naoki |
Abstract: | We estimate an investors’ demand model for hedge funds to analyze the potential impact of leverage limits in the industry. Our estimation results highlight the importance of heterogeneous investor preference for the use of leverage, i.e., 20% of investors prefer leverage usage while others do not. We then conduct a policy simulation in which regulators put a cap on allowable leverage, as proposed by the Financial Stability Board in 2012. Simulation results suggest that the 200% leverage limit would lower the total demand (assets under management) for hedge funds by 10%. In particular, the regulation would lead to lower investments in highly leveraged funds and to lower investments in risky strategies, which, in turn, would reduce systemic risk. |
Keywords: | hedge funds; demand estimation; leverage; regulation; systemic risk |
JEL: | G38 G23 L52 |
Date: | 2014–09–01 |
URL: | http://d.repec.org/n?u=RePEc:trf:wpaper:473&r=rmg |
By: | Zura Kakushadze; Jim Kyung-Soo Liew |
Abstract: | We discuss when and why custom multi-factor risk models are warranted and give source code for computing some risk factors. Pension/mutual funds do not require customization but standardization. However, using standardized risk models in quant trading with much shorter holding horizons is suboptimal: 1) longer horizon risk factors (value, growth, etc.) increase noise trades and trading costs; 2) arbitrary risk factors can neutralize alpha; 3) "standardized" industries are artificial and insufficiently granular; 4) normalization of style risk factors is lost for the trading universe; 5) diversifying risk models lowers P&L correlations, reduces turnover and market impact, and increases capacity. We discuss various aspects of custom risk model building. |
Date: | 2014–09 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1409.2575&r=rmg |
By: | Takashi Isogai (Bank of Japan) |
Abstract: | This paper analyzes Value at Risk (VaR) and Expected Shortfall (ES) calculation methods in terms of bias and dispersion against benchmarks computed from a fat-tailed parametric distribution. The daily log returns of the Nikkei-225 stock index are modeled by a truncated stable distribution. The VaR and ES values of the fitted distribution are regarded as benchmarks. The fitted distribution is also used as a sampling distribution; sample returns with different sizes are generated for the simulations of the VaR and ES calculations. Two parametric methods: normal distribution and generalized Pareto distribution and two non-parametric methods: historical simulation and kernel smoothing are selected as the targets of this analysis. A comparison of the simulated VaR, ES, and the ES/VaR ratio with the benchmarks at multiple confidence levels reveals that the normal distribution approximation has a significant downward bias, especially in the ES calculation. The estimates by the other three methods are much closer to the benchmarks on average, although some of them become unstable with smaller sample sizes and/or at higher confidence levels. Specifically, ES tends to be more biased and unstable than VaR at higher confidence levels. |
Keywords: | Value at Risk; Expected Shortfall; Fat-Tailed Distribution; Truncated Stable Distribution; Numerical Simulation |
Date: | 2014–01–17 |
URL: | http://d.repec.org/n?u=RePEc:boj:bojwps:wp14e01&r=rmg |
By: | Kazemi, Maziar (Board of Governors of the Federal Reserve System (U.S.)); Islamaj, Ergys (Vassar College) |
Abstract: | Do more active hedge fund managers generate higher returns than their less active peers? We attempt to answer this question. Using Kalman Filter techniques, we estimate the risk exposure dynamics of a large sample of live and dead equity long-short hedge funds. These estimates are then used to develop a measure of activeness for each hedge fund. Our results show that there exists a nonlinear relationship between activeness and performance. Using raw returns as a measure of performance, it is found that more active funds outperform the less active ones. However, when risk adjusted returns are used to measure performance, we find the opposite results; that is, activeness is inversely related to returns. Still, we find that a few very active managers outperform the moderately active funds and generate higher returns. We conclude that the most active managers use their skills to manage the riskiness of their portfolios and are, therefore, able to provide higher risk adjusted returns. Finally, we find that compared to the least active managers, the most active managers are less homogeneous and, therefore, due diligence is far more important when selecting an active manager. |
Keywords: | Hedge funds; Fama-French; active management; dynamic trading |
JEL: | G11 G12 G14 G23 |
Date: | 2014–08–08 |
URL: | http://d.repec.org/n?u=RePEc:fip:fedgif:1112&r=rmg |
By: | Maiya Anokhina (National Research University Higher School of Economics, Moscow); Henry Penikas (National Research University Higher School of Economics, Moscow); Victor Petrov (National Research University Higher School of Economics, Moscow) |
Abstract: | The increased role of financial institutions in the economy leads to a need to determine those that are systemically important. The bankruptcy of such institutions creates negative effects for the economy on the global scale. The aim of this article is to identify important financial coefficients that can be used in the methodology of identification of G-SIB and G-SII. Models of binary choice and models of ordered choice are used in this article, several models are highly predictive. Besides this paper has revealed several financial coefficients, that helped to find the probabilities of G-SIF for Russian banks and insurance companies. |
Keywords: | Systemic importance; Basel committee, probability of default, financial coefficients; models of ordered choice, models of binary choice, global systemically important banks (G-SIB), insurance company. |
JEL: | C70 E58 G21 |
Date: | 2014–09 |
URL: | http://d.repec.org/n?u=RePEc:pav:demwpp:085&r=rmg |
By: | Chen, Cathy W.S.; Gerlach, Richard; Lin, Edward M.H. |
Abstract: | Methods for Bayesian testing and assessment of dynamic quantile forecasts are proposed. Specifically, Bayes factor analogues of popular frequentist tests for independence of violations from, and for correct coverage of a time series of, quantile forecasts are developed. To evaluate the relevant marginal likelihoods involved, analytic integration methods are utilised when possible, otherwise multivariate adaptive quadrature methods are employed to estimate the required quantities. The usual Bayesian interval estimate for a proportion is also examined in this context. The size and power properties of the proposed methods are examined via a simulation study, illustrating favourable comparisons both overall and with their frequentist counterparts. An empirical study employs the proposed methods, in comparison with standard tests, to assess the adequacy of a range of forecasting models for Value at Risk (VaR) in several financial market data series. |
Keywords: | quantile regression; Value-at-Risk; asymmetric-Laplace distribution; Bayes factor; Bayesian Hypothesis testing |
Date: | 2014–09–10 |
URL: | http://d.repec.org/n?u=RePEc:syb:wpbsba:2123/11816&r=rmg |
By: | Thomas Brand; Fabien Tripier |
Abstract: | Why have the Euro area and the US diverged since 2011 while they were highly synchronized during the recession of 2008-2009? To explain this divergence, we provide a structural interpretation of these episodes through the estimation of a business cycle model with financial frictions for both economies. Our results show that risk shocks, measured as the volatility of idiosyncratic uncertainty in the financial sector, have played a crucial role in the divergence with the absence of risk reversal in the Euro area. Risk shocks have stimulated US credit and investment growth since the trough of 2009 whereas they have been at the origin of the double-dip recession in the Euro area. A companion website is available at http://shiny.cepii.fr/risk-shocks-and-di vergence. |
Keywords: | Great recession;Business cycles;Uncertainty;Divergence;Risk Shocks |
JEL: | E3 E4 G3 |
Date: | 2014–07 |
URL: | http://d.repec.org/n?u=RePEc:cii:cepidt:2014-11&r=rmg |
By: | Rohini Grover (Indira Gandhi Institute of Development Research); Ajay Shah (National Institute of Public Finance and Policy) |
Abstract: | Concerns about sampling noise arise when a VIX estimator is computed by aggregating several imprecise implied volatility estimates. We propose a bootstrap strategy to measure the imprecision of a model based VIX estimator. We find that the imprecision of VIX is economically significant. We propose a model selection strategy,where alternative statistical estimators of VIX are evaluated based on this imprecision. |
Keywords: | Implied volatility, volatility index, imprecision |
JEL: | G12 G13 G17 |
Date: | 2014–08 |
URL: | http://d.repec.org/n?u=RePEc:ind:igiwpp:2014-031&r=rmg |