|
on Risk Management |
Issue of 2011‒03‒12
thirteen papers chosen by |
By: | Michael McAleer (Econometrisch Instituut (Econometric Institute), Faculteit der Economische Wetenschappen (Erasmus School of Economics) Erasmus Universiteit, Tinbergen Instituut (Tinbergen Institute).); Juan-Ángel Jiménez-Martín (Departamento de Economía Cuantitativa (Department of Quantitative Economics), Facultad de Ciencias Económicas y Empresariales (Faculty of Economics and Business), Universidad Complutense de Madrid); Chia-Lin Chang (NCHU Department of Applied Economics (Taiwan)); Teodosio Pérez-Amaral (Departamento de Economía Cuantitativa (Department of Quantitative Economics), Facultad de Ciencias Económicas y Empresariales (Faculty of Economics and Business), Universidad Complutense de Madrid) |
Abstract: | The Basel II Accord requires that banks and other Authorized Deposit-taking Institutions (ADIs) communicate their daily risk forecasts to the appropriate monetary authorities at the beginning of each trading day, using one or more risk models to measure Value-at-Risk (VaR). The risk estimates of these models are used to determine capital requirements and associated capital costs of ADIs, depending in part on the number of previous violations, whereby realised losses exceed the estimated VaR. McAleer, Jimenez-Martin and Perez- Amaral (2009) proposed a new approach to model selection for predicting VaR, consisting of combining alternative risk models, and comparing conservative and aggressive strategies for choosing between VaR models. This paper addresses the question of risk management of risk, namely VaR of VIX futures prices. We examine how different risk management strategies performed during the 2008-09 global financial crisis (GFC). We find that an aggressive strategy of choosing the Supremum of the single model forecasts is preferred to the other alternatives, and is robust during the GFC. However, this strategy implies relatively high numbers of violations and accumulated losses, though these are admissible under the Basel II Accord. |
Keywords: | Median strategy, Value-at-Risk (VaR), daily capital charges, violation penalties, optimizing strategy, aggressive risk management, conservative risk management, Basel II Accord, VIX futures, global financial crisis (GFC). |
JEL: | G32 G11 C53 C22 |
Date: | 2011 |
URL: | http://d.repec.org/n?u=RePEc:ucm:doicae:1102&r=rmg |
By: | Thomas Conlon (University College Dublin); John Cotter (University College Dublin) |
Abstract: | This paper investigates the hedging effectiveness of a dynamic moving window OLS hedging model, formed using wavelet decomposed time-series. The wavelet transform is applied to calculate the appropriate dynamic minimum-variance hedge ratio for various hedging horizons for a number of assets. The effectiveness of the dynamic multiscale hedging strategy is then tested, both in- and out-of-sample, using standard variance reduction and expanded to include a downside risk metric, the time horizon dependent Value-at-Risk. Measured using variance reduction, the effectiveness converges to one at longer scales, while a measure of VaR reduction indicates a portion of residual risk remains at all scales. Analysis of the hedge portfolio distributions indicate that this unhedged tail risk is related to excess portfolio kurtosis found at all scales. |
Date: | 2011–03–02 |
URL: | http://d.repec.org/n?u=RePEc:ucd:wpaper:201104&r=rmg |
By: | William T. Shaw |
Abstract: | We show how to reduce the problem of computing VaR and CVaR with Student T return distributions to evaluation of analytical functions of the moments. This allows an analysis of the risk properties of systems to be carefully attributed between choices of risk function (e.g. VaR vs CVaR); choice of return distribution (power law tail vs Gaussian) and choice of event frequency, for risk assessment. We exploit this to provide a simple method for portfolio optimization when the asset returns follow a standard multivariate T distribution. This may be used as a semi-analytical verification tool for more general optimizers, and for practical assessment of the impact of fat tails on asset allocation for shorter time horizons. |
Date: | 2011–02 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1102.5665&r=rmg |
By: | Éric Tymoigne |
Abstract: | With the Great Recession and the regulatory reform that followed, the search for reliable means to capture systemic risk and to detect macrofinancial problems has become a central concern. In the United States, this concern has been institutionalized through the Financial Stability Oversight Council, which has been put in charge of detecting threats to the financial stability of the nation. Based on Hyman Minsky's financial instability hypothesis, the paper develops macroeconomic indexes for three major economic sectors. The index provides a means to detect the speed with which financial fragility accrues, and its duration; and serves as a complement to the microprudential policies of regulators and supervisors. The paper notably shows, notably, that periods of economic stability during which default rates are low, profitability is high, and net worth is accumulating are fertile grounds for the growth of financial fragility. |
Keywords: | Financial Fragility; Financial Regulation; Financial Crises; Macroprudential Risk; Debt-Deflation Process; Ponzi Finance |
JEL: | E32 G18 G28 G38 |
Date: | 2011–03 |
URL: | http://d.repec.org/n?u=RePEc:lev:wrkpap:wp_654&r=rmg |
By: | Jo\~ao P. da Cruz; Pedro G. Lind |
Abstract: | The social role of any company is to get the maximum profitability with the less risk. Due to Basel III, banks should now raise their minimum capital levels on an individual basis, with the aim of lowering the probability for a large crash to occur. Such implementation assumes that with higher minimum capital levels it becomes more probable that the value of the assets drop bellow the minimum level and consequently expects the number of bank defaults to drop also. We present evidence that in such new financial reality large crashes are avoid only if one assumes that banks will accept quietly the drop of business levels, which is counter-nature. Our perspective steams from statistical physics and gives hints for improving bank system resilience. Stock markets exhibit critical behavior and scaling features, showing a power-law for the amplitude of financial crisis. By modeling a financial network where critical behavior naturally emerges it is possible to show that bank system resilience is not favored by raising the levels of capital. Due to the complex nature of the financial network, only the probability of bank default is affected and not the magnitude of a money market crisis. Further, assuming that banks will try to restore business levels, raising diversification and lowering their individual risk, the dimension of the entire financial network will increase, which has the natural consequence of raising the probability of large crisis. |
Date: | 2011–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1103.0717&r=rmg |
By: | Pflüger, Michael P. (University of Passau); Russek, Stephan (University of Passau) |
Abstract: | The risk of default that business firms face is very significant and differs widely across countries. This paper explores the links between countries' business conditions and international trade embedment and the default risk at the country level from a theoretical point of view. Our main contribution is to set up a general equilibrium model which allows us to derive sharp predictions concerning how key factors which shape a country's business and trade environment impact on the default risk of firms which operate in these environments. The predictions are in accord with readily available data. |
Keywords: | firm death, firm heterogeneity, business conditions and firm productivity, trade integration |
JEL: | F12 F13 F15 L25 |
Date: | 2011–02 |
URL: | http://d.repec.org/n?u=RePEc:iza:izadps:dp5541&r=rmg |
By: | John Cotter (University College Dublin); Jim Hanly (Dublin Institute of Technology) |
Abstract: | A key issue in the estimation of energy hedges is the hedgers’ attitude towards risk which is encapsulated in the form of the hedgers’ utility function. However, the literature typically uses only one form of utility function such as the quadratic when estimating hedges. This paper addresses this issue by estimating and applying energy market based risk aversion to commonly applied utility functions including log, exponential and quadratic, and we incorporate these in our hedging frameworks. We find significant differences in the optimal hedge strategies based on the utility function chosen. |
Keywords: | Energy; Hedging; Risk Management; Risk Aversion; Forecasting |
JEL: | G10 G12 G15 |
Date: | 2011–03–02 |
URL: | http://d.repec.org/n?u=RePEc:ucd:wpaper:201106&r=rmg |
By: | Tim Christensen (Yale University); Stan Hurn (QUT); Ken Lindsay (Glasgow) |
Abstract: | In many electricity markets, retailers purchase electricity at an unregulated spot price and sell to consumers at a heavily regulated price. Consequently the occurrence of extreme movements in the spot price represents a major source of risk to retailers and the accurate forecasting of these extreme events or price spikes is an important aspect of effective risk management. Traditional approaches to modeling electricity prices are aimed primarily at predicting the trajectory of spot prices. By contrast, this paper focuses exclusively on the prediction of spikes in electricity prices. The time series of price spikes is treated as a realization of a discrete-time point process and a nonlinear variant of the autoregressive conditional hazard (ACH) model is used to model this process. The model is estimated using half-hourly data from the Australian electricity market for the sample period 1 March 2001 to 30 June 2007. The estimated model is then used to provide one-step-ahead forecasts of the probability of an extreme event for every half hour for the forecast period, 1 July 2007 to 30 September 2007, chosen to correspond to the duration of a typical forward contract. The forecasting performance of the model is then evaluated against a benchmark that is consistent with the assumptions of commonly-used electricity pricing models. |
Keywords: | Electricity Prices, Price Spikes, Autoregressive Conditional Duration, Autoregressive |
JEL: | C14 C52 |
Date: | 2011–01–25 |
URL: | http://d.repec.org/n?u=RePEc:qut:auncer:2011_1&r=rmg |
By: | Xiaolin Luo; Pavel V. Shevchenko |
Abstract: | One of the most popular copulas for modeling dependence structures is t-copula. Recently the grouped t-copula was generalized to allow each group to have one member only, so that a priori grouping is not required and the dependence modeling is more flexible. This paper describes a Markov chain Monte Carlo (MCMC) method under the Bayesian inference framework for estimating and choosing t-copula models. Using historical data of foreign exchange (FX) rates as a case study, we found that Bayesian model choice criteria overwhelmingly favor the generalized t-copula. In addition, all the criteria also agree on the second most likely model and these inferences are all consistent with classical likelihood ratio tests. Finally, we demonstrate the impact of model choice on the conditional Value-at-Risk for portfolios of six major FX rates. |
Date: | 2011–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1103.0606&r=rmg |
By: | Gregor Wergen; Miro Bogner; Joachim Krug |
Abstract: | We consider the occurrence of record-breaking events in random walks with asymmetric jump distributions. The statistics of records in symmetric random walks was previously analyzed by Majumdar and Ziff and is well understood. Unlike the case of symmetric jump distributions, in the asymmetric case the statistics of records depends on the choice of the jump distribution. We compute the record rate $P_n(c)$, defined as the probability for the $n$th value to be larger than all previous values, for a Gaussian jump distribution with standard deviation $\sigma$ that is shifted by a constant drift $c$. For small drift, in the sense of $c/\sigma \ll n^{-1/2}$, the correction to $P_n(c)$ grows proportional to arctan$(\sqrt{n})$ and saturates at the value $\frac{c}{\sqrt{2} \sigma}$. For large $n$ the record rate approaches a constant, which is approximately given by $1-(\sigma/\sqrt{2\pi}c)\textrm{exp}(-c^2/2\sigma^2)$ for $c/\sigma \gg 1$. These asymptotic results carry over to other continuous jump distributions with finite variance. As an application, we compare our analytical results to the record statistics of 366 daily stock prices from the Standard & Poors 500 index. The biased random walk accounts quantitatively for the increase in the number of upper records due to the overall trend in the stock prices, and after detrending the number of upper records is in good agreement with the symmetric random walk. However the number of lower records in the detrended data is significantly reduced by a mechanism that remains to be identified. |
Date: | 2011–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1103.0893&r=rmg |
By: | Paolo Angelini; Laurent Clerc; Vasco Cúrdia; Leonardo Gambacorta; Andrea Gerali; Alberto Locarno; Roberto Motto; Werner Roeger; Skander Van den Heuvel; Jan Vlcek |
Abstract: | We assess the long-term economic impact of the new regulatory standards (the Basel III reform), answering the following questions. (1) What is the impact of the reform on long-term economic performance? (2) What is the impact of the reform on economic fluctuations? (3) What is the impact of the adoption of countercyclical capital buffers on economic fluctuations? The main results are the following. (1) Each percentage point increase in the capital ratio causes a median 0.09 percent decline in the level of steady state output, relative to the baseline. The impact of the new liquidity regulation is of a similar order of magnitude, at 0.08 percent. This paper does not estimate the benefits of the new regulation in terms of reduced frequency and severity of financial crisis, analysed in Basel Committee on Banking Supervision (BCBS, 2010b). (2) The reform should dampen output volatility; the magnitude of the effect is heterogeneous across models; the median effect is modest. (3) The adoption of countercyclical capital buffers could have a more sizeable dampening effect on output volatility. These conclusions are fully consistent with those of reports by the Long-term Economic Impact group (BCBS, 2010b) and Macro Assessment Group (MAG, 2010b). |
Keywords: | Basel III, countercyclical capital buffers, financial (in)stability, procyclicality, macroprudential |
Date: | 2011–02 |
URL: | http://d.repec.org/n?u=RePEc:bis:biswps:338&r=rmg |
By: | Alejandro Balbás; Beatriz Balbás; Raquel Balbás |
Abstract: | This paper studies a portfolio choice problem such that the pricing rule may incorporate transaction costs and the risk measure is coherent and expectation bounded. We will prove the necessity of dealing with pricing rules such that there are essentially bounded stochastic discount factors, which must be also bounded from below by a strictly positive value. Otherwise good deals will be available to traders, i.e., depending on the selected risk measure, investors can build portfolios whose (risk, return) will be as close as desired to (- infinite, + infinite) or (0, infinite). This pathologic property still holds for vector risk measures (i.e., if we minimize a vector valued function whose components are risk measures). It is worthwhile to point out that essentially bounded stochastic discount factors are not usual in financial literature. In particular, the most famous frictionless, complete and arbitrage free pricing models imply the existence of good deals for every coherent and expectation bounded measure of risk, and the incorporation of transaction costs will no guarantee the solution of this caveat |
Keywords: | Risk measure, Perfect and imperfect markets, Stochastic discount factor, Portfolio choice model, Good deal |
JEL: | G12 G13 G11 |
Date: | 2011–02 |
URL: | http://d.repec.org/n?u=RePEc:cte:wbrepe:wb110302&r=rmg |
By: | Katja Drechsel; Rolf Scheufele |
Abstract: | This paper analyses the recession in 2008/2009 in Germany, which is very different from previous recessions, in particular regarding its cause and magnitude. We show to what extent forecasters and forecasts based on leading indicators fail to detect the timing and the magnitude of the recession. This study shows that large forecast errors for both expert forecasts and forecasts based on leading indicators resulted during this recession which implies that the recession was very difficult to forecast. However, some leading indicators (survey data, risk spreads, stock prices) have indicated an economic downturn and hence, beat univariate time series models. Although the combination of individual forecasts provides an improvement compared to the benchmark model, the combined forecasts are worse than several individual models. A comparison of expert forecasts with the best forecasts based on leading indicators shows only minor deviations. Overall, the range for an improvement of expert forecasts during the crisis compared to indicator forecasts is relatively small. |
Keywords: | leading indicators, recession, consensus forecast, non-linearities |
JEL: | E37 C53 |
Date: | 2011–03 |
URL: | http://d.repec.org/n?u=RePEc:iwh:dispap:5-11&r=rmg |