|
on Risk Management |
Issue of 2009‒08‒30
eleven papers chosen by |
By: | McAleer, M.; Jimenez-Marin, J-. A.; Perez-Amaral, T. (Erasmus Econometric Institute) |
Abstract: | When dealing with market risk under the Basel II Accord, variation pays in the form of lower capital requirements and higher profits. Typically, GARCH type models are chosen to forecast Value-at-Risk (VaR) using a single risk model. In this paper we illustrate two useful variations to the standard mechanism for choosing forecasts, namely: (i) combining different forecast models for each period, such as a daily model that forecasts the supremum or infinum value for the VaR; (ii) alternatively, select a single model to forecast VaR, and then modify the daily forecast, depending on the recent history of violations under the Basel II Accord. We illustrate these points using the Standard and Poor’s 500 Composite Index. In many cases we find significant decreases in the capital requirements, while incurring a number of violations that stays within the Basel II Accord limits. |
Keywords: | risk management;violations;aggressive risk strategy;conservative risk strategy;value-at-risk forecast |
Date: | 2009–08–18 |
URL: | http://d.repec.org/n?u=RePEc:dgr:eureir:1765016512&r=rmg |
By: | Chen Zou |
Abstract: | In this paper, we study the aggregated risk from dependent risk factors under the multivariate Extreme Value theory (EVT) framework. We consider the heavy-tailness of the risk factors as well a non-parametric tail dependence structure. This allows a large scope of models on the dependency. We assess the Value-at-Risk of a diversified portfolio constructed from dependent risk factors. Moreover, we examine the diversification effects under this setup. |
Keywords: | Aggregated risk; diversification effect; multivariate Extreme Value Theory |
Date: | 2009–07 |
URL: | http://d.repec.org/n?u=RePEc:dnb:dnbwpp:219&r=rmg |
By: | René Aïd (EDF R&D - EDF, FiME Lab - Laboratoire de Finance des Marchés d'Energie - Université Paris Dauphine - Paris IX - CREST - EDF R&D) |
Abstract: | Since the energy markets liberalisation at the beginning of the 1990s in Europe, electricity monopolies have gone through a profound evolution process. From an industrial organisation point of view, they lost their monopoly on their historical business, but gained the capacity to develop in any sector. Companies went public and had to upgrade their financial risk management process to international standards and implement modern risk management concepts and reporting processes (VaR, EaR...). Even though important evolutions have been accomplished, we argue here that the long-term risk management process of utility companies has not yet reached its full maturity and is still facing two main challenges. The first one concerns the time consistency of long-term and mid-term risk management processes. We show that consistencies issues are coming from the different classical financial parameters carrying information on firms' risk aversion (cost of capital and short-term risk limits) and the concepts inherited from the monopoly period, like the loss of load value, that are still involved in the utility company decision-making process. The second challenge concerns the need for quantitative models to assess their business model. With the deregulation, utilities have to address the question of their boundaries. Although intuition can provide insights on the benefits of some firm structures like vertical integration, only sound and tractable quantitative models can bring answers to the optimality of different possible firm structures. |
Keywords: | electricity markets; risk management; investment decision; long-term risk |
Date: | 2008–12–30 |
URL: | http://d.repec.org/n?u=RePEc:hal:wpaper:hal-00409030_v4&r=rmg |
By: | Tobias Adrian; Hyun Song Shin |
Abstract: | The current financial crisis has highlighted the growing importance of the “shadow banking system,” which grew out of the securitization of assets and the integration of banking with capital market developments. This trend has been most pronounced in the United States, but it has had a profound influence on the global financial system. In a market-based financial system, banking and capital market developments are inseparable: Funding conditions are closely tied to fluctuations in the leverage of market-based financial intermediaries. Growth in the balance sheets of these intermediaries provides a sense of the availability of credit, while contractions of their balance sheets have tended to precede the onset of financial crises. Securitization was intended as a way to transfer credit risk to those better able to absorb losses, but instead it increased the fragility of the entire financial system by allowing banks and other intermediaries to “leverage up” by buying one another’s securities. In the new, post-crisis financial system, the role of securitization will likely be held in check by more stringent financial regulation and by the recognition that it is important to prevent excessive leverage and maturity mismatch, both of which can undermine financial stability. |
Date: | 2009 |
URL: | http://d.repec.org/n?u=RePEc:fip:fednsr:382&r=rmg |
By: | Bernardo Maggi; Marco Guida (Dipartimento di Economia, Sapienza University of Rome Italy) |
Abstract: | In this paper we model the effect of the non performing loans on the cost structure of the commercial banking system. With this aim, we comment on an increase in the non performing loans by studying the consequences of such a change on the cost function and compute the probability of failure of maintaining a performing loan as such. In so doing we are convinced that geography does matter and evaluate the risk propensity of the bank towards the non performing loans accordingly. We finally stress that traditional efficiency indicators of cost elasticity do not fit properly with such a problem and propose a measure based on the costs for managing and monitoring the loans which, according to the related density function, will reveal effectively as non performing. |
Keywords: | Non performing loans probability, Bank management, Cost function, Efficiency and effectiveness indicators, Flexible forms |
JEL: | G21 D24 C33 C51 L23 |
Date: | 2009–04 |
URL: | http://d.repec.org/n?u=RePEc:des:wpaper:1&r=rmg |
By: | Julien Chevallier (EconomiX - CNRS : UMR7166 - Université de Paris X - Nanterre) |
Abstract: | This article proposes a mean-variance optimization and portfolio frontier analysis of energy risk management with carbon assets, introduced in January 2005 as part of the EU Emissions Trading Scheme. In a stylized exercise, we compute returns, standard deviations and correlations for various asset classes from April 2005 to January 2009. Our central result features an expected return of 3% with a standard deviation < 0.06 by introducing carbon assets – carbon futures and CERs- in a diversified portfolio composed of energy (oil, gas, coal), weather, bond, equity risky assets, and of a riskless asset (U.S. T-bills). Besides, we investigate the characteristics of each asset class with respect to the alpha, beta, and sigma in the spirit of the CAPM. These results reveal that carbon, gas, coal and bond assets share the best properties for composing an optimal portfolio. Collectively, these results illustrate the benefits of carbon assets for diversification purposes in portfolio management, as the carbon market constitutes a segmented commodity market with specific risk factors linked to the EU Commission's decisions and the power producers' fuel-switching behavior. |
Keywords: | Mean-variance optimization; Portfolio frontier analysis; CAPM; CO2; Carbon; Energy; Bonds; Equity; Asset Management; EU ETS; CERs |
Date: | 2009–08–16 |
URL: | http://d.repec.org/n?u=RePEc:hal:wpaper:halshs-00410059_v1&r=rmg |
By: | Peter Christoffersen; Kris Jacobs; Chayawat Ornthanalai |
Abstract: | Standard empirical investigations of jump dynamics in returns and volatility are fairly complicated due to the presence of latent continuous-time factors. We present a new discrete-time framework that combines heteroskedastic processes with rich specifications of jumps in returns and volatility. Our models can be estimated with ease using standard maximum likelihood techniques. We provide a tractable risk neutralization framework for this class of models which allows for separate modeling of risk premia for the jump and normal innovations. We anchor our models in the literature by providing continuous time limits of the models. The models are evaluated by fitting a long sample of S&P500 index returns, and by valuing a large sample of options. We find strong empirical support for time-varying jump intensities. A model with jump intensity that is affine in the conditional variance performs particularly well both in return fitting and option valuation. Our implementation allows for multiple jumps per day, and the data indicate support for this model feature, most notably on Black Monday in October 1987. Our results also confirm the importance of jump risk premia for option valuation: jumps cannot significantly improve the performance of option pricing models unless sizeable jump risk premia are present. <P>Les recherches empiriques standards portant sur la dynamique des sauts dans les rendements et dans la volatilité sont plutôt complexes en raison de la présence de facteurs inobservables en temps continu. Nous présentons un nouveau cadre d’étude en temps discret qui combine des processus hétéroscédastiques et des caractéristiques à concentration élevée de sauts dans les rendements et dans la volatilité. Nos modèles peuvent être facilement évalués à l’aide des méthodes standards du maximum de vraisemblance. Nous offrons une démarche souple de neutralisation du risque pour cette catégorie de modèles, ce qui permet de modéliser distinctement les primes de risque liées aux sauts et celles liées aux innovations normales. Nous imbriquons nos modèles dans la littérature en établissant leurs limites en temps continu. Ces derniers sont évalués en intégrant un échantillon de rendements à long terme de l’indice S&P 500 et en évaluant un vaste échantillon d’options. Nous trouvons un solide appui empirique en ce qui a trait aux intensités de sauts variant dans le temps. Un modèle avec intensité de saut affine dans la variance conditionnelle est particulièrement efficace sur les plans de l’ajustement des rendements et de l’évaluation des options. La mise en œuvre de notre modèle permet de multiples sauts par jour et les données appuient cette caractéristique, plus particulièrement en ce qui a trait au lundi noir d’octobre 1987. Nos résultats confirment aussi l’importance des primes liées au risque de sauts pour l’évaluation du prix des options : les sauts ne peuvent contribuer à améliorer considérablement la performance des modèles utilisés pour fixer les prix des options, sauf en présence de primes de risque de sauts assez importantes. |
Keywords: | compound Poisson process, option valuation, filtering; volatility jumps, jump risk premia, time-varying jump intensity, heteroskedasticity. , processus composé de Poisson, évaluation du prix des options, filtrage, sauts liés à la volatilité, primes de risque de sauts, intensité des sauts variant dans le temps, hétéroscédasticité. |
JEL: | G12 |
Date: | 2009–08–01 |
URL: | http://d.repec.org/n?u=RePEc:cir:cirwor:2009s-34&r=rmg |
By: | Bo-Young Chang; Peter Christoffersen; Kris Jacobs; Gregory Vainberg |
Abstract: | Equity risk measured by beta is of great interest to both academics and practitioners. Existing estimates of beta use historical returns. Many studies have found option-implied volatility to be a strong predictor of future realized volatility. We .nd that option-implied volatility and skewness are also good predictors of future realized beta. Motivated by this .nding, we establish a set of assumptions needed to construct a beta estimate from option-implied return moments using equity and index options. This beta can be computed using only option data on a single day. It is therefore potentially able to re.ect sudden changes in the structure of the underlying company. <P>Le risque du marché des actions mesuré selon le coefficient bêta suscite un vif intérêt de la part des universitaires et des praticiens. Les estimations existantes du coefficient bêta utilisent les rendements historiques. De nombreuses études ont démontré que la volatilité implicite du prix des options constitue un indice solide de la volatilité future réalisée. Nous constatons que la volatilité implicite des options et leur caractère asymétrique sont aussi de bons facteurs prévisionnels du bêta futur réalisé. Motivés par ce constat, nous établissons un ensemble d’hypothèses nécessaires pour effectuer une estimation du bêta, à partir des moments de rendement implicite des options, en recourant aux actions et aux options sur indices boursiers. Ce bêta peut être calculé en utilisant seulement les données obtenues sur les options au cours d’une même journée. Il peut donc refléter les changements soudains de la structure de la société sous-jacente. |
Keywords: | market beta; CAPM; historical; capital budgeting; model-free moments, bêta du marché, MEDAF (modèle d’équilibre des actifs financiers), historique, budgétisation des investissements, moments non paramétriques. |
JEL: | G12 |
Date: | 2009–08–01 |
URL: | http://d.repec.org/n?u=RePEc:cir:cirwor:2009s-33&r=rmg |
By: | Yannick Malevergne; Pedro Santa-Clara; Didier Sornette |
Abstract: | The heavy-tailed distribution of firm sizes first discovered by Zipf (1949) is one of the best established empirical facts in economics. We show that it has strong implications for asset pricing. Due to the concentration of the market portfolio when the distribution of the capitalization of firms is sufficiently heavy-tailed, an additional risk factor generically appears even for very large economies. Our two-factor model is as successful empirically as the three-factor Fama-French model. |
JEL: | G12 |
Date: | 2009–08 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:15295&r=rmg |
By: | Salman, A. Khalik (CAFO, Växjö University); von Friedrichs, Yvonne (CAFO, Växjö University); Shukur, Ghazi (CESIS - Centre of Excellence for Science and Innovation Studies, Royal Institute of Technology) |
Abstract: | This paper employs a time series cointegration approach to evaluate the relationship between manufacturing firm failure and macroeconomic factors for the Swedish manufacturing sector in the period 1986 – 2006. It uses quarterly data for this period. We found that in long run a firms’ failure is negatively related to the level of industrial activity, money supply, GNP and economic openness rate, and positively related to the real wage. Time series Error Correction Model (ECM) estimates suggest that macroeconomic risk factors impinge on firm failures on the same direction in both the short run and the long run and that adjustment to stabilise the relationship is quite slow. |
Keywords: | firm failure; macroeconomic factors; cointegration analysis; diagnostic tests |
JEL: | D01 D02 |
Date: | 2009–08–26 |
URL: | http://d.repec.org/n?u=RePEc:hhs:cesisp:0185&r=rmg |
By: | Lasse Heje Pedersen |
Abstract: | The dangers of shouting "fire" in a crowded theater are well understood, but the dangers of rushing to the exit in the financial markets are more complex. Yet, the two events share several features, and I analyze why people crowd into theaters and trades, why they run, what determines the risk, whether to return to the theater or trade when the dust settles, and how much to pay for assets (or tickets) in light of this risk. These theoretical considerations shed light on the recent global liquidity crisis and, in particular, the quant event of 2007. |
JEL: | E44 E52 G1 G12 G18 G2 |
Date: | 2009–08 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:15297&r=rmg |