
on Risk Management 
Issue of 2010‒05‒08
nine papers chosen by 
By:  Durant, D.; Frey, L. 
Abstract:  The aim of this paper is to build and estimate a macroeconomic model of credit risk for the French manufacturing sector. This model is based on Wilson's CreditPortfolioView model (1997a, 1997b); it enables us to simulate loss distributions for a credit portfolio for several macroeconomic scenarios. We implement two simulation procedures based on two assumptions relative to probabilities of default (PDs): in the first procedure, firms are assumed to have identical default probabilities; in the second, individual risk is taken into account. The empirical results indicate that these simulation procedures lead to quite different loss distributions. For instance, a negative one standard deviation shock on output leads to a maximum loss of 3.07% of the financial debt of the French manufacturing sector, with a probability of 99%, under the identical default probability hypothesis versus 2.61% with individual default probabilities. 
Keywords:  Consumption and savings, pension funds, social security and public pensions, portfolio choices and investment decisions. 
JEL:  E21 G11 G23 H55 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:bfr:banfra:280&r=rmg 
By:  Luca RICCETTI (Universita' Politecnica delle Marche, Dipartimento di Economia) 
Abstract:  Investors assign part of their funds to asset managers that are given the task of beating a benchmark. The risk management department usually imposes a maximum value of the tracking error volatility (TEV) in order to keep the risk of the portfolio near to that of the selected benchmark. However, risk management does not establish a rule on TEV which enables us to understand whether the asset manager is really active or not and, in practice, asset managers sometimes follow passively the corresponding index. Moreover, the benchmark is sometimes difficult to be beaten when the risk managers only check that portfolio managers do not exceed a fixed level of relative risk. I derive analytical methods that could be used to understand whether the strategy used by the portfolio manager is active that allows him/her to have an excess return above the benchmark large enough to cover the commission paid by investors and, concurrently, that allows him/her to restrict the portfolio's variance to be not more than the benchmark's variance in order to avoid an excess return merely due to a higher risk level (using variance as risk indicator). These equations are a necessary (but not suffiient) condition to beat the benchmark's return, without increasing the overall variance of the portfolio. This is also a generalization of the model of Jorion (2003) with the use of commissions. I apply these equations to an Italian liquidity fund and I find that the fees are too high and the TEV is low. In fact, all the funds in the liquidity category show similar problems that often render the portfolio unable to cover the fees without increasing the variance. 
Keywords:  Active Management, Benchmarking, Commissions, Portfolio Choice, Risk Management, Tracking Error 
JEL:  C61 G10 G11 G23 
Date:  2010–04 
URL:  http://d.repec.org/n?u=RePEc:anc:wpaper:340&r=rmg 
By:  Claudia Miani (Bank of Italy); Stefano Siviero (Bank of Italy) 
Abstract:  It has increasingly become standard practice to supplement point macroeconomic forecasts with an appraisal of the degree of uncertainty and the prevailing direction of risks. Several alternative approaches have been proposed in the literature to compute the probability distribution of macroeconomic forecasts; all of them rely on combining the predictive density of modelbased forecasts with subjective judgment about the direction and intensity of prevailing risks. We propose a nonparametric, modelbased simulation approach, which does not require specific assumptions to be made regarding the probability distribution of the sources of risk. The probability distribution of macroeconomic forecasts is computed as the result of modelbased stochastic simulations which rely on resampling from the historical distribution of risk factors and are designed to deliver the desired degree of skewness. By contrast, other approaches typically make a specific, parametric assumption about the distribution of risk factors. The approach is illustrated using the Bank of Italy’s Quarterly Macroeconometric Model. The results suggest that the distribution of macroeconomic forecasts quickly tends to become symmetric, even if all risk factors are assumed to be asymmetrically distributed. 
Keywords:  macroeconomic forecasts, stochastic simulations, balance of risks, uncertainty, fancharts 
JEL:  C14 C53 E37 
Date:  2010–04 
URL:  http://d.repec.org/n?u=RePEc:bdi:wptemi:td_758_10&r=rmg 
By:  Robin Greenwood; Samuel Hanson 
Abstract:  We use differences between the attributes of stock issuers and repurchasers to forecast characteristicrelated stock returns. For example, we show that large firms underperform following years when issuing firms are large relative to repurchasing firms. Our approach is useful for forecasting returns to portfolios based on booktomarket (HML), size (SMB), price, distress, payout policy, profitability, and industry. We consider interpretations of these results based on both timevarying risk premia and mispricing. Our results are primarily consistent with the view that firms issue and repurchase shares to exploit timevarying characteristic mispricing. 
JEL:  G14 G3 G32 
Date:  2010–04 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:15948&r=rmg 
By:  Jocelyne BionNadal; Magali Kervarec 
Abstract:  The framework of this paper is that of uncertainty, that is when no reference probability measure is given. To every convex regular risk measure $\rho$ on ${\cal C}_b(\Omega)$, we associate a canonical $c_{\rho}$class of probability measures. Furthermore the convex risk measure admits a dual representation in terms of a weakly relatively compact set of probability measures absolutely continuous with respect to some probability measure belonging to the canonical $c_{\rho}$class. To get these results we study the topological properties of the dual of the Banach space $L^1(c)$ associated to some capacity $c$ and we prove a representation Theorem for convex risk measures on $L^1(c)$. As applications, we obtain that every $G$expectation $\E$ (resp. in case of uncertain volatility every sublinear risk measure $\rho$), admits a representation with a numerable family of probability measures absolutely continuous with respect to some $P$ belonging to the canonical $c$class, with $c(f)=\E(f)$, (resp. $\rho(f))$. 
Date:  2010–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1004.5524&r=rmg 
By:  Tian Qiu; Guang Chen; LiXin Zhong; XiaoWei Lei 
Abstract:  An average instantaneous crosscorrelation function is introduced to quantify the interaction of the financial market of a specific time. Based on the daily data of the American and Chinese stock markets, memory effect of the average instantaneous crosscorrelations is investigated over different price return time intervals. Longrange timecorrelations are revealed, and are found to persist up to a monthorder magnitude of the price return time interval. Multifractal nature is investigated by a multifractal detrended fluctuation analysis. 
Date:  2010–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1004.5547&r=rmg 
By:  Giulio Bottazzi; Federico Tamagni 
Abstract:  Exploiting a large database of Italian manufacturing firms we investigate the relationships between default rate and firm size. Default events, defined as conditions of actual or likely insolvency, are a signal of deep business troubles. They are unanticipated, costly and dangerous for the firm as well as for the economy, and should be in principle avoided. Our evidence, based on data provided by a large Italian banking group, reveals that the default probability of firms increases with their size. This finding contrasts with typical results on exit events based on business registries data, and suggests to revise the common wisdom that sees the core of the industry as a safe place and its members as most valuable economic assets. 
Keywords:  firm default and exit, firm size,bootstrap probit regressions. 
JEL:  C14 C25 G30 L11 
Date:  2010–05–01 
URL:  http://d.repec.org/n?u=RePEc:ssa:lemwps:2010/07&r=rmg 
By:  Sharma Megha; Ghosh Diptesh 
Abstract:  Reliable networks are those in which network elements have a positive probability of failing. Conventional performance measures for such networks concern themselves either with expected network performance or with the performance of the network when it is performing well. In reliable networks modeling critical functions, decision makers are often more concerned with network performance when the network is not performing well. In this paper, we study the singlesource singledestination maximum flow problem through reliable networks and propose two risk measures to evaluate such downside performance. We propose an algorithm called COMPUTERISK to compute downside risk measures, and report our computational experience with the proposed algorithm. 
Date:  2009–09–29 
URL:  http://d.repec.org/n?u=RePEc:iim:iimawp:wp20090902&r=rmg 
By:  Borgy, V.; Idier, I.; Le Fol, G. 
Abstract:  Even though the FX market is one of the most liquid financial market, it would be an error to consider that it is immune against any liquidity problem. This paper analyzes on a long sample (20002009), the all set of quotes and transactions in three main currency pairs (EURJPY, EURUSD, USDJPY) on the EBS platform. To characterize the FX market liquidity, we consider the spread, the traded volume, the number of transactions and the Amihud (2002) statistic for illiquidity. We also propose the computation of a new liquidity indicator, BIL, that solely relies on price series availability. The main benefit of such measure is to be easily calculated on almost any financial market as well as to have a clear interpretation in terms of liquidity costs. Using all these advanced liquidity analyses, we finally test the accuracy of these measures to detect liquidity problems in the FX market. Our analysis, based on a signaling approach, shows that liquidity problems have arisen during specific episodes in the early 2000's and more generally during the recent financial turmoil. 
Keywords:  FX market, Liquidity, financial crisis. 
JEL:  G15 F31 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:bfr:banfra:279&r=rmg 