nep-rmg New Economics Papers
on Risk Management
Issue of 2010‒05‒08
nine papers chosen by
Stan Miles
Thompson Rivers University

  1. Pension entitlements of French and american households: a first comparison. By Durant, D.; Frey, L.
  2. Minimum Tracking Error Volatility By Luca RICCETTI
  3. A non-parametric model-based approach to uncertainty and risk analysis of macroeconomic forecast By Claudia Miani; Stefano Siviero
  4. Characteristic Timing By Robin Greenwood; Samuel Hanson
  5. Risk measuring under model uncertainty By Jocelyne Bion-Nadal; Magali Kervarec
  6. Memory effect and multifractality of cross-correlations in financial markets By Tian Qiu; Guang Chen; Li-Xin Zhong; Xiao-Wei Lei
  7. Is Bigger Always Better ? The Effect of Size on Defaultse By Giulio Bottazzi; Federico Tamagni
  8. Evaluating Downside Risks in Reliable Networks By Sharma Megha; Ghosh Diptesh
  9. Liquidity problems in the FX liquid market: Ask for the "BIL". By Borgy, V.; Idier, I.; Le Fol, G.

  1. By: Durant, D.; Frey, L.
    Abstract: The aim of this paper is to build and estimate a macroeconomic model of credit risk for the French manufacturing sector. This model is based on Wilson's CreditPortfolioView model (1997a, 1997b); it enables us to simulate loss distributions for a credit portfolio for several macroeconomic scenarios. We implement two simulation procedures based on two assumptions relative to probabilities of default (PDs): in the first procedure, firms are assumed to have identical default probabilities; in the second, individual risk is taken into account. The empirical results indicate that these simulation procedures lead to quite different loss distributions. For instance, a negative one standard deviation shock on output leads to a maximum loss of 3.07% of the financial debt of the French manufacturing sector, with a probability of 99%, under the identical default probability hypothesis versus 2.61% with individual default probabilities.
    Keywords: Consumption and savings, pension funds, social security and public pensions, portfolio choices and investment decisions.
    JEL: E21 G11 G23 H55
    Date: 2010
  2. By: Luca RICCETTI (Universita' Politecnica delle Marche, Dipartimento di Economia)
    Abstract: Investors assign part of their funds to asset managers that are given the task of beating a benchmark. The risk management department usually imposes a maximum value of the tracking error volatility (TEV) in order to keep the risk of the portfolio near to that of the selected benchmark. However, risk management does not establish a rule on TEV which enables us to understand whether the asset manager is really active or not and, in practice, asset managers sometimes follow passively the corresponding index. Moreover, the benchmark is sometimes difficult to be beaten when the risk managers only check that portfolio managers do not exceed a fixed level of relative risk. I derive analytical methods that could be used to understand whether the strategy used by the portfolio manager is active that allows him/her to have an excess return above the benchmark large enough to cover the commission paid by investors and, concurrently, that allows him/her to restrict the portfolio's variance to be not more than the benchmark's variance in order to avoid an excess return merely due to a higher risk level (using variance as risk indicator). These equations are a necessary (but not suffiient) condition to beat the benchmark's return, without increasing the overall variance of the portfolio. This is also a generalization of the model of Jorion (2003) with the use of commissions. I apply these equations to an Italian liquidity fund and I find that the fees are too high and the TEV is low. In fact, all the funds in the liquidity category show similar problems that often render the portfolio unable to cover the fees without increasing the variance.
    Keywords: Active Management, Benchmarking, Commissions, Portfolio Choice, Risk Management, Tracking Error
    JEL: C61 G10 G11 G23
    Date: 2010–04
  3. By: Claudia Miani (Bank of Italy); Stefano Siviero (Bank of Italy)
    Abstract: It has increasingly become standard practice to supplement point macroeconomic forecasts with an appraisal of the degree of uncertainty and the prevailing direction of risks. Several alternative approaches have been proposed in the literature to compute the probability distribution of macroeconomic forecasts; all of them rely on combining the predictive density of model-based forecasts with subjective judgment about the direction and intensity of prevailing risks. We propose a non-parametric, model-based simulation approach, which does not require specific assumptions to be made regarding the probability distribution of the sources of risk. The probability distribution of macroeconomic forecasts is computed as the result of model-based stochastic simulations which rely on re-sampling from the historical distribution of risk factors and are designed to deliver the desired degree of skewness. By contrast, other approaches typically make a specific, parametric assumption about the distribution of risk factors. The approach is illustrated using the Bank of Italy’s Quarterly Macroeconometric Model. The results suggest that the distribution of macroeconomic forecasts quickly tends to become symmetric, even if all risk factors are assumed to be asymmetrically distributed.
    Keywords: macroeconomic forecasts, stochastic simulations, balance of risks, uncertainty, fan-charts
    JEL: C14 C53 E37
    Date: 2010–04
  4. By: Robin Greenwood; Samuel Hanson
    Abstract: We use differences between the attributes of stock issuers and repurchasers to forecast characteristic-related stock returns. For example, we show that large firms underperform following years when issuing firms are large relative to repurchasing firms. Our approach is useful for forecasting returns to portfolios based on book-to-market (HML), size (SMB), price, distress, payout policy, profitability, and industry. We consider interpretations of these results based on both time-varying risk premia and mispricing. Our results are primarily consistent with the view that firms issue and repurchase shares to exploit time-varying characteristic mispricing.
    JEL: G14 G3 G32
    Date: 2010–04
  5. By: Jocelyne Bion-Nadal; Magali Kervarec
    Abstract: The framework of this paper is that of uncertainty, that is when no reference probability measure is given. To every convex regular risk measure $\rho$ on ${\cal C}_b(\Omega)$, we associate a canonical $c_{\rho}$-class of probability measures. Furthermore the convex risk measure admits a dual representation in terms of a weakly relatively compact set of probability measures absolutely continuous with respect to some probability measure belonging to the canonical $c_{\rho}$-class. To get these results we study the topological properties of the dual of the Banach space $L^1(c)$ associated to some capacity $c$ and we prove a representation Theorem for convex risk measures on $L^1(c)$. As applications, we obtain that every $G$-expectation $\E$ (resp. in case of uncertain volatility every sublinear risk measure $\rho$), admits a representation with a numerable family of probability measures absolutely continuous with respect to some $P$ belonging to the canonical $c$-class, with $c(f)=\E(|f|)$, (resp. $\rho(-|f|))$.
    Date: 2010–04
  6. By: Tian Qiu; Guang Chen; Li-Xin Zhong; Xiao-Wei Lei
    Abstract: An average instantaneous cross-correlation function is introduced to quantify the interaction of the financial market of a specific time. Based on the daily data of the American and Chinese stock markets, memory effect of the average instantaneous cross-correlations is investigated over different price return time intervals. Long-range time-correlations are revealed, and are found to persist up to a month-order magnitude of the price return time interval. Multifractal nature is investigated by a multifractal detrended fluctuation analysis.
    Date: 2010–04
  7. By: Giulio Bottazzi; Federico Tamagni
    Abstract: Exploiting a large database of Italian manufacturing firms we investigate the relationships between default rate and firm size. Default events, defined as conditions of actual or likely insolvency, are a signal of deep business troubles. They are unanticipated, costly and dangerous for the firm as well as for the economy, and should be in principle avoided. Our evidence, based on data provided by a large Italian banking group, reveals that the default probability of firms increases with their size. This finding contrasts with typical results on exit events based on business registries data, and suggests to revise the common wisdom that sees the core of the industry as a safe place and its members as most valuable economic assets.
    Keywords: firm default and exit, firm size,bootstrap probit regressions.
    JEL: C14 C25 G30 L11
    Date: 2010–05–01
  8. By: Sharma Megha; Ghosh Diptesh
    Abstract: Reliable networks are those in which network elements have a positive probability of failing. Conventional performance measures for such networks concern themselves either with expected network performance or with the performance of the network when it is performing well. In reliable networks modeling critical functions, decision makers are often more concerned with network performance when the network is not performing well. In this paper, we study the single-source single-destination maximum flow problem through reliable networks and propose two risk measures to evaluate such downside performance. We propose an algorithm called COMPUTE-RISK to compute downside risk measures, and report our computational experience with the proposed algorithm.
    Date: 2009–09–29
  9. By: Borgy, V.; Idier, I.; Le Fol, G.
    Abstract: Even though the FX market is one of the most liquid financial market, it would be an error to consider that it is immune against any liquidity problem. This paper analyzes on a long sample (2000-2009), the all set of quotes and transactions in three main currency pairs (EURJPY, EURUSD, USDJPY) on the EBS platform. To characterize the FX market liquidity, we consider the spread, the traded volume, the number of transactions and the Amihud (2002) statistic for illiquidity. We also propose the computation of a new liquidity indicator, BIL, that solely relies on price series availability. The main benefit of such measure is to be easily calculated on almost any financial market as well as to have a clear interpretation in terms of liquidity costs. Using all these advanced liquidity analyses, we finally test the accuracy of these measures to detect liquidity problems in the FX market. Our analysis, based on a signaling approach, shows that liquidity problems have arisen during specific episodes in the early 2000's and more generally during the recent financial turmoil.
    Keywords: FX market, Liquidity, financial crisis.
    JEL: G15 F31
    Date: 2010

This nep-rmg issue is ©2010 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.