
on Risk Management 
Issue of 2017‒01‒08
twenty papers chosen by 
By:  Dacorogna, Michel M; Busse, Marc 
Abstract:  After reviewing the notion of Systemically Important Financial Institution (SIFI), we propose a first principles way to compute the price of the implicit put option that the State gives to such an institution. Our method is based on important results from Extreme Value Theory (EVT), one for the aggregation of heavy tailed distributions and the other one for the tail behavior of the ValueatRisk (VaR) versus the TailValueatRisk (TVaR). We show how to value in practice is proportional to the VaR of the institution and thus would provide the wrong incentive to the banks even if not explicitly granted. We conclude with a proposal to make the institution pay the price of this option to a fund, whose task would be to guarantee the orderly bankruptcy of such an institution. This fund would function like an insurance selling a cover to clients. 
Keywords:  Systemic Risk; "Too Big to Fail"; Risk Measure; ValueatRisk and Tail Valueat Risk; Option Price; Risk Neutral Distribution; Heavy tail; Pareto; Insurance 
JEL:  C10 E58 E61 
Date:  2016–06–27 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:75787&r=rmg 
By:  Brunella Bruno; Giacomo Nocera; Andrea Resti 
Abstract:  Supranational institutions, academics and market analysts have increasingly questioned the reliability of bank riskweighted assets (RWAs), a cornerstone of the system of minimum capital ratios designed by the Basel Committee on Banking Supervision. In fact, significant differences can be found in the banks’ average risk weights, both over time and across countries. Such differences can be explained by several factors, some of which may reflect the actual risk content of bank’s assets, while others may conceal distortions due to “RWA tweaking” and supervisory segmentations. We analyze a sample of 50 large European banks between 2008 and 2012 and document several meaningful findings. First, risk weights are affected by the banks’ size, business model and asset mix. Second, the adoption of internal ratings based (IRB) approaches is (as expected) a powerful driver of bank riskweighted assets. Third, lower risk weights are positively linked to the banks’ capital cushion. Fourth, IRB adoption is more widespread in countries where supervisory capture is potentially stronger, due to a banking industry that is both larger (compared to GDP) and concentrated. Fifth, regulatory risk weights are not disconnected from marketbased measures of bank risk. 
Keywords:  Banks, capital, riskweighted assets, regulation, Basel accords 
JEL:  G21 G28 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:baf:cbafwp:cbafwp1509&r=rmg 
By:  De Koning, Kees 
Abstract:  In finance, systemic risk is the risk of collapse of an entire financial system or entire market, as opposed to risk associated with any individual entity, group or component of a system, which can be contained therein without harming the whole system. This definition can clearly be applied to the financial crisis of 20072008 and to all its constituting parties, be they the banking sector, the mortgage bondholders or the collective of individual mortgagors in the U.S. The systemic risks to the lenders have been well documented, but for the borrowers the same does not apply. For the latter, the fact that, between 2005 and 2014, more than 45% of homeowneroccupiers with a mortgage were confronted with foreclosure proceedings, implies that the funding structure of the total U.S. mortgage portfolio made mortgagors vulnerable to loan default pressures: a serious systemic risk for mortgagors. Such pressures do not arise overnight, but rather over a number of years. How this pressure did grow, will be shown with the help of two indices: one which shows the link between mortgage debt to income by comparing the total U.S. mortgage debt with the nominal GDP levels and the second one the link between the annual mortgage lending volumes and the average new house prices during the same years. The second one is split into one index based on actual average house prices and another one on house prices adjusted for CPI inflation. The paper covers the period from 19972015. One cannot solve household’ systemic risk factors as if this is an individual household’s own problem. It was a collective problem caused by systemic factors. Managing such events should have been organized on a collective basis. The sooner it is recognized that the collective of mortgagors can experience systemic risk pressures –just like banks and mortgage bondholders, the quicker solutions can be found to overcome such pressures. If, by 2003, the priority had been given to solving systemic risks to the U.S. mortgagors, the U.S. and quite a few other countries would have been in a much better place at present. 
Keywords:  systemic risk, U.S. mortgagors, foreclosures, mortgage lending ceiling, system deficiencies, interest instrument, quantitative easing, traffic light system, National Mortgage Bank, 
JEL:  E32 E4 E44 E6 E61 G2 
Date:  2016–12–19 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:75688&r=rmg 
By:  Claudia Kl\"uppelberg; Miriam Isabel Seifert 
Abstract:  We analyze systems of agents sharing lighttailed risky claims issued by different financial objects. Assuming exponentially distributed claims, we obtain that both agents' and system's losses follow generalized exponential mixture distributions. We show that this leads to qualitatively different results on individual and system risks compared to heavytailed claims previously studied in the literature. By deducing conditional loss distributions we investigate the impact of stress situations on agents' and system's losses. Moreover, we present a criterion for agents to decide whether holding few objects or portfolio diversification minimizes their risks in system crisis situations. 
Date:  2016–12 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1612.07132&r=rmg 
By:  João Barata Ribeiro Blanco Barroso; Thiago Christiano Silva; Sergio Rubens Stancato de Souza 
Abstract:  In this paper, we propose a methodology to decompose drivers of systemic risk that arise due to insolvency contagion in evolving financial networks. There is an ongoing discussion on how network topology and capital buffer influence systemic risk. On the one hand, the network contagion literature tends to emphasize the influence of the network topology. On the other hand, policy works tend to discuss restrictions over the capital buffers of financial institutions. Systemic risk is usually a complex function of both risk drivers and thus isolating the contributive effects of each risk driver to systemic risk is not a trivial task. Our decomposition methodology identifies and isolates these effects. We apply our methodology to the global banking network and find that the network topology effect explains most of the systemic risk measure's volatility. Additionally, we show that the capital buffer effect explains the persistent reduction in systemic risk buildup with effects concentrated around the global financial crisis. Our results confirm the importance of both risk drivers to measuring systemic risk 
Date:  2016–12 
URL:  http://d.repec.org/n?u=RePEc:bcb:wpaper:448&r=rmg 
By:  Bekiros, Stelios; Boubaker, Sabri; Nguyen, Duc Khuong; Uddin, Gazi Salah 
Abstract:  There is evidence to suggest that gold acts as both a hedge and a safe haven for equity markets over recent years, and particularly during crises periods. Our work extends the recent literature on hedging and diversification roles of gold by analyzing its interaction with the stock markets of the leading emerging economies, the BRICS. Whilst they generally exhibit a high growth rate, these economies still experience a pronounced vulnerability to external shocks, particularly to commodity price fluctuations. Using a multiscale wavelet approach and a GARCHbased copula methodology, we mainly show evidence of: i) the timescale coevolvement patterns between BRICS stock markets and gold market, with some profound regions of concentrated extreme variations; and ii) a strong timevarying asymmetric dependence structure between those markets. These findings are essential for risk diversification and portfolio hedging strategies amongst the investigated markets. 
Keywords:  Equity markets; Copulas; Gold; Timescale analysis. 
JEL:  C14 C32 C51 G1 
Date:  2015–10 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:75740&r=rmg 
By:  Michele Fabrizi; Elisabetta Ipino; Michel Magnan; Antonio Parbonetti 
Abstract:  The 2007–2009 financial crisis has reignited a longrunning debate about the relative merits of historical cost accounting (HCA) or fair value accounting as foundations for prudential oversight, including the calculation of regulatory capital. Availableforsale securities provide a good setting to further explore this issue. Using a sample of 5,333 firmyear observations representing 721 unique U.S. banks and bank holding companies between 1998 and 2013, we present evidence that regulatory capital based on HCA induces banks to engage in gains trading activities to improve their capital position and pay dividends. We also document that banks experiencing a decrease in regulatory capital and banks with a higher percentage of institutional investors are more prone to engage in gains trading to pay dividends. Finally, our findings reveal that to counterbalance the increased risk, banks change their lending behavior and decrease the riskiness of their trading portfolios. Overall, our results reveal the potential side effects linked to the use of HCA as a foundation to compute regulatory capital and suggest that HCA is not a panacea. La crise financière de 20072009 a relancé le débat quant aux avantages et inconvénients comparés de la comptabilité au coût historique (ou amorti) et de la comptabilité à la juste valeur à des fins de réglementation, notamment en ce qui a trait au calcul du capital. Compte tenu de leur comptabilité particulière, les valeurs mobilières disponibles à la vente offrent un contexte intéressant pour l’étude de cette question. À partir des données financières d’un échantillon de 721 banques américaines au cours de la période 19982013, nous constatons que le calcul du capital réglementaire basé sur le coût historique amène les banques à effectuer des opérations de cessions de titres disponibles à la vente en vue de réaliser des gains constatés aux résultats, ce qui leur permet d’améliorer leur ratio de capital réglementaire et d’augmenter leurs dividendes. Nous constatons également que les banques souffrant de pressions à la baisse quant à leur ratio de capitalisation et ayant une plus grande proportion d’investisseurs institutionnels dans leur actionnariat ont une plus grande propension à effectuer de telles opérations. Par contre, nous constatons également qu’afin de contrer l’accroissement du risque financier qui en résulte, les banques modifient leur stratégie de prêt et réduisent le niveau de risque de leur portefeuille de négociation. Dans l’ensemble, nos résultats tendent à montrer les effets secondaires découlant de l’utilisation du coût historique en tant que fondement du calcul du capital réglementaire. 
Keywords:  Banks, Regulatory capital, Availableforsale securities, Realized gains, Realized losses, Dividend payout, banques, capital réglementaire, titres disponibles à la vente, gains réalisés, pertes constatées, politique de dividende 
Date:  2016–12–19 
URL:  http://d.repec.org/n?u=RePEc:cir:cirwor:2016s57&r=rmg 
By:  Leonella Gori; Barbara Chizzolini; Stefano Gatti 
Abstract:  In this paper we propose a measure of riskiness of PE assets alternative to the CAPM derived beta coefficient usually suggested in the literature on performance of PE Funds. We assumption that at any given point in time there exist alternative investment opportunities that can be classified into a limited number of types, and that Funds manage their Portfolio “optimally”, within the range of investments allowed by their Placement Memorandum. We first estimate a discrete choice model of the Fund Managers’ investment decisions by type of PE investment, as a function of the observed characteristics of the Fund, of the Deal and of the Portfolio Company, as well as of the year when the deal is closed. Given the chosen type of investment, we then estimate the probability of negative returns of each deal in each investment class. These predicted probabilities together with the historical expected shortfalls by investment type, yield the Expected Loss by deal, the measure of pure risk we propose in this paper. We find that it is possible to identify the idiosyncratic features of each investment type and that the patterns and degrees of riskiness differ quite significantly among them. 
Date:  2016 
URL:  http://d.repec.org/n?u=RePEc:baf:cbafwp:cbafwp1625&r=rmg 
By:  Bruno Bouchard (CEREMADE  CEntre de REcherches en MAthématiques de la DEcision  CNRS  Centre National de la Recherche Scientifique  Université ParisDauphine, CREST  Centre de Recherche en Économie et Statistique  INSEE  École Nationale de la Statistique et de l'Administration Économique); Ludovic Moreau (Department of Mathematics, ETH zurich  Swiss Federal Institute of Technology in Zurich (ETH Zurich).); Mete Soner (Department of Mathematics, ETH zurich  Swiss Federal Institute of Technology in Zurich (ETH Zurich).) 
Abstract:  We consider the problem of option hedging in a market with proportional transaction costs. Since superreplication is very costly in such markets, we replace perfect hedging with an expected loss constraint. Asymptotic analysis for small transactions is used to obtain a tractable model. A general expansion theory is developed using the dynamic programming approach. Explicit formulae are also obtained in the special cases of an exponential or power loss function. As a corollary, we retrieve the asymptotics for the exponential utility indifference price. 
Keywords:  asymptotic expansion,Expected loss constraint,hedging,transaction cost,asymptotic expansion. 
Date:  2016–06–01 
URL:  http://d.repec.org/n?u=RePEc:hal:journl:hal00863562&r=rmg 
By:  Apicella, Giovanna; Dacorogna, Michel M 
Abstract:  The need for having a good knowledge of the degree of dependence between various risks is fundamental for understanding their real impacts and consequences, since dependence reduces the possibility to diversify the risks. This paper expands in a more theoretical approach the methodology developed in for exploring the dependence between mortality and market risks in case of stress. In particular, we investigate, using the Feller process, the relationship between mortality and interest rate risks. These are the primary sources of risk for life (re)insurance companies. We apply the Feller process to both mortality and interest rate intensities. Our study cover both the short and the longterm interest rates (3m and 10y) as well as the mortality indices of ten developed countries and extending over the same time horizon. Specifically, this paper deals with the stochastic modelling of mortality. We calibrate two different specifications of the Feller process (a twoparameters Feller process and a threeparameters one) to the survival probabilities of the generation of males born in 1940 in ten developed countries. Looking simultaneously at different countries gives us the possibility to find regularities that go beyond one particular case and are general enough to gain more confidence in the results. The calibration provides in most of the cases a very good fit to the data extrapolated from the mortality tables. On the basis of the principle of parsimony, we choose the twoparameters Feller process, namely the hypothesis with the fewer assumptions. These results provide the basis to study the dynamics of both risks and their dependence. 
Keywords:  Mortality Model, Interest Rate Model, Dependence 
JEL:  C01 C15 C32 C40 
Date:  2016–07 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:75788&r=rmg 
By:  Elena Di Bernardino (CEDRIC  Centre d'Etude et De Recherche en Informatique du Cnam  Conservatoire National des Arts et Métiers [CNAM]); Didier Rullière (SAF  Laboratoire de Sciences Actuarielle et Financière  UCBL  Université Claude Bernard Lyon 1) 
Abstract:  An important topic in Quantitative Risk Management concerns the modeling of dependence among risk sources and in this regard Archimedean copulas appear to be very useful. However, they exhibit symmetry, which is not always consistent with patterns observed in real world data. We investigate extensions of the Archimedean copula family that make it possible to deal with asymmetry. Our extension is based on the observation that when applied to the copula the inverse function of the generator of an Archimedean copula can be expressed as a linear form of generator inverses. We propose to add a distortion term to this linear part, which leads to asymmetric copulas. Parameters of this new class of copulas are grouped within a matrix, thus facilitating some usual applications as level curve determination or estimation. Some choices such as submodel stability help associating each parameter to one bivariate projection of the copula. We also give some admissibility conditions for the considered copulas. We propose different examples as some natural multivariate extensions of FarlieGumbelMorgenstern or GumbelBarnett. 
Keywords:  transformations of Archimedean copulas,Archimedean copulas 
Date:  2016–12–14 
URL:  http://d.repec.org/n?u=RePEc:hal:journl:hal01147778&r=rmg 
By:  Brown, Sarah (University of Sheffield); Gray, Daniel (University of Sheffield); Harris, Mark N. (Curtin University); Spencer, Christopher (Loughborough University) 
Abstract:  Analysing the US Panel Study of Income Dynamics, we present a new empirical method to investigate the extent to which households reduce their financial risk exposure when confronted with background risk. Our novel modelling approach – termed a deflated fractional ordered probit model – quantifies how the overall asset composition in a portfolio adjusts with background risk, and is unique in recovering for, any given risky asset class, the shares that are reallocated to a safer asset category. Background risk exerts a significant impact on household portfolios, resulting in a 'flight from risk', away from riskier to safe assets. 
Keywords:  asset allocation, background risk, flight from risk, fractional models 
JEL:  C33 C35 D14 G11 
Date:  2016–12 
URL:  http://d.repec.org/n?u=RePEc:iza:izadps:dp10408&r=rmg 
By:  Tirupam Goel 
Abstract:  This paper presents a general equilibrium model with a dynamic banking sector to characterize optimal sizedependent bank capital regulation (CR). Bank leverage choices are subject to the riskreturn tradeoff: high leverage increases expected return on capital, but also increases return variance and bank failure risk. Financial frictions imply that bank leverage choices are socially inefficient, providing scope for a welfareenhancing CR that imposes a cap on bank leverage. The optimal CR is tighter relative to the precrisis benchmark. Optimal CR is also bank specific, and tighter for large banks than for small banks. This is for three reasons. First, allowing small banks to take more leverage enables them to potentially grow faster, leading to a growth effect. Second, although more leverage by small banks results in a higher exit rate, these exits are by the less efficient banks, leading to a cleansing effect. Third, failures are more costly among large banks, because these are more efficient in equilibrium and intermediate more capital. Therefore, tighter regulation for large banks renders them less prone to failure, leading to a stabilization effect. In terms of industry dynamics, tighter CR results in a smaller bank exit rate and a larger equilibrium mass of better capitalized banks, even though physical capital stock and wages are lower. The calibrated model rationalizes various steady state moments of the US banking industry, and provides general support for the Basel III GSIB framework. 
Keywords:  Size distribution, entry & exit, heterogeneous agent models, size dependent policy 
Date:  2016–12 
URL:  http://d.repec.org/n?u=RePEc:bis:biswps:599&r=rmg 
By:  Yehning Chen; Iftekhar Hasan 
Abstract:  This paper proposes that whether interconnectedness among banks leads to financial instability depends on banks’ leverage decisions. It extends the network model in Allen et al. (2012) to study the relationship between interconnectedness and the banks’ failure probability. In the model, banks adopt the ValueatRisk rule to make the capital structure decisions and the risk of contagion is neglected. The paper finds that interconnectedness may either increase or decrease the banks’ failure probability. It also shows that interconnection is more harmful when banks are more overoptimistic about their prospects, and that financial integration may hurt financial stability. In addition, the adverse impact of interconnectedness on the banks’ failure probability can be alleviated if bank capital regulation is properly designed. This paper supports the conclusion in Allen and Gale (2000) that a complete financial system in which each bank is connected to all the other banks is superior to incomplete ones in which banks are connected to only a part of other banks. 
Keywords:  financial network, contagion, interconnectedness, diversification, bank capital regulation 
JEL:  G01 G21 
Date:  2016 
URL:  http://d.repec.org/n?u=RePEc:baf:cbafwp:cbafwp1643&r=rmg 
By:  Chunfa Wang 
Abstract:  The COS method proposed in Fang and Oosterlee (2008), although highly efficient, may lack robustness for a number of cases. In this paper, we present a Stable pricing of call options based on Fourier cosine series expansion. The Stability of the pricing methods is demonstrated by error analysis, as well as by a series of numerical examples, including the Heston stochastic volatility model, Kou jumpdiffusion model, and CGMY model. 
Date:  2017–01 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1701.00886&r=rmg 
By:  Olofsson, Sara (The Swedish Institute for Health Economics (IHE), Lund, Sweden); Gerdtham , UlfG (Department of Economics, Lund University); Hultkrantz , Lars (Örebro University, School of Business, Sweden); Persson , Ulf (The Swedish Institute for Health Economics (IHE), Lund, Sweden) 
Abstract:  To decide how much resources to spend on reducing mortality risk, governmental agencies in several countries turn to the value of a statistical life (VSL). VSL has been shown to vary depending on the size of the risk reduction, which indicates that WTP does not increase nearproportional in relation to risk reduction as suggested by standard economic theory. Chained approach (CA) is a stated preference method that was designed to deal with this problem. The objective of this study was to compare CA to the more traditional approach contingent valuation (CV). Data was collected from 500 individuals in the Swedish adult general population using two webbased questionnaires, whereof one based on CA and the other on the CV method. Despite the two different ways of deriving the estimates, the methods showed similar results. The CV result showed scale insensitivity with respect to the size of the risk reduction and disease duration and resulted in more zero and protest response. The CA result did also vary depending on the procedure used, but not when chaining on individual estimates. The CA result was also found to be more sensitive to disease duration and severity. This study provides support for the validity of studies of the WTP for a risk reduction. It also shows that CA is associated with encouraging features for the valuation of nonfatal road traffic accidents, but the result does not support the use of one method over the other. 
Keywords:  contingent valuation; chained approach; scale sensitivity; risk reduction; willingnesstopay 
JEL:  D61 D80 I18 J17 
Date:  2016–12–13 
URL:  http://d.repec.org/n?u=RePEc:hhs:lunewp:2016_034&r=rmg 
By:  Imre Kondor; G\'abor Papp; Fabio Caccioli 
Abstract:  A large portfolio of independent returns is optimized under the variance risk measure with a ban on short positions. The noshort selling constraint acts as an asymmetric $\ell_1$ regularizer, setting some portfolio weights to zero and keeping the estimation error bounded, avoiding the divergence present in the nonregularized case. However, the susceptibility, i.e. the sensitivity of the optimal portfolio weights to changes in the returns, diverges at a critical value $2$ of the ratio $N/T$, where $N$ is the number of different assets in the portfolio and $T$ the length of available time series. This means that a ban on short positions does not prevent the phase transition in the optimization problem, it merely shifts the critical point from its nonregularized value of $N/T=1$ to $2$. We show that this critical value is universal, independent of the distribution of the returns. Beyond this critical value, the variance of the portfolio vanishes for any portfolio weight vector constructed as a linear combination of the eigenvectors from the null space of the covariance matrix, but these linear combinations are not legitimate solutions of the optimization problem, as they are infinitely sensitive to any change in the input parameters, in particular they will wildly fluctuate from sample to sample. We also calculate the distribution of the optimal weights over the random samples and show that the regularizer preferentially removes the assets with large variances, in accord with one's natural expectation. The analytic calculations are supported by numerical simulations. The analytic and numerical results are in perfect agreement for $N/T 2$. This is because there are regularizers built into these solvers that stabilize the otherwise freely fluctuating, meaningless solutions. 
Date:  2016–12 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1612.07067&r=rmg 
By:  Katrien Antonio; Anastasios Bardoutsos; Wilbert Ouburg 
Abstract:  Life insurers, pension funds, health care providers and social security institutions face increasing expenses due to continuing improvements of mortality rates. The actuarial and demographic literature has introduced a myriad of (deterministic and stochastic) models to forecast mortality rates of single populations. This paper presents a Bayesian analysis of two related multipopulation mortality models of logbilinear type, designed for two or more populations. Using a larger set of data, multipopulation mortality models allow joint modelling and projection of mortality rates by identifying characteristics shared by all subpopulations as well as subpopulation speci?c e?ects on mortality. This is important when modeling and forecasting mortality of males and females, regions within a country and when dealing with indexbased longevity hedges. Our ?rst model is inspired by the two factor Lee & Carter model of Renshaw and Haberman (2003) and the common factor model of Carter and Lee (1992). The second model is the augmented common factor model of Li and Lee (2005). This paper approaches both models in a statistical way, using a Poisson distribution for the number of deaths at a certain age and in a certain time period. Moreover, we use Bayesian statistics to calibrate the models and to produce mortality forecasts. We develop the technicalities necessary for Markov Chain Monte Carlo ([MCMC]) simulations and provide software implementation (in R) for the models discussed in the paper. Key bene?ts of this approach are multiple. We jointly calibrate the Poisson likelihood for the number of deaths and the times series models imposed on the time dependent parameters, we enable full allowance for parameter uncertainty and we are able to handle missing data as well as small sample populations. We compare and contrast results from both models to the results obtained with a frequentist single population approach and a least squares estimation of the augmented common factor model. 
Keywords:  projected life tables, multipopulation stochastic mortality models, Bayesian statistics, Poisson regression, one factor Lee & Carter model, two factor Lee & Carter model, Li & Lee model, augmented common factor model 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:baf:cbafwp:cbafwp1505&r=rmg 
By:  Teresa Ghilarducci (Schwartz Center for Economic Policy Analysis (SCEPA)) 
Abstract:  For the first time in two generations, there’s a growing risk of being poor or near poor in old age because the U.S. pension system has failed. The U.S. pension system is based on a threelayered pyramid, with Social Security on the bottom, employmentbased retirement plans in the middle, and personal assets at the top. The second layer has collapsed over the last 35 years as employerbased pensions have shifted years to “doityourself” financialbased accounts anchored in individual assetbuilding. 
Keywords:  Social Security, Inequality, Policy 
JEL:  H5 I14 I2 
Date:  2016–06 
URL:  http://d.repec.org/n?u=RePEc:epa:cepawp:201605&r=rmg 
By:  Kiran Sharma; Balagopal Gopalakrishnan; Anindya S. Chakrabarti; Anirban Chakraborti 
Abstract:  We show that there exists an empirical linkage between nominal financial networks and the underlying economic fundamentals across countries. We construct the nominal return correlation networks from daily data to encapsulate sectorlevel dynamics and calculate the relative importance of the sectors in the nominal network through centrality measure and clustering algorithms. The centrality measure robustly identifies the backbone of the minimum spanning trees defined on the return networks. We show that the sectors that are relatively large constitute the core of the return networks, whereas the periphery is mostly populated by relatively smaller sectors. Therefore, sectorlevel nominal return dynamics is anchored to the real size effect, which ultimately shapes the optimal portfolios for risk management. The results are reasonably robust across 27 countries of varying degree of prosperity and across periods of market turbulence (200809), as well as relative calmness (201516). 
Date:  2016–12 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1612.05952&r=rmg 