
on Risk Management 
Issue of 2014‒05‒09
thirtytwo papers chosen by 
By:  Lisa R. Goldberg; Ola Mahmoud 
Abstract:  Maximum drawdown, the largest cumulative loss from peak to trough, is one of the most widely used indicators of risk in the fund management industry, but one of the least developed in the context of probabilistic risk metrics. We formalize drawdown risk as Conditional Expected Drawdown (CED), which is the tail mean of maximum drawdown distributions. We show that CED is a degree one positive homogenous risk measure, so that it can be attributed to factors; and convex, so that it can be used in quantitative optimization. We develop an efficient linear program for minimum CED optimization and empirically explore the differences in risk attributions based on CED, Expected Shortfall (ES) and volatility. An important feature of CED is its sensitivity to serial correlation. In an empirical study that fits AR(1) models to US Equity and US Bonds, we find substantially higher correlation between the autoregressive parameter and CED than with ES or with volatility. 
Date:  2014–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1404.7493&r=rmg 
By:  Eleonora Iachini (Banca d'Italia); Stefano Nobili (Banca d'Italia) 
Abstract:  This paper introduces a coincident indicator of systemic liquidity risk in the Italian financial markets. In order to take account of the systemic dimension of liquidity stress, standard portfolio theory is used. Three subindices, that reflect liquidity stress in specific market segments, are aggregated in the systemic liquidity risk indicator in the same way as individual risks are aggregated in order to quantify overall portfolio risk. The aggregation takes account of the timevarying crosscorrelations between the subindices, using a multivariate GARCH approach. This is able to capture abrupt changes in the correlations and makes it possible for the indicator to identify systemic liquidity events precisely. We evaluate the indicator on its ability to match the results of a survey conducted among financial market experts to determine the most liquidity stressful events for the Italian financial markets. The results show that the systemic liquidity risk indicator accurately identifies events characterized by high systemic risk, while not exaggerating the level of stress during calm periods. 
Keywords:  financial crisis, liquidity risk, systemic risk, stress index, multivariate GARCH 
JEL:  G01 G10 G20 
Date:  2014–04 
URL:  http://d.repec.org/n?u=RePEc:bdi:opques:qef_217_14&r=rmg 
By:  ChiaLin Chang (Department of Applied Economics, Department of Finance, National Chung Hsing University, Taiwan); JuanÁngel JiménezMartín (Departamento de Economía Cuantitativa (Department of Quantitative Economics), Facultad de Ciencias Económicas y Empresariales (Faculty of Economics and Business), Universidad Complutense de Madrid); Esfandiar Maasoumi (Department of Economics, Emory University); Teodosio Pérez Amaral (Departamento de Economía Cuantitativa (Department of Quantitative Economics), Facultad de Ciencias Económicas y Empresariales (Faculty of Economics and Business), Universidad Complutense de Madrid) 
Abstract:  The Basel III Accord requires that banks and other Authorized Deposittaking Institutions (ADIs) communicate their daily risk forecasts to the appropriate monetary authorities at the beginning of each trading day, using one of a range of alternative risk models to forecast ValueatRisk (VaR). The risk estimates from these models are used to determine the daily capital charges (DCC) and associated capital costs of ADIs, depending in part on the number of previous violations, whereby realized losses exceed the estimated VaR. In this paper we define risk management in terms of choosing sensibly from a variety of risk models and discuss the optimal selection of financial risk models. A previous approach to model selection for predicting VaR proposed combining alternative risk models and ranking such models on the basis of average DCC. This method is based only on the first moment of the DCC distribution, supported by a restrictive evaluation function. In this paper, we consider uniform rankings of models over large classes of evaluation functions that may reflect different weights and concerns over different intervals of the distribution of losses and DCC. The uniform rankings are based on recently developed statistical tests of stochastic dominance (SD). The SD tests are illustrated using the prices and returns of VIX futures. The empirical findings show that the tests of SD can rank different pairs of models to a statistical degree of confidence, and that the alternative (recentered) SD tests are in general agreement. 
Keywords:  Stochastic dominance; ValueatRisk; daily capital charges; violation penalties; optimizing strategy; Basel III Accord; VIX futures; global financial crisis. 
JEL:  G32 G11 G17 C53 C22 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:ucm:doicae:1408&r=rmg 
By:  Yongmiao Hong; Yanhui Liu; Shouyang Wang 
Abstract:  Controlling and monitoring extreme downside market risk is important for financial risk management and portfolio/investment diversification. In this paper, we introduce a new concept of Granger causality in risk and propose a class of kernelbased tests to detect extreme downside risk spillover between financial markets, where risk is measured by the left tail of the distribution or equivalently by the Value at Risk (VaR). The proposed tests have a convenient asymptotic standard normal distribution under the null hypothesis of no Granger causality in risk. They check a large number of lags and thus can detect risk spillover that occurs with a time lag or that has weak spillover at each lag but carries over a very long distributional lag. Usually, tests using a large number of lags may have low power against alternatives of practical importance, due to the loss of a large number of degrees of freedom. Such power loss is fortunately alleviated for our tests because our kernel approach naturally discounts higher order lags, which is consistent with the stylized fact that today’s financial markets are often more influenced by the recent events than the remote past events. A simulation study shows that the proposed tests have reasonable size and power against a variety of empirically plausible alternatives in nite samples, including the spillover from the dynamics in mean, variance, skewness and kurtosis respectively. In particular, nonuniform weighting delivers better power than uniform weighting and a Granger type regression procedure. The proposed tests are useful in investigating large comovements between financial markets such as financial contagions. An application to the Eurodollar and Japanese Yen highlights the merits of our approach. 
Keywords:  Crossspectrum, Extreme downside risk, Financial contagion, Granger causality in risk,Nonlinear time series, Risk management, Value at Risk 
Date:  2013–10–14 
URL:  http://d.repec.org/n?u=RePEc:wyi:wpaper:001986&r=rmg 
By:  Andrew Green; Chris Kenyon 
Abstract:  Central counterparties (CCPs) require initial margin (IM) to be posted for derivative portfolios cleared through them. Additionally, the Basel Committee on Banking Supervision has proposed in BCBS261 that all significant OTC derivatives trading must also post IM by 2019. IM is typically calculated using ValueatRisk (VAR) or Conditional ValueatRisk (CVAR, aka Expected Shortfall), based on historical simulation. As previously noted (Green2013a), (Green2013b) IM requirements give rise to a need for unsecured funding similar to FVA on unsecured derivatives. The IM cost to the derivatives originator requires an integral of the funding cost over the funding profile which depends on VAR or CVARbased calculation. VAR, here, involves running a historical simulation Monte Carlo inside a riskneutral Monte Carlo simulation. Brute force calculation is computationally unfeasible. This paper presents a computationally efficient method of calculating IM costs for any derivative portfolio: LongstaffSchwartz Augmented Compression, (LSAC). Essentially, LongstaffSchwartz is used with an augmented state space to retain accuracy for VARrelevant changes to the state variables. This method allows rapid calculation of IM costs both for portfolios, and on an incremental basis. LSAC can be applied wherever historic simulation VAR is required such as lifetime cost of market risk regulatory capital using internal models. We present example costs for IM under BCBS261 for interest rate swap portfolios of up to 10000 swaps and 30 year maturity showing significant IM FVA costs and two orders of magnitude speedup compared to direct calculation. 
Date:  2014–05 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1405.0508&r=rmg 
By:  Hajo Holzmann; Matthias Eulert 
Abstract:  Predictions are issued on the basis of certain information. If the forecasting mechanisms are correctly specified, a larger amount of available information should lead to better forecasts. For point forecasts, we show how the effect of increasing the information set can be quantified by using strictly consistent scoring functions, where it results in smaller average scores. Further, we show that the classical DieboldMariano test, based on strictly consistent scoring functions and asymptotically ideal forecasts, is a consistent test for the effect of an increase in a sequence of information sets on $h$step point forecasts. For the value at risk (VaR), we show that the average score, which corresponds to the average quantile risk, directly relates to the expected shortfall. Thus, increasing the information set will result in VaR forecasts which lead on average to smaller expected shortfalls. We illustrate our results in simulations and applications to stock returns for unconditional versus conditional risk management as well as univariate modeling of portfolio returns versus multivariate modeling of individual risk factors. The role of the information set for evaluating probabilistic forecasts by using strictly proper scoring rules is also discussed. 
Date:  2014–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1404.7653&r=rmg 
By:  Andrew Green; Chris Kenyon 
Abstract:  Credit (CVA), Debit (DVA) and Funding Valuation Adjustments (FVA) are now familiar valuation adjustments made to the value of a portfolio of derivatives to account for credit risks and funding costs. However, recent changes in the regulatory regime and the increases in regulatory capital requirements has led many banks to include the cost of capital in derivative pricing. This paper formalises the addition of cost of capital by extending the BurgardKjaer semireplication approach to CVA and FVA to include an addition capital term, Capital Valuation Adjustment (KVA, i.e. Kapital Valuation Adjustment to distinguish from CVA. Two approaches are considered, one where the (regulatory) capital is released back to shareholders upon counterparty default and one where the capital can be used to offset losses in the event of counterparty default. The use of the semireplication approach means that the flexibility around the treatment of selfdefault is carried over into this analysis. The paper further considers the practical calculation of KVA with reference to the Basel II (BCBS128) and Basel III (BCBS189) Capital regimes and its implementation via CRD IV (CRDIVRegulation,CRDIVDirective). The paper assesses how KVA may be hedged, given that any hedging transactions themselves would lead to regulatory capital requirements and hence KVA. To conclude, a number of numerical examples are presented to gauge the cost impact of KVA on vanilla derivative products. 
Date:  2014–05 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1405.0515&r=rmg 
By:  Michele Leonardo Bianchi (Bank of Italy) 
Abstract:  In this paper we conduct an empirical analysis of daily logreturns of Italian openend mutual funds and their respective benchmarks in the period from February 2007 to June 2013. First, we estimate the classical normalbased model on the logreturns of a large set of funds. Then we compare it with three models allowing for asymmetry and heavy tails. We empirically assess that both the value at risk and the average value at risk are modeldependent and we show that the difference between models should be taken into consideration in the evaluation of risk measures. 
Keywords:  openend mutual funds, normal distribution, tempered stable distributions, value at risk, average value at risk 
JEL:  C02 C46 G23 
Date:  2014–04 
URL:  http://d.repec.org/n?u=RePEc:bdi:wptemi:td_957_14&r=rmg 
By:  Safarian, Mher 
Abstract:  In this paper we study a hedging problem for European options taking into account the presence of transaction costs. In incomplete markets, i.e. markets without classical restriction, there exists a unique martingale measure. Our approach is based on the FöllmerSchweizerSondermann concept of risk minimizing. In discret time Markov market model we construct a risk minimizing strategy by backwards iteration. The strategy gives a closedform formula. A continuous time market model using martingale price process shows the existence of a risk minimizing hedging strategy.  
Keywords:  hedging of options,incomplete markets,transaction costs,risk minimization,meanself strategies 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:zbw:kitwps:56&r=rmg 
By:  Przemys\law Klusik 
Abstract:  In the paper we develop mathematical tools of quantile hedging in incomplete market. Those could be used for two significant applications: \begin{enumerate} \item calculating the \textbf{optimal capital requirement imposed by Solvency II} (Directive 2009/138/EC of the European Parliament and of the Council) when the market and nonmarket risk is present in insurance company. We show hot to find the minimal capital $V_0$ to provide with the oneyear hedging strategy for insurance company satisfying $E\left[{\mathbf 1}_{\{V_1 \geq D\}}\right]=0.995$, where $V_1$ denotes the value of insurance company in one year time and $D$ is the payoff of the contract. \item finding a hedging strategy for derivative not using underlying but an asset with dynamics correlated or in some other way dependent (no deterministically) on underlying. The work is a genaralisation of the work of Klusik and Palmowski \cite{KluPal}. \end{enumerate} \medskip {\it Keywords:} quantile hedging, solvency II, capital modelling, hedging options on nontradable asset. \medskip 
Date:  2014–05 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1405.1212&r=rmg 
By:  Chenghu Ma; WingKeung Wong 
Abstract:  Is it possible to obtain an objective and quantifiable measure of risk backed up by choices made by some specific groups of rational investors? To answer this question, in this paper we establish some behavior foundations for various types of VaR models, including VaR and conditionalVaR, as measures of downside risk. Though supported to some extent with unanimous choices by some specific groups of expected or nonexpected utility investors, VaRs as profiles of risk measures at various levels of risk tolerance are not quantifiable – they can only provide partial and incomplete risk assessments for risky prospects. Also included in our discussion are the relevant VaRs and several alternative risk measures for investors; these alternatives use somewhat weaker assumptions about riskaverse behavior by incorporating a meanpreservingspread. For this latter group of investors, we provide arguments for and against the standard deviation vs. VaR and conditional VaRs as objective and quantifiable measures of risk in the context of portfolio choice. 
Keywords:  downside risk, valueatrisk, conditionalVaR, stochastic dominance, utility 
JEL:  C0 D81 G10 
Date:  2013–10–14 
URL:  http://d.repec.org/n?u=RePEc:wyi:wpaper:001971&r=rmg 
By:  Emmanuel Haven; Xiaoquan Liu; Chenghu Ma; Liya Shen 
Abstract:  Options are believed to contain unique information about the risk neutral moment generating function (MGF hereafter) or the riskneutral probability density function (PDF hereafter). This paper applies the wavelet method to approximate the riskneutral MGF of the under lying asset from option prices. Monte Carlo simulation experiments are performed to elaborate how the riskneutral MGF can be obtained using the wavelet method. The BlackScholes model is chosen as the benchmark model. We offer a novel method for obtaining the implied riskneutral MGF for pricing outofsample options and other complex or illiquid derivative claims on the underlying asset using information obtained from simulated data. 
Keywords:  Implied riskneutral MGF; wavelets; options; BlackScholes model. 
Date:  2013–10–14 
URL:  http://d.repec.org/n?u=RePEc:wyi:wpaper:001991&r=rmg 
By:  Qian Han 
Abstract:  Considering a production economy with an arbitrary vonNeumann Morgenstern utility, this paper derives a general equilibrium relationship between the market prices of risks and market risk aversion under a continuous time stochastic volatility model completed by liquidly traded options. The derived relation shows that in equilibrium the risk aversion should be a linear combination of the market price of asset risk and market price of orthogonal risk. Construction of a daily market risk aversion index is proposed to help practitioners with better risk management. 
Date:  2013–10–14 
URL:  http://d.repec.org/n?u=RePEc:wyi:wpaper:002033&r=rmg 
By:  Luciana Dalla Valle; Maria Elena De Giuli; Claudia Tarantola; Claudio Manelli 
Abstract:  In this paper we present a novel semiBayesian model for firm default probability estimation. The methodology is based on multivariate contingent claim analysis and pair copula constructions. For each considered firm, balance sheet data are used to assess the asset value, and to compute its default probability. The asset pricing function is expressed via a pair copula construction, and it is approximated via Monte Carlo simulations. The methodology is illustrated through an application to the analysis of both operative and defaulted firms. 
Date:  2014–05 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1405.1309&r=rmg 
By:  Takashi Shinzato 
Abstract:  In portfolio optimization problems, the minimum expected investment risk is not always smaller than the expected minimal investment risk. That is, using a wellknown approach from operations research, it is possible to derive a strategy that minimizes the expected investment risk, but this strategy does not always result in the best rate of return on assets. Prior to making investment decisions, it is important to an investor to know the potential minimal investment risk (or the expected minimal investment risk) and to determine the strategy that will maximize the return on assets. We use the selfaveraging property to analyze the potential minimal investment risk and the concentrated investment level for the strategy that gives the best rate of return. We compare the results from our method with the results obtained by the operations research approach and with those obtained by a numerical simulation using the optimal portfolio. The results of our method and the numerical simulation are in agreement, but they differ from that of the operations research approach. 
Date:  2014–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1404.5222&r=rmg 
By:  Sung Yong Park; Sang Young Jei 
Abstract:  Bollerslev’s (1990) constant conditional correlation (CCC) and Engle’s (2002) dynamic conditional correlation (DCC) bivariate generalized autoregressive conditional heteroskedasticity (BGARCH) models are usually used to estimate timevarying hedge ratios. In this paper, we extend the above model to more flexible ones to analyze the behavior of the optimal conditional hedge ratio based on two BGARCH models: (i) adapting more flexible bivariate density functions such as a bivariate skewedt density function; (ii) considering asymmetric individual conditional variance equations; and (iii) incorporating asymmetry in the conditional correlation equation for the DCC based model. Hedging performance in terms of variance reduction and also value at risk and expected shortfall of the hedged portfolio are also conducted. Using daily data of the spot and futures returns of corn and soybeans we find asymmetric and flexible density specifications help increase the goodnessoffit of the estimated models, but does not guarantee higher hedging performance. We also find that there is an inverse relationship between the variance of hedge ratios and hedging effectiveness. 
Date:  2013–10–14 
URL:  http://d.repec.org/n?u=RePEc:wyi:journl:002103&r=rmg 
By:  Alexei V. Egorov; Yongmiao Hong; Haitao Li 
Abstract:  Most existing empirical studies on affine term structure models (ATSMs) have mainly focused on insample goodnessoffit of historical bond yields and ignored outofsample forecast of future bond yields. Using an omnibus nonparametric procedure for density forecast evaluation in a continuoustime framework, we provide probably the first comprehensive empirical analysis of the outofsample performance of ATSMs in forecasting the joint conditional probability density of bond yields. We find that although the random walk models tend to have better forecasts for the conditional mean dynamics of bond yields, some ATSMs provide better forecasts for the joint probability density of bond yields. However, all ATSMs considered are still overwhelmingly rejected by our tests and fail to provide satisfactory density forecasts. There exists room for further improving density forecasts for bond yields by extending ATSMs. r 2005 Elsevier B.V. All rights reserved. 
Keywords:  Density forecast; Affine term structure models; Probability integral transform; Financial risk management; Value at risk; Fixedincome portfolio management 
JEL:  C4 C5 G1 
Date:  2013–10–14 
URL:  http://d.repec.org/n?u=RePEc:wyi:journl:002064&r=rmg 
By:  Zongwu Cai; Xian Wang 
Abstract:  This paper considers a new nonparametric estimation of conditional valueatrisk and expected shortfall functions. Conditional valueatrisk is estimated by inverting the weighted double kernel local linear estimate of the conditional distribution function. The nonparametric estimator of conditional expected shortfall is constructed by a pluggingin method. Both the asymptotic normality and consistency of the proposed nonparametric estimators are established at both boundary and interior points for time series data. We show that the weighted double kernel local linear conditional distribution estimator has the advantages of always being a distribution, continuous, and differentiable, besides the good properties from both the double kernel local linear and weighted NadarayaWatson estimators. Moreover, an adhoc datadriven fashion bandwidth selection method is proposed, based on the nonparametric version of the Akaike information criterion. Finally, an empirical study is carried out to illustrate the finite sample performance of the proposed estimators. Published by Elsevier B.V. 
Keywords:  Boundary effects,�Empirical likelihood, Expected shortfall, Local linear estimation,Nonparametric smoothing, Valueatrisk, Weighted double kernel 
JEL:  C14 D81 G10 G22 G31 
Date:  2013–10–14 
URL:  http://d.repec.org/n?u=RePEc:wyi:journl:002095&r=rmg 
By:  Kurmas Akdogan; Burcu Deniz Yildirim 
Abstract:  We provide a detailed classification of core and noncore liabilities for the Turkish banking system à la Shin and Shin (2010). We further carry out a twostage liquidity stress test similar to Van Den End (2010) where we simulate inflow and outflow factors as well as the network topology of mutual liabilities between financial institutions. Our results indicate that Turkish banking system with relatively low level of noncore liabilities is to a great extent robust to liquidity shocks. Nevertheless, the level of noncore liabilities should be monitored closely considering its procyclical behaviour over the business cycle and its strong correlation with credit growth 
Keywords:  Financial stability, noncore liabilities, liquidity stress test, network topology 
JEL:  C15 E44 G21 G28 G32 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:tcb:wpaper:1412&r=rmg 
By:  Aptus, Elias; Britz, Volker; Gersbach, Hans 
Abstract:  We examine the impact of socalled Crisis Contracts on bank managers' risktaking incentives and on the probability of banking crises. Under a Crisis Contract, managers are required to contribute a prespecified share of their past earnings to finance public rescue funds when a crisis occurs. This can be viewed as a retroactive tax that is levied only when a crisis occurs and that leads to a form of collective liability for bank managers. We develop a gametheoretic model of a banking sector whose shareholders have limited liability, so that society at large will suffer losses if a crisis occurs. Without Crisis Contracts, the managers' and shareholders' interests are aligned, and managers take more than the socially optimal level of risk. We investigate how the introduction of Crisis Contracts changes the equilibrium level of risktaking and the remuneration of bank managers. We establish conditions under which the introduction of Crisis Contracts will reduce the probability of a banking crisis and improve social welfare. We explore how Crisis Contracts and capital requirements can supplement each other and we show that the efficacy of Crisis Contracts is not undermined by attempts to hedge.  
Keywords:  banking crises,Crisis Contracts,excessive risk taking,banker's pay,hedging,capital requirements 
JEL:  C79 G21 G28 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:zbw:cfswop:453&r=rmg 
By:  Xiaobing Feng; Haibo Hu 
Abstract:  The negative externalities from an individual bank failure to the whole system can be huge. One of the key purposes of bank regulation is to internalize the social costs of potential bank failures via capital charges. This study proposes a method to evaluate and allocate the systemic risk to different countries/regions using a SIR type of epidemic spreading model and the Shapley value in game theory. The paper also explores features of a constructed bank network using real globewide banking data. 
Date:  2014–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1404.5689&r=rmg 
By:  Li, Minqiang 
Abstract:  We study Aumann and Serrano's (2008) risk index for sums of gambles that are not necessarily independent. We show that if the dependent parts of two gambles are similarly ordered, or more generally positively quadrant dependent, then the risk index of the sum of two gambles is always larger than the minimum of the risk indices of the two gambles. For negative dependence, the risk index of the sum is always smaller than the maximum of the two risk indices. The above results agree with our intuitions well. For example, the result for negative dependence agrees with our intuition of risk diversification. Thus this result can be considered another attractive property of Aumann and Serrano's risk index. 
Keywords:  Risk index; Additive gambles; Subadditivity; Positive quadrant dependence 
JEL:  A10 C00 D81 
Date:  2014–05 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:55697&r=rmg 
By:  Müller, Kirsten; Musshoff, Oliver; Weber, Ron 
Abstract:  Financial institutions still neglect to address agricultural clients. The main reasons for that are their perception that farmers bear higher risks than nonfarmers and that their loan products are inadequate to accommodate the needs of agricultural entrepreneurs. As a result, many farmers still lack access to external finance. The aim of this paper is to investigate determinants of credit risk for agricultural loans disbursed by a Microfinance Institution (MFI) in Azerbaijan. In this context special attention is paid to repayment flexibility and the role of collaterals. MFIs are among the first financial institutions recently focusing on farmers with new loan products. We find that farmers are less risky than nonfarmers, which is surprising because the opposite is widely believed. We furthermore find that the level of collateral has a negative influence on credit risk. Beyond that, we find that repayment flexibility increases credit risk.  
Keywords:  microfinance,collaterals,Tobit Model,credit risk,smallscale farmer 
JEL:  Q12 Q14 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:zbw:daredp:1402&r=rmg 
By:  Mohanty, Roshni; P, Srinivasan 
Abstract:  This paper investigates the relationship between stock market returns and volatility in the Indian stock markets using AR(1)EGARCH(p, q)inMean model. The study considers daily closing prices of two major indexes of Indian stock exchanges, viz., S&P CNX NIFTY and the BSESENSEX of National Stock Exchange (NSE) and Bombay Stock Exchange (BSE), respectively for the period from July 1, 1997 to December 31, 2013. The empirical results show positive but insignificant relationship between stock returns and conditional variance in the case of NSE Nifty and BSE SENSEX stock markets. Besides, the analysis reveals that volatility is persistent and there exists leverage effect supporting the work of Nelson (1991) in the Indian stock markets. The present study suggests that the capital market regulators, investors and market participants should employ the asymmetric GARCHtype model that sufficiently captures the stylized characteristics of the return, such as time varying volatility, high persistence and asymmetric volatility responses, in determining the hedging strategy and portfolio management and estimating and forecasting volatility for risk management decision making at Indian Stock Exchange. 
Keywords:  Stock Market Returns, WeakFrom Efficiency, India, AREGARCHM model 
JEL:  C58 G1 G12 
Date:  2014–05–01 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:55660&r=rmg 
By:  Anil K. Bera; Sung Y. Park 
Abstract:  Markowitz’s meanvariance (MV) efficient portfolio selection is one of the most widely used approaches in solving portfolio diversification problem. However, contrary to the notion of diversification, MV approach often leads to portfolios highly concentrated on a few assets. Also, this method leads to poor outofsample performances. Entropy is a wellknown measure of diversity and also has a shrinkage interpretation. In this article, we propose to use crossentropy measure as the objective function with side conditions coming from the mean and variance–covariance matrix of the resampled asset returns. This automatically captures the degree of imprecision of input estimates. Our approach can be viewed as a shrinkage estimation of portfolio weights (probabilities) which are shrunk towards the predetermined portfolio, for example, equally weighted portfolio or minimum variance portfolio. Our procedure is illustrated with an application to the international equity indexes. 
Keywords:  Diversification; Entropy measure; Portfolio selection; Shrinkage rule; Simulation methods. 
JEL:  C15 C44 G11 
Date:  2013–10–14 
URL:  http://d.repec.org/n?u=RePEc:wyi:journl:002090&r=rmg 
By:  Kent Wang; Junwei Liu; Zhi Liu 
Abstract:  We propose a new thresholdæ™reaveraging realized estimator for the integrated covolatility of two assets using nonsynchronous observations with the simultaneous presence of microstructure noise and jumps. We derive a noiserobust Hayashiæœ°oshida estimator that allows for very general structure of jumps in the underlying process. Based on the new estimator, different aspects and components of covolatility are compared to examine the effect of jumps on systematic risk using tickbytick data from the Chinese stock market during 2009?011. We find controlling for jumps contributes significantly to the beta estimation and common jumps mostly dominate the jumpæŠ¯ effect, but there is also evidence that idiosyncratic jumps may lead to significant deviation. We also find that not controlling for noise and jumps in previous realized beta estimations tend to considerably underestimate the systematic risk. 
Keywords:  Ito semimartingale, Highfrequency finance, Covolatility, Nonsynchronous trading, Idiosyncratic jumps, Cojump, Microstructure noise 
JEL:  C13 C14 G10 G12 
Date:  2013–10–14 
URL:  http://d.repec.org/n?u=RePEc:wyi:journl:002184&r=rmg 
By:  Vuillemey, G.; Breton, R. 
Abstract:  This paper proposes a network formation model of an OTC derivatives market where both prices and quantities are bilaterally negociated. The key feature of the framework is to endogenize the network of exposures, the gross and net notional amounts traded and the collateral delivered through initial and variation margins, as a function of idiosyncratic counterparty risk and regulatory collateral and clearing requirements. Using the framework, we investigate numerically the size of the derivatives network, the aggregate collateral demand and the pricing of the contracts under the following schemes: (i) various levels of collateralization for uncleared transactions, (ii) rehypothecation of received collateral and (iii) clearing through a central clearing party (CCP). Overall results suggest that dynamic effects due to the endogeneity of the derivative network to the collateralization and clearing requirements have sizeable consequences on both contract volumes and prices. Intermediary trading and market liquidity are reduced by higher collateralization requirements and enhanced by rehypothecation, while the potential for contagion is reduced. Not accounting for dynamic effects in current market conditions may lead to overestimate collateral demand induced by mandatory central clearing by up to 22%. 
Keywords:  Collateral, Credit derivatives, Central Clearing Party (CCP), Rehypothecation, Network formation. 
JEL:  G11 G17 G28 
Date:  2014 
URL:  http://d.repec.org/n?u=RePEc:bfr:banfra:483&r=rmg 
By:  Bell, Peter Newton 
Abstract:  This paper investigates a simple risk management problem where an investor is forced to hold a risky asset and then allowed to trade put options on the asset. I simulate the distribution of returns for different quantities of options and investigate statistics from the distribution. In the first section of the paper, I compare two types of averages: the ensemble and the time average. These two statistics are motivated by research that uses ideas from ergodic theory and tools from statistical mechanics to provide new insight into decision making under uncertainty. In a large sample setting, I find that the ensemble average leads an investor to buy zero put options and the time average leads them to buy a positive quantity of options; these results are in agreement with stylized facts from the literature. In the second section, I investigate the stability of the optimal quantity under small sample sizes. This is a standard resampling exercise that shows large variability in the optimal quantity associated with the time average of returns. In the third section, I conclude with a brief discussion of higher moments from the distribution of returns. I show that higher moments change substantially with different quantities of options and suggest that these higher moments deserve further attention in relation to the time average. 
Keywords:  Time average; risk management; portfolio optimization 
JEL:  C4 C44 D8 D81 G11 
Date:  2014–05–07 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:55803&r=rmg 
By:  Yanan He; Jing Zhao 
Abstract:  This paper uses copula approach to investigate the extreme dependence between crude oil and stock sectors in China. Empirical results show that oilstock linkages have been changed by the occurrence of the recent financial crisis. Before the financial crisis, only two stock sectors have weakly negative tail dependence, while in the postcrisis period much more sectors become positively dependent with oil at extreme levels. Meanwhile, heterogeneity of sector dependence with crude oil is identified. The sector of Basic materials has the largest tail dependence (94.5%) with crude oil prices, which is followed by Financials and Construction & Materials. Asymmetric tail dependence is found in the Basic materialscrude oil pair, indicating that two returns exhibit greater correlation during market crashing than booming. Empirical findings in this paper have potentially important implications for financial market participants and policy makers. 
Keywords:  Stock sector; Crude oil; Tail dependence; Copula 
URL:  http://d.repec.org/n?u=RePEc:wyi:wpaper:002210&r=rmg 
By:  Bruno Giovannetti; Mauro Rodrigues, Eduardo Ros 
Abstract:  Global institutional investors face constraints, in the form of either external regulations or internal firm policies, with regard to investing in countries rated speculative grade. As a consequence, when a country receives (loses) its investmentgrade status, a significant inflow (outflow) of foreign investment is likely to occur and, thus, a global portfolio should increase (diminish) in importance as a source of systematic risk for stocks traded in that country. We study how stock prices behave around such events. Our results are consistent with theory 
Keywords:  Investment grade; Systematic risk; Asset pricing 
JEL:  G15 G14 G12 
Date:  2014–04–23 
URL:  http://d.repec.org/n?u=RePEc:spa:wpaper:2014wpecon5&r=rmg 
By:  Yongmiao Hong; Hai Lin; Shouyang Wang 
Abstract:  Understanding the dynamics of spot interest rates is important for derivatives pricing, risk management, interest rate liberalization, and macroeconomic control. Based on a daily data of Chinese 7day repo rates from July 22, 1996 to August 26, 2004, we estimate and test a variety of popular spot rate models, including single factor diffusion, GARCH, Markov regime switching and jump diffusion models, to examine how well they can capture the dynamics of the Chinese spot rates and whether the dynamics of the Chinese spot rates has similar features to that of the U.S. spot rates. A robust Mestimation method and a robust Hellinger metricbased specification test are used to alleviate the impact of frequent extreme observations in the Chinese interest rate data, which are mainly due to IPO. We document that GARCH, regime switching and jump diffusion models can capture some important features of the dynamics of the Chinese spot rates, but all models under study are overwhelmingly rejected. We further explore possible sources of�model misspecification using some diagnostic tests. This provides useful information for future improvement on modeling the dynamics of the Chinese spot rates. 
Keywords:  Generalized residuals, Robust specification tests, Robust Mestimation, Spot rate 
JEL:  E4 C5 G1 
Date:  2013–10–14 
URL:  http://d.repec.org/n?u=RePEc:wyi:wpaper:001998&r=rmg 
By:  Angelos Dassios; Hongbiao Zhao 
Abstract:  We introduce a numerically efficient simulation algorithm for Hawkes process with exponentially decaying intensity, a special case of general Hawkes process that is most widely implemented in practice. This computational method is able to exactly generate the point process and intensity process, by sampling interarrivaltimes directly via the underlying analytic distribution functions without numerical inverse, and hence avoids simulating intensity paths and introducing discretisation bias. Moreover, it is flexible to generate points with either stationary or nonstationary intensity, starting from any arbitrary time with any arbitrary initial intensity. It is also straightforward to implement, and can easily extend to multidimensional versions, for further applications in modelling contagion risk or clustering arrival of events in finance, insurance, economics and many other fields. Simulation algorithms for one dimension and multidimension are represented, with numerical examples of univariate and bivariate processes provided as illustrations. 
Keywords:  Contagion risk; Stochastic intensity model; Selfexciting point process; Hawkes process; Hawkes process with exponentially decaying intensity; Exact simulation; Monte Carlo simulation. 
URL:  http://d.repec.org/n?u=RePEc:wyi:journl:002211&r=rmg 