nep-rmg New Economics Papers
on Risk Management
Issue of 2010‒08‒21
eight papers chosen by
Stan Miles
Thompson Rivers University

  1. Risk-return Efficiency, Financial Distress Risk, and Bank Financial Strength Ratings By Hua, Changchun; Liu, Li-Gang
  2. Two Risk-aware Resource Brokering Strategies in Grid Computing:Broker-driven vs. User-driven Methods By Junseok Hwang; Jihyoun Park; Jorn Altmann
  3. "Bayesian Estimation and Particle Filter for Max-Stable Processes" By Tsuyoshi Kunihama; Yasuhiro Omori; Zhengjun Zhang
  4. Asymptotically effcient estimation of the conditionalexpected shortfall By Samantha Leorato; Franco Peracchi; Andrei V. Tanase
  5. Sibuya copulas By Marius Hofert; Frederic Vrins
  6. Non-existence of Markovian time dynamics for graphical models of correlated default By Steven N. Evans; Alexandru Hening
  7. Stock volatility in the periods of booms and stagnations By Taisei Kaizoji
  8. Financial Deregulation and Profit Efficiency: A Non-parametric Analysis of Indian Banks By Ghosh, Saibal

  1. By: Hua, Changchun (Asian Development Bank Institute); Liu, Li-Gang (Asian Development Bank Institute)
    Abstract: This paper investigates whether there is any consistency between banks' financial strength ratings (bank rating) and their risk-return profiles. It is expected that banks with high ratings tend to earn high expected returns for the risks they assume and thereby have a low probability of experiencing financial distress. Bank ratings, a measure of a bank's intrinsic safety and soundness, should therefore be able to capture the bank's ability to manage financial distress while achieving risk-return efficiency. We first estimate the expected returns, risks, and financial distress risk proxy (the inverse z-score), then apply the stochastic frontier analysis (SFA) to obtain the risk-return efficiency score for each bank, and finally conduct ordered logit regressions of bank ratings on estimated risks, risk-return efficiency, and the inverse z-score by controlling for other variables related to each bank's operating environment. We find that banks with a higher efficiency score on average tend to obtain favorable ratings. It appears that rating agencies generally encourage banks to trade expected returns for reduced risks, suggesting that these ratings are generally consistent with banks' risk-return profiles.
    Keywords: bank ratings; risk-return efficiency; stochastic frontier analysis
    JEL: D21 D24 G21 G24 G28 G32
    Date: 2010–08–12
    URL: http://d.repec.org/n?u=RePEc:ris:adbiwp:0240&r=rmg
  2. By: Junseok Hwang; Jihyoun Park; Jorn Altmann (Technology Management, Economics, and Policy Program (TEMEP), Seoul National University)
    Abstract: Grid computing evolves toward an open computing environment, which is characterized by highly diversified resource providers and systems. As the control of each computing resource becomes difficult, the security of users¡¯ job is often threatened by various risks occurred at individual resources in the network. This paper proposes two risk-aware resource brokering strategies: self-insurance and risk-performance preference specification. The former is a broker-driven method and the latter a user-driven method. Two mechanisms are analyzed through simulations. The simulation results show that both methods are effective for increasing the market size and reducing risks, but the user-driven technique is more cost-efficient.
    Keywords: Grid Computing, Risk Management, Self-Insurance, Risk-Performance Preference Specification
    JEL: C02 C15 C61 C63 D83 L11 L15 L86 L99 M21
    Date: 2010–03
    URL: http://d.repec.org/n?u=RePEc:snv:dp2009:201063&r=rmg
  3. By: Tsuyoshi Kunihama (Graduate School of Economics, University of Tokyo); Yasuhiro Omori (Faculty of Economics, University of Tokyo); Zhengjun Zhang (Department of Statistics, University of Wisconsin Madison)
    Abstract: Extreme values are often correlated over time, for example, in a financial time series, and these values carry various risks. Max-stable processes such as maxima of moving maxima (M3) processes have been recently considered in the literature to describe timedependent dynamics, which have been difficult to estimate. This paper first proposes a feasible and efficient Bayesian estimation method for nonlinear and non-Gaussian state space models based on these processes and describes a Markov chain Monte Carlo algorithm where the sampling efficiency is improved by the normal mixture sampler. Furthermore, a unique particle filter that adapts to extreme observations is proposed and shown to be highly accurate in comparison with other well-known filters. Our proposed algorithms were applied to daily minima of high-frequency stock return data, and a model comparison was conducted using marginal likelihoods to investigate the time-dependent dynamics in extreme stock returns for financial risk management.
    Date: 2010–08
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2010cf757&r=rmg
  4. By: Samantha Leorato (Tor Vergata University); Franco Peracchi (Tor Vergata University, EIEF); Andrei V. Tanase (Tor Vergata University)
    Abstract: We propose a procedure for efficient estimation of the trimmed mean of a random variable Y conditional on a set of covariates X. For concreteness, we focus on a financial application where the trimmed mean of interest corresponds to a coherent measure of risk, namely the conditional expected shortfall. Our estimator is based on the representation of the estimand as an integral of the conditional quantile function. We extend the class of estimators originally proposed by Peracchi and Tanase (2008) by introducing a weighting function that gives different weights to different conditional quantiles. Our approach allows for either parametric or nonparametric modeling of the conditional quantiles and the weights, but is essentially nonparametric in spirit. We prove consistency and asymptotic normality of the resulting estimator. Optimizing over the weighting function, we obtain asymptotic efficiency gains with respect to the unweighted estimators. The gains are especially noticeable in the case of fat-tailed distributions.
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:eie:wpaper:1014&r=rmg
  5. By: Marius Hofert; Frederic Vrins
    Abstract: The standard intensity-based approach for modeling defaults is generalized by making the deterministic term structure of the survival probability stochastic via a common jump process. The survival copula of the vector of default times is derived and it is shown to be explicit and of the functional form as dealt with in the work of Sibuya. Besides the parameters of the jump process, the marginal survival functions of the default times appear in the copula. Sibuya copulas therefore allow for functional parameters and asymmetries. Due to the jump process in the construction, they allow for a singular component. Depending on the parameters, they may also be extreme-value copulas or Levy-frailty copulas. Further, Sibuya copulas are easy to sample in any dimension. Properties of Sibuya copulas including positive lower orthant dependence, tail dependence, and extremal dependence are investigated. An application to pricing first-to-default contracts is outlined and further generalizations of this copula class are addressed.
    Date: 2010–08
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1008.2292&r=rmg
  6. By: Steven N. Evans; Alexandru Hening
    Abstract: Filiz et al. (2008) proposed a model for the pattern of defaults seen among a group of firms at the end of a given time period. The ingredients in the model are a graph, where the vertices correspond to the firms and the edges describe the network of interdependencies between the firms, a parameter for each vertex that captures the individual propensity of that firm to default, and a parameter for each edge that captures the joint propensity of the two connected firms to default. The correlated default model can be re-rewritten as a standard Ising model on the graph by identifying the set of defaulting firms in the default model with the set of sites in the Ising model for which the spin is +1. We ask whether there is a suitable continuous time Markov chain taking values in the subsets of the vertex set such that the initial state of the chain is the empty set, each jump of the chain involves the inclusion of a single extra vertex, the distribution of the chain at some fixed time horizon time is the one given by the default model, and the distribution of the chain for other times is described by a probability distribution in the same family as the default model. We show for three simple but financially natural special cases that this is not possible outside of the trivial case where there is complete independence between the firms.
    Date: 2010–08
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1008.2226&r=rmg
  7. By: Taisei Kaizoji
    Abstract: The aim of this paper is to compare statistical properties of stock price indices in periods of booms with those in periods of stagnations. We use the daily data of the four stock price indices in the major stock markets in the world: (i) the Nikkei 225 index (Nikkei 225) from January 4, 1975 to August 18, 2004, of (ii) the Dow Jones Industrial Average (DJIA) from January 2, 1946 to August 18, 2004, of (iii) Standard and Poor’s 500 index (SP500) from November 22, 1982 to August 18, 2004, and of (iii) the Financial Times Stock Exchange 100 index (FT 100) from April 2, 1984 to August 18, 2004. We divide the time series of each of these indices in the two periods: booms and stagnations, and investigate the statistical properties of absolute log returns, which is a typical measure of volatility, for each period. We find that (i) the tail of the distribution of the absolute log-returns is approximated by a power-law function with the exponent close to 3 in the periods of booms while the distribution is described by an exponential function with the scale parameter close to unity in the periods of stagnations.
    Keywords: Stock volatility, booms, stagnations, power-law distributions, and exponential distributions.
    JEL: G00 C16
    Date: 2010–07–07
    URL: http://d.repec.org/n?u=RePEc:eei:rpaper:eeri_rp_2010_07&r=rmg
  8. By: Ghosh, Saibal
    Abstract: The paper investigates the performance of Indian commercial banking sector during the post reform period 1992-2004. The results indicate high levels of efficiency in costs and lower levels in profits, reflecting the importance of inefficiencies on the revenue side of banking activity. The decomposition of profit efficiency shows that a large portion of outlay lost is due to allocative inefficiency. The proximate determinants of profit efficiency appears to suggest that big state-owned banks performed reasonably well and are more likely to operate at higher levels of profit efficiency. A close relationship is observed between efficiency and soundness as determined by bank’s capital adequacy ratio. The empirical results also show that the profit efficient banks are those that have, on an average, less non-performing loans.
    Keywords: Indian Banks; Deregulation; Profit efficiency; DEA model
    JEL: G21
    Date: 2009–11
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:24292&r=rmg

This nep-rmg issue is ©2010 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.