nep-rmg New Economics Papers
on Risk Management
Issue of 2007‒01‒14
nine papers chosen by
Stan Miles
Thompson Rivers University

  1. Discriminant Analysys of Default Risk By Aragon, Aker
  2. An Alternative Definition of Market Efficiency and some Comments on its Empirical Testing By Alexandros E. Milionis
  3. Liquidity risk management in banks: international best practices and cases (In Spanish) By Delfiner, Miguel; Lippi, Claudia; Pailhé, Cristina
  4. The Basel II IRB approach revisited: do we use the correct model? By Varsanyi, Zoltan
  5. Kerangka Kerja Ekonofisika dalam Basel II By Situngkir, Hokky; Surya, Yohanes
  6. Practical Calculation of Expected and Unexpected Losses in Operational Risk by Simulation Methods By Enrique, Navarrete
  7. Quantile Forecasts of Daily Exchange Rate Returns from Forecasts of Realized Volatility By Clements, Michael P.; Galvão, Ana Beatriz; Kim, Jae H.
  8. The Performance of Default Risk Structural Models on Commercial Mortgages: An Empirical Investigation By Richard K. Green; George M. Jabbour; Yi-Kang Liu
  9. Forecasting volatility and volume in the Tokyo stock market : long memory, fractality and regime switching By Lux, Thomas; Kaizoji, Taisei

  1. By: Aragon, Aker
    Abstract: In this work discriminant analysis was applied, but firstly the variables were transformed in order to get normal distribution; and Component Analysis was applied in order to get uncorrelated factors.
    Keywords: discriminant; default risk; box cox
    JEL: G21 C13
    Date: 2004–10–21
  2. By: Alexandros E. Milionis (Bank of Greece)
    Abstract: An alternative definition for market efficiency, based on econometric rather than financial arguments is suggested. It is argued that this new definition, though equivalent to the existing one, has some comparative advantages. Moreover, the conditions under which the results from the application of some commonly used methods for the empirical testing of market efficiency are meaningful are examined, and guidelines for practitioners are suggested. Further, market efficiency is examined in a time-varying risk framework.
    Keywords: Market efficiency; Return predictability; Serial correlation in stock returns; Market efficiency in the presence of conditional heteroscedasticity
    JEL: G14 C10 C22
    Date: 2006–11
  3. By: Delfiner, Miguel; Lippi, Claudia; Pailhé, Cristina
    Abstract: This paper studies the best practices related to the management of liquidity risk in financial institutions from the viewpoint of the standards, as well as its treatment in a series of countries. Firstly it reviews the best practices suggested by the Basel Committee on Banking Supervision, the developments in European countries observed by the European Central Bank, and sound practices for liquidity risk management proposed in the supervision manuals of the US regulatory agencies. Secondly, it examines particular experiences of countries that apply policies for the management of liquidity risk, through their supervision manuals or their regulation. The paper also includes the experiences of some Latin-American countries that rely on a specific regulation of liquidity, together with the Argentine case. Although the importance of liquidity risk is well known, given the idiosyncratic characteristics shown in different banks, the organisms in charge of establishing the best practices regarding the subject prefer to give general principles that can be used as a guide in the management of the risk rather than to specify a quantitative regulation. Most of the analysed countries have adopted these recommendations, in some cases giving some freedom for the banks to apply internal methods, in others providing guidance for banks that don’t have advanced developments in the subject yet. In other countries, instead, quantitative regulations have been implemented.
    Keywords: Liquidity Risk; Liquidity Mismatches; Regulation
    JEL: G21
    Date: 2006–10
  4. By: Varsanyi, Zoltan
    Abstract: In this paper I question whether the risk weights in the advanced (IRB) approach of the Basel 2 regulation are appropriate, on a strictly theoretical ground. The major concern is that the model behind the regulation considers defaults only at the end of the time horizon for which capital is to be held - whereas defaults in the whole time interval should be taken into consideration. This latter approach is represented by a model that is different from the Basel model. It follows, as I show, that the Basel model should be viewed just as a technical tool to turn the expected value of the unconditional loss distribution into a given percentile of the same distribution - making use of conditional (on the systemic factor) default probabilities - and should not be interpreted as describing even 'virtual' firms and asset values. More importantly, I also show that a logical step in the theoretical foundation of the model is missing which raises the question whether the risk weights calculated with the model are indeed appropriate. Due to difficulties in the calculation in the alternative approach of the percentiles of the loss distribution no clean-cut answer is given in this paper.
    Keywords: Basel II; credit risk
    JEL: G28
    Date: 2006–08
  5. By: Situngkir, Hokky; Surya, Yohanes
    Abstract: The paper elaborates some analytical opportunities for econophysics in the implementation of Basel II documents for banking. We see this chances by reviewing some methodologies proposed by the econophysicists in the three important aspects of risk management: the market risk, credit risk, and operational risk.
    Keywords: risk management; econophysics; Basel II.
    JEL: G23 E51 H55 D81 C52 G32
    Date: 2006–06–07
  6. By: Enrique, Navarrete
    Abstract: This paper explores the difficulties involved in quantitative measurement of operational risk and proposes simulation methods as a practical solution to obtain the distribution of total losses. It also introduces an example of the estimation of expected and unexpected losses, as well as Value-at-Risk (VaR), arising from operational risk.
    Keywords: Operational risk; loss distribution; Value-at-Risk (VaR); simulation methods; Basel II
    JEL: G00 C15
    Date: 2006–10
  7. By: Clements, Michael P. (University of Warwick); Galvão, Ana Beatriz (Queen Mary, University of London); Kim, Jae H. (Monash University)
    Abstract: Quantile forecasts are central to risk management decisions because of the widespread use of Value-at-Risk. A quantile forecast is the product of two factors : the model used to forecast volatility, and the method of computing quantiles from the volatility forecasts. In this paper we calculate and evaluate quantile forecasts of the daily exchange rate returns of five currencies. The forecasting models that have been used in recent analyses of the predictability of daily realized volatility permit a comparison of the predictive power of different measures of intraday variation and intraday returns in forecasting exchange rate variability. The methods of computing quantile forecasts include making distributional assumptions for future daily returns as well as using the empirical distribution of predicted standardized returns with both rolling and recursive samples. Our main ?ndings are that the HAR model provides more accurate volatility and quantile forecasts for currencies which experience shifts in volatility, such as the Canadian dollar, and that the use of the empirical distribution to calculate quantiles can improve forecasts when there are shifts.
    Keywords: realized volatility ; quantile forecasting ; MIDAS ; HAR ; exchange rates
    JEL: C32 C53 F37
    Date: 2006
  8. By: Richard K. Green (The George Washington University School of Business); George M. Jabbour (The George Washington University School of Business); Yi-Kang Liu (Pentagon Federal Credit Union)
    Abstract: This paper uses the first-passage-time approach to estimate default probabilities of commercial mortgages and the Receiver Operating Characteristic (ROC) approach to empirically test the cash flow proposition of Vandell (1995). The focus is on comparing the performance between a single trigger model and a double-trigger model. Using 17,616 lockout commercial loans issued between 1995 and 2001, we find the property value model performs the best. In addition, the results provide a partial support to the cash flow proposition.
    Date: 2006–12
  9. By: Lux, Thomas; Kaizoji, Taisei
    Abstract: We investigate the predictability of both volatility and volume for a large sample of Japanese stocks. The particular emphasis of this paper is on assessing the performance of long memory time series models in comparison to their short-memory counterparts. Since long memory models should have a particular advantage over long forecasting horizons, we consider predictions of up to 100 days ahead. In most respects, the long memory models (ARFIMA, FIGARCH and the recently introduced multifractal model) dominate over GARCH and ARMA models. However, while FIGARCH and ARFIMA also have quite a number of cases with dramatic failures of their forecasts, the multifractal model does not suffer from this shortcoming and its performance practically always improves upon the naïve forecast provided by historical volatility. As a somewhat surprising result, we also find that, for FIGARCH and ARFIMA models, pooled estimates (i.e. averages of parameter estimates from a sample of time series) give much better results than individually estimated models.
    Keywords: forecasting, long memory models, volume, volatility
    JEL: C22 C53 G12
    Date: 2006

This nep-rmg issue is ©2007 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.