New Economics Papers
on Risk Management
Issue of 2011‒10‒01
seven papers chosen by

  1. Analysing Risk Management in Banks: Evidence of Bank Efficiency and Macroeconomic Impact By Awojobi, Omotola; Amel, Roya; Norouzi, Safoura
  2. Backtesting Value-at-Risk using Forecasts for Multiple Horizons, a Comment on the Forecast Rationality Tests of A.J. Patton and A. Timmermann By Lennart F. Hoogerheide; Francesco Ravazzolo; Herman K. van Dijk
  4. Analytical Solution for the Loss Distribution of a Collateralized Loan under a Quadratic Gaussian Default Intensity Process By Satoshi Yamashita; Toshinao Yoshiba
  5. Mapping the state of financial stability By Peter Sarlin; Tuomas A. Peltonen
  6. The Role of Default in Macroeconomics By Charles A. E. Goodhart; Dimitrios P. Tsomocos
  7. Estimating High Dimensional Covariance Matrices and its Applications By Jushan Bai; Shuzhong Shi

  1. By: Awojobi, Omotola; Amel, Roya; Norouzi, Safoura
    Abstract: The recent Global Economic meltdown triggered by the subprime mortgage crisis of United States in 2007 and its adverse effect on financial markets and participants in the financial industry worldwide have resulted in a capital management crisis in most financial institutions especially banks. This study is a case for the Nigerian banking industry, focusing on factors affecting risk management efficiency in banks. For empirical investigation, we employed Panel regression analysis taking a stratum of time series data and cross-sectional variants of macro and bank-specific factors for period covering 2003 to 2009. Result for panel regression indicates that risk management efficiency in Nigerian banks is not just affected by bank-specific factors but also by macroeconomic variables. This describes the pro-cyclicality of bank performance in the Nigerian banking sector. As it stands, the sufficiency of Basel principles for risk management is doubtful because asset quality varies with business cycles.
    Keywords: Risk management; Nigerian banks; capital adequacy; Basel; cyclicality
    JEL: E31 G31 G21
    Date: 2011–04–06
  2. By: Lennart F. Hoogerheide (VU University Amsterdam); Francesco Ravazzolo (Norges Bank); Herman K. van Dijk (Erasmus University Rotterdam, VU University Amsterdam.)
    Abstract: Patton and Timmermann (2011, 'Forecast Rationality Tests Based on Multi-Horizon Bounds', <I>Journal of Business & Economic Statistics</I>, forthcoming) propose a set of useful tests for forecast rationality or optimality under squared error loss, including an easily implemented test based on a regression that only involves (long-horizon and short-horizon) forecasts and no observations on the target variable. We propose an extension, a simulation-based procedure that takes into account the presence of errors in parameter estimates. This procedure can also be applied in the field of 'backtesting' models for Value-at-Risk. Applications to simple AR and ARCH time series models show that its power in detecting certain misspecifications is larger than the power of well-known tests for correct Unconditional Coverage and Conditional Coverage.
    Keywords: Value-at-Risk; backtest; optimal revision; forecast rationality
    JEL: C12 C52 C53 G32
    Date: 2011–09–20
  3. By: Frédéric Planchet (SAF - Laboratoire de Sciences Actuarielle et Financière - Université Claude Bernard - Lyon I : EA2429); Pierre-Emanuel Thérond (SAF - Laboratoire de Sciences Actuarielle et Financière - Université Claude Bernard - Lyon I : EA2429)
    Abstract: This paper investigates the robustness of the Solvency Capital Requirement (SCR) when a log-normal reference model is slightly disturbed by the heaviness of its tail distribution. It is shown that situations with "almost" lognormal data and a rather important variation between the "disturbed" SCR and the reference SCR can be built. The consequences of the estimation errors on the level of the SCR are studied too.
    Keywords: Solvency; extreme values
    Date: 2011–07–04
  4. By: Satoshi Yamashita (Professor, The Institute of Statistical Mathematics (E-mail:; Toshinao Yoshiba (Director and Senior Economist, Institute for Monetary and Economic Studies, (currently Financial System and Bank Examination Department), Bank of Japan (E-mail:
    Abstract: In this study, we derive an analytical solution for expected loss and the higher moment of the discounted loss distribution for a collateralized loan. To ensure nonnegative values for intensity and interest rate, we assume a quadratic Gaussian process for default intensity and discount interest rate. Correlations among default intensity, discount interest rate, and collateral value are represented by correlations among Brownian motions driving the movement of the Gaussian state variables. Given these assumptions, the expected loss or the m-th moment of the loss distribution is obtained by a time integral of an exponential quadratic form of the state variables. The coefficients of the form are derived by solving ordinary differential equations. In particular, with no correlation between default intensity and discount interest rate, the coefficients have explicit closed form solutions. We show numerical examples to analyze the effects of the correlation between default intensity and collateral value on expected loss and the standard deviation of the loss distribution.
    Keywords: default intensity, stochastic recovery, quadratic Gaussian, expected loss, measure change
    JEL: G21 G32 G33
    Date: 2011–09
  5. By: Peter Sarlin (Åbo Akademi University, Turku Centre for Computer Science, Joukahaisenkatu 3–5, 20520 Turku, Finland.); Tuomas A. Peltonen (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt, Germany.)
    Abstract: The paper uses the Self-Organizing Map for mapping the state of financial stability and visualizing the sources of systemic risks as well as for predicting systemic financial crises. The Self-Organizing Financial Stability Map (SOFSM) enables a two-dimensional representation of a multidimensional financial stability space that allows disentangling the individual sources impacting on systemic risks. The SOFSM can be used to monitor macro-financial vulnerabilities by locating a country in the financial stability cycle: being it either in the pre-crisis, crisis, post-crisis or tranquil state. In addition, the SOFSM performs better than or equally well as a logit model in classifying in-sample data and predicting out-of-sample the global financial crisis that started in 2007. Model robustness is tested by varying the thresholds of the models, the policymaker’s preferences, and the forecasting horizons. JEL Classification: E44, E58, F01, F37, G01.
    Keywords: Systemic financial crisis, systemic risk, Self-Organizing Map (SOM), visualization, prediction, macroprudential supervision.
    Date: 2011–09
  6. By: Charles A. E. Goodhart (Norman Sosnow Professor of Banking and Finance, London School of Economics (email:; Dimitrios P. Tsomocos (Said Business School, University of Oxford (email:
    Abstract: What is the main limitation of much modern macro-economic theory, among the failings pointed out by William R. White at the 2010 Mayekawa Lecture? We argue that the main deficiency is a failure to incorporate the possibility of default, including that of banks, into the core of the analysis. With default assumed away, there can be no role for financial intermediaries, for financial disturbances, or even for money. Models incorporating defaults are, however, harder to construct, in part because the representative agent fiction must be abandoned. Moreover, financial crises are hard to predict and to resolve. All of the previously available alternatives for handling failing systemically important financial institutions (SIFIs) are problematical. We end by discussing a variety of current proposals for improving the resolution of failed SIFIs.
    Keywords: Default, Transversality, Money, Bankruptcy cost, Asset bubbles, Resolution mechanisms
    JEL: B40 E12 E30 E40 E44 G18 G20 G28 P10
    Date: 2011–09
  7. By: Jushan Bai (Department of Economics, Columbia University; CEMA, Central University of Finance and Economics); Shuzhong Shi (Department of Finance, Guanghua School of Management)
    Abstract: Estimating covariance matrices is an important part of portfolio selection, risk management, and asset pricing. This paper reviews the recent development in estimating high dimensional covariance matrices, where the number of variables can be greater than the number of observations. The limitations of the sample covariance matrix are discussed. Several new approaches are presented, including the shrinkage method, the observable and latent factor method, the Bayesian approach, and the random matrix theory approach. For each method, the construction of covariance matrices is given. The relationships among these methods are discussed.
    Keywords: Factor analysis, Principal components, Singular value decomposition, Random matrix theory, Empirical Bayes, Shrinkage method, Optimal portfolios, CAPM, APT, GMM
    JEL: C33
    Date: 2011–11

General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.