nep-rmg New Economics Papers
on Risk Management
Issue of 2019‒04‒29
twelve papers chosen by
Stan Miles
Thompson Rivers University

  1. Optimal investment strategy for DC pension plans with stochastic force of mortality By Yongjie Wang
  2. Monotonic Estimation for the Survival Probability over a Risk-Rated Portfolio by Discrete-Time Hazard Rate Models By Yang, Bill Huajian
  3. Resolutions to flip-over credit risk and beyond By Yang, Bill Huajian
  4. Loss-based risk statistics with scenario analysis By Fei Sun
  5. Risk Management-Driven Policy Rate Gap By Giovanni Caggiano; Efrem Castelnuovo; Gabriela Nodari
  6. Shape Factor Asymptotic Analysis I By Wang, Frank Xuyan
  7. CHANGING THE GAME; NEW FRAME WORK OF CAPITAL ADEQUACY RATIO By A.bary, amr
  8. Simulation-based Value-at-Risk for Nonlinear Portfolios By Junyao Chen; Tony Sit; Hoi Ying Wong
  9. Deep Generative Models for Reject Inference in Credit Scoring By Rogelio A. Mancisidor; Michael Kampffmeyer; Kjersti Aas; Robert Jenssen
  10. Monotonic Estimation for Probability Distribution and Multivariate Risk Scales by Constrained Minimum Generalized Cross-Entropy By Yang, Bill Huajian
  11. Liquidity Risk After 20 Years By Pástor, Luboš; Stambaugh, Robert F.
  12. Can regulation on loan-loss-provisions for credit risk affect the mortgage market? Evidence from administrative data in Chile By Mauricio Calani

  1. By: Yongjie Wang
    Abstract: This paper studies an optimal portfolio problem for a DC pension plan considering both interest rate risk and longevity risk. In the accumulation phase, plan members pay constant contributions continuously into the pension fund. We assume that the evolution of mortality rate of all the plan members can be described by the same stochastic process and a representative member is chosen to study the problem. At retirement time, the pension fund is used to purchase a lifetime annuity and a minimum guarantee is required by the representative member. To hedge the longevity risk, we introduce a mortality-linked security, i.e. a longevity bond, into the financial market. The pension manager makes investment decisions for the benefit and on behalf of the representative pension member whose objective is to maximize his expected utility of the terminal surplus between the final fund level and the minimum guarantee. To solve the initial constrained non-self-financing optimization problem, we transform it to an unconstrained self-financing problem by replicating the future contributions and the minimum guarantee. By applying dynamic programming method, analytical solutions to the equivalent optimization problem are derived and optimal investment strategies to the original problem are obtained by simple calculations. The numerical applications reveal that the longevity risk has an important impact on the investment strategies and show evidence that mortality-linked securities could provide an efficient way to hedge the longevity risk.
    Date: 2019–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1904.10229&r=all
  2. By: Yang, Bill Huajian
    Abstract: Monotonic estimation for the survival probability of a loan in a risk-rated portfolio is based on the observation arising, for example, from loan pricing that a loan with a lower credit risk rating is more likely to survive than a loan with a higher credit risk rating, given the same additional risk covariates. Two probit-type discrete-time hazard rate models that generate monotonic survival probabilities are proposed in this paper. The first model calculates the discrete-time hazard rate conditional on systematic risk factors. As for the Cox proportion hazard rate model, the model formulates the discrete-time hazard rate by including a baseline component. This baseline component can be estimated outside the model in the absence of model covariates using the long-run average discrete-time hazard rate. This results in a significant reduction in the number of parameters to be otherwise estimated inside the model. The second model is a general form model where loan level factors can be included. Parameter estimation algorithms are also proposed. The models and algorithms proposed in this paper can be used for loan pricing, stress testing, expected credit loss estimation, and modeling of the probability of default term structure.
    Keywords: loan pricing, survival probability, Cox proportion hazard rate model, baseline hazard rate, forward probability of default, probability of default term structure
    JEL: C02 C13 C18 C40 C44 C51 C52 C53 C58 C61 C63
    Date: 2019–03–18
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:93398&r=all
  3. By: Yang, Bill Huajian
    Abstract: Abstract Given a risk outcome y over a rating system {R_i }_(i=1)^k for a portfolio, we show in this paper that the maximum likelihood estimates with monotonic constraints, when y is binary (the Bernoulli likelihood) or takes values in the interval 0≤y≤1 (the quasi-Bernoulli likelihood), are each given by the average of the observed outcomes for some consecutive rating indexes. These estimates are in average equal to the sample average risk over the portfolio and coincide with the estimates by least squares with the same monotonic constraints. These results are the exact solution of the corresponding constrained optimization. A non-parametric algorithm for the exact solution is proposed. For the least squares estimates, this algorithm is compared with “pool adjacent violators” algorithm for isotonic regression. The proposed approaches provide a resolution to flip-over credit risk and a tool to determine the fair risk scales over a rating system.
    Keywords: risk scale, maximum likelihood, least squares, isotonic regression, flip-over credit risk
    JEL: C10 C13 C14 C18 C6 C61 C63 C65 C67 C8 C80 G12 G17 G18 G3 G32 G35
    Date: 2019–03–18
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:93389&r=all
  4. By: Fei Sun
    Abstract: Since the investors and regulators pay more attention to losses rather than gains, we will study a new class of risk statistics, named loss-based risk statistics in this paper. This new class of risk statistics can be considered as a kind of risk extension of risk statistics introduced by Kou, Peng and Heyde (2013), and also data-based versions of loss-based risk measures introduced by Cont et al. (2013) and Sun et al. (2018).
    Date: 2019–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1904.11032&r=all
  5. By: Giovanni Caggiano (Monash University and University of Padova); Efrem Castelnuovo (Melbourne Institute: Applied Economic & Social Research, The University of Melbourne); Gabriela Nodari (Reserve Bank of Australia)
    Abstract: We employ real-time data available to the US monetary policy makers to estimate a Taylor rule augmented with a measure of financial uncertainty over the period 1969-2008. We find evidence in favor of a systematic response to financial uncertainty over and above that to expected inflation, output gap, and output growth. However, this evidence regards the Greenspan-Bernanke period only. Focusing on this period, the "risk-management" approach is found to be responsible for monetary policy easings for up to 75 basis points of the federal funds rate.
    Keywords: Risk management-driven policy rate gap, uncertainty, monetary policy, Taylor rules, real-time data
    JEL: C2 E4 E5
    Date: 2018–08
    URL: http://d.repec.org/n?u=RePEc:iae:iaewps:wp2018n10&r=all
  6. By: Wang, Frank Xuyan
    Abstract: The shape factor defined as kurtosis divided by skewness squared K/S^2 is characterized as the only choice among all factors K/〖|S|〗^α ,α>0 which is greater than or equal to 1 for all probability distributions. For a specific distribution family, there may exists α>2 such that min⁡〖K/〖|S|〗^α 〗≥1. The least upper bound of all such α is defined as the distribution’s characteristic number. The useful extreme values of the shape factor for various distributions which are found numerically before, the Beta, Kumaraswamy, Weibull, and GB2 Distribution, are derived using asymptotic analysis. The match of the numerical and the analytical results can be considered prove of each other. The characteristic numbers of these distributions are also calculated. The study of the boundary value of the shape factor, or the shape factor asymptotic analysis, help reveal properties of the original shape factor, and reveal relationship between distributions, such as between the Kumaraswamy distribution and the Weibull distribution.
    Keywords: Shape Factor, Skewness, Kurtosis, Asymptotic Expansion, Beta Distribution, Kumaraswamy Distribution, Weibull Distribution, GB2 Distribution, Computer Algebra System, Numerical Optimization, Characteristic Number.
    JEL: C02 C46 C88 G22
    Date: 2019
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:93357&r=all
  7. By: A.bary, amr
    Abstract: The objective of this paper is to develop a framework for measuring the capital adequacy by assessing the bank’s risks according to the basics of Basel’s norms in respect of the component of tire1&2 of capital adequacy.
    Keywords: Capital adequacy ratio (CAR), Liquidity, Credit Risk, Loan to Deposits (LTD), Equity to Assets (ETA), Retained Earnings (RE).
    JEL: G2 G21
    Date: 2019–02–08
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:93388&r=all
  8. By: Junyao Chen; Tony Sit; Hoi Ying Wong
    Abstract: Value-at-risk (VaR) has been playing the role of a standard risk measure since its introduction. In practice, the delta-normal approach is usually adopted to approximate the VaR of portfolios with option positions. Its effectiveness, however, substantially diminishes when the portfolios concerned involve a high dimension of derivative positions with nonlinear payoffs; lack of closed form pricing solution for these potentially highly correlated, American-style derivatives further complicates the problem. This paper proposes a generic simulation-based algorithm for VaR estimation that can be easily applied to any existing procedures. Our proposal leverages cross-sectional information and applies variable selection techniques to simplify the existing simulation framework. Asymptotic properties of the new approach demonstrate faster convergence due to the additional model selection component introduced. We have also performed sets of numerical results that verify the effectiveness of our approach in comparison with some existing strategies.
    Date: 2019–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1904.09088&r=all
  9. By: Rogelio A. Mancisidor; Michael Kampffmeyer; Kjersti Aas; Robert Jenssen
    Abstract: Credit scoring models based on accepted applications may be biased and their consequences can have a statistical and economic impact. Reject inference is the process of attempting to infer the creditworthiness status of the rejected applications. In this research, we use deep generative models to develop two new semi-supervised Bayesian models for reject inference in credit scoring, in which we model the data generating process to be dependent on a Gaussian mixture. The goal is to improve the classification accuracy in credit scoring models by adding reject applications. Our proposed models infer the unknown creditworthiness of the rejected applications by exact enumeration of the two possible outcomes of the loan (default or non-default). The efficient stochastic gradient optimization technique used in deep generative models makes our models suitable for large data sets. Finally, the experiments in this research show that our proposed models perform better than classical and alternative machine learning models for reject inference in credit scoring.
    Date: 2019–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1904.11376&r=all
  10. By: Yang, Bill Huajian
    Abstract: Minimum cross-entropy estimation is an extension to the maximum likelihood estimation for multinomial probabilities. Given a probability distribution {r_i }_(i=1)^k, we show in this paper that the monotonic estimates {p_i }_(i=1)^k for the probability distribution by minimum cross-entropy are each given by the simple average of the given distribution values over some consecutive indexes. Results extend to the monotonic estimation for multivariate outcomes by generalized cross-entropy. These estimates are the exact solution for the corresponding constrained optimization and coincide with the monotonic estimates by least squares. A non-parametric algorithm for the exact solution is proposed. The algorithm is compared to the “pool adjacent violators” algorithm in least squares case for the isotonic regression problem. Applications to monotonic estimation of migration matrices and risk scales for multivariate outcomes are discussed.
    Keywords: maximum likelihood, cross-entropy, least squares, isotonic regression, constrained optimization, multivariate risk scales
    JEL: C13 C18 C4 C44 C5 C51 C52 C53 C54 C58 C61 C63
    Date: 2019–03
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:93400&r=all
  11. By: Pástor, Luboš; Stambaugh, Robert F.
    Abstract: The Critical Finance Review commissioned Li, Novy-Marx, and Velikov (2017) and Pontiff and Singla (2019) to replicate the results in Pastor and Stambaugh (2003). Both studies successfully replicate our market-wide liquidity measure and find similar estimates of the liquidity risk premium. In the sample period after our study, the liquidity risk premium estimates are even larger, and the liquidity measure displays sharp drops during the 2008 financial crisis. We respond to both replication studies and offer some related thoughts, such as when to use our traded versus non-traded liquidity factors and how to improve the precision of liquidity beta estimates.
    Keywords: liquidity; liquidity beta; liquidity factor; liquidity risk
    JEL: G12
    Date: 2019–04
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:13680&r=all
  12. By: Mauricio Calani
    Abstract: We argue that financial institutions responded by raising their acceptable borrowing standards on borrowers, enhancing the quality of their portfolio, but also contracting their supply of mortgage credit. We reach this conclusion by developing a stylized imperfect information model which we use to guide our empirical analysis. We conclude that the loan-to-value (LTV) ratio was 2.8% lower for the mean borrower, and 9.8% lower for the median borrower, because of the regulation. Our paper contributes to the literature on the evaluation of macro-prudential policies, which has mainly exploited cross-country evidence. In turn, our analysis narrows down to one particular policy in the mortgage market, and dissects its effects by exploiting unique administrative tax data on the census of all real estate transactions in Chilean territory, in the period 2012-2016.
    Keywords: loan loss provisions, LTV, screening, coarsened exact matching, macroprudential policy
    JEL: G21 R31
    Date: 2019–04
    URL: http://d.repec.org/n?u=RePEc:bis:biswps:780&r=all

This nep-rmg issue is ©2019 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.