
on Risk Management 
Issue of 2019‒04‒29
twelve papers chosen by 
By:  Yongjie Wang 
Abstract:  This paper studies an optimal portfolio problem for a DC pension plan considering both interest rate risk and longevity risk. In the accumulation phase, plan members pay constant contributions continuously into the pension fund. We assume that the evolution of mortality rate of all the plan members can be described by the same stochastic process and a representative member is chosen to study the problem. At retirement time, the pension fund is used to purchase a lifetime annuity and a minimum guarantee is required by the representative member. To hedge the longevity risk, we introduce a mortalitylinked security, i.e. a longevity bond, into the financial market. The pension manager makes investment decisions for the benefit and on behalf of the representative pension member whose objective is to maximize his expected utility of the terminal surplus between the final fund level and the minimum guarantee. To solve the initial constrained nonselffinancing optimization problem, we transform it to an unconstrained selffinancing problem by replicating the future contributions and the minimum guarantee. By applying dynamic programming method, analytical solutions to the equivalent optimization problem are derived and optimal investment strategies to the original problem are obtained by simple calculations. The numerical applications reveal that the longevity risk has an important impact on the investment strategies and show evidence that mortalitylinked securities could provide an efficient way to hedge the longevity risk. 
Date:  2019–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1904.10229&r=all 
By:  Yang, Bill Huajian 
Abstract:  Monotonic estimation for the survival probability of a loan in a riskrated portfolio is based on the observation arising, for example, from loan pricing that a loan with a lower credit risk rating is more likely to survive than a loan with a higher credit risk rating, given the same additional risk covariates. Two probittype discretetime hazard rate models that generate monotonic survival probabilities are proposed in this paper. The first model calculates the discretetime hazard rate conditional on systematic risk factors. As for the Cox proportion hazard rate model, the model formulates the discretetime hazard rate by including a baseline component. This baseline component can be estimated outside the model in the absence of model covariates using the longrun average discretetime hazard rate. This results in a significant reduction in the number of parameters to be otherwise estimated inside the model. The second model is a general form model where loan level factors can be included. Parameter estimation algorithms are also proposed. The models and algorithms proposed in this paper can be used for loan pricing, stress testing, expected credit loss estimation, and modeling of the probability of default term structure. 
Keywords:  loan pricing, survival probability, Cox proportion hazard rate model, baseline hazard rate, forward probability of default, probability of default term structure 
JEL:  C02 C13 C18 C40 C44 C51 C52 C53 C58 C61 C63 
Date:  2019–03–18 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:93398&r=all 
By:  Yang, Bill Huajian 
Abstract:  Abstract Given a risk outcome y over a rating system {R_i }_(i=1)^k for a portfolio, we show in this paper that the maximum likelihood estimates with monotonic constraints, when y is binary (the Bernoulli likelihood) or takes values in the interval 0≤y≤1 (the quasiBernoulli likelihood), are each given by the average of the observed outcomes for some consecutive rating indexes. These estimates are in average equal to the sample average risk over the portfolio and coincide with the estimates by least squares with the same monotonic constraints. These results are the exact solution of the corresponding constrained optimization. A nonparametric algorithm for the exact solution is proposed. For the least squares estimates, this algorithm is compared with “pool adjacent violators” algorithm for isotonic regression. The proposed approaches provide a resolution to flipover credit risk and a tool to determine the fair risk scales over a rating system. 
Keywords:  risk scale, maximum likelihood, least squares, isotonic regression, flipover credit risk 
JEL:  C10 C13 C14 C18 C6 C61 C63 C65 C67 C8 C80 G12 G17 G18 G3 G32 G35 
Date:  2019–03–18 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:93389&r=all 
By:  Fei Sun 
Abstract:  Since the investors and regulators pay more attention to losses rather than gains, we will study a new class of risk statistics, named lossbased risk statistics in this paper. This new class of risk statistics can be considered as a kind of risk extension of risk statistics introduced by Kou, Peng and Heyde (2013), and also databased versions of lossbased risk measures introduced by Cont et al. (2013) and Sun et al. (2018). 
Date:  2019–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1904.11032&r=all 
By:  Giovanni Caggiano (Monash University and University of Padova); Efrem Castelnuovo (Melbourne Institute: Applied Economic & Social Research, The University of Melbourne); Gabriela Nodari (Reserve Bank of Australia) 
Abstract:  We employ realtime data available to the US monetary policy makers to estimate a Taylor rule augmented with a measure of financial uncertainty over the period 19692008. We find evidence in favor of a systematic response to financial uncertainty over and above that to expected inflation, output gap, and output growth. However, this evidence regards the GreenspanBernanke period only. Focusing on this period, the "riskmanagement" approach is found to be responsible for monetary policy easings for up to 75 basis points of the federal funds rate. 
Keywords:  Risk managementdriven policy rate gap, uncertainty, monetary policy, Taylor rules, realtime data 
JEL:  C2 E4 E5 
Date:  2018–08 
URL:  http://d.repec.org/n?u=RePEc:iae:iaewps:wp2018n10&r=all 
By:  Wang, Frank Xuyan 
Abstract:  The shape factor defined as kurtosis divided by skewness squared K/S^2 is characterized as the only choice among all factors K/〖S〗^α ,α>0 which is greater than or equal to 1 for all probability distributions. For a specific distribution family, there may exists α>2 such that min〖K/〖S〗^α 〗≥1. The least upper bound of all such α is defined as the distribution’s characteristic number. The useful extreme values of the shape factor for various distributions which are found numerically before, the Beta, Kumaraswamy, Weibull, and GB2 Distribution, are derived using asymptotic analysis. The match of the numerical and the analytical results can be considered prove of each other. The characteristic numbers of these distributions are also calculated. The study of the boundary value of the shape factor, or the shape factor asymptotic analysis, help reveal properties of the original shape factor, and reveal relationship between distributions, such as between the Kumaraswamy distribution and the Weibull distribution. 
Keywords:  Shape Factor, Skewness, Kurtosis, Asymptotic Expansion, Beta Distribution, Kumaraswamy Distribution, Weibull Distribution, GB2 Distribution, Computer Algebra System, Numerical Optimization, Characteristic Number. 
JEL:  C02 C46 C88 G22 
Date:  2019 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:93357&r=all 
By:  A.bary, amr 
Abstract:  The objective of this paper is to develop a framework for measuring the capital adequacy by assessing the bank’s risks according to the basics of Basel’s norms in respect of the component of tire1&2 of capital adequacy. 
Keywords:  Capital adequacy ratio (CAR), Liquidity, Credit Risk, Loan to Deposits (LTD), Equity to Assets (ETA), Retained Earnings (RE). 
JEL:  G2 G21 
Date:  2019–02–08 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:93388&r=all 
By:  Junyao Chen; Tony Sit; Hoi Ying Wong 
Abstract:  Valueatrisk (VaR) has been playing the role of a standard risk measure since its introduction. In practice, the deltanormal approach is usually adopted to approximate the VaR of portfolios with option positions. Its effectiveness, however, substantially diminishes when the portfolios concerned involve a high dimension of derivative positions with nonlinear payoffs; lack of closed form pricing solution for these potentially highly correlated, Americanstyle derivatives further complicates the problem. This paper proposes a generic simulationbased algorithm for VaR estimation that can be easily applied to any existing procedures. Our proposal leverages crosssectional information and applies variable selection techniques to simplify the existing simulation framework. Asymptotic properties of the new approach demonstrate faster convergence due to the additional model selection component introduced. We have also performed sets of numerical results that verify the effectiveness of our approach in comparison with some existing strategies. 
Date:  2019–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1904.09088&r=all 
By:  Rogelio A. Mancisidor; Michael Kampffmeyer; Kjersti Aas; Robert Jenssen 
Abstract:  Credit scoring models based on accepted applications may be biased and their consequences can have a statistical and economic impact. Reject inference is the process of attempting to infer the creditworthiness status of the rejected applications. In this research, we use deep generative models to develop two new semisupervised Bayesian models for reject inference in credit scoring, in which we model the data generating process to be dependent on a Gaussian mixture. The goal is to improve the classification accuracy in credit scoring models by adding reject applications. Our proposed models infer the unknown creditworthiness of the rejected applications by exact enumeration of the two possible outcomes of the loan (default or nondefault). The efficient stochastic gradient optimization technique used in deep generative models makes our models suitable for large data sets. Finally, the experiments in this research show that our proposed models perform better than classical and alternative machine learning models for reject inference in credit scoring. 
Date:  2019–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1904.11376&r=all 
By:  Yang, Bill Huajian 
Abstract:  Minimum crossentropy estimation is an extension to the maximum likelihood estimation for multinomial probabilities. Given a probability distribution {r_i }_(i=1)^k, we show in this paper that the monotonic estimates {p_i }_(i=1)^k for the probability distribution by minimum crossentropy are each given by the simple average of the given distribution values over some consecutive indexes. Results extend to the monotonic estimation for multivariate outcomes by generalized crossentropy. These estimates are the exact solution for the corresponding constrained optimization and coincide with the monotonic estimates by least squares. A nonparametric algorithm for the exact solution is proposed. The algorithm is compared to the “pool adjacent violators” algorithm in least squares case for the isotonic regression problem. Applications to monotonic estimation of migration matrices and risk scales for multivariate outcomes are discussed. 
Keywords:  maximum likelihood, crossentropy, least squares, isotonic regression, constrained optimization, multivariate risk scales 
JEL:  C13 C18 C4 C44 C5 C51 C52 C53 C54 C58 C61 C63 
Date:  2019–03 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:93400&r=all 
By:  Pástor, Luboš; Stambaugh, Robert F. 
Abstract:  The Critical Finance Review commissioned Li, NovyMarx, and Velikov (2017) and Pontiff and Singla (2019) to replicate the results in Pastor and Stambaugh (2003). Both studies successfully replicate our marketwide liquidity measure and find similar estimates of the liquidity risk premium. In the sample period after our study, the liquidity risk premium estimates are even larger, and the liquidity measure displays sharp drops during the 2008 financial crisis. We respond to both replication studies and offer some related thoughts, such as when to use our traded versus nontraded liquidity factors and how to improve the precision of liquidity beta estimates. 
Keywords:  liquidity; liquidity beta; liquidity factor; liquidity risk 
JEL:  G12 
Date:  2019–04 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:13680&r=all 
By:  Mauricio Calani 
Abstract:  We argue that financial institutions responded by raising their acceptable borrowing standards on borrowers, enhancing the quality of their portfolio, but also contracting their supply of mortgage credit. We reach this conclusion by developing a stylized imperfect information model which we use to guide our empirical analysis. We conclude that the loantovalue (LTV) ratio was 2.8% lower for the mean borrower, and 9.8% lower for the median borrower, because of the regulation. Our paper contributes to the literature on the evaluation of macroprudential policies, which has mainly exploited crosscountry evidence. In turn, our analysis narrows down to one particular policy in the mortgage market, and dissects its effects by exploiting unique administrative tax data on the census of all real estate transactions in Chilean territory, in the period 20122016. 
Keywords:  loan loss provisions, LTV, screening, coarsened exact matching, macroprudential policy 
JEL:  G21 R31 
Date:  2019–04 
URL:  http://d.repec.org/n?u=RePEc:bis:biswps:780&r=all 