nep-rmg New Economics Papers
on Risk Management
Issue of 2020‒11‒16
23 papers chosen by
Stan Miles
Thompson Rivers University

  1. Excursion Risk By Anna Ananova; Rama Cont; Renyuan Xu
  2. Transition Risks and Opportunities in Residential Mortgages By Franziska Schütze
  3. Rule-based Strategies for Dynamic Life Cycle Investment By T. R. B. den Haan; K. W. Chau; M. van der Schans; C. W. Oosterlee
  4. Deep learning for CVA computations of large portfolios of financial derivatives By Kristoffer Andersson; Cornelis W. Oosterlee
  5. Financial Data Analysis Using Expert Bayesian Framework For Bankruptcy Prediction By Amir Mukeri; Habibullah Shaikh; Dr. D. P. Gaikwad
  6. Liquidity, Interbank Network Topology and Bank Capital By Aref Ardekani
  7. Maximum Spectral Measures of Risk with given Risk Factor Marginal Distributions By Mario Ghossoub; Jesse Hall; David Saunders
  8. Robust Optimization Approaches for Portfolio Selection: A Computational and Comparative Analysis By A. Georgantas
  9. A deep neural network algorithm for semilinear elliptic PDEs with applications in insurance mathematics By Stefan Kremsner; Alexander Steinicke; Michaela Sz\"olgyenyi
  10. Prediction accuracy of bivariate score-driven risk premium and volatility filters: an illustration for the Dow Jones By Licht, Adrian; Escribano Saez, Alvaro; Blazsek, Szabolcs Istvan
  11. Granular Credit Risk By Sigurd Galaasen; Rustam Jamilov; Ragnar Juelsrud; Hélène Rey
  12. Uncertainty due to Infectious Diseases and Forecastability of the Realized Variance of US REITs: A Note By Matteo Bonato; Oguzhan Cepni; Rangan Gupta; Christian Pierdzioch
  13. Fear and Volatility in Digital Assets By Faizaan Pervaiz; Christopher Goh; Ashley Pennington; Samuel Holt; James West; Shaun Ng
  14. Hedging with commodity futures and the end of normal Backwardation By Jochen Güntner; Benjamin Karner
  15. Machine learning in credit risk: measuring the dilemma between prediction and supervisory cost By Andrés Alonso; José Manuel Carbó
  16. Mortgage Loss Severities: What Keeps Them So High? By Xudong An; Lawrence R. Cordell
  17. Do Oil-Price Shocks Predict the Realized Variance of U.S. REITs? By Matteo Bonato; Rangan Gupta; Christian Pierdzioch
  18. Realized volatility, jump and beta: evidence from Canadian stock market By Gajurel, Dinesh; Chowdhury, Biplob
  19. Optimal Portfolio Using Factor Graphical Lasso By Tae-Hwy Lee; Ekaterina Seregina
  20. Risk Preferences and Efficiency of Household Portfolios By Agostino Capponi; Zhaoyu Zhang
  21. Recurrent Conditional Heteroskedasticity By T. -N. Nguyen; M. -N. Tran; R. Kohn
  22. Screening and Loan Origination Time: Lending Standards, Loan Defaults and Bank Failures By Bedayo, Mikel; Jiménez, Gabriel; Peydró, José-Luis; Vegas, Raquel
  23. Analysis of the Impact of High-Frequency Trading on Artificial Market Liquidity By Isao Yagi; Yuji Masuda; Takanobu Mizuta

  1. By: Anna Ananova; Rama Cont; Renyuan Xu
    Abstract: The risk and return profiles of a broad class of dynamic trading strategies, including pairs trading and other statistical arbitrage strategies, may be characterized in terms of excursions of the market price of a portfolio away from a reference level. We propose a mathematical framework for the risk analysis of such strategies, based on a description in terms of price excursions, first in a pathwise setting, without probabilistic assumptions, then in a Markovian setting. We introduce the notion of delta-excursion, defined as a path which deviates by delta from a reference level before returning to this level. We show that every continuous path has a unique decomposition into delta-excursions, which is useful for the scenario analysis of dynamic trading strategies, leading to simple expressions for the number of trades, realized profit, maximum loss and drawdown. As delta is decreased to zero, properties of this decomposition relate to the local time of the path. When the underlying asset follows a Markov process, we combine these results with Ito's excursion theory to obtain a tractable decomposition of the process as a concatenation of independent delta-excursions, whose distribution is described in terms of Ito's excursion measure. We provide analytical results for linear diffusions and give new examples of stochastic processes for flexible and tractable modeling of excursions. Finally, we describe a non-parametric scenario simulation method for generating paths whose excursion properties match those observed in empirical data.
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2011.02870&r=all
  2. By: Franziska Schütze
    Abstract: A range of studies has analysed how climate-related risks can impact financial markets, focusing on equity and corporate bond holdings. This article takes a closer look at transition risks and opportunities in residential mortgages. Mortgage loans are important from a financial perspective due to their large share in banks’ assets and their long credit lifetime, and from a climate perspective due to their large share in fossil fuel consumption. The analysis combines data on the energy-performance of buildings with financial data on mortgages for Germany and identifies two risk drivers – a carbon price and a performance standard. The scenario analysis shows that expected credit loss can be substantially higher for a “brown” portfolio compared to a “green” portfolio. Taking climate policy into account in risk management and strategy can reduce the transition risk and open up new lending opportunities. Financial regulation can promote such behaviour.
    Keywords: Mortgages, residential buildings, carbon risks, transition risks, valuation, climate policy scenarios, policy and regulation
    JEL: G21 Q48 Q56 Q58 R38
    Date: 2020
    URL: http://d.repec.org/n?u=RePEc:diw:diwwpp:dp1910&r=all
  3. By: T. R. B. den Haan; K. W. Chau; M. van der Schans; C. W. Oosterlee
    Abstract: In this work, we consider rule-based investment strategies for managing a defined contribution saving scheme under the Dutch pension fund testing model. We found that dynamic rule-based investment can outperform traditional static strategies, by which we mean that the pensioner can achieve the target retirement income with higher probability and limit the shortfall when target is not met. In comparison with the popular dynamic programming technique, the rule-based strategy has a more stable asset allocation throughout time and avoid excessive transactions, which may be hard to explain to the investor. We also study a combined strategy of rule based target and dynamic programming in this work. Another key feature of this work is that there is no risk-free asset under our setting, instead, a matching portfolio is introduced for the investor to avoid unnecessary risk.
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2011.02596&r=all
  4. By: Kristoffer Andersson; Cornelis W. Oosterlee
    Abstract: In this paper, we propose a neural network-based method for CVA computations of a portfolio of derivatives. In particular, we focus on portfolios consisting of a combination of derivatives, with and without true optionality, \textit{e.g.,} a portfolio of a mix of European- and Bermudan-type derivatives. CVA is computed, with and without netting, for different levels of WWR and for different levels of credit quality of the counterparty. We show that the CVA is overestimated with up to 25\% by using the standard procedure of not adjusting the exercise strategy for the default-risk of the counterparty. For the Expected Shortfall of the CVA dynamics, the overestimation was found to be more than 100\% in some non-extreme cases.
    Date: 2020–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2010.13843&r=all
  5. By: Amir Mukeri; Habibullah Shaikh; Dr. D. P. Gaikwad
    Abstract: In recent years, bankruptcy forecasting has gained lot of attention from researchers as well as practitioners in the field of financial risk management. For bankruptcy prediction, various approaches proposed in the past and currently in practice relies on accounting ratios and using statistical modeling or machine learning methods.These models have had varying degrees of successes. Models such as Linear Discriminant Analysis or Artificial Neural Network employ discriminative classification techniques. They lack explicit provision to include prior expert knowledge. In this paper, we propose another route of generative modeling using Expert Bayesian framework. The biggest advantage of the proposed framework is an explicit inclusion of expert judgment in the modeling process. Also the proposed methodology provides a way to quantify uncertainty in prediction. As a result the model built using Bayesian framework is highly flexible, interpretable and intuitive in nature. The proposed approach is well suited for highly regulated or safety critical applications such as in finance or in medical diagnosis. In such cases accuracy in the prediction is not the only concern for decision makers. Decision makers and other stakeholders are also interested in uncertainty in the prediction as well as interpretability of the model. We empirically demonstrate these benefits of proposed framework using Stan, a probabilistic programming language. We found that the proposed model is either comparable or superior to other existing methods. Also resulting model has much less False Positive Rate compared to many existing state of the art methods. The corresponding R code for the experiments is available at Github repository.
    Date: 2020–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2010.13892&r=all
  6. By: Aref Ardekani (UP1 - Université Panthéon-Sorbonne, CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique, UNILIM - Université de Limoges)
    Abstract: By applying the interbank network simulation, this paper examines whether the causal relationship between capital and liquidity is influenced by bank positions in the interbank network. While existing literature highlights the causal relationship that moves from liquidity to capital, the question of how interbank network characteristics affect this relationship remains unclear. Using a sample of commercial banks from 28 European countries, this paper suggests that banks' interconnectedness within interbank loan and deposit networks affects their decisions to set higher or lower regulatory capital rations when facing higher illiquidity. This study provides support for the need to implement minimum liquidity ratios to complement capital ratios, as stressed by the Basel Committee on Banking Regulation and Supervision. This paper also highlights the need for regulatory authorities to consider the network characteristics of banks.
    Keywords: Interbank network topology,Bank regulatory capital,Liquidity risk,Basel III
    Date: 2020–10
    URL: http://d.repec.org/n?u=RePEc:hal:journl:halshs-02967226&r=all
  7. By: Mario Ghossoub; Jesse Hall; David Saunders
    Abstract: We consider the problem of determining an upper bound for the value of a spectral risk measure of a loss that is a general nonlinear function of two factors whose marginal distributions are known, but whose joint distribution is unknown. The factors may take values in complete separable metric spaces. We introduce the notion of Maximum Spectral Measure (MSP), as a worst-case spectral risk measure of the loss with respect to the dependence between the factors. The MSP admits a formulation as a solution to an optimization problem that has the same constraint set as the optimal transport problem, but with a more general objective function. We present results analogous to the Kantorovich duality, and we investigate the continuity properties of the optimal value function and optimal solution set with respect to perturbation of the marginal distributions. Additionally, we provide an asymptotic result characterizing the limiting distribution of the optimal value function when the factor distributions are simulated from finite sample spaces. The special case of Expected Shortfall and the resulting Maximum Expected Shortfall is also examined.
    Date: 2020–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2010.14673&r=all
  8. By: A. Georgantas
    Abstract: The field of portfolio selection is an active research topic, which combines elements and methodologies from various fields, such as optimization, decision analysis, risk management, data science, forecasting, etc. The modeling and treatment of deep uncertainties for future asset returns is a major issue for the success of analytical portfolio selection models. Recently, robust optimization (RO) models have attracted a lot of interest in this area. RO provides a computationally tractable framework for portfolio optimization based on relatively general assumptions on the probability distributions of the uncertain risk parameters. Thus, RO extends the framework of traditional linear and non-linear models (e.g., the well-known mean-variance model), incorporating uncertainty through a formal and analytical approach into the modeling process. Robust counterparts of existing models can be considered as worst-case re-formulations as far as deviations of the uncertain parameters from their nominal values are concerned. Although several RO models have been proposed in the literature focusing on various risk measures and different types of uncertainty sets about asset returns, analytical empirical assessments of their performance have not been performed in a comprehensive manner. The objective of this study is to fill in this gap in the literature. More specifically, we consider different types of RO models based on popular risk measures and conduct an extensive comparative analysis of their performance using data from the US market during the period 2005-2016.
    Date: 2020–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2010.13397&r=all
  9. By: Stefan Kremsner; Alexander Steinicke; Michaela Sz\"olgyenyi
    Abstract: In insurance mathematics optimal control problems over an infinite time horizon arise when computing risk measures. Their solutions correspond to solutions of deterministic semilinear (degenerate) elliptic partial differential equations. In this paper we propose a deep neural network algorithm for solving such partial differential equations in high dimensions. The algorithm is based on the correspondence of elliptic partial differential equations to backward stochastic differential equations with random terminal time.
    Date: 2020–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2010.15757&r=all
  10. By: Licht, Adrian; Escribano Saez, Alvaro; Blazsek, Szabolcs Istvan
    Abstract: In this paper, we introduce Beta-t-QVAR (quasi-vector autoregression) for the joint modelling of score-driven location and scale. Asymptotic theory of the maximum likelihood (ML) estimatoris presented, and sufficient conditions of consistency and asymptotic normality of ML are proven. Forthe joint score-driven modelling of risk premium and volatility, Dow Jones Industrial Average (DJIA)data are used in an empirical illustration. Prediction accuracy of Beta-t-QVAR is superior to theprediction accuracies of Beta-t-EGARCH (exponential generalized AR conditional heteroscedasticity),A-PARCH (asymmetric power ARCH), and GARCH (generalized ARCH). The empirical results motivate the use of Beta-t-QVAR for the valuation of DJIA options.
    Keywords: Generalized Autoregressive Score; Dynamic Conditional Score; Risk Premium; Volatility
    JEL: C58 C22
    Date: 2020–11–05
    URL: http://d.repec.org/n?u=RePEc:cte:werepe:31339&r=all
  11. By: Sigurd Galaasen; Rustam Jamilov; Ragnar Juelsrud; Hélène Rey
    Abstract: What is the impact of granular credit risk on banks and on the economy? We provide the first causal identification of single-name counterparty exposure risk in bank portfolios by applying a new empirical approach on an administrative matched bank-firm dataset from Norway. Exploiting the fat tail properties of the loan share distribution we use a Gabaix and Koijen (2020a,b) granular instrumental variable strategy to show that idiosyncratic borrower risk survives aggregation in banks portfolios. We also find that this granular credit risk spills over from affected banks to firms, decreases investment, and increases the probability of default of non-granular borrowers, thereby sizably affecting the macroeconomy.
    JEL: E3 G2
    Date: 2020–10
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:27994&r=all
  12. By: Matteo Bonato (Department of Economics and Econometrics, University of Johannesburg, Auckland Park, South Africa; IPAG Business School, 184 Boulevard Saint-Germain, 75006 Paris, France); Oguzhan Cepni (Copenhagen Business School, Department of Economics, Porcelænshaven 16A, Frederiksberg DK-2000, Denmark; Central Bank of the Republic of Turkey, Haci Bayram Mah. Istiklal Cad. No:10 06050, Ankara, Turkey); Rangan Gupta (Department of Economics, University of Pretoria, Pretoria, 0002, South Africa); Christian Pierdzioch (Department of Economics, Helmut Schmidt University, Holstenhofweg 85, P.O.B. 700822, 22008 Hamburg, Germany)
    Abstract: We examine the forecasting power of a daily newspaper-based index of uncertainty associated with infectious diseases (EMVID) for Real Estate Investment Trusts (REITs) realized market variance of the United States (US) via the heterogeneous autoregressive realized volatility (HAR-RV) model. Our results show that the EMVID index improves the forecast accuracy of realized variance of REITs at short-, medium-, and long-run horizons in a statistically significant manner, with the result being robust to the inclusion of additional controls (leverage, realized jumps, skewness, and kurtosis) capturing extreme market movements, and also carries over to ten sub-sectors of the US REITs market. Our results have important portfolio implications for investors during the current period of unprecedented levels of uncertainty resulting from the outbreak of COVID-19.
    Keywords: Uncertainty, Infectious diseases, REITs, Realized variance, Forecasting
    JEL: C22 C53 G10
    Date: 2020–10
    URL: http://d.repec.org/n?u=RePEc:pre:wpaper:202099&r=all
  13. By: Faizaan Pervaiz; Christopher Goh; Ashley Pennington; Samuel Holt; James West; Shaun Ng
    Abstract: We show Bitcoin implied volatility on a 5 minute time horizon is modestly predictable from price, volatility momentum and alternative data including sentiment and engagement. Lagged Bitcoin index price and volatility movements contribute to the model alongside Google Trends with markets responding often several hours later. The code and datasets used in this paper can be found at https://github.com/Globe-Research/bitfea r.
    Date: 2020–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2010.15611&r=all
  14. By: Jochen Güntner; Benjamin Karner
    Abstract: Using the S&P GSCI and its five component sub-indices, we show that considering each commodity separately yields nontrivial hedging gains in and out of sample. During 1999–2019, the maximum Sharpe ratio portfolio assigns positive weights to the GSCI Energy, Industrial and Precious Metals, whereas only precious metals enter the optimal portfolio after the financial crisis. In out-of-sample optimizations based on dynamic conditional correlations, a subset of commodity futures excluding the GSCI Agriculture and Livestock outperforms conventional stock-bond portfolios with and without the overall GSCI. We argue that the “normal backwardation” in commodity markets has broken down during our sample period.
    Keywords: Commodity futures, Diversification, Hedging, Financial crisis, Normal backwardation
    JEL: C58 G11 G17 Q02
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:jku:econwp:2020-21&r=all
  15. By: Andrés Alonso (Banco de España); José Manuel Carbó (Banco de España)
    Abstract: New reports show that the financial sector is increasingly adopting machine learning (ML) tools to manage credit risk. In this environment, supervisors face the challenge of allowing credit institutions to benefit from technological progress and financial innovation, while at the same ensuring compatibility with regulatory requirements and that technological neutrality is observed. We propose a new framework for supervisors to measure the costs and benefits of evaluating ML models, aiming to shed more light on this technology’s alignment with the regulation. We follow three steps. First, we identify the benefits by reviewing the literature. We observe that ML delivers predictive gains of up to 20?% in default classification compared with traditional statistical models. Second, we use the process for validating internal ratings-based (IRB) systems for regulatory capital to detect ML’s limitations in credit risk mangement. We identify up to 13 factors that might constitute a supervisory cost. Finally, we propose a methodology for evaluating these costs. For illustrative purposes, we compute the benefits by estimating the predictive gains of six ML models using a public database on credit default. We then calculate a supervisory cost function through a scorecard in which we assign weights to each factor for each ML model, based on how the model is used by the financial institution and the supervisor’s risk tolerance. From a supervisory standpoint, having a structured methodology for assessing ML models could increase transparency and remove an obstacle to innovation in the financial industry.
    Keywords: artificial intelligence, machine learning, credit risk, interpretability, bias, IRB models
    JEL: C53 D81 G17
    Date: 2020–10
    URL: http://d.repec.org/n?u=RePEc:bde:wpaper:2032&r=all
  16. By: Xudong An; Lawrence R. Cordell
    Abstract: Mortgage loss-given-default (LGD) increased significantly when house prices plummeted during the financial crisis, but it has remained over 40 percent in recent years, despite a strong housing recovery. Our results indicate that the sustained high LGDs post-crisis is due to a combination of an overhang of crisis-era foreclosures and prolonged liquidation timelines, which have offset higher sales recoveries. Simulations show that cutting foreclosure timelines by one year would cause LGD to decrease by 5 to 8 percentage points, depending on the tradeoff between lower liquidation expenses and lower sales recoveries. Using difference-in-differences tests, we also find that recent consumer protection programs have extended foreclosure timelines and increased loss severities despite their potential benefits of increasing loan modifications and enhancing consumer protections.
    Keywords: loss-given default (LGD); foreclosure timelines; regulatory changes; Heckman twostage model; accelerated failure time model
    JEL: G21 G18 C41 C24 G01
    Date: 2020–09–25
    URL: http://d.repec.org/n?u=RePEc:fip:fedpwp:88789&r=all
  17. By: Matteo Bonato (Department of Economics and Econometrics, University of Johannesburg, Auckland Park, South Africa; IPAG Business School, 184 Boulevard Saint-Germain, 75006 Paris, France; Copenhagen Business School, Department of Economics, Porcelænshaven 16A, Frederiksberg DK-2000, Denmark; Central Bank of the Republic of Turkey, Haci Bayram Mah. Istiklal Cad. No:10 06050, Ankara, Turkey); Rangan Gupta (Department of Economics, University of Pretoria, Pretoria, 0002, South Africa); Christian Pierdzioch (Department of Economics, Helmut Schmidt University, Holstenhofweg 85, P.O.B. 700822, 22008 Hamburg, Germany)
    Abstract: We examine, using aggregate and sectoral U.S. data for the period 2008-2020, the predictive power of disentangled oil-price shocks for Real Estate Investment Trusts (REITs) realized market variance via the heterogeneous auto-regressive realized variance (HAR-RV) model. In-sample tests show that demand and financial-market risk shocks contribute to a larger extent to the overall fit of the model than supply shocks, where the in-sample transmission of the impact of the shocks mainly operates through their significant effects on realized upward (“good†) variance. Out-of-sample tests corroborate the significant predictive value of demand and risk shocks for realized variance and its upward counterpart at a short, medium, and long forecast horizon, for various recursive-estimation windows, for realized volatility (that is, the square root of realized variance), for a shorter sub-sample period that excludes the recent phase of exceptionally intense oil-market turbulence, and for an extended benchmark model that features realized higher-order moments, realized jumps, and a leverage effect as control variables.
    Keywords: Oil price; Shocks, REITs; Realized variance; Forecasting
    JEL: C22 C53 G10 Q02
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:pre:wpaper:2020100&r=all
  18. By: Gajurel, Dinesh (University of New Brunswick); Chowdhury, Biplob (Tasmanian School of Business & Economics, University of Tasmania)
    Abstract: Inclusion of jump component in the price process has been a long debate in finance literature. In this paper, we identify and characterize jump risks in the Canadian stock market using high-frequency data from the Toronto Stock Exchange. Our results provide a strong evidence of jump clustering - about 90% of jumps occur within first 30 minutes of market opening for trade, and about 55% of jumps are due to the overnight returns. While average intraday jump is negative, jumps induced by overnight returns bring a cancellation effect yielding average size of the jumps to zero. We show that the economic significance of jump component in volatility forecasting is very nominal. Our results further demonstrate that market jumps and overnight returns bring significant changes in systematic risk (beta) of stocks. While the average effect of market jumps on beta is not significantly different than zero, the effect of overnight returns on beta is significant. Overall, our results suggest that jump risk is non-systematic in nature.
    Keywords: financial markets, stock price process, jumps, volatility, systematic risk
    JEL: C58 G12
    Date: 2020
    URL: http://d.repec.org/n?u=RePEc:tas:wpaper:35107&r=all
  19. By: Tae-Hwy Lee (Department of Economics, University of California Riverside); Ekaterina Seregina (University of California Riverside)
    Abstract: Graphical models are a powerful tool to estimate a high-dimensional inverse covariance (precision) matrix, which has been applied for portfolio allocation problem. The assumption made by these models is a sparsity of the precision matrix. However, when the stock returns are driven by the common factors, this assumption does not hold. Our paper develops a framework for estimating a high-dimensional precision matrix which combines the benefits of exploring the factor structure of the stock returns and the sparsity of the precision matrix of the factor-adjusted returns. The proposed algorithm is called Factor Graphical Lasso (FGL). We study a high-dimensional portfolio allocation problem when the asset returns admit the approximate factor model. In high dimensions, when the number of assets is large relative to the sample size, the sample covariance matrix of the excess returns is subject to the large estimation uncertainty, which leads to unstable solutions for portfolio weights. To resolve this issue, we consider the decomposition of low-rank and sparse components. This strategy allows us to consistently estimate the optimal portfolio in high dimensions, even when the covariance matrix is ill-behaved. We establish consistency of the portfolio weights in a high-dimensional setting without assuming sparsity on the covariance or precision matrix of stock returns. Our theoretical results and simulations demonstrate that FGL is robust to heavy-tailed distributions, which makes our method suitable for financial applications. The empirical application uses daily and monthly data for the constituents of the S&P500 to demonstrate superior performance of FGL compared to the equal-weighted portfolio, index and some prominent precision and covariance-based estimators.
    Keywords: High-dimensionality, Portfolio optimization, Graphical Lasso, Approximate Factor Model, Sharpe Ratio, Elliptical Distributions
    JEL: C13 C55 C58 G11 G17
    Date: 2020–09
    URL: http://d.repec.org/n?u=RePEc:ucr:wpaper:202025&r=all
  20. By: Agostino Capponi; Zhaoyu Zhang
    Abstract: We propose a novel approach to infer investors' risk preferences from their portfolio choices, and then use the implied risk preferences to measure the efficiency of investment portfolios. We analyze a dataset spanning a period of six years, consisting of end of month stock trading records, along with investors' demographic information and self-assessed financial knowledge. Unlike estimates of risk aversion based on the share of risky assets, our statistical analysis suggests that the implied risk aversion coefficient of an investor increases with her wealth and financial literacy. Portfolio diversification, Sharpe ratio, and expected portfolio returns correlate positively with the efficiency of the portfolio, whereas a higher standard deviation reduces the efficiency of the portfolio. We find that affluent and financially educated investors as well as those holding retirement related accounts hold more efficient portfolios.
    Date: 2020–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2010.13928&r=all
  21. By: T. -N. Nguyen; M. -N. Tran; R. Kohn
    Abstract: We propose a new class of financial volatility models, which we call the REcurrent Conditional Heteroskedastic (RECH) models, to improve both the in-sample analysis and out-of-sample forecast performance of the traditional conditional heteroskedastic models. In particular, we incorporate auxiliary deterministic processes, governed by recurrent neural networks, into the conditional variance of the traditional conditional heteroskedastic models, e.g. the GARCH-type models, to flexibly capture the dynamics of the underlying volatility. The RECH models can detect interesting effects in financial volatility overlooked by the existing conditional heteroskedastic models such as the GARCH (Bollerslev, 1986), GJR (Glosten et al., 1993) and EGARCH (Nelson, 1991). The new models often have good out-of-sample forecasts while still explain well the stylized facts of financial volatility by retaining the well-established structures of the econometric GARCH-type models. These properties are illustrated through simulation studies and applications to four real stock index datasets. An user-friendly software package together with the examples reported in the paper are available at https://github.com/vbayeslab.
    Date: 2020–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2010.13061&r=all
  22. By: Bedayo, Mikel; Jiménez, Gabriel; Peydró, José-Luis; Vegas, Raquel
    Abstract: We show that loan origination time is key for bank lending standards, cycles, defaults and failures. We exploit the credit register from Spain, with the time of a loan application and its granting. When VIX is lower (booms), banks shorten loan origination time, especially to riskier firms. Bank incentives (capital and competition), capacity constraints, and borrower-lender information asymmetries are key mechanisms driving results. Moreover, shorter (loan-level) origination time is associated with higher ex-post defaults, also using variation from holidays. Finally, shorter precrisis origination time —more than other lending conditions— is associated with more bank-level failures in crises, consistent with lower screening.
    Keywords: loan origination time,lending standards,credit cycles,defaults,bank failures,screening
    JEL: G01 G21 G28 E44 E51
    Date: 2020
    URL: http://d.repec.org/n?u=RePEc:zbw:esprep:225986&r=all
  23. By: Isao Yagi; Yuji Masuda; Takanobu Mizuta
    Abstract: Many empirical studies have discussed market liquidity, which is regarded as a measure of a booming financial market. Further, various indicators for objectively evaluating market liquidity have also been proposed and their merits have been discussed. In recent years, the impact of high-frequency traders (HFTs) on financial markets has been a focal concern, but no studies have systematically discussed their relationship with major market liquidity indicators, including volume, tightness, resiliency, and depth. In this study, we used agent-based simulations to compare the major liquidity indicators in an artificial market where an HFT participated was compared to one where no HFT participated. The results showed that all liquidity indicators in the market where an HFT participated improved more than those in the market where no HFT participated. Furthermore, as a result of investigating the correlations between the major liquidity indicators in our simulations and the extant empirical literature, we found that market liquidity can be measured not only by the major liquidity indicators but also by execution rate. Therefore, it is suggested that it could be appropriate to employ execution rate as a novel liquidity indicator in future studies.
    Date: 2020–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2010.13038&r=all

This nep-rmg issue is ©2020 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.