nep-rmg New Economics Papers
on Risk Management
Issue of 2024‒04‒01
twenty papers chosen by



  1. Worst-Case Higher Moment Risk Measure: Addressing Distributional Shifts and Procyclicality By Castro-Iragorri, Carlos; Gómez, Fabio; Quiceno, Nancy
  2. Neural Networks for Portfolio-Level Risk Management: Portfolio Compression, Static Hedging, Counterparty Credit Risk Exposures and Impact on Capital Requirement By Vikranth Lokeshwar Dhandapani; Shashi Jain
  3. Value-at Risk under Measurement Error By Mohamed Doukali; Xiaojun Song; Abderrahim Taamouti
  4. Properties of the entropic risk measure EVaR in relation to selected distributions By Yuliya Mishura; Kostiantyn Ralchenko; Petro Zelenko; Volodymyr Zubchenko
  5. Modelling Risk-Weighted Assets: Looking Beyond Stress Tests By Josef Sveda; Jiri Panos; Vojtech Siuda
  6. Matrix-based Prediction Approach for Intraday Instantaneous Volatility Vector By Sung Hoon Choi; Donggyu Kim
  7. From GARCH to Neural Network for Volatility Forecast By Pengfei Zhao; Haoren Zhu; Wilfred Siu Hung NG; Dik Lun Lee
  8. Optimizing Neural Networks for Bermudan Option Pricing: Convergence Acceleration, Future Exposure Evaluation and Interpolation in Counterparty Credit Risk By Vikranth Lokeshwar Dhandapani; Shashi Jain
  9. Optimal positioning in derivative securities in incomplete markets By Tim Leung; Matthew Lorig; Yoshihiro Shirai
  10. Uncovering the Sino-US dynamic risk spillovers effects: Evidence from agricultural futures markets By Han-Yu Zhu; Peng-Fei Dai; Wei-Xing Zhou
  11. The Importance of Counterparty Credit Risk Management: A speech at By Michael S. Barr
  12. A house prices at risk approach for the German residential real estate market By Hafemann, Lucas
  13. Digitization of Non-Recovery Risk Management in Non-Financial Companies: A Qualitative Study of Risk Managers' Perceptions in Morocco By Chaimaa Achir; Aziz Douari
  14. Nonparametric Estimation of Large Spot Volatility Matrices for High-Frequency Financial Data By Ruijun Bu; Degui Li; Oliver Linton; Hanchao Wang
  15. Higher order measures of risk and stochastic dominance By Alois Pichler
  16. Justifying the Volatility of S&P 500 Daily Returns By Hayden Brown
  17. Optimizing Portfolio Management and Risk Assessment in Digital Assets Using Deep Learning for Predictive Analysis By Qishuo Cheng; Le Yang; Jiajian Zheng; Miao Tian; Duan Xin
  18. An averaging framework for minimum-variance portfolios: Optimal rules for combining portfolio weights By Roland Füss; Thorsten Glück; Christian Koeppel; Felix Miebs
  19. Do Positive Externalities Affect Risk Taking? Experimental Evidence on Gender and Group Membership By Carina Cavalcanti; Andreas Leibbrandt
  20. Decomposing liquidity risk in banking models By Dr. Lukas Voellmy

  1. By: Castro-Iragorri, Carlos (Facultad de Economía Universidad del Rosario); Gómez, Fabio (Facultad de Economía Universidad del Rosario); Quiceno, Nancy (Camara de Riesgo Central de Contraparte, CRCC)
    Abstract: This paper addresses the inherent procyclicality in widely adopted financial risk measures, such as Expected Shortfall (ES). We propose an innovative approach utilizing the Higher Moment (HM) risk measure, which offers a robust solution to distributional shifts by incorporating adaptive features. Empirical results using historical S&P500 returns indicate that worst-case HM risk measures significantly reduce the underestimation of risk and provide more stable risk assessments throughout the financial cycle compared to traditional ES predictions. These results suggest that HM risk measures represent a viable alternative to regulatory add-ons for stress testing and procyclicality mitigation in financial risk management.
    Keywords: procyclicality; higher moment risk; stress testing; expected shortfall
    JEL: C58 G17 G32
    Date: 2024–02–29
    URL: http://d.repec.org/n?u=RePEc:col:000092:021048&r=rmg
  2. By: Vikranth Lokeshwar Dhandapani; Shashi Jain
    Abstract: In this paper, we present an artificial neural network framework for portfolio compression of a large portfolio of European options with varying maturities (target portfolio) by a significantly smaller portfolio of European options with shorter or same maturity (compressed portfolio), which also represents a self-replicating static hedge portfolio of the target portfolio. For the proposed machine learning architecture, which is consummately interpretable by choice of design, we also define the algorithm to learn model parameters by providing a parameter initialisation technique and leveraging the optimisation methodology proposed in Lokeshwar and Jain (2024), which was initially introduced to price Bermudan options. We demonstrate the convergence of errors and the iterative evolution of neural network parameters over the course of optimization process, using selected target portfolio samples for illustration. We demonstrate through numerical examples that the Exposure distributions and Exposure profiles (Expected Exposure and Potential Future Exposure) of the target portfolio and compressed portfolio align closely across future risk horizons under risk-neutral and real-world scenarios. Additionally, we benchmark the target portfolio's Financial Greeks (Delta, Gamma, and Vega) against the compressed portfolio at future time horizons across different market scenarios generated by Monte-Carlo simulations. Finally, we compare the regulatory capital requirement under the standardised approach for counterparty credit risk of the target portfolio against the compressed portfolio and highlight that the capital requirement for the compact portfolio substantially reduces.
    Date: 2024–02
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2402.17941&r=rmg
  3. By: Mohamed Doukali; Xiaojun Song; Abderrahim Taamouti
    Abstract: We propose an optimization-based estimation of Value-at-Risk that corrects for the effect of measurement errors in prices. We show that measurement errors might pose serious problems for estimating risk measures like Value-at-Risk. In particular, when the stock prices are contaminated, the existing estimators of Value-at-Risk are inconsistent and might lead to an underestimation of risk, which might result in extreme leverage ratios within the held portfolios. Using Fourier transform and a deconvolution kernel estimator of the probability distribution function of true latent prices, we derive a robust estimator of Value-at-Risk in the presence of measurement errors. Monte Carlo simulations and a real data analysis illustrate satisfactory performance of the proposed method.
    Keywords: Deconvolution kernel, Fourier transform, measurement errors, market microstructure noise, optimization, Value-at-Risk
    JEL: G11 G19 C14 C61 C63
    Date: 2022–03
    URL: http://d.repec.org/n?u=RePEc:liv:livedp:202209&r=rmg
  4. By: Yuliya Mishura; Kostiantyn Ralchenko; Petro Zelenko; Volodymyr Zubchenko
    Abstract: Entropic Value-at-Risk (EVaR) measure is a convenient coherent risk measure. Due to certain difficulties in finding its analytical representation, it was previously calculated explicitly only for the normal distribution. We succeeded to overcome these difficulties and to calculate Entropic Value-at-Risk (EVaR) measure for Poisson, compound Poisson, Gamma, Laplace, exponential, chi-squared, inverse Gaussian distribution and normal inverse Gaussian distribution with the help of Lambert function that is a special function, generally speaking, with two branches.
    Date: 2024–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2403.01468&r=rmg
  5. By: Josef Sveda (Czech National Bank and Charles University, Prague); Jiri Panos (European Central Bank and University of Economics, Prague); Vojtech Siuda (Czech National Bank and University of Economics, Prague)
    Abstract: We propose an improved methodology for modelling potential scenario paths of banks' riskweighted assets, which drive the denominator of capital adequacy ratios. Our approach centres on modelling the internal risk structure of bank portfolios and thus aims to provide more accurate estimations than the common portfolio level approaches used in top-down stress testing frameworks. This should reduce the likelihood of significant misestimation of riskweighted assets, which can lead to unjustifiably high or low solvency measures and induce false perceptions about banks' financial health. The proposed methodology is easy to replicate and suitable for various applications, including stress testing and calibration of macroprudential tools. After the methodology is introduced, we show how our proposed approach compares favourably to the methods typically used. Subsequently, we use our approach to estimate the potential increase in risk weights due to a cyclical deterioration in credit parameters and the corresponding setup of the countercyclical capital buffer for the Czech banking sector. Finally, an illustrative, hands-on example is provided in the Appendix.
    Keywords: Risk weighted exposure; stress-testing; credit portfolio structure; countercyclical capital buffer
    JEL: E58 G21 G28 G29
    Date: 2024–03–15
    URL: http://d.repec.org/n?u=RePEc:gii:giihei:heidwp04-2024&r=rmg
  6. By: Sung Hoon Choi; Donggyu Kim
    Abstract: In this paper, we introduce a novel method for predicting intraday instantaneous volatility based on Ito semimartingale models using high-frequency financial data. Several studies have highlighted stylized volatility time series features, such as interday auto-regressive dynamics and the intraday U-shaped pattern. To accommodate these volatility features, we propose an interday-by-intraday instantaneous volatility matrix process that can be decomposed into low-rank conditional expected instantaneous volatility and noise matrices. To predict the low-rank conditional expected instantaneous volatility matrix, we propose the Two-sIde Projected-PCA (TIP-PCA) procedure. We establish asymptotic properties of the proposed estimators and conduct a simulation study to assess the finite sample performance of the proposed prediction method. Finally, we apply the TIP-PCA method to an out-of-sample instantaneous volatility vector prediction study using high-frequency data from the S&P 500 index and 11 sector index funds.
    Date: 2024–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2403.02591&r=rmg
  7. By: Pengfei Zhao; Haoren Zhu; Wilfred Siu Hung NG; Dik Lun Lee
    Abstract: Volatility, as a measure of uncertainty, plays a crucial role in numerous financial activities such as risk management. The Econometrics and Machine Learning communities have developed two distinct approaches for financial volatility forecasting: the stochastic approach and the neural network (NN) approach. Despite their individual strengths, these methodologies have conventionally evolved in separate research trajectories with little interaction between them. This study endeavors to bridge this gap by establishing an equivalence relationship between models of the GARCH family and their corresponding NN counterparts. With the equivalence relationship established, we introduce an innovative approach, named GARCH-NN, for constructing NN-based volatility models. It obtains the NN counterparts of GARCH models and integrates them as components into an established NN architecture, thereby seamlessly infusing volatility stylized facts (SFs) inherent in the GARCH models into the neural network. We develop the GARCH-LSTM model to showcase the power of the GARCH-NN approach. Experiment results validate that amalgamating the NN counterparts of the GARCH family models into established NN models leads to enhanced outcomes compared to employing the stochastic and NN models in isolation.
    Date: 2024–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2402.06642&r=rmg
  8. By: Vikranth Lokeshwar Dhandapani; Shashi Jain
    Abstract: This paper presents a Monte-Carlo-based artificial neural network framework for pricing Bermudan options, offering several notable advantages. These advantages encompass the efficient static hedging of the target Bermudan option and the effective generation of exposure profiles for risk management. We also introduce a novel optimisation algorithm designed to expedite the convergence of the neural network framework proposed by Lokeshwar et al. (2022) supported by a comprehensive error convergence analysis. We conduct an extensive comparative analysis of the Present Value (PV) distribution under Markovian and no-arbitrage assumptions. We compare the proposed neural network model in conjunction with the approach initially introduced by Longstaff and Schwartz (2001) and benchmark it against the COS model, the pricing model pioneered by Fang and Oosterlee (2009), across all Bermudan exercise time points. Additionally, we evaluate exposure profiles, including Expected Exposure and Potential Future Exposure, generated by our proposed model and the Longstaff-Schwartz model, comparing them against the COS model. We also derive exposure profiles at finer non-standard grid points or risk horizons using the proposed approach, juxtaposed with the Longstaff Schwartz method with linear interpolation and benchmark against the COS method. In addition, we explore the effectiveness of various interpolation schemes within the context of the Longstaff-Schwartz method for generating exposures at finer grid horizons.
    Date: 2024–02
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2402.15936&r=rmg
  9. By: Tim Leung; Matthew Lorig; Yoshihiro Shirai
    Abstract: This paper analyzes a problem of optimal static hedging using derivatives in incomplete markets. The investor is assumed to have a risk exposure to two underlying assets. The hedging instruments are vanilla options written on a single underlying asset. The hedging problem is formulated as a utility maximization problem whereby the form of the optimal static hedge is determined. Among our results, a semi-analytical solution for the optimizer is found through variational methods for exponential, power/logarithmic, and quadratic utility. When vanilla options are available for each underlying asset, the optimal solution is related to the fixed points of a Lipschitz map. In the case of exponential utility, there is only one such fixed point, and subsequent iterations of the map converge to it.
    Date: 2024–02
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2403.00139&r=rmg
  10. By: Han-Yu Zhu; Peng-Fei Dai; Wei-Xing Zhou
    Abstract: Agricultural products play a critical role in human development. With economic globalization and the financialization of agricultural products continuing to advance, the interconnections between different agricultural futures have become closer. We utilize a TVP-VAR-DY model combined with the quantile method to measure the risk spillover between 11 agricultural futures on the futures exchanges of US and China from July 9, 2014, to December 31, 2022. This study yielded several significant findings. Firstly, CBOT corn, soybean, and wheat were identified as the primary risk transmitters, with DCE corn and soybean as the main risk receivers. Secondly, sudden events or increased economic uncertainty can increase the overall risk spillovers. Thirdly, there is an aggregation of risk spillovers amongst agricultural futures based on the dynamic directional spillover results. Lastly, the central agricultural futures under the conditional mean are CBOT corn and soybean, while CZCE hard wheat and long-grained rice are the two risk spillover centers in extreme cases, as per the results of the spillover network and minimum spanning tree. Based on these results, decision-makers are advised to safeguard against the price risk of agricultural futures under sudden economic events, and investors can utilize the results to construct a superior investment portfolio by taking different agricultural product futures as risk-leading indicators according to various situations.
    Date: 2024–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2403.01745&r=rmg
  11. By: Michael S. Barr
    Date: 2024–02–27
    URL: http://d.repec.org/n?u=RePEc:fip:fedgsq:97858&r=rmg
  12. By: Hafemann, Lucas
    Abstract: This paper focuses on the downside risks to the German residential real estate market. It applies the "at-risk" methodology to the German housing market. Quantile regressions reveal that different quantiles of the house price forecast distribution are heterogeneously affected by the same exogenous variables. While past house prices have a very pronounced impact on the median, variations in interest rates predominantly affect the lower quantiles of the distribution. Other factors, such as employment, affect different quantiles more equally. The at-risk model shows that, in the recent era of high inflation and rising interest rates, the forecast distribution of house prices has shifted to the left, resulting in lower expected growth rates of real house prices. Additionally, we find that sparsely populated districts have more pronounced downside risks than densely populated ones.
    Keywords: residential real estate, housing, growth-at-risk, quantile regression, Germany
    JEL: C32 E37 G01 R31
    Date: 2023
    URL: http://d.repec.org/n?u=RePEc:zbw:bubtps:283351&r=rmg
  13. By: Chaimaa Achir (Université Hassan 1er [Settat]); Aziz Douari (Université Hassan 1er [Settat])
    Abstract: Within an ever-changing economic landscape, organizations are exposed to a myriad of risks, compelling them to increasingly implement rigorous and sector-appropriate risk management. In this context, the risk of non-recovery of receivables within non-financial companies deserves particular attention due to its significant impact.This article, among the few studiesaddressing this specific topic, aims to shed light on the perceptions and attitudes of risk managers regarding the digitization of non-recovery risk management in non-financial enterprises. It is commonly observed that the risk of non-recovery is more frequently discussed in the context of financial companies, while non-financial enterprises also experience its challenges. The fundamental idea behind this study is to understand how risk managers perceive this risk and whether they believe that digitizationimpacts its management. The results reflect trends in the adoption of digitization, the perspectives of risk managers, and the observed impacts on non-recovery risk management. The discussion delves deep into these findings, highlighting challenges, successes, and opportunities. By exploring trends, perspectives, and impacts, this research contributes to a broader understanding of the role of digital tools in anticipating, assessing, and mitigating non-recovery risks. The challenges highlighted in the discussion emphasize the need for tailored strategies and further exploration of opportunities for improvement.
    Abstract: Au sein d'un paysage économique en constante évolution, les organisations sont exposées à une myriade de risques, les incitant de plus en plus à mettre en œuvre des processus de gestion des risques rigoureux et adaptés à leur secteur. Dans ce contexte, le risque de non-recouvrement des créances au sein des entreprises non financières mérite une attention particulière en raison de son impact significatif. Cet article, parmi les rares études abordant ce sujet spécifique, vise à éclairer les perceptions et attitudes des gestionnaires de risques concernant la numérisation de la gestion des risques liés au non-recouvrement dans les entreprises non financières. On observe couramment que le risque de non-recouvrement est plus fréquemment discuté dans le contexte des entreprises financières, tandis que les entreprises non financières font également face à ses défis. L'idée fondamentale de cette étude est de comprendre comment les gestionnaires de risques perçoivent ce risque et s'ils estiment que la numérisation a un impact sur sa gestion. Les résultats reflètent les tendances en matière d'adoption de la numérisation, les perspectives des gestionnaires de risques et les impacts observés sur la gestion des risques liés au non-recouvrement. La discussion approfondit ces résultats, mettant en lumière les défis, les succès et les opportunités. En explorant les tendances, les perspectives et les impacts, cette recherche contribue à une compréhension plus large du rôle des outils numériques dans l'anticipation, l'évaluation et l'atténuation des risques liés au non-recouvrement. Les défis soulignés dans la discussion soulignent la nécessité de stratégies adaptées et d'une exploration approfondie des opportunités d'amélioration.
    Keywords: Risk management, Non-recovery risk, Digital tools, Perceptions, Risk managers, Gestion des risques, Risque de non-recouvrement, Outils digitaux
    Date: 2024–02–12
    URL: http://d.repec.org/n?u=RePEc:hal:journl:hal-04463403&r=rmg
  14. By: Ruijun Bu; Degui Li; Oliver Linton; Hanchao Wang
    Abstract: In this paper, we consider estimating spot/instantaneous volatility matrices of high-frequency data collected for a large number of assets. We first combine classic nonparametric kernel-based smoothing with a generalised shrinkage technique in the matrix estimation for noise-free data under a uniform sparsity assumption, a natural extension of the approximate sparsity commonly used in the literature. The uniform consistency property is derived for the proposed spot volatility matrix estimator with convergence rates comparable to the optimal minimax one. For the highfrequency data contaminated by the microstructure noise, we introduce a localised pre-averaging estimation method in the high-dimensional setting which first pre-whitens data via a kernel filter and then uses the estimation tool developed in the noise-free scenario, and further derive the uniform convergence rates for the developed spot volatility matrix estimator. In addition, we also combine the kernel smoothing with the shrinkage technique to estimate the time-varying volatility matrix of the high-dimensional noise vector, and establish the relevant uniform consistency result. Numerical studies are provided to examine performance of the proposed estimation methods in finite samples.
    Keywords: Brownian semi-martingale, Kernel smoothing, Microstructure noise, Sparsity, Spot volatility matrix, Uniform consistency.
    Date: 2022–03
    URL: http://d.repec.org/n?u=RePEc:liv:livedp:202212&r=rmg
  15. By: Alois Pichler
    Abstract: Higher order risk measures are stochastic optimization problems by design, and for this reason they enjoy valuable properties in optimization under uncertainties. They nicely integrate with stochastic optimization problems, as has been observed by the intriguing concept of the risk quadrangles, for example. Stochastic dominance is a binary relation for random variables to compare random outcomes. It is demonstrated that the concepts of higher order risk measures and stochastic dominance are equivalent, they can be employed to characterize the other. The paper explores these relations and connects stochastic orders, higher order risk measures and the risk quadrangle. Expectiles are employed to exemplify the relations obtained.
    Date: 2024–02
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2402.15387&r=rmg
  16. By: Hayden Brown
    Abstract: Over the past 60 years, there has been a gradual increase in the volatility of daily returns for the S&P 500 Index. Hypothetically, suppose that market forces determine daily volatility such that a daily leveraged S&P 500 fund cannot outperform a standard S&P 500 fund in the long run. Then this hypothetical volatility happens to support the increase in volatility seen in the S&P 500 index. On this basis, it appears that the classic argument of the market portfolio being unbeatable in the long run is determining the volatility of S&P 500 daily returns. Moreover, it follows that the long-term volatility of the daily returns for the S&P 500 Index should continue to increase until passing a particular threshold. If, on the other hand, this hypothesis about market forces increasing volatility is invalid, then there is room for daily leveraged S&P 500 funds to outperform their unleveraged counterparts in the long run.
    Date: 2024–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2403.01088&r=rmg
  17. By: Qishuo Cheng; Le Yang; Jiajian Zheng; Miao Tian; Duan Xin
    Abstract: Portfolio management issues have been extensively studied in the field of artificial intelligence in recent years, but existing deep learning-based quantitative trading methods have some areas where they could be improved. First of all, the prediction mode of stocks is singular; often, only one trading expert is trained by a model, and the trading decision is solely based on the prediction results of the model. Secondly, the data source used by the model is relatively simple, and only considers the data of the stock itself, ignoring the impact of the whole market risk on the stock. In this paper, the DQN algorithm is introduced into asset management portfolios in a novel and straightforward way, and the performance greatly exceeds the benchmark, which fully proves the effectiveness of the DRL algorithm in portfolio management. This also inspires us to consider the complexity of financial problems, and the use of algorithms should be fully combined with the problems to adapt. Finally, in this paper, the strategy is implemented by selecting the assets and actions with the largest Q value. Since different assets are trained separately as environments, there may be a phenomenon of Q value drift among different assets (different assets have different Q value distribution areas), which may easily lead to incorrect asset selection. Consider adding constraints so that the Q values of different assets share a Q value distribution to improve results.
    Date: 2024–02
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2402.15994&r=rmg
  18. By: Roland Füss (Swiss Finance Institute; University of St. Gallen - School of Finance); Thorsten Glück (Wiesbaden Business School); Christian Koeppel (University of St. Gallen); Felix Miebs (University of Applied Sciences Cologne)
    Abstract: We propose an averaging framework for combining minimum-variance strategies to either minimize the expected out-of-sample variance or maximize the expected out-of-sample Sharpe ratio. Our framework overcomes the problem of selecting the “best” strategy ex-ante by optimally averaging over portfolio weights. This averaging procedure has an intuitive economic interpretation because it resembles a fund-of-fund approach, where each minimum-variance strategy represents a single fund. In a range of simulations, for a set of well-established strategies, we show that optimally averaging over portfolio weights improves the out-ofsample variance and Sharpe ratio. We confirm the finding of our simulation study on empirical data.
    Keywords: Averaging; diversification; estimation error; portfolio optimization; shrinkage.
    JEL: G11
    Date: 2024–01
    URL: http://d.repec.org/n?u=RePEc:chf:rpseri:rp2410&r=rmg
  19. By: Carina Cavalcanti (Department of Accounting, Finance and Economics, Griffith University, Southport and Nathan, QLD, Australia.); Andreas Leibbrandt (Department of Economics, Monash University, Clayton, VIC 3800, Australia.)
    Abstract: Many positive externalities are created by risk-taking. We investigate whether risk-taking is affected by the presence of positive externalities. In our experiments, we study choices between investments in technologies that differ according to their level of risk and the extent to which they generate positive externalities for others. We find that even large positive externalities have little to no impact on individual risk-taking. We also find that women are generally less willing to take risks than men in the absence and presence of positive externalities and that they generate fewer positive externalities if they increase with risk but more positive externalities if they decrease with risk. Finally, we observe that groups invest more in technologies with larger positive externalities and that this is mainly driven by male group members. These findings provide a comprehensive view on the malleability of risk-taking in the presence of positive externalities.
    Keywords: Risk aversion, positive externality, gender
    JEL: C91 C92 D81 H23 J17
    Date: 2024–03
    URL: http://d.repec.org/n?u=RePEc:mos:moswps:2024-05&r=rmg
  20. By: Dr. Lukas Voellmy
    Abstract: In various banking models, banks are viewed as arrangements that insure households against uncertain liquidity needs. However, the exact nature of the liquidity risk faced by households – and hence the insurance function of banks – differs across models. This paper attempts to disentangle the different meanings of the term ‘liquidity insurance’ in the literature and to clarify what kind of insurance banks provide in which models. The paper also shows under which conditions banking is equivalent to eliminating uncertainty about liquidity needs or letting households trade with each other in an asset market. Special attention is given to the comparison of banking models in the tradition of Diamond and Dybvig (1983) with those based on monetary (notably New Monetarist) frameworks.
    Keywords: Liquidity insurance, Banking theory
    JEL: G21 G52
    Date: 2024
    URL: http://d.repec.org/n?u=RePEc:snb:snbwpa:2024-03&r=rmg

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.