|
on Risk Management |
Issue of 2022‒05‒09
twenty-six papers chosen by |
By: | Benjamin Avanzi; Hayden Lau; Mogens Steffensen |
Abstract: | Optimal reinsurance is a perennial problem in insurance. The problem formulation considered in this paper is closely connected to the optimal portfolio problem in finance, with some important distinctions. In particular, the surplus of an insurance company is routinely approximated by a Brownian motion, as opposed to the geometric Brownian motion used to model assets in finance. Furthermore, exposure to risk is controlled "downwards" via reinsurance, rather than "upwards" via risky investments. This leads to interesting qualitative differences in the optimal solutions. In this paper, using the martingale method, we derive the optimal proportional, non cheap reinsurance control that maximises the quadratic utility of the terminal value of the insurance surplus. We also consider a number of realistic constraints on the terminal value: a strict lower boundary, the probability (Value at Risk) constraint, and the expected shortfall (conditional Value at Risk) constraints under the $\mathbb{P}$ and $\mathbb{Q}$ measures, respectively. Comparison of the optimal strategies with the optimal solutions in finance are of particular interest. Results are illustrated. |
Date: | 2022–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2203.16108&r= |
By: | Konstantin G\"orgen; Jonas Meirer; Melanie Schienle |
Abstract: | We study the estimation and prediction of the risk measure Value at Risk for cryptocurrencies. Using Generalized Random Forests (GRF) (Athey et al., 2019) that can be adapted to specifically fit the framework of quantile prediction, we show their superior performance over other established methods such as quantile regression and CAViaR, particularly in unstable times. We investigate the small-sample prediction properties in comparison to standard techniques in a Monte Carlo simulation study. In a comprehensive empirical assessment, we study the performance not only for the major cryptocurrencies but also in the stock market. Generally, we find that GRF outperforms established methods especially in crisis situations. We further identify important predictors during such times and show their influence on forecasting over time. |
Date: | 2022–02 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2203.08224&r= |
By: | T. van der Zwaard; L. A. Grzelak; C. W. Oosterlee |
Abstract: | March 2020, the world was thrown into a period of financial distress. This manifested through increased uncertainty in the financial markets. Many interest rates collapsed and funding spreads surged significantly, which increased due to the market turmoil. In light of these events, it is key to understand and model Wrong-Way Risk (WWR) in a Funding Valuation Adjustment (FVA) context. WWR might currently not be incorporated in FVA calculations in banks' Valuation Adjustment (xVA) engines. However, we demonstrate that WWR effects are non-negligible in FVA modeling from a risk-management perspective. We look at the impact of various modeling choices such as including the default times of the relevant parties and we consider different choices of funding spread. A case study is presented for interest rate derivatives. |
Date: | 2022–04 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2204.02680&r= |
By: | Christa Cuchiero; Guido Gazzani; Irene Klein |
Abstract: | We introduce two kinds of risk measures with respect to some reference probability measure, which both allow for a certain order structure and domination property. Analyzing their relation to each other leads to the question when a certain minimax inequality is actually an equality. We then provide conditions under which the corresponding robust risk measures, being defined as the supremum over all risk measures induced by a set of probability measures, can be represented classically in terms of one single probability measure. We focus in particular on the mixture probability measure obtained via mixing over a set of probability measures using some prior, which represents for instance the regulator's beliefs. The classical representation in terms of the mixture probability measure can then be interpreted as a Bayesian approach to robust risk measures. |
Date: | 2022–04 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2204.07115&r= |
By: | JunTao Duan; Ionel Popescu |
Abstract: | Minimum-variance portfolio optimizations rely on accurate covariance estimator to obtain optimal portfolios. However, it usually suffers from large error from sample covariance matrix when the sample size $n$ is not significantly larger than the number of assets $p$. We analyze the random matrix aspects of portfolio optimization and identify the order of errors in sample optimal portfolio weight and show portfolio risk are underestimated when using samples. We also provide LoCoV (low dimension covariance voting) algorithm to reduce error inherited from random samples. From various experiments, LoCoV is shown to outperform the classical method by a large margin. |
Date: | 2022–04 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2204.00204&r= |
By: | Veraart, Luitgard A. M. |
Abstract: | We develop a new model for solvency contagion that can be used to quantify systemic risk in stress tests of financial networks. In contrast to many existing models it allows for the spread of contagion already before the point of default and hence can account for contagion due to distress and mark-to-market losses. We derive general ordering results for outcome measures of stress tests that enable us to compare different contagion mechanisms. We use these results to study the sensitivity of the new contagion mechanism with respect to its model parameters and to compare it to existing models in the literature. When applying the new model to data from the European Banking Authority we find that the risk from distress contagion is strongly dependent on the anticipated recovery rate. For low recovery rates the high additional losses caused by bankruptcy dominate the overall stress test results. For high recovery rates, however, we observe a strong sensitivity of the stress test outcomes with respect to the model parameters determining the magnitude of distress contagion. |
Keywords: | systemic risk; contagion; financial networks; stress testing; mark-to-market losses; George Fellowship |
JEL: | C62 D85 G21 G28 G33 |
Date: | 2020–07–01 |
URL: | http://d.repec.org/n?u=RePEc:ehl:lserod:101905&r= |
By: | Ferdoos Alharbi; Tahir Choulli |
Abstract: | In this paper, we consider an informational market model with two flows of informations. The smallest flow F, which is available to all agents, is the filtration of the initial market model(S,F,P), where S is the assets' prices and P is a probability measure. The largest flow G contains additional information about the occurrence of a random time T. This setting covers credit risk theory where T models the default time of a firm, and life insurance where T represents the death time of an insured. For the model (S-S^T,G,P), we address the log-optimal portfolio problem in many aspects. In particular, we answer the following questions and beyond: 1) What are the necessary and sufficient conditions for the existence of log-optimal portfolio of the model under consideration? 2) what are the various type of risks induced by T that affect this portfolio and how? 3) What are the factors that completely describe the sensitivity of the log-portfolio to the parameters of T? The answers to these questions and other related discussions definitely complement the work of Choulli and Yansori [12] which deals with the stopped model (S^T,G). |
Date: | 2022–04 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2204.03798&r= |
By: | Peter Reinhard Hansen; Chen Tong |
Abstract: | We introduce a novel pricing kernel with time-varying variance risk aversion that can explain key pricing kernel puzzles. When combined with the Heston-Nandi GARCH model, the framework yields closed-form expressions for the VIX. We also obtain closed-form expressions for option prices by proposing a novel method that extrapolation from the closed-form VIX. We estimate the model with S&P 500 returns and option prices and find a substantial reduction in pricing errors by permitting time-variation in volatility risk aversion. This reduction is seen for both option pricing and VIX pricing and both in-sample and out-of-sample. The variance risk ratio emerges as a fundamental variable in our framework and we show that it is closely related to economic fundamentals and leading measures of sentiment and uncertainty |
Date: | 2022–04 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2204.06943&r= |
By: | Bruno Spilak; Wolfgang Karl H\"ardle |
Abstract: | A portfolio allocation method based on linear and non-linear latent constrained conditional factors is presented. The factor loadings are constrained to always be positive in order to obtain long-only portfolios, which is not guaranteed by classical factor analysis or PCA. In addition, the factors are to be uncorrelated among clusters in order to build long-only portfolios. Our approach is based on modern machine learning tools: convex Non-negative Matrix Factorization (NMF) and autoencoder neural networks, designed in a specific manner to enforce the learning of useful hidden data structure such as correlation between the assets' returns. Our technique finds lowly correlated linear and non-linear conditional latent factors which are used to build outperforming global portfolios consisting of cryptocurrencies and traditional assets, similar to hierarchical clustering method. We study the dynamics of the derived non-linear factors in order to forecast tail losses of the portfolios and thus build more stable ones. |
Date: | 2022–04 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2204.02757&r= |
By: | Florian Bourgey; Stefano De Marco; Peter K. Friz; Paolo Pigato |
Abstract: | Several asymptotic results for the implied volatility generated by a rough volatility model have been obtained in recent years (notably in the small-maturity regime), providing a better understanding of the shapes of the volatility surface induced by rough volatility models, and supporting their calibration power to S&P500 option data. Rough volatility models also generate a local volatility surface, via the so-called Markovian projection of the stochastic volatility. We complement the existing results on the implied volatility by studying the asymptotic behavior of the local volatility surface generated by a class of rough stochastic volatility models, encompassing the rough Bergomi model. Notably, we observe that the celebrated "1/2 skew rule" linking the short-term at-the-money skew of the implied volatility to the short-term at-the-money skew of the local volatility, a consequence of the celebrated "harmonic mean formula" of [Berestycki, Busca, and Florent, QF 2002], is replaced by a new rule: the ratio of the at-the-money implied and local volatility skews tends to the constant 1/(H + 3/2) (as opposed to the constant 1/2), where H is the regularity index of the underlying instantaneous volatility process. |
Date: | 2022–04 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2204.02376&r= |
By: | Kun Zhang; Ben Mingbin Feng; Guangwu Liu; Shiyu Wang |
Abstract: | Nested simulation is a natural approach to tackle nested estimation problems in operations research and financial engineering. The outer-level simulation generates outer scenarios and the inner-level simulations are run in each outer scenario to estimate the corresponding conditional expectation. The resulting sample of conditional expectations is then used to estimate different risk measures of interest. Despite its flexibility, nested simulation is notorious for its heavy computational burden. We introduce a novel simulation procedure that reuses inner simulation outputs to improve efficiency and accuracy in solving nested estimation problems. We analyze the convergence rates of the bias, variance, and MSE of the resulting estimator. In addition, central limit theorems and variance estimators are presented, which lead to asymptotically valid confidence intervals for the nested risk measure of interest. We conduct numerical studies on two financial risk measurement problems. Our numerical studies show consistent results with the asymptotic analysis and show that the proposed approach outperforms the standard nested simulation and a state-of-art regression approach for nested estimation problems. |
Date: | 2022–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2203.15929&r= |
By: | Xu, Jack |
Abstract: | Fundamental credit analysis is widely performed by fixed income analysts and financial institutions to assess the credit risk of individual companies based on their financial data, notably the financial statements reported by the companies. Yet, the conventional analysis has not developed a computational method to forecast, directly from a company’s financial statements, the default probability, the recovery rate, and ultimately the fundamental valuation of a company’s credit risk in terms of credit spreads to risk-free rate. This paper introduces a generalizable approach to achieve these goals by implementing fundamental credit analysis in dynamical models. When combined with Monte-Carlo simulation, the current methodology naturally combines several novel features in the same forecast algorithm: 1. integrating default (defined as the state of negative cash) and recovery rate (under liquidation scenario) through the same defaulted balance sheet, 2. valuing the corporate real options manifested as planning in the amount of borrowing and expenditure, 3. embedding macro-economic and macro-financing conditions, and 4. forecasting the joint default risk of multiple companies. The method is applied to the Chinese real estate industry to forecast for several listed developers their forward default probabilities and associated recovery rates, and the fair-value par coupon curves of senior unsecured debt, using as inputs 6-8 years of their annual financial statements with 2020 as the latest. The results show both agreements and disagreements with the market-traded credit spreads at early April 2021, the time of these forecasts. The models forecasted much wider than market spreads on the big three developers, particularly pricing Evergrande in distressed levels. After setting up additional generic industry models, the current methodology is capable of computing default risk and debt valuation on large-scale of companies based on their historical financial statements. |
Keywords: | fundamental credit analysis; financial statement analysis; default forecasting; bond valuation; debt valuation; dynamical models; joint default; corporate real options |
JEL: | C6 G17 |
Date: | 2022–04–10 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:112699&r= |
By: | Daniel Dimitrov (University of Amsterdam) |
Abstract: | This paper examines the optimal allocation of risk across generations whose savings mix is subject to illiquidity in the form of uncertain trading costs. We use a stylised two-period OLG framework, where each generation makes a portfolio allocation decision for retirement, and show that illiquidity reduces the range of transferable shocks between generations and thus lowers the benefits of risk-sharing. Higher illiquidity then may justify higher levels of risk sharing to compensate for the trading friction. We still find that a contingent transfers policy based on a reasonably parametrised savings portfolio with liquid and illiquid assets increased aggregate welfare. |
Keywords: | intergenerational risk sharing, (il)liquidity, stochastic overlapping generations, funded pension plan |
JEL: | G11 G23 E21 H55 |
Date: | 2022–03–30 |
URL: | http://d.repec.org/n?u=RePEc:tin:wpaper:20220028&r= |
By: | Jaehyuk Choi; Rong Chen |
Abstract: | Risk parity, also known as equal risk contribution, has recently gained increasing attention as a portfolio allocation method. However, solving portfolio weights must resort to numerical methods as the analytic solution is not available. This study improves two existing iterative methods: the cyclical coordinate descent (CCD) and Newton methods. We enhance the CCD method by simplifying the formulation using a correlation matrix and imposing an additional rescaling step. We also suggest an improved initial guess inspired by the CCD method for the Newton method. Numerical experiments show that the improved CCD method performs the best and is approximately three times faster than the original CCD method, saving more than 40% of the iterations. |
Date: | 2022–02 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2203.00148&r= |
By: | Giacomo Giorgio; Barbara Pacchiarotti; Paolo Pigato |
Abstract: | We provide a short-time large deviation principle (LDP) for stochastic volatility models, where the volatility is expressed as a function of a Volterra process. This LDP holds under suitable conditions, but does not require any self-similarity assumption on the Volterra process. For this reason, we are able to apply such LDP to two notable examples of non self-similar rough volatility models: models where the volatility is given as a function of a log-modulated fractional Brownian motion [Bayer et al., Log-modulated rough stochastic volatility models. SIAM J. Financ. Math, 2021, 12(3), 1257-1284], and models where it is given as a function of a fractional Ornstein-Uhlenbeck (fOU) process [Gatheral et al., Volatility is rough. Quant. Finance, 2018, 18(6), 933-949]. In both cases we derive consequences for short-maturity European option prices and implied volatility surfaces. In the fOU case we also discuss moderate deviations pricing and simulation results. |
Date: | 2022–04 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2204.10103&r= |
By: | Xinyu Wang; Liang Zhao; Ning Zhang; Liu Feng; Haibo Lin |
Abstract: | The systemic stability of a stock market is one of the core issues in the financial field. The market can be regarded as a complex network whose nodes are stocks connected by edges that signify their correlation strength. Since the market is a strongly nonlinear system, it is difficult to measure the macroscopic stability and depict market fluctuations in time. In this paper, we use a geometric measure derived from discrete Ricci curvature to capture the higher-order nonlinear architecture of financial networks. In order to confirm the effectiveness of our method, we use it to analyze the CSI 300 constituents of China's stock market from 2005--2020 and the systemic stability of the market is quantified through the network's Ricci type curvatures. Furthermore, we use a hybrid model to analyze the curvature time series and predict the future trends of the market accurately. As far as we know, this is the first paper to apply Ricci curvature to forecast the systemic stability of domestic stock market, and our results show that Ricci curvature has good explanatory power for the market stability and can be a good indicator to judge the future risk and volatility of the domestic market. |
Date: | 2022–04 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2204.06692&r= |
By: | Galvani, Valentina (University of Alberta, Department of Economics); Faychuk, Vita (Gustavus Adolphus College) |
Abstract: | We explore the existence of a mean-variance core subset of cryptocurrencies that subsumes the risk-reward of the broader market. The analysis considers both the perspective of long-short and long-only investors. The results indicate that most cryptocurrencies are redundant from the standpoint of both types of investors, with the exception of Bitcoin, which consistently improves the Sharpe ratio of even broad cryptocurrency portfolios. We show that the core can be often identified ex-ante as the cryptocurrencies attracting the highest levels of investors’ attention. |
Keywords: | Sharpe Ratio; Cryptocurrencies; Bitcoin; Short-Selling; Spanning |
JEL: | G11 G12 G14 G40 |
Date: | 2022–03–24 |
URL: | http://d.repec.org/n?u=RePEc:ris:albaec:2022_004&r= |
By: | Franco D. Albareti; Thomas Ankenbrand; Denis Bieri; Esther H\"anggi; Damian L\"otscher; Stefan Stettler; Marcel Sch\"ongens |
Abstract: | Quantum computers can solve specific problems that are not feasible on "classical" hardware. Harvesting the speed-up provided by quantum computers therefore has the potential to change any industry which uses computation, including finance. First quantum applications for the financial industry involving optimization, simulation, and machine learning problems have already been proposed and applied to use cases such as portfolio management, risk management, and pricing derivatives. This survey reviews platforms, algorithms, methodologies, and use cases of quantum computing for various applications in finance in a structured way. It is aimed at people working in the financial industry and serves to gain an overview of the current development and capabilities and understand the potential of quantum computing in the financial industry. |
Date: | 2022–04 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2204.10026&r= |
By: | Felix-Benedikt Liebrich; Marco Maggis; Gregor Svindland |
Abstract: | Robust models in mathematical finance replace the classical single probability measure by a sufficiently rich set of probability measures on the future states of the world to capture (Knightian) uncertainty about the "right" probabilities of future events. If this set of measures is nondominated, many results known from classical dominated frameworks cease to hold as probabilistic and analytic tools crucial for the handling of dominated models fail. We investigate the consequences for the robust model when prominent results from the mathematical finance literature are postulate. In this vein, we categorise the Kreps-Yan property, robust variants of the Brannath-Schachermayer Bipolar Theorem, Fatou representations of risk measures, and aggregation in robust models. |
Date: | 2020–04 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2004.06636&r= |
By: | Nick Netzer; Arthur Robson; Jakub Steiner; Pavel Kocourek |
Abstract: | In a model inspired by neuroscience, we show that constrained optimal perception encodes lottery rewards using an S-shaped encoding function and over-samples low-probability events. The implications of this perception strategy for behavior depend on the decision-maker’s understanding of the risk. The strategy does not distort choice in the limit as perception frictions vanish when the decision-maker fully understands the decision problem. If, however, the decision-maker underrates the complexity of the decision problem, then risk attitudes reflect properties of the perception strategy even for vanishing perception frictions. The model explains adaptive risk attitudes and probability weighting as in prospect theory and, additionally, predicts that risk attitudes are strengthened by time pressure and attenuated by anticipation of large risks. |
Keywords: | endogenous preferences, probability distortions, misspecified learning |
JEL: | D81 D87 D91 |
Date: | 2022 |
URL: | http://d.repec.org/n?u=RePEc:ces:ceswps:_9547&r= |
By: | Dave, Chetan (University of Alberta, Department of Economics); Dressler, Scott (Villanova University); Malik, Samreen (New York University Abu Dhabi) |
Abstract: | Several macroeconomic time series exhibit excess kurtosis or “Fat Tails” possibly due to rare but large shocks (i.e., tail events). We document the extent to which tail events are attributable to long-run growth shocks. We show that excess kurtosis is not a uniform characteristic of postwar US data, but attributable to episodes containing well-documented growth shocks. A general equilibrium model captures these observations assuming Gaussian business-cycle shocks and a single growth shock from various sources. The model matches the data best with a growth shock to labor productivity while investment-specific technology shocks drive cycles. |
Keywords: | fat tails; growth shocks; real business cycles |
JEL: | E00 E30 |
Date: | 2022–03–24 |
URL: | http://d.repec.org/n?u=RePEc:ris:albaec:2022_001&r= |
By: | Nicholas Fritsch; Jan-Peter Siedlarek |
Abstract: | Understanding banks’ responses to capital regulation is essential for regulators to use this key tool of modern banking regulation effectively. We study how and when US banks responded to changes to the way capital ratios are measured, changes that were introduced as part of the adoption of Basel III. We find that small banks — those below USD 10bn — responded neither before nor after the release of the new rules to the change in measured capital they experienced under the new rules. In contrast, we show that regional banks — those with total assets between USD 10bn and USD 50bn — adjusted their capital ratios to partially compensate for the changes resulting from the new rules: On average, if a bank’s capital ratio when measured under the new rules was lower than under the old rules, then the bank took steps to increase its capital ratio, compared to a bank whose capital ratio did not change with the new rules. This adjustment took place prior to the publication of the specific language applicable to US banks, suggesting that the changes were largely expected by that time. Both groups of banks responded in the periods following the release of the new US rules in relation to their exposure to mortgage servicing rights, suggesting that the severe treatment of this asset class was not expected. The bank responses we estimate take place well before the Basel III rules started to come into force after 2014, emphasizing the importance of policy announcements in shaping bank behavior. |
Keywords: | bank regulation; bank capital; capital requirements |
JEL: | G21 G28 |
Date: | 2022–04–20 |
URL: | http://d.repec.org/n?u=RePEc:fip:fedcwq:94079&r= |
By: | Mauro Bernardi; Daniele Bianchi; Nicolas Bianco |
Abstract: | We develop a new variational Bayes estimation method for large-dimensional sparse multivariate predictive regression models. Our approach allows to elicit ordering-invariant shrinkage priors directly on the regression coefficient matrix rather than a Cholesky-based linear transformation, as typically implemented in existing MCMC and variational Bayes approaches. Both a simulation and an empirical study on the cross-industry predictability of equity risk premiums in the US, show that by directly shrinking weak industry inter-dependencies one can substantially improve both the statistical and economic out-of-sample performance of multivariate regression models for return predictability. This holds across alternative continuous shrinkage priors, such as the adaptive Bayesian lasso, adaptive normal-gamma and the horseshoe. |
Date: | 2022–02 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2202.12644&r= |
By: | Sergio A. Correia; Matthew P. Seay; Cindy M. Vojtech |
Abstract: | A resilient banking system meets the demands of households and businesses for financial services during both benign and severe macroeconomic and financial conditions. Banks' ability to weather severe macroeconomic shocks, and their willingness to continue providing financial services, depends on their levels of capital, balance sheet exposures, and ability to generate earnings. This note uses the Forward-Looking Analysis of Risk Events (FLARE) stress testing model to evaluate the resiliency of the banking system by consistently applying severe macroeconomic and financial shocks each quarter between 2014:Q1 and 2021:Q3. |
Date: | 2022–03–18 |
URL: | http://d.repec.org/n?u=RePEc:fip:fedgfn:2022-03-18&r= |
By: | Sebastian Baran; Przemys{\l}aw Rola |
Abstract: | The insurance industry, with its large datasets, is a natural place to use big data solutions. However it must be stressed, that significant number of applications for machine learning in insurance industry, like fraud detection or claim prediction, deals with the problem of machine learning on an imbalanced data set. This is due to the fact that frauds or claims are rare events when compared with the entire population of drivers. The problem of imbalanced learning is often hard to overcome. Therefore, the main goal of this work is to present and apply various methods of dealing with an imbalanced dataset in the context of claim occurrence prediction in car insurance. In addition, the above techniques are used to compare the results of machine learning algorithms in the context of claim occurrence prediction in car insurance. Our study covers the following techniques: logistic-regression, decision tree, random forest, xgBoost, feed-forward network. The problem is the classification one. |
Date: | 2022–04 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2204.06109&r= |
By: | Jozef Barunik; Lubos Hanus |
Abstract: | We propose a deep learning approach to probabilistic forecasting of macroeconomic and financial time series. Being able to learn complex patterns from a data rich environment, our approach is useful for a decision making that depends on uncertainty of large number of economic outcomes. Specifically, it is informative to agents facing asymmetric dependence of their loss on outcomes from possibly non-Gaussian and non-linear variables. We show the usefulness of the proposed approach on the two distinct datasets where a machine learns the pattern from data. First, we construct macroeconomic fan charts that reflect information from high-dimensional data set. Second, we illustrate gains in prediction of stock return distributions which are heavy tailed, asymmetric and suffer from low signal-to-noise ratio. |
Date: | 2022–04 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2204.06848&r= |