|
on Risk Management |
| By: | Charlie Che; Hanxuan Lin; Yudong Yang; Guofan Hu; Lei Fang |
| Abstract: | We propose a model independent framework for generating SPX and VIX risk scenarios based on a joint optimal transport calibration of their market smiles. Starting from the entropic martingale optimal transport formulation of Guyon, we introduce a perturbation methodology that computes sensitivities of the calibrated coupling using a Fisher information linearization. This allows risk to be generated without performing a full recalibration after market shocks. We further introduce a dimension reduction method based on perturbed optimal transport that produces fast and stable risk estimates while preserving the structural properties of the calibrated model. The approach is combined with Skew Stickiness Ratio(SSR) dynamics to translate SPX shocks into perturbations of forward variance and VIX distributions. Numerical experiments show that the proposed method produces accurate risk estimates relative to full recalibration while being computationally much faster. A backtesting study also demonstrates improved hedging performance compared with stochastic local volatility models. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.10857 |
| By: | Dhruv Bansal; Mayank Goud; Sourav Majumdar |
| Abstract: | In the Vasicek credit portfolio model, tail risk is driven primarily by the asset-correlation parameter, yet empirically is subject to correlation risk. We propose a stochastic correlation extension of the Vasicek framework in which the correlation state evolves as a diffusion on the circle. This representation accommodates both non-mean-reverting and mean-reverting dependence regimes via circular Brownian motion and von Mises process, while retaining tractable transition densities. Conditionally on a fixed correlation state, we derive closed or semi-closed form expressions for the joint distribution of two assets, the joint first-passage (default) time distribution, and the joint survival probability. A simulation study quantifies how correlation volatility and persistence reshape joint default-at-horizon, survival, and joint barrier-crossing probabilities beyond marginal volatility effects. An empirical illustration using U.S. bank charge-off rates demonstrates economically interpretable time-variation in a dependence index and shows how inferred stochastic dependence translates into materially different joint tail-event probabilities. Overall, circular diffusion models provide a parsimonious and operationally tractable route to incorporating correlation risk into Vasicek structural credit calculations. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.01109 |
| By: | Ruodu Wang; Jingcheng Yu |
| Abstract: | We study submodularity for law-invariant functionals, with special attention to convex risk measures. Expected losses are modular, and certainty equivalents are submodular if and only if the underlying loss function is convex. Law-invariant coherent risk measures are submodular if and only if they are coherent distortion risk measures, which include the class of Expected Shortfall (ES). We proceed to consider four classes of convex risk measures with explicit formulas. For shortfall risk measures, we give a complete characterization through an inequality on the Arrow--Pratt measure of risk aversion. The optimized certainty equivalents are always submodular, whereas for the adjusted Expected Shortfall (AES) with a nonconvex penalty function, submodularity forces reduction to a standard ES. Within a subclass of monotone mean-deviation risk measures, submodularity can hold only in coherent distortion cases. In an empirical study of daily US equity returns using rolling historical estimation, no ES submodularity violations are observed, as expected from the exact ES structure of the estimator; VaR shows persistent violations linked to market stress, and AES shows a small percentage of violations. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.01232 |
| By: | Manan Poddar (Department of Mathematics, London School of Economics) |
| Abstract: | Deep hedging trains neural networks to manage derivative risk under market frictions, but produces hedge ratios with no measure of model confidence -- a significant barrier to deployment. We introduce uncertainty quantification to the deep hedging framework by training a deep ensemble of five independent LSTM networks under Heston stochastic volatility with proportional transaction costs. The ensemble's disagreement at each time step provides a per-time-step confidence measure that is strongly predictive of hedging performance: the learned strategy outperforms the Black-Scholes delta on approximately 80% of paths when model agreement is high, but on fewer than 20% when disagreement is elevated. We propose a CVaR-optimised blending strategy that combines the ensemble's hedge with the classical Black-Scholes delta, weighted by the level of model uncertainty. The blend improves on the Black-Scholes delta by 35-80 basis points in CVaR across several Heston calibrations, and on the theoretically optimal Whalley-Wilmott strategy by 100-250 basis points, with all improvements statistically significant under paired bootstrap tests. The analysis reveals that ensemble uncertainty is driven primarily by option moneyness rather than volatility, and that the uncertainty-performance relationship inverts under weak leverage -- findings with practical implications for the deployment of machine learning in hedging systems. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.10137 |
| By: | Jimmy Risk; Shen-Ning Tung; Tai-Ho Wang |
| Abstract: | This paper develops a robust mathematical framework for Constant Function Market Makers (CFMMs) by transitioning from traditional token reserve analyses to a coordinate system defined by price and intrinsic liquidity. We establish a canonical parametrization of the bonding curve that ensures dimensional consistency across diverse trading functions, such as those employed by Uniswap and Balancer, and demonstrate that asset reserves and value functions exhibit a linear dependence on this intrinsic liquidity. This linear structure facilitates a streamlined approach to arbitrage-free pricing, delta hedging, and systematic risk management. By leveraging the Carr-Madan spanning formula, we characterize Impermanent Loss (IL) as a weighted strip of vanilla options, thereby defining a fine-grained implied volatility structure for liquidity profiles. Furthermore, we provide a path-dependent analysis of IL using the last-passage time. Empirical results from Uniswap v3 ETH/USDC pools and Deribit option markets confirm a volatility smile consistent with crypto-asset dynamics, validating the framework's utility in characterizing the risk-neutral fair value of liquidity provision. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.01344 |
| By: | Yinhuan Li; Chenxin Lyu; Ruodu Wang |
| Abstract: | Risk forecasts in financial regulation and internal management are calculated through historical data. The unknown structural changes of financial data poses a substantial challenge in selecting an appropriate look-back window for risk modeling and forecasting. We develop a data-driven online learning method, called the bootstrap-based adaptive window selection (BAWS), that adaptively determines the window size in a sequential manner. A central component of BAWS is to compare the realized scores against a data-dependent threshold, which is evaluate based on an idea of bootstrap. The proposed method is applicable to the forecast of risk measures that are elicitable individually or jointly, such as the Value-at-Risk (VaR) and the pair of the VaR and the corresponding Expected Shortfall. Through simulation studies and empirical analyses, we demonstrate that BAWS generally outperforms the standard rolling window approach and the recently developed method of stability-based adaptive window selection, especially when there are structural changes in the data-generating process. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.01157 |
| By: | Arno Botha; Mohammed Gabru; Marcel Muller; Janette Larney |
| Abstract: | The estimation of marginal loan write-off probabilities is a non-trivial task when modelling the loss given default (LGD) risk parameter in credit risk. We explore two types of survival models in estimating the overall write-off probability over default spell time, where these probabilities form the term-structure of write-off risk in aggregate. These survival models include a discrete-time hazard (DtH) model and a conditional inference survival tree. Both models are compared to a cross-sectional logistic regression model for write-off risk. All of these (first-stage) models are then ensconced in a broader two-stage LGD-modelling approach, wherein a loss severity model is estimated in the second stage. In expanding the model suite, a novel dichotomisation step is introduced for collapsing the write-off probability into a 0/1-value, prior to LGD-calculation. A benchmark study is subsequently conducted amongst the resulting LGD-models. We find that the DtH-model outperforms other two-stage LGD-models admirably across most diagnostics. However, a single-stage LGD-model still had the best results, likely due to the peculiar `L-shaped' LGD-distribution in our data. Ultimately, we believe that our tutorial-style work can enhance LGD-modelling practices when estimating the expected credit loss under IFRS 9. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.11897 |
| By: | Daisuke Suzuki (Showa Women's University, Japan); Shoji Kamimura (Kanagawa University, Japan); Jumpei Yamada (Meiji Gakuin University, Japan) |
| Abstract: | This paper develops a unified framework for earnings measurement by introducing two types of risk: inflow-related risk and outflow-related risk. Using the concept of risk tolerance—the acceptable level of stock-related risk—the framework spans the full range of earnings measures, from conservative cash-based accounting to forward-looking economic income. The analysis shows that (i) the asymmetry between inflows and outflows is central to earnings recognition, (ii) varying risk tolerance explains practices such as historical cost and depreciation, and (iii) realization, matching, and conservatism can be reconciled within a risk-based model. The contribution lies in formally linking risk tolerance to the stock–flow structure of accounting, providing a clearer representation of how uncertainty shapes earnings measurement. The framework offers implications for both theory and practice while also facing limitations, including its stylized two-period setting and simplified risk measure. These point to directions for future extensions and empirical validation. |
| Keywords: | Earnings, Stock-flow congruence, Matching/realization, Risk tolerance, Uncertainty |
| JEL: | M41 |
| Date: | 2025–12–15 |
| URL: | https://d.repec.org/n?u=RePEc:aoh:conpro:2025:i:6:p:10-21 |
| By: | Yang Liu; Yunran Wei; Xintao Ye |
| Abstract: | Various financial market scenarios may cause heterogeneous risk assessments among analysts, which motivates the usage of the Generalized Risk Measure in Fadina et al. (2024, Finance and Stochastics). Effectively synthesizing these diverse assessments avoids over-relying on a single, potentially flawed or conservative forecast and promotes more robust decision-making. Motivated by this, we establish analytical characterizations of the Weighted Generalized Risk Measure (WGRM) under both discrete and continuous settings. Building upon the WGRM, we incorporate the Fundamental Risk Quadrangle (FRQ) in Rockafellar and Uryasev (2013, Surveys in Operations Research and Management Science) into the Weighted Risk Quadrangle (WRQ) and show that the intrinsic relationships among risk, deviation, regret, error, and statistics in FRQ are preserved under weighted aggregation across scenarios. Moreover, we demonstrate that certain complex risk optimization problems under the WGRM can be reformulated as tractable linear programs through the WRQ structure, thus ensuring computational feasibility. Finally, the WGRM and WRQ framework is applied to empirical analyses using constituents of the NASDAQ 100 and S&P 500 indices across recession and expansion regimes, which validates that WGRM-based portfolios exhibit superior risk-adjusted performance and enhanced downside resilience and effectively mitigate losses arising from erroneous single-scenario judgments. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.10327 |
| By: | Lea Gogová (Národná banka Slovenska); Juraj Hledik (Joint Research Centre); Ján Klacso (Národná banka Slovenska) |
| Abstract: | Climate change is expected to lead to more frequent and intense extreme weather events, such as floods and droughts, which in turn increase physical risks. In this paper, we assess the direct exposure of Slovak banks’ corporate loan portfolios to riverine flood risk. We propose several monitoring metrics and estimate exposures at risk due to riverine flood-ing. Our analysis leverages a comprehensive dataset that integrates flood risk maps from the European Commission’s Joint Research Centre, cadastral data on firm properties, credit register data, and firms’ financial statements. While a significant share of firms are located in flood-prone areas, only a subset are likely to face flood levels that exceed critical thresh-olds. Consequently, the direct impact of riverine flooding on corporate credit risk appears to be relatively moderate — with the estimated increase of exposure at default ranging from 2 to 10 basis points of the corporate loan portfolio under standard scenarios, and up to 50–60 basis points in conservative stress cases accounting for asset value declines. Under coun-terfactual scenarios assuming a fivefold increase in the frequency of floods, the estimated increase exceeds 1 percentage point of the loan portfolio. |
| JEL: | G21 Q54 R30 |
| Date: | 2025–10 |
| URL: | https://d.repec.org/n?u=RePEc:svk:wpaper:1130 |
| By: | Hiroshi Oishi (Bank of Japan); Yoshibumi Makabe (Bank of Japan); Mitsuhiro Osada (Bank of Japan) |
| Abstract: | This article provides an overview of how supervisory granular data, particularly loan-by-loan data from banks, is used in recent analyses of the financial system at the Bank of Japan. The transaction-level granular data uncovers new facts that are difficult to identify from conventional aggregated data alone. As such, it serves as a powerful tool for detecting vulnerabilities within the financial system and evaluating potential risks. It is important to keep advancing analytical methods for utilizing granular data, thereby contributing to the enhancement of financial stability assessment and the improvement of banks' risk management. |
| Keywords: | granular data; loan-by-loan data; supervisory data; financial stability |
| JEL: | C80 G21 G32 |
| Date: | 2026–03–06 |
| URL: | https://d.repec.org/n?u=RePEc:boj:bojrev:rev26e02 |
| By: | Feinstein, Zachary; Sojmark, Andreas |
| Abstract: | We introduce a dynamic and stochastic interbank model with an endogenous notion of distress contagion, arising from rational worries about future defaults and ensuing losses. This entails a mark-to-market valuation adjustment for interbank claims, leading to a forward-backward approach to the equilibrium dynamics whereby future default probabilities are needed to determine today's balance sheets. Distinct from earlier models, the resulting distress contagion acts, endogenously, as a stochastic volatility term that exhibits clustering and down-market spikes. Furthermore, by incorporating multiple maturities, we provide a novel framework for constructing systemic interbank term structures, reflecting the intertemporal risk of contagion. We present the analysis in two parts: first, the simpler single maturity setting that extends the classical interbank network literature and, then, the multiple maturity setting for which we can examine how systemic risk materializes in the shape of the resulting term structures. |
| Keywords: | systemic risk; distress contagion; dynamic network model; multiple maturities; valuation adjustment; volatility effects; term structure; yield curves |
| JEL: | C1 |
| Date: | 2026–02–23 |
| URL: | https://d.repec.org/n?u=RePEc:ehl:lserod:137337 |
| By: | Nikhil Devanathan; Dylan Rueter; Stephen Boyd; Emmanuel Cand\`es; Trevor Hastie; Mykel J. Kochenderfer; Arpit Apoorv; David Soronow; Igor Zamkovsky |
| Abstract: | This paper introduces methodologies for constructing an index composed of a risky asset and a risk-free asset that achieves a fixed target volatility. We propose a simple proportional-control-based approach for setting the index weights, and we demonstrate in simulation that this method is more effective at consistently achieving the target volatility than an open-loop approach. We additionally present a modification to our proportional control approach that reduces index drawdowns in simulation. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.01298 |
| By: | Juliane Begenau; Vadim Elenev; Tim Landvoigt |
| Abstract: | This paper investigates financial stability risks arising from banks' interest rate exposure and uninsured deposit funding. We develop a model of heterogeneous banks featuring endogenous run risk to jointly analyze portfolio and funding choices. The model replicates key empirical patterns, including the concentration of uninsured deposits in larger banks. We analyze the impact of monetary policy rate hikes and evaluate the capacity of microprudential tools to mitigate bank fragility. Results demonstrate that tightening capital requirements significantly lowers run risk. Higher liquidity requirements targeting uninsured deposits efficiently reduce run risk, provided they are met exclusively with reserves. |
| JEL: | E41 E43 E44 E58 G11 G12 G21 G28 |
| Date: | 2026–02 |
| URL: | https://d.repec.org/n?u=RePEc:nbr:nberwo:34892 |
| By: | Han Chen (College of Finance and Statistics, Hunan University); Yijie Fei (College of Finance and Statistics, Hunan University); Jun Yu (Faculty of Business Administration, University of Macau) |
| Abstract: | Modeling the dynamics of correlations of multiple time series is an important yet difficult task, especially when the dimension is not confined to be low. In this paper, we propose a new multivariate stochastic volatility model featuring a block correlation structure. Our specification is built upon the new parametrization of the correlation matrix of Archakov & Hansen (2021) and extends the MSV-GFT model introduced in Chen et al. (2025). A Particle Gibbs Ancestor Sampling (PGAS) method is proposed to conduct the Bayesian analysis. It is shown to perform well for our model in finite samples. An empirical application based on a dozen U.S. stocks shows that our new model outperforms alternative specifications in terms of both the in-sample performance and the out-of-sample performance. |
| Keywords: | Block correlation matrix; Generalized Fisher transformation; Markov chain Monte Carlo; Multivariate stochastic volatility; Particle filter |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:boa:wpaper:202638 |
| By: | Christopher Blier-Wong |
| Abstract: | The conditional mean risk-sharing (CMRS) rule is an important tool for distributing aggregate losses across individual risks, but its implementation in continuous multivariate models typically requires complicated multidimensional integrals. We develop a framework to compute CMRS allocations from the joint Laplace--Stieltjes transform of the risk vector. The LSTs of the allocation measures $\nu_i(B)=\mathbb{E}[X_i\boldsymbol{1}_{\{S\in B\}}]$ are expressed as partial derivatives of the joint LST evaluated on the diagonal $t_1=\cdots=t_n$. When densities exist, this yields one-dimensional Laplace inversions for $f_S$ and $\xi_i$, and hence $h_i(s)=\xi_i(s)/f_S(s)$ on the absolutely continuous part, providing closed-form or semi-analytic solutions for a broad class of distributions. We also develop numerical inversion methods for cases where analytic inversion is unavailable. We introduce an exponential tilting procedure to stabilize numerical inversion in low-probability aggregate events. We provide several examples to illustrate the approach, including in some high-dimensional settings where existing approaches are infeasible. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.01434 |
| By: | Harashima, Taiji |
| Abstract: | Personal bankruptcy is usually justified in theory on the grounds that households always face substantial risks, which conversely implies that if there is no risk, personal bankruptcy is only socially harmful. However, even in a non-stochastic environment, households can persistently owe large debts because of heterogeneities in their preferences and abilities. Moreover, the usually implemented means of public assistance cannot necessarily prevent households from persistently owing large debts because of these heterogeneities. In this paper, I show that personal bankruptcy can prevent households from persistently owing large debts resulting from their heterogeneities, and that personal bankruptcy can therefore be justified even in a non-stochastic environment. |
| Keywords: | Personal bankruptcy; Bankruptcy discharge; Sustainable heterogeneity |
| JEL: | D63 E21 |
| Date: | 2026–03–03 |
| URL: | https://d.repec.org/n?u=RePEc:pra:mprapa:125812 |
| By: | Goumet, Laudine; Menegat, Martina; Almeida, Elena; Waaifoort, Maria; Smolenska, Agnieszka |
| Abstract: | Nature degradation is increasingly recognised as a source of prudential and macrofinancial risk. This has prompted regulators and banks in the European Union to move beyond climate-only approaches to assess broader environmental exposures. An analysis of 15 EU banks’ public disclosure documents reveals a developing approach to nature-risk mitigation but a gap between ambition and implementation. Banks, financial supervisors and regulators should build on emerging good practices even in the face of regulatory rollbacks. They should treat nature-related risks as material prudential concerns, strengthen monitoring and assessment frameworks, and address environmental risks in their entirety rather than through a climate lens alone. |
| JEL: | F3 G3 N0 |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:ehl:lserod:137487 |
| By: | Fotso, Chris Toumping; Özer, Yeliz; Palumbo, Dario; Sibbertsen, Philipp |
| Abstract: | A dynamic modelling for heavy-tailed cylindrical time series is developed by combining score-driven models with a generalised Pareto-type cylindrical distribution. The proposed specification extends existing cylindrical models by allowing location, scale, concentration, andcrucially, the tail index of the linear component through the conditional distribution of speed to vary according to its score. Whereas the Weibull-von Mises model, whose linear componentexhibits exponentially decaying tails, the GPar specification admits polynomial tail decay. An explicit expression for the time-varying circular-linear dependence measure is also derived. The methodology is applied to high-frequency data from two onshore wind turbines in Germany. The empirical results indicate that allowing time-varying tail thickness leads to overall improvements compared to the Weibull-von Mises model. The proposedmodelprovidesaflexibleandcomputationallytractableframeworkforanalysing heavy-tailed cylindrical time series in environmental and energy applications. |
| Keywords: | cylindrical distributions, dynamic correlation, generalised Pareto, score-driven models, Weibull-von Mises, wind energy. |
| JEL: | C13 C18 C22 C46 Q42 |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:han:dpaper:dp-745 |
| By: | Adir Saly-Kaufmann; Kieran Wood; Jan Peter-Calliess; Stefan Zohren |
| Abstract: | We present a large scale benchmark of modern deep learning architectures for a financial time series prediction and position sizing task, with a primary focus on Sharpe ratio optimization. Evaluating linear models, recurrent networks, transformer based architectures, state space models, and recent sequence representation approaches, we assess out of sample performance on a daily futures dataset spanning commodities, equity indices, bonds, and FX spanning 2010 to 2025. Our evaluation goes beyond average returns and includes statistical significance, downside and tail risk measures, breakeven transaction cost analysis, robustness to random seed selection, and computational efficiency. We find that models explicitly designed to learn rich temporal representations consistently outperform linear benchmarks and generic deep learning models, which often lead the ranking in standard time series benchmarks. Hybrid models such as VSN with LSTM, a combination of Variable Selection Networks (VSN) and LSTMs, achieves the highest overall Sharpe ratio, while VSN with xLSTM and LSTM with PatchTST exhibit superior downside adjusted characteristics. xLSTM demonstrates the largest breakeven transaction cost buffer, indicating improved robustness to trading frictions. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.01820 |
| By: | Aman, Muhammad; Ali, Amjad; Audi, Marc |
| Abstract: | This research investigates the potential role of Bitcoin as a hedge against inflation across various countries, utilizing data spanning from 2015 to 2024. As central banks confront the inflationary pressures intensified by the global pandemic and fluctuations in international money supply, Bitcoin has gained increased attention. Proponents of Bitcoin contend that, similar to gold and in contrast to government-issued currencies, it is decentralized and has a limited supply, which theoretically protects it from inflationary erosion. However, due to the high volatility and speculative nature of cryptocurrencies, their practicality for facilitating monetary transactions remains contentious. Grounded in the positivist paradigm, this study employs ordinary least squares regression, dynamic conditional correlation-generalized autoregressive conditional heteroskedasticity, panel fixed effects, and quantile regression methods, using monthly data on Bitcoin returns, inflation levels, and financial benchmarks across both developed and emerging economies. Empirical findings reveal that Bitcoin returns exhibit no significant correlation with inflation, either across the full sample or within advanced economies. The evidence explains that Bitcoin's valuation responds more to variables like exchange rates, interest rates, and speculative investor behavior than to inflation itself. Comparative performance analysis indicates that Bitcoin underperforms traditional inflation hedging instruments. During inflationary episodes, assets such as gold and Treasury Inflation-Protected Securities offer more reliable financial protection than Bitcoin. The study concludes that while Bitcoin does not effectively hedge against inflation, it may serve as a risk-diversification tool within portfolios under specific conditions. Due to its volatility, regulatory limitations, and weak inflation linkage, Bitcoin remains unsuitable for integration into conventional central banking frameworks. These insights offer practical implications for investors, portfolio managers, and policymakers navigating inflationary periods. Although Bitcoin may serve niche purposes, it should not be equated with traditional risk-hedging financial assets. |
| Keywords: | Bitcoin, Inflation Hedge, Cryptocurrency, Emerging Markets |
| JEL: | E4 |
| Date: | 2025 |
| URL: | https://d.repec.org/n?u=RePEc:pra:mprapa:127489 |
| By: | Cameron Tucker |
| Abstract: | For Financial Literacy Month, get a refresher on insurance and learn about resources to help you teach your kids or students about it. |
| Date: | 2025–04–02 |
| URL: | https://d.repec.org/n?u=RePEc:fip:l00100:102759 |
| By: | William N. Goetzmann; Otto Manninen; James Tyler |
| Abstract: | We examine the historical frequency of stock market booms, crashes, and bubbles in the United States from 1792 to 2024 using aggregate market data and industry-level portfolios. We define a bubble as a large boom followed by a crash that reverses the market’s prior gains. Bubbles are extremely rare. We extend the industry-level analysis of Greenwood, Shleifer, and You (2019) through 2024 and replicate their findings out of sample using Cowles Commission industry data from 1871 to 1938. Booms do not reliably predict crashes, but they do predict higher subsequent volatility, increasing the likelihood of both large gains and large losses. |
| JEL: | G1 G10 G12 G4 |
| Date: | 2026–02 |
| URL: | https://d.repec.org/n?u=RePEc:nbr:nberwo:34903 |
| By: | Jonathan Klinge; Maren Diane Schmeck |
| Abstract: | We study a dynamic model of a non-life insurance portfolio. The foundation of the model is a compound Poisson process that represents the claims side of the insurer. To introduce clusters of claims appearing, e.g. with catastrophic events, this process is time-changed by a L\'evy subordinator. The subordinator is chosen so that it evolves, on average, at the same speed as calendar time, creating a trade-off between intensity and severity. We show that such a transformation always has a negative impact on the probability of ruin. Despite the expected total claim amount remaining invariant, it turns out that the probability of ruin as a function of the initial capital falls arbitrarily slowly depending on the choice of the subordinator. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.01821 |