|
on Risk Management |
| By: | Minxuan Hu; Ziheng Chen; Jiayu Yi; Wenxi Sun |
| Abstract: | The deployment of autonomous AI agents in derivatives markets has widened a practical gap between static model calibration and realized hedging outcomes. We introduce two reinforcement learning frameworks, a novel Replication Learning of Option Pricing (RLOP) approach and an adaptive extension of Q-learner in Black-Scholes (QLBS), that prioritize shortfall probability and align learning objectives with downside sensitive hedging. Using listed SPY and XOP options, we evaluate models using realized path delta hedging outcome distributions, shortfall probability, and tail risk measures such as Expected Shortfall. Empirically, RLOP reduces shortfall frequency in most slices and shows the clearest tail-risk improvements in stress, while implied volatility fit often favors parametric models yet poorly predicts after-cost hedging performance. This friction-aware RL framework supports a practical approach to autonomous derivatives risk management as AI-augmented trading systems scale. |
| Date: | 2026–02 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.06587 |
| By: | Matteo Bonato (Department of Economics and Econometrics, University of Johannesburg, Auckland Park, South Africa; IPAG Business School, 184 Boulevard Saint-Germain, 75006 Paris, France); Oguzhan Cepni (Ostim Technical University, Ankara, Turkiye; University of Edinburgh Business School, Centre for Business, Climate Change, and Sustainability; Department of Economics, Copenhagen Business School, Denmark); Rangan Gupta (Department of Economics, University of Pretoria, Private Bag X20, Hatfield 0028, South Africa); Christian Pierdzioch (Department of Economics, Helmut Schmidt University, Holstenhofweg 85, P.O.B. 700822, 22008 Hamburg, Germany) |
| Abstract: | We introduce credit standards from the Federal Reserve's Senior Loan Officer Opinion Survey (SLOOS) as a novel predictor of U.S. stock market realized volatility over 1990:04-2024:12. We show that tighter credit standards significantly predict higher realized volatility both in- and out-of-sample at one-, three-, and six-month-ahead horizons. A parsimonious model with only the credit standards factor outperforms more complex specifications incorporating macroeconomic factors, uncertainty indexes, and realized moments, estimated via elastic-net and random forest methods, with forecasting gains increasing at longer horizons. These findings establish credit standards as a powerful and distinct predictor of stock market volatility with practical implications for portfolio allocation and risk management. |
| Keywords: | Credit conditions, Realized stock market volatility, Forecasting |
| JEL: | C22 C53 E23 G10 G17 |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:pre:wpaper:202607 |
| By: | Claudia Fassino; Pierpaolo Uberti |
| Abstract: | Given a reference risk measure, the risk budgeting is the portfolio where each asset contributes a predetermined amount to the total risk. We propose a novel approach, alternative to the ones proposed in the literature, for the calculation of the risk budgeting portfolio. This different perspective on the problem has several interesting consequences. For the calculation of the portfolio, we define a Cauchy sequence within the simplex of R^n, whose limit corresponds to the risk budgeting portfolio. This construction allows for the straightforward implementation of an efficient algorithm, avoiding the need to solve auxiliary, equivalent optimization problems, which may be computationally challenging and hard to interpret in the decision theory context. We compare our algorithm with the standard optimization-based methods proposed in the literature. From a theoretical point of view, starting from the Cauchy sequence, we define a function for which the risk budgeting portfolio is a fixed point. Therefore, sufficient conditions for the existence and uniqueness of the fixed point can be used. The methodology is developed for general risk measures and implemented in detail in the case of standard deviation. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.15511 |
| By: | Daniel Bloch |
| Abstract: | This paper introduces a transformative framework for managing path-dependent financial risk by shifting from traditional distribution-centric models to a geometry-based approach. We propose the SigSwap as a new regulatory instrument that allows market participants to decompose complex risk into terminal price law and the underlying texture of the price path. By utilising the mathematical properties of the path-signature, we demonstrate how previously unmodellable risks, such as lead-lag dynamics and flash-crash spiralling, can be converted into transparent and linear risk factors. Central to this framework is the introduction of Signature Expected Shortfall, a risk metric designed to capture toxic path geometries that traditional methods often overlook. We also present a proactive monitoring system based on the Temporal Exposure Profile, which utilises anticipatory learning to detect potential liquidity traps and geometric decoupling before they manifest as realised volatility. The proposed methodology offers a rigorous alignment with global regulatory mandates, specifically the Fundamental Review of the Trading Book (FRTB), by providing a consistent bridge between physical stress-testing and risk-neutral hedging. Finally, we show that this algebraic approach significantly reduces computational complexity, enabling real-time, high-frequency risk reporting and capital optimisation for the modern financial ecosystem. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.24154 |
| By: | Otar Sepper |
| Abstract: | We introduce $\textbf{Slippage-at-Risk (SaR)}$, a quantitative framework for measuring liquidity risk in perpetual futures exchanges. Unlike backward-looking metrics such as Value-at-Risk computed on historical returns or realized deficit distributions, SaR provides a \emph{forward-looking} assessment of liquidation execution risk derived from current order book microstructure. The framework comprises three complementary metrics: $SaR(\alpha)$, the cross-sectional slippage quantile; $ESaR(\alpha)$, the expected slippage in the distributional tail; and $TSaR(\alpha)$, the aggregate dollar-denominated tail slippage. We extend the base framework with a \emph{concentration adjustment} that penalizes fragile liquidity structures where a small number of market makers dominate quote provision. Drawing on recent work by Chitra et al. (2025) on autodeleveraging mechanisms and insurance fund optimization, we establish a direct mapping from SaR metrics to optimal capital requirements. Empirical analysis using Hyperliquid order book data, including the October 10, 2025 liquidation cascade, demonstrates SaR's predictive validity as a leading indicator of systemic stress. We conclude with practical implementation guidance and discuss philosophical implications for risk management in decentralized financial systems. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.09164 |
| By: | Giulia Di Nunno; Emanuela Rosazza Gianin |
| Abstract: | Whenever dealing with horizons of different times scales, risk evaluation of losses may incur in both interest rate uncertainty and horizon risk as introduced in [11]. With the goal to capture both effects, we work with cash subadditive fully-dynamic risk measures. In this work we consider such measures obtained via the BSDE and the shortfall approaches. We stress that we consider BSDEs both with Lipschitz and quadratic drivers. We then introduce the hq-entropic risk measure on losses as an effective example of fully-dynamic risk measure serving the scope. Shortfall risk measures are extended to capture cash non-additivity. For our newly introduced h-generalized shortfall risk measures we provide a dual representation and we connect them to fully-dynamic certainty equivalent. To conclude, we can see that the hq-entropic risk measures on losses belong to the family h-generalized shortfall, but they are not of certainty equivalent type. We note that the classical entropic risk measure, besides being generated by a BSDE, is also both a shortfall and a certainty equivalent. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.14024 |
| By: | Konietschke, Paul; Metzler, Julian; Ponte Marques, Aurea |
| Abstract: | Conventional credit risk models understate tail risk by centering on mean default probabilities and neglecting distributional and sectoral heterogeneity. We propose a Quantile Probability of Default (QPD) framework based on unconditional quantile regressions estimated on flow default rates from five million non-financial firms across nine countries, conditioned on macro- and sectoral scenario covariates standard in stress testing. The tail exhibits three- to five-fold stronger sensitivity than at the median, revealing non-linearities and asymmetric sectoral propagation of credit risk. We validate the performance of our model across crisis periods and benchmark models to confirm the framework’s robustness and prudential efficiency. Under the European Central Banks’s 2025 increasing geopolitical and trade tensions scenario, the QPD identifies higher tail vulnerabilities in construction, trade, hospitality, and real estate. The framework embeds distributional estimation into stress testing, advancing scenario-based assessment of sectoral credit risk for policy and prudential applications. JEL Classification: C21, C54, D22, G21, G32 |
| Keywords: | firm dynamics, non-linearity, probability of default, stress testing, trade tension |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:ecb:ecbwps:20263207 |
| By: | Xiaochun Liu; Richard Luger |
| Abstract: | We introduce a semiparametric approach for forecasting Value-at-Risk (VaR) and Expected Shortfall (ES) by modeling the conditional scale of financial returns, defined as the difference between two specified quantiles, via restricted quantile regression. Focusing on downside risk, VaR is derived from the left-tail quantile of rescaled returns, and ES is approximated by averaging quantiles below the VaR level. The method delivers robust, distribution-free estimates of extreme losses and captures skewness, heavy tails, and leverage effects. Simulation experiments and empirical analysis show that it often outperforms established models, including GARCH and joint VaR-ES conditional-quantile approaches. An application to daily returns on major international stock indices, spanning the COVID-19 period, highlights its effectiveness in capturing risk dynamics. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.02357 |
| By: | Brunella Bruno and Immacolata Marino |
| Abstract: | We examine how banks adjust credit allocation when hidden credit risk is revealed. Using supervisory risk disclosure data from the European Central Bank’s 2014 Asset Quality Review, we find that banks experiencing larger increases in non-performing loans and provisions significantly reduce riskweighted exposures while keeping total credit volumes largely unchanged. This suggests that derisking primarily occurs through portfolio reallocation - particularly within portfolios - rather than through credit contraction. We document heterogeneous responses depending on the rating approach used to measure credit risk and we show that capital constraints amplify, but are not the sole drivers of, de-risking. Finally, we provide evidence that supervisory risk disclosure plays a key role in shaping banks’ risk-taking behavior, even in the absence of observable adjustments in their financial statements. |
| Keywords: | Transparency, Bank Supervision, Credit risk, Non-performing loans (NPLs) |
| JEL: | G21 G28 M48 |
| Date: | 2026 |
| URL: | https://d.repec.org/n?u=RePEc:baf:cbafwp:cbafwp26268 |
| By: | Luna Rigby; R\"udiger Frey; Erik Schl\"ogl |
| Abstract: | Model risk arises from the misspecification of probabilistic models used for pricing and hedging derivatives. While model risk for European-style claims has been widely studied, much less attention has been given to American-style derivatives and the associated optimal stopping problems. This paper analyzes model risk in the optimal exercise of an American put option using the benchmark methodology of Hull and Suo [2002]. The true data-generating process is assumed to follow a Heston stochastic volatility model. We compare the optimal exercise strategy of an investor who correctly uses the Heston model with those of investors who instead use misspecified Black--Scholes or Dupire local volatility models. Optimal exercise boundaries are computed numerically via finite difference methods. Stochastic volatility dynamics and return--volatility correlation are found to have a substantial impact on optimal exercise behavior across models, creating a source of model risk. As this behavior is not transmitted to exercise strategies determined by misspecified models, even if such models are fully calibrated to European option prices, calibration fails to mitigate model risk in this context. This issue persists under frequent recalibration of a misspecified model. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.19984 |
| By: | Daouia, Abdelaati; Hachem, Joseph; Stupfler, Gilles |
| Abstract: | A major mathematical difficulty in studying extreme value parameter estimators defined as empirical mean excesses is their reliance on high order statistics above a random threshold. Based on simple yet novel derandomization arguments, we provide sufficient conditions for deriving the joint asymptotic distribution of so-called tail empirical excesses and Expected Shortfall with the underlying threshold level. This high-level result allows for a strong degree of heterogeneity in the data-generating process as well as serial dependence. When the observations are independent and their average distribution is heavy-tailed, we obtain asymptotic normality results for the Hill estimator of the extreme value index, the Weissman estimator of extreme quantiles, and two estimators of Expected Shortfall above an extreme level, under substantially weaker, yet easily verifiable and interpretable conditions than those prevailing in the recent literature. In particular, we establish precise closed-form expressions for the asymptotic bias and variance of each estimator. Our assumptions hold in a wide range of models where existing results may not apply, including scenarios of contaminated samples, pooled samples from several populations, heterogeneous location-scale models and the situation where observed covariate information is ignored. We discuss practical consequences of our results on simulated data and two real data applications to cyber risk and financial risk management. |
| Keywords: | Derandomization; Expected Shortfall; Extreme quantile; Heavy tails; Heterogeneity; Hill estimator |
| Date: | 2026–03–17 |
| URL: | https://d.repec.org/n?u=RePEc:tse:wpaper:131598 |
| By: | Hanyong Cho; Geumil Bae; Jang Ho Kim |
| Abstract: | This paper investigates how large language models (LLMs) form and express investor risk profiles, a critical component of retail investment advising. We examine three LLMs (GPT, Gemini, and Llama) and assess their responses to a standardized risk questionnaire under varying prompts. In particular, we establish each model's default investment profile by analyzing repeated responses per model. We observe that LLMs are generally longterm investors but exhibit different tendencies in risk tolerance: Gemini has a moderate risk level with highly consistent responses, Llama skews more conservative, and GPT appears moderately aggressive with the greatest variation in answers. Moreover, we find that assigning specific personas such as age, wealth, and investment experience leads each LLM to adjust its risk profile, although the extent of these adjustments differs across the models. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.09303 |
| By: | Keonvin Park |
| Abstract: | Portfolio construction traditionally relies on separately estimating expected returns and covariance matrices using historical statistics, often leading to suboptimal allocation under time-varying market conditions. This paper proposes a joint return and risk modeling framework based on deep neural networks that enables end-to-end learning of dynamic expected returns and risk structures from sequential financial data. Using daily data from ten large-cap US equities spanning 2010 to 2024, the proposed model is evaluated across return prediction, risk estimation, and portfolio-level performance. Out-of-sample results during 2020 to 2024 show that the deep forecasting model achieves competitive predictive accuracy (RMSE = 0.0264) with economically meaningful directional accuracy (51.9%). More importantly, the learned representation effectively captures volatility clustering and regime shifts. When integrated into portfolio optimization, the proposed Neural Portfolio strategy achieves an annual return of 36.4% and a Sharpe ratio of 0.91, outperforming equal weight and historical mean-variance benchmarks in terms of risk-adjusted performance. These findings demonstrate that jointly modeling return and covariance dynamics can provide consistent improvements over traditional allocation approaches. The framework offers a scalable and practical alternative for data-driven portfolio construction under nonstationary market conditions. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.19288 |
| By: | Paul P. Hager; Ulrich Horst; Thomas Wagenhofer; Wei Xu |
| Abstract: | We establish a microstructural foundation of the rough Bergomi model. Specifically, we consider a sequence of order driven financial market models where orders to buy or sell an asset arrive according to a Poisson process and have a long lasting impact on volatility. Using a recently established C-tightness result for c\`adl\`ag processes we establish the weak convergence of the price-volatility process to a log-normal rough volatility model. Our weak convergence result is accompanied by weak error rates that employ a recently established Clark-Ocone formula for Poisson processes and turn our microstructure model into viable alternative to classical simulation schemes. The weak error rates strongly hinge on Poisson arrival dynamics and are novel to the rough microstructure literature. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.13170 |
| By: | Pierre-Emmanuel Thérond (LSAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1 - Université de Lyon, Institut de Science Financières et d'Assurance, UCBL - Université Claude Bernard Lyon 1 - Université de Lyon, Thesseract); Pierre Boutonnet (Thesseract) |
| Abstract: | In late 2025, the IASB published an Exposure Draft proposing a new Risk Mitigation Accounting (RMA) model aimed at better reflecting how financial institutions manage interest rate repricing risk on a portfolio basis. The model seeks to reduce accounting volatility arising from measurement asymmetries between derivatives measured at fair value and instruments measured at amortised cost. It introduces a portfolio-based framework built around net repricing risk exposure and the recognition of a risk mitigation adjustment in the balance sheet. The article explains the architecture of the model and discusses its potential implications for insurers, particularly in the context of IFRS 17. |
| Abstract: | L'IASB a publié fin 2025 un projet de norme introduisant un modèle optionnel de Risk Mitigation Accounting (RMA) visant à mieux refléter, dans les états financiers, la gestion du risque de repricing des taux d'intérêt au niveau portefeuille. Le modèle cherche à réduire la volatilité comptable liée aux asymétries de valorisation entre dérivés et instruments mesurés au coût amorti. Il repose sur une approche fondée sur l'exposition nette au risque de repricing et sur la reconnaissance d'un ajustement de mitigation du risque au bilan. L'article présente l'architecture du modèle et discute ses implications potentielles pour les assureurs, notamment dans le contexte d'IFRS 17. |
| Keywords: | IFRS, Risk management, Interest Rate, Hedging, Accounting |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:hal:journl:hal-05546812 |
| By: | Emile A. Marin; JiYong Jung (Department of Economics, University of California Davis) |
| Abstract: | We characterize the gap between the equity risk premium (ERP) and its SVIX-implied lower bound as an equilibrium object, increasing in the correlation of valuations and returns, their relative volatility, and risk aversion. Higher risk premia need not be reflected in options-implied volatility. Yet, models resolving the equity premium puzzle through high risk aversion will tend to understate the lower bound and risk-neutral variance of returns. Applying our findings to a RBC economy with disasters, we consider an increase in their probability leading to a 1 p.p. rise in the ERP, but a negligible rise in the SVIX-implied bound (10 b.p.). |
| Date: | 2026–03–19 |
| URL: | https://d.repec.org/n?u=RePEc:cda:wpaper:378 |
| By: | Barigou, Karim (Université catholique de Louvain, LIDAM/ISBA, Belgium); Loisel, Stéphane; Salhi, Yahia; Vigneron, Rayane |
| Abstract: | This paper proposes an online multivariate cumulative sum (MCUSUM) monitoring procedure for detecting changes in mortality dynamics, with direct applications to mortality and longevity risk management for insurers and pension funds. The method is built on Gaussian process (GP) non-parametric mortality forecasts, and performs surveillance in real time by tracking multivariate forecast errors across ages. We develop MCUSUM schemes targeting two practically relevant forms of change: (i) a change in level, corresponding to an abrupt proportional shift in mortality rates, and (ii) a change in trend, corresponding to a shift in the rate of mortality improvement. In both cases, one-sided monitoring rules allow the practitioner to focus on either adverse mortality shocks or adverse longevity developments. By explicitly exploiting dependence between age groups, the proposed multivariate approach improves detection performance relative to collections of univariate control charts. We evaluate the procedure through simulation experiments and empirical applications to recent mortality data from France, Japan, Canada, and the USA, and we further illustrate its use on a real-world life insurance portfolio. Finally, we document the impact of age-pattern changes consistent with rectangularization of mortality curves and discuss how such dynamics can affect prospective monitoring and the interpretation of detection signals. |
| Keywords: | Mortality modeling ; Change-point detection ; Gaussian processes ; longevity risk management ; rectangularization |
| Date: | 2026–02–26 |
| URL: | https://d.repec.org/n?u=RePEc:aiz:louvad:2026004 |
| By: | Jimin Lin |
| Abstract: | Option prices encode the market's collective outlook through implied density and implied volatility. An explicit link between implied density and implied volatility translates the risk-neutrality of the former into conditions on the latter to rule out static arbitrage. Despite earlier recognition of their parity, the two had been studied in isolation for decades until the recent demand in implied volatility modeling rejuvenated such parity. This paper provides a systematic approach to build neural representations of option implied information. As a preliminary, we first revisit the explicit link between implied density and implied volatility through an alternative and minimalist lens, where implied volatility is viewed not as volatility but as a pointwise corrector mapping the Black-Scholes quasi-density into the implied risk-neutral density. Building on this perspective, we propose the neural representation that incorporates arbitrage constraints through the differentiable corrector. With an additive logistic model as the synthetic benchmark, extensive experiments reveal that deeper or wider network structures do not necessarily improve the model performance due to the nonlinearity of both arbitrage constraints and neural derivatives. By contrast, a shallow feedforward network with a single hidden layer and a specific activation effectively approximates implied density and implied volatility. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.17151 |
| By: | Jean-Michel Do Carmo Silva (EESC-GEM - Grenoble Ecole de Management) |
| Abstract: | Major contemporary risks – notably climate change, cybercrime, pandemics and wars – are disrupting the traditional insurance model, which is seen as a mechanism for financing residual risks. They raise questions about their insurability and the ability of companies to maintain their economic and societal role. How do insurance law and practices address, or should they address, the role of insurance in a risk management system designed by companies to respond to the societal transitions of our century? Firstly, the restructuring of insurance law and practices concerns the balance of interests at stake. While the logic of protecting the collective interests of policyholders remains, the evolution of extreme risks reveals its limitations. A societal approach is emerging, based on risk sharing between insurers, policyholders and public authorities. Secondly, the restructuring is also technical. Traditional insurance, based on mutualisation and compensation after assessment, is proving insufficient for correlated or poorly understood risks. The rise of parametric insurance, which provides compensation based on predefined indices, is examined, including the legal issues it raises. Some players offer integrated prevention and protection services (cybersecurity, climate diagnostics), transforming insurance into a lever for resilience rather than simply outsourcing risk. |
| Abstract: | Vers une recomposition du rôle de l'assurance dans un système de gestion des risques conçu par l'entreprise Résumé : les grands risques contemporains -notamment le dérèglement climatique, la cybermalveillance, les pandémies et les guerres -bouleversent le modèle traditionnel de l'assurance, envisagé comme mécanisme de financement des risques résiduels. Ils poussent à se questionner à propos de leur assurabilité et de la capacité des compagnies à maintenir leur rôle économique et sociétal. Comment le droit et les pratiques assurantielles se saisissent-ils ou doivent-ils se saisir du rôle de l'assurance dans un système de gestion des risques conçu par l'entreprise afin de répondre aux transitions sociétales de notre siècle ? Premièrement, la recomposition du droit et des pratiques assurantielles concerne l'équilibre des intérêts en présence. Si la logique de protection de l'intérêt collectif des assurés perdure, l'évolution des risques extrêmes révèle ses limites. Une approche sociétale émerge, fondée sur le partage du risque entre assureurs, assurés et pouvoirs publics.Deuxièmement, la recomposition est également technique. L'assurance traditionnelle, fondée sur la mutualisation et l'indemnisation après expertise, se révèle insuffisante pour des risques corrélés ou mal connus. L'essor de l'assurance paramétrique, qui indemnise sur la base d'indices prédéfinis, est examiné, y compris les questions juridiques qu'elle soulève. Certains acteurs proposent des services intégrés de prévention et de protection (cybersécurité, diagnostic climatique), transformant l'assurance en levier de résilience plutôt qu'en simple externalisation du risque. |
| Keywords: | Public-private partnerships, Organizational approach to law, Parametric insurance, Transformation of the insurer's role, Major contemporary risks, Assurance paramétrique, Transformation du rôle de l'assureur, Approche organisationnelle du droit, Partenariat public-privé, Grands risques contemporains |
| Date: | 2026–01–25 |
| URL: | https://d.repec.org/n?u=RePEc:hal:journl:hal-05543694 |
| By: | Steven Campbell; Natascha Hey; Ciamac C. Moallemi; Marcel Nutz |
| Abstract: | Auto-deleveraging (ADL) mechanisms are a critical yet understudied component of risk management on cryptocurrency futures exchanges. When available margin and other loss-absorbing resources are insufficient to cover losses following large price moves, exchanges reduce positions and socialize losses among solvent participants via rule-based ADL protocols. We formulate ADL as an optimization problem that minimizes the exchange's risk of loss arising from future equity shortfalls. In a single-asset, isolated-margin setting, we show that under a risk-neutral expected loss objective the unique optimal policy minimizes the maximum leverage among participants. The resulting design has a transparent structure: positions are reduced first for the most highly levered accounts, and leverage is progressively equalized via a water-filling (or ``leverage-draining'') rule. This policy is distribution-free, wash-trade resistant, Sybil resistant, and path-independent. It provides a canonical and implementable benchmark for ADL design and clarifies the economic logic underlying queue-based mechanisms used in practice. We further study the multi-asset, cross-margin setting, where the ADL problem becomes genuinely multi-dimensional: the exchange must allocate a vector of required reductions across accounts with portfolios exposed to correlated price moves. We show that under an expected-loss objective the problem remains separable across accounts after introducing asset-level shadow prices, yielding a scalable numerical method. We observe that naive gross leverage can be misleading in this context as it ignores hedging within portfolios. When asset prices are driven by a single dominant risk factor, the optimal policy again takes a water-filling form, but now in a factor-adjusted notion of leverage, so that more effectively hedged portfolios are deleveraged less aggressively. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.15963 |
| By: | Bo Pieter Johannes Andr\'ee |
| Abstract: | Range-based volatility estimators are widely used in financial econometrics to quantify risk and market stress, yet their application to local commodity markets remains limited. This paper shows how open-high--low-close (OHLC) volatility estimators can be adapted to monitor localized market distress across diverse development contexts, including conflict-affected settings, climate-exposed regions, remote and thinly traded markets, and import- and logistics-constrained urban hubs. Using monthly food price data from the World Bank's Real-Time Prices dataset, several volatility measures -- including the Parkinson, Garman-Klass, Rogers-Satchell, and Yang-Zhang estimators -- are constructed and evaluated against independently documented disruption timelines. Across settings, elevated volatility aligns with episodes linked to insecurity and market fragmentation, extreme weather and disaster shocks, policy and fuel-cost adjustments, and global supply-chain and trade disruptions. Volatility also detects stress that standard momentum indicators such as the relative strength index (RSI) can miss, including symmetric or rapidly reversing shocks in which offsetting supply and demand disturbances dampen net directional price movements while amplifying intra-period dispersion. Overall, OHLC-based volatility indicators provide a robust and interpretable signal of market disruptions and complement price-level monitoring for applications spanning financial risk, humanitarian early warning, and trade. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.02898 |
| By: | Zhang, Yuliang |
| Abstract: | I introduce an index that formulates the vulnerability of the banking sector from a systemic risk perspective. It is expressed in terms of the size-weighted leverage and the illiquidity-weighted Herfindahl–Hirschman Index. The empirical implementation is demonstrated using balance sheet data from U.S. bank holding companies during 2001–2024 and national banks during the Great Depression. The index can be used to monitor financial instability, activate macroprudential capital buffers, and analyse historical banking crises. |
| Keywords: | measurement; financial vulnerability; macroprudential policy; banking crises |
| JEL: | F3 G3 |
| Date: | 2026–03–10 |
| URL: | https://d.repec.org/n?u=RePEc:ehl:lserod:137224 |
| By: | Osberghaus, Alex; Schepens, Glenn |
| Abstract: | Banks use synthetic risk transfers (SRTs) to offload potential losses in their loan portfolios to non-bank investors while retaining the loans on their balance sheets. We investigate this trillion-euro market using transaction-level data from the euro area, the largest SRT market, and highlight three channels of potential risks to financial stability. First, we show that banks synthetically transfer loans that are capital-expensive relative to their riskiness. To establish causality, we exploit a regulation that causes a jump in the risk weights of loans without affecting their riskiness. As banks redeploy the freed capital, their loan portfolios become riskier relative to their capitalization. Second, after entering an SRT, banks reduce their monitoring efforts compared to other banks lending to the same firm. The reduction in monitoring is greater the larger the share of their firm exposure that banks synthetically transfer. Third, banks and non-bank investors are interconnected. Banks are more likely to sell SRTs to investors with whom they already have credit relationship. JEL Classification: G20, G21, G28 |
| Keywords: | bank monitoring, capital regulation, financial stability, securitisation |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:ecb:ecbwps:20263210 |
| By: | Benjamin K\"ohler; Anton J. Heckens; Thomas Guhr |
| Abstract: | Extreme values and the tail behavior of probability distributions are essential for quantifying and mitigating risk in complex systems of all kinds. In multivariate settings, accounting for correlations is crucial. Although extreme value analysis for infinite correlated systems remains an open challenge, we propose a practical framework for handling a large but finite number of correlated time series. We develop our approach for finance as a concrete example but emphasize its generality. We study the extremal behavior of high-frequency stock returns after rotating them into the eigenbasis of the correlation matrix. This separates and extracts various collective effects, including information on the correlated market as a whole and on correlated sectoral behavior from idiosyncratic features, while allowing us to use univariate tools of extreme value analysis. This holds even for high-frequency data where discretization effects normally complicate analysis. We employ a peaks-over-threshold approach and thereby fully avoid the analysis of block maxima. We estimate the tail shape of the rotated returns while explicitly accounting for nonstationarity, a key feature in finance and many other complex systems. Our framework facilitates tail risk estimation relative to larger trends and intraday seasonalities at both market and sectoral levels. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.05260 |
| By: | Saeed Asadi; Jonathan Yu-Meng Li |
| Abstract: | We propose Generative Adversarial Regression (GAR), a framework for learning conditional risk scenarios through generators aligned with downstream risk objectives. GAR builds on a regression characterization of conditional risk for elicitable functionals, including quantiles, expectiles, and jointly elicitable pairs. We extend this principle from point prediction to generative modeling by training generators whose policy-induced risk matches that of real data under the same context. To ensure robustness across all policies, GAR adopts a minimax formulation in which an adversarial policy identifies worst-case discrepancies in risk evaluation while the generator adapts to eliminate them. This structure preserves alignment with the risk functional across a broad class of policies rather than a fixed, pre-specified set. We illustrate GAR through a tail-risk instantiation based on jointly elicitable $(\mathrm{VaR}, \mathrm{ES})$ objectives. Experiments on S\&P 500 data show that GAR produces scenarios that better preserve downstream risk than unconditional, econometric, and direct predictive baselines while remaining stable under adversarially selected policies. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.08553 |
| By: | Yining Ding; Ruyi Liu; Marek Rutkowski |
| Abstract: | The role of collateral in derivative pricing has evolved beyond credit risk mitigation, particularly following the global financial crisis, when funding costs and basis spreads became central to valuation practices. This development coincided with the transition from the London Interbank Offered Rate (LIBOR) to risk-free rates (RFRs) and the increasing standardization of collateralised trading. We study the valuation and hedging of a class of differential swaps referencing backward-looking averages of overnight rates, with SOFR swaps appearing as a particular instance. The focus is on the impact of the collateral currency. Extending earlier results Ding et al. [Math. Finance 36 (2026), pp.~180--202], we allow the collateral account to be denominated in a currency different from that of the contractual cash flows and derive explicit pricing and hedging strategies using a futures-based replication approach. We show that the choice of collateral currency can have a non-trivial effect on both valuation and risk management. In particular, foreign-currency collateral can introduce additional risk exposures even when contractual cash flows are entirely denominated in the domestic currency. Numerical study demonstrates that collateral effects can lead to significant valuation adjustments and therefore need to be properly incorporated in modern multi-currency modelling frameworks. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.07863 |
| By: | Jean-Loup Dupret; Edouard Motte |
| Abstract: | In life insurance, life tables are used to estimate the survival distribution of individuals from a given population. However, these tables only provide survival probabilities at integer ages but no information about the distribution of deaths between two consecutive integer values. Many actuarial quantities, such as variable annuities, are functionals of the lifetime and computing them requires full information about mortality rates. One frequent solution is to postulate fractional age assumptions or mortality rate models, but it turns out that the results of the computations strongly depend on these assumptions, which makes it difficult to generalize them. We hence derive upper and lower bounds of functionals of the lifetime with respect to mortality rates, which are compatible with the observed life table at integer ages. We derive two sets of results under distinct assumptions. In the first, we assume that each mortality trajectory is almost surely consistent with all the given one-year survival probabilities from the table. In the second, we consider a relaxed formulation that allows for deviations of the mortality rates while still being consistent in expectation with the given one-year reference survival probabilities. These distinct yet complementary approaches provide a new robust framework for managing mortality risk in life insurance. They characterize the worst- and best-case contract values over all mortality processes that remain compatible with the observed life-table information, thereby enabling insurers to quantify the impact on prices of deviations of the observed mortality rates from their mortality assumptions/models. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.06238 |
| By: | Jakub Ryłow (Faculty of Economic Sciences, University of Warsaw) |
| Abstract: | This article examines the intellectual connections between the works of Kenneth Arrow, Alan Garber, and Peter Zweifel in the context of the economics of medicine, with particular emphasis on uncertainty, decision-making, and risk. It argues that health economics should be understood not merely as an applied field, but as a domain in which economic theory, mathematical economics, and actuarial science intersect. Arrow's analysis establishes uncertainty and information asymmetry as structural features of medical care markets, challenging standard welfare-theoretic assumptions. Garber extends this insight by formalizing medical decision-making through cost-effectiveness analysis and decision theory, translating clinical uncertainty into economically tractable choice problems. Zweifel's contributions to actuarial risk theory provide the mathematical foundations for health insurance systems, enabling the aggregation, pricing, and long-term management of medical risk. Taken together, these approaches demonstrate that modern health economics relies on mathematical formalization not only to model behavior, but to sustain the institutional viability of health care systems through insurance, risk pooling, and intertemporal solvency. The economics of medicine thus emerges as a field in which mathematical economics and actuarial methods are indispensable for understanding how societies manage health-related uncertainty. |
| Keywords: | health economics, cost-effectiveness analysis, QALY, willingness to pay, health insurance, decision theory, actuarial science, medical uncertainty, risk pooling |
| JEL: | D81 I11 I13 C61 |
| Date: | 2026 |
| URL: | https://d.repec.org/n?u=RePEc:war:wpaper:2026-6 |
| By: | Antoine Mandel; Vipin P. Veetil |
| Abstract: | We study how idiosyncratic firm-level shocks generate aggregate volatility and tail risk when they propagate through a production network under overlapping adjustment: new productivity draws arrive before the economy reaches the static equilibrium associated with earlier draws. Each innovation generates a `productivity wave' that mixes and dissipates over time as it travels through the production network. Macroeconomic fluctuations emerge from the interference between these waves of different vintages. The interference between these waves is governed by the dominant transient eigenvalue of the production network, and therefore so is the macroeconomic fluctuations they generate. In such a dynamic regime, the tail of the degree distribution is a markedly weaker determinant of macro fluctuations than in the fully adjusted static benchmark. And the macroeconomic significance of the degree-heterogeneity of production networks cannot be known without knowing the rate at which the economy converges to equilibrium or equivalently the spectral properties of the production network. More concretely, once we permit the time-averaging of shocks, granular shocks may account for only a small fraction of the empirically observed aggregate volatility. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.05367 |
| By: | Vespignani, Joaquin (Tasmanian School of Business & Economics, University of Tasmania); Smyth, Russell (Department of Economics, Monash University, Clayton, Australia); Saadaoui, Jamel (University Paris 8, IEE, LED, Saint-Denis, Franc); Wang, Yitian (Department of Economics, Monash University, Clayton, Australia) |
| Abstract: | We develop novel, stage-specific, geopolitical risk indicators to examine how geopolitical risk is distributed across the supply-chain for lithium and copper, two minerals which are vital for low-carbon technologies. We find that refining is the geopolitical bottleneck for both minerals, reflecting that refining capacity is highly concentrated in China. We examine refining diversification, strategic stockpiling, and AI-driven productivity gains as complementary policy instruments for mitigating exposure to geopolitical risk at the refining stage. We show that reducing China’s refining share substantially lowers refining-stage geopolitical risk, with larger gains for lithium than for copper. We find that stockpiling plays a critical role in buffering near-term geopolitical shocks, but significantly increases the projected shortfall in copper and lithium which is needed to realize the clean energy transition under alternative Net Zero pathways. We demonstrate that AI-driven productivity gains will be needed to narrow the projected supply gaps for both minerals. Our results suggest that ensuring effective security of critical minerals requires a coordinated policy mix, combining refining diversification, strategic stockpiling, and productivity-enhancing technological change |
| Keywords: | Critical Minerals; Copper; Lithium; Geopolitical Risk; Refining bottlenecks; |
| JEL: | C14 Q20 Q41 Q43 |
| Date: | 2026 |
| URL: | https://d.repec.org/n?u=RePEc:tas:wpaper:62814634 |
| By: | Marilyn Pease; Mark Whitmeyer |
| Abstract: | We provide a unifying way to analyze how risk aversion changes bidding in auctions by asking which bids become more attractive as bidders become more risk averse. In first-price auctions, under two payoff conditions--winning is never worse than the outside option, and winning with a low bid is preferable to winning only with a high bid--greater risk aversion makes high bids more appealing. In second-price auctions with a known outside option, bidding more increases risk exposure conditional on winning, so greater risk aversion favors lower bids. We show these bid-level forces translate into corresponding equilibrium comparative statics. |
| Date: | 2026–03 |
| URL: | https://d.repec.org/n?u=RePEc:arx:papers:2603.09683 |