nep-rmg New Economics Papers
on Risk Management
Issue of 2021‒04‒19
28 papers chosen by
Stan Miles
Thompson Rivers University

  1. An empirical foundation for calibrating the G-SIB surcharge By Alexander Jiron; Wayne Passmore; Aurite Werman
  2. Polynomial series expansions and moment approximations for conditional mean risk sharing of insurance losses By Denuit, Michel; Robert, Christian Y.
  3. Modelling uncertainty in financial tail risk: a forecasting combination and weighted quantile approach By Giuseppe Storti; Chao Wang
  4. Quantifying firm-level economic systemic risk from nation-wide supply networks By Christian Diem; Andr\'as Borsos; Tobias Reisch; J\'anos Kert\'esz; Stefan Thurner
  5. Macroeconomic Effects of Global Policy and Financial Risks By OGAWA Eiji; Pengfei LUO
  6. Impact of rough stochastic volatility models on long-term life insurance pricing By Dupret, Jean-Loup; Barbarin, Jérôme; Hainaut, Donatien
  7. Deep Hedging under Rough Volatility By Blanka Horvath; Josef Teichmann; Zan Zuric
  8. Optimal Surplus-dependent Reinsurance under Regime-Switching in a Brownian Risk Model By Eisenberg, Julia; Fabrykowski, Lukas; Schmeck, Maren Diane
  9. Risk sharing under the dominant peer-to-peer property and casualty insurance business models By Denuit, Michel; Robert, Christian Y.
  10. Nonstationary Portfolios: Diversification in the Spectral Domain By Bruno Scalzo; Alvaro Arroyo; Ljubisa Stankovic; Danilo P. Mandic
  11. Black-box model risk in finance By Samuel N. Cohen; Derek Snow; Lukasz Szpruch
  12. Analysis of bank leverage via dynamical systems and deep neural networks By Fabrizio Lillo; Giulia Livieri; Stefano Marmi; Anton Solomko; Sandro Vaienti
  13. Time-Consistent Evaluation of Credit Risk with Contagion By Ketelbuters, John John; Hainaut, Donatien
  14. Computation of the marginal contribution of Sharpe ratio and other performance ratios By Eric Benhamou; Beatrice Guez
  15. How risky is Monetary Policy? The Effect of Monetary Policy on Systemic Risk in the Euro Area By Leitner, Georg; Hübel, Teresa; Wolfmayr, Anna; Zerobin, Manuel
  16. Frequency-Dependent Higher Moment Risks By Jozef Barunik; Josef Kurka
  17. Analysis of optimal portfolio on finite and small time horizons for a stochastic volatility market model By Minglian Lin; Indranil SenGupta
  18. Frequency-Dependent Higher Moment Risks By Jozef Barunik; Josef Kurka
  19. Boosting cost-complexity pruned trees On Tweedie responses: the ABT machine By Trufin, Julien; Denuit, Michel
  20. The Efficient Hedging Frontier with Deep Neural Networks By Zheng Gong; Carmine Ventre; John O'Hara
  21. Risk Sharing and the Demand for Insurance: Theory and Experimental Evidence from Ethiopia By Erlend Berg; Michael Blake; Karlijn Morsink
  22. How risky is Monetary Policy? The Effect of Monetary Policy on Systemic Risk in the Euro Area By Georg Leitner; Teresa Hübel; Anna Wolfmayr; Manuel Zerobin
  23. COVID-19 as a Stress Test: Assessing the Bank Regulatory Framework By Alice Abboud; Elizabeth Duncan; Akos Horvath; Diana A. Iercosan; Bert Loudis; Francis Martinez; Timothy Mooney; Benjamin Ranish; Ke Wang; Missaka Warusawitharana; Carlo Wix
  24. The VAR at Risk By Alfred Galichon
  25. "We're rolling". Our Uncertainty Perception Indicator (UPI) in Q4 2020: introducing RollingLDA, a new method for the measurement of evolving economic narratives By Müller, Henrik; Rieger, Jonas; Hornig, Nico
  26. Loss of structural balance in stock markets By E. Ferreira; S. Orbe; J. Ascorbebeitia; B. \'Alvarez Pereira; E. Estrada
  27. CDS Pricing with Fractional Hawkes Processes By Ketelbuters, John John; Hainaut, Donatien
  28. Enabling Machine Learning Algorithms for Credit Scoring -- Explainable Artificial Intelligence (XAI) methods for clear understanding complex predictive models By Przemys{\l}aw Biecek; Marcin Chlebus; Janusz Gajda; Alicja Gosiewska; Anna Kozak; Dominik Ogonowski; Jakub Sztachelski; Piotr Wojewnik

  1. By: Alexander Jiron; Wayne Passmore; Aurite Werman
    Abstract: As developed by the BCBS, the expected impact framework is the theoretical foundation for calibrating the capital surcharge applied to global systemically important banks (G-SIB surcharge). This paper describes four improvements to the current implementation of the BCBS expected impact framework. We (i) introduce a theoretically sound and an empirically grounded approach to estimating a probability of default (PD) function; (ii) apply density-based cluster analysis to identify the reference bank for each G-SIB indicator; (iii) recalibrate the systemic loss-given-default (LGD) function that determines G-SIB scores, using both the current system based on supervisory judgment and using an alternative system based on CoVaR; and (iv) derive a continuous capital surcharge function to determine G-SIB capital surcharges. Our approach would strengthen the empirical and theoretical foundation of the G-SIB surcharge framework. Moreover, the continuous surcharge function would reduce banks' incentive to manage their balance sheets to reduce systemic capital surcharges, mitigate cliff effects, allow for the lifting of the cap on the substitutability score and penalise growth in the category for all G-SIBs. In addition, our two capital surcharge functions might be used to monitor G-SIBs' capital adequacy and distortions induced by G-SIB surcharges.
    Keywords: xxxxxx investment funds, herding, bank regulation, leverage ratio, social welfare
    JEL: G21 G23 G28 D62
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:bis:biswps:935&r=all
  2. By: Denuit, Michel (Université catholique de Louvain, LIDAM/ISBA, Belgium); Robert, Christian Y. (ENSAE)
    Abstract: This paper exploits the representation of the conditional mean risk sharing allocations in terms of size-biased transforms to derive effective approximations within insurance pools of limited size. Precisely, the probability density functions involved in this representation are expanded with respect to the Gamma density and its associated Laguerre orthonormal polynomials, or with respect to the Normal density and its associated Hermite polynomials when the size of the pool gets larger. Depending on the thickness of the tails of the loss distributions, the latter may be replaced with their Esscher transform (or exponential tilting) of negative order. The numerical method then consists in truncating the series expansions to a limited number of terms. This results in an approximation in terms of the first moments of the individual loss distributions. Compound Panjer-Katz sums are considered as an application. The proposed method is compared with the well-established Panjer recursive algorithm. It appears to provide the analyst with reliable approximations that can be used to tune system parameters, before performing exact calculations.
    Keywords: conditional expectation ; size-biased transform ; Esscher transform ; exponential tilting ; Laguerre polynomials ; Hermite polynomials
    Date: 2021–03–16
    URL: http://d.repec.org/n?u=RePEc:aiz:louvad:2021016&r=all
  3. By: Giuseppe Storti; Chao Wang
    Abstract: A novel forecasting combination and weighted quantile based tail risk forecasting framework is proposed, aiming to reduce the impact of modelling uncertainty in financial tail risk forecasting. The proposed approach is based on a two-step estimation procedure. The first step involves the combination of Value-at-Risk (VaR) forecasts at a grid of different quantile levels. A range of parametric and semi-parametric models is selected as the model universe which is incorporated in the forecasting combination procedure. The quantile forecasting combination weights are estimated by optimizing the quantile loss. In the second step, the Expected Shortfall (ES) is computed as a weighted average of combined quantiles. The quantiles weighting structure used to generate the ES forecast is determined by minimizing a strictly consistent joint VaR and ES loss function of the Fissler-Ziegel class. The proposed framework is applied to six stock market indices and its forecasting performance is compared to each individual model in the model universe and a simple average approach. The forecasting results based on a number of evaluations support the proposed framework.
    Date: 2021–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2104.04918&r=all
  4. By: Christian Diem; Andr\'as Borsos; Tobias Reisch; J\'anos Kert\'esz; Stefan Thurner
    Abstract: Crises like COVID-19 or the Japanese earthquake in 2011 exposed the fragility of corporate supply networks. The production of goods and services is a highly interdependent process and can be severely impacted by the default of critical suppliers or customers. While knowing the impact of individual companies on national economies is a prerequisite for efficient risk management, the quantitative assessment of the involved economic systemic risks (ESR) is hitherto practically non-existent, mainly because of a lack of fine-grained data in combination with coherent methods. Based on a unique value added tax dataset we derive the detailed production network of an entire country and present a novel approach for computing the ESR of all individual firms. We demonstrate that a tiny fraction (0.035%) of companies has extraordinarily high systemic risk impacting about 23% of the national economic production should any of them default. Firm size alone cannot explain the ESR of individual companies; their position in the production networks does matter substantially. If companies are ranked according to their economic systemic risk index (ESRI), firms with a rank above a characteristic value have very similar ESRI values, while for the rest the rank distribution of ESRI decays slowly as a power-law; 99.8% of all companies have an impact on less than 1% of the economy. We show that the assessment of ESR is impossible with aggregate data as used in traditional Input-Output Economics. We discuss how simple policies of introducing supply chain redundancies can reduce ESR of some extremely risky companies.
    Date: 2021–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2104.07260&r=all
  5. By: OGAWA Eiji; Pengfei LUO
    Abstract: Globalization has brought larger spillovers of global risks across borders since the 2000s. Specifically, global policy risk has sharply increased due to policy uncertainty in major countries in the recent decade, as seen in Brexit, US-China trade friction, and the COVID-19 pandemic. This paper empirically investigates the effects of both global policy risk and global financial risk on macroeconomy and financial markets in eight major countries from January 1997 to June 2020. We employed a Vector Autoregressive (VAR) framework to obtain interesting empirical results. First, global risks have recessionary effects on the macroeconomy, reducing production, deteriorating employment, lowering long-term interest rates, depressing prices, and reducing global trade. Second, global risks also have recessionary effects on financial markets, reducing stock prices, appreciating safe-haven currencies, and depreciating the other currencies. Third, the macroeconomies and the financial markets respond to global financial risk more significantly than global policy risk. Fourth, the recessionary effects of global risks vary depending on countries.
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:eti:dpaper:21020&r=all
  6. By: Dupret, Jean-Loup (Université catholique de Louvain, LIDAM/ISBA, Belgium); Barbarin, Jérôme (Université catholique de Louvain, LIDAM/ISBA, Belgium); Hainaut, Donatien (Université catholique de Louvain, LIDAM/ISBA, Belgium)
    Abstract: The Rough Fractional Stochastic Volatility (RFSV) model of Gatheral et al. [18] is remarkably consistent with financial time series data as well as with the observed implied volatility surface. Two tractable implementations are derived from the RFSV with the rBergomi model of Bayer et al. [3] and the rough Heston model of El Euch et al. [13]. We now show practically how to calibrate these two rough-type models and how they can price long-term equity-linked life insurance claims. This way, we analyze more closely their longterm properties and compare them with standard stochastic volatility models such as the Heston and Bates model. For the rough Heston, we build a highly consistent calibration and pricing methodology based on a long-term stationary regime for the volatility. This ensures a reasonable behavior of the model in the long run. Concerning the rBergomi, it does not admit a stationary volatility process and hence, this model does not exhibit realistic volatility paths for large maturities. We also show that this rBergomi is not fast enough for calibration purposes, unlike the rough Heston which is highly tractable. Compared to standard stochastic volatility models, the rough Heston hence provides efficiently a more accurate fair value of long-term life insurance contracts embedding path-dependent options while being highly consistent with historical and risk-neutral data.
    Keywords: Rough Volatility ; Volatility modeling ; Equity-linked endowment valuation ; Stationary regime ; Long-term option pricing ; Fractional Brownian motion
    Date: 2021–01–01
    URL: http://d.repec.org/n?u=RePEc:aiz:louvad:2021017&r=all
  7. By: Blanka Horvath; Josef Teichmann; Zan Zuric
    Abstract: We investigate the performance of the Deep Hedging framework under training paths beyond the (finite dimensional) Markovian setup. In particular we analyse the hedging performance of the original architecture under rough volatility models with view to existing theoretical results for those. Furthermore, we suggest parsimonious but suitable network architectures capable of capturing the non-Markoviantity of time-series. Secondly, we analyse the hedging behaviour in these models in terms of P\&L distributions and draw comparisons to jump diffusion models if the the rebalancing frequency is realistically small.
    Date: 2021–02
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2102.01962&r=all
  8. By: Eisenberg, Julia (Center for Mathematical Economics, Bielefeld University); Fabrykowski, Lukas (Center for Mathematical Economics, Bielefeld University); Schmeck, Maren Diane (Center for Mathematical Economics, Bielefeld University)
    Abstract: In this paper, we consider a company that wishes to determine the optimal reinsurance strategy minimising the total expected discounted amount of capital injections needed to prevent the ruin. The company's surplus process is assumed to follow a Brownian motion with drift, and the reinsurance price is modelled by a continuous-time Markov chain with two states. The presence of regime-switching complicates substantially the optimal reinsurance problem, as the surplus-independent strategies turn out to be suboptimal. We develop a recursive approach that allows to represent a solution to the corresponding Hamilton-Jacobi-Bellman equation and the corresponding reinsurance strategy as the unique limits of the sequence of solutions to ordinary differential equations and their first and second order derivatives. Via Ito's formula we prove the constructed function to be the value function. Two examples illustrate the recursive procedure along with a numerical approach yielding the direct solution to the HJB equation.
    Keywords: Reinsurance, Regime-switching, Brownian motion, Markov chain, Optimal control, HJB equation, Ordinary differential equations, Boundary value problem
    Date: 2021–04–06
    URL: http://d.repec.org/n?u=RePEc:bie:wpaper:648&r=all
  9. By: Denuit, Michel (Université catholique de Louvain, LIDAM/ISBA, Belgium); Robert, Christian Y. (ENSAE)
    Abstract: This paper purposes to formalize the three business models dominating peer-to-peer (P2P) property and casualty insurance: the self-governing model, the broker model and the carrier model. The former one develops outside the insurance market whereas the latter ones may originate from the insurance industry, by partnering with an existing company or by issuing a new generation of participating insurance policies where part of the risk is shared within a community and higher losses, exceeding the community's risk-bearing capacity are covered by an insurance or reinsurance company. The present paper proposes an actuarial modeling based on conditional mean risk sharing, to support the development of this new P2P insurance offer under each of the three business models. In addition, several specific questions are also addressed in the self-governing model. Considering an economic agent who has to select the optimal pool for a risk to be shared with other participants, it is shown that uniform comparison of the Lorenz or concentration curves associated to the respective total losses of the pools under consideration allows the agent to decide which pool is preferable. The monotonicity of the respective contributions of the participants is established with respect to the convex order, showing that increasing the number of participants is always beneficial under conditional mean risk sharing.
    Keywords: Risk pooling ; conditional mean risk sharing ; Lorenz curve ; concentration curve ; convex order
    Date: 2021–01–08
    URL: http://d.repec.org/n?u=RePEc:aiz:louvad:2021001&r=all
  10. By: Bruno Scalzo; Alvaro Arroyo; Ljubisa Stankovic; Danilo P. Mandic
    Abstract: Classical portfolio optimization methods typically determine an optimal capital allocation through the implicit, yet critical, assumption of statistical time-invariance. Such models are inadequate for real-world markets as they employ standard time-averaging based estimators which suffer significant information loss if the market observables are non-stationary. To this end, we reformulate the portfolio optimization problem in the spectral domain to cater for the nonstationarity inherent to asset price movements and, in this way, allow for optimal capital allocations to be time-varying. Unlike existing spectral portfolio techniques, the proposed framework employs augmented complex statistics in order to exploit the interactions between the real and imaginary parts of the complex spectral variables, which in turn allows for the modelling of both harmonics and cyclostationarity in the time domain. The advantages of the proposed framework over traditional methods are demonstrated through numerical simulations using real-world price data.
    Date: 2021–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2102.00477&r=all
  11. By: Samuel N. Cohen; Derek Snow; Lukasz Szpruch
    Abstract: Machine learning models are increasingly used in a wide variety of financial settings. The difficulty of understanding the inner workings of these systems, combined with their wide applicability, has the potential to lead to significant new risks for users; these risks need to be understood and quantified. In this sub-chapter, we will focus on a well studied application of machine learning techniques, to pricing and hedging of financial options. Our aim will be to highlight the various sources of risk that the introduction of machine learning emphasises or de-emphasises, and the possible risk mitigation and management strategies that are available.
    Date: 2021–02
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2102.04757&r=all
  12. By: Fabrizio Lillo; Giulia Livieri; Stefano Marmi; Anton Solomko; Sandro Vaienti
    Abstract: We consider a model of a simple financial system consisting of a leveraged investor that invests in a risky asset and manages risk by using Value-at-Risk (VaR). The VaR is estimated by using past data via an adaptive expectation scheme. We show that the leverage dynamics can be described by a dynamical system of slow-fast type associated with a unimodal map on [0,1] with an additive heteroscedastic noise whose variance is related to the portfolio rebalancing frequency to target leverage. In absence of noise the model is purely deterministic and the parameter space splits in two regions: (i) a region with a globally attracting fixed point or a 2-cycle; (ii) a dynamical core region, where the map could exhibit chaotic behavior. Whenever the model is randomly perturbed, we prove the existence of a unique stationary density with bounded variation, the stochastic stability of the process and the almost certain existence and continuity of the Lyapunov exponent for the stationary measure. We then use deep neural networks to estimate map parameters from a short time series. Using this method, we estimate the model in a large dataset of US commercial banks over the period 2001-2014. We find that the parameters of a substantial fraction of banks lie in the dynamical core, and their leverage time series are consistent with a chaotic behavior. We also present evidence that the time series of the leverage of large banks tend to exhibit chaoticity more frequently than those of small banks.
    Date: 2021–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2104.04960&r=all
  13. By: Ketelbuters, John John (Université catholique de Louvain, LIDAM/ISBA, Belgium); Hainaut, Donatien (Université catholique de Louvain, LIDAM/ISBA, Belgium)
    Abstract: A time-consistent evaluation is a dynamic pricing method according to which a risk that will be almost surely cheaper than another one at a future date should already be cheaper today. Common actuarial pricing approaches are usually not time-consistent. Pelsser and Ghalehjooghi (2016) derived time-consistent valuation principles from time-inconsistent ones. The aim of this paper is twofold. Firstly, we propose a model for credit insurance portfolios taking into account the contagion risk via self-exciting jump processes. Secondly, we extend the approach of Pelsser and Ghalehjooghi to credit insurance in this framework. Starting from classical time-inconsistent actuarial pricing methods, we derive partial integro-differential equations (PIDE) for their time-consistent counterparts. We discuss numerical methods for solving these PIDE and their results. We draw two conclusions from these results. On the one hand, we show that working with time-consistent evaluations in the absence of a risk of contagion does not make a significant difference compared to time-inconsisent evaluations. On the other hand, our results show that the time-consistency of evaluations allows to better take into acount the risk of contagion in credit insurance, if such a risk exists.
    Keywords: Credit risk ; Self-exciting processes ; Time-consistency
    Date: 2021–01–01
    URL: http://d.repec.org/n?u=RePEc:aiz:louvad:2021004&r=all
  14. By: Eric Benhamou (MILES - Machine Intelligence and Learning Systems - LAMSADE - Laboratoire d'analyse et modélisation de systèmes pour l'aide à la décision - Université Paris Dauphine-PSL - PSL - Université Paris sciences et lettres - CNRS - Centre National de la Recherche Scientifique, LAMSADE - Laboratoire d'analyse et modélisation de systèmes pour l'aide à la décision - Université Paris Dauphine-PSL - PSL - Université Paris sciences et lettres - CNRS - Centre National de la Recherche Scientifique); Beatrice Guez
    Abstract: Computing incremental contribution of performance ratios like Sharpe, Treynor, Calmar or Sterling ratios is of paramount importance for asset managers. Leveraging Euler's homogeneous function theorem, we are able to prove that these performance ratios are indeed a linear combination of individual modified performance ratios. This allows not only deriving a condition for a new asset to provide incremental performance for the portfolio but also to identify the key drivers of these performance ratios. We provide various numerical examples of this performance ratio decomposition.
    Keywords: portfolio analysis,recovery and incremental Sharpe ratio,Treynor,Sharpe,Marginal contribution
    Date: 2021–04–03
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-03189299&r=all
  15. By: Leitner, Georg; Hübel, Teresa; Wolfmayr, Anna; Zerobin, Manuel
    Abstract: This paper empirically investigates the effect of monetary policy on systemic risk within the Euro area. We estimate a Bayesian proxy-VAR where we exploit high-frequency identified monetary policy surprises for identification. Employing aggregate as well as market specific systemic risk measures, we provide novel evidence on the heterogeneous risk transmission of conventional and unconventional monetary policy on different financial markets. We find that expansionary conventional monetary policy, near term guidance and forward guidance decrease systemic risk whereas quantitative easing (QE) increases systemic risk. While the effects are qualitatively homogeneous for near term guidance and forward guidance, there exists heterogeneity in the risk transmission of conventional monetary policy and QE across different financial markets. The latter increases systemic risk significantly within bond markets, foreign exchange markets and among financial intermediaries. This might be caused by increased search for yield behaviour as QE distinctively reduces longer term interest rates. Our analysis shows that there is a potential threat to financial stability caused by QE which should be concerned by monetary- and macroprudential policymakers.
    Keywords: Monetary Policy, CISS, Systemic Risk, Bayesian-Proxy-VAR, High-Frequency Identification
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:wiw:wus005:8062&r=all
  16. By: Jozef Barunik (Institute of Economic Studies, Faculty of Social Sciences, Charles University, Prague, Czech Republic & Institute of Information Theory and Automation, Academy of Sciences of the Czech Republic); Josef Kurka (Institute of Economic Studies, Faculty of Social Sciences, Charles University, Prague, Czech Republic & Institute of Information Theory and Automation, Academy of Sciences of the Czech Republic)
    Abstract: Based on intraday data for a large cross-section of individual stocks and Exchange traded funds, we show that short-term as well as long-term fluctuations of realized market and average idiosyncratic higher moments risks are priced in the cross-sectionof asset returns. Specifically, we find that market and average idiosyncratic volatility and kurtosis are significantly priced by investors mainly in the long-run even if controlled by market moments and other factors, while skewness is mostly short-run phenomenon. A conditional pricing model capturing the time-variation of moments confirms downward-sloping term structure of skewness risk and upward-sloping term structure of kurtosis risk, moreover the term structures connected to market skewness risk and average idiosyncratic skewness risk exhibit different dymanics.
    Keywords: Higher Moments, frequency, Spectral Analysis, Cross-sectional
    JEL: C14 C22 G11 G12
    Date: 2021–04
    URL: http://d.repec.org/n?u=RePEc:fau:wpaper:wp2021_11&r=all
  17. By: Minglian Lin; Indranil SenGupta
    Abstract: In this paper, we consider the portfolio optimization problem in a financial market under a general utility function. Empirical results suggest that if a significant market fluctuation occurs, invested wealth tends to have a notable change from its current value. We consider an incomplete stochastic volatility market model, that is driven by both a Brownian motion and a jump process. At first, we obtain a closed-form formula for an approximation to the optimal portfolio in a small-time horizon. This is obtained by finding the associated Hamilton-Jacobi-Bellman integro-differential equation and then approximating the value function by constructing appropriate super-solution and sub-solution. It is shown that the true value function can be obtained by sandwiching the constructed super-solution and sub-solution. We also prove the accuracy of the approximation formulas. Finally, we provide a procedure for generating a close-to-optimal portfolio for a finite time horizon.
    Date: 2021–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2104.06293&r=all
  18. By: Jozef Barunik; Josef Kurka
    Abstract: Based on intraday data for a large cross-section of individual stocks and exchange traded funds, we show that short-term as well as long-term fluctuations of realized market and average idiosyncratic higher moments risks are priced in the cross-section of asset returns. Specifically, we find that market and average idiosyncratic volatility and kurtosis are significantly priced by investors mainly in the long-run even if controlled by market moments and other factors, while skewness is mostly short-run phenomenon. A conditional pricing model capturing the time-variation of moments confirms downward-sloping term structure of skewness risk and upward-sloping term structure of kurtosis risk, moreover the term structures connected to market skewness risk and average idiosyncratic skewness risk exhibit different dymanics.
    Date: 2021–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2104.04264&r=all
  19. By: Trufin, Julien (Université Libre de Bruxelles); Denuit, Michel (Université catholique de Louvain, LIDAM/ISBA, Belgium)
    Abstract: This paper proposes a new boosting machine based on forward stagewise additive modeling with cost-complexity pruned trees. In the Tweedie case, it deals directly with observed res-ponses, not gradients of the loss function. Trees included in the score progressively reduce to the root-node one, in an adaptive way. The proposed Adaptive Boosting Tree (ABT) machine thus automatically stops at that time, avoiding to resort to the time-consuming cross validation approach. A case study performed on motor third-party liability insurance claim data demons-trates the performances of the proposed ABT machine for ratemaking, in comparison with regu-lar gradient boosting trees.
    Keywords: Risk classification ; Boosting ; Gradient Boosting ; Regression Trees ; Cost-complexity pruning
    Date: 2021–03–09
    URL: http://d.repec.org/n?u=RePEc:aiz:louvad:2021015&r=all
  20. By: Zheng Gong; Carmine Ventre; John O'Hara
    Abstract: The trade off between risks and returns gives rise to multi-criteria optimisation problems that are well understood in finance, efficient frontiers being the tool to navigate their set of optimal solutions. Motivated by the recent advances in the use of deep neural networks in the context of hedging vanilla options when markets have frictions, we introduce the Efficient Hedging Frontier (EHF) by enriching the pipeline with a filtering step that allows to trade off costs and risks. This way, a trader's risk preference is matched with an expected hedging cost on the frontier, and the corresponding hedging strategy can be computed with a deep neural network. We further develop our framework to improve the EHF and find better hedging strategies. By adding a random forest classifier to the pipeline to forecast market movements, we show how the frontier shifts towards lower costs and reduced risks, which indicates that the overall hedging performances have improved. In addition, by designing a new recurrent neural network, we also find strategies on the frontier where hedging costs are even lower.
    Date: 2021–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2104.05280&r=all
  21. By: Erlend Berg; Michael Blake; Karlijn Morsink
    Abstract: Households in developing countries commonly engage in risk sharing to cope with shocks. Despite this, the residual risk they remain exposed to - often due to aggregate events such as droughts and floods - is considerable. To mitigate these risks, governments, NGOs and multilateral organizations have introduced index insurance. To appreciate its welfare implications, however, we need to assess how insurance interacts with pre-existing risk sharing. We ask to what extent the demand for index insurance - as compared to standard indemnity insurance - depends on the level of pre-existing risk sharing. We contribute by developing a simple theoretical framework which shows that, relative to a state of autarky, risk sharing between agents increases demand for index insurance and decreases demand for indemnity insurance. In an artefactual field experiment with Ethiopian farmers who share risk in real life, we test and confirm these predictions.
    Date: 2021–04–09
    URL: http://d.repec.org/n?u=RePEc:bri:uobdis:21/742&r=all
  22. By: Georg Leitner (Department of Economics, Vienna University of Economics and Business); Teresa Hübel (Department of Economics, Vienna University of Economics and Business); Anna Wolfmayr (Department of Economics, Vienna University of Economics and Business); Manuel Zerobin (Department of Economics, Vienna University of Economics and Business)
    Abstract: This paper empirically investigates the effect of monetary policy on systemic risk within the Euro area. We estimate a Bayesian proxy-VAR where we exploit high-frequency identified monetary policy surprises for identification. Employing aggregate as well as market specific systemic risk measures, we provide novel evidence on the heterogeneous risk transmission of conventional and unconventional monetary policy on different financial markets. We find that expansionary conventional monetary policy, near term guidance and forward guidance decrease systemic risk whereas quantitative easing (QE) increases systemic risk. While the effects are qualitatively homogeneous for near term guidance and forward guidance, there exists heterogeneity in the risk transmission of conventional monetary policy and QE across different financial markets. The latter increases systemic risk significantly within bond markets, foreign exchange markets and among financial intermediaries. This might be caused by increased search for yield behaviour as QE distinctively reduces longer term interest rates. Our analysis shows that there is a potential threat to financial stability caused by QE which should be concerned by monetary- and macroprudential policymakers.
    Keywords: Monetary Policy, CISS, Systemic Risk, Bayesian-Proxy-VAR, High-Frequency Identification
    JEL: C32 E44 E52 G10
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:wiw:wiwwuw:wuwp312&r=all
  23. By: Alice Abboud; Elizabeth Duncan; Akos Horvath; Diana A. Iercosan; Bert Loudis; Francis Martinez; Timothy Mooney; Benjamin Ranish; Ke Wang; Missaka Warusawitharana; Carlo Wix
    Abstract: The widespread economic damage caused by the ongoing COVID-19 pandemic poses the first major test of the bank regulatory reforms put in place following the global financial crisis. This study assesses this framework, with an emphasis on capital and liquidity requirements. Leading up to the COVID-19 crisis, banks were well-capitalized and held ample liquid assets, reflecting in part heightened requirements. Capital requirements were comparable across major jurisdictions, despite differences in the implementation of the international Basel standards. The overall robust capital and liquidity levels resulted in a resilient banking system, which maintained lending through the early stages of the pandemic. Furthermore, trading activity was a source of strength for banks, reflecting in part a prudent regulatory approach. Areas for potential improvement include addressing the cyclicality of requirements.
    Keywords: Bank capital; Banking regulation; Liquidity
    Date: 2021–04–06
    URL: http://d.repec.org/n?u=RePEc:fip:fedgfe:2021-24&r=all
  24. By: Alfred Galichon
    Abstract: I show that the structure of the firm is not neutral in respect to regulatory capital budgeted under rules which are based on the Value-at-Risk.
    Date: 2021–02
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2102.02577&r=all
  25. By: Müller, Henrik; Rieger, Jonas; Hornig, Nico
    Abstract: In this paper, we present a new dynamic topic modeling method to build stable models and consistent time series. We call this new method RollingLDA. It has the potential to overcome several difficulties researchers, who use unsupervised probabilistic topic models, have grappled with: namely the problem of arbitrary selection, which is aggravated when models are to be updated with new sequences of data. RollingLDA is derived by combining the LDAPrototype approach (Rieger, Jentsch and Rahnenführer, 2020) with an implementation that uses preceding LDA results as an initialization for subsequent quarters, while allowing topics to change over time. Squaring dual-process theory, employed in Behavioral Economics (Kahneman, 2011), with the evolving theory of Economic Narratives (Shiller, 2017), RollingLDA is applied to the measurement of economic uncertainty. The new version of our Uncertainty Perception Indicator (UPI), based on a newspaper corpus of 2.8 million German newspaper articles, published between 1 January 2001 and 31 December 2020, proves indeed capable of detecting an uncertainty narrative. The narrative, derived from the thorough quantitative-qualitative analysis of a key-topic of our model, can be interpreted as collective memory of past uncertainty shocks, their causes and the societal reactions to them. The uncertainty narrative can be seen as a collective intangible cultural asset (Haskel and Westlake, 2017), accumulated in the past, informing the present and potentially the future, as the story is being updated and partly overwritten by new experiences. This concept opens up a fascinating new field for future research. We would like to encourage researchers to use our data and are happy to share it on request.
    Keywords: Uncertainty,Narratives,Latent Dirichlet Allocation,Business Cycles,Covid-19,Text Mining,Computational Methods,Behavioral Economics
    Date: 2021
    URL: http://d.repec.org/n?u=RePEc:zbw:docmaw:6&r=all
  26. By: E. Ferreira (Department of Quantitative Methods, University of the Basque Country UPV/EHU); S. Orbe (Department of Quantitative Methods, University of the Basque Country UPV/EHU); J. Ascorbebeitia (Department of Economic Analysis, University of the Basque Country UPV/EHU); B. \'Alvarez Pereira (Nova School of Business and Economics); E. Estrada (Institute of Mathematics and Applications, University of Zaragoza, ARAID Foundation. Institute for Cross-Disciplinary Physics and Complex Systems)
    Abstract: We use rank correlations as distance functions to establish the interconnectivity between stock returns, building weighted signed networks for the stocks of seven European countries, the US and Japan. We establish the theoretical relationship between the level of balance in a network and stock predictability, studying its evolution from 2005 to the third quarter of 2020. We find a clear balance-unbalance transition for six of the nine countries, following the August 2011 Black Monday in the US, when the Economic Policy Uncertainty index for this country reached its highest monthly level before the COVID-19 crisis. This sudden loss of balance is mainly caused by a reorganization of the market networks triggered by a group of low capitalization stocks belonging to the non-financial sector. After the transition, the stocks of companies in these groups become all negatively correlated between them and with most of the rest of the stocks in the market. The implied change in the network topology is directly related to a decrease in stocks predictability, a finding with novel important implications for asset allocation and portfolio hedging strategies.
    Date: 2021–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2104.06254&r=all
  27. By: Ketelbuters, John John (Université catholique de Louvain, LIDAM/ISBA, Belgium); Hainaut, Donatien (Université catholique de Louvain, LIDAM/ISBA, Belgium)
    Abstract: We propose a fractional self-exciting model for the risk of corporate default. We study the properties of a time-changed version of an intensity based model. As a time-change, we use the inverse of an α-stable subordinator. Performing such a time-change allows to incorporate two particular features in the survival probability curves implied by the model. Firstly, it introduces random periods of time where the survival probability is frozen, thereby modeling periods of time where the viability of the company is not threatened. Secondly, the time-change implies possible sharp drops in the survival probability. This feature corresponds to the occurence of one-time events that threaten the creditworthiness of the company. We show that the joint probability density function and Laplace transform of the time-changed intensity and associate compensator are solutions of fractional Fokker-Planck equations. After a discussion regarding approximation of Caputo fractional derivatives, we describe a simple and fast numerical method to solve the Fokker-Planck equation of the Laplace transform. This Laplace transform is used to obtain the survival probabilities implied by our model. Finally, we use our results to calibrate the model to real market data and show that it leads to an improvement of the fit.
    Keywords: Finance ; Credit risk ; Caputo derivatives ; Self-exciting processes
    Date: 2021–01–01
    URL: http://d.repec.org/n?u=RePEc:aiz:louvad:2021018&r=all
  28. By: Przemys{\l}aw Biecek; Marcin Chlebus; Janusz Gajda; Alicja Gosiewska; Anna Kozak; Dominik Ogonowski; Jakub Sztachelski; Piotr Wojewnik
    Abstract: Rapid development of advanced modelling techniques gives an opportunity to develop tools that are more and more accurate. However as usually, everything comes with a price and in this case, the price to pay is to loose interpretability of a model while gaining on its accuracy and precision. For managers to control and effectively manage credit risk and for regulators to be convinced with model quality the price to pay is too high. In this paper, we show how to take credit scoring analytics in to the next level, namely we present comparison of various predictive models (logistic regression, logistic regression with weight of evidence transformations and modern artificial intelligence algorithms) and show that advanced tree based models give best results in prediction of client default. What is even more important and valuable we also show how to boost advanced models using techniques which allow to interpret them and made them more accessible for credit risk practitioners, resolving the crucial obstacle in widespread deployment of more complex, 'black box' models like random forests, gradient boosted or extreme gradient boosted trees. All this will be shown on the large dataset obtained from the Polish Credit Bureau to which all the banks and most of the lending companies in the country do report the credit files. In this paper the data from lending companies were used. The paper then compares state of the art best practices in credit risk modelling with new advanced modern statistical tools boosted by the latest developments in the field of interpretability and explainability of artificial intelligence algorithms. We believe that this is a valuable contribution when it comes to presentation of different modelling tools but what is even more important it is showing which methods might be used to get insight and understanding of AI methods in credit risk context.
    Date: 2021–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2104.06735&r=all

This nep-rmg issue is ©2021 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.