nep-rmg New Economics Papers
on Risk Management
Issue of 2022‒07‒18
sixteen papers chosen by
Stan Miles
Thompson Rivers University

  1. Risk Management and Internal Control System of Deposit Insurers By International Association of Deposit Insurers
  2. Exploring Corporate Crisis Communication after COVID-19: The Role of Enterprise Risk Management in (Re)Building Trust By Chiara Mio; Marco Fasan; Carlo Marcon; Silvia Panfilo
  3. Value-at-Risk (VAR) Estimation Methods: Empirical Analysis based on BRICS Markets By Ben Salem, Ameni; Safer, Imene; Khefacha, Islem
  4. Some Optimisation Problems in Insurance with a Terminal Distribution Constraint By Katia Colaneri; Julia Eisenberg; Benedetta Salterini
  5. $\Delta-$CoES By Aleksy Leeuwenkamp
  6. Risk-sharing Rules and their properties with applications to peer-to-peer insurance By Michel Denuit; Jan Dhaene; Christian Y Robert
  7. Policy Uncertainty and Stock Market Volatility Revisited: The Predictive Role of Signal Quality By Afees A. Salisu; Riza Demirer; Rangan Gupta
  8. Stock Market Bubbles and the Forecastability of Gold Returns (and Volatility) By David Gabauer; Rangan Gupta; Sayar Karmakar; Joshua Nielsen
  9. A Network Perspective in Supply Chain Risk Management By Bier, Tobias
  10. AdaVol: An Adaptive Recursive Volatility Prediction Method By Nicklas Werge; Olivier Wintenberger
  11. Risk Pooling and Precautionary Saving in Village Economies By Marcel Fafchamps; Aditya Shrinivas
  12. RMT-Net: Reject-aware Multi-Task Network for Modeling Missing-not-at-random Data in Financial Credit Scoring By Qiang Liu; Yingtao Luo; Shu Wu; Zhen Zhang; Xiangnan Yue; Hong Jin; Liang Wang
  13. The End of the Crypto-Diversification Myth By Luciano Somoza; Antoine Didisheim
  14. Ensemble distributional forecasting for insurance loss reserving By Benjamin Avanzi; Yanfeng Li; Bernard Wong; Alan Xian
  15. Latin American Falls, Rebounds and Tail Risks By Luciano Campos; Danilo Leiva-León; Steven Zapata- Álvarez
  16. The Fairness of Machine Learning in Insurance: New Rags for an Old Man? By Laurence Barry; Arthur Charpentier

  1. By: International Association of Deposit Insurers
    Abstract: The paper provides a starting point for further research involving an in-depth analysis of specific technical aspects of the risk management framework and internal control system. Handbooks or 'How to Apply' tools may also be useful to IADI Members. Finally, the Risk Management and Internal Controls Technical Committee deems the Guidance Points suitable for potential integration into the IADI Core Principles.
    Keywords: deposit insurance, bank resolution
    JEL: G21 G33
    Date: 2020–12
    URL: http://d.repec.org/n?u=RePEc:awl:guipap:2012&r=
  2. By: Chiara Mio (Dept. of Management, Università Ca' Foscari Venice); Marco Fasan (Dept. of Management, Università Ca' Foscari Venice); Carlo Marcon (Dept. of Management, Università Ca' Foscari Venice); Silvia Panfilo (Dept. of Management, Università Ca' Foscari Venice)
    Abstract: This study aims at investigating whether Enterprise Risk Management (ERM) sophistication shaped different COVID-19 crisis communication strategies. We assess the level of ERM sophistication of the FTSE-MIB Italian listed companies, and we study the pattern of risk communication strategies building on Situational Crisis Communication Theory (SCCT). We find that companies with a low level of ERM sophistication generally adopt a crisis communication strategy based on a “denying/diminish” approach. In contrast, companies with higher ERM sophistication adopt a “diminish/rebuild” strategy. Our results extend previous literature on crisis communication by looking at the unique case of the COVID-19, a non-company-specific crisis that hit all firms. Results show company crisis communication strategies depend on prior risk management characteristics. Thus companies willing to protect their reputation and (re)build public trust because of a crisis should invest not only in risk communication but also in their risk management process.
    Keywords: COVID-19, Enterprise Risk Management, Situational Crisis Communication Theory, Risk communication, Crisis communication strategies, Corporate reputation.
    JEL: M14 G3
    Date: 2022–05
    URL: http://d.repec.org/n?u=RePEc:vnm:wpdman:191&r=
  3. By: Ben Salem, Ameni; Safer, Imene; Khefacha, Islem
    Abstract: The purpose of this paper is to investigate some statistical methods to estimate the value-at-Risk (VaR) for stock returns in the BRICS countries for the period between 2011 to 2018. Four different risk methods are used to estimate VaR: Historical Simulation (HS), Riskmetrics, Historical Method and Generalized Autoregressive Conditional Heteroscedasticity (GARCH) Process. By applying the Backtesting technique, we try to test the effectiveness of this different methods by comparing the calculated VaR with the real realized losses (or gain) of the portfolio or the index. The results show that for the all-BRICS countries and at different confidence level; the Historical Method and the Historical Simulation are the appropriate methods. While the GARCH model failed to predict precisely the VaR for all BRICS countries.
    Keywords: Value-at-Risk, BRICS, Riskmetrics, Historical Simulation, GARCH, Historical Method, Backtesting, Confidence level.
    JEL: C15 C52 C58
    Date: 2022–02
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:113350&r=
  4. By: Katia Colaneri; Julia Eisenberg; Benedetta Salterini
    Abstract: In this paper, we study two optimisation settings for an insurance company, under the constraint that the terminal surplus at a deterministic and finite time $T$ follows a normal distribution with a given mean and a given variance. In both cases, the surplus of the insurance company is assumed to follow a Brownian motion with drift. First, we allow the insurance company to pay dividends and seek to maximise the expected discounted dividend payments or to minimise the ruin probability under the terminal distribution constraint. Here, we find explicit expressions for the optimal strategies in both cases: in discrete and continuous time settings. Second, we let the insurance company buy a reinsurance contract for a pool of insured or a branch of business. To achieve a certain level of sustainability (i.e. the collected premia should be sufficient to buy reinsurance and to pay the occurring claims) the initial capital is set to be zero. We only allow for piecewise constant reinsurance strategies producing a normally distributed terminal surplus, whose mean and variance lead to a given Value at Risk or Expected Shortfall at some confidence level $\alpha$. We investigate the question which admissible reinsurance strategy produces a smaller ruin probability, if the ruin-checks are due at discrete deterministic points in time.
    Date: 2022–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2206.04680&r=
  5. By: Aleksy Leeuwenkamp
    Abstract: In this paper the $\Delta$-CoVaR method is extended in both the conditional and unconditional cases to be based on the Expected Shortfall (ES) using quantile regression with a more expansive distress definition. We find the resulting $\Delta$-CoES measure to be complementary to the $\Delta$-CoVaR and to be more effective than the $\Delta$-CoVaR in measuring short-term changes in systemic risk and in identifying heterogeneity in the systemic risk contributions of financial institutions and linkages between institutions due to its lower robustness. For regulators, risk managers and market participants these properties are interesting from an economic standpoint when they require the increased sensitivity and heterogeneity of the $\Delta$-CoES to set short-term capital requirements/risk limits, find problematic financial linkages, problematic financial institutions or have some kind of early warning system for the emergence of systemic risk. Lastly, the $\Delta$-CoES is straightforward to estimate and would fit within recent regulatory frameworks such as the FRTB. To show the statistical advantages and properties empirically, the $\Delta$-CoVaR and $\Delta$-CoES methods are used on a large sample (from 31-12-1970 to 31-12-2020 1564 firms) of daily equity data from US financial institutions both in a system and network fashion. On a sample of 9 US-based GSIBS we also show the properties and the utility of the $\Delta$-CoES when it comes to identifying problematic financial links.
    Date: 2022–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2206.02582&r=
  6. By: Michel Denuit; Jan Dhaene; Christian Y Robert
    Date: 2021–11–23
    URL: http://d.repec.org/n?u=RePEc:ete:afiper:689055&r=
  7. By: Afees A. Salisu (Centre for Econometrics & Applied Research, Ibadan, Nigeria; Department of Economics, University of Pretoria, Private Bag X20, Hatfield 0028, South Africa); Riza Demirer (Department of Economics and Finance, Southern Illinois University Edwardsville, Edwardsville, IL 62026-1102, USA); Rangan Gupta (Department of Economics, University of Pretoria, Private Bag X20, Hatfield 0028, South Africa)
    Abstract: This paper provides novel mixed-frequency insight to the growing literature on the (monthly) economic policy uncertainty-(daily) stock market volatility nexus by examining the out-of-sample predictive ability of the quality of political signals over stock market volatility at various forecast horizons, and whether or not accounting for the signal quality in forecasting models can help achieve economic gains for investors. Both in- and out-of-sample tests, based on a GARCH-MIDAS framework, show that the quality of the policy signal indeed matters when it comes to the predictive role played by policy uncertainty over subsequent stock market volatility. While high EPU is found to predict high volatility, particularly when the signal quality is high, the positive relationship between EPU and volatility breaks down when the signal quality is low. The improved out-of-sample volatility forecasts obtained from the models that account for the quality of policy signals also helps typical mean-variance investors achieve improved economic outcomes captured by higher certainty equivalent returns and Sharpe ratios. Although our results indicate clear distinctions between the U.S. and U.K. stock markets in terms of how policy signals are processed by market participants, they highlight the role of the quality of policy signals as a driver of volatility forecasts with significant economic implications.
    Keywords: Economic policy uncertainty, Signal quality, Market Volatility, Forecasting
    JEL: C32 C53 D8 E32 G15
    Date: 2022–06
    URL: http://d.repec.org/n?u=RePEc:pre:wpaper:202232&r=
  8. By: David Gabauer (Data Analysis Systems, Software Competence Center Hagenberg, Austria); Rangan Gupta (Department of Economics, University of Pretoria, Private Bag X20, Hatfield 0028, South Africa); Sayar Karmakar (Department of Statistics, University of Florida, 230 Newell Drive, Gainesville, FL, 32601, USA); Joshua Nielsen (Boulder Investment Technologies, LLC, 1942 Broadway Suite 314C, Boulder, CO, 80302, USA)
    Abstract: Firstly, we use the Multi-Scale LPPLS Confidence Indicator approach to detect both positive and negative bubbles at short-, medium- and long-term horizons for the stock markets of the G7 and the BRICS countries. We were able to detect major crashes and rallies in the 12 stock markets over the period of the 1st week of January, 1973 to the 2nd week of September, 2020. We also observed similar timing of strong (positive and negative) LPPLS indicator values across both G7 and BRICS countries, suggesting interconnectedness of the extreme movements in these stock markets. Secondly, we utilize these indicators to forecast gold returns and its volatility, using a method involving block means of residuals obtained from the popular LASSO routine, given that the number of covariates ranged between 42 to 72, and gold returns demonstrated a heavy upper tail. We found that, our bubbles indicators, particularly when both positive and negative bubbles are considered simultaneously, can accurately forecast gold returns at short- to medium-term, and also time-varying estimates of gold returns volatility to a lesser extent. Our results have important implications for the portfolio decisions of investors who seek a safe haven during boom-bust cycles of major global stock markets.
    Keywords: Gold, Stock Markets, Bubbles, Forecasting, Returns, Volatility
    JEL: C22 C53 G15 Q02
    Date: 2022–06
    URL: http://d.repec.org/n?u=RePEc:pre:wpaper:202228&r=
  9. By: Bier, Tobias
    Abstract: Classical approaches in the field of supply chain risk management (SCRM) consider supply chains as linear, as the term itself indicates (Hearnshaw and Wilson, 2013). However, modern supply chains are by no means linear—they form complex interconnected networks (e.g., Hearnshaw and Wilson (2013)). This increased complexity is induced by trends such as globalization, increasing product complexity and shorter lead times (Ghadge et al., 2013, Harland et al., 2003). Clearly, new methods for supply chain management are needed, especially those that consider the complexity of today’s supply chains. In this respect, the network structure of supply chains also needs to be considered. For example, studies find that the supply network structure is directly related to resilience, which is the key to effective SCRM (Kim et al., 2015). Research has introduced network theoretical approaches to supply chain management (e.g., Galaskiewicz (2011), Borgatti and Li (2009)). This cumulative dissertation joins the effort by addressing the research field of network theory in the SCRM context. This dissertation contributes to the domain, by first providing a systematic literature review that structures methods for mitigating disruptions in complex supply chains – or to be precise supply networks – and outlines an agenda for further research in the field. Next, in the second paper, it contributes a qualitative model that helps to understand the mechanisms of risks in complex supply chain networks. The same model is the basis for two quantitative studies conveyed in the third and fourth papers that investigate how centrality measures can be used to identify critical suppliers. Finally, the fifth paper conveys a study which directly contributes to practice by developing a supply chain mapping framework as a basis for systematic, effective, and efficient SCRM in complex supply chain networks.
    Date: 2022
    URL: http://d.repec.org/n?u=RePEc:dar:wpaper:132717&r=
  10. By: Nicklas Werge (LPSM (UMR_8001) - Laboratoire de Probabilités, Statistiques et Modélisations - SU - Sorbonne Université - CNRS - Centre National de la Recherche Scientifique - UPC - Université Paris Cité); Olivier Wintenberger (LPSM (UMR_8001) - Laboratoire de Probabilités, Statistiques et Modélisations - SU - Sorbonne Université - CNRS - Centre National de la Recherche Scientifique - UPC - Université Paris Cité)
    Abstract: Quasi-Maximum Likelihood (QML) procedures are theoretically appealing and widely used for statistical inference. While there are extensive references on QML estimation in batch settings, the QML estimation in streaming settings has attracted little attention until recently. An investigation of the convergence properties of the QML procedure in a general conditionally heteroscedastic time series model is conducted, and the classical batch optimization routines extended to the framework of streaming and large-scale problems. An adaptive recursive estimation routine for GARCH models named AdaVol is presented. The AdaVol procedure relies on stochastic approximations combined with the technique of Variance Targeting Estimation (VTE). This recursive method has computationally efficient properties, while VTE alleviates some convergence difficulties encountered by the usual QML estimation due to a lack of convexity. Empirical results demonstrate a favorable trade-off between AdaVol's stability and the ability to adapt to time-varying estimates for real-life data.
    Keywords: recursive algorithm,quasi-likelihood,volatility models,GARCH,prediction method,stock index
    Date: 2022
    URL: http://d.repec.org/n?u=RePEc:hal:journl:hal-02733439&r=
  11. By: Marcel Fafchamps; Aditya Shrinivas
    Abstract: We propose a new method to test for efficient risk pooling that allows for intertemporal smoothing, non-homothetic consumption, and heterogeneous risk and time preferences. The method is composed of three steps. The first one allows for precautionary savings by the aggregate risk pooling group. The second utilizes the inverse Engel curve to estimate good-specific tests for efficient risk pooling. In the third step, we obtain consistent estimates of households' risk and time preferences using a full risk sharing model, and incorporate heterogeneous preferences in testing for risk pooling. We apply this method to panel data from Indian villages to generate a number of new insights. We find that food expenditures are better protected from aggregate shocks than non-food consumption, after accounting for non-homotheticity. Village-level consumption tracks aggregate village cash-in-hand, suggesting some form of coordinated precautionary savings. But there is considerable excess sensitivity to aggregate income, indicating a lack of full asset integration. We also find a large unexplained gap between the variation in measured consumption expenditures and cash-in-hand at the aggregate village level. Contrary to earlier findings, risk pooling in Indian villages no longer appears to take place more at the sub-caste level than at the village level.
    JEL: D14 D31 D64 O12
    Date: 2022–06
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:30128&r=
  12. By: Qiang Liu; Yingtao Luo; Shu Wu; Zhen Zhang; Xiangnan Yue; Hong Jin; Liang Wang
    Abstract: In financial credit scoring, loan applications may be approved or rejected. We can only observe default/non-default labels for approved samples but have no observations for rejected samples, which leads to missing-not-at-random selection bias. Machine learning models trained on such biased data are inevitably unreliable. In this work, we find that the default/non-default classification task and the rejection/approval classification task are highly correlated, according to both real-world data study and theoretical analysis. Consequently, the learning of default/non-default can benefit from rejection/approval. Accordingly, we for the first time propose to model the biased credit scoring data with Multi-Task Learning (MTL). Specifically, we propose a novel Reject-aware Multi-Task Network (RMT-Net), which learns the task weights that control the information sharing from the rejection/approval task to the default/non-default task by a gating network based on rejection probabilities. RMT-Net leverages the relation between the two tasks that the larger the rejection probability, the more the default/non-default task needs to learn from the rejection/approval task. Furthermore, we extend RMT-Net to RMT-Net++ for modeling scenarios with multiple rejection/approval strategies. Extensive experiments are conducted on several datasets, and strongly verifies the effectiveness of RMT-Net on both approved and rejected samples. In addition, RMT-Net++ further improves RMT-Net's performances.
    Date: 2022–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2206.00568&r=
  13. By: Luciano Somoza (University of Lausanne, HEC; Swiss Finance Institute); Antoine Didisheim (Swiss Finance Institute, UNIL)
    Abstract: We propose a mechanism explaining the recent high positive correlation between cryptocurrencies and the stock market. With a unique dataset of investor-level holdings from a bank offering trading accounts and cryptocurrency wallets, we show that retail investors’ net trading volumes of stocks and cryptocurrencies are positively correlated. Theoretically, this micro-level pattern translates into a cross-asset class correlation as long as the two markets are not fully integrated. We provide suggestive evidence showing that this micro-level pattern emerged in March 2020 and that stocks preferred by crypto-traders exhibit a stronger correlation with Bitcoin, especially when the cross asset retail volume is high.
    Keywords: cryptocurrencies, Bitcoin, retail investors, correlation
    JEL: G11 G12 G29
    Date: 2022–06
    URL: http://d.repec.org/n?u=RePEc:chf:rpseri:rp2253&r=
  14. By: Benjamin Avanzi; Yanfeng Li; Bernard Wong; Alan Xian
    Abstract: Loss reserving generally focuses on identifying a single model that can generate superior predictive performance. However, different loss reserving models specialise in capturing different aspects of loss data. This is recognised in practice in the sense that results from different models are often considered, and sometimes combined. For instance, actuaries may take a weighted average of the prediction outcomes from various loss reserving models, often based on subjective assessments. In this paper, we propose a systematic framework to objectively combine (i.e. ensemble) multiple stochastic loss reserving models such that the strengths offered by different models can be utilised effectively. Criteria of choice consider the full distributional properties of the ensemble. A notable innovation of our framework is that it is tailored for the features inherent to reserving data. These include, for instance, accident, development, calendar, and claim maturity effects. Crucially, the relative importance and scarcity of data across accident periods renders the problem distinct from the traditional ensembling techniques in statistical learning. Our ensemble reserving framework is illustrated with a complex synthetic dataset. In the results, the optimised ensemble outperforms both (i) traditional model selection strategies, and (ii) an equally weighted ensemble. In particular, the improvement occurs not only with central estimates but also relevant quantiles, such as the 75th percentile of reserves (typically of interest to both insurers and regulators).
    Date: 2022–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2206.08541&r=
  15. By: Luciano Campos; Danilo Leiva-León; Steven Zapata- Álvarez
    Abstract: This paper proposes comprehensive measures of the Latin American business cycle that help to infer the expected deepness of recessions, and strength of expansions, as they unfold in real time. These measures are based on the largest country economies in the region by accounting for intrinsic features of real activity, such as comovement, nonlinearities, asymmetries, and are also robust to unprecedented shocks, like the COVID-19 pandemics. The proposed measures provide timely updates on (i) inferences on the state of the regional economy, (ii) the underlying momentum embedded in short-term fluctuations of real activity, and (iii) the quantification of macroeconomic tail risks. We evaluate as well the time-varying effects of U.S. financial conditions on the Latin American economy by employing the proposed measures, and identify periods of persistent international spillovers. **** RESUMEN: En este documento se proponen diferentes medidas para estimar el ciclo económico de América latina, con las que se permite observar la profundidad de las recesiones y la fuerza de las expansiones de la economía de la región en tiempo real. Estas medidas se construyen con los datos observados de las economías más grandes de latinoamérica y tienen en cuenta diferentes características de la actividad real, capturando los comovimientos, no linealidades y asimetrías que caracterizan la actividad económica de la región, al tiempo que son robustas frente a choques sin precedentes como el de la pandemia COVID 19. Las medidas propuestas proporcionan información sobre; (i) el estado de la economía regional, (ii) el momentum de la actividad económica en el corto plazo, y (iii) la cuantificación de los riesgos de cola macroeconómicos. Usando las medidas propuestas también evaluamos los efectos que tienen las condiciones financieras de EE.UU sobre la economía latinoamericana.
    Keywords: Business Cycles, Factor Model, Nonlinear, Latin America, Ciclos Económicos, Modelos de Factores, No linealidad, América Latina
    JEL: E32 C22 E27
    Date: 2022–06
    URL: http://d.repec.org/n?u=RePEc:bdr:borrec:1201&r=
  16. By: Laurence Barry; Arthur Charpentier
    Abstract: Since the beginning of their history, insurers have been known to use data to classify and price risks. As such, they were confronted early on with the problem of fairness and discrimination associated with data. This issue is becoming increasingly important with access to more granular and behavioural data, and is evolving to reflect current technologies and societal concerns. By looking into earlier debates on discrimination, we show that some algorithmic biases are a renewed version of older ones, while others show a reversal of the previous order. Paradoxically, while the insurance practice has not deeply changed nor are most of these biases new, the machine learning era still deeply shakes the conception of insurance fairness.
    Date: 2022–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2205.08112&r=

This nep-rmg issue is ©2022 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.