nep-fmk New Economics Papers
on Financial Markets
Issue of 2020‒11‒16
twelve papers chosen by
Kwang Soo Cheong
Johns Hopkins University

  1. r Minus g By Robert J. Barro
  2. Searching for the Equity Premium By Hang Bai; Lu Zhang
  3. Why do mutual funds hold lottery stocks? By Agarwal, Vikas; Jiang, Lei; Wen, Quan
  4. Event-Driven Learning of Systematic Behaviours in Stock Markets By Xianchao Wu
  5. Stock Price Prediction Using CNN and LSTM-Based Deep Learning Models By Sidra Mehtab; Jaydip Sen
  6. Machine learning in credit risk: measuring the dilemma between prediction and supervisory cost By Andrés Alonso; José Manuel Carbó
  7. Deep learning for CVA computations of large portfolios of financial derivatives By Kristoffer Andersson; Cornelis W. Oosterlee
  8. Non-Normal Identification for Price Discovery in High-Frequency Financial Markets By Sebastiano Michele Zema
  9. Inside the regulatory sandbox: effects on fintech funding By Giulio Cornelli; Sebastian Doerr; Leonardo Gambacorta; Ouarda Merrouche
  10. Hedging with commodity futures and the end of normal Backwardation By Jochen Güntner; Benjamin Karner
  11. Stock market spillovers via the global production network: Transmission of U.S. monetary policy By Julian di Giovanni; Galina Hale
  12. Realized volatility, jump and beta: evidence from Canadian stock market By Gajurel, Dinesh; Chowdhury, Biplob

  1. By: Robert J. Barro
    Abstract: Long-term data show that the dynamic efficiency condition r>g holds when g is represented by the average growth rate of real GDP if r is the average real rate of return on equity, E(r e ) , but not if r is the risk-free rate, r f . This pattern accords with a simple disaster-risk model calibrated to fit observed equity premia. If Ponzi (chain-letter) finance by private agents and the government are precluded, the equilibrium can feature r f ≤E(g) , a result that does not signal dynamic inefficiency. In contrast, E(r e )>E(g) is required for dynamic efficiency, implied by the model, and consistent with the data. The model satisfies Ricardian Equivalence because, without Ponzi finance by the government, a rise in safe assets from increased public debt is matched by an increase in the safe (that is, certain) present value of liabilities associated with net taxes.
    JEL: E21 G12 O4
    Date: 2020–10
  2. By: Hang Bai; Lu Zhang
    Abstract: Labor market frictions are crucial for the equity premium in production economies. A dynamic stochastic general equilibrium model with recursive utility, search frictions, and capital accumulation yields a high equity premium of 4.26% per annum, a stock market volatility of 11.8%, and a low average interest rate of 1.59%, while simultaneously retaining plausible business cycle dynamics. The equity premium and stock market volatility are strongly countercyclical, while the interest rate and consumption growth are largely unpredictable. Because of wage inertia, dividends are procyclical despite consumption smoothing via capital investment. The welfare cost of business cycles is huge, 29%.
    JEL: E32 E44 G12
    Date: 2020–10
  3. By: Agarwal, Vikas; Jiang, Lei; Wen, Quan
    Abstract: We provide evidence regarding mutual funds' motivation to hold lottery stocks. Funds with higher managerial ownership invest less in lottery stocks, suggesting that managers themselves do not prefer such stocks. The evidence instead supports that managers cater to fund investors' preference for such stocks. In particular, funds with more lottery holdings attract larger flows after portfolio disclosure compared to their peers, and poorly performing funds tend to engage in risk shifting by increasing their lottery holdings towards year-ends. Funds' aggregate holdings of lottery stocks contribute to their overpricing.
    Keywords: lottery stocks,risk shifting,fund performance,investor flows,stock mispricing
    JEL: G11 G23
    Date: 2020
  4. By: Xianchao Wu
    Abstract: It is reported that financial news, especially financial events expressed in news, provide information to investors' long/short decisions and influence the movements of stock markets. Motivated by this, we leverage financial event streams to train a classification neural network that detects latent event-stock linkages and stock markets' systematic behaviours in the U.S. stock market. Our proposed pipeline includes (1) a combined event extraction method that utilizes Open Information Extraction and neural co-reference resolution, (2) a BERT/ALBERT enhanced representation of events, and (3) an extended hierarchical attention network that includes attentions on event, news and temporal levels. Our pipeline achieves significantly better accuracies and higher simulated annualized returns than state-of-the-art models when being applied to predicting Standard\&Poor 500, Dow Jones, Nasdaq indices and 10 individual stocks.
    Date: 2020–10
  5. By: Sidra Mehtab; Jaydip Sen
    Abstract: Designing robust and accurate predictive models for stock price prediction has been an active area of research for a long time. While on one side, the supporters of the efficient market hypothesis claim that it is impossible to forecast stock prices accurately, many researchers believe otherwise. There exist propositions in the literature that have demonstrated that if properly designed and optimized, predictive models can very accurately and reliably predict future values of stock prices. This paper presents a suite of deep learning based models for stock price prediction. We use the historical records of the NIFTY 50 index listed in the National Stock Exchange of India, during the period from December 29, 2008 to July 31, 2020, for training and testing the models. Our proposition includes two regression models built on convolutional neural networks and three long and short term memory network based predictive models. To forecast the open values of the NIFTY 50 index records, we adopted a multi step prediction technique with walk forward validation. In this approach, the open values of the NIFTY 50 index are predicted on a time horizon of one week, and once a week is over, the actual index values are included in the training set before the model is trained again, and the forecasts for the next week are made. We present detailed results on the forecasting accuracies for all our proposed models. The results show that while all the models are very accurate in forecasting the NIFTY 50 open values, the univariate encoder decoder convolutional LSTM with the previous two weeks data as the input is the most accurate model. On the other hand, a univariate CNN model with previous one week data as the input is found to be the fastest model in terms of its execution speed.
    Date: 2020–10
  6. By: Andrés Alonso (Banco de España); José Manuel Carbó (Banco de España)
    Abstract: New reports show that the financial sector is increasingly adopting machine learning (ML) tools to manage credit risk. In this environment, supervisors face the challenge of allowing credit institutions to benefit from technological progress and financial innovation, while at the same ensuring compatibility with regulatory requirements and that technological neutrality is observed. We propose a new framework for supervisors to measure the costs and benefits of evaluating ML models, aiming to shed more light on this technology’s alignment with the regulation. We follow three steps. First, we identify the benefits by reviewing the literature. We observe that ML delivers predictive gains of up to 20?% in default classification compared with traditional statistical models. Second, we use the process for validating internal ratings-based (IRB) systems for regulatory capital to detect ML’s limitations in credit risk mangement. We identify up to 13 factors that might constitute a supervisory cost. Finally, we propose a methodology for evaluating these costs. For illustrative purposes, we compute the benefits by estimating the predictive gains of six ML models using a public database on credit default. We then calculate a supervisory cost function through a scorecard in which we assign weights to each factor for each ML model, based on how the model is used by the financial institution and the supervisor’s risk tolerance. From a supervisory standpoint, having a structured methodology for assessing ML models could increase transparency and remove an obstacle to innovation in the financial industry.
    Keywords: artificial intelligence, machine learning, credit risk, interpretability, bias, IRB models
    JEL: C53 D81 G17
    Date: 2020–10
  7. By: Kristoffer Andersson; Cornelis W. Oosterlee
    Abstract: In this paper, we propose a neural network-based method for CVA computations of a portfolio of derivatives. In particular, we focus on portfolios consisting of a combination of derivatives, with and without true optionality, \textit{e.g.,} a portfolio of a mix of European- and Bermudan-type derivatives. CVA is computed, with and without netting, for different levels of WWR and for different levels of credit quality of the counterparty. We show that the CVA is overestimated with up to 25\% by using the standard procedure of not adjusting the exercise strategy for the default-risk of the counterparty. For the Expected Shortfall of the CVA dynamics, the overestimation was found to be more than 100\% in some non-extreme cases.
    Date: 2020–10
  8. By: Sebastiano Michele Zema
    Abstract: The possibility to measure the relative contribution of agents and exchanges to the price formation process in high-frequency financial markets acquired increasingly importance in the financial econometric literature. In this paper I propose to adopt fully data-driven approaches to identify structural vector error correction models (SVECM) typically used for price discovery. Exploiting the non-Normal distributions of the variables under consideration, I propose two novel variants of the widespread Information Share (IS) measure which are able to identify the leaders and the followers in the price formation process. The approaches will be illustrated both from a semiparametric and parametric standpoints, solving the identification problem with no need of increasing the computational complexity which usually arises when working at incredibly short time scales. Finally, an empirical application on IBM intraday data will be provided.
    Keywords: Information Shares; Structural VECM; Microstructure noise; Independent Component Analysis; Directed acyclic graphs.
    Date: 2020–10–26
  9. By: Giulio Cornelli; Sebastian Doerr; Leonardo Gambacorta; Ouarda Merrouche
    Abstract: Policymakers around the world are adopting regulatory sandboxes as a tool for spurring innovation in the financial sector while keeping alert to emerging risks. Using unique data for the UK, this paper provides initial evidence on the effectiveness of the world's first sandbox in improving fintechs' access to finance. Firms entering the sandbox see a significant increase of 15% in capital raised post-entry, relative to firms that did not enter; and their probability of raising capital increases by 50%. Our results furthermore suggest that the sandbox facilitates access to capital through two channels: reduced asymmetric information and reduced regulatory costs or uncertainty. Our results are confirmed when we exploit the staggered introduction of the sandbox and compare firms in earlier to those in later sandbox cohorts, and when we compare participating firms to a matched set of firms that never enters the sandbox.
    Keywords: fintech, regulatory sandbox, startups, venture capital.
    JEL: G32 G38 M13 O3
    Date: 2020–11
  10. By: Jochen Güntner; Benjamin Karner
    Abstract: Using the S&P GSCI and its five component sub-indices, we show that considering each commodity separately yields nontrivial hedging gains in and out of sample. During 1999–2019, the maximum Sharpe ratio portfolio assigns positive weights to the GSCI Energy, Industrial and Precious Metals, whereas only precious metals enter the optimal portfolio after the financial crisis. In out-of-sample optimizations based on dynamic conditional correlations, a subset of commodity futures excluding the GSCI Agriculture and Livestock outperforms conventional stock-bond portfolios with and without the overall GSCI. We argue that the “normal backwardation” in commodity markets has broken down during our sample period.
    Keywords: Commodity futures, Diversification, Hedging, Financial crisis, Normal backwardation
    JEL: C58 G11 G17 Q02
    Date: 2020–11
  11. By: Julian di Giovanni; Galina Hale
    Abstract: We quantify the role of global production linkages in explaining spillovers of U.S. monetary policy shocks to stock returns of 54 sectors in 26 countries. We first present a conceptual framework based on a standard open-economy production network model that delivers a spillover pattern consistent with a spatial autoregression (SAR) process. We then use the SAR model to decompose the overall impact of U.S. monetary policy on stock returns into a direct and a network effect. We find that up to 80% of the total impact of U.S. monetary policy shocks on average country-sector stock returns are due to the network effect of global production linkages. We further show that U.S. monetary policy shocks have a direct impact predominantly on U.S. sectors and then propagate to the rest of the world through the global production network. Our results are robust to controlling for correlates of the global financial cycle, foreign monetary policy shocks, and to changes in variable definitions and empirical specifications.
    Keywords: Global production network, asset prices, monetary policy shocks
    JEL: G15 F10 F36
    Date: 2020–10
  12. By: Gajurel, Dinesh (University of New Brunswick); Chowdhury, Biplob (Tasmanian School of Business & Economics, University of Tasmania)
    Abstract: Inclusion of jump component in the price process has been a long debate in finance literature. In this paper, we identify and characterize jump risks in the Canadian stock market using high-frequency data from the Toronto Stock Exchange. Our results provide a strong evidence of jump clustering - about 90% of jumps occur within first 30 minutes of market opening for trade, and about 55% of jumps are due to the overnight returns. While average intraday jump is negative, jumps induced by overnight returns bring a cancellation effect yielding average size of the jumps to zero. We show that the economic significance of jump component in volatility forecasting is very nominal. Our results further demonstrate that market jumps and overnight returns bring significant changes in systematic risk (beta) of stocks. While the average effect of market jumps on beta is not significantly different than zero, the effect of overnight returns on beta is significant. Overall, our results suggest that jump risk is non-systematic in nature.
    Keywords: financial markets, stock price process, jumps, volatility, systematic risk
    JEL: C58 G12
    Date: 2020

This nep-fmk issue is ©2020 by Kwang Soo Cheong. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.