nep-cmp New Economics Papers
on Computational Economics
Issue of 2020‒12‒07
28 papers chosen by



  1. Simulation of the drawdown and its duration in L\'{e}vy models via stick-breaking Gaussian approximation By Jorge Gonz\'alez C\'azares; Aleksandar Mijatovi\'c
  2. Application of deep quantum neural networks to finance By Takayuki Sakuma
  3. Portfolio Risk Measurement Using a Mixture Simulation Approach By Seyed Mohammad Sina Seyfi; Azin Sharifi; Hamidreza Arian
  4. People Meet People: A Microlevel Approach to Predicting the Effect of Policies on the Spread of COVID-19 By Gabler, Janos; Raabe, Tobias; Röhrl, Klara
  5. Affine-invariant contracting-point methods for Convex Optimization By Doikov, Nikita; Nesterov, Yurii
  6. Deep Neural Networks and Neuro-Fuzzy Networks for Intellectual Analysis of Economic Systems By Alexey Averkin; Sergey Yarushev
  7. Nonparametric Adaptive Bayesian Stochastic Control Under Model Uncertainty By Tao Chen; Jiyoun Myung
  8. Pattern recognition in trading behaviors before stock price jumps: new method based on multivariate time series classification By Ao Kong; Robert Azencott; Hongliang Zhu
  9. Predicting Disaggregated CPI Inflation Components via Hierarchical Recurrent Neural Networks By Oren Barkan; Itamar Caspi; Allon Hammer; Noam Koenigstein
  10. Policy Gradient Methods for the Noisy Linear Quadratic Regulator over a Finite Horizon By Ben Hambly; Renyuan Xu; Huining Yang
  11. Spillover effects in international business cycles By Máximo Camacho; Matías Pacce; Gabriel Pérez-Quirós
  12. Asset Pricing with Realistic Crises Dynamics By Goutham Gopalakrishna
  13. Aplica\c{c}\~ao do Movimento Browniano Geom\'etrico para Simula\c{c}\~ao de Pre\c{c}os de A\c{c}\~oes do \'Indice Brasileiro de Small Caps By Marcos Vin\'icius dos Santos Ara\'ujo
  14. A deep-narrative analysis of energy cultures in slum rehabilitation housing of Abuja, Mumbai and Rio de Janeiro for just policy design By Debnath, R.; Bardhan, R.; Darby, S.; Mohaddes, K.; Sunikka-Blank, M.; Coelho, A C V., Isa, A.; Isa, A.
  15. Understanding the Distributional Aspects of Microcredit Expansions By Melvyn Weeks; Tobias Gabel Christiansen
  16. Predicting county-scale maize yields with publicly available data By Jiang, Zehui; Liu, Chao; Ganapathysubramanian, Baskar; Hayes, Dermot J.; Sarkar, Soumik
  17. The effects of structural reforms: Evidence from Italy By Emanuela Ciapanna; Sauro Mocetti; Alessandro Notarpietro
  18. Macroprudential capital buffers in heterogeneous banking networks: Insights from an ABM with liquidity crises By Gurgone, Andrea; Iori, Giulia
  19. Back to the past: the historical roots of labour-saving automation By Jacopo Staccioli; Maria Enrica Virgillito
  20. Text-Based Linkages and Local Risk Spillovers in the Equity Market By Ge, S.
  21. Application of text mining to the analysis of climate-related disclosures By Ángel Iván Moreno; Teresa Caminero
  22. Interpreting Big Data in the Macro Economy: A Bayesian Mixed Frequency Estimator By David Kohns; Arnab Bhattacharjee
  23. Predicting the price of second-hand vehicles using data mining techniques By Jafari Kang, Masood; Zohoori, Sepideh; Abbasi, Elahe; Li, Yueqing; Hamidi, Maryam
  24. Financial Conditions and Economic Activity: Insights from Machine Learning By Michael T. Kiley
  25. Nowcasting business cycle turning points with stock networks and machine learning By Azqueta-Gavaldon, Andres; Hirschbühl, Dominik; Onorante, Luca; Saiz, Lorena
  26. Conducting Regression-based Causal Mediation Analysis Using the R Package "regmedint" By Yoshida, Kazuki; Mathur, Maya B; Glynn, Robert J.
  27. Do words hurt more than actions? The impact of trade tensions on financial markets By Ferrari, Massimo Minesso; Pagliari, Maria Sole; Kurcz, Frederik
  28. The gender-dependent structure of wages in Hungary: results using machine learning techniques By Olga Takács; János Vincze

  1. By: Jorge Gonz\'alez C\'azares; Aleksandar Mijatovi\'c
    Abstract: We develop a computational method for expected functionals of the drawdown and its duration in exponential L\'evy models. It is based on a novel simulation algorithm for the joint law of the state, supremum and time the supremum is attained of the Gaussian approximation of a general L\'evy process. We bound the bias for various locally Lipschitz and discontinuous payoffs arising in applications and analyse the computational complexities of the corresponding Monte Carlo and multilevel Monte Carlo estimators. Monte Carlo methods for L\'evy processes (using Gaussian approximation) have been analysed for Lipschitz payoffs, in which case the computational complexity of our algorithm is up to two orders of magnitude smaller when the jump activity is high. At the core of our approach are bounds on certain Wasserstein distances, obtained via the novel SBG coupling between a L\'evy process and its Gaussian approximation. Numerical performance, based on the implementation in the dedicated GitHub repository, exhibits a good agreement with our theoretical bounds.
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2011.06618&r=all
  2. By: Takayuki Sakuma
    Abstract: Use of the deep quantum neural network proposed by Beer et al. (2020) could grant new perspectives on solving numerical problems arising in the field of finance. We discuss this potential in the context of simple experiments such as learning implied volatilites and differential machine proposed by Huge and Savine (2020). The deep quantum neural network is considered to be a promising candidate for developing highly powerful methods in finance.
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2011.07319&r=all
  3. By: Seyed Mohammad Sina Seyfi; Azin Sharifi; Hamidreza Arian
    Abstract: Monte Carlo Approaches for calculating Value-at-Risk (VaR) are powerful tools widely used by financial risk managers across the globe. However, they are time consuming and sometimes inaccurate. In this paper, a fast and accurate Monte Carlo algorithm for calculating VaR and ES based on Gaussian Mixture Models is introduced. Gaussian Mixture Models are able to cluster input data with respect to market's conditions and therefore no correlation matrices are needed for risk computation. Sampling from each cluster with respect to their weights and then calculating the volatility-adjusted stock returns leads to possible scenarios for prices of assets. Our results on a sample of US stocks show that the Gmm-based VaR model is computationally efficient and accurate. From a managerial perspective, our model can efficiently mimic the turbulent behavior of the market. As a result, our VaR measures before, during and after crisis periods realistically reflect the highly non-normal behavior and non-linear correlation structure of the market.
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2011.07994&r=all
  4. By: Gabler, Janos (IZA); Raabe, Tobias (quantilope); Röhrl, Klara (University of Bonn)
    Abstract: Governments worldwide are adopting nuanced policy measures to reduce the number of Covid-19 cases with minimal social and economic costs. Epidemiological models have a hard time predicting the effects of such fine grained policies. We propose a novel simulation-based model to address this shortcoming. We build on state-of-the-art agent-based simulation models but replace the way contacts between susceptible and infected people take place. Firstly, we allow for heterogeneity in the types of contacts (e.g. recurrent or random) and in the infectiousness of each contact type. Secondly, we strictly separate the number of contacts from the probabilities that a contact leads to an infection. The number of contacts changes with social distancing policies, the infection probabilities remain invariant. This allows us to model many types of fine grained policies that cannot easily be incorporated into other models. To validate our model, we show that it can accurately predict the effect of the German November lockdown even if no similar policy has been observed in the time series that were used to estimate the model parameters.
    Keywords: COVID-19, agent based simulation model, public health measures
    JEL: C63 I18
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp13899&r=all
  5. By: Doikov, Nikita (Université catholique de Louvain); Nesterov, Yurii (Université catholique de Louvain, LIDAM/CORE, Belgium)
    Abstract: In this paper, we develop new affine-invariant algorithms for solving composite con- vex minimization problems with bounded domain. We present a general framework of Contracting-Point methods, which solve at each iteration an auxiliary subproblem re- stricting the smooth part of the objective function onto contraction of the initial domain. This framework provides us with a systematic way for developing optimization methods of different order, endowed with the global complexity bounds. We show that using an ap- propriate affine-invariant smoothness condition, it is possible to implement one iteration of the Contracting-Point method by one step of the pure tensor method of degree p ≥ 1. The resulting global rate of convergence in functional residual is then O(1/kp), where k is the iteration counter. It is important that all constants in our bounds are affine-invariant. For p = 1, our scheme recovers well-known Frank-Wolfe algorithm, providing it with a new interpretation by a general perspective of tensor methods. Finally, within our frame- work, we present efficient implementation and total complexity analysis of the inexact second-order scheme (p = 2), called Contracting Newton method. It can be seen as a proper implementation of the trust-region idea. Preliminary numerical results confirm its good practical performance both in the number of iterations, and in computational time.
    Keywords: Convex Optimization, Frank-Wolfe algorithm, Newton method, Tensor Methods, Global Complexity Bounds
    Date: 2020–09–17
    URL: http://d.repec.org/n?u=RePEc:cor:louvco:2020029&r=all
  6. By: Alexey Averkin; Sergey Yarushev
    Abstract: In tis paper we consider approaches for time series forecasting based on deep neural networks and neuro-fuzzy nets. Also, we make short review of researches in forecasting based on various models of ANFIS models. Deep Learning has proven to be an effective method for making highly accurate predictions from complex data sources. Also, we propose our models of DL and Neuro-Fuzzy Networks for this task. Finally, we show possibility of using these models for data science tasks. This paper presents also an overview of approaches for incorporating rule-based methodology into deep learning neural networks.
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2011.05588&r=all
  7. By: Tao Chen; Jiyoun Myung
    Abstract: In this paper we propose a new methodology for solving a discrete time stochastic Markovian control problem under model uncertainty. By utilizing the Dirichlet process, we model the unknown distribution of the underlying stochastic process as a random probability measure and achieve online learning in a Bayesian manner. Our approach integrates optimizing and dynamic learning. When dealing with model uncertainty, the nonparametric framework allows us to avoid model misspecification that usually occurs in other classical control methods. Then, we develop a numerical algorithm to handle the infinitely dimensional state space in this setup and utilizes Gaussian process surrogates to obtain a functional representation of the value function in the Bellman recursion. We also build separate surrogates for optimal control to eliminate repeated optimizations on out-of-sample paths and bring computational speed-ups. Finally, we demonstrate the financial advantages of the nonparametric Bayesian framework compared to parametric approaches such as strong robust and time consistent adaptive.
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2011.04804&r=all
  8. By: Ao Kong; Robert Azencott; Hongliang Zhu
    Abstract: This paper extends the work of Boudt and Pertitjean(2014) and investigates the trading patterns before price jumps in the stock market based on a new multivariate time classification technique. Different from Boudt and Pertitjean(2014), our analyzing scheme can explore the "time-series information" embedded in the trading-related attributes and provides a set of jump indicators for abnormal pattern recognition. In addition to the commonly used liquidity measures, our analysis also involves a set of technical indicators to describe the micro-trading behaviors. An empirical study is conducted on the level-2 data of the constituent stocks of China Security Index 300. It is found that among all the candidate attributes, several volume and volatility-related attributes exhibit the most significant abnormality before price jumps. Though some of the abnormalities start just shortly before the occurrence of the jumps, some start much earlier. We also find that most of our attributes have low mutual dependencies with each other from the perspective of time-series analysis, which allows various perspectives to study the market trading behaviors. To this end, our experiment provides a set of jump indicators that can effectively detect the stocks with extremely abnormal trading behaviors before price jumps. More importantly, our study offers a new framework and potential useful directions for trading-related pattern recognition problem using the time series classification techniques.
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2011.04939&r=all
  9. By: Oren Barkan; Itamar Caspi; Allon Hammer; Noam Koenigstein
    Abstract: We present a hierarchical architecture based on Recurrent Neural Networks (RNNs) for predicting disaggregated inflation components of the Consumer Price Index (CPI). While the majority of existing research is focused mainly on predicting the inflation headline, many economic and financial entities are more interested in its partial disaggregated components. To this end, we developed the novel Hierarchical Recurrent Neural Network (HRNN) model that utilizes information from higher levels in the CPI hierarchy to improve predictions at the more volatile lower levels. Our evaluations, based on a large data-set from the US CPI-U index, indicate that the HRNN model significantly outperforms a vast array of well-known inflation prediction baselines.
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2011.07920&r=all
  10. By: Ben Hambly; Renyuan Xu; Huining Yang
    Abstract: We explore reinforcement learning methods for finding the optimal policy in the linear quadratic regulator (LQR) problem. In particular, we consider the convergence of policy gradient methods in the setting of known and unknown parameters. We are able to produce a global linear convergence guarantee for this approach in the setting of finite time horizon and stochastic state dynamics under weak assumptions. The convergence of a projected policy gradient method is also established in order to handle problems with constraints. We illustrate the performance of the algorithm with two examples. The first example is the optimal liquidation of a holding in an asset. We show results for the case where we assume a model for the underlying dynamics and where we apply the method to the data directly. The empirical evidence suggests that the policy gradient method can learn the global optimal solution for a larger class of stochastic systems containing the LQR framework and that it is more robust with respect to model mis-specification when compared to a model-based approach. The second example is an LQR system in a higher dimensional setting with synthetic data.
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2011.10300&r=all
  11. By: Máximo Camacho (University of Murcia and BBVA Research); Matías Pacce (Banco de España); Gabriel Pérez-Quirós (European Central Bank and CEPR)
    Abstract: To analyze the international transmission of business cycle fluctuations, we propose a new multilevel dynamic factor model with a block structure that (i) does not restrict the factors to being orthogonal and (ii) mixes data sampled at quarterly and monthly frequencies. By means of Monte Carlo simulations, we show the high performance of the model in computing inferences of the unobserved factors, accounting for the spillover effects, and estimating the model’s parameters. We apply our proposal to data from the G7 economies by analyzing the responses of national factors to shocks in foreign factors and by quantifying the changes in national GDP expectations in response to unexpected positive changes in foreign GDPs. Although the share of the world factor as a source of the international transmission of fluctuations is still signicant, this is partially absorbed by the spillover transmissions. In addition, we document a pro-cyclical channel of international transmission of output growth expectations, with the US and UK being the countries that generate the greatest spillovers and Germany and Japan being the countries that generate the smallest spillovers. Therefore, policymakers should closely monitor the evolution of foreign business cycle expectations.
    Keywords: international business cycles, mixed frequency data, bayesian estimation, spillover effects
    JEL: E32 C22 F42 F41
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:bde:wpaper:2034&r=all
  12. By: Goutham Gopalakrishna (Swiss Finance Institute (EPFL); Ecole Polytechnique Fédérale de Lausanne)
    Abstract: What causes deep recessions and slow recovery? I revisit this question and develop a macro-finance asset pricing model that quantitatively matches the salient empirical features of financial crises such as a large drop in the output, a high risk premium, reduced financial intermediation, and a long duration of economic distress. The model features leveraged intermediaries who are subjected to both capital and productivity shocks, and face a regime-dependent exit rate. I show that the model without time varying intermediary productivity and exit, which reduces to Brunnermeier and Sannikov (2016), suffers from a tension between the amplification and the persistence of financial crises. In particular, there is a trade-off between the unconditional risk premium, the conditional risk premium, and the probability and duration of crisis. Features that generate high financial amplification also induce faster recovery, at odds with the data. I show that my model resolves this tension and generates realistic crises dynamics. The model is solved using a novel numerical method with active machine learning that is scalable and alleviates the curse of dimensionality.
    Keywords: Financial Intermediation, Intermediary Asset Pricing, Machine Learning
    JEL: E44 G12 C63
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:chf:rpseri:rp2096&r=all
  13. By: Marcos Vin\'icius dos Santos Ara\'ujo
    Abstract: This work addressed the use of the geometric Brownian motion to simulate the prices of shares listed in the Small Caps index of the Brazilian stock exchange B3 (Brazil, Bolsa, Balc\~ao). The data used refer to the price history from January 2016 to December 2018. The price history of 2019 was used to be compared with the simulated prices. The data was imported from the Yahoo Finance database using the Python programming language, and the simulations were performed for each stock individually, and for portfolios formed based on expected returns, risk and the Sharpe Index. The results were better for portfolios with higher returns, lower risks and higher Sharpe Indexes.
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2011.08128&r=all
  14. By: Debnath, R.; Bardhan, R.; Darby, S.; Mohaddes, K.; Sunikka-Blank, M.; Coelho, A C V., Isa, A.; Isa, A.
    Abstract: Slum rehabilitation housing (SRH) are critical transitional spaces in urban informality that has deep-rooted implications on poverty alleviation efforts. However, current literature reports systemic injustices in SRH on access to essential services, including energy injustices. This study investigated distributive injustices in the SRH across three cities, Abuja, Mumbai and Rio de Janeiro, developing ‘energy cultures’ narratives. It employed a computational social science methodology that used textual analysis, followed by a constructivist grounded theoretic approach to inform just policy design. The analysis was performed at two scales to identify and contrast injustices in the study areas. The result at an aggregated scale showed commonalities were around the poor design of the built environment, administrative lags of the utilities and high electricity bills. Case study-specific results showed that poverty penalties were linked with the energy cultures of each SRHs. In the Mumbai case, poverty penalties were associated with the aspirational purchase of household appliances due to move from slums to SRH. The Abuja case showed low power quality and load shedding frequently damaged appliances that increase the maintenance costs for the occupants. The Rio de Janeiro SRH case had injustices embedded through the adoption of inefficient appliances received as charity from higher-income households. Fuel stacking was also observed in the SRH that illustrated cultural identities associated with cooking energy. The conclusion was drawn to support just policy design by considering the socio-cultural context of the built environment, improving utility governance and promoting cleaner fuel mix at the household level.
    Keywords: Energy justice, poverty, computational social science, energy cultures, machine learning
    Date: 2020–11–11
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:20101&r=all
  15. By: Melvyn Weeks; Tobias Gabel Christiansen
    Abstract: Various poverty reduction strategies are being implemented in the pursuit of eliminating extreme poverty. One such strategy is increased access to microcredit in poor areas around the world. Microcredit, typically defined as the supply of small loans to underserved entrepreneurs that originally aimed at displacing expensive local money-lenders, has been both praised and criticized as a development tool (Banerjee et al., 2015b). This paper presents an analysis of heterogeneous impacts from increased access to microcredit using data from three randomised trials. In the spirit of recognising that in general the impact of a policy intervention varies conditional on an unknown set of factors, particular, we investigate whether heterogeneity presents itself as groups of winners and losers, and whether such subgroups share characteristics across RCTs. We find no evidence of impacts, neither average nor distributional, from increased access to microcredit on consumption levels. In contrast, the lack of average effects on profits seems to mask heterogeneous impacts. The findings are, however, not robust to the specific machine learning algorithm applied. Switching from the better performing Elastic Net to the worse performing Random Forest leads to a sharp increase in the variance of the estimates. In this context, methods to evaluate the relative performing machine learning algorithm developed by Chernozhukov et al. (2019) provide a disciplined way for the analyst to counter the uncertainty as to which algorithm to deploy.
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2011.10509&r=all
  16. By: Jiang, Zehui; Liu, Chao; Ganapathysubramanian, Baskar; Hayes, Dermot J.; Sarkar, Soumik
    Abstract: Maize (corn) is the dominant grain grown in the world. Total maize production in 2018 equaled 1.12 billion tons. Maize is used primarily as an animal feed in the production of eggs, dairy, pork and chicken. The US produces 32% of the world’s maize followed by China at 22% and Brazil at 9% (https://apps.fas.usda.gov/psdonline/app/index.html#/app/home). Accurate national-scale corn yield prediction critically impacts mercantile markets through providing essential information about expected production prior to harvest. Publicly available high-quality corn yield prediction can help address emergent information asymmetry problems and in doing so improve price efficiency in futures markets. We build a deep learning model to predict corn yields, specifically focusing on county-level prediction across 10 states of the Corn-Belt in the United States, and pre-harvest prediction with monthly updates from August. The results show promising predictive power relative to existing survey-based methods and set the foundation for a publicly available county yield prediction effort that complements existing public forecasts.
    Date: 2020–09–11
    URL: http://d.repec.org/n?u=RePEc:isu:genstf:202009110700001775&r=all
  17. By: Emanuela Ciapanna (Bank of Italy); Sauro Mocetti (Bank of Italy); Alessandro Notarpietro (Bank of Italy)
    Abstract: This paper quantifies the macroeconomic effects of three major structural reforms (i.e., service sector liberalizations, incentives to innovation and civil justice reforms) undertaken in Italy in the last decade. We employ a novel approach that estimates the impact of each reform on total factor productivity and markups in an empirical micro setting and that uses these estimates in a structural general equilibrium model to simulate the macroeconomic impact of the reforms. Our results indicate that, accounting for estimation uncertainty, the increase in the level of GDP as of 2019 due to the sole effect of these reforms (ignoring all the other shocks that the Italian economy suffered in the same period) would be between 3% and 6%. The long-run increase in Italy's potential output would lie between 4% and 8%, with non-negligible effects on the labor market.
    Keywords: structural reforms, DSGE models, liberalization, innovation, civil justice
    JEL: E10 E20 J60 K40 L50 O30
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:bdi:wptemi:td_1303_20&r=all
  18. By: Gurgone, Andrea; Iori, Giulia
    Abstract: To date, macroprudential policy inspired by the Basel III package is applied irrespective of the network characteristics of the banking system. We study how the implementation of macroprudential policy in the form of additional capital requirements conditional to systemic-risk measures of banks should regard the degree of heterogeneity of financial networks. We adopt a multi-agent approach describing an artificial economy with households, firms, and banks in which occasional liquidity crises emerge. We shape the configuration of the financial network to generate two polar worlds: one is characterized by few banks who lend most of the credit to the real sector while borrowing interbank liquidity. The other shows a higher degree of homogeneity. We focus on a capital buffer for SII and two buffers built on measures of systemic impact and vulnerability. The research suggests that the criteria for the identification of systemic-important banks may change with the network heterogeneity. Thus, capital buffers should be calibrated on the heterogeneity of the financial networks to stabilize the system, otherwise they may be ineffective. Therefore, we argue that prudential regulation should account for the characteristics of the banking networks and tune macroprudential tools accordingly.
    Keywords: agent-based model,capital requirements,capital buffers,,financial networks,macroprudential policy,systemic-risk
    JEL: C63 D85 E44 G01 G21
    Date: 2020
    URL: http://d.repec.org/n?u=RePEc:zbw:bamber:164&r=all
  19. By: Jacopo Staccioli; Maria Enrica Virgillito
    Abstract: This paper, relying on a still relatively unexplored long-term dataset on U.S.~patenting activity, provides empirical evidence on the history of labour-saving innovations back to early 19th century. The identification of mechanisation/automation heuristics, retrieved via textual content analysis on current robotic technologies by Montobbio et al. (2020), allows to focus on a limited set of CPC codes where mechanisation and automation technologies are more prevalent. We track their time evolution, clustering, eventual emergence of wavy behaviour, and their comovements with long-term GDP growth. Our results challenge both the general-purpose technology approach and the strict 50-year Kondratiev cycle, while provide evidence of the emergence of erratic constellations of heterogeneous technological artefacts, in line with the development-block approach enabled by autocatalytic systems.
    Keywords: Labour-Saving Technologies; Search Heuristics; Industrial Revolutions; Wavelet analysis.
    Date: 2020–11–23
    URL: http://d.repec.org/n?u=RePEc:ssa:lemwps:2020/34&r=all
  20. By: Ge, S.
    Abstract: This paper uses extensive text data to construct firms' links via which local shocks transmit. Using the novel text-based linkages, I estimate a heterogeneous spatial-temporal model which accommodates the contemporaneous and dynamic spillover effects at the same time. I document a considerable degree of local risk spillovers in the market plus sector hierarchical factor model residuals of S&P 500 stocks. The method is found to outperform various previously studied methods in terms of out-of-sample fit. Network analysis of the spatial-temporal model identifies the major systemic risk contributors and receivers, which are of particular interest to microprudential policies. From a macroprudential perspective, a rolling-window analysis reveals that the strength of local risk spillovers increases during periods of crisis, when, on the other hand, the market factor loses its importance.
    Keywords: Excess co-movement, weak and strong cross-sectional dependence, local risk spillovers, networks, textual analysis, big data, systemic risk, heterogeneous spatial auto-regressive model (HSAR)
    JEL: C33 C58 G10 G12
    Date: 2020–11–26
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:20115&r=all
  21. By: Ángel Iván Moreno (Banco de España); Teresa Caminero (Banco de España)
    Abstract: In this article we apply text mining techniques to analyse the TCFD recommendations on climate-related disclosures of the 12 significant Spanish financial institutions using publicly available corporate reports from 2014 until 2019. In our analysis, applying our domain knowledge, first we create a taxonomy of concepts present in disclosures associated with each of the four areas described in the TCFD recommendations. This taxonomy is then linked together by a set of rules in query form of selected concepts. The queries are crafted so that they identify the excerpts most likely to relate to each of the TCFD’s 11 recommended disclosures. By applying these rules we estimate a TCFD compliance index for each of the four main areas for the period 2014-2019 using corporate reports in Spanish. We also describe some challenges in analysing climate-related disclosures. The index gives an overview of the evolution of the level of climate-related financial disclosures present in the corporate reports of the Spanish banking sector. The results indicate that the quantity of climate-related disclosures reported by the banking sector is growing each year. Besides, our study also suggests that some disclosures are only present in reports different than annual and ESG reports, such as Pillar 3 reports or reports on remuneration of directors.
    Keywords: sustainability, sustainability data gaps, text mining, TCFD, Taxonomy and Ontology Management
    JEL: C81 G32 Q54
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:bde:wpaper:2035&r=all
  22. By: David Kohns; Arnab Bhattacharjee (Centre for Energy Economics Research and Policy, Heriot-Watt University)
    Abstract: More and more are Big Data sources, such as Google Trends, being used to augment nowcast models. An often neglected issue within the previous literature, which is especially pertinent to policy environments, is the interpretability of the Big Data source included in the model. We provide a Bayesian modeling framework which is able to handle all usual econometric issues involved in combining Big Data with traditional macroeconomic time series such as mixed frequency and ragged edges, while remaining computationally simple and allowing for a high degree of interpretability. In our model, we explicitly account for the possibility that the Big Data and macroeconomic data set included have different degreesof sparsity. We test our methodology by investigating whether Google trends in real time increase nowcast fit of US real GDP growth compared to traditional macroeconomic time series. We find that search terms improve performance of both point forecast accuracy as well as forecast density calibration not only before official information is released but alsolater into GDP reference quarters. Our transparent methodology shows that the increased fit stems from search terms acting as early warning signals to large turning points in GDP.
    Keywords: Big Data; Machine Learning; Interpretability; Illusion of Sparsity; Density Nowcast; Google Search Terms
    JEL: C31 C53
    Date: 2019–10
    URL: http://d.repec.org/n?u=RePEc:hwc:wpaper:010&r=all
  23. By: Jafari Kang, Masood; Zohoori, Sepideh; Abbasi, Elahe; Li, Yueqing; Hamidi, Maryam
    Abstract: The electronic commerce, known as “E-commerce”, has been boosted rapidly in recent years, and makes it possible to record all information such as price, location, customer’s review, search history, discount options, competitor’s price, and so on. Accessing to such rich source of data, companies can analyze their users’ behavior to improve the customer satisfaction as well as the revenue. This study aims to estimate the price of used light vehicles in a commercial website, Divar, which is a popular website in Iran for trading second-handed goods. At first, highlighted features were extracted from the description column using the three methods of Bag of Words (BOW), Latent Dirichlet Allocation (LDA), and Hierarchical Dirichlet Process (HDP). Second, a multiple linear regression model was fit to predict the product price based on its attributes and the highlighted features. The accuracy index of Actuals-Predictions Correlation, the min-max index, and MAPE methods were used to validate the proposed methods. Results showed that the BOW model is the best model with an Adjusted R-square of 0.7841.
    Keywords: Text mining, Topic modeling, BOW, LDA, HDP, Linear regression
    JEL: C5 C8 Y10
    Date: 2019–11–08
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:103933&r=all
  24. By: Michael T. Kiley
    Abstract: Machine learning (ML) techniques are used to construct a financial conditions index (FCI). The components of the ML-FCI are selected based on their ability to predict the unemployment rate one-year ahead. Three lessons for macroeconomics and variable selection/dimension reduction with large datasets emerge. First, variable transformations can drive results, emphasizing the need for transparency in selection of transformations and robustness to a range of reasonable choices. Second, there is strong evidence of nonlinearity in the relationship between financial variables and economic activity—tight financial conditions are associated with sharp deteriorations in economic activity and accommodative conditions are associated with only modest improvements in activity. Finally, the ML-FCI places sizable weight on equity prices and term spreads, in contrast to other measures. These lessons yield an ML-FCI showing tightening in financial conditions before the early 1990s and early 2000s recessions, in contrast to the National Financial Conditions Index (NFCI).
    Keywords: Big Data; Recession Prediction; Variable Selection
    JEL: E50 E17 C55 E44
    Date: 2020–11–16
    URL: http://d.repec.org/n?u=RePEc:fip:fedgfe:2020-95&r=all
  25. By: Azqueta-Gavaldon, Andres; Hirschbühl, Dominik; Onorante, Luca; Saiz, Lorena
    Abstract: We propose a granular framework that makes use of advanced statistical methods to approximate developments in economy-wide expected corporate earnings. In particular, we evaluate the dynamic network structure of stock returns in the United States as a proxy for the transmission of shocks through the economy and identify node positions (firms) whose connectedness provides a signal for economic growth. The nowcasting exercise, with both the in-sample and the out-of-sample consistent feature selection, highlights which firms are contemporaneously exposed to aggregate downturns and provides a more complete narrative than is usually provided by more aggregate data. The two-state model for predicting periods of negative growth can remarkably well predict future states by using information derived from the node-positions of manufacturing, transportation and financial (particularly insurance) firms. The three-states model, which identifies high, low and negative growth, successfully predicts economic regimes by making use of information from the financial, insurance, and retail sectors. JEL Classification: C45, C51, D85, E32, N1
    Keywords: early warning signal, Granger-causality networks, real-time, turning point prediction
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20202494&r=all
  26. By: Yoshida, Kazuki; Mathur, Maya B; Glynn, Robert J.
    Abstract: The R package regmedint is a complete implementation of the regression formula-based causal mediation analysis.
    Date: 2020–11–14
    URL: http://d.repec.org/n?u=RePEc:osf:osfxxx:6c79f&r=all
  27. By: Ferrari, Massimo Minesso; Pagliari, Maria Sole; Kurcz, Frederik
    Abstract: In this paper, we apply textual analysis and machine learning algorithms to construct an index capturing trade tensions between US and China. Our indicator matches well-known events in the US-China trade dispute and is exogenous to the developments on global financial markets. By means of local projection methods, we show that US markets are largely unaffected by rising trade tensions, with the exception of those firms that are more exposed to China, while the same shock negatively affects stock market indices in EMEs and China. Higher trade tensions also entail: i) an appreciation of the US dollar; ii) a depreciation of EMEs currencies; iii) muted changes in safe haven currencies; iv) portfolio re-balancing between stocks and bonds in the EMEs. We also show that trade tensions account for around 15% of the variance of Chinese stocks while their contribution is muted for US markets. These findings suggest that the US-China trade tensions are interpreted as a negative demand shock for the Chinese economy rather than as a global risk shock. JEL Classification: D53, E44, F13, F14, C55
    Keywords: Exchange rates, Machine Learning, Stock indexes, Trade Shocks
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20202490&r=all
  28. By: Olga Takács (Corvinus University of Budapest, Hungary, H-1093 Budapest, Fõvám Square 8); János Vincze (Corvinus University of Budapest, Hungary , H-1093 Budapest, Fõvám Square 8 and Centre for Economic and Regional Studies, Institute of Economics, (KRTK KTI),)
    Abstract: This paper reports the results of a Blinder-Oaxaca style decomposition analysis on Hungarian matched employer-employee data to study the gender pay-gap. We carry out the decomposition by Random Forest regressions. The raw gap in our horizon (2008-2016) is increasing, but we find that the wage structure effects are rather stable, thus the rise in the gap is due to the disappearance of the formerly negative composition effects. Graphical analysis sheds light on interesting non-linear relationships; some of them can be readily interpreted by the previous literature. A Classification and Regression Tree analysis suggests that complicated interaction patterns exist in the data. We identify segments of the Hungarian labour market that are most and least exposed to gender-dependent wage determination. Our findings lend support to the idea that an important part of the gender wage gap is attributable to monopsonistic competition with gender-dependent supply elasticities.
    Keywords: Gender pay gap ,Blinder-Oaxaca decomposition,Random Forest Regression Hungary
    JEL: J7 J3 C14
    Date: 2020–11
    URL: http://d.repec.org/n?u=RePEc:has:discpr:2044&r=all

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.