|
on Computational Economics |
Issue of 2021‒03‒29
25 papers chosen by |
By: | Charles F. Manski; Alan H. Sanstad; Stephen J. DeCanio |
Abstract: | Numerical simulations of the global climate system provide inputs to integrated assessment modeling for estimating the impacts of greenhouse gas mitigation and other policies to address global climate change. While essential tools for this purpose, computational climate models are subject to considerable uncertainty, including inter-model “structural” uncertainty. Structural uncertainty analysis has emphasized simple or weighted averaging of the outputs of multi-model ensembles, sometimes with subjective Bayesian assignment of probabilities across models. However, choosing appropriate weights is problematic. To use climate simulations in integrated assessment, we propose instead framing climate model uncertainty as a problem of partial identification, or “deep” uncertainty. This terminology refers to situations in which the underlying mechanisms, dynamics, or laws governing a system are not completely known and cannot be credibly modeled definitively even in the absence of data limitations in a statistical sense. We propose the min-max regret (MMR) decision criterion to account for deep climate uncertainty in integrated assessment without weighting climate model forecasts. We develop a theoretical framework for cost-benefit analysis of climate policy based on MMR, and apply it computationally with a simple integrated assessment model. We suggest avenues for further research. |
JEL: | D81 Q54 Q58 |
Date: | 2021–02 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:28449&r=all |
By: | Javier Oliver Muncharaz (Universidad Politécnica de Valencia) |
Abstract: | In the financial literature, there is great interest in the prediction of stock prices. Stock prediction is necessary for the creation of different investment strategies, both speculative and hedging ones. The application of neural networks has involved a change in the creation of predictive models. In this paper, we analyze the capacity of recurrent neural networks, in particular the long short-term recurrent neural network (LSTM) as opposed to classic time series models such as the Exponential Smooth Time Series (ETS) and the Arima model (ARIMA). These models have been estimated for 284 stocks from the S&P 500 stock market index, comparing the MAE obtained from their predictions. The results obtained confirm a significant reduction in prediction errors when LSTM is applied. These results are consistent with other similar studies applied to stocks included in other stock market indices, as well as other financial assets such as exchange rates. |
Abstract: | En la literatura financiera existe un gran interés por la predicción de precios bursátiles que es necesario para la creación de diferentes estrategias de inversion, tanto especulativas como de cobertura. La aplicación de las redes neuronales ha supuesto un cambio en la creación de modelos de predicción. En este trabajo se analiza la capacidad que tienen las redes neuronales recurrentes, en concreto la long shortterm recurrent neural network (LSTM) frente a modelos de series temporales clásicos como el Exponential Smooth Time Series (ETS) y el modelo Arima (ARIMA). Para ello se ha estimado dichos modelos para 284 acciones pertenecientes al índice bursátil S&P 500, comparando el MAE obtenido de sus predicciones, con el modelo LSTM. Los resultados obtenidos confirman una reducción importante de los errores de predicción. Estos resultados son coincidentes con otros estudios similares aplicados a acciones de otros índices bursátiles así como a otros activos financieros como los tipos de cambio. |
Keywords: | S&P 500,Long short-term neural network,Recurrent Neural Network,Arima,Redes neuronales recurrentes |
Date: | 2020 |
URL: | http://d.repec.org/n?u=RePEc:hal:journl:hal-03149342&r=all |
By: | Baptiste Barreau (MICS - Mathématiques et Informatique pour la Complexité et les Systèmes - CentraleSupélec, BNPP CIB GM Lab - BNP Paribas CIB Global Markets Data & AI Lab); Laurent Carlier (BNPP CIB GM Lab - BNP Paribas CIB Global Markets Data & AI Lab); Damien Challet (MICS - Mathématiques et Informatique pour la Complexité et les Systèmes - CentraleSupélec) |
Abstract: | We propose a novel deep learning architecture suitable for the prediction of investor interest for a given asset in a given timeframe. This architecture performs both investor clustering and modelling at the same time. We first verify its superior performance on a simulated scenario inspired by real data and then apply it to a large proprietary database from BNP Paribas Corporate and Institutional Banking. |
Keywords: | clustering,investor activity prediction,deep learning,neural networks,mixture of experts |
Date: | 2021–01–07 |
URL: | http://d.repec.org/n?u=RePEc:hal:journl:hal-02276055&r=all |
By: | Céline de Quatrebarbes (FERDI - Fondation pour les Etudes et Recherches sur le Développement International); Bertrand Laporte (CERDI - Centre d'Études et de Recherches sur le Développement International - CNRS - Centre National de la Recherche Scientifique - UCA - Université Clermont Auvergne); Stéphane Calipel (CERDI - Centre d'Études et de Recherches sur le Développement International - CNRS - Centre National de la Recherche Scientifique - UCA - Université Clermont Auvergne) |
Abstract: | As happened in West Africa in 2008, in an imported inflation context, it is common for the governments to take short-term tax action to protect the poor: VAT or trade tariffs exemptions. As part of the tax-tariff transition, the comparison between Trade tariffs and VAT has already been the subject of much works. The introduction of VAT, as a tax on final consumption, is supposed to be optimal, due to its economically neutral aspect for production decisions. However, some authors show that in developing countries, a large informal sector affects this result. In this paper, we use a CGE model and a micro-simulation model to compare the effects of VAT and Trade tariffs exemptions to combat rising agricultural food prices. |
Keywords: | computable general equilibrium model,imperfect competition,indirect taxes,poverty,Niger |
Date: | 2021–03–01 |
URL: | http://d.repec.org/n?u=RePEc:hal:wpaper:hal-03164636&r=all |
By: | Francesco Cusano (Bank of Italy); Giuseppe Marinelli (Bank of Italy); Stefano Piermattei (Bank of Italy) |
Abstract: | Ensuring and disseminating high-quality data is crucial for central banks to adequately support monetary analysis and the related decision-making process. In this paper we develop a machine learning process for identifying errors in banks’ supervisory reports on loans to the private sector employed in the Bank of Italy’s statistical production of Monetary and Financial Institutions’ (MFI) Balance Sheet Items (BSI). In particular, we model a “Revisions Adjusted – Quantile Regression Random Forest” (RA–QRRF) algorithm in which the predicted acceptance regions of the reported values are calibrated through an individual “imprecision rate” derived from the entire history of each bank’s reporting errors and revisions collected by the Bank of Italy. The analysis shows that our RA-QRRF approach returns very satisfying results in terms of error detection, especially for the loans to the households sector, and outperforms well-established alternative outlier detection procedures based on probit and logit models. |
Keywords: | banks, balance sheet items, outlier detection, machine learning |
JEL: | C63 C81 G21 |
Date: | 2021–03 |
URL: | http://d.repec.org/n?u=RePEc:bdi:opques:qef_611_21&r=all |
By: | Cathal O'Donoghue; Denisa M. Sologon; Iryna Kyzyma; John McHale |
Abstract: | This paper relies on a microsimulation framework to undertake an analysis of the distributional implications of the COVID-19 crisis over three waves. Given the lack of real-time survey data during the fast moving crisis, it applies a nowcasting methodology and real-time aggregate administrative data to calibrate an income survey and to simulate changes in the tax benefit system that attempted to mitigate the impacts of the crisis. Our analysis shows how crisis-induced income-support policy innovations combined with existing progressive elements of the tax-benefit system were effective in avoiding an increase in income inequality at all stages of waves 1-3 of the COVID-19 emergency in Ireland. There was, however, a decline in generosity over time as benefits became more targeted. On a methodological level, our paper makes a specific contribution in relation to the choice of welfare measure in assessing the impact of the COVID-19 crisis on inequality. |
Date: | 2021–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2103.08398&r=all |
By: | Olivier Basdevant; John Hooley; Eslem Imamoglu |
Abstract: | This guidance note describes how to use the Excel-based template developed by the Fiscal Affairs Department (FAD) of the IMF accompanying the note “How to Design a Fiscal Strategy in a Resource-Rich Country.” This template uses data inputs to generate simulations of fiscal policy dynamics. It helps IMF teams and country authorities in RRCs analyze trade-offs associated with alternative fiscal strategies for the use of public resource wealth. Visualizing these trade-offs and assessing their sensitivity to underlying macroeconomic assumptions can help inform policymakers on the most appropriate fiscal strategy, given country-specific circumstances. |
Keywords: | Commodity price indexes;Macro-fiscal framework;Fiscal governance;Fiscal multipliers;Fiscal policy;FADHTN,HTN,resource revenue,calibration method,commodity shock,dropdown menu,shock path |
Date: | 2021–03–09 |
URL: | http://d.repec.org/n?u=RePEc:imf:imfhtn:2021/002&r=all |
By: | Gutin, Gregory; Hirano, Tomohiro; Hwang, Sung-Ha; Neary, Philip R; Toda, Alexis Akira |
Abstract: | How does social distancing affect the reach of an epidemic in social networks? We present Monte Carlo simulation results of a susceptible-infected-removed with social distancing model. The key feature of the model is that individuals are limited in the number of acquaintances that they can interact with, thereby constraining disease transmission to an infectious subnetwork of the original social network. While increased social distancing typically reduces the spread of an infectious disease, the magnitude varies greatly depending on the topology of the network, indicating the need for policies that are network dependent. Our results also reveal the importance of coordinating policies at the 'global' level. In particular, the public health benefits from social distancing to a group (e.g. a country) may be completely undone if that group maintains connections with outside groups that are not following suit. |
Keywords: | BA scale-free networks, Infectious subnetwork, SIRwSD model, Social distancing, WS small-world networks, q-bio.PE, physics.soc-ph, Fluids & Plasmas, Applied Economics |
Date: | 2021–03–03 |
URL: | http://d.repec.org/n?u=RePEc:cdl:ucsdec:qt7xv4h5qr&r=all |
By: | Srivastav, Bhanu |
Abstract: | Neural networks are one of the methods of artificial intelligence. It is founded on an existing knowledge and capacity to learn by illustration of the biological nervous system. Neural networks are used to solve problems that could not be modeled with conventional techniques. A neural structure can be learned, adapted, predicted, and graded. The potential of neural network parameters is very strong prediction. The findings are more reliable than standard mathematical estimation models. Therefore, it has been used in different fields. This research reviews the most recent advancement in utilizing the Artificial neural networks. The reviewed studies have been extracted from Web of Science maintained by Clarivate Analytics in 2021. We find that among the other applications of ANN, the applications on Covid-19 are on the rise. |
Keywords: | ANN; Covid-19; Dust; Gas; Organic richness |
JEL: | I1 I10 Q49 Y80 |
Date: | 2021–02–08 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:106499&r=all |
By: | Riccardo (Jack) Lucchetti (Dipartimento di Scienze Economiche e Sociali (DiSES), Università Politecnica delle Marche); Luca Pedini (Dipartimento di Scienze Economiche e Sociali (DiSES), Università Politecnica delle Marche) |
Abstract: | This paper describes the gretl function package ParMA, which provides Bayesian model averaging in generalised linear models. In order to over-come the lack of analytical specification for many of the models covered, the package features an implementation of the reversible jump Markov chain Monte Carlo technique, following the original idea by Green (1995), as a flexible tool to model several specifications. Particular attention is devoted to computational aspects such as the automatisation of the model building procedure and the parallelisation of the sampling scheme. |
Keywords: | BMA, GLM, RJMCMC, parallelisation |
JEL: | C11 C63 C20 |
Date: | 2020 |
URL: | http://d.repec.org/n?u=RePEc:ven:wpaper:2020:28&r=all |
By: | Dongming Wei; Yogi Ahmad Erlangga; Andrey Pak; Laila Zhexembay |
Abstract: | his paper presents finite element methods for solving numerically the Risk-Adjusted Pricing Methodology (RAPM) Black-Scholes model for option pricing with transaction costs. Spatial finite element models based on P1 and/or P2 elements are formulated using some group finite elements and numerical quadrature to handle the nonlinear term, in combination with a Crank-Nicolson-type temporal scheme. The temporal scheme is implemented using the Rannacher approach. Spatial-temporal mesh-size ratios are observed for controlling the stability of our method. Our results compare favorably with the finite difference results in the literature for the model. |
Date: | 2021–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2103.08380&r=all |
By: | Axel Anderson (Department of Economics, Georgetown University); Jeremy Rosen (Topspin Shot Research); John Rust (Department of Economics, Georgetown University); Kin-Ping Wong (Digonex) |
Abstract: | Are the serves of the world’s best tennis pros consistent with the theoretical prediction of Nash equilibrium in mixed strategies? We analyze their serve direction choices (to the returner’s left, right or body) with data from an online database called the Match Charting Project. Using a new methodology, we test and decisively reject a key implication of a mixed strategy Nash equilibrium, namely, that the probability of winning a service game is the same for all serve directions. We also use dynamic programming (DP) to numerically solve for the best-response serve strategies to probability models of service game outcomes estimated for individual server-returner pairs, such as Novak Djokovic serving to Rafael Nadal. We show that for most elite pro servers, the DP serve strategy significantly increases their service game win probability compared to the mixed strategies they actually use, which we estimate using flexible reduced-form logit models. Stochastic simulations verify that our results are robust to estimation error. Classification- C61, C73, L21 |
Keywords: | tennis, games, Nash equilibrium, Minimax theorem, constant sum games, mixed strategies, dynamic directional games, binary Markov games, dynamic programming, structural estimation, muscle memory |
Date: | 2021–03–16 |
URL: | http://d.repec.org/n?u=RePEc:geo:guwopa:gueconwpa~21-21-07&r=all |
By: | Emmanuel Coffie |
Abstract: | In this paper, we study analytical properties of the solutions to the generalised delay Ait-Sahalia-type interest rate model with Poisson-driven jump. Since this model does not have explicit solution, we employ several new truncated Euler-Maruyama (EM) techniques to investigate finite time strong convergence theory of the numerical solutions under the local Lipschitz condition plus the Khasminskii-type condition. We justify the strong convergence result for Monte Carlo calibration and valuation of some debt and derivative instruments. |
Date: | 2021–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2103.07651&r=all |
By: | Muriel Dal Pont Legrand (Université Côte d'Azur, CNRS, GREDEG, France) |
Abstract: | This paper analyses how the macro agent-based literature which developed intensively during the last decades, analyses the issue of financial instability. This paper focuses its attention on two specific researchers’ communities which, within this new paradigm, specifically emphasize this question. We examine their common analytical foundations, how they have been influenced by anterior research programs, and we distinguish their modeling strategies and how these distinct strategies led them to follow somewhat different objectives. |
Keywords: | Macro agent-based models, financial instability, microeconomic foundations, CATS, K&S, Minsky, Leijonhufvud, Stiglitz |
JEL: | B22 B31 B41 E32 |
Date: | 2021–03 |
URL: | http://d.repec.org/n?u=RePEc:gre:wpaper:2021-14&r=all |
By: | Xiaoyue Li; A. Sinem Uysal; John M. Mulvey |
Abstract: | We employ model predictive control for a multi-period portfolio optimization problem. In addition to the mean-variance objective, we construct a portfolio whose allocation is given by model predictive control with a risk-parity objective, and provide a successive convex program algorithm that provides 30 times faster and robust solutions in the experiments. Computational results on the multi-asset universe show that multi-period models perform better than their single period counterparts in out-of-sample period, 2006-2020. The out-of-sample risk-adjusted performance of both mean-variance and risk-parity formulations beat the fix-mix benchmark, and achieve Sharpe ratio of 0.64 and 0.97, respectively. |
Date: | 2021–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2103.10813&r=all |
By: | Kea Baret (BETA - Bureau d'Économie Théorique et Appliquée - UL - Université de Lorraine - UNISTRA - Université de Strasbourg - CNRS - Centre National de la Recherche Scientifique - INRAE - Institut National de Recherche pour l’Agriculture, l’Alimentation et l’Environnement); Amélie Barbier-Gauchard (BETA - Bureau d'Économie Théorique et Appliquée - UL - Université de Lorraine - UNISTRA - Université de Strasbourg - CNRS - Centre National de la Recherche Scientifique - INRAE - Institut National de Recherche pour l’Agriculture, l’Alimentation et l’Environnement); Theophilos Papadimitriou (DUTH - Democritus University of Thrace) |
Abstract: | Since the reinforcement of the Stability and Growth Pact (1996), the European Commission closely monitors public finance in the EU members. A failure to comply with the 3% limit rule on the public deficit by a country triggers an audit. In this paper, we present a Machine Learning based forecasting model for the compliance with the 3% limit rule. To do so, we use data spanning the period from 2006 to 2018 (a turbulent period including the Global Financial Crisis and the Sovereign Debt Crisis) for the 28 EU Member States. A set of eight features are identified as predictors from 141 variables through a feature selection procedure. The forecasting is performed using the Support Vector Machines (SVM). The proposed model reached 91.7% forecasting accuracy and outperformed the Logit model that we used as benchmark. |
Keywords: | Fiscal Rules,Fiscal Compliance,Stability and Growth Pact,Machine learning |
Date: | 2021–01–26 |
URL: | http://d.repec.org/n?u=RePEc:hal:wpaper:hal-03121966&r=all |
By: | Marie-Hélène Felt; David Laferrière |
Abstract: | The Bank of Canada’s Currency Department has used the Canadian Financial Monitor (CFM) survey since 2009 to track Canadians’ cash usage, payment card ownership and usage, and the adoption of payment innovations. A new online CFM survey was launched in 2018. Because it uses non-probability sampling for data collection, selection bias is very likely. We outline various methods for obtaining survey weights and discuss the associated conditions necessary for these weights to eliminate selection bias. In the end, we obtain calibration weights for the 2018 and 2019 online CFM samples. Our final weights improve upon the default weights provided by the survey company in several ways: (i) we choose the calibration variables based on a fully documented selection procedure that employs machine learning techniques; (ii) we use very up-to-date calibration totals; (iii) for each survey year we obtain two sets of weights, one for the full yearly sample of CFM respondents, the other for the sub-sample of CFM respondents who also filled in the methods-of-payment module of the survey. |
Keywords: | Econometric and statistical methods |
JEL: | C C8 C81 C83 |
Date: | 2020 |
URL: | http://d.repec.org/n?u=RePEc:bca:bocatr:118&r=all |
By: | Nuno José Henriques Baetas da Silva (Centre for Business and Economics CeBER and Faculty of Economics, University of Coimbra); António Manuel Portugal Duarte (University of Coimbra, Centre for Business and Economics CeBER and Faculty of Economics) |
Abstract: | Making use of a two-country, two-sector, New Keynesian model with essential and nonessential goods we assess the macroeconomic consequences of a labor supply shock in the Euro Area. Our model incorporates health status in the households' maximization problem which depends on the time devoted to leisure. Health status is linked to the consumption of non-essential goods, such that the demand for non-essentials is decreasing with contemporaneous health. After calibrating the model for the case of Portugal and the rest of the Euro Area, our simulations show that, a labor supply shock aecting only the latter, reduces the demand for non-essential goods, generates ination in the Portuguese economy and pushes both regions into economic recession, depriving households from essential goods. If the labor supply shock aects both economies, the negative income eect dominates the decreased demand eect for non-essential goods and that stagation is a plausible scenario. In addition, our calibration scheme allows us to conclude that the asymmetric eects across economies may be due to dierent price rigidities between sectors and to dierent production structures between countries. |
Keywords: | Essential goods, Non-essential goods, COVID-19, DSGE, Euro Area.. |
JEL: | E12 E32 F41 F42 |
Date: | 2021–04 |
URL: | http://d.repec.org/n?u=RePEc:gmf:papers:2021-04&r=all |
By: | Jayachandran, Seema (Northwestern University); Biradavolu, Monica (QualAnalytics); Cooper, Jan (Harvard University) |
Abstract: | We propose a new method to design a short survey measure of a complex concept such as women's agency. The approach combines mixed-methods data collection and machine learning. We select the best survey questions based on how strongly correlated they are with a "gold standard" measure of the concept derived from qualitative interviews. In our application, we measure agency for 209 women in Haryana, India, first, through a semi-structured interview and, second, through a large set of close-ended questions. We use qualitative coding methods to score each woman's agency based on the interview, which we treat as her true agency. To identify the close-ended questions most predictive of the "truth," we apply statistical algorithms that build on LASSO and random forest but constrain how many variables are selected for the model (five in our case). The resulting five-question index is as strongly correlated with the coded qualitative interview as is an index that uses all of the candidate questions. This approach of selecting survey questions based on their statistical correspondence to coded qualitative interviews could be used to design short survey modules for many other latent constructs. |
Keywords: | women's empowerment, survey design, feature selection, psychometrics |
JEL: | C83 D13 J16 O12 |
Date: | 2021–03 |
URL: | http://d.repec.org/n?u=RePEc:iza:izadps:dp14221&r=all |
By: | François Durand (Nokia Bell Labs, LINCS - Laboratory of Information, Network and Communication Sciences - Inria - Institut National de Recherche en Informatique et en Automatique - IMT - Institut Mines-Télécom [Paris] - SU - Sorbonne Université); Antonin Macé (PJSE - Paris Jourdan Sciences Economiques - UP1 - Université Paris 1 Panthéon-Sorbonne - ENS Paris - École normale supérieure - Paris - PSL - Université Paris sciences et lettres - EHESS - École des hautes études en sciences sociales - ENPC - École des Ponts ParisTech - CNRS - Centre National de la Recherche Scientifique - INRAE - Institut National de Recherche pour l’Agriculture, l’Alimentation et l’Environnement, PSE - Paris School of Economics - ENPC - École des Ponts ParisTech - ENS Paris - École normale supérieure - Paris - PSL - Université Paris sciences et lettres - UP1 - Université Paris 1 Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique - EHESS - École des hautes études en sciences sociales - INRAE - Institut National de Recherche pour l’Agriculture, l’Alimentation et l’Environnement); Matias Nunez (CREST - Centre de Recherche en Économie et Statistique - ENSAI - Ecole Nationale de la Statistique et de l'Analyse de l'Information [Bruz] - X - École polytechnique - ENSAE ParisTech - École Nationale de la Statistique et de l'Administration Économique - CNRS - Centre National de la Recherche Scientifique) |
Abstract: | We study how voting rules shape voter coordination in large three-candidate elections. We consider three rules, varying according to the number of candidates that voters can support in their ballot: Plurality (one), Anti-Plurality (two) and Approval Voting (one or two). We show that the Condorcet winner—a normatively desirable candidate—can always be elected at equilibrium under Approval Voting. We then numerically study a dynamic process of political tâtonnement. Monte-Carlo simulations of the process deliver rich insights on election outcomes. The Condorcet winner is virtually always elected under Approval Voting, but not under the other rules. The dominance of Approval Voting is robust to alternative welfare criteria and to the introduction of expressive voters. |
Keywords: | Approval voting,Poisson games,Strategic voting,Condorcet consistency,Fictitious play,Expressive voting |
Date: | 2021–03 |
URL: | http://d.repec.org/n?u=RePEc:hal:wpaper:halshs-03162184&r=all |
By: | Gomez, M.; Hairault, J. |
Abstract: | Our paper aims to unveil how much the monetary policy shall deviate from the flexible-price allocation in an economy with a large informal sector. First of all, the presence of variable taxes in the formal sector generates an inflation bias under discretionary policy which increases with the size of the informal sector. Secondly, we find that only the formal sector due to tax distortion fluctuations is responsible for cost push shocks which are amplified in a more informal economy. The trade-off between inflation and the formal output gap is then dependent on the elasticity of the former variable with respect to the latter one, which is lower in a more informal economy. However, the optimal management of inflation also depends on the elasticity of the informal output gap with respect to the formal output gap. As this elasticity is decreasing with the size of the informal sector, whether inflation volatility (in terms of the aggregate output gap) is lower or higher in a more informal economy is ambiguous. By simulation, we show that economies with a larger informal sector should stabilize more inflation relative to the two sectoral output gaps. |
Keywords: | Informality; optimal monetary policy; New-Keynesian macroeconomics;tax distortion |
JEL: | E26 E52 E12 H21 |
Date: | 2020–06–02 |
URL: | http://d.repec.org/n?u=RePEc:col:000561:019125&r=all |
By: | Azi Ben-Rephael; Bruce I. Carlin; Zhi Da; Ryan D. Israelsen |
Abstract: | We use machine learning to analyze minute-by-minute Bloomberg online status data and study how the effort provision of top executives in public corporations affects firm value. While executives likely spend most of their time doing other activities, Bloomberg usage data allows us to characterize their work habits. We document a positive effect of effort on unexpected earnings, cumulative abnormal returns following firm earnings announcements, and credit default swap spreads. We form long-short, calendar-time, effort portfolios and show that they earn significant average daily returns. Finally, we revisit several agency issues that have received attention in the prior academic literature on executive compensation. |
JEL: | D22 D82 G32 M52 |
Date: | 2021–02 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:28441&r=all |
By: | Valentina Aprigliano (Bank of Italy); Simone Emiliozzi (Bank of Italy); Gabriele Guaitoli (University of Warwick); Andrea Luciani (Bank of Italy); Juri Marcucci (Bank of Italy); Libero Monteforte (Ufficio Parlamentare di Bilancio, Bank of Italy) |
Abstract: | Can we use newspaper articles to forecast economic activity? Our answer is yes and, to this end, we propose a brand new economic dictionary in Italian with valence shifters, and we apply it to a corpus of about two million articles from four popular newspapers. We produce a set of high-frequency text-based sentiment and policy uncertainty indicators (TESI and TEPU respectively), which are constantly updated, not revised and computed both for the whole economy and for specific sectors or economic topics. To test the predictive power of our text-based indicators, we propose two forecasting exercises. First, by using Bayesian Model Averaging (BMA) techniques, we show that our monthly text-based indicators greatly reduce the uncertainty surrounding the short-term forecasts of the main macroeconomic aggregates, especially during recessions. Secondly, we employ these indices in a weekly GDP growth tracker, achieving sizeable gains in forecasting accuracy in both normal and turbulent times. |
Keywords: | Forecasting, Text Mining, Sentiment, Economic Policy Uncertainty, Big data, BMA. |
JEL: | C11 C32 C43 C52 C55 E52 E58 |
Date: | 2021–03 |
URL: | http://d.repec.org/n?u=RePEc:bdi:wptemi:td_1321_21&r=all |
By: | Doruk Cengiz; Arindrajit Dube; Attila S. Lindner; David Zentler-Munro |
Abstract: | We assess the effect of the minimum wage on labor market outcomes such as employment, unemployment, and labor force participation for most workers affected by the policy. We apply modern machine learning tools to construct demographically-based treatment groups capturing around 75% of all minimum wage workers—a major improvement over the literature which has focused on fairly narrow subgroups where the policy has a large bite (e.g., teens). By exploiting 172 prominent minimum wages between 1979 and 2019 we find that there is a very clear increase in average wages of workers in these groups following a minimum wage increase, while there is little evidence of employment loss. Furthermore, we find no indication that minimum wage has a negative effect on the unemployment rate, on the labor force participation, or on the labor market transitions. Furthermore, we detect no employment or participation responses even for sub-groups that are likely to have a high extensive margin labor supply elasticity—such as teens, older workers, or single mothers. Overall, these findings provide little evidence for changing search effort in response to a minimum wage increase. |
JEL: | J08 J2 J3 J38 J8 J88 |
Date: | 2021–01 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:28399&r=all |
By: | Yasuyuki Kusuda |
Abstract: | Omnichannel retailing, a new form of distribution system, seamlessly integrates the Internet and physical stores. This study considers the pricing and fulfillment strategies of a retailer that has two sales channels: online and one physical store. The retailer offers consumers three purchasing options: delivery from the fulfillment center, buy online and pick up in-store (BOPS), and purchasing at the store. Consumers choose one of these options to maximize their utility, dividing them into several segments. Given the retailer can induce consumers to the profitable segment by adjusting the online and store prices, our analysis shows that it has three optimal strategies: (1) The retailer excludes consumers far from the physical store from the market and lets the others choose BOPS or purchasing at the store. (2) It lets consumers far from the physical store choose delivery from the fulfillment center and the others choose BOPS or purchasing at the store. (3) It lets all consumers choose delivery from the fulfillment center. Finally, we present simple dynamic simulations that considers how the retailer's optimal strategy changes as consumers' subjective probability of believing the product is in stock decreases. The results show that the retailer should offer BOPS in later periods of the selling season to maximize its profit as the subjective probability decreases. |
Date: | 2021–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2103.07214&r=all |