|
on Computational Economics |
Issue of 2015‒06‒13
eleven papers chosen by |
By: | Peter, Eckley |
Abstract: | We develop a news-media textual measure of aggregate economic uncertainty, defined as the fraction of Financial Times articles that contain uncertainty-related keyphrases, at frequencies from daily to annual, from January 1982 to April 2014. We improve on existing similar measures in several ways. First, we reveal extensive and irregular duplication of articles in the news database most widely used in the literature, and provide a simple but effective de-duplication algorithm. Second, we boost the uncertainty ‘signal strength’ by 14% through the simple addition of the word “uncertainties” to the conventional keyword list of “uncertain” and “uncertainty”, and show that adding further uncertainty-related keyphrases would likely constitute only a second-order adjustment. Third, we demonstrate the importance of normalising article counts by total news volume and provide the first textual uncertainty measure to do so for the UK. We empirically establish the plausibility of our measure as an uncertainty proxy through a detailed narrative analysis and a detailed comparative analysis with another popular uncertainty proxy, stock returns volatility. We show the relationship between these proxies is strong and significant on average, but breaks down periodically. We offer plausible explanations for this behaviour. We also establish the absence of Granger causation between the measures, even down to daily (publication) frequency. |
Keywords: | economic uncertainty; news-media; text-mining; stock returns volatility |
JEL: | C80 D80 E66 G10 |
Date: | 2015–01–30 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:64874&r=cmp |
By: | David Garcia; Frank Schweitzer |
Abstract: | The availability of data on digital traces is growing to unprecedented sizes, but inferring actionable knowledge from large-scale data is far from being trivial. This is especially important for computational finance, where digital traces of human behavior offer a great potential to drive trading strategies. We contribute to this by providing a consistent approach that integrates various datasources in the design of algorithmic traders. This allows us to derive insights into the principles behind the profitability of our trading strategies. We illustrate our approach through the analysis of Bitcoin, a cryptocurrency known for its large price fluctuations. In our analysis, we include economic signals of volume and price of exchange for USD, adoption of the Bitcoin technology, and transaction volume of Bitcoin. We add social signals related to information search, word of mouth volume, emotional valence, and opinion polarization as expressed in tweets related to Bitcoin for more than 3 years. Our analysis reveals that increases in opinion polarization and exchange volume precede rising Bitcoin prices, and that emotional valence precedes opinion polarization and rising exchange volumes. We apply these insights to design algorithmic trading strategies for Bitcoin, reaching profits of more than 200% in less than a year. We verify this high profitability with robust statistical methods that take into account risk and trading costs, confirming the long-standing hypothesis that trading based social media sentiment can yield positive returns on investment. |
Date: | 2015–06 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1506.01513&r=cmp |
By: | Ludwig, Alexander; Schön, Matthias (Munich Center for the Economics of Aging (MEA)) |
Abstract: | This paper investigates extensions of the method of endogenous gridpoints (ENDGM) introduced by Carroll (2006) to higher dimensions with more than one continuous endogenous state variable. We compare three dierent categories of algorithms: (i) the conventional method with exogenous grids (EXOGM), (ii) the pure method of endogenous gridpoints (ENDGM) and (iii) a hybrid method (HYBGM). ENDGM comes along with Delaunay interpolation on irregular grids. Comparison of methods is done by evaluating speed and accuracy. We nd that HYBGM and ENDGM both dominate EXOGM. In an innite horizon model, ENDGM also always dominates HYBGM. In a nite horizon model, the choice between HYBGM and ENDGM depends on the number of gridpoints in each dimension. With less than 150 gridpoints in each dimension ENDGM is faster than HYBGM, and vice versa. For a standard choice of 25 to 50 gridpoints in each dimension, ENDGM is 1:4 to 1:7 times faster than HYBGM in the nite horizon version and 2:4 to 2:5 times faster in the innite horizon version of the model. |
JEL: | C63 E21 |
Date: | 2014–06–11 |
URL: | http://d.repec.org/n?u=RePEc:mea:meawpa:201309&r=cmp |
By: | Klabunde, Anna |
Abstract: | In this paper an agent-based model of endogenously evolving migrant networks is developed to identify the determinants of migration and return decisions. Individuals are connected by links, the strength of which declines over time and distance. Methodologically, this paper combines parameterization using data from the Mexican Migration Project with calibration. It is shown that expected earnings, an idiosyncratic home bias, network ties to other migrants, strength of links to the home country and age have a significant impact on circular migration patterns. The model can reproduce spatial patterns of migration as well as the distribution of number of trips of migrants. It is shown how it can also be used for computational experiments and policy analysis. |
Abstract: | In dieser Studie wird ein agentenbasiertes Modell zum Migrationskreislauf mexikanischer Migranten in die USA eingeführt. Es handelt sich um ein vollständig empirisch fundiertes Modell, d.h. alle Parameter basieren auf empirischen Schätzungen. Insbesondere wurden die Koeffizienten der Verhaltensregeln der Individuen mit geläufigen ökonometrischen Methoden geschätzt. Hierbei wurde das Mexican Migration Project (MMP) verwendet, ein großer Haushaltsdatensatz. In einem ersten Schritt wird gezeigt, dass erwartetes Einkommen, eine idiosynkratische Heimatpräferenz und Netzwerkbeziehungen zu anderen Migranten die wichtigsten Determinanten der Migrationsentscheidung von Angehörigen einer Generation mexikanischer Migranten sind. Die Anzahl und Stärke der Beziehungen in das Heimatland beeinflusst hingegen die Rückkehrentscheidung. Es wird zudem gezeigt, dass die Verteilung der Migranten über die Städte der USA hinweg einer Power-Law-Verteilung folgt. Dies wird erklärt durch einen Preferential-Attachment'-Prozess, in dem Migranten häufig die Städte als Zielort wählen, in denen sie Bekannte und Verwandte haben. Die Verteilung der Anzahl der Migrationsbewegungen ist negativ binomialverteilt, was dadurch zu erklären ist, dass es viel wahrscheinlicher ist, dass Migranten nach der ersten Migrationsbewegung eine weitere Migrationsbewegung durchführen, als dass sie das erste Mal migrieren. Der Grund hierfür ist, dass sich die Entscheidung, zum zweiten Mal zu migrieren, stark von der unterscheidet, zum ersten Mal auszuwandern, weil migrationsspezifische Erfahrungen die Entscheidung erleichtern. Das agentenbasierte Modell ist in der Lage, beide Verteilungen und zwei aggregierte Zeitreihen nachzubilden. Daher wird es für geeignet befunden, Politikanalysen durchzuführen. Es wird gezeigt, wie mit Hilfe des Modells der Effekt einer Erhöhung der mexikanischen Löhne und einer Intensivierung der Grenzkontrollen untersucht werden kann. |
Keywords: | circular migration,social networks,agent-based computational economics |
JEL: | C63 F22 J61 |
Date: | 2014 |
URL: | http://d.repec.org/n?u=RePEc:zbw:rwirep:471&r=cmp |
By: | Alain Chateauneuf (CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics, IPAG - Business School); Mina Mostoufi (CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics); David Vyncke (Universiteit Gent - Vakgroep Toegepaste Wiskunde en Informatica) |
Abstract: | Monte Carlo (MC) simulation is a technique that provides approximate solutions to a broad range of mathematical problems. A drawback of the method is its high computational cost, especially in a high-dimensional setting. Estimating the Tail Value-at-Risk for large portfolios or pricing basket options and Asian options for instance can be quite time-consuming. For these types of problems, one can construct an upper bound in the convex order by replacing the copula by the comonotonic copula. This comonotonic upper bound can be computed very quickly, but it gives only a rough approximation. In this paper we introduce the Comonotonic Monte Carlo (CoMC) simulation, which uses the best features of both approaches. By using the comonotonic approximation as a control variate we get more accurate estimates and hence the simulation is less time-consuming. The CoMC is of broad applicability and numerical results show a remarkable speed improvement. We illustrate the method for estimating Tail Value-at-Risk and pricing basket options and Asian options. |
Abstract: | La méthode de Monte Carlo est une technique qui permet la résolution d'un grand nombre de problèmes en mathématiques. L'inconvénient de la méthode est la lourdeur des calculs spécialement dans le cadre multidimensionnel. L'estimation de la « Tail Value-at-Risk » dans le cas d'un grand portefeuille ou la tarification d'un portefeuille d'actions comme les options asiatiques peuvent être relativement longues. Dans ce genre de cas, on peut définir une borne supérieure d'ordre convexe en remplaçant la copula par une copula comonotone. Dans cet article, nous introduisons la méthode de Monte Carlo comonotone qui combine les avantages des deux approches. En utilisant l'approximation comonotone comme variable aléatoire de contrôle, nous obtenons une estimation plus précise et donc une simulation moins longue. La CoMc a une large application et les résultats obtenus montrent une amélioration remarquable en terme de vitesse de résolution. Nous illustrerons cette méthode par l'estimation de la TVar, la tarification d'un portefeuille d'actions et des options asiatiques. |
Date: | 2015–02 |
URL: | http://d.repec.org/n?u=RePEc:hal:cesptp:hal-01159741&r=cmp |
By: | Klima, Grzegorz; Podemski, Karol; Retkiewicz-Wijtiwiak, Kaja; Sowińska, Anna E. |
Abstract: | This paper presents an implementation of the well-known Smets-Wouters 2003 model for Euro Area using the gEcon package - what we call the ``third generation'' DSGE modelling toolbox. Our exercise serves three goals. First, we show how gEcon can be used to implement an important - from both applications and historical perspective - model. Second, through rigorous exposition enforced by the gEcon’s block-agent paradigm we analyse all the Smets-Wouters model’s building blocks. Last, but not least, the implementation presented here serves as a natural starting point for important from applications point of view extensions, like opening the economy, introducing non-lump-sum taxes, or adding sectors to the model economy. Full model implementation is attached. |
Keywords: | DSGE; monetary policy; staggered prices; staggered wages |
JEL: | C88 E3 E4 |
Date: | 2015–02–28 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:64440&r=cmp |
By: | Francois J Stofberg and Jan H van Heerden |
Abstract: | TERM[1] is used to analyse the short term regional economic impact of an increase in industries’ transport costs when paying E-Tolls. Market-clearing and accounting equations allow regional economies to be represented as an integrated framework; labour adjusts to accommodate increasing transportation costs, and investments change to accommodate capital that is fixed.[2] We concluded that costs from levying E-Tolls on industries are relatively small in comparison to total transport costs, and the impact on economic aggregates and most industries are negligible: investments (-0.404%), GDP (-0.01), CPI (-0.10%). This is true even when considering costs and benefits on industries as well as consumers. Industries that experienced the greatest decline in output were transport, construction, and gold. Provinces which are closer to Gauteng, and have a greater share of severely impacted industries, experienced larger GDP and real income reductions. Mpumalanga’s decrease in GDP was 17% greater than Gauteng’s.[1] “TERM†is an acronym for “The enormous regional modelâ€, for simplicity we refer to the TERM model.[2] TERM is a bottom-up CGE model designed for highly disaggregated regional data. “CGE†is an acronym for Computable General Equilibrium. TERM models originate from Horridge et al. (2005) which are better explained in Horridge (2011). |
Keywords: | Computable General Equilibrium Models, Regional Economics, Policy Modelling, Transport Cost |
JEL: | C68 L91 R11 R48 |
Date: | 2015 |
URL: | http://d.repec.org/n?u=RePEc:rza:wpaper:515&r=cmp |
By: | Nlemfu Mukoko, J.Blaise; Wabenga Yango, James |
Abstract: | This paper examines the implications of agricultural growth on poverty reduction during the period 2013 - 2020. It also poses the problem of the level of investment required to support such growth effort. A dynamic computable general equilibrium model, applied to the case of the DRCongo. |
Keywords: | Computable General Equilibrium, poverty,agriculture, productivity exogenous shock |
JEL: | O11 O21 O55 |
Date: | 2014–08–16 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:58193&r=cmp |
By: | Ole Boysen (Agricultural and Food Policy, University of Hohenheim; Institute for International Integration Studies, Trinity College Dublin); Alan Matthews (Department of Economics, Trinity College Dublin; Institute for International Integration Studies, Trinity College Dublin) |
Abstract: | Economic Partnership Agreements (EPAs) between the EU and ACP countries are frequently criticized because of fears about negative implications for economic development. Using Uganda as a case study, this paper employs an integrated macro-micro framework rich in household-level detail to assess the consequences of the East African Community EPA for economic output and poverty. Simulations of the agreement's tariff liberalization provisions indicate very minor negative economic and poverty impacts mostly affecting the rural poor. The poverty results depend in size and direction on the way the government addresses tariff revenue losses and on labor market assumptions. |
Keywords: | Economic Partnership Agreements; Uganda; poverty; trade liberalization; computable general equilibrium; microsimulation |
JEL: | D58 F14 O10 O55 |
Date: | 2015–05 |
URL: | http://d.repec.org/n?u=RePEc:tcd:tcduee:tep0315&r=cmp |
By: | Andrea Teglio (Department of Economics, Universitat Jaume I, Castell—n, Spain); Andrea Mazzocchetti (Universitˆ di Genova, DIME-CINEF, Genova, Italy); Linda Ponta (Universitˆ di Genova, DIME-CINEF, Genova, Italy); Marco Raberto (Universitˆ di Genova, DIME-CINEF, Genova, Italy); Silvano Cincotti (Universitˆ di Genova, DIME-CINEF, Genova, Italy) |
Abstract: | The 2008 financial crisis, and the subsequent global recession, triggered a widespread economic and political debate on the proper policy combination to deal with the crisis and to prevent similar ones in the future. Probably, the main dispute has been around the use of fiscal instruments in order to foster growth while keeping public debt under control. The European Union, for instance, endorsed measures for fiscal consolidation but has been sharply criticized by several scholars as well as Nobel Laureates. This paper aims at contributing to this debate by presenting the outcomes of a computational study performed with the Eurace agent-based model. We set up an experiment with two base policy scenarios, i.e., stability and growth pact and fiscal compact, incrementally enriching them with complementary policies which relax fiscal rigidity and introduce quantitative easing. We are therefore able to compare eight policy combinations, spanning different degrees of fiscal and monetary expansion. Results show that budgetary rigour performs well if and only if some mechanisms of fiscal relaxation and monetary accommodation are considered during bad times; thus confirming in a richer and more realistic model setting the fundamental tenet of Keynesian economics about the importance of sustaining aggregate demand during recessions. |
Keywords: | fiscal policy, quantitative easing, financial stability, economic crisis, agent-based modelling |
JEL: | E63 G01 H12 C63 |
Date: | 2015 |
URL: | http://d.repec.org/n?u=RePEc:jau:wpaper:2015/07&r=cmp |
By: | Anthony T. Flegg (University of the West of England, Bristol); Yongming Huang (Wuhan University, Wuhan, China); Timo Tohmo (University of Jyväskylä, Jyväskylä, Finland) |
Abstract: | Data for the Chinese province of Hubei are used to assess the performance of Kronenberg¡¦s CHARM, a method that takes explicit account of cross-hauling when constructing regional inputƒ{output tables. A key determinant of cross-hauling is held to be the heterogeneity of commodities, which is estimated using national data. However, contrary to the authors¡¦ findings for Finland, CHARM does not generate reliable estimates of Hubei¡¦s sectoral exports, imports and volume of trade, although it is more successful in estimating sectoral supply multipliers. The poor simulations of regional trade are attributed to the fact that Hubei is a relatively small region, where there is a large divergence between regional and national technology and pattern of final demand. The simulation errors are decomposed into components reflecting differences between regional and national technology, final demand and heterogeneity. The third component is found to be the least important of the three sources of error. |
Keywords: | Regional input-output tables; Non-survey methods; CHARM; Cross-hauling; China |
Date: | 2015–01–06 |
URL: | http://d.repec.org/n?u=RePEc:uwe:wpaper:20151506&r=cmp |