|
on Computational Economics |
Issue of 2016‒06‒25
twelve papers chosen by |
By: | María Teresa à lvarez-Martínez (European Commission – JRC - IPTS); Montserrat López-Cobo (European Commission – JRC - IPTS) |
Abstract: | This paper describes the construction process of a new set of national social accounting matrices (SAMs) for the EU-27 in 2010 that will be regionalized and used in RHOMOLO, the regional computable general equilibrium (CGE) model developed by the European Commission to evaluate the impact of cohesion policies. After a careful analysis of the input-output frameworks available in Eurostat and the World input-output database (WIOD), the latter has been used as the main data source, which is completed with information from national accounts in Eurostat. The structure of the SAM is determined by the sectoral disaggregation in WIOD. It includes a useful disaggregation of labour by skill levels and a disaggregation of the foreign sector in the EU and rest of the world. In the paper it is clearly described how to elaborate a symmetric input-output table product by product at purchasers' prices using supply and use tables and applying the industry technology. It is also described the reallocation of social contributions needed to properly assign tax revenues to government and avoid the problems generated by the second redistribution of income in national accounts. The description of the SAMs and their availability for the EU-27 can be very useful to researchers in applied economics and it may help to better understand the structure of RHOMOLO. |
Keywords: | social accounting matrices, national accounts, EU-27 |
JEL: | D57 E16 |
Date: | 2016–06 |
URL: | http://d.repec.org/n?u=RePEc:ipt:iptwpa:jrc101673&r=cmp |
By: | Caiani, Alessandro; Russo, Alberto; Gallegati, Mauro |
Abstract: | The paper builds upon the Agent Based-Stock Flow Consistent model presented in Caiani et al. (2015) to analyze the relationship between income and wealth inequality and economic development. For this sake, the original model has been amended under three main dimensions: first, the households sector has been subdivided into workmen, office workers, researchers, and executives which compete on segmented labor markets. Conversely, firms are now characterized by a hierarchical organization structure which determines, according to firms’ output levels, their demand for each type of workers. Second, in order to account for the impact of income and wealth distribution on consumption patterns, different households classes - also representing different income groups - have diversified average propensities to consume and save. Finally, the model now embeds technological change in an evolutionary flavor, affecting labor productivity evolution in the consumption sector through product innovation in the capital sector, where firms invest in R&D and produce differentiated vintages of machineries. The model is then calibrated using realistic values for both income and wealth distribution across different income groups, and their average propensities to consume. Results of the simulation experiments suggest that more progressive tax schemes and labor market policies aiming to increase low and middle workers’ coordination, and to support their wage levels, concur to foster economic development and to reduce inequality, though the latter seem to be more effective under both respects. The model thus provides some evidence in favor of a wage-led growth regime, where improvements of middle-low levels workers’ conditions create positive systemic effects, which eventually trickle up also to high income-profit earners households. |
Keywords: | Innovation, Inequality, Agent Based Macroeconomics, Stock Flow Consistent Models. |
JEL: | C63 D31 D33 E32 O33 |
Date: | 2016–06 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:71864&r=cmp |
By: | Teruyoshi Kobayashi (Graduate School of Economics, Kobe University); Naoki Masuda (Graduate School of Economics, Kobe University) |
Abstract: | A practical approach to protecting networks against epidemic processes such as spreading of infectious diseases, malware, and harmful viral information is to remove some influential nodes beforehand to fragment the network into small components. Because determining the optimal order to remove nodes is a computationally hard problem, various approximate algorithms have been proposed to efficiently fragment networks by sequential node removal. Morone and Makse proposed an algorithm employing the non-backtracking matrix of given networks, which outperforms various existing algorithms. In fact, many empirical networks have community structure, compromising the assumption of local tree-like structure on which the original algorithm is based. We develop an immunization algorithm by synergistically combining the Morone-Makse algorithm and coarse graining of the network in which we regard a community as a supernode. In this way, we aim to identify nodes that connect different communities at a reasonable computational cost. The proposed algorithm works more efficiently than the Morone-Makse and other algorithms on networks with community structure. |
Date: | 2016–06 |
URL: | http://d.repec.org/n?u=RePEc:koe:wpaper:1616&r=cmp |
By: | Mattia GUERINI (Scuola Superiore Sant'Anna); Mauro Napoletano (OFCE); Andrea Roventini (Laboratory of Economics and Management (Pisa) (LEM)) |
Abstract: | We develop an agent-based model in which heterogeneous firms and households interact in labor and good markets according to centralized or decentralized search and matching protocols. As the model has a deterministic backbone and a full-employment equilibrium, it can be directly compared to Dynamic Stochastic General Equilibrium (DSGE) models. We study the effects of negative productivity shocks by way of impulse-response func- tions (IRF). Simulation results show that when search and matching are centralized, the economy is always able to return to the full employment equilibrium and IRFs are similar to those generated by DSGE models. However, when search and matching are local, co- ordination failures emerge and the economy persistently deviates from full employment. Moreover, agents display persistent heterogeneity. Our results suggest that macroeco- nomic models should explicitly account for agents’ heterogeneity and direct interactions |
Keywords: | Agent-based model; Local interactions; Heterogenous agents; DGSE Model |
JEL: | E3 E32 E37 |
Date: | 2016–06 |
URL: | http://d.repec.org/n?u=RePEc:spo:wpmain:info:hdl:2441/20d1ncsepb9ssq3b3v4s6nbc41&r=cmp |
By: | Balazs Kiraly (Institute of Physics of Budapest University of Technology, Budapest); Andras Simonovits (Institute of Economics - Centre for Economic and Regional Studies, Hungarian Academy of Sciences also Mathematical Institute of Budapest University of Technology, Budapest) |
Abstract: | Mandatory pension systems only partially replace old-age income, therefore the government also operates a voluntary pension system, where savings are matched by government grants. Accounting for the resulting tax expenditure, our models describe the income flow from shortsighted to farsighted workers. 1. In rational models, explicit results are obtained, showing the limited learning of shortsighted workers. 2. In agent-based models, this learning is improved and this raises the shortsighted workers' saving and reduces perverse income redistribution. |
Keywords: | life-cycle savings, overlapping generations, mandatory pensions, voluntary pensions, agent-based models |
JEL: | H55 D91 |
Date: | 2016–02 |
URL: | http://d.repec.org/n?u=RePEc:has:discpr:1606&r=cmp |
By: | Maria Nieswand; Stefan Seifert |
Abstract: | Benchmarking methods are widely used in the regulation of firms in network industries working under heterogeneous exogenous environments. In this paper we compare three recently developed estimators, namely conditional DEA (Daraio and Simar, 2005, 2007b), latent class SFA (Orea and Kumbhakar, 2004; Greene, 2005), and the StoNEZD approach (Johnson and Kuosmanen, 2011) by means of Monte Carlo simulation focusing on their ability to identify production frontiers in the presence of environmental factors. Data generation replicates regulatory data from the energy sector in terms of sample size, sample dispersion and distribution, and correlations of variables. Although results show strengths of each of the three estimators in particular settings, latent class SFA perform best in nearly all simulations. Further, results indicate that the accuracy of the estimators is less sensitive against different distributions of environmental factors, their correlations with inputs, and their impact on the production process, but performance of all approaches deteriorates with increasing noise. For regulators this study provides orientation to adopt new benchmarking methods given industry characteristics. |
Keywords: | Monte Carlo Simulation, Environmental Factors, StoNEZD, Latent Class SFA, Conditional DEA, Regulatory Benchmarking |
JEL: | L50 Q50 C63 |
Date: | 2016 |
URL: | http://d.repec.org/n?u=RePEc:diw:diwwpp:dp1585&r=cmp |
By: | Paul Gretton (EABER) |
Abstract: | Effective economic reform agendas provide a means for promoting national economic growth, raising living standards and adapting to changes in trading conditions, new technologies and ways of working. Taking as a focus the Australia-China economic relationship, the GTAP model of the global economy is used to project the implications for Australia and China of preferential, unilateral and broader approaches to trade liberalisation, a broad agenda for reform across the services sector and financial market reform. The simulations show that reform strategies based on non-discriminatory trade liberalization and broadly-based concerted domestic reforms are likely to deliver substantive economic benefits and contribute to growth. Agendas that are restrictive, either through preferential deals between trading partners or through a narrow sectoral focus domestically are likely to constrain gains below levels that would otherwise be attainable. |
JEL: | F1 F3 F4 O4 O5 |
Date: | 2016–06 |
URL: | http://d.repec.org/n?u=RePEc:eab:wpaper:25630&r=cmp |
By: | Hänisch, Carsten; Klos, Jonas |
Abstract: | This paper provides a micro-simulation study on the long-run effects of career interruptions in Germany, extending earlier work which generally only focuses on the first few years after an interruption. Using data of the German Socio-Economic Panel, it finds that career interruptions will, for the average individual, have lifelong effects on incomes and labor-force participation. It quantifies these effects for the average affected individual as well as on the entire society and therefore provides additional information on the total cost of career interruptions. |
Keywords: | micro-simulation,career interruptions,lifetime income |
JEL: | H55 |
Date: | 2016 |
URL: | http://d.repec.org/n?u=RePEc:zbw:wgspdp:201603&r=cmp |
By: | Holopainen, Markus; Sarlin, Peter |
Abstract: | This paper presents first steps toward robust models for crisis prediction. We conduct a horse race of conventional statistical methods and more recent machine learning methods as early-warning models. As individual models are in the literature most often built in isolation of other methods, the exercise is of high relevance for assessing the relative performance of a wide variety of methods. Further, we test various ensemble approaches to aggregating the information products of the built models, providing a more robust basis for measuring country-level vulnerabilities. Finally, we provide approaches to estimating model uncertainty in early-warning exercises, particularly model performance uncertainty and model output uncertainty. The approaches put forward in this paper are shown with Europe as a playground. Generally, our results show that the conventional statistical approaches are outperformed by more advanced machine learning methods, such as k-nearest neighbors and neural networks, and particularly by model aggregation approaches through ensemble learning. JEL Classification: E44, F30, G01, G15, C43 |
Keywords: | early-warning models, ensembles, financial stability, horse race, model uncertainty |
Date: | 2016–05 |
URL: | http://d.repec.org/n?u=RePEc:ecb:ecbwps:20161900&r=cmp |
By: | Xiao Lin |
Abstract: | The aim of this paper is to present a dual-term structure model of interest rate derivatives in order to solve the two hardest problems in financial modeling: the exact volatility calibration of the entire swaption matrix, and the calculation of bucket vegas for structured products. The model takes a series of long-term zero-coupon rates as basic state variables that are driven directly by one or more Brownian motion. The model volatility is assigned in a matrix form with two terms. A complete numerical scheme for implementing the model has been developed in the paper. At the end, several examples have been given for the model calibration, the structured products pricing and the calculation of bucket vegas. |
Date: | 2016–06 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1606.01343&r=cmp |
By: | Gilles Pag\`es (UPMC); Olivier Pironneau (LJLL); Guillaume Sall (LJLL) |
Abstract: | This paper deals with the computation of second or higher order greeks of financial securities. It combines two methods, Vibrato and automatic differentiation and compares with other methods. We show that this combined technique is faster than standard finite difference, more stable than automatic differentiation of second order derivatives and more general than Malliavin Calculus. We present a generic framework to compute any greeks and present several applications on different types of financial contracts: European and American options, multidimensional Basket Call and stochastic volatility models such as Heston's model. We give also an algorithm to compute derivatives for the Longstaff-Schwartz Monte Carlo method for American options. We also extend automatic differentiation for second order derivatives of options with non-twice differentiable payoff. 1. Introduction. Due to BASEL III regulations, banks are requested to evaluate the sensitivities of their portfolios every day (risk assessment). Some of these portfolios are huge and sensitivities are time consuming to compute accurately. Faced with the problem of building a software for this task and distrusting automatic differentiation for non-differentiable functions, we turned to an idea developed by Mike Giles called Vibrato. Vibrato at core is a differentiation of a combination of likelihood ratio method and pathwise evaluation. In Giles [12], [13], it is shown that the computing time, stability and precision are enhanced compared with numerical differentiation of the full Monte Carlo path. In many cases, double sensitivities, i.e. second derivatives with respect to parameters, are needed (e.g. gamma hedging). Finite difference approximation of sensitivities is a very simple method but its precision is hard to control because it relies on the appropriate choice of the increment. Automatic differentiation of computer programs bypass the difficulty and its computing cost is similar to finite difference, if not cheaper. But in finance the payoff is never twice differentiable and so generalized derivatives have to be used requiring approximations of Dirac functions of which the precision is also doubtful. The purpose of this paper is to investigate the feasibility of Vibrato for second and higher derivatives. We will first compare Vibrato applied twice with the analytic differentiation of Vibrato and show that it is equivalent, as the second is easier we propose the best compromise for second derivatives: Automatic Differentiation of Vibrato. In [8], Capriotti has recently investigated the coupling of different mathematical methods -- namely pathwise and likelihood ratio methods -- with an Automatic differ |
Date: | 2016–06 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1606.06143&r=cmp |
By: | Marco Bianchetti; Davide Galli; Camilla Ricci; Angelo Salvatori; Marco Scaringi |
Abstract: | We applied the Johansen-Ledoit-Sornette (JLS) model to detect possible bubbles and crashes related to the Brexit/Bremain referendum scheduled for 23rd June 2016. Our implementation includes an enhanced model calibration using Genetic Algorithms. We selected a few historical financial series sensitive to the Brexit/Bremain scenario, representative of multiple asset classes. We found that equity and currency asset classes show no bubble signals, while rates, credit and real estate show super-exponential behaviour and instabilities typical of bubble regime. Our study suggests that, under the JLS model, equity and currency markets do not expect crashes or sharp rises following the referendum results. Instead, rates and credit markets consider the referendum a risky event, expecting either a Bremain scenario or a Brexit scenario edulcorated by central banks intervention. In the case of real estate, a crash is expected, but its relationship with the referendum results is unclear. |
Date: | 2016–06 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1606.06829&r=cmp |