nep-cmp New Economics Papers
on Computational Economics
Issue of 2016‒02‒29
fifteen papers chosen by



  1. The Potential Economic Impacts of the Proposed Development Corridor in Egypt: An Interregional CGE Approach By Diana N. Elshahawany; Eduardo A. Haddd, Michael L. Lahr
  2. Multi-country decentralized agent based model: Macroeconomic dynamics and vulnerability in a simplified currency union By Catullo, Ermanno; Gallegati, Mauro
  3. Modern Trade Theory for CGE Modelling: the Armington, Krugman and Melitz Models By Dixon, Peter; Michael Jerie; Maureen Rimmer
  4. Multi-stage Adjustable Robust Mixed-Integer Optimization via Iterative Splitting of the Uncertainty set (Revision of CentER Discussion Paper 2014-056) By Postek, K.S.; den Hertog, D.
  5. RHOMOLO Model Manual: A Dynamic Spatial General Equilibrium Model for EU Regions and Sectors By Francesco Di Comite; Olga Diukanova; D'Artis Kancs
  6. Estimating the marginal abatement cost curve of CO2 emissions in China: Provincial panel data analysis By Du, Limin; Hanley, Aoife; Wei, Chu
  7. Calibrating the Dynamic Nelson-Siegel Model: A Practitioner Approach By Francisco Ibáñez
  8. Assessing surface water flood risk and management strategies under future climate change: an agent-based model approach By Katie Jenkins; Swenja Surminski; Jim Hall; Florence Crick
  9. A New Formulation and Benders' Decomposition for Multi-period facility Location Problem with Server Uncertainty By Vatsa, Amit Kumar; Jayaswal, Sachin
  10. A large scale OLG model for France, Italy and Sweden: assessing the interpersonal and intrapersonal redistributive effects of public policies By Alessandro Bucciol; Laura Cavalli; Igor Fedotenkov; Paolo Pertile; Veronica Polin; Nicola Sartor; Alessandro Sommacal
  11. Breadth-first and Best-first Exact Procedures for Regular Measures of the Multi-mode RCPSP By Dayal Madhukar; Verma, Sanjay
  12. Repairing non-monotone ordinal data sets by changing class labels By Pijls, W.H.L.M.; Potharst, R.
  13. A Historical Welfare Analysis of Social Security: Whom Did the Program Benefit? By Peterman, William B.; Sommer, Kamila
  14. Likelihood Evaluation of High-Dimensional Spatial Latent Gaussian Models with Non-Gaussian Response Variables By Jean-François Richard
  15. Lifting the US Crude Oil Export Ban: A Numerical Partial-Equilibrium Analysis By Lissy Langer; Daniel Huppmann; Franziska Holz

  1. By: Diana N. Elshahawany; Eduardo A. Haddd, Michael L. Lahr
    Abstract: Egypt has proposed a new development corridor. A main component is a desert-based expansion of the current highway network. This network is founded on a 1200-kilometer north-south route that starts at a proposed new port near El-Alemein and runs parallel to the Nile Valley to the border of Sudan. It also includes 21 east-west branches that connect the main axis to densely populated cities on the Nile. The paper is a first attempt at an economic assessment of the impact of this proposed corridor. It uses an interregional computable general equilibrium (CGE) model developed and reported in a prior paper. Here, that model is integrated with a more detailed geo-coded transportation network model to help quantify the spatial effects of transportation cost change due specifically to changes in accessibility induced by the corridor. The paper focuses on the likely structural economic impacts that such a large investment in transportation could enable through a series of simulations related to the operational phase of the project.
    Keywords: Impact analysis; interregional CGE models; transport infrastructure; accessibility; Egypt
    JEL: R13 R42 C68
    Date: 2015–11–10
    URL: http://d.repec.org/n?u=RePEc:spa:wpaper:2015wpecon42&r=cmp
  2. By: Catullo, Ermanno; Gallegati, Mauro
    Abstract: We developed a multi country agent based simulation model with endogenous incremental technological change. Macroeconomic dynamics derive from simple behavioral and interacting rules defining the actions of adaptive firms, banks and households (Delli Gatti et al., 2008; Riccetti et al., 2014; Caiani et al., 2015). Countries join a currency union with a perfectly integrated good market, while labor and capital are not ex- changed across countries. We observe that credit dynamics are strictly associated to business cycle: phases of credit growth are associated with increasing leverage and connectivity that creates the conditions for crisis. Moreover, we tested the effects of different fiscal regimes on output dynamics, showing that in a common currency area restrictive fiscal regimes may increase country inequality and systemic vulnerability. Inequality between countries derives from differences in technological progress patterns which open competitiveness gaps. Conversely, in fiscal regimes where public deficits are excessively high the public debt burden tends to increase transferring risk from the private sector to the public one.
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:zbw:fmpwps:50&r=cmp
  3. By: Dixon, Peter; Michael Jerie; Maureen Rimmer
    Abstract: This paper is for CGE modelers and others interested in modern trade theory. The Armington specification of trade, assuming country-level product differentiation, has been central to CGE modelling for 40 years. Starting in the 1980s with Krugman and more recently Melitz, trade theorists have preferred specifications with firm-level product differentiation. We draw out the connections between the Armington, Krugman and Melitz models, deriving them as successively less restrictive special cases of an encompassing model. We then investigate optimality properties of the Melitz model, demonstrating that a Melitz general equilibrium is the solution to a global, cost-minimizing problem. This suggests that envelope theorems can be used in interpreting results from a Melitz model. Next we explain the Balistreri-Rutherford decomposition in which a Melitz general equilibrium model is broken into Melitz sectoral models combined with an Armington general equilibrium model. Balistreri and Rutherford see their decomposition as a basis of an iterative approach for solving Melitz general equilibrium models. We see it as a means for interpreting Melitz results as the outcome of an Armington simulation with additional shocks to productivity and preferences variables. With CGE modelers in mind, we report computational experience in solving a Melitz general equilibrium model using GEMPACK.
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:gta:techpp:4595&r=cmp
  4. By: Postek, K.S. (Tilburg University, Center For Economic Research); den Hertog, D. (Tilburg University, Center For Economic Research)
    Abstract: In this paper we propose a methodology for constructing decision rules for integer and continuous decision variables in multi period robust linear optimization problems. This type of problems finds application in, for example, inventory management, lot sizing, and manpower management. We show that by iteratively splitting the uncertainty set into subsets one can differentiate the later-period decisions based on the revealed uncertain parameters. At the same time, the problem’s computational complexity stays at the same level as for the static robust problem. This holds also in the non-fixed recourse situation. In the fixed recourse situation our approach can be combined with linear decision rules for the continuous decision variables. We provide theoretical results how to split the uncertainty set by identifying sets of uncertain parameter scenarios to be divided for an improvement in the worst-case objective value. Based on this theory, we propose several splitting heuristics. Numerical examples entailing a capital budgeting and a lot sizing problem illustrate the advantages of the proposed approach.
    Keywords: adjustable; decision rules; integer; multi-stage; robust optimization
    JEL: C61
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:tiu:tiucen:08442e3a-d1eb-42b3-8f13-8af3dc9d85c2&r=cmp
  5. By: Francesco Di Comite (European Commission – JRC - IPTS); Olga Diukanova (European Commission – JRC - IPTS); D'Artis Kancs (European Commission – JRC - IPTS)
    Abstract: This manual explains how to practically use the RHOMOLO model for policy impact assessment. We explain here how to read its modular structure, to retrieve its database and provide a step-by step guide to perform simulations using either its GAMS-IDE interface (for expert users) or a user-friendly graphical web-interface.
    Keywords: RHOMOLO, Macro-Economic Models, Computable General Equilibrium, Impact assessment
    JEL: C68 D24 D58 H50 O31 O32
    Date: 2016–01
    URL: http://d.repec.org/n?u=RePEc:ipt:iptwpa:jrc96776&r=cmp
  6. By: Du, Limin; Hanley, Aoife; Wei, Chu
    Abstract: This paper estimates the Marginal Abatement Cost Curve (MACC) of CO2 emissions in China based on a provincial panel for the period of 2001-2010. The provincial marginal abatement cost (MAC) of CO2 emissions is estimated using a parameterized directional output distance function. Four types of model specifications are applied to fit the MAC-carbon intensity pairs. The optimal specification controlling for various covariates is identified econometrically. A scenario simulation of China's 40-45 percent carbon intensity reduction based on our MACC is illustrated. Our simulation results show that China would incur a 559-623 Yuan/ton (roughly 51-57 percent) increase in marginal abatement cost to achieve a corresponding 40-45 percent reduction in carbon intensity compared to its 2005 level.
    Keywords: CO2 Emissions,Marginal Abatement Cost Curve,Model Selection,China
    JEL: Q52 Q54 Q58
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:zbw:ifwkwp:1985&r=cmp
  7. By: Francisco Ibáñez
    Abstract: The dynamic version of the Nelson-Siegel model has shown useful applications in the investment management industry. These applications go from forecasting the yield curve to portfolio risk management. Because of the complexity in the estimation of the parameters, some practitioners are unable to benefit from the uses of this model. This note presents two approximations to estimate the time series of the model’s factors. The first one has a more technical aim, focusing on the construction of a representative base to work, and uses a genetic algorithm to face the optimization problem. The second approximation has a practitioner spirit, focusing on the easiness of implementation. The results show that both methodologies have good fitting for the U.S. Treasury bonds market.
    Date: 2016–01
    URL: http://d.repec.org/n?u=RePEc:chb:bcchwp:774&r=cmp
  8. By: Katie Jenkins; Swenja Surminski; Jim Hall; Florence Crick
    Abstract: Flooding is the costliest natural disaster worldwide. In the UK flooding is listed as a major risk on the National Risk Register with surface water flooding the most likely cause of damage to properties. Climate change and increasing urbanisation are both projected to result in an increase in surface water flood events and their associated damages in the future. In this paper we present an Agent Based Model (ABM), applied to a London case study of surface water flood risk, designed to assess the interplay between different adaptation options; how risk reduction could be achieved by homeowners and government; and the role of flood insurance and the recently launched flood insurance pool, Flood Re, in the context of climate change. The ABM is novel in its coverage of different combinations of flood risk management options, insurance, and Flood Re, and its ability to model changing behaviour, decision making, surface water flood events, and surface water flood risk in a dynamic manner. The analysis highlights that while combined investment in property-level protection measures and sustainable urban drainage systems reduce surface water flood risk, benefits can be outweighed by continued development in high risk areas and the effects of climate change. Flood Re is beneficial in its function to provide affordable insurance, even under climate change, and is shown to have some positive effects on the housing market in the model. However, in our simulations Flood Re does face increasing pressure due to rising surface water flood risk, which highlights the importance of forward looking flood risk management interventions, that utilize insurance incentives, limit new development, and support resilience measures. Our findings are highly relevant for the ongoing regulatory and political approval process for Flood Re as well as the wider flood risk management discussion in the UK.
    Date: 2016–02
    URL: http://d.repec.org/n?u=RePEc:lsg:lsgwps:wp223&r=cmp
  9. By: Vatsa, Amit Kumar; Jayaswal, Sachin
    Abstract: Facility location problems reported in the literature generally assume the problem parameter values (like cost, budget, etc.) to be known with complete certainty, even if they change over time (as in multi-period versions). However, in reality, there may be some uncertainty about the exact values of these parameters. Specifically, in the context of locating primary health centers (PHCs) in developing countries, there is generally a high level of uncertainty in the availability of servers (doctors) joining the facilities in different time periods. For transparency and efficient assignment of the doctors to PHCs, it is desirable to decide the facility opening sequence (assigning doctors to unmanned PHCs) at the start of the planning horizon. For, this we present a new formulation for a multi-period maximal coverage location problem with server uncertainty (MMCLPSU). We further demonstrate the superiority of our proposed formulation over the only other formulation reported in the literature. For instances of practical size, we provide Benders' decomposition based solution method, along with several refinements. For instances that CPLEX MIP solver could solve within a time limit of 20 hours, our proposed solution method turns out to be of the order of 150 - 250 times faster for the problems with complete coverage, and around 1000 times faster for gradual coverage.
    URL: http://d.repec.org/n?u=RePEc:iim:iimawp:13185&r=cmp
  10. By: Alessandro Bucciol (Department of Economics (University of Verona)); Laura Cavalli (Department of Economics (University of Verona)); Igor Fedotenkov (Department of Economics (University of Verona)); Paolo Pertile (Department of Economics (University of Verona)); Veronica Polin (Department of Economics (University of Verona)); Nicola Sartor (Department of Economics (University of Verona)); Alessandro Sommacal (Department of Economics (University of Verona))
    Abstract: The paper presents a large scale overlapping generation model with heterogeneous agents, where the family is the decision unit. We model a large number of tax and public expenditure (cash and in kind) programmes, so that the equity and efficiency implications of public sector intervention may be assessed in its complexity. We do this for three european countries that show remarkable differences in the design of most of these programmes: France, Italy and Sweden. We show that the model is able to match relevant aggregate and distributional statistics of the three countries we analyse. To illustrate the working of the model, we provide examples of policy experiments that can be simulated. That is, we compare our model economies featuring the current set of public policies implemented in France, Italy and Sweden, with alternative economies where some (all) public finance programs are absent. The comparison is done, looking at the effects on both inequality and individual welfare.
    Keywords: Redistribution, Fiscal policy, Computable OLG models
    JEL: H2 H3
    Date: 2014–04
    URL: http://d.repec.org/n?u=RePEc:ver:wpaper:07/2014&r=cmp
  11. By: Dayal Madhukar; Verma, Sanjay
    Abstract: The multi-mode resource constrained project scheduling problem (MM RCPSP) is a NP-hard problem representing a generalization of the well-studied RCPSP. Depth-first tree search approach by Sprecher & Drexl (1998) is the best known exact solution tree search procedure for this problem. In this paper we present two exact solution single-processor approaches: a breadth-first approach and a best-first monotone heuristic. The comparison with depth-first and CPLEX show promising results on small problem sets. We report extension of the breadth-first approach to yield exact multi-objective solutions for the PSPLIB (Project Scheduling Problem Library, Kolisch & Sprecher, 1997) problem sets which is the first of its kind.
    URL: http://d.repec.org/n?u=RePEc:iim:iimawp:12919&r=cmp
  12. By: Pijls, W.H.L.M.; Potharst, R.
    Abstract: __Abstract__ Ordinal data sets often contain a certain amount of non-monotone noise. This paper proposes three algorithms for removing these non-monotonicities by relabeling the noisy instances. The first one is a naive algorithm. The second one is a refinement of this naive algorithm which minimizes the difference between the old and the new label. The third one is optimal in the sense that the number of unchanged instances is maximized. The last algorithm is a refinement of the second. In addition, the runtime complexities are discussed.
    Keywords: Ordinal data sets
    Date: 2014–11–01
    URL: http://d.repec.org/n?u=RePEc:ems:eureir:77641&r=cmp
  13. By: Peterman, William B. (Board of Governors of the Federal Reserve System (U.S.)); Sommer, Kamila (Board of Governors of the Federal Reserve System (U.S.))
    Abstract: A well-established result in the literature is that Social Security tends to reduce steady state welfare in a standard life cycle model. However, less is known about the historical effects of the program on agents who were alive when the program was adopted. In a computational life cycle model that simulates the Great Depression and the enactment of Social Security, this paper quantifies the welfare effects of the program's enactment on the cohorts of agents who experienced it. In contrast to the standard steady state results, we find that the adoption of the original Social Security tended to improve these cohorts' welfare. In particular, we estimate that the original program benefited households alive at the time of the program's adoption with a likelihood of over 80 percent, and increased these agents' welfare by the equivalent of 5.9% of their expected future lifetime consumption. The welfare benefit was particularly large for poorer agents and agents who were near retirement age when the program was enacted. Through a series of counterfactual experiments we demonstrate that the difference between the steady state and transitional welfare effects is primarily driven by a slower adoption of payroll taxes and a quicker adoption of benefit payments during the program's phase-in. Overall, the opposite welfare effects experienced by agents in the steady state versus agents who experienced the program's adoption might offer one explanation for why a program that potentially reduces welfare in the steady state was originally adopted.
    Keywords: Social Security; Recessions; Great Depression; Overlapping Generations
    JEL: D91 E21 H55
    Date: 2015–09–24
    URL: http://d.repec.org/n?u=RePEc:fip:fedgfe:2015-92&r=cmp
  14. By: Jean-François Richard
    Abstract: We propose a generic algorithm for numerically accurate likelihood evaluation of a broad class of spatial models characterized by a high-dimensional latent Gaussian process and non-Gaussian response variables.The class of models under consideration includes specifications for discrete choices, event counts and limited dependent variables (truncation, censoring, and sample selection) among others.Our algorithm relies upon a novel implementation of Efficient Importance Sampling (EIS) specifically designed to exploit typical sparsity of high-dimensional spatial precision (or covariance) matrices. It is numerically very accurate and computationally feasible even for very high-dimensional latent processes. Thus Maximum Likelihood (ML) estimation of high-dimensional non-Gaussian spatial models, hitherto considered to be computationally prohibitive, becomes feasible. We illustrate our approach with ML estimation of a spatial probit for US presidential voting decisions and spatial count data models (Poisson and Negbin) for firm location choices.
    Date: 2015–01
    URL: http://d.repec.org/n?u=RePEc:pit:wpaper:5778&r=cmp
  15. By: Lissy Langer; Daniel Huppmann; Franziska Holz
    Abstract: The upheaval in global crude oil markets and the boom in oil production from shale plays in North America have brought scrutiny on the export ban for crude oil in the United States. This paper examines the global flows and strategic refinery adjustments in a spatial, game-theoretic partial-equilibrium model. We consider de- tailed supply chain infrastructure with multiple crude oil qualities (supply), distinct oil products (demand), as well as specific refinery configurations and modes of transport (mid-stream). Investments in production capacity and infrastructure are endogenous. We compare two development pathways for the global oil market: one projection retaining the US export ban, and a counterfactual scenario lifting the export restrictions. Lifting the US crude ban, we find significant expansion of US sweet crude exports. In the US refinery sector, more heavy sour crude is imported and transformed. While US producers gain, the profits of US refiners decrease, due to reduced market distortions and a more efficient resource allocation. Countries importing US sweet crude benefit from higher product output, while avoiding costly refinery investments. Producers of heavy sour crude (e.g. the Middle East) are incentivised to climb up the value chain to defend their market share and maintain their dominant position.
    Keywords: energy system model, crude oil market, US crude export ban, refining capacity, infrastructure investment
    JEL: Q41 Q47 Q48 C61
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:diw:diwwpp:dp1548&r=cmp

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.