nep-cmp New Economics Papers
on Computational Economics
Issue of 2012‒06‒13
sixteen papers chosen by
Stan Miles
Thompson Rivers University

  1. Self-Organizing Maps and the US Urban Spatial Structure By Daniel Arribas-Bel; Charles R. Schmidt
  2. Spin model with negative absolute temperatures for stock market forecasting By J. L. Subias
  3. Computing Solutions for Matching Games By Peter Biro; Walter Kern; Daniel Paulusma
  4. Error estimates for binomial approximations of game put options By Y. Iron; Y. Kifer
  5. Choosing a retirement income strategy: a new evaluation framework By Pfau, Wade Donald
  6. PHILGEM: A SAM-based Computable General Equilibrium Model of the Philippines By Erwin L. Corong; J. Mark Horridge
  7. ILS-ESP: An Efficient, Simple, and Parameter-Free Algorithm for Solving the Permutation Flow-Shop Problem By Barry B. Barrios; Quim Castellà; Angel A. Juan; Helena R. Lourenço; Manuel Mateo
  8. Improving the Multi-Dimensional Comparison of Simulation Results: A Spatial Visualization Approach By Daniel Arribas-Bel; Julia Koschinsky; Pedro Amaral
  9. Construction and updating of a Ugandan CGE database By Louise Roos; Philip Adams; Jan van Heerden
  10. Tax Morale and Tax Evasion: Social Preferences and Bounded Rationality By Zsombor Z. M‚der; Andr s Simonovits; J nos Vincze
  11. Impacts of large-scale expansion of biofuels on global poverty and income distribution By Cororaton, Caesar B.; Timilsina, Govinda R.
  12. Kriging in Multi-response Simulation, including a Monte Carlo Laboratory By Kleijnen, Jack P.C.; Mehdad, E.
  13. Sale Of Visas: A Smuggler's Final Song? By Emmanuelle Auriol; Alice Mesnard
  14. A DSGE model with Endogenous Term Structure By M. Falagiarda; M. Marzo
  15. Optimal Treatment of an SIS Disease with Two Strains By Telalagic, S.
  16. When is debt sustainable? By Jasper Lukkezen; Hugo Rojas-Romagosa

  1. By: Daniel Arribas-Bel; Charles R. Schmidt (GeoDa Center for Geospatial Analysis and Computation; Arizona State University)
    Abstract: This article considers urban spatial structure in US cities using a multi- dimensional approach. We select six key variables (commuting costs, den- sity, employment dispersion/concentration, land-use mix, polycentricity and size) from the urban literature and define measures to quantify them. We then apply these measures to 359 metropolitan areas from the 2000 US Census. The adopted methodological strategy combines two novel techniques for the social sciences to explore the existence of relevant pat- terns in such multi-dimensional datasets. Geodesic self-organizing maps (SOM) are used to visualize the whole set of information in a meaningful way, while the recently developed clustering algorithm of the max-p is applied to draw boundaries within the SOM and analyze which cities fall into each of them. JEL C45, R0, R12, R14. Keywords Urban spatial structure, self-organizing maps, US metropolitan areas
    Date: 2011
  2. By: J. L. Subias
    Abstract: A spin model relating physical to financial variables is presented. Based on this model, an algorithm evaluating negative temperatures was applied to New York Stock Exchange quotations from May 2005 up to the present. Stylized patterns resembling known processes in phenomenological thermodynamics were found, namely, population inversion and the magnetocaloric effect.
    Date: 2012–06
  3. By: Peter Biro (Institute of Economics - Hungarian Academy of Sciences); Walter Kern (Faculty of Electrical Engineering, Mathematics and Computer Science, University of Twente, P.O.Box 217, NL-7500 AE Enschede); Daniel Paulusma (Department of Computer Science, University of Durham Science Laboratories, South Road, Durham DH1 3EY, England)
    Abstract: A matching game is a cooperative game (N; v) defined on a graph G = (N;E) with an edge weighting w : E ! R+. The player set is N and the value of a coalition S N is dened as the maximum weight of a matching in the subgraph induced by S. First we present an O(nm+n2 log n) algorithm that tests if the core of a matching game defined on a weighted graph with n vertices and m edges is nonempty and that computes a core member if the core is nonempty. This algorithm improves previous work based on the ellipsoid method and can also be used to compute stable solutions for instances of the stable roommates problem with payments. Second we show that the nucleolus of an n-player matching game with a nonempty core can be computed in O(n4) time. This generalizes the corresponding result of Solymosi and Raghavan for assignment games. Third we prove that is NP-hard to determine an imputation with minimum number of blocking pairs, even for matching games with unit edge weights, whereas the problem of determining an imputation with minimum total blocking value is shown to be polynomial-time solvable for general matching games.
    Keywords: matching game; nucleolus; cooperative game theory
    JEL: C61 C63 C71 C78
    Date: 2011–12
  4. By: Y. Iron; Y. Kifer
    Abstract: We construct algorithms via binomial approximations for computation of prices of game put options and obtain estimates of approximation errors.
    Date: 2012–06
  5. By: Pfau, Wade Donald
    Abstract: This article presents the initial stages of a new evaluation framework for choosing among retirement income strategies. The investigation includes eight retirement income strategies: constant inflation-adjusted withdrawal amounts, a constant withdrawal percentage of remaining assets, a withdrawal percentage based on remaining life expectancy, a more aggressive hybrid withdrawal percentage, inflation-adjusted and fixed single premium immediate annuities, a variable annuity with a guaranteed living withdrawal benefit rider, and a strategy which annuitizes the flooring level to meet basic needs and uses the hybrid withdrawal percentage for remaining assets. These eight strategies will be analyzed with six retirement outcome measures over a 30-year retirement period: the average amount whereby spending falls below the minimally acceptable level, the average spending amount, the remaining bequest at the end of the retirement period, the minimum spending amount for any year in the retirement period, a measure of whether spending increases or decreases over time defined as spending in the first year divided by spending in the 30th year, and the value of total spending after accounting for diminishing returns from increased spending for a client with somewhat inflexible spending needs. The model is applied to three client scenarios representing a cross-section of RIIA’s client segmentation matrix. It is built using Monte Carlo simulations which reflect current market conditions, so that systematic withdrawals and guaranteed products share compatible underlying assumptions.
    Keywords: retirement planning; retirement income modeling
    JEL: G11 C15 D14
    Date: 2012–06–01
  6. By: Erwin L. Corong; J. Mark Horridge
    Abstract: This paper describes the structure of PHILGEM, a single country computable general equilibrium (CGE) model of the Philippine economy. PHILGEM offers a good starting point for model development, especially for researchers who may want to extend their ORANI-G models to draw on supplementary data coming from a social accounting matrix (SAM). A generic version of the model is described here designed for expository purposes and for adaptation to other countries. The description of PHILGEM's equations and database is closely integrated with an explanation of how the model is solved using the GEMPACK system. Computer files are freely available, which contain a complete model specification and database.
    Keywords: CGE modelling, Social Accounting Matrix, Philippines
    JEL: C68 D58 O21
    Date: 2012–04
  7. By: Barry B. Barrios; Quim Castellà; Angel A. Juan; Helena R. Lourenço; Manuel Mateo
    Abstract: From a managerial point of view, the more efficient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most efficient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic ‘common sense’ rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.
    Keywords: Flow-Shop Problem, Scheduling, Randomized Algorithms, Iterated Local Search, Metaheuristics, GRASP
    JEL: C63 M11
    Date: 2012–02
  8. By: Daniel Arribas-Bel; Julia Koschinsky (GeoDa Center for Geospatial Analysis and Computation; Arizona State University); Pedro Amaral
    Abstract: Results from simulation experiments are important in applied spatial econometrics to, for instance, assess the performance of spatial estimators and tests for finite samples. However, the traditional tabular and graphi- cal formats for displaying simulation results in the literature have several disadvantages. These include loss of results, lack of intuitive synthesis, and difficulty in comparing results across multiple dimensions. We pro- pose to address these challenges through a spatial visualization approach. This approach visualizes model precision and bias as well as the size and power of tests in map format. The advantage of this spatial approach is that these maps can display all results succinctly, enable an intuitive interpretation, and compare results efficiently across multiple dimensions of a simulation experiment. Due to the respective strengths of tables, graphs and maps, we propose this spatial approach as a supplement to traditional tabular and graphical display formats. To allow readers to generate maps such as the ones presented in this article, a package (written in Python) has been made available by the authors as free/libre software. The package includes an example as well as a short tutorial for researchers without programming experience and can be downloaded at:
    Date: 2011
  9. By: Louise Roos; Philip Adams; Jan van Heerden
    Abstract: This paper documents (1) the structure of a CGE database; (2) the data manipulation steps in creating such a database from published data; (3) updating a SAM; and (4) describe features of the updated SAM. The database is constructed for a Ugandan CGE model. The building blocks for creating a database for a CGE model are official data from an Input/output (IO) table, or from a Supply Use Table (SUT), or from a SAM. Often the structure of the published data is not in the required format of a CGE database, and so a major task is to transform the official data into a form required by a CGE database. The first step in this task is typically a review of the primary source of data. We then proceed by identifying any implausible, unusual and negative values. We adjust these elements and rebalance the database to ensure that the balancing conditions hold. We then proceed to create the matrices required by the CGE model. Typically we create (1) a source dimension for all user-specific matrices, (2) user and source-specific margin matrices, (3) user and source-specific tax matrices and (4) industry-specific land rentals. It is likely that as we adjust data and create the required matrices, we violate the balancing conditions. Therefore in each step in the database construction stage, we check the balancing conditions and when appropriate we rebalance the database to ensure that the balancing conditions hold. Having constructed the 2002 database that conforms to the CGE structure, we update the database to 2009. We then proceed to create an additional sector namely, RawOil sector. In terms of the database, we create an additional industry and an additional commodity. Our final task is to create, based on the 2009 database, an updated SAM. The CGE database does not provide information on transfers between economic agents. We therefore adjust the transfer elements based on shares estimated for 2002.
    Keywords: CGE modelling, database construction, Uganda
    JEL: C68 C69
    Date: 2012–03
  10. By: Zsombor Z. M‚der (Maastricht University Department of Economics); Andr s Simonovits (Institute of Economics Research Centre for Economic and Regional Studies Hungarian Academy of Sciences and Budapest University of Technology and Economics Institute of Mathematics and Central European University, Department of Economics); J nos Vincze (Institute of Economics Research Centre for Economic and Regional Studies Hungarian Academy of Sciences and Corvinus University of Budapest)
    Abstract: We study a family of models of tax evasion, where a flat-rate tax finances only the provision of public goods, neglecting audits and wage differences. We focus on the comparison of two modeling approaches. The first is based on optimizing agents, who are endowed with social preferences, their utility being the sum of private consumption and moral utility. The second approach involves agents acting according to simple heuristics. We find that while we encounter the traditionally shaped Laffer-curve in the optimizing model, the heuristics models exhibit (linearly) increasing Laffer-curves. This difference is related to a peculiar type of behavior emerging within the heuristics based approach: a number of agents lurk in a moral state of limbo, alternating between altruism and selfishness.
    Keywords: tax evasion, tax morale, agent-based simulation
    JEL: H26
    Date: 2012–01
  11. By: Cororaton, Caesar B.; Timilsina, Govinda R.
    Abstract: This paper analyzes the impact of large-scale expansion of biofuels on the global income distribution and poverty. A global computable general equilibrium model is used to simulate the effects of the expansion of biofuels on resource allocation, commodity prices, factor prices and household income. A second model based on world-wide household surveys uses these results to calculate the impacts on poverty and global income inequality. The study finds that the large-scale expansion of biofuels leads to an increase in production and prices of agricultural commodities. The increased prices would cause higher food prices, especially in developing countries. Moreover, wages of unskilled rural labor would also increase, which slows down the rural to urban migration in many developing countries. The study also shows that the effects on poverty vary across regions; it increases in South Asia and Sub-Saharan Africa, whereas it decreases in Latin America. At the global level, the expansion of biofuels increases poverty slightly.
    Keywords: Rural Poverty Reduction,Food&Beverage Industry,Regional Economic Development,Economic Theory&Research,Labor Policies
    Date: 2012–06–01
  12. By: Kleijnen, Jack P.C.; Mehdad, E. (Tilburg University, Center for Economic Research)
    Abstract: Abstract: To analyze the input/output behavior of simulation models with multiple responses, we may apply either univariate or multivariate Kriging (Gaussian Process) models. Univariate Kriging may use a popular MATLAB Kriging toolbox called \DACE'. Multivariate Kriging faces a major problem: its covariance matrix should remain positive-definite; this problem may be solved through nonseparable dependence model. To evaluate the performance of these two Kriging models, we develop a Monte Carlo \laboratory' that simulates Gaussian Processes. To verify that this laboratory works correctly, we derive statistics that test whether the Kriging parameters have the correct values. Our Monte Carlo results demonstrate that in general DACE gives smaller Mean Squared Error (MSE); we also explain these results.
    Keywords: positive-definite covariance-matrix;nonseparable dependence model;Gaussian process;verification.
    JEL: C0 C1 C9 C15 C44
    Date: 2012
  13. By: Emmanuelle Auriol (Toulouse School of Economics (ARQADE and IDEI) and CEPR); Alice Mesnard (City University, Institute for Fiscal Studies and CEPR)
    Abstract: Is there a way of eliminating human smuggling? We set up a model to simultaneously determine the provision of human smuggling services and the demand from would-be migrants. A visa-selling policy may be successful at eliminating human smugglers by eroding their profits but it necessarily increases immigration. In contrast, re-enforced repression decreases migration but uses the help of cartelized smugglers. To overcome this trade-off we study how legalisation and repression can be combined to eliminate human smuggling while controlling migration flows. This policy mix also has the advantage that the funds raised by visa sales can be used to finance additional investments in border and internal controls (employer sanctions and deportations). Simulations of the policy implications highlight the complementarities between repression and legalisation and call into question the current policies.
    Keywords: migration, migration policies, market structure, legalisation, human smuggling.
    JEL: F22 I18 L51 O15
    Date: 2012–05
  14. By: M. Falagiarda; M. Marzo
    Abstract: In this paper, we propose a DSGE model with the term structure of interest rates drawing on the framework introduced by Andrés et al. (2004) and Marzo et al. (2008). In particular, we reproduce segmentation in financial markets by introducing bonds of different maturities and bond adjustment costs non-zero at the steady state, introducing a structural liquidity frictions among bonds with different maturities: agents are assumed to pay a cost whenever they trade bonds. As a result, the model is able to generate a non-zero demand for bonds of different maturities, which become imperfect substitutes, due to differential liquidity conditions. The main properties of the model are analysed through both simulation and estimation exercises. The importance of the results are twofold. On one hand, the calibrated model is able to replicate the stylized facts regarding the yield curve and the term premium in the US over the period 1987:3-2011:3, without compromising its ability to match macro dynamics. On the other hand, the estimation, besides providing an empirical support to the theoretical setting, highlights the potentialities of the model to analyze the term premium in a microfounded macro framework. The results match very closely the behavior of actual yields, reflecting the recent activity of the Fed on longer maturities bonds.
    JEL: C5 E32 E37 E43 E44
    Date: 2012–06
  15. By: Telalagic, S.
    Abstract: This paper explores optimal treatment of an SIS (Susceptible-Infected-Susceptible) disease that has two strains with di¤erent infectivities. When we assume that neither eradication nor full infection are possible, it is shown that there are two categories of equilibria. First, there are two continua of interior equilibria characterised by a fixed, positive total level of infection, where both strands of the disease prevail. It is hypothesised that a Skiba curve of indi¤erence lies between them. Second, there are two sets of equilibria where one strand of the disease is eradicated asymptotically. The feasibility of equilibria depends on parameter assumptions; a combination of low natural rate of recovery and large di¤erence between infectivities leaves only a small proportion of equilibria as feasible. Simulations exploring the relationship between cost and optimal policy are carried out. There exists a parameter range such that, counter-intuitively, it is optimal to allow the high-infectivity strain of the disease to prevail, while asymptotically eradicating the low-infectivity strain. Within this parameter range, there is added benefit from policy flexibility. At higher costs, simulations of the interior equilibria demonstrate the existence of a Skiba curve. The curve delineates two regions, each of which has a clear optimal policy.
    Keywords: Epidemiological modelling, Optimal control, Simulations
    JEL: I18 I19 C61 C63
    Date: 2012–05–30
  16. By: Jasper Lukkezen; Hugo Rojas-Romagosa
    Abstract: <p>This CPB Discussion Paper proposes indicators to assess government debt sustainability. Sustainable government finances can be achieved via three main channels: fiscal responses, economic growth and financial repression.</p><p>The fiscal response provides information on the long-term country specific attitude towards fiscal sustainability and is estimated using Bohn (2008)’s approach. We combine the estimated fiscal response with a stochastic debt simulation and calculate the probability of debt-to-GDP ratios rising above some threshold. This is applied on historical data for seven OECD countries. In particular, the probability of debt-to-GDP ratios rising by more than 20% in the next decade clearly identifies countries that have sustainability concerns: Spain, Portugal and Iceland, from those that do not: US, UK, Netherlands and Belgium.</p>
    JEL: E4 E6 H0 H6
    Date: 2012–06

This nep-cmp issue is ©2012 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.