nep-cmp New Economics Papers
on Computational Economics
Issue of 2005‒12‒01
seven papers chosen by
Stan Miles
York University

  1. Solving Dynamic Stochastic Optimization Problems Using the Method of Endogenous Gridpoints By Christopher D. Carroll
  2. Working with the XQC By Wolfgang Härdle; Heiko Lehmann
  3. New Computational Results for the Nurse Scheduling Problem: A Scatter Search Algorithm By B. MAENHOUT; M. VANHOUCKE
  4. Computing Second-Order-Accurate Solutions for Rational Expectation Models Using Linear Solution Methods By Giovanni Lombardo; Alan Sutherland
  5. Building and Linking a Microsimulation Model to a CGE Model : the South African Microsimulation Model By Nicolas Hérault
  6. A CELLULAR AUTOMATA MODEL OF THE GENERAL RATE OF PROFIT By Claudio Castelo Branco Puty
  7. UM MODELO MACRODINÂMICO PÓS-KEYNESIANO DE SIMULAÇÃO COM PROGRESSO TÉCNICO ENDÓGENO By José Luís Oreiro; Breno Pascualote Lemos

  1. By: Christopher D. Carroll (Economics Johns Hopkins University)
    Keywords: Numerical Optimization; Dynamic Programming; Precautionary Saving
    JEL: C61 C63 E2
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:red:sed005:628&r=cmp
  2. By: Wolfgang Härdle; Heiko Lehmann
    Abstract: An enormous number of statistical methods have been developed in quantitive finance during the last decades. Nonparametric methods, bootstrapping time series, wavelets, estimation of diffusion coefficients are now almost standard in statistical applications. To implement these new methods the method developer usually uses a programming environment he is familiar with. Thus, such methods are only available for preselected software packages, but not for widely used standard software packages like MS Excel. To apply these new methods to empirical data a potential user faces a number of problems or it may even be impossible for him to use the methods without rewriting them in a different programming language. Even if one wants to apply a newly developed method to simulated data in order to understand the methodology one is confronted with the drawbacks described above. A very similar problem occurs in teaching statistics at undergraduate level. Since students usually have their preferred software and often do not have access to the same statistical software packages as their teacher, illustrating examples have to be executable with standard tools. In general, two statisticians are on either side of the distribution process of newly implemented methods, the provider (inventor) of a new technique (algorithm) and the user who wants to apply (understand) the new technique. The aim of the XploRe Quantlet client/server architecture is to bring these statisticians closer to each other. The XploRe Quantlet Client (XQC) represents the front end - the user interface (UI) of this architecture allowing to access the XploRe server and its methods and data. The XQC is fully programmed in Java not depending on a specific computer platform. It runs on Windows and Mac platforms as well as on Unix and Linux machines.
    Keywords: XploRe Quantlet Client, quantitive finance, application, applet
    JEL: C87 C88
    Date: 2005–03
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2005-010&r=cmp
  3. By: B. MAENHOUT; M. VANHOUCKE
    Abstract: In this paper, we present a scatter search algorithm for the well-known nurse scheduling problem (NSP). This problem aims at the construction of roster schedules for nurses taking both hard and soft constraints into account. The objective is to minimize the total preference cost of the nurses and the total penalty cost from violations of the soft constraints. The problem is known to be NPhard. The contribution of this paper is threefold. First, we are, to the best of our knowledge, the first to present a scatter search algorithm for the NSP. Second, we investigate two different types of solution combination methods in the scatter search framework, based on four different cost elements. Last, we present detailed computational experiments on a benchmark dataset presented recently, and solve these problem instances under different assumptions. We show that our procedure performs consistently well under many different circumstances, and hence, can be considered as robust against case-specific constraints.
    Keywords: meta-heuristics; scatter search; nurse scheduling
    Date: 2005–11
    URL: http://d.repec.org/n?u=RePEc:rug:rugwps:05/341&r=cmp
  4. By: Giovanni Lombardo; Alan Sutherland
    Abstract: This paper shows how to compute a second-order accurate solution of a non-linear rational expectation model using algorithms developed for the solution of linear rational expectation models. This result is a state-space representation for the realized values of the variables of the model. This state-space representation can easily be used to compute impulse responses as well as conditional and unconditional forecasts.
    Keywords: Second-order approximation; solution method for rational expectation models.
    JEL: E63 E0
    Date: 2005–03
    URL: http://d.repec.org/n?u=RePEc:san:cdmacp:0504&r=cmp
  5. By: Nicolas Hérault (CED, IFReDE/GRES, Université Montesquieu-Bordeaux IV)
    Abstract: This paper describes the project of building a micro-macro model for South Africa. The aim is to deal with the links between globalisation and poverty or inequality, explaining the effects of trade liberalisation on poverty and inequality. The main issue of interest is the effect of international trade on households (especially their income); some changes may contribute to reduce poverty while other changes could work against the poor. The approach presented in this paper relies on combining a macro-oriented CGE model and a microsimulation model. Combining these two models the microeconomic effects (on poverty and inequality) of a macroeconomic policy (trade liberalisation) can be analysed. The paper gives details about the microsimulation model and the "top-down" approach used to link the microsimulation model and the CGE model. In addition, the methodology discussed is applied to South African data and a selection of preliminary results using this approach are presented and discussed. The main concern regarding poor households is whether the decrease in real (or nominal) earnings for formal low-skilled and skilled workers is offset by the upward trend in formal employment levels. This appears to be the case implying a decrease in poverty due to trade liberalisation. Although whites emerge as the main winners, the increase in inter-group inequality is more than compensated by the decrease in intra-group inequality. Ce papier décrit le projet d’élaboration d’un modèle micro-macro pour l'Afrique du Sud. L’objectif est d’examiner les liens entre la mondialisation et la pauvreté ou l'inégalité, en expliquant les effets de la libéralisation commerciale sur ces deux indicateurs de progrès social. La préoccupation principale concerne l'effet du commerce international sur les ménages (particulièrement, leur revenu), certains changements pouvant contribuer à réduire la pauvreté, tandis que d'autres étant susceptibles d’aggraver les privations. L'approche présentée dans cet article est fondée sue la combinaison d’un modèle CGE orienté-macro et d’un modèle de micro-simulation. En combinant ces deux modèles, les effets micro-économiques (sur la pauvreté et l'inégalité) d'une politique macro-économique (libéralisation commerciale) peuvent être analysés. L’étude spécifie le modèle de micro-simulation et l'approche « top-down », employés pour relier les modèles de micro-simulation et CGE.En outre, la méthodologie discutée est appliquée aux données sud-africaines, et des résultats préliminaires, fondés sur cette approche, sont présentés et discutés. Un élément central de l’analyse concernant les ménages pauvres est d’examiner si la diminution des revenus réels (ou nominaux) des ouvriers qualifiés ou faiblement qualifiés du secteur formel est compensée par la tendance à la hausse de l’emploi formel. L’étude montre que cela semble être le cas, ce qui implique une diminution de la pauvreté due à la libéralisation commerciale. Bien que les bénéficiaires principaux soient les « blancs », l'augmentation de l'inégalité inter-groupes est plus que compensée par la diminution de l'inégalité intra-groupes. (Full text in english)
    JEL: C68 E17 O55
    Date: 2005–09
    URL: http://d.repec.org/n?u=RePEc:mon:ceddtr:114&r=cmp
  6. By: Claudio Castelo Branco Puty
    Abstract: We present a simulation exercise of classical free competition in which individual capitals attracted by prospective rates of profit move across industries in their Moore neighborhoods. Capitals, in general, never settle down to a fully equalized general rate of profit position and the most common characteristic of the series of cross sectional average rate of profit is the never ending gravitation around the average rate of profit determined by the number of capitals in the lattice-economy. The statistical properties that emerge from the interaction of our agents resembles stable distributions chracterized by skewness and heavy tails.
    JEL: C15 D58 E11
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:anp:en2005:006&r=cmp
  7. By: José Luís Oreiro; Breno Pascualote Lemos
    Abstract: The objective of this article is to present the structure and the first computational simulations of a one-sector macrodynamic model that imbed some elements of the post- keynesian theoretical framework. The theoretical elements embed in the model are: i) determination of the level of output by the principle of effective demand; (ii) differentiated savings propensities of capitalists and workers; iii) mark-up pricing; iv) investment decision based on Minsky´s two price theory; v) importance of firms´s capital structure over the level of aggregate investment; (vi) inflation based on distributive conflict between capitalists and workers; (vii) endogenous money and (ix) endogenous technical progress. The computational simulations of the model reproduce some important features of capitalist dynamics as "cyclical growth" - i. e.; irregular but bounded fluctuations of the growth rate of real GDP -; the occurrence of a single Great Depression over the entire simulation period, what resembles the "rare" nature of great crises in the history of capitalism. The computational simulation also shows that a big reduction in inflation rate in a short period spam will be accompanied by a great financial fragility of productive firms, which, sooner or latter, will generate a great depression. As a corollary of these results follows that the Central Bank should conduct monetary policy in a way to avoid very rapid reduction in inflation rate.
    JEL: O41 O11 E12
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:anp:en2005:056&r=cmp

This nep-cmp issue is ©2005 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.