nep-cmp New Economics Papers
on Computational Economics
Issue of 2008‒11‒11
seven papers chosen by
Stan Miles
Thompson Rivers University

  1. An integrated model for warehouse and inventory planning By STRACK, Geraldine; POCHET, Yves
  2. Towards an understanding of tradeoffs between regional wealth, tightness of a common environmental constraint and the sharing rules By BOUCEKKINE, RAOUF; Krawczyk, Jacek B.; VALLÉE, Thomas
  3. Neural Networks and their application in the fields of corporate finance By Eric Severin
  4. Exploration in stochastic algorithms: An application on MAX-MIN Ant System By Paola Pellegrini; Elena Moretti; Daniela Favaretto
  5. RBCs and DSGEs: The Computational Approach to Business Cycle Theory and Evidence By Özer Karagedikli; Troy Matheson; Christie Smith; Shaun Vahey
  6. Grid-enabled estimation of structural economic models By Zhorin, Victor; Stef-Praun, Tiberiu
  7. Sensitivity Analysis in Economic Simulations: A Systematic Approach By Hermeling, Claudia; Mennel, Tim

  1. By: STRACK, Geraldine (Université catholique de Louvain (UCL). Center for Operations Research and Econometrics (CORE)); POCHET, Yves (Université catholique de Louvain (UCL). Center for Operations Research and Econometrics (CORE))
    Abstract: We propose a tactical model which integrates the replenishment decision in inventory management, the allocation of products to warehousing systems and the assignment of products to storage locations in warehousing management. The purpose of this article is to analyse the value of integrating warehouse and inventory decisions. This is achieved by proposing two methods for solving this tactical integrated model which differ in the level of integration of the inventory and warehousing decisions. A computational analysis is performed on a real world database and using multiple scenarios differing by the warehouse capacity limits. Our observation is that the total cost of the inventory and warehousing systems can be reduced drastically by taking into account the warehouse capacity restrictions in the inventory planning decisions, in an aggregate way. Moreover additional inventory and warehouse savings can be achieved by using more sophisticated integration methods for inventory and warehousing decisions.
    Keywords: multi-item inventory model, tactical warehouse model, integrated model, Lagrangian relaxation.
    Date: 2008–02
  2. By: BOUCEKKINE, RAOUF (Université catholique de Louvain (UCL). Center for Operations Research and Econometrics (CORE)); Krawczyk, Jacek B.; VALLÉE, Thomas
    Abstract: Consider a country with two regions that have developed differently so that their current levels of energy efficiency differ. Each region's production involves the emission of pollutants, on which a regulator might impose restrictions. The restrictions can be related to pollution standards that the regulator perceives as binding the whole country (e.g., enforced by international agreements like the Kyoto Protocol). We observe that the pollution standards define a common constraint upon the joint strategy space of the regions. We propose a game theoretic model with a coupled constraints equilibrium as a solution to the regulator's problem of avoiding excessive pollution. The regulator can direct the regions to implement the solution by using a political pressure, or compel them to employ it by using the coupled constraints' Lagrange multipliers as taxation coefficients. We specify a stylised model that possesses those characteristics, of the Belgian regions of Flanders and Wallonia. We analytically and numerically analyse the equilibrium regional production levels as a function of the pollution standards and of the sharing rules for the satisfaction of the constraint. For the computational results, we use NIRA, which is a piece of software designed to min-maximise the associated Nikaido-Isoda function.
    Keywords: coupled constraints, generalised Nash equilibrium, Nikaido-Isoda function, regional economics, environmental regulations.
    JEL: C6 C7 D7
    Date: 2008–10
  3. By: Eric Severin (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, SAMOS - Statistique Appliquée et MOdélisation Stochastique - Université Panthéon-Sorbonne - Paris I, CIS - Lab of Computer and Information Science - Helsinki University of Technology)
    Abstract: This article deals with the usefulness of neuronal networks in the area of corporate finance. Firstly, we highlight the initial applications of neural networks. One can distinguish two main types: layer networks and self organizing maps. As Altman al. (1994) underlined, the use of layer networks has improved the reclassifying rate in models of bankruptcy forecasting. These first applications improved bankruptcy forecasting by showing a relationship between capital structure and corporate performance. The results highlighted in our second part, show the pertinence of the use of the algorithm of Kohonen applied to qualitative variables (KACM). More particularly, in line with Altman (1968, 1984), one can suggest the coexistence of negative and positive effects of financial structure on performance. This result allows us to question scoring models and to conclude as to a non-linear relationship. In a larger framework, the methodology of Kohonen has allowed a better perception of the factors able to explain the leasing financing (Cottrell et al., 1996). The objective is here to explain the factors of the choice between leasing and banking loans. By using different variables, we highlight the characteristics of firms which most often use leasing. The corporate financing policy could be explained by: the cost of the financing, advantages of leasing or by the minimization of agency costs in leasing, we highlight a relationship between resorting to leasing and credit rationing.
    Keywords: neural netwoks, SOM, corporate finance
    Date: 2008
  4. By: Paola Pellegrini (Department of Applied Mathematics, University of Venice); Elena Moretti (Department of Applied Mathematics, University of Venice); Daniela Favaretto (Department of Applied Mathematics, University of Venice)
    Abstract: In this paper a definition of the exploration performed by stochastic algorithms is proposed. It is based on the observation through cluster analysis of the solutions generated during a run. The probabilities associated by an algorithm to solution components are considered. Moreover, a consequent method for quantifying the exploration is provided. Such a measurement is applied to MAX-MIN Ant System. The results of the experimental analysis allow to observe the impact of the parameters of the algorithm on the exploration.
    Keywords: exploration, cluster analysis, MAX-MIN Ant System
    JEL: C61
    Date: 2008–10
  5. By: Özer Karagedikli (Bank of England); Troy Matheson (Reserve Bank of New Zealand); Christie Smith (Norges Bank (Central Bank of Norway)); Shaun Vahey (Melbourne Business School, Norges Bank (Central Bank of Norway) , and Reserve Bank of New Zealand)
    Abstract: Real Business Cycle (RBC) and Dynamic Stochastic General Equilibrium (DSGE) methods have become essential components of the macroeconomist’s toolkit. This literature review stresses recently developed techniques for computation and inference, providing a supplement to the Romer (2006) textbook, which stresses theoretical issues. Many computational aspects are illustrated with reference to the simple divisible labour RBC model. Code and US data to replicate the computations are provided on the Internet, together with a number of appendices providing background details.
    Keywords: RBC, DSGE, Computation, Bayesian Analysis, Simulation
    JEL: C11 C50 E30
    Date: 2008–10–24
  6. By: Zhorin, Victor; Stef-Praun, Tiberiu
    Abstract: In this paper we present our experiences with the execution of structural economic models over the Grid using ”cloud computing”. We describe cases of distributed implementation and execution of occupational choice and financial deepening models of economic growth. We show how the application of Grid technology and resources naturally fits the studies of economic systems, by allowing us to capture effects of computationally challenging real-world characteristics such as heterogeneity of wealth, talent and access costs among economic agents.
    Keywords: occupational choice; financial deepening; economic growth; cloud computing
    JEL: G11 C63 D58
    Date: 2008–11–04
  7. By: Hermeling, Claudia; Mennel, Tim
    Abstract: Sensitivity analysis studies how the variation in the numerical output of a model can be quantitatively apportioned to different sources of variation in basic input parameters. Thus, it serves to examine the robustness of numerical results with respect to input parameters, which is a prerequisite for deriving economic conclusions from them. In practice, modellers apply different methods, often chosen ad hoc, to do sensitivity analysis. This paper pursues a systematic approach. It formalizes deterministic and stochastic methods used for sensitivity analysis. Moreover, it presents the numerical algorithms to apply the methods, in particular, an improved version of a Gauss-Quadrature algorithm, applicable to one as well as multidimensional sensitivity analysis. The advantages and disadvantages of different methods and algorithms are discussed as well as their applicability.
    Keywords: Sensitivity Analysis, Computational Methods
    JEL: C15 C63 D50
    Date: 2008

This nep-cmp issue is ©2008 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.