nep-cmp New Economics Papers
on Computational Economics
Issue of 2009‒04‒05
eighteen papers chosen by
Stan Miles
Thompson Rivers University

  1. Resistance to learning and the evolution of cooperation By Raul Jimenez; Haydee Lugo; Maxi San Miguel
  2. Dynamic Regimes of a Multi-agent Stock Market Model By Yu, Tongkui; Li, Honggang
  3. An Artificial Immune System for the Multi-Mode Resource-Constrained Project Scheduling Problem By V. VAN PETEGHEM; M. VANHOUCKE
  4. Who Drives the Market? Estimating a Heterogeneous Agent-based Financial Market Model Using a Neural Network Approach By Klein, A.; Urbig, D.; Kirn, S.
  5. A Model for the Global Crude Oil Market Using a Multi-Pool MCP Approach By Daniel Huppmann; Franziska Holz
  6. Numerical Simulation of Nonoptimal Dynamic Equilibrium Models By Zhigang Feng; Jianjun Miao; Adrian Peralta-Alva; Manual Santos
  7. Generalized power method for sparse principal component analysis By JournŽe, Michel; Nesterov, Yurii; Richtarik, Peter; Sepulchre, Rodolphe
  8. Program Equilibria and Discounted Computation Time By Lance Fortnow
  9. Initial infrastructure development strategies for the transition to sustainable mobility By Floris J. Huétink; Alexander van der Vooren; Floortje Alkemade
  10. Portfolio Management: An investigation of the implications of measurement errors in stock prices on the creation, management and evaluation of stock portfolios, using stochastic simulations By Dikaios Tserkezos; Eleni Thanou Thanou
  11. The Distributional Effects of Tax-benefit Policies under New Labour: A Shapley Decomposition By Bargain O
  12. Barrier subgradient method By NESTEROV, Y.
  13. Approximate level method By Richtarik, Peter
  14. The (mis)specification of discrete duration models with unobserved heterogeneity: a Monte Carlo study By Concetta Rondinelli; Cheti Nicoletti
  15. Monetary policy rules with financial instability By Sofia Bauducco; Ales Bulir; Martin Cihak
  16. Discrete-continuos analysis of optimal equipment replacement By YATSENKO, Yuri; HRITONENKO, Natali
  17. The impact of the unilateral EU commitment on the stability of international climate agreements By BRECHET, Thierry; EYCKMANS, Johan; GERARD, Franois; MARBAIX, Philippe
  18. Uncertainty of Multiple Period Risk Measures By Lönnbark, Carl

  1. By: Raul Jimenez; Haydee Lugo; Maxi San Miguel
    Abstract: In many evolutionary algorithms, crossover is the main operator used in generating new individuals from old ones. However, the usual mechanism for generating offsprings in spatially structured evolutionary games has to date been clonation. Here we study the effect of incorporating crossover on these models. Our framework is the spatial Continuous Prisoner's Dilemma. For this evolutionary game, it has been reported that occasional errors (mutations) in the clonal process can explain the emergence of cooperation from a non-cooperative initial state. First, we show that this only occurs for particular regimes of low costs of cooperation. Then, we display how crossover gets greater the range of scenarios where cooperative mutants can invade selfish populations. In a social context, where crossover involves a general rule of gradual learning, our results show that the less that is learnt in a single step, the larger the degree of global cooperation finally attained. In general, the effect of step-by-step learning can be more efficient for the evolution of cooperation than a full blast one.
    Keywords: Evolutionary games, Continuous prisoner's dilemma, Spatially structured, Crossover, Learning
    Date: 2009–02
    URL: http://d.repec.org/n?u=RePEc:cte:werepe:we092012&r=cmp
  2. By: Yu, Tongkui; Li, Honggang
    Abstract: This paper presents a stochastic multi-agent model of stock market. The market dynamics include switches between chartists and fundamentalists and switches in the prevailing opinions (optimistic or pessimistic) among chartists. A nonlinear dynamical system is derived to depict the underlying mechanisms of market evolvement. Under different settings of parameters representing traders' mimetic contagion propensity, price chasing propensity and strategy switching propensity, the system exhibits four kinds of dynamic regimes: fundamental equilibrium, non-fundamental equilibrium, periodicity and chaos.
    Keywords: multi-agent stock market model, market dynamic regime, bifurcation analysis
    JEL: G12 C62
    Date: 2008–11
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:14339&r=cmp
  3. By: V. VAN PETEGHEM; M. VANHOUCKE
    Abstract: In this paper, an Artificial Immune System (AIS) for the multi-mode resource-constrained project scheduling problem (MRCPSP), in which multiple execution modes are available for each of the activities of the project, is presented. The AIS algorithm makes use of mechanisms which are inspired on the vertebrate immune system performed on an initial population set. This population set is generated with a controlled search method, based on experimental results which revealed a link between predefined profit values of a mode assignment and its makespan. The impact of the algorithmic parameters and the initial population generation method is observed and detailed comparative computational results for the MRCPSP are presented.
    Date: 2009–01
    URL: http://d.repec.org/n?u=RePEc:rug:rugwps:09/555&r=cmp
  4. By: Klein, A.; Urbig, D.; Kirn, S.
    Abstract: Introduction. The objects of investigation of this work are micro-level behaviors in stock markets. We aim at better understanding which strategies of market participants drive stock markets. The problem is that micro-level data from real stock markets are largely unobservable. We take an estimation perspective to obtain daily time series of fractions of chartists and fundamentalists among market participants. We estimate the heterogeneous agent-based financial market model introduced by Lux and Marchesi [1] to the S&P 500. This model has more realistic time series properties compared to less complex econometric and other agent-based models. Such kinds of models have a rather complex dependency between micro and macro parameters that have to be mapped to empirical data by the estimation method. This poses heavy computational burdens. Our contribution to this field is a new method for indirectly estimating time-varying micro-parameters of highly complex agent-based models at high frequency. Related work. Due to the high complexity, few authors have published on this topic to date (e.g., [2], [3], and [4]). Recent approaches in directly estimating agent-based models are restricted to simpler models, make simplifying assumptions on the estimation procedure, estimate only non-time varying parameters, or estimate only low frequency time series. Approach and computational methods. The indirect estimation method we propose is based on estimating the inverse model of a rich agent-based model that derives realistic macro market behavior from heterogeneous market participants’ behaviors. Applying the inverse model, which maps macro parameters back to micro parameters, to widely available macro-level financial market data, allows for estimating time series of aggregated real world micro-level strategy data at daily frequency. To estimate the inverse model in the first place, a neural network approach is used, as it allows for a large degree of freedom concerning the structure of the mapping to be represented by the neural network. As basis for learning the mapping, micro and macro time series of the market model are generated artificially using a multi-agent simulation based on RePast [5]. After applying several pre-processing and smoothing methods to these time series, including the Hodrick and Prescott filter [6], a feed-forward multilayer perceptron is trained using a variant of the Levenberg-Marquardt algorithm combined with Bayesian regularization [7]. Finally, the trained network is applied to the S&P 500 to estimate daily time series of fractions of strategies used by market participants. Results. The main contribution of this work is a model-free indirect estimation approach. It allows estimating micro-parameter time series of the underlying agent-based model of high complexity at high frequency. No simplifying assumptions concerning the model or the estimation process have to be applied. Our results also contribute to the understanding of theoretical models. By investigating fundamental depen¬den¬cies in the Lux and Marchesi model by means of sensitivity analysis of the resulting neural network inverse model, price volatility is found to be the key driver. This provides additional support to findings in [1]. Some face validity for concrete estimation results obtained from the S&P 500 is shown by comparing to results of Boswijk et al. [3]. This is the work which comes closest to our approach, albeit their model is simpler and estimation frequency is yearly. We find support for Boswijk et al.’s key finding of a large fraction of chartists during the end of 1990s price bubble in technology stocks. Eventually, our work contributes to understanding what kind of micro-level behaviors drive stock markets. Analyzing correlations of our estimation results to historic market events, we find the fraction of chartists being large at times of crises, crashes, and bubbles.
    Keywords: stock market; heterogeneous agent-based models; indirect estimation; inverse model; trading strategies; chartists; fundamentalists; neural networks
    JEL: C32 G12 C45 C81 C15
    Date: 2008–06–24
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:14382&r=cmp
  5. By: Daniel Huppmann; Franziska Holz
    Abstract: This paper proposes a partial equilibrium model to describe the global crude oil market. Pricing on the global crude oil market is strongly influenced by price indices such as WTI (USA) and Brent (Northwest Europe). Adapting an approach for pool-based electricity markets, the model captures the particularities of these benchmark price indices and their influence on the market of physical oil. This approach is compared to a model with bilateral trade relations as is traditionally used in models of energy markets. With these two model approaches, we compute the equilibrium solutions for several market power scenarios to investigate whether the multi-pool approach may be better suited than the bilateral trade model to describe the crude oil market. The pool-based approach yields, in general, results closer to observed quantities and prices, with the best fit obtained by the scenario of an OPEC oligopoly. We conclude that the price indices indeed are important on the global crude market in determining the prices and flows, and that OPEC effectively exerts market power, but in a non-cooperative way.
    Keywords: crude oil, market structure, cartel, pool market, simulation model
    JEL: L13 L71 Q41
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:diw:diwwpp:dp869&r=cmp
  6. By: Zhigang Feng (Department of Economics, University of Miami); Jianjun Miao (Department of Economics, Boston University); Adrian Peralta-Alva (Research Division, Federal Reserve Bank of Saint Louis); Manual Santos (Department of Economics, University of Miami)
    Abstract: In this paper we present a recursive method for the computation of dynamic competitive equilibria in models with heterogeneous agents and market frictions. This method is based upon a convergent operator over an expanded set of state variables. The ï¬xed point of this operator deï¬nes the set of all Markovian equilibria. We study approximation properties of the operator as well as the convergence of the moments of simulated sample paths. We apply our numerical algorithm to two growth models, an overlapping generations economy with money, and an asset pricing model with financial frictions.
    Keywords: Heterogeneous agents, taxes, externalities, financial frictions, competitive equilibrium, computation, simulation
    JEL: C6 D5 E2
    Date: 2009–02–28
    URL: http://d.repec.org/n?u=RePEc:mia:wpaper:0912&r=cmp
  7. By: JournŽe, Michel (---); Nesterov, Yurii (UniversitŽ catholique de Louvain (UCL). Center for Operations Research and Econometrics (CORE)); Richtarik, Peter (UniversitŽ catholique de Louvain (UCL). Center for Operations Research and Econometrics (CORE)); Sepulchre, Rodolphe
    Abstract: In this paper we develop a new approach to sparse principal component analysis (sparse PCA). We propose two single-unit and two block optimization formulations of the sparse PCA problem, aimed at extracting a single sparse dominant principal component of a data matrix, or more components at once, respectively. While the initial formulations involve nonconvex functions, and are therefore computationally intractable, we rewrite them into the form of an optimization program involving maximization of a convex function on a compact set. The dimension of the search space is decreased enormously if the data matrix has many more columns (variables) than rows. We then propose and analyze a simple gradient method suited for the task. It appears that our algorithm has best convergence properties in the case when either the objective function or the feasible set are strongly convex, which is the case with our single-unit formulations and can be enforced in the block case. Finally, we demonstrate numerically on a set of random and gene expression test problems that our approach outperforms existing algorithms both in quality of the obtained solution and in computational speed.
    Keywords: sparse PCA, power method, gradient ascent, strongly convex sets, block algorithms.
    Date: 2008–11
    URL: http://d.repec.org/n?u=RePEc:cor:louvco:2008070&r=cmp
  8. By: Lance Fortnow
    Abstract: Tennenholtz (GEB 2004) developed Program Equilibrium to model play in a finite two-player game where each player can base their strategy on the other player's strategies. Tennenholtz's model allowed each player to produce a "loop-free" computer program that had access to the code for both players. He showed a folk theorem where any mixed-strategy individually rational play could be an equilibrium payo in this model even in a one-shot game. Kalai et al. gave a general folk theorem for correlated play in a more generic commitment model. We develop a new model of program equilibrium using general computational models and discounting the payos based on the computation time used. We give an even more general folk theorem giving correlated-strategy payoffs down to the pure minimax of each player. We also show equilibrium in other games not covered by the earlier work.
    Keywords: brokers, applied mechanism design, linear commission fees, optimal indirect mechanisms, internet auctions, auction houses.
    JEL: C72 C78 L13
    Date: 2008–11
    URL: http://d.repec.org/n?u=RePEc:nwu:cmsems:1473&r=cmp
  9. By: Floris J. Huétink; Alexander van der Vooren; Floortje Alkemade
    Abstract: Within the Dutch transition policy framework, the transition to hydrogen-based transport is seen as a promising option towards a sustainable transport system. This transition requires the build-up of a hydrogen infrastructure as a certain level of refuelling infrastructure is necessary before (even the most innovative or environmentally friendly) consumers will substitute their conventional car for a hydrogen vehicle (Dunn 2002). This is often referred to as the chicken-and-egg problem of infrastructure development. However, the build-up of infrastructure is costly and irreversible and it is therefore important for policymakers to gain insight in the minimally required levels of initial infrastructure that will still set off the transition. In this paper we therefore present a diffusion model for the analysis of the effects of different strategies for hydrogen infrastructure development on hydrogen vehicle fleet penetration. Within the simulation model, diffusion patterns for hydrogen vehicles were created through the interactions of consumers, refuelling stations and technological learning. We compare our results to the benchmark patterns derived from the hydrogen roadmap. The strategies for initial infrastructure development differ with respect to the placement (urban or nationwide) and the number of initial refuelling stations. Simulation results indicate that when taking social learning between consumers into account, diffusion is generally lower than in the benchmark patterns. Furthermore, simulation results indicate that a nationwide deployment strategy generally leads to faster diffusion of hydrogen vehicles than a strategy focused on urban areas. These demand side aspects of the transition to sustainable mobility are considered especially important in the Netherlands since besides the high cost associated with infrastructure investment the Netherlands do not have a domestic car industry so that policy measures will most likely focus on infrastructure and consumers. Increased insights in the relation between infrastructure development strategies and hydrogen vehicle diffusion are thus necessary to further manage the transition to sustainable mobility.
    Keywords: transition management, sustainable mobility
    Date: 2009–03
    URL: http://d.repec.org/n?u=RePEc:uis:wpaper:0905&r=cmp
  10. By: Dikaios Tserkezos (Department of Economics, University of Crete, Greece); Eleni Thanou Thanou (Hellenic Open University)
    Abstract: In this paper, we investigate the implications of measurement errors in the daily published stock prices on the creation and management of efficient portfolios. Using stochastic simulation techniques and the Markowitz Mean Variance approach in the creation of the weights of the various stocks of a portfolio, we conclude that measurement errors have significant implications on the efficiency of the management of a stock portfolio.
    Keywords: Markowitz Mean Variance, Measurement Errors in Returns, Stochastic Simulation.
    Date: 2009–03–26
    URL: http://d.repec.org/n?u=RePEc:crt:wpaper:0904&r=cmp
  11. By: Bargain O
    Abstract: Using counterfactual microsimulations, Shapley decompositions of time change in inequality and poverty indices make it possible to disentangle and quantify the relative effect of tax-benefit policy changes, compared to all other effects including shifts in the distribution of market income. Using this approach also helps to clarify the different issues underlying the distributional evaluation of policy reforms. An application to the UK (1998-2001) confirms previous findings that inequality and depth of poverty would have increased under the first New Labour government, had important reforms like the extensions of income support and tax credits not been implemented. These reforms have also contributed to substantially reduce poverty among families with children and pensioners.
    Keywords: Tax-benefit policy; inequality; poverty; Shapley decomposition; microsimulation
    JEL: H23 H53 I32
    Date: 2009–03–23
    URL: http://d.repec.org/n?u=RePEc:ese:emodwp:em2/09&r=cmp
  12. By: NESTEROV, Y. (UniversitŽ catholique de Louvain (UCL). Center for Operations Research and Econometrics (CORE))
    Abstract: In this paper we develop a new primal-dual subgradient method for nonsmooth convex optimization problems. This scheme is based on a self-concordant barrier for the basic feasible set. It is suitable for finding approximate solutions with certain relative accuracy. We discuss some applications of this technique including fractional covering problem, maximal concurrent flow problem, semidefinite relaxations and nonlinear online optimization.
    Keywords: convex optimization, subgradient methods, non-smooth optimization, minimax problems, saddle points, variational inequalities, stochastic optimization, black-box methods, lower complexity bounds.
    Date: 2008–10
    URL: http://d.repec.org/n?u=RePEc:cor:louvco:2008060&r=cmp
  13. By: Richtarik, Peter (UniversitŽ catholique de Louvain (UCL). Center for Operations Research and Econometrics (CORE))
    Abstract: In this paper we propose and analyze a variant of the level method [4], which is an algorithm for minimizing nonsmooth convex functions. The main work per iteration is spent on 1) minimizing a piecewise-linear model of the objective function and on 2) projecting onto the intersection of the feasible region and a polyhedron arising as a level set of the model. We show that by replacing exact computations in both cases by approximate computations, in relative scale, the theoretical iteration complexity increases only by the factor of four. This means that while spending less work on the subproblems, we are able to retain the good theoretical properties of the level method.
    Keywords: evel method, approximate projections in relative scale, nonsmooth convex optimization, sensitivity analysis, large-scale optimization.
    Date: 2008–12
    URL: http://d.repec.org/n?u=RePEc:cor:louvco:2008083&r=cmp
  14. By: Concetta Rondinelli (Bank of Italy); Cheti Nicoletti (Institute for Social and Economic Research (ISER))
    Abstract: Empirical researchers usually prefer statistical models that can be easily estimated using standard software packages. One such model is the sequential binary model with or without normal random effects; such models can be adopted to estimate discrete duration models with unobserved heterogeneity. But ease of estimation may come at a cost. In this paper we conduct a Monte Carlo simulation to evaluate the consequences of omitting or misspecifying the unobserved heterogeneity distribution in single-spell discrete duration models.
    Keywords: discrete duration models, unobserved heterogeneity, Monte Carlo simulations
    JEL: C23 C25
    Date: 2009–03
    URL: http://d.repec.org/n?u=RePEc:bdi:wptemi:td_705_09&r=cmp
  15. By: Sofia Bauducco; Ales Bulir; Martin Cihak
    Abstract: To provide a rigorous analysis of monetary policy in the face of financial instability, we extend the standard dynamic stochastic general equilibrium model to include a financial system. Our simulations suggest that if financial instability affects output and inflation with a lag, and if the central bank has privileged information about credit risk, monetary policy responding instantly to increased credit risk can trade off more output and inflation instability today for a faster return to the trend than a policy that follows the simple Taylor rule. This augmented rule leads in some parameterizations to improved outcomes in terms of long-term welfare, however, the welfare impacts of such a rule appear to be negligible.
    Keywords: DSGE models, financial instability, monetary policy rule.
    JEL: E52 E58 G21
    Date: 2008–12
    URL: http://d.repec.org/n?u=RePEc:cnb:wpaper:2008/8&r=cmp
  16. By: YATSENKO, Yuri; HRITONENKO, Natali
    Abstract: In Operations Research, the equipment replacement process is usually modeled in discrete time. The optimal replacement strategies are found from discrete (or integer) programming problems, well known for their analytic and computational complexity. An alternative approach is represented by continuous-time vintage capital models that explicitly involve the equipment lifetime and are described by nonlinear integral equations. Then the optimal replacement is determined via the optimal control of such equations. These two alternative techniques describe essentially the same controlled dynamic process. We introduce and analyze a model that unites both approaches. The obtained results allow us to explore such important effects in optimal asset replacement as the transition and long-term dynamics, clustering and splitting of replaced assets, and the impact of improving technology and discounting. In particular, we demonstrate that the cluster splitting is possible in our replacement model with given demand in the case of an increasinTheoretical findings are illustrated with numeric examples.
    Keywords: vintage capital models, optimization, equipment lifetime, discrete-continuous models.
    JEL: E20 O40 C60
    Date: 2008–12
    URL: http://d.repec.org/n?u=RePEc:cor:louvco:2008069&r=cmp
  17. By: BRECHET, Thierry (UniversitŽ catholique de Louvain (UCL). Center for Operations Research and Econometrics (CORE)); EYCKMANS, Johan (UniversitŽ catholique de Louvain (UCL). Center for Operations Research and Econometrics (CORE)); GERARD, Franois (UniversitŽ catholique de Louvain (UCL). Center for Operations Research and Econometrics (CORE)); MARBAIX, Philippe
    Abstract: In this paper we analyze the negotiation strategy of the European Union regarding the formation of an international climate agreement for the post-2012 era. We use game theoretical stability concepts to explore incentives for key players in the climate policy game to join future climate agreements. We compare a minus 20 percent unilateral commitment strategy by the EU with a unilateral minus 30 percent emission reduction strategy for all Annex-B countries. Using a numerical integrated assessment climate-economy simulation model, we find that carbon leakage effects are negligible. The EU strategy to reduce emissions by 30% (compared to 1990 levels) by 2020 if other Annex-B countries follow does not induce participation of the USA with a similar 30% reduction commitment. However, the model shows that an appropriate initial allocation of emission allowances may stabilize a larger and more ambitious climate coalition than the Kyoto Protocol in its first commitment period.
    Keywords: climate change, coalition theory, integrated assessment model, Kyoto protocol.
    JEL: C6 C7 H4 Q5
    Date: 2008–11
    URL: http://d.repec.org/n?u=RePEc:cor:louvco:2008061&r=cmp
  18. By: Lönnbark, Carl (Department of Economics, Umeå University)
    Abstract: In general, the properties of the conditional distribution of multiple period returns do not follow easily from the one-period data generating process. This renders computation of Value-at-Risk and Expected Shortfall for multiple period returns a non-trivial task. In this paper we consider some approximation approaches to computing these measures. Based on the results of a simulation experiment we conclude that among the studied analytical approaches the one based on approximating the distribution of the multiple period shocks by a skew-t was the best. It was almost as good as the simulation based alternative. We also found that the uncertainty due to the estimation risk can be quite accurately estimated employing the delta method. In an empirical illustration we computed ve day V aR0s for the S&P 500 index. The approaches performed about equally well.
    Keywords: Asymmetry; Estimation Error; Finance; GJR-GARCH; Prediction; Risk Management
    JEL: C16 C46 C52 C53 C63 G10
    Date: 2009–04–01
    URL: http://d.repec.org/n?u=RePEc:hhs:umnees:0768&r=cmp

This nep-cmp issue is ©2009 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.