nep-cmp New Economics Papers
on Computational Economics
Issue of 2011‒03‒26
twelve papers chosen by
Stan Miles
Thompson Rivers University

  1. Building an Artificial Stock Market Populated by Reinforcement-Learning Agents By Tomas Ramanauskas; Aleksandras Vytautas Rutkauskas
  2. The Implementation of Scenarios Using DSGE Models By Igor Vetlov; Ricardo Mourinho Félix; Laure Frey; Tibor Hlédik; Zoltán Jakab; Niki Papadopoulou; Lukas Reiss; Martin Schneider
  3. Can the removal of VAT Exemptions support the Poor? The Case of Niger By Dorothée Boccanfuso; Celine De Quatrebarbes; Luc Savard
  4. Role of Agriculture in Achieving MDG 1 in Asia and the Pacific Region By Katsushi S. Imai; Raghav Gaiha; Ganesh Thapa
  5. Are the Poverty Effects of Trade Policies Invisible? By Monika Verma; Thomas Hertel; Ernesto Valenzuela
  6. How uncertainty reduces greenhouse gas emissions By Schenker, Oliver
  7. Clustering life trajectories: A new divisive hierarchical clustering algorithm for discrete-valued discrete time series By Dlugosz, Stephan
  8. An efficient method of computing higher-order bond price perturbation approximations By Andreasen , Martin; Zabczyk, Pawel
  9. Cardinality versus q-Norm Constraints for Index Tracking By Bjoern Fastrich; Sandra Paterlini; Peter Winker
  10. On the (non-)equivalence of capital adequacy and monetary policy: A response to Cechetti and Kohler By Stan du Plessis; Gideon du Rand
  11. An empirical Analysis of the Counterfactual: A Merger and Divestiture in the Australian Cigarette Industry By Vivienne Pham; David Prentice
  12. A redux of the workhorse NOEM model with capital accumulation and incomplete asset markets By Enrique Martinez-Garcia

  1. By: Tomas Ramanauskas (Bank of Lithuania); Aleksandras Vytautas Rutkauskas (Vilnius Gediminas Technical University)
    Abstract: In this paper we propose an artificial stock market model based on interaction of heterogeneous agents whose forward-looking behaviour is driven by the reinforcement learning algorithm combined with some evolutionary selection mechanism. We use the model for the analysis of market self-regulation abilities, market efficiency and determinants of emergent properties of the financial market. Distinctive and novel features of the model include strong emphasis on the economic content of individual decision making, application of the Q-learning algorithm for driving individual behaviour, and rich market setup.
    Keywords: agent-based financial modelling, artificial stock market, complex dynamical system, emergent properties, market efficiency, agent heterogeneity, reinforcement learning
    JEL: G10 G11 G14
    Date: 2009–09–04
  2. By: Igor Vetlov (Bank of Lithuania); Ricardo Mourinho Félix (Banco de Portugal); Laure Frey (Banque de France); Tibor Hlédik (Czech National Bank); Zoltán Jakab (Office of the Fiscal Council, Republic of Hungary); Niki Papadopoulou (Central Bank of Cyprus); Lukas Reiss (Oesterreichische Nationalbank); Martin Schneider (Oesterreichische Nationalbank)
    Abstract: The new generation of dynamic stochastic general equilibrium (DSGE) models seems particularly suited for conducting scenario analysis. These models formalise the behaviour of economic agents on the basis of explicit micro-foundations. As a result, they appear less prone to the Lucas critique than more traditional macroeconometric models. DSGE models provide researchers with powerful tools, which allow for the designing of a broad range of scenarios and tackling a large range of issues, offering at the same time an appealing structural interpretation of the scenario specification and simulation results. The paper provides illustrations on some of the modelling issues that often arise when implementing scenarios using DSGE models in the context of projection exercises or policy analysis. These issues reflect the sensitivity of DSGE model-based analysis to scenario assumptions, which in more traditional models are apparently less critical, such as, for example, scenario event anticipation and duration, treatment of monetary and fiscal policy rules.
    Keywords: business fluctuations, monetary policy, fiscal policy, forecasting and simulation
    JEL: E32 E52 E62 E37
    Date: 2010–08–25
  3. By: Dorothée Boccanfuso (GREDI - Université de Sherbrooke); Celine De Quatrebarbes (CERDI - Centre d'études et de recherches sur le developpement international - CNRS : UMR6587 - Université d'Auvergne - Clermont-Ferrand I); Luc Savard (GREDI - Université de Sherbrooke)
    Abstract: In order to have the public funds necessary for its development, Niger is examining the possibility of expanding its VAT tax base to exempted goods and basic food products. This proposal has prompted violent opposition leading to the question of the social impacts of taxation. The first micro-macro computable general equilibrium model of Niger's actual economy has been developed. This model allows analysis of the social impact and distributional analysis of the following VAT structures: a pure VAT structure, a structure maintaining certain exemptions, and a multiple-rate VAT structure. The model's results shows that although restoring the VAT rate would be socially costly compared to the initial situation, the distributional impact of the VAT differs according to the system implemented in the country. Maintaining VAT exemptions in food crop agriculture sectors associated with a tax base expansion in the remaining sectors will increase public revenue while taking into account the national goal of poverty reduction. The net social impact of exoneration depends on the economic structure of the concerned sector. If the national goal is the end of exemption, the model shows that applying a pure VAT conforming to the theory is preferable in terms of economic growth whereas applying a reduced-rate on food crop agriculture lightens the social impact of the end of exemptions compared to a single rate.
    Keywords: distributional analysis;Value Added Tax;exemptions;micro-simulation;Computable general equilibrium model;niger
    Date: 2011–03–16
  4. By: Katsushi S. Imai (Economics, School of Social Sciences, University of Manchester, UK and Research Institute for Economics & Business Administration, Kobe University, Japan); Raghav Gaiha (Massachusetts Institute of Technology, USA & Faculty of Management Studies, University of Delhi, India); Ganesh Thapa (International Fund for Agricultural Development, Italy)
    Abstract: This paper examines whether agricultural growth through public expenditure, ODA or investment will improve significantly the prospects of achieving MDG 1 of halving poverty in Asia and the Pacific Region. As more than a few countries in this Region recorded impressive economic growth in the early years of the present decade, the case for the widely used poverty threshold of US$1.25 per day (at 2005 PPP) for assessing progress towards MDG1 is not so compelling now. Accordingly, the present assessment uses two poverty thresholds: US$2 per day and US$1.25 per day (both at 2005 PPP). Our analysis, based on country panel data, confirms robustly that increases in public agricultural expenditure, agricultural ODA, agricultural investment, or fertiliser use (as a proxy for technology), accelerate agricultural and GDP growth. Consequently, the headcount and depth of poverty indices are reduced substantially. Our simulation results show that, for halving the headcount index at US$2 per day, Asia and the Pacific region as a whole would need in 2007-13 a 56% increase in annual agricultural ODA, a 28% increase in agricultural expenditure, a 23% increase in fertiliser use or a 24% increase in agricultural investment. Aggregation of the simulation results for various groups reveals that countries in low income group, with a low level of macro governance or institutional quality, or with low ease of doing business would need larger increase in agricultural ODA, expenditure or investment to halve poverty. Although the share of agriculture in GDP has declined, our analysis reinforces the case for channelling a substantially larger flow of resources not just for accelerating growth but also for achieving the more ambitious MDG1. A policy dilemma, however, is the trade-off between institutional quality and resource transfers. National governments and donors must reflect deeply on triggers for institutional reforms and mechanisms that would ensure larger outlays for agriculture and their allocation between rural infrastructure and sustainable technologies.
    Keywords: Millennium Development Goal, Poverty, Agriculture, ODA, Investment, Public Expenditure, Asia, Panel Data, Simulations
    JEL: C31 C33 H53 I32
    Date: 2011–01
  5. By: Monika Verma (Center for Global Trade Analysis, Department of Agricultural Economics, Purdue University); Thomas Hertel (Center for Global Trade Analysis, Department of Agricultural Economics, Purdue University); Ernesto Valenzuela (Centre for International Economic Studies, School of Economics, University of Adelaide)
    Abstract: Beginning with the WTO's Doha Development Agenda and establishment of the Millennium Development Goal of reducing poverty by 50 percent by 2015, poverty impacts of trade reforms have become central to the global development agenda. This has been particularly true of agricultural trade reforms due to the importance of grains in the diets of the poor, presence of relatively higher protection in agriculture, as well as heavy concentration of global poverty in rural areas where agriculture is the main source of income. Yet some in this debate have argued that, given the extreme volatility in agricultural commodity markets, the additional price and therefore poverty impacts due to trade liberalization might well be barely discernible. This paper formally tests this invisibility hypothesis using the method of stochastic simulation in a trade-poverty modeling framework. The hypothesis test is based on the comparison of two samples of price and poverty distributions. The first originates solely from the inherent variability in global staple grains markets, while the second combines the effects of inherent market variability with those of trade reform in these same markets. Results, at both national and stratum levels, indicate that the short-run poverty impacts of full trade liberalization in staple grains trade worldwide are not distinguishable in the majority of cases, suggesting that the poverty impacts of more modest (and realistic) agricultural trade liberalization are indeed likely to be statistically invisible. This does not mean that such reforms are economically unimportant. Rather it is a direct consequence of the high degree of volatility in agricultural commodity markets.
    Keywords: Trade policy reform, agricultural trade, computable general equilibrium, developing countries, poverty headcount, volatility, stochastic simulation, non-parametric hypothesis testing.
    JEL: C12 C68 F17 I32 Q17 R20
    Date: 2011–03
  6. By: Schenker, Oliver
    Abstract: China has becoming in 2006 the world’s largest emitter of greenhouse gases (GHG), responsible for one-fifth of world’s emissions from power generation. And further strong growth in this sector is to be expected. To provide these additional power generation capacities substantial investments in China’s energy infrastructure are necessary. But the potential investors are confronted with uncertainty in the design of China’s future climate policy, which might affect the profitability of GHG emitting power plants. It is the aim of this paper to investigate the role of uncertainty in China’s climate policy on investments in the electricity sector and its consequences for GHG emissions. We analyze the topic with a stochastic dynamic computable general equilibrium model with an extended energy sector and calibrated with Chinese data. The results show that uncertainty about the timing and extent of China’s climate policy lowers emissions compared to a world with perfect information. Uncertainty lowers the present value of coal-fired electricity in pre-policy periods and has so a positive effect for the environment.
    Keywords: China; Energy Policy; Climate Policy; Investment under Uncertainty; Stochastic and Dynamic CGE Model
    JEL: O41 C68 Q41 D58 D80
    Date: 2011–02–15
  7. By: Dlugosz, Stephan
    Abstract: A new algorithm for clustering life course trajectories is presented and tested with large register data. Life courses are represented as sequences on a monthly timescale for the working-life with an age span from 16-65. A meaningful clustering result for this kind of data provides interesting subgroups with similar life course trajectories. The high sampling rate allows precise discrimination of the different subgroups, but it produces a lot of highly correlated data for phases with low variability. The main challenge is to select the variables (points in time) that carry most of the relevant information. The new algorithm deals with this problem by simultaneously clustering and identifying critical junctures for each of the relevant subgroups. The developed divisive algorithm is able to handle large amounts of data with multiple dimensions within reasonable time. This is demonstrated on data from the Federal German pension insurance. --
    Keywords: Clustering,measures of association,discrete data,time series
    JEL: C33 J00
    Date: 2011
  8. By: Andreasen , Martin (Bank of England); Zabczyk, Pawel (Bank of England)
    Abstract: This paper develops a fast method of computing arbitrary order perturbation approximations to bond prices in DSGE models. The procedure is implemented to third order where it can shorten the approximation process by more than 100 times. In a consumption-based endowment model with habits, it is further shown that a third-order perturbation solution is more accurate than the log-normal method and a procedure using consol bonds.
    Keywords: Perturbation method; DSGE models; habit model; higher-order approximation.
    JEL: C63 G12
    Date: 2011–03–15
  9. By: Bjoern Fastrich; Sandra Paterlini; Peter Winker
    Abstract: Index tracking aims at replicating a given benchmark with a smaller number of its constituents. Different quantitative models can be set up to determine the optimal index replicating portfolio. In this paper, we propose an alternative based on imposing a constraint on the q-norm, 0 < q < 1, of the replicating portfolios' asset weights: the q-norm constraint regularises the problem and identifies a sparse model. Both approaches are challenging from an optimisation viewpoint due to either the presence of the cardinality constraint or a non-convex constraint on the q-norm. The problem can become even more complex when non-convex distance measures or other real-world constraints are considered. We employ a hybrid heuristic as a flexible tool to tackle both optimisation problems. The empirical analysis on real-world financial data allows to compare the two index tracking approaches. Moreover, we propose a strategy to determine the optimal number of constituents and the corresponding optimal portfolio asset weights.
    Keywords: index tracking, cardinality constraint, q-norm, regularization methods, heuristic algorithms
    JEL: C15 C61 G11
    Date: 2011–01
  10. By: Stan du Plessis (Department of Economics, University of Stellenbosch); Gideon du Rand (Department of Economics, University of Stellenbosch)
    Abstract: The instrument problem in monetary policy is back on the agenda. Until recently interest rate policy was widely thought to be sufficient for the attainment of appropriate monetary policy goals. No longer. In the wake of the international financial crisis there is much pressure on monetary authorities to incorporate the goal of financial stability more explicitly in policy. This requires an expansion of the instruments typically used by central banks. Cechetti and Kohler (2010) recently considered this new version of the instrument problem in monetary policy by analysing the distinct role and potential for co-ordinating (i) interest rates and (ii) capital adequacy requirements. In this paper we connect this modern debate with an earlier version of the instrument problem, famously discussed by Poole (1970). Then, as now (we claim), the main message of the analysis is the non-equivalence of these instruments and the structural features of the economy on the basis of which one would prefer a particular combination of these instruments. These results are demonstrated with a set of simulations. We also offer a theoretical criticism of the modelling approach used by Cechetti and Kohler (2010).
    Keywords: Monetary policy, Instrument problem, Interest rates, Alternative monetary policy instruments, Balance sheet operations, Policy co-ordination
    JEL: E52 E58 E61
    Date: 2011
  11. By: Vivienne Pham (School of Economics and Finance, La Trobe University); David Prentice (School of Economics and Finance, La Trobe University)
    Abstract: In this paper we empirically analyse two counterfactual situations facing an anti-trust authority following the merger of two of the largest international cigarette companies. First we estimate a nested logit model of demand for cigarettes. The implied elasticity of demand for smoking and implied marginal costs are both broadly consistent with the limited independent estimates available. We then use the model to simulate the proposed merger and the partial divestiture that was accepted by the Australian anti-trust author- ity. A comparison of the relative price changes predicted by the divestiture simulation with the actual post-divestiture price changes shows the model successfully anticipated the behaviour of the divested brands. This suggests structural econometric analysis us- ing a nested logit may be usefully utilised by anti-trust authorities to assess the welfare implications of proposed mergers and partial divestitures.
    Date: 2010–11
  12. By: Enrique Martinez-Garcia
    Abstract: I build a symmetric two-country model that incorporates nominal rigidities, local-currency pricing and monopolistic competition distorting the goods markets. The model is similar to the framework developed in Martínez-García and Søndergaard (2008a, 2008b), but it also introduces frictions in the assets markets by restricting the financial assets available to two uncontingent nominal bonds in zero-net supply and by adding quadratic costs on international borrowing (see, e.g., Benigno and Thoenissen (2008) and Benigno (2009). The technical part of the paper contains three basic calculations. First, I derive the equilibrium conditions of the open economy model under local-currency pricing and incomplete asset markets. Second, I compute the zero-inflation (deterministic) steady state and discuss what happens with a non-zero net foreign asset position. Third, I derive the log-linearization of the equilibrium conditions around the deterministic steady state. The quantitative part of the paper aims to give a broad overview of the role that incomplete international asset markets can play in accounting for the persistence and volatility of the real exchange rate. I find that the simulation of the incomplete and complete asset markets models is almost indistinguishable whenever the business cycle is driven primarily by either nonpersistent monetary or persistent productivity (but not permanent) shocks. In turn, asset market incompleteness has more sizeable wealth effects whenever the cycle is driven by persistent (but not permanent) investment-specific technology shocks, resulting in significantly lower real exchange rate volatility.
    Keywords: Foreign exchange ; International finance ; International trade ; Macroeconomics
    Date: 2011

This nep-cmp issue is ©2011 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.