nep-cmp New Economics Papers
on Computational Economics
Issue of 2015‒06‒20
fourteen papers chosen by
Stan Miles
Thompson Rivers University

  1. Q-Learning and SARSA: a comparison between two intelligent stochastic control approaches for financial trading By Marco Corazza; Andrea Sangalli
  2. Regional Analysis Domestic Integration in Egypt By Eduardo A. Haddad; Michael Lahr, Dina N. Elshahawany, Moises Vassallo
  3. La taxation directe de la consommation: Une simulation des recettes et taux pour le Québec By François Vaillancourt; stefano Polloni
  4. Ralentissement de la croissance économique des principaux partenaires commerciaux et ses implications sur l’économie congolaise By Nlemfu Mukoko, Jean Blaise
  5. An interior-point path-following method for computing a perfect stationary point of a polynomial mapping on a polytope By Dang, Chuangyin; Meng, Xiaoxuan; Talman, Dolf
  6. Labour Supply models By Rolf Aaberge; Ugo Colombino
  7. Zone de libre échange de la sadc et économie de la RDCongo :Création de commerce et Bien-être? By Nlemfu Mukoko, Jean Blaise; Wabenga Yango, James
  8. The Dynamics of Pollution Permits By Hasegawa, Makoto; Salant, Stephen
  9. A Practical Approach to Financial Crisis Indicators Based on Random Matrices By Antoine Kornprobst; Raphael Douady
  10. An Economy-Wide Evaluation of New Power Generation in South Africa: The Case of Kusile and Medupi By Jessica A. Bohlmann; Heinrich R. Bohlmann; Roula Inglesi-Lotz
  11. The economics of radical uncertainty By Ormerod, Paul
  12. It is a matter of hierarchy: a Nash equilibrium problem perspective on bilevel programming By Lorenzo Lampariello; Simone Sagratella
  13. Remote Sensing Feature Detection and Geoinformation Retrieval Via Multiscale 2D Gabor Wavelet Transform By Zhengmao Ye; Habib Mohamadian
  14. A Microsimulation Model of the Distributional Impacts of Climate Policies By Gordon, Hal; Burtraw, Dallas; Williams, Roberton

  1. By: Marco Corazza (Department of Economics, Cà Foscari University Of Venice); Andrea Sangalli (…)
    Abstract: The purpose of this paper is to solve a stochastic control problem consisting of optimizing the management of a trading system. Two model free machine learning algorithms based on Reinforcement Learning method are compared: the Q-Learning and the SARSA ones. Both these models optimize their behaviours in real time on the basis of the reactions they get from the environment in which operate. This idea is based on a new emerging theory about the market efficiency, the Adaptive Market Hypothesis. We apply the algorithms on single stock price time series using simple state variables. These algorithms operate selecting an action among three possible ones: buy, sell and stay out from the market. We perform several applications based on different parameter settings that are tested on an artificial daily stock prices time series and on different real ones from Italian stock market. Furthermore, performances are both gross and net of transaction costs.
    Keywords: Financial trading system, Adaptive Market Hypothesis, model free machine learning, Reinforcement Learning, Q-Learning, SARSA, Italian stock market.
    JEL: C61 C63 G11
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:ven:wpaper:2015:15&r=cmp
  2. By: Eduardo A. Haddad; Michael Lahr, Dina N. Elshahawany, Moises Vassallo
    Abstract: We develop an interregional computable general equilibrium model to help assess the ex ante impact of transportation infrastructure policies in Egypt. The model is integrated with a GIS network. We illustrate the analytical capabilities of the model by looking at the domestic integration of the country. Improvements of transportation costs among Egyptian governorates and of their links to the broader world economy are considered in stylized simulations. The results provide quantitative and qualitative insights (general equilibrium effects) into trade-offs commonly faced by policy makers when dealing with transportation infrastructure projects in a spatial context. In the case of Egypt, there seems to be an important trade-off between efficiency and regional equity: projects that produce potential higher impacts on national GDP also tend to contribute more to regional concentration.
    Keywords: Transportation cost; infrastructure; regional analysis; spatial general equilibrium.
    JEL: R11 R13 R4
    Date: 2015–06–10
    URL: http://d.repec.org/n?u=RePEc:spa:wpaper:2015wpecon10&r=cmp
  3. By: François Vaillancourt; stefano Polloni
    Date: 2015–05–29
    URL: http://d.repec.org/n?u=RePEc:cir:cirpro:2015rp-12&r=cmp
  4. By: Nlemfu Mukoko, Jean Blaise
    Abstract: This work examines the possible impact of slower economic growth in major trading partners on the Congolese economy during the period 2013-2020. It also offers an alternative way to finance the economy. A dynamic and micro simulated computable general equilibrium model applied to the case of the D.R.Congo economy has been used. The results show the implications on the real sector, public finances and the balance of payments, and emphasize the importance of diversifying the structure of exports by focusing on agricultural products as an alternative solution.
    Keywords: Computable general equilibrium, economic growth
    JEL: C68 O40 O55
    Date: 2013–07
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:65089&r=cmp
  5. By: Dang, Chuangyin; Meng, Xiaoxuan; Talman, Dolf (Tilburg University, Center For Economic Research)
    Abstract: As a refinement of the concept of stationary point, the notion of perfect stationary point was formulated in the literature. Although simplicial methods could be applied to approximate such a point, these methods do not make use of the possible differentiability of the problem and can be very time-consuming even for small-scale problems. To fully exploit the differentiability of the problem, this paper develops an interior-point path-following method for computing a perfect stationary point of a polynomial mapping on a polytope. By incorporating a logarithmic barrier term into the linear objective function with an appropriate convex combination, the method closely approximates some stationary points of the mapping on a perturbed polytope, especially when the perturbation is sufficiently small. It is proved that there exists a smooth path which starts from a point in the interior of a polytope and ends at a perfect stationary point. A predictor-corrector method is adopted for numerically following the path. Numerical results further confirm the effectiveness of the method.
    Keywords: variational inequiality problem; perfect stationay point; interior-point path-following method; predictor-corrector method
    JEL: C62 C63
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:tiu:tiucen:07b7a0e7-f814-4ec2-a3a7-e6cec16cbab9&r=cmp
  6. By: Rolf Aaberge; Ugo Colombino (Statistics Norway)
    Abstract: This paper is published as Chapter 7 of Handbook of Microsimulation Modelling edited by Cathal O’'Donoghue, and issued in the series Contributions to Economic Analysis by Emerald Publishing Group. The purpose of the paper is to provide a detailed discussion in relation to the development of the field of labour supply focused microsimulation models and methodological choices. The paper identifies three methodologies for modelling labour supply • *The Reduced Form Approach • *The Structural “Marginalist” Approach • *The Random Utility Maximisation Approach The paper considers issues associated with the reliability of structural models relative to (ex-post) experimental or quasi-experimental analysis. Recognising however the need to undertake ex-ante analysis, it questions, whether there are alternatives to structural models and how can we evaluate structural models and how they are compared with other approaches. The paper then describes approaches to utilising these models for policy simulation in terms of producing and interpreting simulation outcomes, outlining an extensive literature of policy analyses utilising the approach. Also labour supply is not only central to modelling behavioural response but also modelling optimal tax-benefit systems, with a focus on a computational approach, given some of the challenges of the theoretical approach. Combining labour supply results with welfare functions enables the social evaluation of policy simulations. Combining welfare functions and labour supply functions, the chapter then identifies how to model socially optimal income taxation.
    Keywords: inequality; poverty; deprivation; multidimensional well-being; capabilities and functionings
    JEL: D10 D31 H21 H24 J20
    Date: 2015–04
    URL: http://d.repec.org/n?u=RePEc:ssb:dispap:807&r=cmp
  7. By: Nlemfu Mukoko, Jean Blaise; Wabenga Yango, James
    Abstract: This work examines implications of joining the SADC Free Trade Agreement on the D.R.Congo economy, within computable general equilibrium framework. This objective passes by the analysis of the Congolese economy features through the 2005 social accounting matrix, empiric support of our model, on which is simulated the removing of 85% import tariffs. The results give an idea on the expected effects on trade creation and welfare, as well as the needed adjustment to the Congo capacity to fully participate in exchanges within this Free Trade Area.
    Keywords: Computable general equilibrium, Free Trade Agreement, Trade Creation and Welfare
    JEL: D58 D63 F13 F15
    Date: 2011–02
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:65050&r=cmp
  8. By: Hasegawa, Makoto; Salant, Stephen (Resources for the Future)
    Abstract: We review the literature on bankable emission permits, which has developed over the last two decades. Most articles analyze either theoretical or simulation models. The theoretical literature considers the problem of minimizing the discounted sum of social costs and the possibility of decentralizing the solution through competitive permit markets. In some cases, authors do not explicitly consider pollution damages but instead assume that the planner's goal is to minimize the discounted social cost of reducing cumulative emissions by a given amount. In other cases, authors do not explicitly consider an emissions reduction target but assume that the goal is to minimize the discounted sum of pollution damages and abatement costs. Simulations permit evaluation of alternative government policies under uncertainty. We conclude by pointing out directions for future work.
    Date: 2015–05–26
    URL: http://d.repec.org/n?u=RePEc:rff:dpaper:dp-15-20&r=cmp
  9. By: Antoine Kornprobst (Centre d'Economie de la Sorbonne, Labex RéFi); Raphael Douady (Centre d'Economie de la Sorbonne, Labex RéFi)
    Abstract: The aim of this work is to build financial crisis indicators based on market data time series. After choosing an optimal size for a rolling window, the market data is seen every trading day as a random matrix from which a covariance and correlation matrix is obtained. Our indicators deal with the spectral properties of these covariance and correlation matrices. Our basic financial intuition is that correlation and volatility are like the heartbeat of the financial market: when correlations between asset prices increase or develop abnormal patterns, when volatility starts to increase, then a crisis event might be around the corner. Our indicators will be mainly of two types. The first one is based on the Hellinger distance, computed between the distribution of the eigenvalues of the empirical covariance matrix and the distribution of the eigenvalues of a reference covariance matrix. As reference distribution we will use the theoretical Marchenko Pastur distribution and, mainly, simulated ones using a random matrix of the same size as the empirical rolling matrix and constituted of Gaussian or Student-t coefficients with some simulated correlations. The idea behind this first type of indicators is that when the empirical distribution of the spectrum of the covariance matrix is deviating from the reference in the sense of Hellinger, then a crisis may be forthcoming. The second type of indicators is based on the study of the spectral radius and the trace of the covariance and correlation matrices as a mean to directly study the volatility and correlations inside the market. The idea behind the second type of indicators is the fact that large eigenvalues are a sign of dynamic instability
    Keywords: Quantitative Finance; Econometrics; Mathematical Methods; Statistical Simulation Methods; Forecasting and Prediction Methods; Large Data Sets Modeling and Analysis; Computational Techniques; Simulation Modeling; Financial Crises; Random Matrix Theory
    JEL: B16 C01 C02 C15 C53 C58 C63 G01
    Date: 2015–06
    URL: http://d.repec.org/n?u=RePEc:mse:cesdoc:15049&r=cmp
  10. By: Jessica A. Bohlmann (Department of Economics, University of Pretoria); Heinrich R. Bohlmann (Department of Economics, University of Pretoria); Roula Inglesi-Lotz (Department of Economics, University of Pretoria)
    Abstract: The South African economy has suffered over the past decade due to a lack of adequate electricity supply. With two new coal-fired power stations, Kusile and Medupi scheduled to come online over a six year period (2014-2019), their additional generation capacity is expected to restore electricity reserve margins and facilitate increased growth and investment in the local economy. In this paper, we use a dynamic CGE model for South Africa to evaluate the economy-wide impact that the additional power generation from these two stations will have across a broad range of macroeconomic and industry variables. In terms of the new power generation capacity, our findings suggest that the macroeconomic impact of Kusile and Medupi will be a definite positive. Results show that, in the medium term, investment expenditur is particularly sensitive to the building of these new power plants. Additional costly blackouts are also likely to be avoided, further promoting economic growth and investment. Once Kusile and Medupi are fully operational and able to provide its projected 9600MW of base load electricity supply, old coal-fired power plants may be decommissioned and replaced by cleaner and more efficient generation sources as outlined in the Department of Energy's Integrated Resource Plan. Our analysis also suggests that this outcome provides a good balance between utilising modern clean coal technologies that are cost-effective while laying the foundation to improving our generation-mix and carbon emissions profile.
    Keywords: Computable general equilibrium, UPGEM, electricity supply, Kusile, Medupi
    JEL: C68 Q41 Q43
    Date: 2015–06
    URL: http://d.repec.org/n?u=RePEc:pre:wpaper:201540&r=cmp
  11. By: Ormerod, Paul
    Abstract: In situations of what we now describe as radical uncertainty, the core model of agent behaviour, of rational autonomous agents with stable preferences, is not useful. Instead, a different principle, in which the decisions of an agent are based directly on the decisions and strategies of other agents, becomes the relevant core model. Preferences are not stable, but evolve. It is not a special case in such circumstances, but the general one. The author provides empirical evidence to suggest that as a description of behaviour in the modern world, economic rationality is applicable in a declining number of situations. He discusses models drawn from the modern literature on cultural evolution in which imitation of others is the basic strategy, and suggests a heuristic way of classifying situations in which the different models are relevant. The key point is that in situations where radical uncertainty is present, we require theoretical 'null' models of agent behaviour which are different from those of economic rationality. Under uncertainty, fundamentally different behavioural rules are 'rational'. The author gives an example of a very simple pure sentiment model of the business cycle, in which agents use very simple heuristic decision rules. It is nevertheless capable of approximating a number of deep features of output growth over the cycle.
    Keywords: uncertainty,imitation,evolution,agent-based model,sentiment,business cycle
    JEL: D81 E32
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:zbw:ifwedp:201540&r=cmp
  12. By: Lorenzo Lampariello (Department of Methods and Models for Economics, Territory and Finance, University of Rome "La Sapienza"); Simone Sagratella (Department of Computer, Control and Management Engineering, University of Rome "La Sapienza")
    Abstract: Inspired by the optimal value approach, we propose a new reformulation of the optimistic Bilevel programming Problem (BP) as a suitable Generalized Nash Equilibrium Problem (GNEP). We provide a complete analysis of the relationship between the original hierarchical BP and the corresponding "more democratic" GNEP. Moreover, we investigate solvability and convexity issues of our reformulation. Finally, relying on the vast literature on solution methods for GNEPs, we devise a new effective algorithmic framework for the solution of signicant classes of BPs.
    Keywords: Bilevel programming ; Generalized Nash Equilibrium Problems (GNEP); parametric optimization ; numerical approaches
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:aeg:report:2015-07&r=cmp
  13. By: Zhengmao Ye (Southern University); Habib Mohamadian (Southern University)
    Abstract: Remote sensing involves observation, acquisition and processing of information detected and measured from airborne or spaceborne platforms. The structural information of objects such as landmarks, highways, bridges, mountains, oceans, rivers, lakes, living creatures and moving targets could be represented by edges and contours based on the high-spatial resolution remote sensing images. The geographic information system (GIS) has been widely adopted to exhibit features, patterns, and trends on the surface of the earth. The data sources from aerial photos and satellite images are typically stored as raster images that contain composite tristimulus color values of red, green and blue (RGB). It allows people to visualize, analyze, process, interpret and archive data to synthesize the boundaries, interactions and relationships of the features, patterns, and trends. Remote sensing could be either passive or active. The passive sensing system identifies natural radiation via visible light imaging or infrared photography. The active sensing system emits its own energy to scan objects so that the reflected or backscattered radiation can be detected and measured, such as Radar or Lidar (Radio or Light Detection and Ranging). To achieve better data abstraction and management with respect to spatial 2D data analysis, integration of geovisualization and geocomputation can be introduced for dynamic data exploration so that the latent interactions could be discovered. Edges are critical features of structural information. Numerous approaches have been designed for detecting edges of high spatial resolution images using gradient based algorithms. However, most methods are sensitive to noises. The 2D Gabor filter is then presented to extract and differentiate the crucial structural information via parametric optimization, where Gabor wavelet transform will be employed for edge detection and contour tracing using convolution operation of each of three primary color components of digital images with the 2D Gabor wavelet in the frequency domain. It has been shown that multiscale Gabor wavelets are suitable for segmenting remote sensing images to explore intrinsic geographical information. Numerical simulations on diversified landscape aerial images have been carried out to show the impact of the proposed approach on spatial information analysis.
    Keywords: Remote Sensing, Wavelet Decomposition, Multiscale Gabor Transform, Edge Detection, Soft Thresholding
    URL: http://d.repec.org/n?u=RePEc:sek:iacpro:2504019&r=cmp
  14. By: Gordon, Hal (Resources for the Future); Burtraw, Dallas (Resources for the Future); Williams, Roberton (Resources for the Future)
    Abstract: Carbon policies introduce potentially uneven cost burdens. Anticipating these outcomes is important for policymakers seeking to achieve an equitable outcome and can be politically important as well. This paper describes the details of a microsimulation model that utilizes the price and quantity changes predicted by economic models of carbon policies to make an estimation of economic incidence by income quintile or state, and potentially across other dimensions. After taking as inputs the aggregate output from partial or general equilibrium economic modeling, the microsimulation model uses data from the Consumer Expenditure Survey (CE), the State Energy Data System (SEDS), the National Income and Product Accounts (NIPA), estimations from the Congressional Budget Office (CBO), and the Haiku electricity model. These data sources are used to estimate the share of consumer and producer surplus changes that accrue to households in each income quintile and state. The model is unique among existing incidence models in its ability to drill down to the level of state incidence and to plug into a wide range of economic models.
    Keywords: carbon price, carbon tax, emissions tax, cap and trade, distributional effects, equity, efficiency, incidence
    JEL: H22 H23 Q52 Q54
    Date: 2015–02–26
    URL: http://d.repec.org/n?u=RePEc:rff:dpaper:dp-14-40&r=cmp

This nep-cmp issue is ©2015 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.