nep-cmp New Economics Papers
on Computational Economics
Issue of 2017‒06‒04
eleven papers chosen by

  1. Agent-based modelling. History, essence, future By Hanappi, Hardy
  2. Exploiting damped techniques for nonlinear conjugate gradient methods By Mehiddin Al-Baali; Andrea Caliciotti; Giovanni Fasano; Massimo Roma
  3. Financial Time Series Forecasting: Semantic Analysis Of Economic News By Kateryna Kononova; Anton Dek
  4. Comparison of methods of data mining techniques for the predictive accuracy. By Pyzhov, Vladislav; Pyzhov, Stanislav
  5. Using machine learning for financial fraud detection in the accounts of companies investigated for money laundering By José A. Álvarez-Jareño; Elena Badal-Valero; José Manuel Pavía
  6. The impact of environmental regulations on the farmland market and farm structures: An agent-based model applied to the Brittany region of France By Elodie Letort; Pierre Dupraz; Laurent Piet
  7. Modelling corporate tax reform in the EU: New calibration and simulations with the CORTAX model By Joint Research Center of the European Commission - IPTS
  8. A New Heuristic in Mutual Sequential Mate Search By Saglam, Ismail
  9. Stabilizing an Unstable Complex Economy-On the limitations of simple rules By Isabelle Salle; Pascal Seppecher
  10. Hill-Climbing Algorithm for Robust Emergency System Design with Return Preventing Constraints By Marek Kvet; Jaroslav Janáèek
  11. The proactive and reactive resource-constrained project scheduling problem: the crucial role of buffer-based reactions By Morteza Davari; Erik Demeulemeester

  1. By: Hanappi, Hardy
    Abstract: The currently fashionable modelling tool agent-based simulation is characterized. The first part concerns the past. It presents a selection of the major intellectual roots from which this new tool emerged. It is important for social scientists, in particular for economists, to see that two relevant impacts came from neighbouring disciplines: biology and network theory. The second part concerns the present of ABM. It aims at highlighting the essential features which are characteristic for an agent-based model. Since there are currently several different opinions on this topic, the one presented here also includes some more epistemologically oriented ideas to support its plausibility. In particular the notion of emergence is scrutinized and extended. This part ends with a short recipe stating how to build an agent-based model. In the last part some ideas on the future of agent based modelling are presented. This part follows the sequence of syntax, semantics, and pragmatics. The syntactic challenges, like operators for pattern recognition, will be meat by a continuing variety of software packages and programming languages tailored to support ABM. The semantic aspect of future agent-based modelling hinges on the close relationship between the tool ABM and its object of investigation, e.g. evolutionary political economy. The need to model institutional change or communication processes will imply adaptive evolution of ABM. The pragmatics of future agent-based modelling are finally characterized as the most demanding – but also as the most influential – element that the new tool will bring about.
    Keywords: Agent-based modelling, economic simulation models
    JEL: B20 B41 C63
    Date: 2017–05–23
  2. By: Mehiddin Al-Baali (Department of Mathematics and Statistics Sultan Qaboos University, P.O. Box 36, Muscat 123, Oman); Andrea Caliciotti (Department of Computer, Control and Management Engineering Antonio Ruberti (DIAG), University of Rome La Sapienza, Rome, Italy); Giovanni Fasano (Department of Management University Ca' Foscari of Venice; S. Giobbe, Cannaregio 873 - 30121 Venice, Italy); Massimo Roma (Department of Computer, Control and Management Engineering Antonio Ruberti (DIAG), University of Rome La Sapienza, Rome, Italy)
    Abstract: In this paper we propose the use of damped techniques within Nonlinear Conjugate Gradient (NCG) methods. Damped techniques were introduced by Powell and recently reproposed by Al-Baali and till now, only applied in the framework of quasi{Newton methods. We extend their use to NCG methods in large scale unconstrained optimization, aiming at possibly improving the efficiency and the robustness of the latter methods, especially when solving difficult problems. We consider both unpreconditioned and Pre-conditioned NCG (PNCG). In the latter case, we embed damped techniques within a class of preconditioners based on quasi-Newton updates. Our purpose is to possibly provide efficient preconditioners which approximate, in some sense, the inverse of the Hessian matrix, while still preserving information provided by the secant equation or some of its modifications. The results of an extensive numerical experience highlights that the proposed approach is quite promising.
    Keywords: Large scale unconstrained optimization ; Nonlinear Conjugate Gradient methods ; quasi-Newton updates ; damped techniques
    Date: 2017
  3. By: Kateryna Kononova; Anton Dek
    Abstract: The paper proposes a method of financial time series forecasting taking into account the semantics of news. For the semantic analysis of financial news the sampling of negative and positive words in economic sense was formed based on Loughran McDonald Master Dictionary. The sampling included the words with high frequency of occurrence in the news of financial markets. For single-root words it has been left only common part that allows covering few words for one request. Neural networks were chosen for modeling and forecasting. To automate the process of extracting information from the economic news a script was developed in the MATLAB Simulink programming environment, which is based on the generated sampling of positive and negative words. Experimental studies with different architectures of neural networks showed a high adequacy of constructed models and confirmed the feasibility of using information from news feeds to predict the stock prices.
    Date: 2017–05
  4. By: Pyzhov, Vladislav; Pyzhov, Stanislav
    Abstract: This paper is based on the work of Yeh, Lien (2009). In the paper, authors used the payment data set from the important bank in Taiwan. To build a model, the whole sample was divided in two subsets - training and testing sets - so each model could be trained on the first one and then be evaluated on the second. Our motivation was to see whether the same result could be obtained if we repeatedly apply the models to the different data sets. To do so, Monte Carlo simulation was implemented to generate these sets.
    Keywords: Monte-Carlo, Data Mining, Neural Networks, k-nearest neighbors, Logistic regression, Random Forest.
    JEL: C53 C81 C87
    Date: 2017–05–23
  5. By: José A. Álvarez-Jareño (Department of Economics, Universitat Jaume I, Castellón, Spain); Elena Badal-Valero (Department of Applied Economics, Universitat de València, Valencia, Spain); José Manuel Pavía (Department of Applied Economics, Universitat de València, Valencia, Spain)
    Abstract: Benford’s Law is a well-known system use in accountancy for the analysis and detection of anomalies relating to money laundering and fraud. On that basis, and using real data from transactions undertaken by more than 600 companies from a particular sector, behavioral patterns can be analyzed using the latest machine learning procedures. The dataset is clearly unbalanced, for this reason we will apply cost matrix and SMOTE to different detecting patters methodologies: logistic regression, decision trees, neural networks and random forests. The objective of the cost matrix and SMOTE is to improve the forecasting capabilities of the models to easily identify those companies committing some kind of fraud. The results obtained show that the SMOTE algorithm gets better true positive results, outperforming the cost matrix implementation. However, the general accuracy of the model is very similar, so the amount of a false positive result will increase with SMOTE methodology. The aim is to detect the largest number of fraudulent companies, reducing, as far as possible, the number of false positives on companies operating correctly. The results obtained are quite revealing: Random forest gets better results with SMOTE transformation. It obtains 96.15% of true negative results and 94,98% of true positive results. Without any doubt, the listing ability of this methodology is very high. This study has been developed from the investigation of a real Spanish money laundering case in which this expert team have been collaborating. This study is the first step to use machine learning to detect financial crime in Spanish judicial process cases.
    Keywords: Benford’s Law, unbalance dataset, random forest, fraud, anti-money laundering.
    JEL: C14 C44 C53 M42
    Date: 2017
  6. By: Elodie Letort; Pierre Dupraz; Laurent Piet
    Abstract: Nitrate pollution remains a major problem in some parts of France, especially in the Brittany region, which is characterized by intensive livestock production systems. Although farmers must not exceed a regulatory limit of nitrogen contained in manure per hectare, many farmers in this region exceed this limit. Therefore, they must treat the excess of manure that they produce or export it to be spread in neighbouring farms and/or areas, inducing fierce competition in the land market. Another adaptation strategy consists of modifying production practices or the production system as a whole, i.e., changing the structure of the farm. In this paper, a spatial agent-based model (ABM) has been developed to assess policy options in the regulation of manure management practices. The objective is to highlight the potential effects of these policies on the farmland market and the structural changes that they induce. Our results show that the different policies, which result in similar environmental benefits, induce different changes in the land market and in agricultural structures.
    Keywords: Q15, C63, D22
    JEL: Q15 C63 D22
    Date: 2017
  7. By: Joint Research Center of the European Commission - IPTS
    Abstract: This report investigates the economic impact of the European Commission proposal for a common corporate tax base (CCTB) and a common consolidated corporate tax base with formula apportionment (CCCTB) within the EU. Furthermore, on top of the common base, it considers proposals to reduce the debt bias in corporate taxation. To do so, we employ an applied general equilibrium model (CORTAX) covering all EU Member States, featuring different firm types and modelling many key features of corporate tax regimes, including multinational profit shifting, investment decisions, loss compensation and the debt-equity choice of firms.
    Keywords: corporate taxation, CGEM, debt-bias, European Union
    JEL: H25 H26 H68 H87 C68
    Date: 2016–10
  8. By: Saglam, Ismail
    Abstract: In this paper, we propose a new heuristic to be used as a mate search strategy in the Todd and Miller's (1999) human mate choice model. This heuristic, which we call Take the Weighted Average with the Next Desiring Date, is a plausible search rule in terms of informational assumptions, while in terms of mating likelihood it is almost as good as the most successful, yet also unrealistic, heuristic of Todd and Miller (1999), namely the Mate Value-5 rule, which assumes that agents in the mating population completely know their own mate values before interacting with any date. The success of our heuristic stems from its extreme power to lead an average agent in the mating population to always underestimate his/her own mate value during the adolescence (learning) phase of the mating process. However, this humble heuristic does not perform well in terms of marital stability. We find that the mean within-pair difference is always higher under our heuristic (possibly due to high estimation errors made in the learning phase) than under any heuristic of Todd and Miller (1999). It seems that becoming ready to pair up with agents whose mate values are well below one's own mate value pays off well in the mating phase but also incurs an increased risk of marital dissolution.
    Keywords: Mate Choice; Mate Search; Simple Heuristics; Agent-Based Simulation
    JEL: C63 J12
    Date: 2017–05–28
  9. By: Isabelle Salle (Utrecht University, School of Economics); Pascal Seppecher (Centre d'Economie de l'Université de Paris Nord (CEPN))
    Abstract: This paper analyzes a range of alternative specifications of the interest rate policy rule within a macroeconomic, stock-flow consistent, agent-based model. In this model, firms’ leverage strategies evolve under the selection pressure of market competition. The resulting process of collective adaptation generates endogenous booms and busts along credit cycles. As feedback loops on aggregate demand affect the goods and the labor markets, the real and the financial sides of the economy are closely interconnected. The baseline scenario is able to qualitatively reproduce a wide range of stylized facts, and to match quantitative orders of magnitude of the main economic indicators. We find that, despite the implementation of credit and balance sheet related prudential policies, the emerging dynamics feature strong instability. Targeting movements in the net worth of firms help dampen the credit cycles, and simultaneously reduce financial and macroeconomic volatility, but does not eliminate the occurrence of financial crises along with high costs in terms of unemployment.
    Keywords: Agent-based modeling, Credit cycles, Monetary and Macroprudential policies, Leaning against the wind
    JEL: C63 E03 E52
    Date: 2017–04
  10. By: Marek Kvet (University of Žilina, Faculty of Management Science and InformaticsUniverzitná 8215/1, 010 26 Žilina, Slovakia); Jaroslav Janáèek (University of Žilina, Faculty of Management Science and InformaticsUniverzitná 8215/1, 010 26 Žilina, Slovakia)
    Abstract: Research background: This paper deals with smart design of robust emergency service system. The robustness means here resistance of the system to various detrimental events, which can randomly occur in the associated transportation network. Consequences of the detrimental events are formalized by establishing a set of detrimental scenarios. A robust emergency service system is usually designed so that the deployment of given number of service centers minimizes the maximal value of objective functions corresponding with the specified scenarios. The original approach to the system design using means of mathematical programming faces computational difficulties caused by the link-up constraints. Purpose of the article: The main purpose of our research is to overcome the computational burden of the branch-and-bound method caused by the min-max constraints in the model. We suggest an iterative hill climbing algorithm, which outperforms the original approach in both computational time and computer memory demand. Methodology/methods: The methodology consists in approximation of the maximum of original objective functions by a suitable convex combination of the objective functions. The previously developed hill climbing algorithm is extended by return preventing constraints and their influence on computational effectiveness is studied within this paper. Especially, we focus on finding the most suitable form of the return preventing constraints and strategy of their implementation. Findings & Value added: We present here a comparison of the suggested algorithm to the original approach and the Lagrangean relaxation of the original approach. We found that the suggested algorithm outperforms the original exact approach as concerns the computational time with maximal two percent deviation from the optimal solution. In addition, the algorithm outperforms the Lagrangean approach in both computational time and the deviation.
    Keywords: Emergency system design; robustness; iterative algorithm; convex combination; return preventing constraints
    JEL: C61 C63
    Date: 2017–05
  11. By: Morteza Davari; Erik Demeulemeester
    Abstract: The proactive and reactive resource-constrained project scheduling problem (PR-RCPSP), that has been introduced recently (Davari and Demeulemeester, 2016a), deals with activity duration uncertainty in a very unique way. The optimal solution to an instance of the PR-RCPSP is a proactive and reactive policy (PR-policy) that is a combination of a baseline schedule and a set of required transitions (reactions). In this research, we introduce two interesting classes of reactions, namely the class of selection-based reactions and the class of buffer-based reactions. We also discuss the theoretical relevance of these two classes of reactions. We run some computational results and report the contributions of the selection-based reactions and the buffer-based reactions in the optimal solution. The results suggest that although both selection-based reactions and buffer-based reactions contribute largely in the construction of the optimal PR-policy, the contribution of the buffer-based reactions is of much greater importance.
    Date: 2017–05

General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.