nep-cmp New Economics Papers
on Computational Economics
Issue of 2018‒01‒29
fifteen papers chosen by
Stan Miles
Thompson Rivers University

  1. “Tracking economic growth by evolving expectations via genetic programming: A two-step approach” By Oscar Claveria; Enric Monte; Salvador Torra
  2. Ensemble Learning or Deep Learning? Application to Default Risk Analysis By Shigeyuki Hamori; Minami Kawai; Takahiro Kume; Yuji Murakami; Chikara Watanabe
  3. Directional Predictability of Daily Stock Returns By Becker, Janis; Leschinski, Christian
  4. Model linkage between CAPRI and MAGNET: An exploratory assessment By PHILIPPIDIS George; Helming John; Tabeau Andrzej
  5. Social Network based Short-Term Stock Trading System By Paolo Cremonesi; Chiara Francalanci; Alessandro Poli; Roberto Pagano; Luca Mazzoni; Alberto Maggioni; Mehdi Elahi
  6. On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions By de Klerk, Etienne; Glineur, Francois; Taylor, Adrien
  7. The European Deposit Insurance Scheme: Assessing risk absorption via SYMBOL By Lucia, Alessi; Giuseppina, Cannas; Sara, Maccaferri; Marco, Petracco Giudici
  8. Nested Branch-and-Price-and-Cut for Vehicle Routing Problems with Multiple Resource Interdependencies By Christian Tilk; Michael Drexl; Stefan Irnich
  9. A Branch-and-Price Framework for Decomposing Graphs into Relaxed Cliques By Timo Gschwind; Stefan Irnich; Fabio Furini; Roberto Wolfler Calvo
  10. A top-down behaviour (TDB) microsimulation toolkit for distributive analysis By Luca Tiberti; John Cockburn; Martín Cicowiez
  11. Assessing potential coupling factors of European decoupled payments with the Modular Agricultural GeNeral Equilibrium Tool (MAGNET) By BOULANGER Pierre; PHILIPPIDIS George; URBAN Kirsten
  12. Product innovation with lumpy investment By Chahim, M.; Grass, D.; Hartl, R.F.; Kort, Peter
  13. Availability Model of a PHM-Equipped Component By Michele Compare; Luca Bellani; Enrico Zio
  14. Technology Classification for the Purposes of Futures Studies By Ilya Kuzminov; Dirk Meissner; Alina Lavrynenko; Elena Tochilina
  15. Can the Paris Deal Boost SDGs Achievement? An Assessment of Climate Mitigation Co-benefits or Side-effects on Poverty and Inequality By Lorenza Campagnolo; Marinella Davide

  1. By: Oscar Claveria (AQR-IREA AQR-IREA, University of Barcelona (UB). Tel.: +34-934021825; Fax.: +34-934021821. Department of Econometrics, Statistics and Applied Economics, University of Barcelona, Diagonal 690, 08034 Barcelona, Spain); Enric Monte (Department of Signal Theory and Communications, Polytechnic University of Catalunya (UPC)); Salvador Torra (Riskcenter-IREA, Department of Econometrics and Statistics, University of Barcelona (UB))
    Abstract: The main objective of this study is to present a two-step approach to generate estimates of economic growth based on agents’ expectations from tendency surveys. First, we design a genetic programming experiment to derive mathematical functional forms that approximate the target variable by combining survey data on expectations about different economic variables. We use evolutionary algorithms to estimate a symbolic regression that links survey-based expectations to a quantitative variable used as a yardstick (economic growth). In a second step, this set of empirically-generated proxies of economic growth are linearly combined to track the evolution of GDP. To evaluate the forecasting performance of the generated estimates of GDP, we use them to assess the impact of the 2008 financial crisis on the accuracy of agents' expectations about the evolution of the economic activity in 28 countries of the OECD. While in most economies we find an improvement in the capacity of agents' to anticipate the evolution of GDP after the crisis, predictive accuracy worsens in relation to the period prior to the crisis. The most accurate GDP forecasts are obtained for Sweden, Austria and Finland.
    Keywords: Evolutionary algorithms; Symbolic regression; Genetic programming; Business and consumer surveys; Expectations; Forecasting. JEL classification:C51, C55, C63, C83, C93.
  2. By: Shigeyuki Hamori (Graduate School of Economics, Kobe University); Minami Kawai (Department of Economics, Kobe University); Takahiro Kume (Department of Economics, Kobe University); Yuji Murakami (Department of Economics, Kobe University); Chikara Watanabe (Department of Economics, Kobe University)
    Abstract: Proper credit risk management is essential for lending institutions as substantial losses can be incurred when borrowers default. Consequently, statistical methods that can measure and analyze credit risk objectively are becoming increasing important. This study analyzed default payment data from Taiwan and compared the prediction accuracy and classification ability of three ensemble learning methods—specifically, Bagging, Random Forest, and Boosting—with those of various neural network methods, each of which has a different activation function. The results indicate that Boosting has a high prediction accuracy, whereas that of Bagging and Random Forest is relatively low. They also indicate that the prediction accuracy and classification performance of Boosting is better than that of deep neural networks, Bagging, and Random Forest.
    Keywords: credit risk; ensemble learning; deep learning; bagging; random forest; boosting; deep neural network.
    Date: 2018–01
  3. By: Becker, Janis; Leschinski, Christian
    Abstract: The level of daily stock returns is generally regarded as unpredictable. Instead of the level, we focus on the signs of these returns and generate forecasts using various statistical classification techniques, such as logistic regression, generalized additive models, or neural networks. The analysis is carried out using a data set consisting of all stocks that were part of the Dow Jones Industrial Average in 1996. After selecting the relevant explanatory variables in the subsample from 1996 to 2003, forecast evaluations are conducted in an out-of-sample environment for the period from 2004 to 2017. Since the model selection and the forecasting period are strictly separated, the procedure mimics the situation a forecaster would face in real time. It is found that the sign of daily returns is predictable to an extent that is statistically significant. Moreover, trading strategies based on these forecasts generate positive alpha, even after accounting for transaction costs. This underlines the economic significance of the predictability and implies that there are periods during which markets are not fully efficient.
    Keywords: Asset Pricing; Market Efficiency; Directional Predictability; Statistical Classification
    JEL: G12 G14 G17 C38
    Date: 2018–01
  4. By: PHILIPPIDIS George (European Commission – JRC); Helming John; Tabeau Andrzej
    Abstract: It is well-known that partial equilibrium (PE) and computable general equilibrium (CGE) models have structural differences both in terms of the data and the behavioural elements (i.e., explicit or implicit elasticities), which can generate divergent results, whilst previous precedents in the literature even show that CGE and PE can generate contradictory findings for the same scenario. Although this is well recognized within the modelling community, in the policy arena it can often be hard to reconcile the findings of both models when presenting a consistent story line for a given policy reform. In the past, previous work commissioned by the Joint Research Centre (JRC), Seville, on behalf of DG Agri, forged a ‘soft’ model linkage (Helming et al., 2010; Nowicki et al, 2006, 2009), such that both models generate a mutually consistent storyline. Typically, a soft linkage is driven by a more ad hoc assessment of the overall results (i.e., are the models broadly telling the same story?), whilst one plays to the strengths of each model to serve as a source of input to the other. For example, the CGE model, with an explicit or endogenous treatment of factor markets, world trade and macro aggregates, could conceivably be used within a PE model. Similarly, the sectoral detail and econometric foundation in supply response which serves some PE models well could be employed to assess and improve the veracity of the CGE model results. Under the auspices of project 154208-2014-A08-NL, entitled, "Scenar2030, parameters and model chain preparation", the Economic of Agriculture unit of the JRC requested a further look at this issue to better understand the merits of different model linkage options. More specifically, as part of technical specification for task 5 ('preparation of model chain'), two forms of model linkage, broadly labelled as 'soft' and 'hard' linkage are considered. The advantage of the soft approach is that it is relatively straightforward to implement in terms of the necessary modelling modifications. On the other hand, the ‘soft’ approach adopted in the Scenar2020 project through linkage of variables was, as noted above, implemented more on an ad hoc basis, rather than following a systematic framework. Thus, subject to the prejudices of the model scenario (i.e., the scenario design, the type of shocks etc.), the use of variable linkage could conceivably vary considerably. This, in turn, has led to the alternative choice of a ‘hard’ linkage which seeks to forge a union between the structural or behavioural elements of the model (see, for example, Britz and Hertel, 2011; Pelikan et al., 2015). Whilst this approach is intuitively appealing because it follows a very specific methodological approach, it requires considerably more modelling expertise to implement, whilst the potential robustness of the two models being linked is, at the current time, far from certain. A fuller exposition of the hard linkage approach is given in section four below with some reflections of its potential suitability for advanced policy analysis using the MAGNET model. For the purposes of the current (tentative experiments), in section two, a ‘test bed’ study is described, which considers a more systematic class of ‘soft’ model linkage between two well-known and respected models from the iMAP platform, namely, the Common Agricultural Policy Regionalised Impact (CAPRI) PE model and the Modular Applied GeNeral Equilibrium Tool (MAGNET) CGE model. In CAPRI, a standard CAP baseline is run, whilst in the MAGNET model, two specific experiments are implemented. The first runs a standard CAP baseline in the MAGNET model, whilst the second implements the same baseline shocks with the inclusion of model predictions of output taken from CAPRI. The aim of the exercise is simply to ascertain the extent to which the MAGNET model results (section three) diverge between the two experiments and assess the degree of compromise required in MAGNET to accommodate said changes. Clearly, if considerable divergences are found, and one considers that the CAPRI sectoral output results are superior, then this could potentially warrant the need for a more extensive research effort to provide a systematic, theoretically consistent and scientifically rigorous approach to model linkage for future policy impact assessments.
    Keywords: modelling, Common agricultural policy
    Date: 2017–08
  5. By: Paolo Cremonesi; Chiara Francalanci; Alessandro Poli; Roberto Pagano; Luca Mazzoni; Alberto Maggioni; Mehdi Elahi
    Abstract: This paper proposes a novel adaptive algorithm for the automated short-term trading of financial instrument. The algorithm adopts a semantic sentiment analysis technique to inspect the Twitter posts and to use them to predict the behaviour of the stock market. Indeed, the algorithm is specifically developed to take advantage of both the sentiment and the past values of a certain financial instrument in order to choose the best investment decision. This allows the algorithm to ensure the maximization of the obtainable profits by trading on the stock market. We have conducted an investment simulation and compared the performance of our proposed with a well-known benchmark (DJTATO index) and the optimal results, in which an investor knows in advance the future price of a product. The result shows that our approach outperforms the benchmark and achieves the performance score close to the optimal result.
    Date: 2018–01
  6. By: de Klerk, Etienne (Tilburg University, School of Economics and Management); Glineur, Francois; Taylor, Adrien
    Abstract: We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also give the tight worst-case complexity bound for a noisy variant of gradient descent method, where exact line-search is performed in a search direction that differs from negative gradient by at most a prescribed relative tolerance. The proofs are computer-assisted, and rely on the resolutions of semidefinite programming performance estimation problems as introduced in the paper (Drori and Teboulle, Math Progr 145(1–2):451–482, 2014).
    Date: 2017
  7. By: Lucia, Alessi (European Commission – JRC); Giuseppina, Cannas (European Commission - JRC); Sara, Maccaferri (European Commission - JRC); Marco, Petracco Giudici (European Commission - JRC)
    Abstract: In November 2015, the European Commission adopted a legislative proposal to set up a European Deposit Insurance Scheme (EDIS), a single deposit insurance system for all bank deposits in the Banking Union. JRC was requested to quantitatively assess the effectiveness of introducing a single deposit insurance scheme and to compare it with other alternative options for the set-up of such insurance at European level. JRC compared national Deposit Guarantee Schemes and EDIS based on their respective ability to cover insured deposits in the face of a banking crisis. Analyses are based upon the results of the SYMBOL model simulation of banks’ failures.
    Keywords: banking regulation; banking crisis; deposit insurance
    JEL: C15 G01 G21 G28
    Date: 2017–12
  8. By: Christian Tilk (Johannes Gutenberg-University Mainz, Germany); Michael Drexl (Faculty of Applied Natural Sciences and Industrial Engineering, Deggendorf, Germany); Stefan Irnich (Johannes Gutenberg-University Mainz, Germany)
    Abstract: This paper considers vehicle routing problems (VRPs) with multiple resource interdependencies and addresses the development and computational evaluation of an exact branch-and-price-and-cut algorithm for their solution. An interdependency between two resources means that the two resource consumptions influence one another in such a way that a tradeoff exists between them. This impacts the feasibility and/or the cost of a solution. The subproblem in branch-and-price-and-cut procedures for VRPs is very often a variant of the shortestpath problem with resource constraints (SPPRC). For the exact solution of many SPPRC variants, dynamicprogramming based labeling algorithms are predominant. The tradeoffs in problems with multiple resource interdependencies, however, render the application of labeling algorithms unpromising. This is because complex data structures for managing the tradeoff curves are necessary and only weak dominance criteria are possible, so that the labeling algorithm becomes almost a pure enumeration. Therefore, we propose to solve also the SPPRC subproblem with branch-and-price-and-cut. This results in a two-level, nested branch-and-price-and-cut algorithm. We analyze different variants of the algorithm to enable the exchange of columns and also rows between the different levels. To demonstrate the computational viability of our approach, we perform computational experiments on a prototypical VRP with time windows, minimal and maximal delivery quantities for each customer, a customer-dependent profit paid for each demand unit delivered, and temporal synchronization constraints between some pairs of customers. In this problem, tradeoffs exist between cost and load and between cost and time.
    Keywords: Routing, Vehicle routing, Synchronization, Nested decomposition, Branch-and-price-and-cut
    Date: 2018–01–11
  9. By: Timo Gschwind (Johannes Gutenberg-Universität Mainz, Germany); Stefan Irnich (Johannes Gutenberg-University Mainz, Germany); Fabio Furini (LAMSADE Université Paris Dauphine, France); Roberto Wolfler Calvo (LIPN Université Paris, France)
    Abstract: We study the family of problems of partitioning and covering a graph into/with a minimum number of relaxed cliques. Relaxed cliques are subset of vertices of a graph for which a clique-defining property is relaxed, e.g., the degree of the vertices, the distance between the vertices, the density of the edges, or the connectivity between the vertices. These graph partitioning and covering problems have important applications in many areas such as social network analysis, biology, and disease spread prevention. We propose a unified framework based on branch-and-price techniques to compute optimal decompositions. For this purpose, new e ective pricing algorithms are developed and new branching schemes are invented. In extensive computational studies, we compare several algorithmic designs, e.g., structure-preserving versus dichotomous branching and their interplay with di erent pricing algorithms. The finally chosen setup for the branch-and-price produces results that demonstrate the e ectiveness of all components of the newly developed framework and the validity of our approach when applied to social network instances.
    Keywords: Graph decomposition, clique relaxations, branch-and-price algorithm, social networks
    Date: 2017–12–20
  10. By: Luca Tiberti; John Cockburn; Martín Cicowiez
    Abstract: CGE models are often combined with microsimulation (MS) models to perform distributive impact analysis for fiscal or structural policies, or external shocks. This paper describes a user-friendly Stata-based toolkit to perform microsimulations combined with CGE models in a top down fashion. The toolkit is organized in various modules. It first estimates income generation by type of work and skill of workers. Then it estimates households’ specific price deflators based on individual utility. The changes estimated by a CGE model (or from other sources) in the employment (by skill and sector), in the wage payroll (by skill), in the revenues from self-employment activities (by skill) as well as in the commodities prices are fed into the MS model in a consistent way. Once the new vector of real consumption or revenue is estimated, it performs a series of distributive analysis, such as the computation of standard poverty and inequality indices, their decomposition by income factor, robustness analysis and growth incidence curves, and compare the baseline with the simulation results. This makes it possible to run standard poverty and distributive analyses, and to see whether a given shock or policy has had some impact on household welfare and who are the most affected households. Based on such information, social protection policies can be accurately designed in order to minimize the, e.g., negative effects of a given shock in a cost-effective manner. An illustrative analysis is run on data from Uganda.
    Keywords: CGE-microsimulation model, poverty and distributive analysis, Uganda
    Date: 2017
  11. By: BOULANGER Pierre (European Commission – JRC Seville); PHILIPPIDIS George (Aragonese Agency for Research and Development (ARAID) and Wageningen University & Research); URBAN Kirsten (Hohenheim University)
    Abstract: In 2020, decoupled payments will represent about 42% of the CAP budget (green payments excluded). This report assesses potential effects of European decoupled payments on farmers' production decisions, prior to a sensitivity analysis of different coupling factors using the Modular Applied GeNeral Equilibrium Tool (MAGNET). Scientific literature reveals different coupling channels such as capitalisation in land value, farmers' risk behaviour, credit accessibility, uncertainty about future policies and labour allocation through which European decoupled payments influence farm choices and thus output. For each of these channels the relevant literature introducing theoretical and empirical assessments is evaluated with the aim to derive parameters that improve the representation of decoupled payments in economic simulation models. Within MAGNET, the traditional representation of decoupled payments is a subsidy to the land factor. Nevertheless it appears that the most suitable approach seems to split decoupled payments value into two components, capitalized into the land factor on the one hand, distributed across all factors of production on the other hand. A sensitivity analysis concludes that if one assumes a "degree" of coupling, it does have some implication for output and price results when conducting policy analysis.
    Keywords: Economic analysis, impact assessment, Common Agricultural Policy, agricultural trade, agricultural markets, modelling tools, database
    JEL: C68 F11 Q13 Q18
    Date: 2017–08
  12. By: Chahim, M. (Tilburg University, School of Economics and Management); Grass, D.; Hartl, R.F.; Kort, Peter (Tilburg University, School of Economics and Management)
    Abstract: The paper provides a framework that enables us to analyze the important topic of capital accumulation under technological progress. We describe an algorithm to solve Impulse Control problems, based on a (multipoint) boundary value problem approach. Investment takes place in lumps and we determine the optimal timing of technology adoptions as well as the size of the corresponding investments. Our numerical approach led to some guidelines for new technology investments. First, we find that investments are larger and occur in a later stadium when more of the old capital stock needs to be scrapped. Moreover, we obtain that the size of the firm’s investments increase when the technology produces more profitable products. We see that the firm in the beginning of the planning period adopts new technologies faster as time proceeds, but later on the opposite happens. Furthermore, we find that the firm does not invest such that marginal profit is zero, but instead marginal profit is negative.
    Date: 2017
  13. By: Michele Compare; Luca Bellani; Enrico Zio (Chaire Sciences des Systèmes et Défis Energétiques EDF/ECP/Supélec - LGI - Laboratoire Génie Industriel - EA 2606 - CentraleSupélec - SSEC - Chaire Sciences des Systèmes et Défis Energétiques EDF/ECP/Supélec - Ecole Centrale Paris - SUPELEC - EDF R&D - Electricité de France Recherche et Développement - CentraleSupélec, LGI - Laboratoire Génie Industriel - EA 2606 - CentraleSupélec, SSEC - Chaire Sciences des Systèmes et Défis Energétiques EDF/ECP/Supélec - Ecole Centrale Paris - SUPELEC - EDF R&D - Electricité de France Recherche et Développement - CentraleSupélec)
    Abstract: —A variety of prognostic and health management (PHM) algorithms have been developed in the last years and some metrics have been proposed to evaluate their performances. However , a general framework that allows us to quantify the benefit of PHM depending on these metrics is still lacking. We propose a general , time-variant, analytical model that conservatively evaluates the increase in system availability achievable when a component is equipped with a PHM system of known performance metrics. The availability model builds on metrics of literature and is applicable to different contexts. A simulated case study is presented concerning crack propagation in a mechanical component. A simplified cost model is used to compare the performance of predictive maintenance based on PHM with corrective and scheduled maintenance. Index Terms—Availability, cost-benefit analysis, Monte Carlo (MC) simulation, prognostics and health Management (PHM) metrics.
    Date: 2017–06
  14. By: Ilya Kuzminov (National Research University Higher School of Economics); Dirk Meissner (National Research University Higher School of Economics); Alina Lavrynenko (National Research University Higher School of Economics); Elena Tochilina (National Research University Higher School of Economics)
    Abstract: The paper analyses problems associated with technologies classification for the purposes of futures studies, in order to ensure definitive inclusion of technologies in specific classes/types when conventional approaches to classification are applied. The evolution of classification approaches in the scope of science philosophy is shortly reviewed, together with the latest research on expert-based and computerised (algorithmic) classification and methodological dilemmas related to hierarchical aggregation of technological and production processes are analysed. Common problems with classifying technologies and industries frequently encountered in the age of converging technologies are examined, using the agricultural sector and related industries as an example. A case study of computerised classification of agricultural technologies based on clustering algorithms is presented, with a brief analysis of the potential and limitations of the methodology. For doing so a two-stage approach to classifying technologies is suggested, based on distinguishing between platform (multipurpose) and industry-specific technologies. An adaptive approach to analysing technological structures is proposed, based on many-to-many relationships and fuzzy logic principles.
    Keywords: classification, typology, futures studies, science and technology development, technological structure, industry structure, text mining, tagging, network structures, fuzzy logic
    JEL: O14 O32 O33 Q16
    Date: 2018
  15. By: Lorenza Campagnolo (Fondazione Eni Enrico Mattei (FEEM)); Marinella Davide (Fondazione Eni Enrico Mattei (FEEM), Harvard University and Ca’ Foscari University)
    Abstract: The paper analyses the synergies and trade-offs between emission reduction policies and sustainable development objectives. Specifically, it provides an ex-ante assessment that the impacts of the Nationally Determined Contributions (NDCs), submitted under the Paris Agreement, will have on the Sustainable Development Goals (SDGs) of poverty eradication (SDG1) and reduced income inequality (SDG10). By combining an empirical analysis with a modelling exercise, the paper estimates the future trends of poverty prevalence and inequality across countries in a reference scenario and under a climate mitigation policy with alternative revenue recycling schemes. Our results suggest that a full implementation of the emission reduction contributions, stated in the NDCs, is projected to slow down the effort to reduce poverty by 2030 (+2% of the population below the poverty line compared to the baseline scenario), especially in countries that have proposed relatively more stringent mitigation targets and suffer higher policy costs. Conversely, countries with a stringent mitigation policy experience a reduction of inequality compared to baseline scenario levels. If financial support for mitigation action in developing countries is provided through an international climate fund, the prevalence of poverty will be slightly reduced at the aggregate level (185,000fewer poor people with respect to the mitigation scenario), but the country-specific effect depends on the relative size of funds flowing to beneficiary countries and on their economic structure.
    Keywords: SDGs, Poverty, Inequality, CGE Model, Mitigation Policy, Paris Agreement
    JEL: C23 C68 Q56
    Date: 2017–09

This nep-cmp issue is ©2018 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.