nep-cmp New Economics Papers
on Computational Economics
Issue of 2018‒03‒12
ten papers chosen by
Stan Miles
Thompson Rivers University

  1. Credit Risk Analysis using Machine and Deep learning models By Peter Martey Addo; Dominique Guegan; Bertrand Hassani
  2. Pricing Options with Exponential Levy Neural Network By Jeonggyu Huh
  3. Modelling land use, deforestation, and policy: a hybrid optimisation-heterogeneous agent model with application to the Bolivian Amazon By Andersen, Lykke E.; Groom, Ben; Killick, Evan; Ledezma, Juan Carlos; Palmer, Charles; Weinhold, Diana
  4. Firm-level simulation of supply chain disruption triggered by actual and predicted earthquakes By Inoue, Hiroyasu; Todo, Yasuyuki
  5. Big Data, Computational Science, Economics, Finance, Marketing, Management, and Psychology: Connections By Chang, C-L.; McAleer, M.J.; Wong, W.-K.
  6. Improving warehouse responsiveness by job priority management By Kim, T.Y.
  7. A 2015 Social Accounting Matrix (SAM) for Mozambique By António S. Cruz; Fausto Mafambissa; Mónica Magáua; Vincenzo Salvucci; Dirk van Seventer
  8. Truck driver scheduling with combined planning of rest periods, breaks and vehicle refueling By Bernhardt, A.; Melo, Teresa; Bousonville, Thomas; Kopfer, Herbert
  9. Nowcasting economic activity with electronic payments data: A predictive modeling approach By Carlos León; Fabio Ortega
  10. Types of signature analysis in reliability based on Hilbert series By Mohammadi, Fatemeh; Saenz-de-Cabezon, Eduardo; Wynn, Henry P.

  1. By: Peter Martey Addo (Data Scientist (Lead), Expert Synapses, SNCF Mobilite); Dominique Guegan (Université Paris1 Panthéon-Sorbonne, Centre d'Economie de la Sorbonne, LabEx ReFi and Ca' Foscari University of Venezia); Bertrand Hassani (VP, Chief Data Scientist, Capgemini Consulting and LabEx ReFi)
    Abstract: Due to the hyper technology associated to Big Data, data availability and computing power, most banks or lending financial institutions are renewing their business models. Credit risk predictions, monitoring, model reliability and effective loan processing are key to decision making and transparency. In this work, we build binary classifiers based on machine and deep learning models on real data in predicting loan default probability. The top 10 important features from these models are selected and then used in the modelling process to test the stability of binary classifiers by comparing performance on separate data. We observe that tree-based models are more stable than models based on multilayer artificial neural networks. This opens several questions relative to the intensive used of deep learning systems in the enterprises
    Keywords: Credit risk; Financial regulation; Data Science; Bigdata; Deep learning
    JEL: C02 C13 C19 G01 G21 G28 D81 G31
    Date: 2018–02
  2. By: Jeonggyu Huh
    Abstract: In this paper, we propose the exponential Levy neural network (ELNN) for option pricing, which is a new non-parametric exponential Levy model using artificial neural networks (ANN). The ELNN fully integrates the ANNs with the exponential Levy model, a conventional pricing model. So, the ELNN can improve ANN-based models to avoid several essential issues such as unacceptable outcomes and inconsistent pricing of over-the-counter products. Moreover, the ELNN is the first applicable non-parametric exponential Levy model by virtue of outstanding researches on optimization in the field of ANN. The existing non-parametric models are too vulnerable to be employed in practice. The empirical tests with S\&P 500 option prices show that the ELNN outperforms two parametric models, the Merton and Kou models, in terms of fitting performance and stability of estimates.
    Date: 2018–02
  3. By: Andersen, Lykke E.; Groom, Ben; Killick, Evan; Ledezma, Juan Carlos; Palmer, Charles; Weinhold, Diana
    Abstract: We introduce a hybrid simulation model ('SimPachamama') designed to explore the complex socio-environmental trade-offs of alternative policy bundles and policy sequencing options for stemming deforestation and reducing poverty in tropical countries. Designed and calibrated to the initial conditions of a small forest village in rural Bolivia, the model consists of: (a) an optimising agricultural household module of heterogeneous agents that make individually optimal land-use decisions based on factor endowments and market conditions; (b) an encompassing general equilibrium ‘shell’ module that endogenously determines wages and links the agricultural labour market and rural-urban migration rates; and (c) a novel user-controlled policy-maker module that allows the user to make ‘real time’ choices over a variety of public and environmental policies that in turn impact land use, welfare, and migration. Over a 20-year simulation period the results highlight trade-offs between reductions in deforestation and improvements in household welfare that can only be overcome either when international REDD payments are offered or when decentralized deforestation taxes are implemented. The sequencing of policies plays a critical role in the determination of these results.
    Keywords: simulation; Bolivia; deforestation; land use; policy; REDD
    JEL: Q23 Q28 Q56 R14
    Date: 2017–05–01
  4. By: Inoue, Hiroyasu; Todo, Yasuyuki
    Abstract: This paper reports simulations of supply chain disruptions regarding the Great East Japan Earthquake and the predicted Nankai Trough Earthquake. The simulations are based on the actual nationwide supply chains of Japan and on an agent-based model. As a result, we obtain the following findings. (1) Based on simulations of the Great East Japan Earthquake, we calibrate the parameters in the model. The result shows that the simulation reproduces the aftermath of the disaster well, which means the simulation captures the propagations of the damages and the recoveries from them on supply chains. (2) Indirect damages of both earthquakes geographically permeate the entire country in a quite short term. Additionally, the damages to firms show synchronized fluctuations due to the network structure. (3) Simulations of the Nankai Trough Earthquake show that direct damages are 12 times greater than those from the Great East Japan Earthquake, but indirect damages are approximately 4.5 times greater in a year. (4) By estimating indirect damage triggered by a single firm loss, approximately 10% of firms cause more than 10% damage of the entire supply chains.
    Keywords: supply chain, propagation, disaster, agent, simulation, high performance computing
    JEL: L14
    Date: 2017–11–22
  5. By: Chang, C-L.; McAleer, M.J.; Wong, W.-K.
    Abstract: The paper provides a review of the literature that connects Big Data, Computational Science, Economics, Finance, Marketing, Management, and Psychology, and discusses some research that is related to the seven disciplines. Academics could develop theoretical models and subsequent econometric and statistical models to estimate the parameters in the associated models, as well as conduct simulation to examine whether the estimators in their theories on estimation and hypothesis testing have good size and high power. Thereafter, academics and practitioners could apply theory to analyse some interesting issues in the seven disciplines and cognate areas.
    Keywords: Big Data, Computational science, Economics, Finance, Management, Theoretical, models, Econometric and statistical models, Applications
    JEL: A10 G00 G31 O32
    Date: 2018–01–01
  6. By: Kim, T.Y.
    Abstract: Warehouses employ order cut-off times to ensure sufficient time for fulfilment. To satisfy higher consumer expectations, these cut-off times are gradually postponed to improve order responsiveness. Warehouses therefore have to allocate jobs more efficiently to meet compressed response times. Priority job management by means of flow-shop models has been used mainly for manufacturing systems but can also be applied for warehouse job scheduling to accommodate tighter cut-off times. This study investigates which priority rule performs best under which circumstances. The performance of each rule is evaluated in terms of a common cost criterion that integrates the objectives of low earliness, low tardiness, low labour idleness, and low work-in-process stocks. A real-world case study for a warehouse distribution centre of an original equipment manufacturer in consumer electronics provides the input parameters for a simulation study. The simulation outcomes validate several strategies for improved responsiveness. In particular, the critical ratio rule has the fastest flow-time and performs best for warehouse scenarios with expensive products and high labour costs.
    Keywords: responsiveness, queuing model, order fulfilment, cut-off operation, flow-shop scheduling
    Date: 2018–01–01
  7. By: António S. Cruz; Fausto Mafambissa; Mónica Magáua; Vincenzo Salvucci; Dirk van Seventer
    Abstract: This paper documents a 2015 Social Accounting Matrix (SAM) for Mozambique. The SAM is built using unpublished Instituto Nacional de Estatística (INE) industry-level production accounts, commodity-level supply–demand balances and a supply matrix, together with national accounts, National Directorate of Planning and Budget (DNPO) government statistics and IMF balance of payment statistics (all for the year 2015), INE household and labour market survey data for 2014–15 and a use matrix from a 2007 SAM for Mozambique. It provides a detailed representation of the Mozambican economy and identifies 55 activities and commodities. Labour is disaggregated by education attainment level and household income and expenditure by per capita expenditure quintiles both for urban and rural areas. The SAM features production for home consumption as reported in the unpublished INE accounts and the INE household survey data and also presents government, investment, and foreign accounts. The SAM is a useful database for conducting economy-wide impact assessments, including multiplier analysis and computable general equilibrium (CGE) modelling.general equilibrium (CGE) modelling.
    Date: 2018
  8. By: Bernhardt, A.; Melo, Teresa; Bousonville, Thomas; Kopfer, Herbert
    Abstract: Fuel is one main cost driver in the road haulage sector. An analysis of diesel price variations across different European countries showed that a significant potential for cutting fuel expenditure can be found in international long-haul freight transportation. Here, truck drivers are often on the road for several consecutive days or even weeks. During their trips, they must comply with the rules on driving hours and rest periods which in the European Union are governed by Regulation (EC) No 561/2006. In the literature, refueling problems have attracted limited attention so far. In the present study, we show why a joint consideration of drivers' rest periods and breaks and refueling is important and how the choice of time windows, the planning of driver activities, and the determination of refueling stops and quantities can be done accordingly. For a given sequence of customer locations and gas stations with different fuel prices along the route chosen to serve these customers we propose a mixed integer linear programming (MILP) model and describe the corresponding solution process. In this multicriteria optimization problem with the goals to minimize lateness, traveling time and fuel expenditures, we consider multiple soft time windows at customer locations. We extend the MILP model developed by Bernhardt et al. (2016) by integrating refueling decisions. Additionally, a preprocessing heuristic is described which reduces the number of gas stations to be considered along the route and thus the solution space and the computational effort. Numerical experiments were conducted for instances derived from real data that include vehicle routes for one week and information on gas stations along the vehicle routes. Different parameter settings for the preprocessing heuristic were analyzed.
    Keywords: road transportation,refueling,fuel cost,driver scheduling,rest periods,breaks,driving hours,Regulation (EC) No 561/2006,mixed integer linear programming models
    Date: 2017
  9. By: Carlos León; Fabio Ortega (Banco de la República de Colombia; Banco de la República de Colombia)
    Abstract: Economic activity nowcasting (i.e. making current-period estimates) is convenient because most traditional measures of economic activity come with substantial lags. We aim at nowcasting ISE, a short-term economic activity indicator in Colombia. Inputs are ISE’s lags and a dataset of payments made with electronic transfers and cheques among individuals, firms, and the central government. Under a predictive modeling approach, we employ a nonlinear autoregressive exogenous neural network model. Results suggest that our choice of inputs and predictive method enable us to nowcast economic activity with fair accuracy. Also, we validate that electronic payments data significantly reduces the nowcast error of a benchmark non-linear autoregressive neural network model. Nowcasting economic activity from electronic payment instruments data not only contributes to agents’ decision making and economic modeling, but also supports new research paths on how to use retail payments data for appending current models. Classification JEL: C45, C53, E27
    Keywords: forecasting, machine learning, neural networks, retail payments, NARX.
    Date: 2018–02
  10. By: Mohammadi, Fatemeh; Saenz-de-Cabezon, Eduardo; Wynn, Henry P.
    Abstract: The present paper studies multiple failure and signature analysis of coherent systems using the theory of monomial ideals. While system reliability has been studied using Hilbert series of monomial ideals, this is not enough to understand in a deeper sense the ideal structure features that reflect the behavior of the system under multiple simultaneous failures. Therefore, we introduce the lcm-filtration of a monomial ideal, and we study the Hilbert series and resolutions of the corresponding ideals. Given a monomial ideal, we explicitly compute the resolutions for all ideals in the associated lcm-filtration, and we apply this to study coherent systems. Some computational results are shown in examples to demonstrate the usefulness of this approach and the computational issues that arise. We also study the failure distribution from a statistical point of view by means of the algebraic tools described.
    JEL: C1
    Date: 2016–08–11

This nep-cmp issue is ©2018 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.