nep-cmp New Economics Papers
on Computational Economics
Issue of 2018‒03‒19
eleven papers chosen by



  1. Bitcoin technical trading with artificial neural network By Masafumi Nakano; Akihiko Takahashi; Soichiro Takahashi
  2. Building a better trade model to determine local effects: A regional and intertemporal GTAP model By Pham Van Ha; Tom Kompas; Hoa-Thi-Minh Nguyen; Chu Hoang Long
  3. Credit Risk Analysis using Machine and Deep Learning models By Peter Addo; Dominique Guegan; Bertrand Hassani
  4. Radial Basis Functions Neural Networks for Nonlinear Time Series Analysis and Time-Varying Effects of Supply Shocks By KANAZAWA, Nobuyuki
  5. What If Supply-Side Policies Are Not Enough? The Perverse Interaction Of Exibility And Austerity By Giovanni Dosi; Marcelo C. Pereira; Andrea Roventini; Maria Enrica Virgillito
  6. Compensating households from carbon tax regressivity and fuel poverty: a microsimulation study By Audrey Berry
  7. "Modern Hypermarket Receiving Yard Utilization: The Implementation of a Simulation Model" By Mohammad Annas
  8. Analysis of Financial Credit Risk Using Machine Learning By Jacky C. K. Chow
  9. Credit Risk Meets Random Matrices: Coping with Non-Stationary Asset Correlations By Andreas M\"uhlbacher; Thomas Guhr
  10. Algorithmic Collusion in Cournot Duopoly Market: Evidence from Experimental Economics By Nan Zhou; Li Zhang; Shijian Li; Zhijian Wang
  11. The Macroeconomic Effects of Efficiency Gains in Electricity Production in Malta By Noel Rapa

  1. By: Masafumi Nakano (Graduate School of Economics, University of Tokyo); Akihiko Takahashi (Graduate School of Economics, University of Tokyo); Soichiro Takahashi (Graduate School of Economics, University of Tokyo)
    Abstract: This paper explores Bitcoin trading based on artificial neural networks for the return prediction. In particular, our deep learning method successfully discovers trading signals through a seven layered neural network structure for given input data of technical indicators, which are calculated by the past time-series of Bitcoin returns over every 15 minutes. Under feasible settings of execution costs, the numerical experiments demonstrate that our approach significantly improves the performance of a buy-and-hold strategy. Especially, our model performs well for a challenging period from December 2017 to January 2018, during which Bitcoin suffers from substantial minus returns.
    URL: http://d.repec.org/n?u=RePEc:cfi:fseres:cf430&r=cmp
  2. By: Pham Van Ha; Tom Kompas; Hoa-Thi-Minh Nguyen; Chu Hoang Long
    Abstract: Intertemporal CGE models allow agents to respond fully to current and future policy shocks. This property is particularly important for trade policies, where tariff reductions span over decades. Nevertheless, intertemporal CGE models are dimensionally large and computationally difficult to solve, thus hindering their development, save for those that are scaled-down to only a few regions and commodities. Using a recently developed solution method, we address this problem by building an intertemporal version of a GTAP model that is large in dimension and can be easily scaled to focus to any subset of GTAP countries or regions, without the need for ‘second best’ recursive approaches. Specifically, we solve using a new parallel-processing technique and matrix reordering procedure, and employ a non-steady state baseline scenario. This provides an effective tool for the dynamic analysis of trade policies. As an application of the model, we simulate a free trade scenario for Vietnam with a focus on the recent Trans-Pacific Partnership (TPP). Our simulation shows that Vietnam gains considerably from the TPP, with 60 of the gains realised within the first 10 years despite our assumption of a gradual and linear removal of trade barriers. We also solve for intertemporal and sector-specific effects on each industry in Vietnam from the trade agreements, showing an added advantage of our approach compared to standard static and recursive GTAP models.
    Keywords: Intertemporal CGE modelGTAPTrans-Pacific Partnership (TPP)EU-Vietnam Free Trade AgreementVietnam
    Date: 2018–01
    URL: http://d.repec.org/n?u=RePEc:een:crwfrp:1802&r=cmp
  3. By: Peter Addo (Lead Data Scientist - SNCF Mobilité); Dominique Guegan (UP1 - Université Panthéon-Sorbonne, Labex ReFi - UP1 - Université Panthéon-Sorbonne, University of Ca’ Foscari [Venice, Italy], CES - Centre d'économie de la Sorbonne - CNRS - Centre National de la Recherche Scientifique - UP1 - Université Panthéon-Sorbonne, IPAG - IPAG Business School - Ipag); Bertrand Hassani (Labex ReFi - UP1 - Université Panthéon-Sorbonne, Capgemini Consulting [Paris])
    Abstract: Due to the hyper technology associated to Big Data, data availability and computing power, most banks or lending financial institutions are renewing their business models. Credit risk predictions, monitoring, model reliability and effective loan processing are key to decision making and transparency. In this work, we build binary classifiers based on machine and deep learning models on real data in predicting loan default probability. The top 10 important features from these models are selected and then used in the modelling process to test the stability of binary classifiers by comparing performance on separate data. We observe that tree-based models are more stable than models based on multilayer artificial neural networks. This opens several questions relative to the intensive used of deep learning systems in the enterprises.
    Keywords: Credit risk,Financial regulation,Data Science,Bigdata,Deep learning
    Date: 2018–02
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-01719983&r=cmp
  4. By: KANAZAWA, Nobuyuki
    Abstract: I propose a flexible nonlinear method for studying the time series properties of macroeconomic variables. In particular, I focus on a class of Artificial Neural Networks (ANN) called the Radial Basis Functions (RBF). To assess the validity of the RBF approach in the macroeconomic time series analysis, I conduct a Monte Carlo experiment using the data generated from a nonlinear New Keynesian (NK) model. I find that the RBF estimator can uncover the structure of the nonlinear NK model from the simulated data whose length is as small as 300 periods. Finally, I apply the RBF estimator to the quarterly US data and show that the response of the macroeconomic variables to a positive supply shock exhibits a substantial time variation. In particular, the positive supply shocks are found to have significantly weaker expansionary effects during the zero lower bound periods as well as periods between 2003 and 2004. The finding is consistent with a basic NK model, which predicts that the higher real interest rate due to the monetary policy inaction weakens the effects of supply shocks.
    Keywords: Neural Networks, Radial Basis Functions, Zero Lower Bound, Supply Shocks
    JEL: C45 E31
    Date: 2018–03
    URL: http://d.repec.org/n?u=RePEc:hit:hiasdp:hias-e-64&r=cmp
  5. By: Giovanni Dosi; Marcelo C. Pereira; Andrea Roventini; Maria Enrica Virgillito
    Abstract: In this work we develop a set of labour market and fiscal policy experiments upon the labour and credit augmented “Schumpeter meeting Keynes" agent-based model. The labour market is declined under two institutional variants, the “Fordist" and the “Competitive" setups meant to capture the historical transition from the Fordist toward the post “Thatcher-Reagan" period. Inside these two regimes, we study the different effects of supply-side active labour market policies (ALMPs) vs. demand-management passive labour market ones (PLMPs). In particular, we analyse the effects of ALMPs aimed at promoting job search, and at providing training to unemployed people. Next, we compare the effects of these policies with unemployment benefits simply meant to sustain income and therefore aggregate demand. Considering the burden of unemployment benefits in terms of public budget, we link such provision with the objectives of the European Stability and Growth Pact. Our results show that (i) an appropriate level of skills is not enough to sustain growth when workers face adverse labour demand; (ii) supply-side policies are not able to reverse the perverse interaction between exibility and austerity; (iii) PLMPs outperform ALMPs in reducing unemployment and workers' skills deterioration; and (iv) demand-management policies are better suited to mitigate inequality and to improve and sustain long-run growth.
    Keywords: Industrial-relation Regimes, Flexibility, Active Labour Market Policies, Austerity, Agent-based models
    JEL: C63 E24 H53 J88
    Date: 2018–02
    URL: http://d.repec.org/n?u=RePEc:ast:wpaper:0031&r=cmp
  6. By: Audrey Berry (CIRED - Centre International de Recherche sur l'Environnement et le Développement - CNRS - Centre National de la Recherche Scientifique - ENPC - École des Ponts ParisTech - AgroParisTech - EHESS - École des hautes études en sciences sociales - CIRAD - Centre de Coopération Internationale en Recherche Agronomique pour le Développement)
    Abstract: For households, taxing carbon raises the cost of the energy they use to heat their home and to travel. This paper studies the distributional impacts of the recently introduced French carbon tax and the design of compensation measures. Using a microsimulation model built on a representative sample of the French population from 2012, I simulate for each household the taxes levied on its consumption of energy for housing and transport. Without recycling, the carbon tax is regressive and increases fuel poverty. However, I show how compensation measures can offset these impacts. A flat cash transfer offsets tax regressivity by redistributing
    Keywords: Carbon tax,Distributional impacts,Fuel poverty,Revenue recycling,Microsimulation
    Date: 2018–01–23
    URL: http://d.repec.org/n?u=RePEc:hal:ciredw:hal-01691088&r=cmp
  7. By: Mohammad Annas (Universitas Multimedia Nusantara, Tangerang, 15153, Indonesia)
    Abstract: "Objective – This research is a direct observation of initial queuing, using data that is categorised into two clusters: the number of people queuing at busy hours, and processing times in the same circumstances. Methodology/Technique – The raw data was converted for use in the Poisson distribution test, as well as the Kolmogorov-Smirnov exponential distribution options. An arena simulation model was also applied to identify the vendor's waiting time and to analyse receiving yard utilization. The average waiting time according to the Poisson distribution, the average serving time per vendor by an exponential distribution, and the number of receiving yards, are all essential factors effecting the utilization of receiving yards. Findings – The study compares the length of queues, serving times, arrival rate, and time in the system using dual and single receiving yard systems. However, the utilization rate on a two receiving yards system is less than the rate on single receiving yard system. As the aim of this study is to identify the utilization rate of the receiving yard, a single receiving yard operation is more representative of modern hypermarkets, and more efficient in terms of resource efficiency. Novelty – This study depends fully on the homogeneous operating hours of the retailers' receiving yards, the type of vehicle used by vendors to unload merchandises, procedures on moving the products to the inspections phase, a generalization of the products delivered by the vendors and the size of the modern hypermarkets business itself. "
    Keywords: Receiving Yard Utilization; Hypermarket Receiving Yard; Queuing Simulation.
    JEL: M1 M10 M19
    Date: 2017–12–12
    URL: http://d.repec.org/n?u=RePEc:gtr:gatrjs:jber147&r=cmp
  8. By: Jacky C. K. Chow
    Abstract: Corporate insolvency can have a devastating effect on the economy. With an increasing number of companies making expansion overseas to capitalize on foreign resources, a multinational corporate bankruptcy can disrupt the world's financial ecosystem. Corporations do not fail instantaneously; objective measures and rigorous analysis of qualitative (e.g. brand) and quantitative (e.g. econometric factors) data can help identify a company's financial risk. Gathering and storage of data about a corporation has become less difficult with recent advancements in communication and information technologies. The remaining challenge lies in mining relevant information about a company's health hidden under the vast amounts of data, and using it to forecast insolvency so that managers and stakeholders have time to react. In recent years, machine learning has become a popular field in big data analytics because of its success in learning complicated models. Methods such as support vector machines, adaptive boosting, artificial neural networks, and Gaussian processes can be used for recognizing patterns in the data (with a high degree of accuracy) that may not be apparent to human analysts. This thesis studied corporate bankruptcy of manufacturing companies in Korea and Poland using experts' opinions and financial measures, respectively. Using publicly available datasets, several machine learning methods were applied to learn the relationship between the company's current state and its fate in the near future. Results showed that predictions with accuracy greater than 95% were achievable using any machine learning technique when informative features like experts' assessment were used. However, when using purely financial factors to predict whether or not a company will go bankrupt, the correlation is not as strong.
    Date: 2018–02
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1802.05326&r=cmp
  9. By: Andreas M\"uhlbacher; Thomas Guhr
    Abstract: We review recent progress in modeling credit risk for correlated assets. We start from the Merton model which default events and losses are derived from the asset values at maturity. To estimate the time development of the asset values, the stock prices are used whose correlations have a strong impact on the loss distribution, particularly on its tails. These correlations are non-stationary which also influences the tails. We account for the asset fluctuations by averaging over an ensemble of random matrices that models the truly existing set of measured correlation matrices. As a most welcome side effect, this approach drastically reduces the parameter dependence of the loss distribution, allowing us to obtain very explicit results which show quantitatively that the heavy tails prevail over diversification benefits even for small correlations. We calibrate our random matrix model with market data and show how it is capable of grasping different market situations. Furthermore, we present numerical simulations for concurrent portfolio risks, i.e., for the joint probability densities of losses for two portfolios. For the convenience of the reader, we give an introduction to the Wishart random matrix model.
    Date: 2018–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1803.00261&r=cmp
  10. By: Nan Zhou; Li Zhang; Shijian Li; Zhijian Wang
    Abstract: Algorithmic collusion is an emerging concept in current artificial intelligence age. Whether algorithmic collusion is a creditable threat remains as an argument. In this paper, we propose an algorithm which can extort its human rival to collude in a Cournot duopoly competing market. In experiments, we show that, the algorithm can successfully extorted its human rival and gets higher profit in long run, meanwhile the human rival will fully collude with the algorithm. As a result, the social welfare declines rapidly and stably. Both in theory and in experiment, our work confirms that, algorithmic collusion can be a creditable threat. In application, we hope, the frameworks, the algorithm design as well as the experiment environment illustrated in this work, can be an incubator or a test bed for researchers and policymakers to handle the emerging algorithmic collusion.
    Date: 2018–02
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1802.08061&r=cmp
  11. By: Noel Rapa
    Abstract: This note studies the impact energy market reforms might have on the Maltese economy in the medium-to-long run using a DSGE model. Contrary to previous studies, this note takes in consideration the changes in the marginal cost of electricity production of Enemalta under a number of energy production setups. Results show that the decommissioning of the Marsa power plant and the installation of an undersea interconnector results in a fall in marginal costs, and therefore an increase in long run output, under both baseline and high oil price scenarios ranging between 1.61% and 2.53%. This energy setup is however consistent with an increase in marginal costs, and therefore a fall in long run output of 0.41% in the case of a low oil price scenario. The future setup of natural gas fired turbines results in a fall in marginal costs and an increase in long run output in all oil price scenarios, ranging between 0.81% in the low oil price scenario to 3.00% in case of high oil prices.
    JEL: E37 D58 Q43
    URL: http://d.repec.org/n?u=RePEc:mlt:ppaper:0517&r=cmp

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.