nep-cmp New Economics Papers
on Computational Economics
Issue of 2015‒09‒05
ten papers chosen by



  1. <h2> Superannuation within a financial CGE model of the Australian economy By Peter B. Dixon; James. A. Giesecke; Maureen T. Rimmer
  2. The Danish Microsimulation Model SMILE – An overview By Peter Stephensen
  3. SBAM: An Algorithm for Pair Matching By Peter Stephensen; Tobias Markeprand
  4. Inefficiency and Self-Determination: Simulation-based Evidence from Meiji Japan By Eric Weese; Masayoshi Hayashi; Masashi Nishikawa
  5. Financial Market Modeling with Quantum Neural Networks By Carlos Pedro Gon\c{c}alves
  6. A computational spectral approach to interest rate models By Luca Di Persio; Michele Bonollo; Gregorio Pellegrini
  7. A Microsimulation Model for Educational Forecasting By Niels Erik Kaaber Rasmussen; Peter Stephensen
  8. Optimization Approaches for the Traveling Salesman Problem with Drone By Agatz, N.A.H.; Bouman, P.; Schmidt, M.
  9. Autonomics: an autonomous and intelligent economic platform and next generation money tool By Benjamin Munro; Julia McLachlan
  10. Were the Scandinavian Banking Crises Predictable? A Neural Network Approach By Kim Ristolainen

  1. By: Peter B. Dixon; James. A. Giesecke; Maureen T. Rimmer
    Abstract: Australia's superannuation sector has become both a major institution in guiding the allocation of the nation's financial capital across asset classes, regions, and sectors, and a central intermediary in channelling the nation's annual savings into domestic capital formation and foreign financial asset accumulation. To put the industry's scale in context, in 2012 the sector had assets under management of approximately $1.4tn (Australia's GDP in the same year was approximately $1.5tn). Annual inflows to the system represent approximately one third of gross national savings. The sector's influence over the allocation of the nation's physical and financial assets continues to grow. We model this important institution within an economy-wide setting by embedding explicit modelling of the sector within a model of the financial sector which is in turn linked to a dynamic multi-sectoral CGE model of the real side of the economy. We develop the financial CGE model by building on a multi-sectoral dynamic model of the real side of the Australian economy. In particular, we introduce explicit treatment of: (i) financial intermediaries and the agents with which they transact; (ii) financial instruments describing assets and liabilities; (iii) the financial flows related to these instruments; (iv) rates of return on individual assets and liabilities; and (v) links between the real and monetary sides of the economy. We explore the effects of the superannuation sector by simulating a one percentage point increase in the ratio of superannuation contributions to the economy-wide nominal wage bill.
    Keywords: Financial CGE model, superannuation
    JEL: C68 G11 G17 G21
    Date: 2015–07
    URL: http://d.repec.org/n?u=RePEc:cop:wpaper:g-253&r=all
  2. By: Peter Stephensen (Danish Rational Economic Agents Model, DREAM)
    Abstract: The SMILE model is a Danish, dynamic, data-driven microsimulation model. The current version forecasts demography, education level, socioeconomic characteristics and housing demand for the period 2010-2050. The basic idea with SMILE is to unite the pre-models that the Danish institution DREAM already uses in a full dynamic microsimulation model. The new elements of the model are described and the development strategy is outlined. The model is based on a new Event Pump architecture. This is a Lego-block-like object oriented technique where the model is built as an Agent Tree consisting of Agent objects. The model take extensive use of a method called CTREE, which is a decision tree technique that has not previously been used for microsimulation modelling. Finally, a matching algorithm called SBAM (Sparse Biproportionate Adjustment Matching) has been developed.
    Keywords: population projections, education, household projections, housing demand, microsimulation
    Date: 2013–12
    URL: http://d.repec.org/n?u=RePEc:dra:wpaper:201305&r=all
  3. By: Peter Stephensen (Danish Rational Economic Agents Model, DREAM); Tobias Markeprand (Danish Rational Economic Agents Model, DREAM)
    Abstract: This paper introduces a new algorithm for pair matching. The method is called SBAM (Sparse Biproportionate Adjustment Matching) and can be characterized as either cross-entropy minimizing or matrix balancing. This implies that we use information eciently according to the historic observations on pair matching. The advantage of the method is its ecient use of information and its reduced computational requirements. We compare the resulting matching pattern with the harmonic and ChooSiow matching functions and find that in important cases the SBAM and ChooSiow method change the couples pattern in the same way. We also compare the computational requirements of the SBAM with alternative methods used in microsimulation models. The method is demonstrated in the context of a new Danish microsimulation model that has been used for forecasting the housing demand.
    Keywords: pair matching, algorithm, SBAM
    Date: 2013–10
    URL: http://d.repec.org/n?u=RePEc:dra:wpaper:201303&r=all
  4. By: Eric Weese (Department of Economics, Yale University); Masayoshi Hayashi (Department of Economics, University of Tokyo); Masashi Nishikawa (Department of Economics, Aoyama Gakuin University)
    Abstract: Does the exercise of the right of self-determination lead to inefficiency? This paper considers a set of centrally planned municipal mergers during the Meiji period, with data from Gifu prefecture. The observed merger pattern can be explained as a social optimum based on a very simple individual utility function. If individual villages had been allowed to choose their merger partners, counterfactual simulations show that the core is always non-empty, but core partitions contain about 80% more (postmerger) municipalities than the social optimum. Simulations are possible because core partitions can be calculated using repeated application of a mixed integer program.
    Date: 2015–08
    URL: http://d.repec.org/n?u=RePEc:kob:dpaper:dp2015-35&r=all
  5. By: Carlos Pedro Gon\c{c}alves
    Abstract: Econophysics has developed as a research field that applies the formalism of Statistical Mechanics and Quantum Mechanics to address Economics and Finance problems. The branch of Econophysics that applies of Quantum Theory to Economics and Finance is called Quantum Econophysics. In Finance, Quantum Econophysics' contributions have ranged from option pricing to market dynamics modeling, behavioral finance and applications of Game Theory, integrating the empirical finding, from human decision analysis, that shows that nonlinear update rules in probabilities, leading to non-additive decision weights, can be computationally approached from quantum computation, with resulting quantum interference terms explaining the non-additive probabilities. The current work draws on these results to introduce new tools from Quantum Artificial Intelligence, namely Quantum Artificial Neural Networks as a way to build and simulate financial market models with adaptive selection of trading rules, leading to turbulence and excess kurtosis in the returns distributions for a wide range of parameters.
    Date: 2015–08
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1508.06586&r=all
  6. By: Luca Di Persio; Michele Bonollo; Gregorio Pellegrini
    Abstract: The Polynomial Chaos Expansion (PCE) technique recovers a finite second order random variable exploiting suitable linear combinations of orthogonal polynomials which are functions of a given stochas- tic quantity {\xi}, hence acting as a kind of random basis. The PCE methodology has been developed as a mathematically rigorous Uncertainty Quantification (UQ) method which aims at providing reliable numerical estimates for some uncertain physical quantities de?ning the dynamic of certain engineering models and their related simulations. In the present paper we exploit the PCE approach to analyze some equity and interest rate models considering, without loss of generality, the one dimensional case. In particular we will take into account those models which are based on the Geometric Brownian Motion (gBm), e.g. the Vasicek model, the CIR model, etc. We also provide several numerical applications and results which are discussed for a set of volatility values. The latter allows us to test the PCE technique on a quite large set of di?erent scenarios, hence providing a rather complete and detailed investigation on PCE-approximation's features and properties, such as the convergence of statistics, distribution and quantiles. Moreover we give results concerning both an e?ciency and an accuracy study of our approach by comparing our outputs with the ones obtained adopting the Monte Carlo approach in its standard form as well as in its enhanced version.
    Date: 2015–08
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1508.06236&r=all
  7. By: Niels Erik Kaaber Rasmussen; Peter Stephensen (Danish Rational Economic Agents Model, DREAM)
    Abstract: A dynamic microsimulation model for forecasting educational patterns is presented. At the level of individuals the model simulates lifetime educational behavior, resulting in a long term forecast of the general educational level in Denmark. The model is a light-weight, dynamic, multithreaded and closed microsimulation model using discrete time. Data on the full Danish population is used as the initial population. Each individual is characterized by age, gender, origin, educational attainment and current educational status. Future demographic events such as births, deaths, immigration and emigration are projected in a separate group-based model and given as input. In the model individuals lives their life?s independently to decrease time-complexity and to utilize the potential of the multithreaded environment. Transition probabilities are calculated from historical educational behavior using Danish register data. The historical observations are linked to a range of background variables (such as gender, age, origin, current participation in education, study length and educational attainment). Prior to running the model, transition probabilities are computed using conditional inference trees. This data-mining approach groups together observations with similar characteristics and responses based on statistical tests. This paper describes the features of the model, briefly presents some results and points to the potential of the model in terms of policy analysis and already planned extensions to the model.
    Keywords: microsimulation model, education, forecasting, education projection
    Date: 2014–10
    URL: http://d.repec.org/n?u=RePEc:dra:wpaper:201402&r=all
  8. By: Agatz, N.A.H.; Bouman, P.; Schmidt, M.
    Abstract: The fast and cost-efficient home delivery of goods ordered online is logistically challenging. Many companies are looking for new ways to cross the last-mile to their customers. One technology-enabled opportunity that recently has received much attention is the use of a drone to support deliveries. An innovative last-mile delivery concept in which a truck collaborates with a drone to make deliveries gives rise to a new variant of the traveling salesman problem (TSP) that we call the TSP with drone. In this paper, we formulate this problem as an MIP model and develop several fast route first-cluster second heuristics based on local search and dynamic programming. We prove worst-case approximation ratios for the heuristics and test their performance by comparing the solutions to the optimal solutions for small instances. In addition, we apply our heuristics to several artificial instances with different characteristics and sizes. Our numerical analysis shows that substantial savings are possible with this concept in comparison to truck-only delivery.
    Keywords: traveling salesman problem, vehicle routing, drones, home delivery
    Date: 2015–08–01
    URL: http://d.repec.org/n?u=RePEc:ems:eureri:78472&r=all
  9. By: Benjamin Munro; Julia McLachlan
    Abstract: We propose a high level network architecture for an economic system that integrates money, governance and reputation. We introduce a method for issuing, and redeeming a digital coin using a mechanism to create a sustainable global economy and a free market. To maintain a currency's value over time, and therefore be money proper, we claim it must be issued by the buyer and backed for value by the seller, exchanging the products of labour, in a free market. We also claim that a free market and sustainable economy cannot be maintained using economically arbitrary creation and allocation of money. Nakamoto, with Bitcoin, introduced a new technology called the cryptographic blockchain to operate a decentralised and distributed accounts ledger without the need for an untrusted third party. This blockchain technology creates and allocates new digital currency as a reward for "proof-of-work", to secure the network. However, no currency, digital or otherwise, has solved how to create and allocate money in an economically non-arbitrary way, or how to govern and trust a world-scale free enterprise money system. We propose an "Ontologically Networked Exchange" (ONE), with purpose as its highest order domain. Each purpose is defined in a contract, and the entire economy of contracts is structured in a unified ontology. We claim to secure the ONE network using economically non-arbitrary methodologies and economically incented human behaviour. Decisions influenced by reputation help to secure the network without an untrusted third party. The stack of contracts, organised in a unified ontology, functions as a super recursive algorithm, with individual use programming the algorithm, acting as the "oracle". The state of the algorithm becomes the "memory" of a scalable and trustable artificial intelligence (AI). This AI offers a new platform for what we call the "Autonomy-of-Things" (AoT).
    Date: 2015–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1508.05355&r=all
  10. By: Kim Ristolainen (Department of Economics, University of Turku)
    Abstract: The early warning system literature on banking crises has often relied on linear classifiers such as the logit model, which are usually estimated with large datasets of multiple regions of countries. We construct an EWS based on an artificial neural network model with monthly data from the Scandinavian countries to tackle the poor generalization ability of the usual models that might be due to regional heterogeneity of the countries and a nonlinear decision boundary of the classification problem. We show that the Finnish and Swedish banking crises in 1991 were quite predictable with an artificial neural network model when information from earlier crises in Denmark and Norway was used. We also use cross validation in the model selection process to get the optimal amount of complexity to the models. Finally the area under the ROC-curve is used as the model assessment criteria and in this framework we show that the artificial neural network outperforms the logit regression in banking crises prediction.
    Keywords: Early Warning System, Banking Crises, Scandinavia, Neural Networks, Validation
    JEL: G21 C45 C52
    URL: http://d.repec.org/n?u=RePEc:tkk:dpaper:dp99&r=all

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.