nep-cmp New Economics Papers
on Computational Economics
Issue of 2016‒11‒13
thirteen papers chosen by



  1. Matrix Neural Networks By Gao, Junbin; Guo, Yi; Wang, Zhiyong
  2. EXPERIENTIAL LEARNING: LOOKING AT COUNTRY MANAGER SIMULATION FOR TEACHING INTERNATIONAL BUSINESS COURSES By Frank Cotae; Jacqueline Musabende
  3. Integrated model of computable general equilibrium and social cost benefit analysis of an Indian oil refinery: Future projections and macroeconomic effects By Shovan Ray; A. Ganesh Kumar; Sumana Chaudhuri
  4. Supplementary Paper Series for the "Comprehensive Assessment" (3): Policy Effects since the Introduction of Quantitative and Qualitative Monetary Easing (QQE) -- Assessment Based on Bank of Japan's Large-scale Macroeconomic Model (Q-JEM) -- By Kazutoshi Kan; Yui Kishaba; Tomohiro Tsuruga
  5. EM Algorithm and Stochastic Control in Economics By Steven Kou; Xianhua Peng; Xingbo Xu
  6. Neural Nets for Indirect Inference By Michael Creel
  7. CONTROL STRATEGY TO TRADE CRYPTOCURRENCIES By Josef KokeÅ¡; Michal BejÄ ek
  8. Sparse grid high-order ADI scheme for option pricing in stochastic volatility models By Bertram D\"uring; Christian Hendricks; James Miles
  9. Inventory management under various maintenance policies By Joeri Poppe; Rob Basten; Robert Boute; Marc Lambrecht
  10. LOLA 3.0: Luxembourg OverLapping generation model for policy Analysis By Luca Marchiori; Olivier Pierrard
  11. An Equilibrium Model with Computationally Constrained Agents By Wolfgang Kuhle
  12. A Finite Volume - Alternating Direction Implicit Approach for the Calibration of Stochastic Local Volatility Models By Maarten Wyns; Jacques Du Toit
  13. Quantitative Assessment of Pathways to a Resource-Efficient and Low-Carbon Europe By Martin Distekamp; Mark Meyer

  1. By: Gao, Junbin; Guo, Yi; Wang, Zhiyong
    Abstract: Traditional neural networks assume vectorial inputs as the network is arranged as layers of single line of computing units called neurons. This special structure requires the non-vectorial inputs such as matrices to be converted into vectors. This process can be problematic. Firstly, the spatial information among elements of the data may be lost during vectorisation. Secondly, the solution space becomes very large which demands very special treatments to the network parameters and high computational cost. To address these issues, we propose matrix neural networks (MatNet), which takes matrices directly as inputs. Each neuron senses summarised information through bilinear mapping from lower layer units in exactly the same way as the classic feed forward neural networks. Under this structure, back prorogation and gradient descent combination can be utilised to obtain network parameters e ciently. Furthermore, it can be conveniently extended for multimodal inputs. We apply MatNet to MNIST handwritten digits classi cation and image super resolution tasks to show its e ectiveness. Without too much tweaking MatNet achieves comparable performance as the state-of-the-art methods in both tasks with considerably reduced complexity.
    Keywords: Image Super Resolution; Pattern Recognition; Machine Learning; Back Propagation; Neural Networks
    Date: 2016–11–02
    URL: http://d.repec.org/n?u=RePEc:syb:wpbsba:2123/15839&r=cmp
  2. By: Frank Cotae (Mount Royal University); Jacqueline Musabende (ISM)
    Abstract: The use of simulations in business education started in 1957, since then, hundreds of simulations have been developed. In this paper, we present a literature review of the impact that business simulations have in developing decision-making skills, integrative, experiential learning, and team work skills. Building on the generative learning theory, experiential learning theory and bloom’s taxonomy, we tested the simulation Country Manager with a number of students divided in 4 groups. The objective was to obtain feedback of the applicability and benefit of using this software to teach decision-making in international business courses. Results showed Country Manager being applicable to senior level or capstone international business strategy courses and appropriate as an experiential learning tool. However, a series of difficulties and software inflexibilities were noted. We found supporting evidence for implementing a simulation into the international business curricula to represent an experiential learning prong, even if Country Manager was not the most pertinent match.
    Keywords: Simulation, Experiential Learning, International Business, Benchmark Competition
    JEL: A22
    URL: http://d.repec.org/n?u=RePEc:sek:ibmpro:4306980&r=cmp
  3. By: Shovan Ray (Indira Gandhi Institute of Development Research); A. Ganesh Kumar (Indira Gandhi Institute of Development Research); Sumana Chaudhuri (Durgadevi Saraf Institute of Management Studies)
    Abstract: Social Cost Benefit Analysis has long been used as a useful tool to appraise and evaluate the value to a society of a range of investment projects. Various important aspects of this method have been subject to scrutiny over the decades, such as the appropriate discount rate, whether the Ramsey Rule of `pure time preference' should be applied as impatience with a positive rate or zero-rated with concern for future generations; these are important concerns since the choice of discount rates deeply affect the valuations of future income streams. Other aspects concerning financial flows and appropriate `shadow prices' have also undergone considerable attention. However, when a mega-project with the character of a `universal intermediate' is considered, its multiplier effects may be wide-ranging and permeate several economic and social layers, and may be captured only in the aggregates. This study, a sequel to a paper that ignores such macro-aggregative benefits, examines the costs and benefits of Vadinar refinery in Gujarat with a focus on this welfare dimension on society for the project. The study allows for this large scale benefit accrual and examines the net economic benefit of refining at Vadinar by Essar Oil to the region, the state and the country by Social Cost Benefit Analysis. The framework thus explores a methodological breakthrough in SCBA studies. In constituting the macroeconomic effects of expansion of the mega oil refinery, the economic impact is estimated using the Computable General Equilibrium (CGE) model and incorporated into the cost benefit analysis. This assimilation of CBA with macroeconomic externality obtained from the CGE model framework is perhaps only one of its kind in economic analysis of major infrastructure projects of any country. SCBA when combined with CGE as an analytical tool can be gainfully employed to appraise or evaluate large scale projects like oil refineries, especially when they make a splash with their mega-sizes as the Essar Oil refinery is.
    Keywords: Social Cost Benefit Analysis, Economic Impact, Computable General Equilibrium (CGE) Model, Oil Refinery
    JEL: B41 C51 C52 C53 C54 C55 D50 D58 D60 D61
    Date: 2016–07
    URL: http://d.repec.org/n?u=RePEc:ind:igiwpp:2016-024&r=cmp
  4. By: Kazutoshi Kan (Bank of Japan); Yui Kishaba (Bank of Japan); Tomohiro Tsuruga (Bank of Japan)
    Abstract: Three and a half years or so have passed since the Bank of Japan introduced Quantitative and Qualitative Monetary Easing (QQE) in April 2013. This paper presents a simulation exercise based on the Bank of Japan's large-scale macroeconomic model (Q-JEM) to assess the impact of policies since the introduction of QQE on Japan's economic activity and prices. In this exercise, we consider hypothetical scenarios assuming that QQE and subsequent easing measures had not been introduced, and conduct counterfactual simulations to examine how the Japanese economy and prices would have evolved under these scenarios. In this setting, we estimate the policy effects as the difference between the actual data and the counterfactual paths. We use two different starting points for the simulation: the introduction of QQE in Q2 2013, and the quarter before the introduction of QQE, when the Bank introduced its inflation target and markets may have anticipated a major policy change. Moreover, for each of the two different starting points, we consider two different cases in terms of what is regarded as part of the monetary policy shock brought about by QQE and subsequent policy measure. Specifically, in the first case, the monetary policy shock includes only the decline in real interest rates, and changes in exchange rates and stock prices are regarded as consequences of the policy shock only to the extent that they are explained within the model. In the second case, it includes all the changes in exchange rates and stock prices (beyond those predicted by the model). The simulation results indicate that in three out of the four scenarios, the year-on-year rate of change in the CPI (all items less fresh food and energy) would have stayed negative or close to zero percent without the introduction of QQE and subsequent policy measures.
    Keywords: Inflation; Inflation expectation; Macroeconomic model; Unconventional monetary policy; Asset purchase; Quantitative easing
    JEL: E17 E37 E52 E58
    Date: 2016–11–07
    URL: http://d.repec.org/n?u=RePEc:boj:bojwps:wp16e15&r=cmp
  5. By: Steven Kou; Xianhua Peng; Xingbo Xu
    Abstract: Generalising the idea of the classical EM algorithm that is widely used for computing maximum likelihood estimates, we propose an EM-Control (EM-C) algorithm for solving multi-period finite time horizon stochastic control problems. The new algorithm sequentially updates the control policies in each time period using Monte Carlo simulation in a forward-backward manner; in other words, the algorithm goes forward in simulation and backward in optimization in each iteration. Similar to the EM algorithm, the EM-C algorithm has the monotonicity of performance improvement in each iteration, leading to good convergence properties. We demonstrate the effectiveness of the algorithm by solving stochastic control problems in the monopoly pricing of perishable assets and in the study of real business cycle.
    Date: 2016–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1611.01767&r=cmp
  6. By: Michael Creel
    Abstract: For simulable models, neural networks are used to approximate the limited information posterior mean, which conditions on a vector of statistics, rather than on the full sample. Because the model is simulable, training and testing samples may be generated with sizes large enough to train well a net that is large enough, in terms of number of hidden layers and neurons, to learn the limited information posterior mean with good accuracy. Targeting the limited information posterior mean using neural nets is simpler, faster, and more successful than is targeting the full information posterior mean, which conditions on the observed sample. The output of the trained net can be used directly as an estimator of the model’s parameters, or as an input to subsequent classical or Bayesian indirect inference estimation. Examples of indirect inference based on the out- put of the net include a small dynamic stochastic general equilibrium model, estimated using both classical indirect inference methods and approximate Bayesian computing (ABC) methods, and a continuous time jump-diffusion model for stock index returns, estimated using ABC.
    Keywords: neural networks; indirect inference; approximate Bayesian computing; machine learning; DSGE; jump-diffusion
    Date: 2016–11
    URL: http://d.repec.org/n?u=RePEc:bge:wpaper:942&r=cmp
  7. By: Josef KokeÅ¡ (Czech Technical University); Michal BejÄ ek (Charles University)
    Abstract: The paper deals with the cryptocurrencies. First, a general introduction to crypto-currencies is given from the programmer’s point of view. It describes some basic strategies for automated trading. Also explained is the algorithm Floyd-Warshall and its modifications for automation arbitrage. An illustrative example is given and a trading algorithm is listed. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
    Keywords: cryptocurrency; arbitrage strategies; algorithm Floyd-Warshall
    JEL: C60 C80 G01
    URL: http://d.repec.org/n?u=RePEc:sek:ibmpro:4407038&r=cmp
  8. By: Bertram D\"uring; Christian Hendricks; James Miles
    Abstract: We present a sparse grid high-order alternating direction implicit (ADI) scheme for option pricing in stochastic volatility models. The scheme is second-order in time and fourth-order in space. Numerical experiments confirm the computational efficiency gains achieved by the sparse grid combination technique.
    Date: 2016–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1611.01379&r=cmp
  9. By: Joeri Poppe; Rob Basten; Robert Boute; Marc Lambrecht
    Abstract: Capital assets, such as manufacturing equipment, require maintenance to remain functioning. Maintenance can be performed when a component breaks down and needs replacement (i.e., corrective maintenance), or the maintenance and part replacement can be performed preventively. Preventive maintenance can be planned on a periodic basis (periodic maintenance), or it can be triggered by a certain monitored condition (condition-based maintenance). Preventive maintenance policies are gaining traction in the business world, but for many companies it is unclear what their impact is on the resulting inventory requirements for the spare parts that are used for the maintenance interventions. For a setting that is realistic at an OEM in the compressed air industry with whom we collaborate, we study the impact of the maintenance policy on the inventory requirements and the corresponding costs. Preventive policies increase the total demand for spare parts compared to corrective maintenance, since the former do not exploit the entire useful life of the components. This leads to higher inventory requirements. At the same time, the preventive policies inhibit advance demand information, as the interventions, and correspondingly the spare parts demands, are planned in advance. Using a simulation study, we show that by using this advance demand information in managing the spare part inventory, the increase in inventory requirements of preventive maintenance policies can to a large extent be offset; for condition-based maintenance, we find that inventories can even be lower compared to corrective maintenance, provided that the advance demand information is used correctly when managing inventories. For the OEM with whom we work, our analysis sheds light on the behaviour of the inventory related costs under various maintenance policies.
    Keywords: Servitisation, Advance demand information, Maintenance, Condition-based maintenance, Spare parts inventory management
    Date: 2016–10
    URL: http://d.repec.org/n?u=RePEc:ete:kbiper:555260&r=cmp
  10. By: Luca Marchiori; Olivier Pierrard
    Abstract: LOLA 2.0 is a dynamic general equilibrium model for the Luxembourg economy, which features overlapping generation dynamics, labormarket frictions à la Diamond-Mortensen-Pissarides and a New Open Economy Macroeconomics structure. This paper presents the model LOLA 3.0, which essentially integrates a financial sector to LOLA 2.0. In contrast to the existing dynamic stochastic general equilibrium (DSGE) literature, the financial sector does not intermediate between resident households and resident firms, but exports wealth management services. We calibrate the model to match the size of the financial sector in terms of employment, value added, net exports and taxes. The 2008 financial crisis has affected Luxembourg's financial sector and slowed inflows of cross-border workers. Because there is a lot of uncertainty surrounding future growth of the Luxembourg financial sector and cross-border worker inflows, we use LOLA 3.0 to study the evolution of the Luxembourg economy between 2015 and 2060 under alternative scenarios (high - medium - low).
    Keywords: Overlapping generations, Long-run projections, Financial sector, Luxembourg.
    JEL: D91 E24 E62 F41 J11
    Date: 2015–11
    URL: http://d.repec.org/n?u=RePEc:bcl:bclwop:bclwp100&r=cmp
  11. By: Wolfgang Kuhle
    Abstract: We study a large economy in which firms cannot compute exact solutions to the non-linear equations that characterize the equilibrium price at which they can sell future output. Instead, firms use polynomial expansions to approximate prices. The precision with which they can compute prices is endogenous and depends on the overall level of supply. At the same time, firms' individual supplies, and thus aggregate supply, depend on the precision with which they approximate prices. This interrelation between supply and price forecast induces multiple equilibria, with inefficiently low output, in economies that otherwise have a unique, efficient equilibrium. Moreover, exogenous parameter changes, which would increase output were there no computational frictions, can diminish agents' ability to approximate future prices, and reduce output. Our model therefore accommodates the intuition that interventions, such as unprecedented quantitative easing, can put agents into "uncharted territory".
    Date: 2016–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1611.01771&r=cmp
  12. By: Maarten Wyns; Jacques Du Toit
    Abstract: Calibration of stochastic local volatility (SLV) models to their underlying local volatility model is often performed by numerically solving a two-dimensional non-linear forward Kolmogorov equation. We propose a novel finite volume (FV) discretization in the numerical solution of general 1D and 2D forward Kolmogorov equations. The FV method does not require a transformation of the PDE. This constitutes a main advantage in the calibration of SLV models as the pertinent PDE coefficients are often nonsmooth. Moreover, the FV discretization has the crucial property that the total numerical mass is conserved. Applying the FV discretization in the calibration of SLV models yields a non-linear system of ODEs. Numerical time stepping is performed by the Hundsdorfer-Verwer ADI scheme to increase the computational efficiency. The non-linearity in the system of ODEs is handled by introducing an inner iteration. Ample numerical experiments are presented that illustrate the effectiveness of the calibration procedure.
    Date: 2016–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1611.02961&r=cmp
  13. By: Martin Distekamp (GWS - Institute of Economic Structures Research); Mark Meyer (GWS - Institute of Economic Structures Research)
    Abstract: Though the sustainability research community as well as international decision makers seem to share the conviction that, akin to the challenges of climate policy, a great transi-tion will also be needed in order to decouple human wellbeing from resource use over the next decades, there exist only scarce quantitative assessments of possible transition scenarios which do also concern this matter. Our paper is intended to advance this branch of research by a presentation of key scenario insights from the global simulation model GINFORS which take account of the complex interrelations between different environmental objectives. Whereas a multitude of publications already applied various MRIO databases for ex post assessments of resource-related national footprint indicators, there exist only scarce ex ante assessments of possible transition scenarios concerning this matter. The modelling framework of GINFORS also rests on a MRIO database. Thus, GINFORS is also able to map quantitative indicators of material extractions embedded in regional consumption activities over the global supply chain.
    Keywords: raw material consumption, RMC, raw material input, RMI, CO2 emissions, macro-econometric model, GINFORS, MRIO, WIOD, policy simulations, resource-efficiency, low-carbon economy
    JEL: Q34 Q37 Q51 Q56
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:gws:dpaper:16-10&r=cmp

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.