nep-cmp New Economics Papers
on Computational Economics
Issue of 2015‒05‒30
thirteen papers chosen by

  1. “Self-organizing map analysis of agents’ expectations. Different patterns of anticipation of the 2008 financial crisis” By Oscar Claveria; Enric Monte; Salvador Torra
  2. Liberalization of trade flows under TTIP from a small country perspective. The case of Poland By Jan Hagemejer
  3. An analytic recursive method for optimal multiple stopping: Canadization and phase-type fitting By Tim Leung; Kazutoshi Yamazaki; Hongzhong Zhang
  4. An Agent-based Model for Financial Vulnerability By Rick Bookstaber; Mark Paddrik; Brian Tivnan
  5. Using support vector machines for measuring democracy By Gründler, Klaus; Krieger, Tommy
  6. A Reformulation of Normative Economics for Models with Endogenous Preferences By Vipul Bhatt; Masao Ogaki; Yuichi Yaguchi
  7. Effects of Limit Order Book Information Level on Market Stability Metrics By Mark Paddrik; Roy Hayes; William Scherer; Peter Beling
  8. Worst-Case Approach To Strategic Optimal Portfolio Selection Under Transaction Costs And Trading Limits Intangibles By Nikolay A. Andreev
  9. Modelling a Market Stability Reserve in Carbon Markets By Anne Schopp; William Acworth; Daniel Huppmann; Karsten Neuhoff
  10. GECO2015 Global Energy and Climate Outlook: Road to Paris. Assessment of Low Emission Levels under World Action Integrating National Contributions By Ariane Labat; Alban Kitous; Miles Perry; Bert Saveyn; Toon Vandyck; Zoi Vrontisi
  11. Prediction of air pollution peaks generated by urban transport networks By Bell, Margaret; Bergantino, Angela S.; Catalano, Mario; Galatioto, Fabio
  12. How Institutional Arrangements in the National Innovation System Affect Industrial Competitiveness: A study of Japan and the United States with multiagent simulation By KWON Seokbeom; MOTOHASHI Kazuyuki
  13. Contract as Automation: The Computational Representation of Financial Agreements By Mark D. Flood; Oliver R. Goodenough

  1. By: Oscar Claveria (Department of Econometrics. University of Barcelona); Enric Monte (Department of Signal Theory and Communications. Polytechnic University of Catalunya.); Salvador Torra (Department of Econometrics & Riskcenter-IREA. Universitat de Barcelona)
    Abstract: By means of Self-Organizing Maps we cluster fourteen European countries according to the most suitable way to model their agents’ expectations. Using the financial crisis of 2008 as a benchmark, we distinguish between those countries that show a progressive anticipation of the crisis and those where sudden changes in expectations occur. By mapping the trajectory of economic experts’ expectations prior to the recession we find that when there are brisk changes in expectations before impending shocks, Artificial Neural Networks are more suitable than time series models for modelling expectations. Conversely, in countries where expectations show a smooth transition towards recession, ARIMA models show the best forecasting performance. This result demonstrates the usefulness of clustering techniques for selecting the most appropriate method to model and forecast expectations according to their behaviour.
    Keywords: Business surveys; Self-Organizing Maps; Clustering; Forecasting; Neural networks; Time series models; Nonlinear models JEL classification: C02; C22; C45; C63; E27
    Date: 2015–03
  2. By: Jan Hagemejer (Faculty of Economic Sciences, University of Warsaw; National Bank of Poland)
    Abstract: The empirical ex-ante evaluations of the Transatlantic Trade and Investment Partnership are similar on aggregate but suggest a large heterogeneity of the TTIP impact at the individual country level. We aim to provide a comprehensive evaluation of the possible TTIP effects for the economy of Poland using a computable general equilibrium model. In our simulation scenarios we use the estimates of NTBs that allow us to differentiate the impact of NTBs on trade of Poland, the remaining NMS aggregate, Germany, the largest trading partner of Poland and the rest of EU-15. We show that from a point of view of a small country, where most of international trade is concentrated on exchange with one or a few neighboring trading partners, such as Poland, simultaneous trade liberalization with a third partner will not bring sizeable gains to its economy. We observe US-EU15 trade expansion to be crowding out some of the trade in the most important Polish trading sectors, such as chemicals and motor vehicles. The unfavorable change in the terms of trade makes the gains from trade small while some sectors reduce output by a considerable amount.
    Keywords: TTIP, trade liberalization, computable general equilibrium, Poland
    JEL: F13 C68 D58
    Date: 2015
  3. By: Tim Leung; Kazutoshi Yamazaki; Hongzhong Zhang
    Abstract: We study an optimal multiple stopping problem for call-type payoff driven by a spectrally negative Levy process. The stopping times are separated by constant refraction times, and the discount rate can be positive or negative. The computation involves a distribution of the Levy process at a constant horizon and hence the solutions in general cannot be attained analytically. Motivated by the maturity randomization (Canadization) technique by Carr (1998), we approximate the refraction times by independent, identically distributed Erlang random variables. In addition, fitting random jumps to phase-type distributions, our method involves repeated integrations with respect to the resolvent measure written in terms of the scale function of the underlying Levy process. We derive a recursive algorithm to compute the value function in closed form, and sequentially determine the optimal exercise thresholds. A series of numerical examples are provided to compare our analytic formula to results from Monte Carlo simulation.
    Date: 2015–05
  4. By: Rick Bookstaber (Office of Financial Research); Mark Paddrik (Office of Financial Research); Brian Tivnan (MITRE Corporation)
    Abstract: This paper describes an agent-based model for analyzing the vulnerability of the financial system to asset- and funding-based fire sales. The model views the dynamic interactions of agents in the financial system extending from the suppliers of funding through the intermediation and transformation functions of the bank/dealer to the financial institutions that use the funds to trade in the asset markets, and that pass collateral in the opposite direction. The model focuses on the intermediation functions of the bank/dealers in order to trace the path of shocks that come from sudden price declines, as well as shocks that come from the various agents, namely funding restrictions imposed by the cash providers, erosion of the credit of the bank/dealers, and investor redemptions by the buy-side financial institutions. The model demonstrates that it is the reaction to initial losses rather than the losses themselves that determine the extent of a crisis. By building on a detailed mapping of the transformations and dynamics of the financial system, the agent-based model provides an avenue toward risk management that can illuminate the pathways for the propagation of key crisis dynamics such as fire sales and funding runs.
    Keywords: Agent-based model, Financial Vulnerability
    Date: 2014–07–29
  5. By: Gründler, Klaus; Krieger, Tommy
    Abstract: We present a novel approach for measuring democracy, which enables a very detailed and sensitive index. This method is based on Support Vector Machines, a mathematical algorithm for pattern recognition. Our implementation evaluates 188 countries in the period between 1981 and 2011. The Support Vector Machines Democracy Index (SVMDI) is continuously on the 0-1-Interval and robust to variations in the numerical process parameters. The algorithm introduced here can be used for every concept of democracy without additional adjustments, and due to its exibility it is also a valuable tool for comparison studies.
    Keywords: Democracy,Support Vector Machines,Democracy Index
    JEL: C43 C65 C82 H11 P16
    Date: 2015
  6. By: Vipul Bhatt (James Madison University); Masao Ogaki (Keio University); Yuichi Yaguchi (Chuo University)
    Abstract: This paper proposes a framework to balance considerations of welfarism and virtue ethics in the normative analysis of economic models with endogenous preferences. We introduce the moral evaluation function (MEF), which ranks alternatives based purely on virtue ethics, and define the social objective function (SOF), which combines the Social Welfare Function (SWF) and the MEF. In a model of intergenerational altruism with endogenous time preference, using numerical simulations we show that maximizing the SWF may not yield a socially desirable state if the society values virtue. This problem can be resolved by using the SOF to evaluate alternative social states.
    Date: 2015–02
  7. By: Mark Paddrik (Office of Financial Research); Roy Hayes (University of Virginia); William Scherer (University of Virginia); Peter Beling (University of Virginia)
    Abstract: Using an agent-based model of the limit order book, we explore how the levels of information available to participants, exchanges, and regulators can be used to improve our understanding of the stability and resiliency of a market. Ultimately, we want to know if electronic market data contains previously undetected information that could allow us to better assess market stability. Using data produced in the controlled environment of an agent-based model's limit order book, we examine various resiliency indicators to determine their predictive capabilities. Most of the types of data created have traditionally been available either publicly or on a restricted basis to regulators and exchanges, but other types have never been collected. We confirmed our findings using actual order flow data with user identifications included from the CME (Chicago Mercantile Exchange) and New York Mercantile Exchange (NYMEX). Our findings strongly suggest that high-fidelity microstructure data in combination with price data can be used to define stability indicators capable of reliably signaling a high likelihood for an imminent flash crash event about one minute before it occurs.
    Keywords: Limit Order Book, Market Stability
    Date: 2014–11–25
  8. By: Nikolay A. Andreev (National Research University Higher School)
    Abstract: We study a worst-case scenario approach to the problem of strategic portfolio selection in presence of transaction costs and trading limits under uncertain stochastic process of market parameters. Unlike classic stochastic programming, the approach is model-free, solution of the arising Bellman-Isaacs equation can be easily found numerically under some general assumptions. All results hold for a general class of utility functions and several risky assets. For a special case of proportional transaction costs and CRRA utility, we present a numerical scheme which allows to reduce the dimension of the Bellman-Isaacs equation by a number of risky assets.
    Keywords: portfolio selection, bellman equation, stochastic dynamic programming, transaction costs, worst-case scenario
    JEL: C61 C63 G11
    Date: 2015
  9. By: Anne Schopp; William Acworth; Daniel Huppmann; Karsten Neuhoff
    Abstract: We examine under which conditions a cap-and-trade mechanism can deliver a dynamically efficient abatement pathway and contribute to a robust investment framework. For this we develop a numerical dynamic partial-equilibrium model that includes differentiated objective functions of different market participants for holding emission allowances based on their banking strategy. If the surplus of allowances is large, as currently observed in the European Union Emissions Trading System, the equilibrium market outcome can deviate from an efficient abatement pathway and performance of the policy is reduced against a set of key criteria (dynamic efficiency, price credibility, price consistency, and robustness to shocks). The model is applied to assess design options of quantity and price based market stability reserves as discussed in Europe. Both price and quantity based mechanisms can improve the performance of the EU ETS against key criteria.
    Keywords: Computational Model, Emissions trading, Environmental Regulation, Market stability reserve
    JEL: D84 G18 Q48
    Date: 2015
  10. By: Ariane Labat (DG CLIMA, European Commission); Alban Kitous (JRC IPTS, European Commission); Miles Perry (DG CLIMA, European Commission); Bert Saveyn (JRC IPTS, European Commission; JRC IPTS , European Commission); Toon Vandyck (JRC IPTS , European Commission); Zoi Vrontisi (JRC IPTS , European Commission)
    Abstract: This report presents the modelling work quoted in the EC communication "The Paris Protocol - a blueprint for tackling global climate change beyond 2020” in the EU’s Energy Union package. It examines the effects of a Baseline scenario where current trends continue beyond 2020, and of a Global Mitigation scenario in line with keeping global warming below 2°C. The analysis uses the POLES and GEM-E3 models in a framework where economic welfare is maximised while tackling climate change. In the Baseline, emissions trigger +3.5°C global warming. In the Global Mitigation scenario, all regions realise domestic emission cuts to stay below 2°C, with various profiles in 2020-2050 depending on their national characteristics. A significant transformation of the energy systems and non-energy measures enable regions at all levels of income to move to a low-emission growth pathway. Sectors linked (directly or indirectly) to carbon-intensive processes adjust their investments to be competitive in a low-emission environment. A significant number of regions draw economic benefits from shifting their expenditures on fossil energy imports to investments. GDP growth rates are marginally affected in most regions by global efforts to reduce emissions. Crucially, high growth rates are maintained in fast-growing low-income regions. Economic costs are reduced further when countries use emission permit auction revenues for other tax reductions. Delaying actions to stay below 2°C add large economic costs.
    Keywords: GHG emissions, climate mitigation, international negotiations, COP21, IPCC, UNFCCC, modelling, GEM-E3, POLES, Road to Paris
    JEL: C68 D58 Q40 Q54
    Date: 2015–04
  11. By: Bell, Margaret; Bergantino, Angela S.; Catalano, Mario; Galatioto, Fabio
    Abstract: This paper illustrates the first results of an ongoing research for developing novel methods to analyse and simulate the relationship between trasport-related air pollutant concentrations and easily accessible explanatory variables. The final scope of the analysis is to integrate the new models in traditional traffic management decision-support systems for a sustainable mobility of road vehicles in urban areas. This first stage concerns the relationship between the mean hourly concentration of nitrogen dioxide and explanatory factors like traffic and weather conditions, with particular reference to the prediction of pollution peaks, defined as exceedances of normative concentration limits. Two modelling frameworks are explored: the Artificial Neural Network approach and the ARIMAX model. Furthermore, the benefit of a synergic use of both models for air quality forecasting is investigated. The analysis of findings points out that the prediction of extreme pollutant concentrations is best performed by the integration of the two models into an ensemble. The neural network is outperformed by the ARIMAX model in foreseeing peaks, but gives a more realistic representation of the relationships between concentration and wind characteristics. So, it can be exploited to direct the ARIMAX model specification. At last, the study shows that the ability at forecasting exceedances of pollution regulative limits can be enhanced by requiring traffic management actions when the predicted concentration exceeds a threshold that is pretty high but lower than the normative one.
    Date: 2015
  12. By: KWON Seokbeom; MOTOHASHI Kazuyuki
    Abstract: The Japanese national innovation system (JP NIS) and that of the United States (U.S. NIS) differ. One of the differences is that firms in the JP NIS are likely to collaborate with historical partners for the purpose of innovation or rely on in-house research and development (R&D), approaches that form a "relationship-driven innovation system." In the U.S. NIS, however, firms have a relatively weak reliance on prior partnerships or internal R&D and are likely to seek entities that know about the necessary technology. Thus, U.S. players acquire technologies through market transactions such as mergers and acquisitions (M&A). This paper primarily discusses how this institutional difference affects country-specific industrial sector specialization. Then, by using a multiagent model of the NIS and conducting simulation, we examine what strategy would help Japanese firms in industries dominated by radical innovation. The results show that the JP NIS provides an institutional advantage in industries with fast-changing consumer demand that require incremental innovation. However, the U.S. NIS benefits industries that require frequent radical innovation. Our analysis reveals that extending the partnership network while keeping internal R&D capability would be a beneficial strategy for Japanese firms in industries driven by radical innovation. Therefore, the present research suggests that policymakers need to differentiate policy that emphasizes business relationship and market mechanism importance according to industrial characteristics in order to improve overall national industrial competitiveness. At the same time, Japanese firms need to strengthen their R&D capability while trying to extend their pool of technology partners in order to improve the flexibility of their responses to radical changes in an industry.
    Date: 2015–05
  13. By: Mark D. Flood (Office of Financial Research); Oliver R. Goodenough (Office of Financial Research)
    Abstract: We show that the fundamental legal structure of a well-written financial contract follows a state-transition logic that can be formalized mathematically as a finite-state machine (also known as a finite-state automaton). The automaton defines the states that a financial relationship can be in, such as “default,” “delinquency,” “performing,” etc., and it defines an “alphabet” of events that can trigger state transitions, such as “payment arrives,” “due date passes,” etc. The core of a contract describes the rules by which different sequences of event arrivals trigger particular sequences of state transitions in the relationship between the counterparties. By conceptualizing and representing the legal structure of a contract in this way, we expose it to a range of powerful tools and results from the theory of computation. These allow, for example, automated reasoning to determine whether a contract is internally coherent and whether it is complete relative to a particular event alphabet. We illustrate the process by representing a simple loan agreement as an automaton.
    Keywords: Financial Contracts, Contract Automation
    Date: 2015–03–26

General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.