nep-gth New Economics Papers
on Game Theory
Issue of 2019‒05‒06
fourteen papers chosen by
Sylvain Béal
Université de Franche-Comté

  1. Quasiseparable aggregation in games with common local utilities By Kukushkin, Nikolai
  2. Compactification of Extensive Form Games and Belief in the Opponents' Future Rationality By Shuige Liu
  3. The Declining Price Anomaly is not Universal in Multi-Buyer Sequential Auctions (but almost is) By Vishnu V. Narayan; Enguerrand Prebet; Adrian Vetta
  4. The Optimal Sequence of Prices and Auctions By Zhang, Hanzhe
  5. Mean Field Equilibrium: Uniqueness, Existence, and Comparative Statics By Light, Bar; Weintraub, Gabriel
  6. Efficiency in Truthful Auctions via a Social Network By Seiji Takanashi; Takehiro Kawasaki; Taiki Todo; Makoto Yokoo
  7. Financial Contracts as Coordination Device By Le Coq, Chloe; Schwenen, Sebastian
  8. A strategic tax mechanism By Stamatopoulos, Giorgos
  9. Aligning profit taxation with value creation By Wolfram F. Richter
  10. Of course Collusion Should be Prosecuted. But Maybe... Or (The case for international antitrust agreements) By Filomena Garcia; Jose Manuel Paz y Minõ; Gustavo Torrens
  11. Causally Driven Incremental Multi Touch Attribution Using a Recurrent Neural Network By Du, Ruihuan; Zhong, Yu; Nair, Harikesh S.; Cui, Bo; Shou, Ruyang
  12. A Game Theoretic Setting of Capitation Versus Fee-For-Service Payment Systems By Allison Koenecke
  13. Distributive justice and social conflict in an AK model By Chris Tsoukis; Jun-ichi Itaya
  14. Decomposition of intra-household disparity sensitive fuzzy multi-dimensional poverty index: A study of vulnerability through Machine Learning By Sen, Sugata

  1. By: Kukushkin, Nikolai
    Abstract: Strategic games are considered where each player's total utility is an aggregate of local utilities obtained from the use of certain "facilities." All players using a facility obtain the same utility therefrom, which may depend on the identities of users and on their behavior. Individual improvements in such a game are acyclic if a "trimness" condition is satisfied by every facility and all aggregation rules are consistent with a separable ordering. Those conditions are satisfied, for instance, by bottleneck congestion games with an infinite set of facilities. Under appropriate additional assumptions, the existence of a Nash equilibrium is established.
    Keywords: Bottleneck congestion game; Game with structured utilities; Potential game; Aggregation; Separable ordering
    JEL: C72
    Date: 2019–04–30
  2. By: Shuige Liu
    Abstract: We introduce an operation, called compactification, to reduce an extensive form to a compact one where each decision node in the game tree can be assigned to more than one player. Motivated by Thompson (1952)'s interchange of decision nodes, we attempt to capture the notion of a faithful representation of the chronological order of the moves in dynamic games which plays an important role in fields like epistemic game theory. The compactification process preserves perfect recall and the unambiguity of the order among information sets. We specify an algorithm, called leaf-to-root process, which compactifies at least as many information sets as any other compactification process. The compact extensive form provides an approach to avoid problems in dynamic game theory due to the vague definition of the chronological order of the moves, for example, belief in the opponents' future rationality (Perea (2014)) is sensitivity to the specific extensive form representative. We show that any strategy which can rationally be chosen under common belief in future rationality in a minimal compact game if and only if it satisfies this property in every extensive form game which can be related to it via some compactification process.
    Date: 2019–05
  3. By: Vishnu V. Narayan; Enguerrand Prebet; Adrian Vetta
    Abstract: The declining price anomaly states that the price weakly decreases when multiple copies of an item are sold sequentially over time. The anomaly has been observed in a plethora of practical applications. On the theoretical side, Gale and Stegeman proved that the anomaly is guaranteed to hold in full information sequential auctions with exactly two buyers. We prove that the declining price anomaly is not guaranteed in full information sequential auctions with three or more buyers. This result applies to both first-price and second-price sequential auctions. Moreover, it applies regardless of the tie-breaking rule used to generate equilibria in these sequential auctions. To prove this result we provide a refined treatment of subgame perfect equilibria that survive the iterative deletion of weakly dominated strategies and use this framework to experimentally generate a very large number of random sequential auction instances. In particular, our experiments produce an instance with three bidders and eight items that, for a specific tie-breaking rule, induces a non-monotonic price trajectory. Theoretic analyses are then applied to show that this instance can be used to prove that for every possible tie-breaking rule there is a sequential auction on which it induces a non-monotonic price trajectory. On the other hand, our experiments show that non-monotonic price trajectories are extremely rare. In over six million experiments only a 0.000183 proportion of the instances violated the declining price anomaly.
    Date: 2019–05
  4. By: Zhang, Hanzhe (Michigan State University, Department of Economics)
    Abstract: A seller chooses to either post a price or run a reserve-price auction each period to sell a good before a deadline. Buyers with independent private values arrive over time. Assume that an auction costs more to the seller than a posted price. For a wide range of auction costs, the profit-maximizing mechanism sequence is to post prices first and then to run auctions. The optimality of the prices-then-auctions mechanism sequence provides a new justification for the use of the buy-it-now selling format on eBay.
    Keywords: buy-it-now; posted price; reserve price auction
    JEL: D44
    Date: 2019–04–24
  5. By: Light, Bar (Graduate School of Business, Stanford University); Weintraub, Gabriel (Graduate School of Business, Stanford University)
    Abstract: The standard solution concept for stochastic games is Markov perfect equilibrium (MPE); however, its computation becomes intractable as the number of players increases. Instead, we consider mean field equilibrium (MFE) that has been popularized in the recent literature. MFE takes advantage of averaging effects in models with a large number of agents. We make three main contributions. First, our main result in the paper provides conditions that ensure the uniqueness of an MFE. Second, we generalize previous MFE existence results. Third, we provide general comparative statics results. We apply our results to dynamic oligopoly models and to heterogeneous agent macroeconomic models commonly used in previous work. We believe our uniqueness result is the first of its nature in the class of models we study.
    Date: 2018–10
  6. By: Seiji Takanashi; Takehiro Kawasaki; Taiki Todo; Makoto Yokoo
    Abstract: In this paper, we study efficiency in truthful auctions via a social network, where a seller can only spread the information of an auction to the buyers through the buyers' network. In single-item auctions, we show that no mechanism is strategy-proof, individually rational, efficient, and weakly budget balanced. In addition, we propose $\alpha$-APG mechanisms, a class of mechanisms which operate a trade-off between efficiency and weakly budget balancedness. In multi-item auctions, there already exists a strategy-proof mechanism when all buyers need only one item. However, we indicate a counter-example to strategy-proofness in this mechanism, and to the best of our knowledge, the question of finding a strategy-proof mechanism remains open. We assume that all buyers have decreasing marginal utility and propose a generalized APG mechanism that is strategy-proof and individually rational but not efficient. Importantly, we show that this mechanism achieves the largest efficiency measure among all strategy-proof mechanisms.
    Date: 2019–04
  7. By: Le Coq, Chloe (Stockholm Institute of Transition Economics); Schwenen, Sebastian (Technical University of Munich (School of Management))
    Abstract: We study the use of fi nancial contracts as bid-coordinating device in multi-unit uniform price auctions. Coordination is required whenever firms face a volunteer's dilemma in pricing strategies: one firm (the "volunteer") is needed to increase the market clearing price. Volunteering, however, is costly, as inframarginal suppliers sell their entire capacity whereas the volunteer only sells residual demand. We identify conditions under which signing financial contracts solves this dilemma. We test our framework exploiting data on contract positions by large producers in the New York power market. Using a Monte Carlo simulation, we show that the contracting strategy is payoff dominant and provide estimates of the benefits of such strategy.
    Keywords: Auctions; Coordination; Volunteers dilemma; Forward markets; power market
    JEL: D21 D44 L41
    Date: 2019–04–24
  8. By: Stamatopoulos, Giorgos
    Abstract: We introduce a novel commodity tax mechanism in oligopolies that improves upon the standard tax policies. The government (i) announces an excise tax rate $\tau$ and (ii) auctions-off a number of tax exemptions. Namely, it invites the firms in a market to acquire the right to be exempted from the excise tax. The highest bidders are exempted paying the government their bids; and all other firms remain subject to $\tau$. Depending on the characteristics of the market, the mechanism we suggest has a number of desirable features. First, it allows the government to collect more revenues than the standard commodity tax policies (this is due to the competition among the firms to acquire the exemptions). Second, for markets where firms have informational advantage over the government, the mechanism allows for information revelation (via the firms' bids in the auction). Third, it impedes collusive activities in the market (as the mechanism creates an artificial asymmetry among the firms, which hinders collusion). Lastly, the mechanism is voluntary, namely the firms participate in the auction only if they wish and hence they are free to choose how to be taxed.
    Keywords: excise tax; tax exemption; auction; asymmetric information; collusion
    JEL: H21 H25 L1
    Date: 2019–05–01
  9. By: Wolfram F. Richter
    Abstract: The OECD seeks to align transfer pricing and profit taxation with value creation but fails to provide a clear definition. This paper argues that value creation requires international cooperation and that the profit tax base should therefore be allocated according to standards commonly considered as fair when distributing the surplus of cooperation. The claim that current rules of international profit taxation are aligned with value creation is rejected. If anything, the OECD’s objective suggests a tax system in which profits are split between the involved jurisdictions. This result triggers the question of possible implementation which is discussed in some detail.
    Keywords: international corporate income taxation, intellectual property, value creation, Shapley value, profit splitting
    JEL: H25 F23 M48
    Date: 2019
  10. By: Filomena Garcia (Indiana University, & UECE); Jose Manuel Paz y Minõ (Indiana University); Gustavo Torrens (Indiana University)
    Abstract: We study the incentives of competition authorities to prosecute collusive practices of domestic and foreign firms. For that purpose, we develop a model of multi-market contact between two firms that can engage in collusion in two countries. In each country, there is a competition authority with a mandate to maximize national welfare. Each competition authority decides its prosecution policy at the beginning of time and commits to it. In equilibrium, the ownership distribution of the firms (domestic versus foreign) affects prosecution policies. The country that does not own the firms prosecutes them as soon as information of collusion becomes available. On the contrary, the country that owns the firms has an incentive to protect their profits in foreign markets delaying prosecution. This strategic delay is valuable because it contains the information spreading that could trigger prosecution in the foreign country. Prosecution delays, however, are not optimal from the point of view of global welfare, something that could be solved through the integration of the competition authorities. The country of origin of the firms would nevertheless oppose integration. Finally, in a multi-industry setting, both countries delay prosecuting domestic firms, which again is not optimal from the point of view of global welfare. Moreover, in a multi-industry setting, both countries can be better off under integration.
    Keywords: Multi-market Collusion, Antitrust Policy, Strategic Prosecution, International Antitrust Agreements
    JEL: F23 F53 L41 K21
    Date: 2018–05
  11. By: Du, Ruihuan (?); Zhong, Yu (?); Nair, Harikesh S. (Stanford University Graduate School of Business); Cui, Bo (?); Shou, Ruyang (?)
    Abstract: This paper describes a practical system for Multi Touch Attribution (MTA) for use by a publisher of digital ads. We developed this system for, an eCommerce company, which is also a publisher of digital ads in China. The approach has two steps. The first step (“response modeling†) fits a user-level model for purchase of a product as a function of the user’s exposure to ads. The second (“credit allocation†) uses the fitted model to allocate the incremental part of the observed purchase due to advertising, to the ads the user is exposed to over the previous T days. To implement step one, we train a Recurrent Neural Network (RNN) on user-level conversion and exposure data. The RNN has the advantage of flexibly handling the sequential dependence in the data in a semi-parametric way. The specific RNN formulation we implement captures the impact of advertising intensity, timing, competition, and user-heterogeneity, which are known to be relevant to ad-response. To implement step two, we compute Shapley Values, which have the advantage of having axiomatic foundations and satisfying fairness considerations. The specific formulation of the Shapley Value we implement respects incrementality by allocating the overall incremental improvement in conversion to the exposed ads, while handling the sequence-dependence of exposures on the observed outcomes. The system is under production at, and scales to handle the high dimensionality of the problem on the platform (attribution of the orders of about 300M users, for roughly 160K brands, across 200+ ad-types, served about 80B ad-impressions over a typical 15-day period).
    Date: 2019–01
  12. By: Allison Koenecke
    Abstract: We aim to determine whether a game-theoretic model between an insurer and a healthcare practice yields a predictive equilibrium that incentivizes either player to deviate from a fee-for-service to capitation payment system. Using United States data from various primary care surveys, we find that non-extreme equilibria (i.e., shares of patients, or shares of patient visits, seen under a fee-for-service payment system) can be derived from a Stackelberg game if insurers award a non-linear bonus to practices based on performance. Overall, both insurers and practices can be incentivized to embrace capitation payments somewhat, but potentially at the expense of practice performance.
    Date: 2019–04
  13. By: Chris Tsoukis; Jun-ichi Itaya
    Abstract: We introduce distributive justice into a simple model of growth and distribution. Two groups (‘classes’) of otherwise identical, capital-rich and capital-poor individuals (‘capitalists’) and (‘workers’) are in conflict over factor (labour-capital) shares. Capitalists’ (workers’) ideal labour share is low (high) – but always tempered by the recognition that everyone supplies one unit of labour inelastically and desires a wage; and that the labour share impacts growth negatively in our ‘AK’ production economy. Social conflict is defined as the difference between the ideal labour shares of the two classes. This conflict is resolved by the two positive and three normative criteria we consider. Thus, the macroeconomy (growth, factor shares, distribution), social conflict and the methods of its resolution are jointly determined in a complete socio-economic equilibrium. We believe both this approach and our rich set of results are novel. We consider two positive (probabilistic voting and Nash bargaining, encapsulating electoral politics and socio-political bargaining) and two normative (justice) criteria (utilitarian and Rawlsian) of conflict resolution. Greater impatience, intensified status comparisons and negative consumption externalities, greater wealth inequality and a decline in productivity exacerbate social conflict. Status comparisons and wealth inequality tend to raise the labour share under all positive and normative criteria. Finally, we propose and analyse a criterion of ‘justice as minimal social friction’. Under the plausible assumption that the capitalists’ overall socio-political influence (numerical strength aside) is at least as high as that of workers, all positive methods imply a smaller labour share and more inequality than all our three criteria of distributive justice. We offer a numerical illustration of the key points.
    Keywords: growth, factor shares, status, distributive justice, social conflict, social contract
    JEL: O41 O43 E25 P16 Z13
    Date: 2019
  14. By: Sen, Sugata
    Abstract: The traditional multi-dimensional measures have failed to properly project the vulnerability of human-beings towards poverty. Some of the reasons behind this inability may be the failure of the existing measures to recognise the graduality inside the concept of poverty and the disparities within the household in wealth distribution. So this work wants to develop a measure to estimate the vulnerability of households in becoming poor in a multidimensional perspective through incorporating the intra-household disparities and graduality within the causal factors. Dimensional decomposition of the developed vulnerability measure is also under the purview of this work. To estimate the vulnerability and dimensional influences with the help of artificial intelligence an integrated mathematical framework is developed.
    Keywords: Poverty, Vulnerability, Fuzzy logic, Intra-household disparity, Shapley Value Decomposition, Machine Learning, LIME
    JEL: C63 I32
    Date: 2019–04–28

This nep-gth issue is ©2019 by Sylvain Béal. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.