nep-dcm New Economics Papers
on Discrete Choice Models
Issue of 2020‒04‒13
four papers chosen by
Edoardo Marcucci
Università degli studi Roma Tre

  1. An economist and a psychologist form a line: What can imperfect perception of length tell us about stochastic choice? By Duffy, Sean; Smith, John
  2. Financial Expectations and Household Consumption: Does Middle Inflation Matter? By Brown, Sarah; Harris, Mark N.; Spencer, Christopher; Taylor, Karl
  3. Choice overload and contextual inference: An experimental test By Irene Maria Buso
  4. An Exact Method for Assortment Optimization under the Nested Logit Model By Laurent Alfandari; Alborz Hassanzadeh; Ivana Ljubic

  1. By: Duffy, Sean; Smith, John
    Abstract: Standard choice experiments are hampered by the fact that utility is either unknown or imperfectly measured by experimenters. As a consequence, the inferences available to researchers are limited. By contrast, we design a choice experiment where the objects are valued according to only a single attribute with a continuous measure and we can observe the true preferences of subjects. Subjects have an imperfect perception of the choice objects but can improve the precision of their perception with cognitive effort. Subjects are given a choice set involving several lines of various lengths and are told to select one of them. They strive to select the longest line because they are paid an amount that increases with the length of their choice. Our design allows us to observe the search history, the response times, and make unambiguous conclusions about the optimality of choices. We find a negative relationship between the demanding nature of the choice problems and the likelihood that subjects select the optimal lines. We also find a positive relationship between the demanding nature of the choice problems and the response times. However, we find evidence that suboptimal choices are associated with longer response times than are optimal choices. This result appears to be consistent with Fudenberg, Strack, and Strzalecki (2018). Additionally, our experimental design permits a multinomial discrete choice analysis. Our results suggest that the errors in our data are better described as having a Gumbel distribution rather than a normal distribution. We also observe effects consistent with memory decay and attention. Finally, we find evidence that choices in our experiment exhibit the independence from irrelevant alternatives (IIA) property.
    Keywords: judgment, memory, response times, independence from irrelevant alternatives
    JEL: C91 D03
    Date: 2020–04–02
  2. By: Brown, Sarah (University of Sheffield); Harris, Mark N. (Curtin University); Spencer, Christopher (Loughborough University); Taylor, Karl (University of Sheffield)
    Abstract: Using British panel data, we explore the finding that households often expect theirÂ…financial position to remain unchanged compared to other alternatives, using a generalised middle inflated ordered probit (GMIOP) model. In doing so we account for the tendency of individuals to choose 'neutral' responses when faced with attitudinal and opinion-based questions, which are a common feature of survey data. Our empirical analysis strongly supports the use of a GMIOP model to account for this response pattern. Expectations indices based on competing discrete choice models are then exploited to explore the role that financial expectations play in driving the consumption of different types of durable goods and the amount of expenditure undertaken. Whilst financial optimism is significantly associated with consumption, indices which fail to take into account middle-inflation overestimate the impact of financial expectations.
    Keywords: household consumption, financial expectations, survey data, generalised middle-inflated ordered probit model
    JEL: C12 C35
    Date: 2020–03
  3. By: Irene Maria Buso
    Abstract: The paradoxical finding of the preference for small sets of products (Iyengar and Lepper; 2000) has been explained with cognitive costs and regret. Instead, Kamenica (2008) suggests that the set size conveys a payoff relevant information about the popularity of the products in a set: in small set there are the most popular products. The present experimental analysis aims to test if the contextual inference theory can explain the increased willingness to take a product from small sets; the experiment relies on the standard framework in experiments on choice overload: it is a between-subjects experiment where the willingness to purchase a product rather than accept a fixed monetary payment is compared in the two experimental conditions, that is when an extensive or a small choice set is provided to the participants. The new element with respect to previous studies on this topic is that the participants do not see the options in the set: the items are presented inside bags. The subjects can choose to take one product at random from the set or a fixed monetary fee; the choice is offered sequentially on three products: chocolate, yoghurt and crisps. This design rules out alternative explanations as cognitive costs and regret since the only information given is the set size. The results show that in two of the three product categories the proportion of people that prefer to take whatever product from the small set is higher than from the large one.
    Keywords: Contextual inference theory; Experimental economics
    JEL: C9 D11
    Date: 2020–03
  4. By: Laurent Alfandari (ESSEC Business School - Essec Business School); Alborz Hassanzadeh (ESSEC Business School - Essec Business School); Ivana Ljubic (ESSEC Business School - Essec Business School)
    Abstract: We study the problem of finding an optimal assortment of products maximizing the expected revenue, in which customer preferences are modeled using a Nested Logit choice model. This problem is known to be polynomially solvable in a specific case and NP-hard otherwise, with only approximation algorithms existing in the literature. For the NP-hard cases, we provide a general exact method that embeds a tailored Branch-and-Bound algorithm into a fractional programming framework. Contrary to the existing literature, in which assumptions are imposed on either the structure of nests or the combination and characteristics of products, no assumptions on the input data are imposed, and hence our approach can solve the most general problem setting. We show that the parameterized subproblem of the fractional programming scheme, which is a binary highly non-linear optimization problem, is decomposable by nests, which is a main advantage of the approach. To solve the subproblem for each nest, we propose a two-stage approach. In the first stage, we identify those products that are undoubtedly beneficial to offer, or not, which can significantly reduce the problem size. In the second stage, we design a tailored Branch-and-Bound algorithm with problem-specific upper bounds. Numerical results show that the approach is able to solve assortment instances with up to 5,000 products per nest. The most challenging instances for our approach are those in which the dissimilarity parameters of nests can be either less or greater than one.
    Keywords: nested logit,fractional programming,combinatorial optimization,revenue management,assortment optimization
    Date: 2020–01

This nep-dcm issue is ©2020 by Edoardo Marcucci. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.