|
on Utility Models and Prospect Theory |
Issue of 2019‒07‒15
eighteen papers chosen by |
By: | Adrian Bruhin; Maha Manai; Luis Santos-Pinto |
Abstract: | The existing literature on choice under risk suggests that probability weighting and choice set dependence both influence risky choices. However, they have not been tested jointly. We design an incentivized laboratory experiment to assess the relative importance of probability weighting and choice set dependence both non-parametrically and with a structural model. Our design uses binary choices between lotteries that may trigger Allais Paradoxes. To reliably discriminate between probability weighting and choice set dependence, we manipulate the lotteries’ correlation structure while keeping their marginal distributions constant. The non-parametric analysis reveals that probability weighting and choice set dependence jointly play a role in describing aggregate choices. To take potential heterogeneity into account parsimoniously, we estimate a structural model based on a finite mixture approach. The model classifies subjects into three distinct types: a Cumulative Prospect Theory (CPT) type whose choices are primarily driven by probability weighting, a Salience Theory (ST) type whose choices are predominantly driven by choice set dependence, and an Expected Utility Theory (EUT) type. The structural model uncovers substantial heterogeneity in risk preferences: 38% of subjects are CPT-types, 34% are ST-types, and 28% are EUT-types. This classification of subjects into types also predicts preference reversals out-of-sample. Overall, these results show that probability weighting and choice set dependence play a similarly important role in describing risky choices. Beyond the domain of choice under risk, they may also help to improve our understanding of consumer, investor, and judicial choices. |
Keywords: | Choice under Risk, Choice Set Dependence, ProbabilityWeighting, Salience Theory, Preference Reversals |
JEL: | D81 C91 C49 |
Date: | 2019–06 |
URL: | http://d.repec.org/n?u=RePEc:lau:crdeep:19.01new&r=all |
By: | Antoine Billot (LEMMA - Laboratoire d'économie mathématique et de microéconomie appliquée - UP2 - Université Panthéon-Assas - Sorbonne Universités); Jean-Marc Tallon (PSE - Paris School of Economics, PJSE - Paris Jourdan Sciences Economiques - UP1 - Université Panthéon-Sorbonne - ENS Paris - École normale supérieure - Paris - INRA - Institut National de la Recherche Agronomique - EHESS - École des hautes études en sciences sociales - ENPC - École des Ponts ParisTech - CNRS - Centre National de la Recherche Scientifique); Sujoy Mukerji (QMUL - Queen Mary University of London) |
Abstract: | We review some of the (theoretical) economic implications of David Schmeidler's models of decision under uncertainty (Choquet expected utility and maxmin expected utility) in competitive market settings. We start with the portfolio inertia result of Dow and Werlang (1992), show how it does or does not generalize in an equilibrium setting. We further explore the equilibrium implications (indeterminacies, non revelation of information) of these decision models. A section is then devoted to the studies of Pareto optimal arrangements under these models. We conclude with a discussion of experimental evidence for these models that relate, in particular, to the implications for market behaviour discussed in the preceding sections. |
Keywords: | Choquet Expected Utility,Maxmin Expected Utility,No-trade,Risk Sharing,Indeterminacy,Experimental evidence |
Date: | 2019–07 |
URL: | http://d.repec.org/n?u=RePEc:hal:psewpa:halshs-02173491&r=all |
By: | Thijs Kamma; Antoon Pelsser |
Abstract: | We develop a dual control method for approximating investment strategies in incomplete environments that emerge from the presence of market frictions. Convex duality enables the approximate technology to generate lower and upper bounds on the optimal value function. The mechanism rests on closed-form expressions pertaining to the portfolio composition, whence we are able to derive the near-optimal asset allocation explicitly. In a real financial market, we illustrate the accuracy of our approximate method on a dual CRRA utility function that characterizes the preferences of some finite-horizon investor. Negligible duality gaps and insignificant annual welfare losses substantiate accuracy of the technique. |
Date: | 2019–06 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1906.12317&r=all |
By: | Gossner, Olivier; Kuzmics, Christoph |
Abstract: | A decision maker (DM) makes choices from different sets of alternatives. The DM is initially ignorant of the payoff associated with each alternative, and learns these payoffs only after a large number of choices have been made. We show that, in the presence of an outside option, once payoffs are learned, the optimal choice rule from sets of alternatives can be rationalized by a DM with strict preferences over all alternatives. Under this model, the DM has preferences for preferences while being ignorant of what preferences are “right”. |
Keywords: | consistency; rationality; weak axiom of revealed preferences; strict preference |
JEL: | C73 D01 D11 |
Date: | 2018–08–16 |
URL: | http://d.repec.org/n?u=RePEc:ehl:lserod:87332&r=all |
By: | Stark, Oded (University of Bonn) |
Abstract: | We study the relative risk aversion of an individual with particular social preferences: his wellbeing is influenced by his relative wealth, and by how concerned he is about having low relative wealth. Holding constant the individual's absolute wealth, we obtain two results. First, if the individual's level of concern about low relative wealth does not change, the individual becomes more risk averse when he rises in the wealth hierarchy. Second, if the individual's level of concern about low relative wealth intensifies when he rises in the wealth hierarchy and if, in precise sense, this intensification is strong enough, then the individual becomes less risk averse: the individual's desire to advance further in the wealth hierarchy is more important to him than possibly missing out on a better rank. |
Keywords: | relative risk aversion, wealth rank, concern about low relative wealth |
JEL: | D31 D81 G11 |
Date: | 2019–06 |
URL: | http://d.repec.org/n?u=RePEc:iza:izadps:dp12423&r=all |
By: | Ge, Ge (Department of Health Management and Health Economics); Godager, Geir (Department of Health Management and Health Economics); Wang, Jian (Department of Health Management and Health Economics) |
Abstract: | We ask whether the physician's treatment choices are affected by demand-side cost sharing. In order to identify and quantify preferences under demand-side cost sharing, we design and conduct an incentivized laboratory experiment where only medical students are recruited to participate. In our experiment we achieve saliency of all three attributes of treatment alternatives, profit, health benefit and patient consumption: The choices in the laboratory experiment determine the amount of medical treatment and the future consumption level of a real patient admitted to the nearest hospital. In our experiment we vary demand-side cost sharing while preferences and bargaining power of the patient is fixed. We estimate decision-makers' preference parameters in a variety of random utility models. We find strong evidence suggesting that the amount of demand-side cost sharing affects medical decisions. |
Keywords: | Physician preferences; Demand-side cost sharing; Incentivized laboratory experiment |
JEL: | C91 I11 J33 |
Date: | 2019–05–13 |
URL: | http://d.repec.org/n?u=RePEc:hhs:oslohe:2019_002&r=all |
By: | Maria Arduca; Pablo Koch-Medina; Cosimo Munari |
Abstract: | We describe a general approach to obtain dual representations for systemic risk measures of the "allocate first, then aggregate"-type, which have recently received significant attention in the literature. Our method is based on the possibility to express this type of multivariate risk measures as special cases of risk measures with multiple eligible assets. This allows us to apply standard Fenchel-Moreau techniques to tackle duality also for systemic risk measures. The same approach can be also successfully employed to obtain an elementary proof of the dual representation of "first aggregate, then allocate"-type systemic risk measures. As a final application, we apply our results to derive a simple proof of the dual representation of univariate utility-based risk measures. |
Date: | 2019–06 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1906.10933&r=all |
By: | W. Erwin Diewert; Robert C. Feenstra |
Abstract: | A major challenge facing statistical agencies is the problem of adjusting price and quantity indexes for changes in the availability of commodities. This problem arises in the scanner data context as products in a commodity stratum appear and disappear in retail outlets. Hicks suggested a reservation price methodology for dealing with this problem in the context of the economic approach to index number theory. Hausman used a linear approximation to the demand curve to compute the reservation price, while Feenstra used a reservation price of infinity for a CES demand curve, which will lead to higher gains. The present paper evaluates these approaches, comparing the CES gains to those obtained using a quadratic utility function using scanner data on frozen juice products. We find that the CES gains from new frozen juice products are about five times greater than those obtained using the quadratic utility function. |
JEL: | C43 C81 D11 |
Date: | 2019–06 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:25991&r=all |
By: | Massimo Guidolin; Alexei Orlov |
Abstract: | We report systematic, out-of-sample evidence on the benefits to an already welldiversified investor that may derive from further diversification into various hedge fund strategies. We investigate dynamic strategic asset allocation decisions that take into account investors’ preferences as well as return predictability. Our results suggest that not all hedge fund strategies benefit a long-term investor who is already well diversified across stocks, government and corporate bonds, and REITs. Only strategies whose payoffs are highly nonlinear (e.g., fixed income relative value and convertible arbitrage), and therefore not easily replicable, constitute viable options. Most of the realized economic value fails to result from a mean-variance type of improvement but comes instead from an improvement in realized higher-moment properties of optimal portfolios. Medium to highly risk-averse investors benefit the most from this alternative asset class. |
Keywords: | Strategic asset allocation, hedge fund strategies, predictive regressions, out-ofsample performance, certainty equivalent return. |
JEL: | G11 G17 G12 C53 |
Date: | 2018 |
URL: | http://d.repec.org/n?u=RePEc:baf:cbafwp:cbafwp1890&r=all |
By: | Michaillat, Pascal; Saez, Emmanuel |
Abstract: | The New Keynesian model makes several anomalous predictions at the zero lower bound: collapse of output and inflation, and implausibly large effects of forward guidance and government spending. To resolve these anomalies, we introduce wealth into the utility function. The justification is that wealth is a marker of social status, and people value social status. Since people save not only for future consumption but also to accrue social status, the Euler equation is modified. As a result, when the marginal utility of wealth is sufficiently large, the dynamical system representing the equilibrium at the zero lower bound becomes a source instead of a saddle-which resolves all the anomalies. |
JEL: | E32 E52 E62 |
Date: | 2019–06 |
URL: | http://d.repec.org/n?u=RePEc:cpr:ceprdp:13775&r=all |
By: | Kate Bundorf; Maria Polyakova; Ming Tai-Seale |
Abstract: | Algorithms increasingly assist consumers in making their purchase decisions across a variety of markets; yet little is known about how humans interact with algorithmic advice. We examine how algorithmic, personalized information affects consumer choice among complex financial products using data from a randomized, controlled trial of decision support software for choosing health insurance plans. The intervention significantly increased plan switching, cost savings, time spent choosing a plan, and choice process satisfaction, particularly when individuals were exposed to an algorithmic expert recommendation. We document systematic selection - individuals who would have responded to treatment the most were the least likely to participate. A model of consumer decision-making suggests that our intervention affected consumers’ signals about both product features (learning) and utility weights (interpretation). |
JEL: | D1 D12 D8 D81 D82 D83 D9 D90 D91 G22 H51 I13 |
Date: | 2019–06 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:25976&r=all |
By: | Luciano Andreozzi |
Abstract: | We extend Karni and Safra (2002a) axiomatic model of procedural jus- tice to provide a behavioral characterization of aversion to ex-post in- equality. Our characterization distinguishes ex-post inequality aversion from other ’social’ motives like altruism and is independent from atti- tudes towards risk. It allows for violations of the independence axiom to make room for ex-ante inequality aversion and, more generally, proce- dural justice. Our axiomatic model naturally lends itself to measure the strength of aversion to ex-post inequality. |
Keywords: | Indirect inference; directional statistics; stable distribution; weighting matrix |
JEL: | D63 D64 D81 |
Date: | 2019 |
URL: | http://d.repec.org/n?u=RePEc:trn:utwprg:2019/10&r=all |
By: | Nicolas Taconet (CIRED, ENPC); Céline Guivarch (CIRED, ENPC); Antonin Pottier (EHESS) |
Abstract: | Carbon dioxide emissions impose a social cost on economies, owing to the damages they will cause in the future. In particular, emissions increase global temperature that may reach tipping points in the climate or economic system, triggering large economic shocks. Tipping points are uncertain by nature, they induce higher expected damages but also dispersion of possible damages, that is risk. Both dimensions increase the Social Cost of Carbon (SCC). However, the respective contributions of higher expected damages and risk have not been disentangled. We develop a simple method to compare how much expected damages explain the SCC, compared to the risk induced by a stochastic tipping point. We find that expected damages account for more than 90% of the SCC with productivity shocks lower than 10%, the high end of the range of damages commonly assumed in Integrated Assessment Models. It takes both high productivity shock and high risk aversion for risk to have a significant effect. Our results also shed light on the observation that risk aversion plays a modest role in determining the SCC (the ''risk aversion puzzle''): they suggest that too low levels of damages considered in previous studies could be responsible for the low influence of risk aversion. |
Keywords: | Climate Change, Tipping points, Expected Utility, Integrated Assessment Models |
JEL: | C61 H41 Q54 |
Date: | 2019–06 |
URL: | http://d.repec.org/n?u=RePEc:fae:wpaper:2019.11&r=all |
By: | Branger, Nicole; Konermann, Patrick; Schlag, Christian |
Abstract: | We study the effects of market incompleteness on speculation, investor survival, and asset pricing moments, when investors disagree about the likelihood of jumps and have recursive preferences. We consider two models. In a model with jumps in aggregate consumption, incompleteness barely matters, since the consumption claim resembles an insurance product against jump risk and effectively reproduces approximate spanning. In a long-run risk model with jumps in the long-run growth rate, market incompleteness affects speculation, and investor survival. Jump and diffusive risks are more balanced regarding their importance and, therefore, the consumption claim cannot reproduce approximate spanning. |
Keywords: | market (in)completeness,heterogeneous beliefs,jumps in the longrungrowth rate,jumps in aggregate consumption,recursive preferences |
JEL: | D51 D52 G12 |
Date: | 2019 |
URL: | http://d.repec.org/n?u=RePEc:zbw:safewp:252&r=all |
By: | Akihiko Matsui (Faculty of Economics, The University of Tokyo); Megumi Murakami (Department of Economics, Northwestern University) |
Abstract: | Centralized matching mechanisms and decentralized markets have been widely studied to allocate indivisible objects. However, they have been analyzed separately. The present paper proposes a new framework, by explicitly formulating a two-stage model where objects are allocated through a matching mechanism in the first stage and traded in the second stage market. In addition, one divisible good called money may or may not be available in the market. Every player demands at most one unit of object besides money. The players may face different priorities at each object type in the first stage. Each object type has a limited amount of capacity, called quota. Each player has a quasi-linear utility function. The present analysis sets forth the equivalence conditions under which stability and efficiency are attained in equilibrium. |
Date: | 2019–01 |
URL: | http://d.repec.org/n?u=RePEc:tky:fseres:2019cf1112&r=all |
By: | Hirotake Ito (Graduate School of Media and Governance, Keio University); Makiko Nakamuro (Graduate School of Media and Governance, Keio University); Shintaro Yamaguchi (Faculty of Economics, The University of Tokyo) |
Abstract: | Centralized matching mechanisms and decentralized markets have been widely studied to allocate indivisible objects. However, they have been analyzed separately. The present paper proposes a new framework, by explicitly formulating a two-stage model where objects are allocated through a matching mechanism in the first stage and traded in the second stage market. In addition, one divisible good called money may or may not be available in the market. Every player demands at most one unit of object besides money. The players may face different priorities at each object type in the first stage. Each object type has a limited amount of capacity, called quota. Each player has a quasi-linear utility function. The present analysis sets forth the equivalence conditions under which stability and efficiency are attained in equilibrium. |
Date: | 2019–01 |
URL: | http://d.repec.org/n?u=RePEc:tky:fseres:2019cf1113&r=all |
By: | Hitoshi Matsushima (Faculty of Economics, The University of Tokyo) |
Abstract: | This study investigates the unique implementation of a social choice function in iterative dominance in the ex-post term. We assume partial ex-post verifiability; that is, after determining an allocation, the central planner can only observe partial information about the state as verifiable. We demonstrate a condition of the state space, termed “full detection,†under which any social choice function is uniquely implementable even if the range of the players’ lies, which the ex-post verifiable information directly detects, is quite narrow. To prove this, we construct a dynamic mechanism according to which each player announces his (or her) private signal before the other players observe this signal at an earlier stage, and each player also announces the state at a later stage. In this construction, we can impose several severe restrictions, such as boundedness, permission of only tiny transfers off the equilibrium path, and no permission of transfers on the equilibrium path. This study does not assume either expected utility or quasi-linearity. |
Date: | 2019–02 |
URL: | http://d.repec.org/n?u=RePEc:tky:fseres:2019cf1116&r=all |
By: | Matthew Backus; Sida Peng |
Abstract: | Estimation of discontinuities is pervasive in applied economics: from the study of sheepskin effects to prospect theory and “bunching” of reported income on tax returns, models that predict discontinuities in outcomes are uniquely attractive for empirical testing. However, existing empirical methods often rely on assumptions about the number of discontinuities, the type, the location, or the underlying functional form of the model. We develop a nonparametric approach to the study of arbitrary discontinuities — point discontinuities as well as jump discontinuities in the nth derivative, where n = 0,1,... — that does not require such assumptions. Our approach exploits the development of false discovery rate control methods for lasso regression as proposed by G’Sell et al. (2015). This framework affords us the ability to construct valid tests for both the null of continuity as well as the significance of any particular discontinuity without the computation of nonstandard distributions. We illustrate the method with a series of Monte Carlo examples and by replicating prior work detecting and measuring discontinuities, in particular Lee (2008), Card et al. (2008), Reinhart and Rogoff (2010), and Backus et al. (2018b). |
JEL: | C01 C20 C52 |
Date: | 2019–06 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:26016&r=all |