
on Utility Models and Prospect Theory 
By:  Thierry Chauveau (Centre d'Economie de la Sorbonne) 
Abstract:  The theory of disappointment of Loomrs and Sugden [1986] has never been given an axiomatics. This article, where a theory of disappointment is derived from a simple axiomatics, makes up for this omission. The new theory is close to that of Loomes and Sugden although the functional representing the preferences of the decisionmaker is now lotterydependent. Actually, preferences exhibit four properties of interest: (a) riskaverse and risk prone investors actually behave differently; (b) risk is defined in a consistent way with risk aversion; (c) the functional is nothing but the opposite to a convex measure of risk (Föllmer and Schied [2002]) when constant marginal utility is assumed and (d) violations of the secondorder stochastic dominance property are allowed for when monetary values are taken into account (but not when "utils" are substituted for them). Moreover, the preorder induced by stochastic dominance over utils is as "close" to the preorder of preferences as possible and utility functions may be elicited through experimental testing 
Keywords:  Disappointment; riskaversion; expected utility; risk premium; stochastic dominance; subjective risk 
JEL:  D81 
Date:  2014–06 
URL:  http://d.repec.org/n?u=RePEc:mse:cesdoc:14054r&r=upt 
By:  Vedran Kojić (Faculty of Economics and Business, University of Zagreb) 
Abstract:  This paper presents a new, noncalculus approach to solving the utility maximization problem with CES utility function, as well as with CobbDouglas utility function in case of n≥2 commodities. Instead of using the Lagrange multiplier method or some other method based on differential calculus, these two maximization problems are solved by using Jensen's inequlity and weighted arithmeticgeometric mean (weighted AMGM) inequality. In comparison with calculus methods, this approach does not require checking the first and the second order conditions. 
Keywords:  Utility maximization problem, CES and CobbDouglas utility function, mathematical inequalities, without calculus 
JEL:  C69 D11 
Date:  2015–07–15 
URL:  http://d.repec.org/n?u=RePEc:zag:wpaper:1504&r=upt 
By:  Dmitry Kramkov; Kim Weston 
Abstract:  In the problem of optimal investment with utility function defined on $(0,\infty)$, we formulate sufficient conditions for the dual optimizer to be a uniformly integrable martingale. Our key requirement consists of the existence of a martingale measure whose density process satisfies the probabilistic Muckenhoupt $(A_p)$ condition for the power $p=1/(1a)$, where $a\in (0,1)$ is a lower bound on the relative riskaversion of the utility function. We construct a counterexample showing that this $(A_p)$ condition is sharp. 
Date:  2015–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1507.05865&r=upt 
By:  Ole Peters; Alexander Adamou 
Abstract:  We present a mathematical solution to the insurance puzzle. Our solution only uses timeaverage growth rates and makes no reference to risk preferences. The insurance puzzle is this: according to the expectation value of wealth, buying insurance is only rational at a price that makes it irrational to sell insurance. There is no price that is beneficial to both the buyer and the seller of an insurance contract. The puzzle why insurance contracts exist is traditionally resolved by appealing to utility theory, asymmetric information, or a mix of both. Here we note that the expectation value is the wrong starting point  a legacy from the early days of probability theory. It is the wrong starting point because not even the most basic models of wealth (random walks) are stationary, and what the individual experiences over time is not the expectation value. We use the standard model of noisy exponential growth and compute timeaverage growth rates instead of expectation values of wealth. In this new paradigm insurance contracts exist that are beneficial for both parties. 
Date:  2015–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1507.04655&r=upt 
By:  Dahlquist, Magnus; Farago, Adam; Tédongap, Roméo 
Abstract:  We examine the portfolio choice of an investor with generalized disappointment aversion preferences who faces returns described by a normalexponential model. We derive a threefund separation strategy: the investor allocates wealth to a riskfree asset, a standard meanvariance efficient fund, and an additional fund reflecting return asymmetries. The optimal portfolio is characterized by the investor's endogenous effective risk aversion and implicit asymmetry aversion. We find that disappointment aversion is associated with much larger asymmetry aversion than are standard preferences. Our model explains patterns in popular portfolio advice and provides a reason for shifting from bonds to stocks as the investment horizon increases. 
Keywords:  Asset allocation; Downside risk 
JEL:  G11 
Date:  2015–07 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:10706&r=upt 
By:  Mikhail Timonin 
Abstract:  We prove a representation theorem for the Choquet integral model. The preference relation is defined on a twodimensional heterogeneous product set $X = X_1 \times X_2$ where elements of $X_1$ and $X_2$ are not necessarily comparable with each other. However, making such comparisons in a meaningful way is necessary for the construction of the Choquet integral (and any rankdependent model). We construct the representation, study its uniqueness properties, and look at applications in multicriteria decision analysis, statedependent utility theory, and social choice. Previous axiomatizations of this model, developed for decision making under uncertainty, relied heavily on the notion of comonotocity and that of a "constant act". However, that requires $X$ to have a special structure, namely, all factors of this set must be identical. Our characterization does not assume commensurateness of criteria a priori, so defining comonotonicity becomes impossible. 
Date:  2015–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1507.04167&r=upt 
By:  Layson, Stephen (University of North Carolina at Greensboro, Department of Economics) 
Abstract:  Only in the 2good case is a diminishing marginal rate of substitution equivalent to quasiconcavity of the utility function. When there are more than 2 goods, the conditions for quasiconcavity, expressed in terms of bordered hessians, are very unintuitive and tedious to implement. This paper demonstrates, however, that a constant or diminishing marginal rate of substitution between any good and a composite good, consisting of all other goods, is equivalent to quasiconcavity. A new method for checking quasiconcavity is demonstrated that is sometimes easier to use than the traditional method of checking the signs of the bordered hessians. 
Keywords:  Marginal Rates; Substitution; Quasiconcavity 
JEL:  D01 D11 
Date:  2015–07–17 
URL:  http://d.repec.org/n?u=RePEc:ris:uncgec:2015_006&r=upt 
By:  Michael Morreau (UiT  The Arctic University of Norway); John A Weymark (Vanderbilt University) 
Abstract:  The social welfare functional approach to social choice theory fails to distinguish between a genuine change in individual wellbeings from a merely representational change due to the use of dierent measurement scales. A generalization of the concept of a social welfare functional is introduced that explicitly takes account of the scales that are used to measure wellbeings so as to distinguish between these two kinds of changes. This generalization of the standard theoretical framework results in a more satisfactory formulation of welfarism, the doctrine that social alternatives are evaluated and socially ranked solely in terms of the wellbeings of the relevant individuals. This scaledependent form of welfarism is axiomatized using this framework. The implications of this approach for characterizing classes of social welfare orderings are also considered. 
Keywords:  grading; measurement scales; social welfare functionals; utility aggregation; welfarism 
JEL:  D7 D6 
Date:  2015–07–13 
URL:  http://d.repec.org/n?u=RePEc:van:wpaper:vueconsub1500008&r=upt 
By:  Thomas Gall; David Reinstein 
Abstract:  When Al makes an offer to Betty that Betty observes and rejects, Al may “lose face”. This loss of face (LoF) may cost Al utility, either directly or through reputation effects. This can lead to fewer offers and inefficiency in the context of bilateral matching problems, e.g., the marriage market, research partnering, and international negotiations. We offer a simple model with asymmetric information, a continuous signal of an individual’s binary type, and a linear marriage production function. We add a primitive LoF term, characterize the stable equilibria, compare the benchmark without LoF to a case where only one side is vulnerable to LoF, and present comparative statics. A small amount of LoF has no effect on low types’ behavior, but, will make high types on both sides more selective. A stronger LoF drives high types out of the market, and makes low types reverse snobs, further reducing welfare. LoF also makes rejecting strictly preferred to being rejected, making the “high types reject” equilibrium stable. We can eliminate the effects of LoF by letting the vulnerable side move second, or setting up a “Conditionally Anonymous Environment” that only reveals when both parties say yes. We motivate our model with a variety of empirical examples, and we suggest policy and managerial implications. 
Date:  2015–07–20 
URL:  http://d.repec.org/n?u=RePEc:esx:essedp:769&r=upt 
By:  Smith, Robert Elliott 
Abstract:  Making decisions under uncertainty is at the core of human decisionmaking, particularly economic decisionmaking. In economics, a distinction is often made between quantifiable uncertainty (risk) and unquantifiable uncertainty (Knight, Uncertainty and Profit, 1921). However, this distinction is often ignored by, in effect, the quantification of unquantifiable uncertainty, through the assumption of subjective probabilities in the mind of the human decision makers (Savage, The Foundations of Statistics, 1954). This idea is also reflected in developments in artificial intelligence (AI). However, there are serious reasons to doubt this assumption, which are relevant to both AI and economics. Some of the reasons for doubt relate directly to problems that AI has faced historically, that remain unsolved, but little regarded. AI can proceed on a prescriptive agenda, making engineered systems that aid humans in decisionmaking, despite the fact that these problems may mean that the models involved have serious departures from real human decisionmaking, particularly under uncertainty. However, in descriptive uses of AI and similar ideas (like the modelling of decision making agents in economics), it is important to have a clear understanding of what has been learned from AI about these issues. This paper will look at AI history in this light, to illustrate what can be expected from models of human decisionmaking under uncertainty that proceed from these assumptions. Alternative models of uncertainty are discussed, along with their implications for examining in vivo human decisionmaking uncertainty in economics. 
Keywords:  uncertainty,probability,Bayesian,artificial intelligence 
JEL:  B59 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:zbw:ifwedp:201550&r=upt 
By:  Ina A Taneva 
Abstract:  There are two ways of creating incentives for interacting agents to behave in a desired way. One is by providing appropriate payoff incentives, which is the subject of mechanism design. The other is by choosing the information that agents observe, which we refer to as information design. We consider a model of symmetric information where a designer chooses and announces the information structure about a payoff relevant state. The interacting agents observe the signal realizations and take actions which affect the welfare of both the designer and the agents. We characterize the general finite approach to deriving the optimal information structure for the designer  the one that maximizes the designer's ex ante expected utility subject to agents playing a Bayes Nash equilibrium. We then apply the general approach to a symmetric two state, two agent, and two actions environment in a parameterized underlying game and fully characterize the optimal information structure: it is never strictly optimal for the designer to use conditionally independent private signals; the optimal information structure may be a public signal or may consist of correlated private signals. Finally, we examine how changes in the underlying game affect the designer's maximum payoff. This exercise provides a joint mechanism/information design perspective. 
Keywords:  informtion design, implementation, incomplete information, Bayes correlated equilibrium, senderreceiver games 
JEL:  C72 D72 D82 D83 
Date:  2015–02–11 
URL:  http://d.repec.org/n?u=RePEc:edn:esedps:256&r=upt 
By:  Blanchard, Olivier (International Monetary Fund); Erceg, Christopher J. (Federal Reserve Board); Lindé, Jesper (Research Department, Central Bank of Sweden) 
Abstract:  We show that a fiscal expansion by the core economies of the euro area would have a large and positive impact on periphery GDP assuming that policy rates remain low for a prolonged period. Under our preferred model specification, an expansion of core government spending equal to one percent of euro area GDP would boost periphery GDP around 1 percent in a liquidity trap lasting three years, about half as large as the effect on core GDP. Accordingly, under a standard ad hoc loss function involving output and inflation gaps, increasing core spending would generate substantial welfare improvements, especially in the periphery. The benefits are considerably smaller under a utilitybased welfare measure, reflecting in part that higher net exports play a material role in raising periphery GDP. 
Keywords:  Monetary Policy; Fiscal Policy; Liquidity Trap; Zero Bound Constraint; DSGE Model; Currency Union 
JEL:  E52 E58 
Date:  2015–07–01 
URL:  http://d.repec.org/n?u=RePEc:hhs:rbnkwp:0304&r=upt 