|
on Utility Models and Prospect Theory |
Issue of 2018‒08‒20
thirty-one papers chosen by |
By: | William, Barnett; Qing, Han; Jianbo, Zhang |
Abstract: | A central tenet of behavioral economics is that the axioms producing expected utility maximization by consumers are too strong to be descriptive of rational behavior. The existing theory of monetary services aggregation under risk assume expected utility maximization. We extend those results to uncertainty under weaker axiomatic assumptions by using Choquet expectations. Choquet integration reduces to Riemann integration as a special case under the stronger assumption of additive probability measure, not accepted in the literature on behavioral economics. Our theoretical results on monetary services aggregation are generalizations of prior results, nested as special cases of our results under stronger behavioral assumptions. |
Keywords: | Uncertainty Aversion, User Cost, Choquet Expectation, Monetary Aggregation |
JEL: | C43 E41 G12 |
Date: | 2018–07–30 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:88261&r=upt |
By: | de Groot, Oliver (University of St. Andrews); Richter, Alexander W. (Federal Reserve Bank of Dallas); Throckmorton, Nathaniel (College of William & Mary) |
Abstract: | The recent asset pricing literature finds valuation risk is an important determinant of key asset pricing moments. Valuation risk is modelled as a time preference shock within Epstein-Zin recursive utility preferences. While this form of valuation risk appears to fit the data extremely well, we show the preference specification violates an economically meaningful restriction on the weights in the Epstein-Zin time-aggregator. The same model with the corrected preference specification performs nearly as well at matching asset pricing moments, but only if the risk aversion parameter is well above the accepted range of values used in the literature. When the corrected preference specification is combined with Bansal-Yaron long-run risk, the estimated model significantly downgrades the role of valuation risk in determining asset prices. The only significant contribution of valuation risk is to help match the volatility of the risk-free rate. |
Keywords: | Epstein-Zin Utility; Valuation Risk; Equity Premium Puzzle; Risk-Free Rate Puzzle |
JEL: | D81 G12 |
Date: | 2018–07–20 |
URL: | http://d.repec.org/n?u=RePEc:fip:feddwp:1808&r=upt |
By: | Christopher P. Chambers; Federico Echenique; Nicolas S. Lambert |
Abstract: | An experimenter seeks to learn a subject's preference relation. The experimenter produces pairs of alternatives. For each pair, the subject is asked to choose. We argue that, in general, large but finite data do not give close approximations of the subject's preference, even when the limiting (countably infinite) data are enough to infer the preference perfectly. We provide sufficient conditions on the set of alternatives, preferences, and sequences of pairs so that the observation of finitely many choices allows the experimenter to learn the subject's preference with arbitrary precision. While preferences can be identified under our sufficient conditions, we show that it is harder to identify utility functions. We illustrate our results with several examples, including consumer choice, expected utility, and preferences in the Anscombe-Aumann model. |
Date: | 2018–07 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1807.11585&r=upt |
By: | Ali al-Nowaihi; Sanjit Dhami; Mengxing Wei |
Abstract: | We formulate a simple quantum decision model of the Ellsberg paradox. We report the results of an experiment we performed to test the matching probabilities predicted by this model using an incentive compatible method. We find that the theoretical predictions of the model are in conformity with our experimental results. We compare the predictions of our quantum model with those of probably the most successful non-quantum model of ambiguity, namely, the source dependent model. The predictions of our quantum model are not statistically significantly different from those of the source dependent model. The source dependent model requires the specification of probability weighting functions in order to fit the evidence. On the other hand, our quantum model makes no recourse to probability weighting functions. This suggests that much of what is normally attributed to probability weighting may actually be due to quantum probability. |
Keywords: | quantum probability, the Ellsberg paradox, the source dependent model, the law of total probability, the law of reciprocity, the Feynman rules, projective expected utility, bounded rationality, Diebold-Mariano forecasting tests |
JEL: | D03 |
Date: | 2018 |
URL: | http://d.repec.org/n?u=RePEc:ces:ceswps:_7158&r=upt |
By: | Jetlir Duraj |
Abstract: | Dynamic Random Subjective Expected Utility (DR-SEU) allows to model choice data observed from an agent or a population of agents whose beliefs about objective payoff-relevant states and tastes can both evolve stochastically. Our observable, the augmented Stochastic Choice Function (aSCF) allows, in contrast to previous work in decision theory, for a direct test of whether the agent's beliefs reflect the true data-generating process conditional on their private information as well as identification of the possibly incorrect beliefs. We give an axiomatic characterization of when an agent satisfies the model, both in a static as well as in a dynamic setting. We look at the case when the agent has correct beliefs about the evolution of objective states as well as at the case when her beliefs are incorrect but unforeseen contingencies are impossible. We also distinguish two subvariants of the dynamic model which coincide in the static setting: Evolving SEU, where a sophisticated agent's utility evolves according to a Bellman equation and Gradual Learning, where the agent is learning about her taste. We prove easy and natural comparative statics results on the degree of belief incorrectness as well as on the speed of learning about taste. Auxiliary results contained in the online appendix extend previous decision theory work in the menu choice and stochastic choice literature from a technical as well as a conceptual perspective. |
Date: | 2018–08 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1808.00296&r=upt |
By: | Marcelo Veracierto (Federal Reserve Bank of Chicago) |
Abstract: | In this paper I study the optimal provision of unemployment insurance over the business cycle. A novel feature of the paper is that, instead of performing the analysis in an environment with exogenous restrictions to risk sharing such as borrowing constraints, the paper takes a more primitive mechanism design approach. In particular, the restrictions to risk sharing arise endogenously as a consequence of search intensities being private information of the individual agents. The economy is essentially a standard real business cycle model that incorporates unemployed agents similar to those in Hopenhayn and Nicolini (1997). Output, which can be consumed or invested, is produced with a Cobb-Douglas production function that uses capital and labor subject to an aggregate productivity shock that follows an AR(1) process. This production technology is located in a single "production island" and workers must be located in this island in order to provide their labor services. Workers get separated from the production island at the beginning of the following period with a constant separation probability. Once outside the production island agents must search in order to get back to it. The probability that a worker arrives to the production island at the beginning of the following period depends on her own search intensity level. A crucial assumption is that this search intensity level is private information of the agent. Only the location of the agent at the beginning of the period (inside or outside the production island) is observed. Agents value consumption and leisure (which is obtained outside the production island) and dislike to search. In order to guarantee stationarity I assume that agents have a constant probability of surviving between consecutive time periods. When an agent dies he is immediately replaced by an offspring from which the agent derives no utility. Newborns start their life as unemployed agents (i.e. outside the production island). In this framework the social planner offers dynamic insurance contracts to the agents under full commitment. The state of the contract is given by the location of the agent at the beginning of the period and by the value that the contract promises to the agent at the beginning of the period. Given this state, the contract determines the consumption level of the agent during the current period and the contingent promised values at the beginning of the following period. These contingent promised values depend both on the realized employment status of the agent at the beginning of the following period and on the realized aggregate productivity shock at the beginning of the following period. The contract must deliver an expected lifetime discounted utility equal to the value promised at the beginning of the period (promise keeping constraint). Although search intensities are not directly observed by the social planner he takes as given the optimal choice of individual search intensities of unemployed agents as a function of the difference that they face between the expected value of becoming employed at the beginning of the following period and the expected value of continuing unemployed (incentive compatibility constraint). The state of the economy for the social planner is the aggregate productivity level, the aggregate stock of capital, and the joint distribution of old agents (i.e. those that are not newborns at the beginning of the current period) across promised values and employment states. Given this aggregate state the social planner chooses investment and the dynamic insurance contracts to maximize the weighted expected lifetime utility of the current newborns and of all future newborns (with constant relative Pareto weights), subject to promise keeping, incentive compatibility and aggregate consumption feasibility constraints. Observe that the social planner does not seek to maximize the lifetime utilities of the current old agents since these are predetermined by their dynamic insurance contracts. Computing a solution to this mechanism design problem is a complex task given the high dimensionality of the state space. I use a method that I introduced in a previous paper (Veracierto 2017) which has the important advantage of not imposing an approximation to the law of motion for the distribution of agents across individual states. The method requires carrying as a state variable a long history of spline coefficients for the decision rules that have been chosen in the past. A (large) linear rational expectations model is then obtained by linearizing all first order conditions and aggregate feasibility constraints with respect to those spline coefficients. In that paper the method was shown to reproduce some key analytical properties that could be derived in the case of logarithmic preferences, even though the computational method did not exploit any particular feature of that case. While this provided considerable confidence about the accuracy of the computational method, the logarithmic preferences corresponded to a case in which the cross sectional heterogeneity did not play an important role in aggregate fluctuations. For other preferences, the computational method still delivered numerical solutions in which the cross sectional heterogeneity did not play a crucial role. Given those results the computational method had to wait to be applied to an environment in which the cross sectional heterogeneity plays an important role for aggregate fluctuations. I expect that the model in this paper will provide such environment. The reason is that when a positive aggregate productivity shock hits the economy and the planner wants to bring people quickly out of unemployment, the only way that he can induce individual agents to increase their search intensity is by increasing the difference between the expected value of becoming employed and the expected value of continuing unemployed. That is, the planner needs to worsen the insurance that it provides to unemployed agents in order to induce them to search more. This will introduce interesting interactions between social insurance (and, therefore, inequality) and properties of the aggregate business cycle since the social planner will consequently be induced to respond less to the aggregate shocks given the negative insurance effects that such response entails. I am currently in the process of computing a solution to the optimal business cycle fluctuations. I will provide a draft of the paper with the results as soon as I have them ready. |
Date: | 2018 |
URL: | http://d.repec.org/n?u=RePEc:red:sed018:281&r=upt |
By: | Alejandro Noriega-Campero; Alex Rutherford; Oren Lederman; Yves A. de Montjoye; Alex Pentland |
Abstract: | Today's age of data holds high potential to enhance the way we pursue and monitor progress in the fields of development and humanitarian action. We study the relation between data utility and privacy risk in large-scale behavioral data, focusing on mobile phone metadata as paradigmatic domain. To measure utility, we survey experts about the value of mobile phone metadata at various spatial and temporal granularity levels. To measure privacy, we propose a formal and intuitive measure of reidentification risk$\unicode{x2014}$the information ratio$\unicode{x2014}$and compute it at each granularity level. Our results confirm the existence of a stark tradeoff between data utility and reidentifiability, where the most valuable datasets are also most prone to reidentification. When data is specified at ZIP-code and hourly levels, outside knowledge of only 7% of a person's data suffices for reidentification and retrieval of the remaining 93%. In contrast, in the least valuable dataset, specified at municipality and daily levels, reidentification requires on average outside knowledge of 51%, or 31 data points, of a person's data to retrieve the remaining 49%. Overall, our findings show that coarsening data directly erodes its value, and highlight the need for using data-coarsening, not as stand-alone mechanism, but in combination with data-sharing models that provide adjustable degrees of accountability and security. |
Date: | 2018–08 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1808.00160&r=upt |
By: | Lijun Bo; Huafu Liao; Yongjin Wang |
Abstract: | This paper studies an optimal investment and risk control problem for an insurer with default contagion and regime-switching. The insurer in our model allocates his/her wealth across multi-name defaultable stocks and a riskless bond under regime-switching risk. Default events have an impact on the distress state of the surviving stocks in the portfolio. The aim of the insurer is to maximize the expected utility of the terminal wealth by selecting optimal investment and risk control strategies. We characterize the optimal trading strategy of defaultable stocks and risk control for the insurer. By developing a truncation technique, we analyze the existence and uniqueness of global (classical) solutions to the recursive HJB system. We prove the verification theorem based on the (classical) solutions of the recursive HJB system. |
Date: | 2018–07 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1807.05513&r=upt |
By: | Clement A. Tisdell |
Abstract: | Provides a sketch of the development of the concept of bounded rationality in economic thought. The concept of rationality has several meanings. These different meanings are taken into account in considering the further development of economic thought. Different views of ecological rationality are critically examined in the light of these concepts. Whether or not various theories of behavioral economics can be classified as exhibiting bounded rationality is discussed. Satisficing behavior is commonly associated with bounded rationality but as demonstrated, it is not the only reason for adopting such behavior. The idea of some authors that optimization models under constraints are of little or no relevance to bounded rationality is rejected. Bounded rationality is an important contributor to the diversity of (economic) behaviors. This is stressed. Whether or not a behavior is rational depends to a considerable extent on the situation (the constraints) that decision-makers or actors face. The time-constraint is very important as an influence on the rationality of decisions. Aspects of this are covered. |
Keywords: | Institutional and Behavioral Economics |
Date: | 2017–11–13 |
URL: | http://d.repec.org/n?u=RePEc:ags:uqseet:264873&r=upt |
By: | Frank Riedel (UJ - University of Johannesburg, Bauhaus-Universität Weimar); Jean-Marc Tallon (PJSE - Paris Jourdan Sciences Economiques - UP1 - Université Panthéon-Sorbonne - ENS Paris - École normale supérieure - Paris - INRA - Institut National de la Recherche Agronomique - EHESS - École des hautes études en sciences sociales - ENPC - École des Ponts ParisTech - CNRS - Centre National de la Recherche Scientifique, PSE - Paris School of Economics); Vassili Vergopoulos (PSE - Paris School of Economics, CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique) |
Abstract: | This paper extends decision theory under imprecise probabilistic information to dynamic settings. We explore the relationship between the given objective probabilistic information, an agent's subjective multiple priors, and updating. Dynamic consistency implies rectangular sets of priors at the subjective level. As the objective probabilistic information need not be consistent with rectangularity at the subjective level, agents might select priors outside the objective probabilistic information while respecting the support of the given set of priors. Under suitable additional axioms, the subjective set of priors belongs to the rectangular hull of the objective probabilistic information. |
Keywords: | imprecision aversion,multiple priors,Imprecise information,dynamic consistency |
Date: | 2017–04 |
URL: | http://d.repec.org/n?u=RePEc:hal:wpaper:halshs-01513820&r=upt |
By: | Tristan Le Cotty (CIRED - Centre International de Recherche sur l'Environnement et le Développement - CNRS - Centre National de la Recherche Scientifique - ENPC - École des Ponts ParisTech - AgroParisTech - EHESS - École des hautes études en sciences sociales - CIRAD - Centre de Coopération Internationale en Recherche Agronomique pour le Développement); Elodie Maitre d'Hotel (UMR MOISA - Marchés, Organisations, Institutions et Stratégies d'Acteurs - CIRAD - Centre de Coopération Internationale en Recherche Agronomique pour le Développement - Montpellier SupAgro - Centre international d'études supérieures en sciences agronomiques - INRA Montpellier - Institut national de la recherche agronomique [Montpellier] - CIHEAM - Centre International des Hautes Études Agronomiques Méditerranéennes - Montpellier SupAgro - Institut national d’études supérieures agronomiques de Montpellier, CIRAD - Centre de Coopération Internationale en Recherche Agronomique pour le Développement); Raphael Soubeyran (LAMETA - Laboratoire Montpelliérain d'Économie Théorique et Appliquée - UM1 - Université Montpellier 1 - UM3 - Université Paul-Valéry - Montpellier 3 - Montpellier SupAgro - Centre international d'études supérieures en sciences agronomiques - INRA Montpellier - Institut national de la recherche agronomique [Montpellier] - UM - Université de Montpellier - CNRS - Centre National de la Recherche Scientifique - Montpellier SupAgro - Institut national d’études supérieures agronomiques de Montpellier); Julie Subervie (LAMETA - Laboratoire Montpelliérain d'Économie Théorique et Appliquée - UM1 - Université Montpellier 1 - UM3 - Université Paul-Valéry - Montpellier 3 - Montpellier SupAgro - Centre international d'études supérieures en sciences agronomiques - INRA Montpellier - Institut national de la recherche agronomique [Montpellier] - UM - Université de Montpellier - CNRS - Centre National de la Recherche Scientifique - Montpellier SupAgro - Institut national d’études supérieures agronomiques de Montpellier) |
Abstract: | This paper investigates whether Burkinabe maize farmers' fertilizer-use decisionsare correlated with their risk and time preferences. We conducted a survey and a se-ries of hypothetical experiments on a sample of 1,500 farmers. We find that morepatient farmers do use more fertilizer, but it is only because they plant more maize (afertilizer-intensive crop) rather than because they use more fertilizer per hectare ofmaize planted. Conversely, we find no statistically significant link between risk aver-sion and fertilizer use. We use a simple two-period model, which suggests that riskaversion may indeed have an ambiguous effect on fertilizer use. |
Keywords: | agriculture,risk aversion,time preferences,agricultural price,western africa,fertilizer,aversion au risque,prix agricole,burkina faso,afrique occidentale,engrais |
Date: | 2017 |
URL: | http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01429099&r=upt |
By: | Jie He (Département d'économique, École de gestion, Université de Sherbrooke); Bing Zhang (Nanjing University) |
Abstract: | Previous psychological and economic studies have observed systematic biases in people’s predictions of their future utilities. In this paper, using repeated contingent valuation (CV) surveys conducted with a very high frequency (every two weeks, in total 29 waves) in Nanjing, China during July 2014 to June 2015, we tested whether people’s expected future utility for better air quality is overly influenced by the air quality at the moment of valuation. As air quality, in general, is subject to high day-to-day variability, its negative impact on people’s utility (health, happiness etc.), according to rational logic, should be essentially stationary in the long-run and dependent on the yearly average air quality. Following this logic, based on the classical random utility model, we should not expect the daily air quality to be a determining factor in a rational person’s valuation decision. Our results show, however, that people’s willingness to pay (WTP) is significantly and positively affected by the level of PM2.5 concentration, one of the key air pollution indicators that has been well understood for several years and is widely available on different media platforms for almost all large cities in China. We explored a range of rational explanations but found that our results were more consistent with the effects of psychological mechanisms, in particular, projection bias. |
Keywords: | Psychological effects, projection bias, contingent valuation, air quality, China |
Date: | 2018–08 |
URL: | http://d.repec.org/n?u=RePEc:shr:wpaper:18-03&r=upt |
By: | Wunsch, Conny (University of Basel); Strobl, Eric (Aix-Marseille University) |
Abstract: | Negative income shocks can either be the consequence of risky choices or random events. A growing literature analyzes the role of responsibility for neediness for informal financial support of individuals facing negative income shocks based on randomized experiments. In this paper, we show that studying this question involves a number of challenges that existing studies either have not been aware of, or have been unable to address satisfactorily. We show that the average effect of free choice of risk on sharing, i.e. the comparison of mean sharing across randomized treatments, is not informative about the behavioural effects and that it is not possible to ensure by the experimental design that the average treatment effect equals the behavioural effect. Instead, isolating the behavioural effect requires conditioning on risk exposure. We show that a design that measures subjects preferred level of risk in all treatments allows isolating this effect without additional assumptions. Another advantage of our design is that it allows disentangling changes in giving behaviour due to attributions of responsibility for neediness from other explanations. We implement our design in a lab experiment we conducted with slum dwellers in Nairobi that measures subjects' transfers to a worse-off partner both in a setting where participants could either deliberately choose or were randomly assigned to a safe or a risky project. We find that free choice matters for giving and that the effects depend on donors' risk preferences but that attributions of responsibility play a negligible role in this context. |
Keywords: | solidarity, risk taking, experimental design |
JEL: | C91 D63 D81 O12 |
Date: | 2018–06 |
URL: | http://d.repec.org/n?u=RePEc:iza:izadps:dp11641&r=upt |
By: | Lioudmila Vostrikova (LAREMA - Laboratoire Angevin de REcherche en MAthématiques - UA - Université d'Angers - CNRS - Centre National de la Recherche Scientifique); Yuchao Dong |
Abstract: | This article is devoted to the maximisation of HARA utilities of Lévy switching process on finite time interval via dual method. We give the description of all f-divergence minimal martingale measures in initially enlarged filtration, the expression of their Radon-Nikodym densities involving Hellinger and Kulback-Leibler processes, the expressions of the optimal strategies in progressively enlarged filtration for the maximisation of HARA utilities as well as the values of the corresponding maximal expected utilities. The example of Brownian switching model is presented to give the financial interpretation of the results. |
Keywords: | Lévy switching models, utility maximisation, dual approach, f-divergence minimal martingale measure, optimal strategy |
Date: | 2018–07–19 |
URL: | http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01844635&r=upt |
By: | Argenziano, Rossella; Gilboa, Itzhak |
Abstract: | Agents make predictions based on similar past cases, while also learning the relative importance of various attributes in judging similarity. We ask whether the resulting "empirical similarity" is unique, and how easy it is to find it. We show that with many observations and few relevant variables, uniqueness holds. By contrast, when there are many variables relative to observations, non-uniqueness is the rule, and finding the best similarity function is computationally hard. The results are interpreted as providing conditions under which rational agents who have access to the same observations are likely to converge on the same predictions, and conditions under which they may entertain different probabilistic beliefs. |
Keywords: | Empirical Similarity; Belief Formation |
JEL: | A10 |
Date: | 2018–05–15 |
URL: | http://d.repec.org/n?u=RePEc:ebg:heccah:1265&r=upt |
By: | Debraj Ray; Rajiv Vohra |
Abstract: | A game of love and hate is one in which a player’s payoff is a function of her own action and the payoffs of other players. For each action profile, the associated payoff profile solves an interdependent utility system, and if that solution is bounded and unique for every profile we call the game coherent. Coherent games generate a standard normal form. Our central theorem states that every Nash equilibrium of such a game is Pareto optimal, in sharp contrast to the general prevalence of inefficient equilibria in the presence of externalities. While externalities in our model are restricted to flow only through payoffs there are no other constraints: they could be positive or negative, or of varying sign. We further show that our coherence and continuity requirements are tight. |
Date: | 2018 |
URL: | http://d.repec.org/n?u=RePEc:bro:econwp:2018-8&r=upt |
By: | Gilboa, Itzhak; Minardi, Stefania; Samuelson, Larry; Schmeidler, David |
Abstract: | We discuss the notion of a state of the world in axiomatic decision theory, and argue that it should be viewed as an "eventuality" that is implicitly assumed to be independent of the process by which preferences are observed. The distinction between states, which are assumed to resolve all uncertainty, and eventualities suggests certain limitations on the axiomatic approach for defining and measuring mental concepts (such as belief ) by observed choices |
Keywords: | axiomatic; decision; theory |
JEL: | A10 |
Date: | 2018–05–15 |
URL: | http://d.repec.org/n?u=RePEc:ebg:heccah:1267&r=upt |
By: | Adrian Bruhin; Maha Manai; Luis Santos-Pinto |
Abstract: | We analyze the relative importance of probability weighting and choice set dependence in describing risky choices both non-parametrically and with a structural model. Our experimental design uses binary choices between lotteries that may trigger Allais Paradoxes. We change the choice set by manipulating the correlation structure of the lotteries' payoffs while keeping their marginal distributions constant. This allows us to discriminate between probability weighting and choice set dependence. There are three main results. First, probability weighting and choice set dependence both play a role in describing aggregate choices. Second, the structural model uncovers substantial individual heterogeneity which can be parsimoniously characterized by three types: 38% of subjects engage primarily in probability weighting, 34% are in uenced predominantly by choice set dependence, and 28% are mostly rational. Third, the classification of subjects into types predicts preference reversals out-of-sample. These results may not only further our understanding of choice under risk but may also prove valuable for describing the behavior of consumers, investors, and judges. |
Keywords: | Individual Choice under Risk; Choice Set Dependence; Probability Weighting; Latent Heterogeneity; Preference Reversals |
JEL: | D81 C91 C49 |
Date: | 2018–07 |
URL: | http://d.repec.org/n?u=RePEc:lau:crdeep:18.04&r=upt |
By: | Mehta, Nirav |
Abstract: | Researchers commonly “shrink” raw quality measures based on statistical criteria. This paper studies when and how this transformation’s statistical properties would confer economic benefits to a utility-maximizing decisionmaker across common asymmetric information environments. I develop the results for an application measuring teacher quality. The presence of a systematic relationship between teacher quality and class size could cause the data transformation to do either worse or better than the untransformed data. I use data from Los Angeles to confirm the presence of such a relationship and show that the simpler raw measure would outperform the one most commonly used in teacher incentive schemes. |
Keywords: | empirical contracts,teacher incentive schemes,teacher quality,economics of education |
JEL: | J01 I21 I28 D81 |
Date: | 2018 |
URL: | http://d.repec.org/n?u=RePEc:zbw:esprep:180846&r=upt |
By: | Andreas Fuster; Ricardo Perez-Truglia; Basit Zafar |
Abstract: | Information frictions play an important role in many theories of expectation formation and macroeconomic fluctuations. We use a survey experiment to generate direct evidence on how people acquire and process information, in the context of national home price expectations. We let consumers buy different pieces of information that could be relevant for the formation of their expectations about the future median national home price. We use an incentive-compatible mechanism to elicit their maximum willingness to pay. We also introduce exogenous variation in the value of information by randomly assigning individuals to rewards for the ex-post accuracy of their expectations. Consistent with rational inattention, individuals are willing to pay more for information when they stand to gain more from it. However, underscoring the importance of limits on information processing capacity, individuals disagree on which signal they prefer to buy. Individuals with lower education and financial numeracy are less likely to demand information that has ex-ante higher predictive power, independently of stakes. As a result, lowering the information acquisition cost does not decrease the cross-sectional dispersion of expectations. Our findings have implications for models of expectation formation and for the design of information interventions. |
JEL: | C81 C93 D80 D83 D84 E27 E3 |
Date: | 2018–06 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:24767&r=upt |
By: | Wang, Zhen; Vukina, Tomislav |
Abstract: | In this paper we investigate sorting patterns among chicken contract producers. We show that the sub-game perfect Nash equilibrium of this contracting game reveals a positive sorting where higher ability producers sort themselves into contracts to grow larger chickens and lower ability types sort themselves into contracts to grow smaller birds. We also show that eliciting this type of sorting behavior is profit maximizing for the principal. In the empirical part of the paper, we first estimate growers’ abilities using a two-way fixed effects model and subsequently use these estimated abilities to estimate a random utility model of contract choice. Our results show that higher ability growers are more likely to self-select themselves into contracts with larger expected outputs (larger chickens) and the opposite is true for growers with lower abilities. The empirical results are strongly supportive of the developed theory. |
Keywords: | Labor and Human Capital |
Date: | 2017–03–01 |
URL: | http://d.repec.org/n?u=RePEc:ags:ncarwp:262930&r=upt |
By: | Zawojska, Ewa; Bartczak, Anna; Czajkowski, Mikotaj |
Abstract: | Stated preference literature suggests that to be incentivised to reveal preferences truthfully in a survey, respondents need to believe that a response in favour of a policy project of providing a public good increases chances of actual provision of the good (policy consequentiality) and that the cost of conducing the policy project stated in a survey will be actually collected upon the policy implementation (payment consequentiality). We investigate the effects of the two aspects of consequentiality beliefs on stated preferences in a field survey concerning renewable energy development in Poland. Using a hybrid choice model to capture unobservable beliefs in consequentiality, we find that latent beliefs in policy consequentiality and in payment consequentiality affect stated preferences differently: respondents believing in policy consequentiality prefer the project implementation to the status quo more than those believing in payment consequentiality; respondents believing in payment consequentiality state significantly lower willingness to pay for the project than those believing in policy consequentiality. Respondents with no clear opinion on the degree of the survey’s consequentiality reveal substantially different preferences; they are much less interested in seeing the proposed project implemented. We also find that respondents’ risk attitudes do not impinge neither on their self-reported perceptions over the survey’s consequentiality nor on their preferences. |
Keywords: | Institutional and Behavioral Economics, Resource /Energy Economics and Policy, Risk and Uncertainty |
Date: | 2017–06–21 |
URL: | http://d.repec.org/n?u=RePEc:ags:caes17:258602&r=upt |
By: | André Lapidus (PHARE - Pôle d'Histoire de l'Analyse et des Représentations Economiques - UP1 - Université Panthéon-Sorbonne - UPN - Université Paris Nanterre - CNRS - Centre National de la Recherche Scientifique) |
Abstract: | This paper shows that Hume's theory of passion, such as elaborated mainly in book II of the Treatise of Human Nature (1739-40) and in the Dissertation on the Passions (1757), gives rise to a conception of the decision process which challenges the canonical approach to the rationality of decision, as rationality of preferences or rationality of choice. It shows that when adopting a Humean perspective, rationality is not embodied as consistency requirements of individual behaviour, but may emerge as a possible outcome of some dispositions of our mind, which make the world inhabited by our emotions. |
Keywords: | Hume, economic philosophy, rationality, decision, passion,emotion, desire, preference, will, choice |
Date: | 2018 |
URL: | http://d.repec.org/n?u=RePEc:hal:journl:hal-01831901&r=upt |
By: | Jerome L Kreuser (ETH Zurich); Didier Sornette (ETH Zürich and Swiss Finance Institute) |
Abstract: | We propose a dynamic Rational Expectations (RE) bubble model of prices with the intention to exploit it for and evaluate it on optimal investment strategies. Our bubble model is defined as a geometric Brownian motion combined with separate crash (and rally) discrete jump distributions associated with positive (and negative) bubbles. We assume that crashes tend to efficiently bring back excess bubble prices close to a “normal” or fundamental value (“efficient crashes”). Then, the RE condition implies that the excess risk premium of the risky asset exposed to crashes is an increasing function of the amplitude of the expected crash, which itself grows with the bubble mispricing: hence, the larger the bubble price, the larger its subsequent growth rate. This positive feedback of price on return is the archetype of super-exponential price dynamics, which has been previously proposed as a general definition of bubbles. Our bubble model also allows for a sequence of small jumps or long-term corrections. We use the RE condition to estimate the realtime crash probability dynamically through an accelerating probability function depending on the increasing expected return. After showing how to estimate the model parameters, we examine the optimal investment problem in the context of the bubble model by obtaining an analytic expression for maximizing the expected log of wealth (Kelly criterion) for the risky asset and a risk-free asset. We also obtain a closed-form approximation for the optimal investment. We demonstrate, on seven historical crashes, the promising outperformance of the method compared to a 60/40 portfolio, the classic Kelly allocation, and the risky asset, and how it mitigates jumps, both positive and negative. |
Keywords: | financial bubbles, efficient crashes, positive feedback, rational expectation, Kelly criterion, optimal investment |
JEL: | C53 G01 G17 |
Date: | 2017–11 |
URL: | http://d.repec.org/n?u=RePEc:chf:rpseri:rp1733&r=upt |
By: | Le Treust, Maël (ENSEA/ETIS - UMR CNRS 8051); Tomala, Tristan (HEC Paris) |
Abstract: | In this article, we investigate strategic information transmission over a noisy channel. This problem has been widely investigated in Economics, when the communication channel is perfect. Unlike in Information Theory, both encoder and decoder have distinct objectives and choose their encoding and decoding strategies accordingly. This approach radically differs from the conventional Communication paradigm, which assumes transmitters are of two types: either they have a common goal, or they act as opponent, e.g. jammer, eavesdropper. We formulate a point-to-point source-channel coding problem with state information, in which the encoder and the decoder choose their respective encoding and decoding strategies in order to maximize their long-run utility functions. This strategic coding problem is at the interplay between Wyner-Ziv’s scenario and the Bayesian persuasion game of Kamenica-Gentzkow. We characterize a single-letter solution and we relate it to the previous results by using the concavification method. This confirms the benefit of sending encoded data bits even if the decoding process is not supervised, e.g. when the decoder is an autonomous device. Our solution has two interesting features: it might be optimal not to use all channel resources; the informational content impacts the encoding process, since utility functions capture preferences on source symbols. |
Keywords: | strategic information transmission; strategic coding problem; |
JEL: | D82 D83 |
Date: | 2018–07–13 |
URL: | http://d.repec.org/n?u=RePEc:ebg:heccah:1288&r=upt |
By: | Escañuela Romana, Ignacio |
Abstract: | New solutions to the basic standard New Keynesian model are explored. I extend De Grauwe’s model (2012), distinguishing two types of agents and different expectations rules. The central bank fixes the rate of interest. Families and firms determine aggregated demand and supply. Neither of them follows the hypothesis of perfect rational expectations. However, Popper’s principle of rationality is applied. From a situation of limited information, even though they learn through rational processes, they are unable to understand their mutual behaviour. Therefore, the expectations in the three equations do not coincide. As a result, the solution does not tend to a single, stationary equilibrium. This conclusion does not depend on the hypothesis of the "animal spirits". Finally, the possibility of a successful learning process is studied. It is considered whether the central bank could learn from the data, finally reaching a stationary optimum equilibrium. The answer is no. The New Keynesian model seems to be basically unstable when agents have limited information. The problem lies in the impossibility to get adequate coordination. |
Keywords: | Business Cycles, Imperfect Information, Learning, Monetary Policy. |
JEL: | D83 E10 E32 E52 |
Date: | 2018–07 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:88015&r=upt |
By: | Claudio Michelacci (EIEF); Andrea Pozzi (EIEF); Luigi Paciello (EIEF) |
Abstract: | We document that around one half of the cyclical variation in aggregate non-durable consumption expenditures by US households comes from changes in the products entering their consumption basket. Most of this variation is due to changes in the rate at which households add new products to their basket, while removals from the basket are relatively acyclical. These patterns hold true within narrowly defined sectors of products or quality categories and are only partly driven by changes in the price of products or their availability in the market. We rationalize this evidence by incorporating a conventional random utility model of discrete choice of products into a standard household dynamic optimization problem. Household preferences over products in her consideration set randomly vary over time and because of this a larger set reduces the welfare relevant household price index. The household can save in financial assets and decides how much to spend in experimenting for new products to be added to her consideration set. In response to income shocks the household increases savings and experiments more, which allows to smooth consumption by persistently reducing her future price index. The calibrated model predicts that experimentation expenditures fluctuate by around 15 percent from peak to bottom in the business cycle. This experimentation channel has novel implications for consumption smoothing, the measurement of household level inflation, and the role of aggregate demand stabilization policies. Motivated by this evidence, we embed a standard discrete choice model of product choice into a macro model. Random shocks to preferences cause products to be temporarily added and removed from the consumption basket, while experimentation effort by households expands their consideration set, leading to products being added to their consumption basket. This mechanism microfunds love for variety as a household's effort to expand the consideration set to better fit its random preferences. Expansions in the consideration set have long lasting effects on household welfare, as they persistently reduce the welfare-relevant price paid by the household, providing a substitute to savings in smoothing utility from consumption expenditure over time. We calibrate the model using the scanner data to show that product experimentation is pro-cyclical, acts as a substitute to savings and accounts for a large fraction of the cyclical behavior of product addition. We validate the prediction of the model by analyzing the response to an exogenous shock to income, represented by the 2008 U.S. Fiscal Stimulus in the U.S., and showing that it led to a surge in product experimentation. |
Date: | 2018 |
URL: | http://d.repec.org/n?u=RePEc:red:sed018:1008&r=upt |
By: | Ozlem Omer (Middle East Technical University) |
Abstract: | In this article, we demonstrate that a quantal response statistical equilibrium approach to the US housing market with the help of maximum entropy method of modeling is a powerful way of revealing di erent characteristics of the housing market behavior before, during and after the recent housing market crash in the US. In this line, a maximum entropy approach to quantal response statistical equilibrium model (QRSE), introduced by Scharfenaker and Foley (2017), is employed in order to model housing market dynamics in di erent phases of the most recent housing market cycle using the S&P Case Shiller housing price index for 20 largest- Metropolitan Regions, and Freddie Mac housing price index (FMHPI) for 367 Metropolitan Cities for the US between 2000 and 2015. Estimated model parameters provide an alternative way to understand and explain the behaviors of economic agents, and market dynamics by questioning the traditional economic theory, which takes assumption for the behavior of rational utility maximizing representative agent with self-ful lled expectations as given. |
Keywords: | Housing Market Crash, Statistical Equilibrium, Quantal Response, Informational Entropy, Maximum Entropy Method |
JEL: | C18 D89 D90 E30 G01 R39 |
Date: | 2018–08 |
URL: | http://d.repec.org/n?u=RePEc:new:wpaper:1809&r=upt |
By: | Antler, Yair |
Abstract: | We identify the conditions on the tendency of agents to spread information by word of mouth, under which a principal can design a pyramid scam to exploit a network of boundedly rational agents whose beliefs are coarse. Our main result is that a pyramid scam is sustainable only if its underlying reward scheme compensates the participants based on multiple levels of their downlines (e.g., for recruiting new members to the pyramid and for recruitments made by these new members). Motivated by the growing discussion on the legitimacy of multilevel marketing schemes and their resemblance to pyramid scams, we use our model to compare the two phenomena based on their underlying compensation structure. |
Keywords: | pyramid scams; multilevel marketing; analogy-based expectations; coarse feedback; bounded rationality. |
Date: | 2018–07 |
URL: | http://d.repec.org/n?u=RePEc:cpr:ceprdp:13054&r=upt |
By: | Hector Chade (Arizona State University) |
Abstract: | This paper develops and analyzes a model of imperfect competition under adverse selection with private values among a finite set of heterogeneous principals (henceforth firms) for (a large number of) heterogeneous agents (henceforth workers). Firms differ in the technology that translates the effort of a worker into revenues for the firm, and we assume that firms are ordered by a single crossing property between effort and the type of technology (better firms have a higher marginal revenue from effort). In turn, workers differ in ability, which determines their disutility of effort. Firms compete with each other by offering menus consisting of wage-effort pairs (contracts), one for each type of worker. Alternative, one can think of firms offering effort-utility pairs for each type. After observing the menus, workers self-select by choosing the best contract for them. Although we cast the model as a labor market, it is straightforward to change the notation and think about it in terms of firms that produce goods of different qualities for different consumers. Instead of a revenue function there will be a cost function, instead of a wage a payment from consumers to firms, instead of worker's ability a consumer's value for quality, and instead of disutility of effort a utility for quality. The analysis reveals that instead of the standard efficiency vs. information rents trade off, the relevant one when there is imperfect competition among firms under adverse selection is efficiency vs. information rents plus market coverage trade off, since changing the menu offered not only affects efficiency and the information rents given to the workers hired but also affects the measure of workers targeted by a firm. We show that a pure strategy Nash equilibrium (PSNE) exists in this modified game (which is in turn a PSNE of the original game), and that in equilibrium the market segments into contiguous intervals, with each firm hiring only workers whose types belong to a given interval, with better firms targeting better interval of worker types and thus having better workforce composition. In equilibrium, the worst firm (in the single-crossing order defined above) distorts effort provision upward for all the types it serves, while the best distort it downward. All other firms exhibit both types of distortions: downward distortions for the lower types they serve, and upward distortions beyond an interior efficient type. Interestingly, the equilibrium effort function exhibits jumps at transition points between firms. In the firms-customers interpretation, this asserts that quality provided in equilibrium on the entire spectrum of the market will exhibit gaps, a potentially testable implication. Regarding curvature properties of the equilibrium menus, we show that it exhibits `quantity discounts' in the following sense: the wage per unit of effort is decreasing in the amount of effort induced in each worker hired by the firm (a similar interpretation holds for the other applications mentioned above regarding firms and customers). |
Date: | 2018 |
URL: | http://d.repec.org/n?u=RePEc:red:sed018:1241&r=upt |
By: | Parenti, Mathieu; Sidorov, Alexander V.; Thisse, Jacques-Francois |
Abstract: | In Contributions to game theory and management, vol. X. Collected papers presented on the Tenth International Conference Game Theory and Management / Editors Leon A. Petrosyan, Nikolay A. Zenkevich. Ó SPb.: Saint Petersburg State University, 2017. Ó 404 p. |
Keywords: | Cournot competition, Bertrand competition, free entry, Lerner index, indirect utility, |
Date: | 2017 |
URL: | http://d.repec.org/n?u=RePEc:sps:cpaper:10463&r=upt |