nep-upt New Economics Papers
on Utility Models and Prospect Theory
Issue of 2026–02–23
nine papers chosen by
Alexander Harin


  1. Leisure and consumption in three dimensions By Miller, Anne
  2. Optimal choice of crop insurance: The Case of Winter Barley in France By Diana Dorobantu; Gia Hien Pham
  3. Decision-Making when Computational Complexity Drives Uncertainty By Bossaerts, P.
  4. Fat-tailed Distribution under the Smooth Ambiguity Model By Osei, Prince
  5. Sampled-Data Wasserstein Distributionally Robust Control of Multiplicative Systems: A Convex Relaxation with Performance Guarantees By Chung-Han Hsieh
  6. Mises' Regression Theorem and Bitcoin - From a Problem to a Full Program and Methodology for Researching the Non-Monetary Utility of Cryptographic Money By Selinger, Daniel P; Crofton, Isaak
  7. Examination of risks in circular supply chains using transition management lens: towards a circular economy in emerging markets By Divya Choudhary; Ajay Kumar; Yeming Gong; Thanos Papadopoulos
  8. Shrink with Purpose: Optimal Covariance Matrix Estimation for Portfolio Selection By Lassance, Nathan; Vanderveken, Rodolphe; Vrins, Frédéric
  9. The welfare cost of ignoring the beta. By Christian Gollier

  1. By: Miller, Anne
    Abstract: The separability rule is unable to distinguish between two commodities fulfilling the same need and those fulfilling different needs, for utilities displaying only diminishing marginal utility. An S-shaped utility, bounded below and above, represents the stages of an individual’s fulfilment of a need, including deprivation (increasing marginal utility), subsistence, sufficiency (diminishing marginal utility) and satiation. A utility function is created by adding two (S-shaped) normal cumulative distribution functions for consumption and leisure, each with a subsistence and an intensity-of-need parameter and satiation at infinity. Its indifference curve map features a straight-line indifference curve separating concave- from convex-to-the-origin indifference curves. The utility function is then maximised subject to a budget constraint to produce consumption demand and labour supply equations. These two functional forms are dependent on only two independent variables – the real wage rate and endowments of unearned consumption. Thus, both consumption demand and labour supply are 3-dimensional figures, which ideally would be presented as 3-D models. The typical demand/supply and Engels diagrams are only two dimensional, representing a dependent variable as a function of only one of its two independent variables, from which the 3-D figure is very difficult to envisage. The aim of this paper is to present the third 2-D diagram for each dependent variable, presented as contours on a map of the real wage rate vs endowments. They highlight the instability of labour and consumption around the intersection of the ‘survival endowment’ and ‘equilibrium wage/price’ created by the straight-line indifference curve.
    Keywords: S-shaped cardinal utility includes increasing marginal utility expressing deprivation; additive utilities represent separate needs; dysfunctional poverty causes involuntary unemployment and disequilibrium; labour and consumption contours; instability at the conjunction of ‘survival endowment’ and ‘equilibrium wage’.
    JEL: D11 J22
    Date: 2025–12–29
    URL: https://d.repec.org/n?u=RePEc:pra:mprapa:127641
  2. By: Diana Dorobantu (Institut de Science Financières et d'Assurance, LSAF - Laboratoire de Sciences Actuarielle et Financière - UCBL - Université Claude Bernard Lyon 1 - Université de Lyon); Gia Hien Pham
    Abstract: This paper analyzes how the agricultural insurance market is adapting to climate change, particularly as extreme weather events become more frequent and severe. We focus on the optimal decision faced by a risk-averse farmer who wants to insure their crop while making savings. They can choose between a traditional loss-based insurance, index-based insurance or a mix of both. By maximizing the farmer's CARA utility function, we show that in some cases, a mixed insurance strategy is more advantageous than a single contract. In our model, the farmer insures only part of the crop when the market interest rate is strictly positive. Demand for traditional and index insurance depends on their respective prices. Highly risk-averse farmers prefer traditional insurance. A numerical application to the French agriculture sector indicates that mean spring temperature primarily affects winter barley yield and could therefore be the main indicator for index-based insurance design. Insurance simulations using the theoretical model and the estimated results further illustrate these findings.
    Keywords: Agriculture yields, Loss-Based Insurance, Index Insurance, CARA utility function, Utility maximization, Utility maximization CARA utility function Index Insurance Loss-Based Insurance Agriculture yields
    Date: 2025–12–09
    URL: https://d.repec.org/n?u=RePEc:hal:journl:hal-05318094
  3. By: Bossaerts, P.
    Abstract: This review summarizes research over the last two decades on human attitudes towards computationally "hard" problems. The focus is on the nature of uncertainty that computational complexity generates because humans do not have the cognitive capacity or the resources (time) to fully resolve the problems they are dealing with. Although decision theorists have traditionally labeled this type of uncertainty as ambiguity, behavior under computational complexity shows that humans neither deal with it as prescribed in rational decision theory nor simply avoid it as in traditional accounts of ambiguity aversion. Instead, behavior (effort applied and performance reached) exhibits distinct features that can be rationalized using the theory of computational complexity, originally developed for electronic computers. Although the theory cannot decisively tell us which problems are most difficult, it does provide classifications that allow one to predict human performance and effort. The theory also identifies which instances of a problem are more difficult, and human performance and effort appear to align with this identification. Evidence is discussed that humans do not appear to allocate cognitive effort ex ante when faced with a "hard" choice. Absence of correlation between early neural signals and ex ante metrics of instance difficulty corroborate this finding. Finally, the heterogeneity in ways humans approach "hard" problems suggests that, collectively, much can be gained from incentive mechanisms that promote communication. Particular market designs appear to be extremely effective in helping participants make "hard" choices.
    Keywords: Computational Complexity, NP-Hard, Uncertainty, Ambiguity, Decision-Making, Rationality, Opportunism, Algorithms, Expected Utility, Neuroeconomics, Cognitive Foundations of Decision-Making
    Date: 2026–01–31
    URL: https://d.repec.org/n?u=RePEc:cam:camdae:2611
  4. By: Osei, Prince (Center for Mathematical Economics, Bielefeld University)
    Abstract: We study the ambiguity-adjusted return distribution induced by an investor with smooth ambiguity preferences à la Klibanoff et al. (2005), who faces uncertainty about the variance of asset returns. The variance uncertainty is modeled using a gamma distribution, a second-order prior over the family of normally distributed returns. Our main results present a density distortion that exponentially tilts this prior into an ambiguity-adjusted gamma distribution, characterized by its distorted rate parameter and shape parameter. A smaller distorted rate parameter implies greater weight on high-variance returns. This paper derives the ambiguity-adjusted return distribution as a symmetric variance–gamma distribution reflecting the investor’s risk and ambiguity aversion. The ambiguity-averse investor assigns a variance–gamma distribution with a higher likelihood of extreme returns, while the ambiguity-neutral investor assigns a distribution more peaked around the mean. We obtain the ambiguity-adjusted return variance as an increasing function of risk and ambiguity aversion. An empirical comparison is performed to calibrate the ambiguity aversion parameter of an in- vestor investing in gold.
    Keywords: sset returns, Smooth ambiguity aversion, Variance uncertainty, Variance– Gamma distribution
    Date: 2026–02–11
    URL: https://d.repec.org/n?u=RePEc:bie:wpaper:764
  5. By: Chung-Han Hsieh
    Abstract: This paper investigates the robust optimal control of sampled-data stochastic systems with multiplicative noise and distributional ambiguity. We consider a class of discrete-time optimal control problems where the controller \emph{jointly} selects a feedback policy and a sampling period to maximize the worst-case expected concave utility of the inter-sample growth factor. Modeling uncertainty via a Wasserstein ambiguity set, we confront the structural obstacle of~``concave-max'' geometry arising from maximizing a concave utility against an adversarial distribution. Unlike standard convex loss minimization, the dual reformulation here requires a minimax interchange within the semi-infinite constraints, where the utility's concavity precludes exact strong duality. To address this, we utilize a general minimax inequality to derive a tractable convex relaxation. Our approach yields a rigorous lower bound that functions as a probabilistic performance guarantee. We establish an explicit, non-asymptotic bound on the resulting duality gap, proving that the approximation error is uniformly controlled by the Lipschitz-smoothness of the stage reward and the diameter of the disturbance support. Furthermore, we introduce necessary and sufficient conditions for \emph{robust viability}, ensuring state positivity invariance across the entire ambiguity set. Finally, we bridge the gap between static optimization and dynamic performance, proving that the optimal value of the relaxation serves as a rigorous deterministic floor for the asymptotic average utility rate almost surely. The framework is illustrated on a log-optimal portfolio control problem, which serves as a canonical instance of multiplicative stochastic control.
    Date: 2026–02
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2602.04219
  6. By: Selinger, Daniel P; Crofton, Isaak
    Abstract: This essay argues that Ludwig von Mises’ regression theorem, when interpreted with full attention to its original logical structure and later Austrian developments, can be elevated from a narrow solution to the monetary circularity problem into a comprehensive methodological framework for analysing the non-monetary utility of cryptographic money. The authors reconstruct the classical Mises–Rothbard formulation and apply it to early Bitcoin history and to Monero’s genesis—especially the first non-coinbase transaction at block 110—to show that these systems exhibited technological, epistemic, and ideological utilities at “zero-day, ” prior to any established exchange value. The article also critiques recent misapplications of the regression theorem that dilute its rigor by treating virtually any collectible or idiosyncratically valued object as sufficient to satisfy its requirements, thereby trivializing the theorem. In contrast, the authors propose a disciplined regression-based research programme capable of distinguishing genuine cryptographic innovations from speculative tokens, tracing how early non-monetary utilities bootstrap intersubjective demand, marketability, and eventual monetary roles. The essay concludes that a rigorously applied regression framework provides a powerful tool for evaluating digital assets, advancing cryptographic research, and understanding the emergence of new monetary and institutional forms.
    Keywords: Mises’ regression theorem; non-monetary utility; cryptocurrencies; Bitcoin; Monero; zero-day analysis; Austrian economics; commodity theory of money; cryptographic primitives; privacy technologies; methodological framework; monetary emergence; blockchain history; digital assets evaluation
    JEL: B41 E31 Z1 Z11 Z19
    Date: 2025–09
    URL: https://d.repec.org/n?u=RePEc:pra:mprapa:127140
  7. By: Divya Choudhary (IIM Lucknow - Indian Institute of Management Lucknow); Ajay Kumar (EM - EMLyon Business School); Yeming Gong (EM - EMLyon Business School); Thanos Papadopoulos
    Abstract: We perform a multidimensional and integrated investigation of risks associated with circular supply chains (CSC), drawing on Transition Management Theory (TMT). This research focuses on e-waste from the Indian electronics industry, a waste stream with significant recovery potential and one of the fastest-growing in emerging economies. Drawing on TMT, the study (i) institutionalises risk management activities in circular systems to operationalise the transition towards CE; (ii) quantifies CSC risks at operational, tactical, and strategic levels and measure the total risk exposure of CSCs; (iii) comprehensively cogitates the operational, socio-environmental, and financial implications of CSCs risks and (iv) considers uncertainty in operations research (OR) models by applying a fuzzy set theory, evidential reasoning algorithm, and expected utility theory based model to evaluate and profile the CSCs risks. The proposed model contributes to the application of decision analysis and risk analysis approaches in the sustainability domain and can efficiently model uncertain, subjective, and incomplete data. Our findings reveal that customers' reluctance to purchase reprocessed products represents the most critical challenge to the effectiveness of CSCs. Furthermore, contrary to conventional perspectives, organizations are strategically shifting toward adopting circular practices. However, they often lack the practical means and resources to implement these strategies effectively.
    Keywords: Circular supply chains, evidential reasoning algorithm (ERA), expected utility theory, risk quantification, transition management theory
    Date: 2025–02–03
    URL: https://d.repec.org/n?u=RePEc:hal:journl:hal-05489667
  8. By: Lassance, Nathan (Université catholique de Louvain, LIDAM/LFIN, Belgium); Vanderveken, Rodolphe (Université catholique de Louvain, LIDAM/LFIN, Belgium); Vrins, Frédéric (Université catholique de Louvain, LIDAM/LFIN, Belgium)
    Abstract: We introduce analytical linear and nonlinear shrinkage estimators of the sample covariance matrix that are optimal for mean-variance portfolio choice. Unlike the classical estimators based on statistical loss functions like the mean squared error, our shrinkage covariance matrices optimize the expected out-of-sample portfolio utility and account for estimation errors in mean returns. Our estimators shrink the sample eigenvalues more intensively than conventional methods, and they especially diminish the contribution of principal components with small squared Sharpe ratios. By jointly estimating the covariance matrix and the optimal portfolio in one step, our method delivers significant empirical performance gains relative to the usual two-step shrinkage approach. Our portfolios also help reduce turnover and outperform recent regularized mean-variance portfolio strategies.
    Keywords: Estimation risk ; linear shrinkage ; mean-variance portfolio ; nonlinear shrinkage ; out-of-sample utility ; parameter uncertainty
    JEL: G11
    Date: 2025–07–11
    URL: https://d.repec.org/n?u=RePEc:ajf:louvlf:2025002
  9. By: Christian Gollier (TSE-R - Toulouse School of Economics - UT Capitole - Université Toulouse Capitole - Comue de Toulouse - Communauté d'universités et établissements de Toulouse - EHESS - École des hautes études en sciences sociales - CNRS - Centre National de la Recherche Scientifique - INRAE - Institut National de Recherche pour l’Agriculture, l’Alimentation et l’Environnement)
    Abstract: Because of risk aversion, any sensible investment valuation system should value less projects that contribute more to the aggregate risk. In theory, this is done by adjusting discount rates to consumption betas. But in reality, most public institutions use a dis-count rate that is rather insensitive to the risk profile of their investment projects. The economic consequences of the implied misallocation of capital are severe. I calibrate a Lucas model in which the investment opportunity set contains a constellation of projects with different expected returns and risk profiles. The model matches the traditional finan-cial and macro moments, together with the observed heterogeneity of assets' risk profiles. The welfare loss of using a single discount rate is equivalent to a permanent reduction in consumption that lies somewhere between 15% and 45% depending upon which single discount rate is used.
    Keywords: capital budgeting, rare disasters, WACC fallacy, Arrow-Lind theorem, carbon pricing, asset pricing, investment theory, Discounting
    Date: 2026–02
    URL: https://d.repec.org/n?u=RePEc:hal:journl:hal-05483623

This nep-upt issue is ©2026 by Alexander Harin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.