nep-hpe New Economics Papers
on History and Philosophy of Economics
Issue of 2009‒03‒14
twenty papers chosen by
Erik Thomson
University of Manitoba

  1. The Financial Crisis and the Systemic Failure of Academic Economics By David Colander; Hans Föllmer; Armin Haas; Michael Goldberg; Katarina Juselius; Alan Kirman; Thomas Lux; Brigitte Sloth
  2. Better to be rough and relevant than to be precise and irrelevant. Reddaway's Legacy to Economics By Ajit Singh
  3. Does Macroeconomics Need Microeconomic Foundations? By Da Silva, Sergio
  4. Psychology and Economics rather than Psychology versus Economics: Cultural differences but no barriers! By Hermann Brandstätter; Werner Güth; Hartmut Kliemt
  5. Neuroeconomics: A Critique of ‘Neuroeconomics: A Critical Reconsideration’ By Stanton, Angela A.
  6. La filosofía ética de la Teoría del Equilibrio General By Loaiza Quintero, Osmar Leandro
  7. Eventology versus contemporary theories of uncertainty By Vorobyev, Oleg
  8. Climate Mitigation or Technological Revolution? A Critical Choice of Futures By Graeme Donald Snooks
  9. Economists in the PITS? By Bruno S. Frey
  10. The Tinbergen & Hueting Approach in the Economics of Ecological Survival By Colignatus, Thomas
  11. Constructing a General Theory of Life: The Dynamics of Human and Non-human Systems By Graeme Donald Snooks
  12. The New Global Crisis Makers: Economic Intervention and the Loss of Strategic Leadership By Graeme Donald Snooks
  13. Perceptions of Efficacy, Control, and Risk: A Theory of Mixed Control By Erik Monsen; Diemo Urbig
  14. Should We Discount the Far-Distant Future at Its Lowest Possible Rate? By Gollier, Christian
  15. Assessing trust through social capital? A possible experimental answer. By Migheli, Matteo
  16. "Time to Bail Out-- Alternatives to the Bush-Paulson Plan" By Dimitri B. Papadimitriou; L. Randall Wray
  17. Does Participating in a Collective Decision Affect the Levels of Contributions Provided? An Experimental Investigation By Francesca Bortolami; Luigi Mittone
  18. Group Selection: The quest for social preferences By Salomonsson, Marcus
  19. Corruption: Measuring the Unmeasurable By Zaman, Asas; Rahim, Faizur
  20. Taking Stock of Existing Structural Policy and Outcome Indicators By Davide Furceri; Annabelle Mourougane

  1. By: David Colander; Hans Föllmer; Armin Haas; Michael Goldberg; Katarina Juselius; Alan Kirman; Thomas Lux; Brigitte Sloth
    Abstract: The economics profession appears to have been unaware of the long build-up to the current worldwide financial crisis and to have significantly underestimated its dimensions once it started to unfold. In our view, this lack of understanding is due to a misallocation of research efforts in economics. We trace the deeper roots of this failure to the profession’s insistence on constructing models that, by design, disregard the key elements driving outcomes in real-world markets. The economics profession has failed in communicating the limitations, weaknesses, and even dangers of its preferred models to the public. This state of affairs makes clear the need for a major reorientation of focus in the research economists undertake, as well as for the establishment of an ethical code that would ask economists to understand and communicate the limitations and potential misuses of their models
    Keywords: financial crisis, academic moral hazard, ethic responsibility of researchers
    JEL: A11 B40 G1
    Date: 2009–02
  2. By: Ajit Singh
    Abstract: W.B. Reddaway has been a highly influential figure in Cambridge economics during the second half of the 20th Century. His method and style of doing economics - called the Reddaway-type economics - were quite distinct. The present paper explains Reddaway's methodology by examining his most important research contributions. The title of this essay conveys his distance from mainstream economists. His essential substantive difference with the latter concerned inferential econometrics. He subscribed to Keynes' critique of Timburgen's methodology. In summary, Reddaway regarded economics as an empirical, evidence-based subject which, through economic policy, should help improve the world. In his view mathematics could sometimes help, but, more often than not, it obfuscated economic reality. Currently the academic economics profession is dominated by a priori theorising and deductive modelling. Greater attention to Reddaway's legacy to economics, to its research methods and to teaching, would very much help to rebalance the subject.
    Keywords: Method and style of doing economics, Reddaway-type economics, inferential econometrics
    JEL: A1 A2 C1 B5
    Date: 2008–12
  3. By: Da Silva, Sergio
    Abstract: The author argues that it is microeconomics that needs foundations, not macroeconomics. Preferences need to be built on biology, and, in particular, on neuroscience. In contrast, macroeconomics could benefit from rationalizations of aggregate economic phenomena by non-equilibrium statistical physics.
    Keywords: Microfoundations, neuroeconomics, econophysics
    JEL: B22 B41 C82 D87
    Date: 2009
  4. By: Hermann Brandstätter (University of Linz); Werner Güth (Max Planck Institute of Economics, Jena, Germany); Hartmut Kliemt (Frankfurt School of Finance & Management, Frankfurt am Main, Germany)
    Abstract: During the last three decades the ascent of behavioral economics clearly helped to bring down artificial disciplinary boundaries between psychology and economics. Noting that behavioral economics seems still under the spell of the rational choice tradition - and, indirectly, of behaviorism - we scrutinize in an exemplary manner how the development of some kind of "cognitive economics" might mirror the rise of "cognitive psychology" without endangering the advantages of the division of labor and of disciplinary specialization.
    Keywords: bounded rationality, game theory, satisficing, interdisciplinary research, experimental economics, economic psychology
    JEL: B31 B41 C72 C73 C78 D63
    Date: 2009–03–04
  5. By: Stanton, Angela A.
    Abstract: Some economists believe that neuroeconomists threatens the theory of economics. Glenn Harrison’s paper “Neuroeconomics: A Critical Reconsideration” (2008) provides some support for this view, though some of the points he makes are somewhat disguised. The field of neuroeconomics is barely into its teenage years; and it is trying to do what? Criticize and redesign the field of economics developed over hundreds of years? But that is not what neuroeconomics is trying to do, in spite of all the efforts of some economists trying to place it into that shoebox (see the argument in great detail in Andrew Caplin, Andrew Schotter 2008). Neuroeconomics is a Mendelian-Economics of sort; it is a science that is able to generate data by fixing the environment to some degree, varying a single independent variable for its affects, and is able to see each individual’s choices from initiation of the decision-making process to its outcome. Mainstream (standard) economics, on the other hand, looks at the average of the outcomes of many individuals and proposes how people chose those outcomes, retroactively. The two fields, neuroeconomics and standard economics, are evaluating two sides of the same coin: one with and the other without ceteris paribus; they are not in conflict with one another.
    Keywords: Neuroeconomics; Standard Economics; Ceteris Paribus; Hormones
    JEL: D01 C91 D87
    Date: 2008
  6. By: Loaiza Quintero, Osmar Leandro
    Abstract: Walrasian General Equilibrium Theory puts no restriction on the income distribution that results from the market functioning, since this theory purportedly excludes the analysis of distribution related problems, as they may imply the introduction of normative considerations. However, the absence of some kind of equity criteria, namely, of a normative judgment regarding the desirable distribution of wealth or income, is not a proof about the amorality of this theory; on the contrary, such an absence is due to the kind of value judgments underlying it. The aim of this paper is then to expose those value judgments that lie at the very base of the Walrasian General Equilibrium Theory, which explain the absence of some equity related criteria and to examine some alternatives that aim to solve this shortage.
    Keywords: General Equilibrium Theory; utilitarianism; Arrow's Impossibility Theorem; Moral Philosophy
    JEL: D63 A13 D30
    Date: 2008–10–15
  7. By: Vorobyev, Oleg
    Abstract: The development of probability theory together with the Bayesian approach in the three last centuries is caused by two factors: the variability of the physical phenomena and partial ignorance about them. As now it is standard to believe [Dubois, 2007], the nature of these key factors is so various, that their descriptions are required special uncertainty theories, which differ from the probability theory and the Bayesian credo, and provide a better account of the various facets of uncertainty by putting together probabilistic and set-valued representations of information to catch a distinction between variability and ignorance. Eventology [Vorobyev, 2007], a new direction of probability theory and philosophy, offers the original event approach to the description of variability and ignorance, entering an agent, together with his/her beliefs, directly in the frameworks of scientific research in the form of eventological distribution of his/her own events. This allows eventology, by putting together probabilistic and set-event representation of information and philosophical concept of event as co-being [Bakhtin, 1920], to provide a unified strong account of various aspects of uncertainty catching distinction between variability and ignorance and opening an opportunity to define imprecise probability as a probability of imprecise event in the mathematical frameworks of Kolmogorov's probability theory [Kolmogorov, 1933].
    Keywords: uncertainty; probability; event; co-being; eventology; imprecise event.
    JEL: C02 C11
    Date: 2009–02–15
  8. By: Graeme Donald Snooks
    Abstract: Mankind currently is not only facing a major environmental challenge, it is embarking on a hugely risky enterprise — that of climate mitigation. This unprecedented global adventure is an attempt to change the nature and shape of human society on the grounds that our traditional market system has failed us. The current enterprise is hugely risky because it is based not on what has happened but on what we are told by “climate-mitigation engineers” might happen. The argument in this essay is simple but powerful, and can be outlined in the following five propositions: • The science of climate change is challenging but compelling, based as it is on an impressive and growing body of expert empirical research. What it shows is that recent climate change is human induced. Hence, further climate change and its mitigation are problems primarily for the social not natural sciences. • The “science” of climate mitigation is nonexistent, because orthodox social science has failed to model the dynamics of human society. And it is the dynamics of human society that will largely determine future climate change. • Orthodox economics, which has attempted to fill the void, has failed completely. Economic theory is suitable only for the analysis of small, shortrun issues that can be accommodated within a static framework — such as the price of a cup of tea; whereas the issue of climate mitigation is one of the biggest and most important issues humanity will ever face, it is long-run in nature, and it can only be adequately handled within a dynamic framework. As orthodox economics has been unable to develop a realist general dynamic theory, its practitioners have been forced to employ simplistic historicist models when analyzing future climate change. • What we need is a new science of human dynamics. The basis for this new science is provided by the author’s dynamic-strategy theory. It is a realist theory in the sense that it has been derived from a long-term, systematic observation of the fluctuating fortunes of both human society over the past 2 million years (myrs) and life over the past 4,000 myrs. • Economists have massively underestimated the costs of their proposed climate mitigation program aimed at stabilizing greenhouse-gas concentrations, because they have employed the inadequate static cost–benefit methodology. This essay takes a very different approach. By estimating the dynamic costs — essentially the costs of suppressing the imminent technological revolution that can only be identified in a realist dynamic framework — I have found that total costs will be almost 100 times greater than current estimates by the year 2100. This puts a comprehensive mitigation program totally out of the question. What, then, is to be done? This essay provides the answer.
    Keywords: climate mitigation, technological revolution, human dynamics, economic intervention, dynamic costs and benefits
    JEL: Q54 Q40 Q20 O30 O40 O47 P1
    Date: 2009–02
  9. By: Bruno S. Frey
    Abstract: Academic economists today are caught in a “Publication Impossibility Theorem System” or PITS. To further their careers, they are required to publish in A-journals, but this is impossible for the vast majority because there are few slots open in such journals. Such academic competition is held to provide the right incentives for hard work, but there may be serious negative consequences: the wrong output may be produced in an inefficient way, the wrong people may be selected, and the losers may react in a harmful way. The paper suggests several ways for improvement.
    Keywords: Academia, economists, publication, journals, incentives, economic methodology
    JEL: A1 D02 I23
    Date: 2009–03
  10. By: Colignatus, Thomas
    Abstract: Tinbergen & Hueting (1991) provide an approach to the economics of ecological survival that still is unsurpassed. Various “green GDPs” have been proposed such as ISEW, Ecological Footprint, Genuine Savings and Genuine Progress Indicator, and lately there is an increased interest in happiness as a re-interpretation of economic utility and social welfare. With respect to both ecological survival and requirements of economic theory these alternatives however fail. The Tinbergen & Hueting (1991) approach is (1) rooted in the fundamentals of economic analysis, (2) rooted in fundamentals of ecology, (3) applicable within the statistical framework of national accounting and henceforth fully practical, (4) demanding in economic and environmental expertise but concerning the resulting indicator of (environmentally) Sustainable National Income (eSNI) easy to understand by policy makers and the general public. Currently, statistical offices and economic advisory agencies over the world are implementing NAMEA systems for national accounting and derived indicators both for statistical observation and projections for the future. Policy discussions on ecological survival will be much served when researchers study in detail what these great economists have wrought. When an economist hasn’t read Tinbergen & Hueting (1991) and Hueting and De Boer (2001) then an advice on economic growth and ecological survival is at risk to be misguided – as indeed is shown in the various cases.
    Keywords: Social welfare; national income; sustainable national income; economic growth; sustainable economic growth; sustainability; environment
    JEL: Q01 E01 A11
    Date: 2009–03–09
  11. By: Graeme Donald Snooks
    Abstract: The ultimate objective of theorists studying living systems is to construct a general theory of life that can explain and predict the dynamics of both human and nonhuman systems. Yet little progress has been made in this endeavour. Why? Because of the inappropriate methods adopted by complexity theorists. By assuming that the supply-side physics model – in which local interactions are said to give rise to the emergence of order and complexity – could be transferred either entirely (social physics) or partially (agent-based models, or ABMs) from the physical to the life sciences, we have distorted reality and, thereby, delayed the construction of a general dynamic theory of living systems. Is there a solution? Yes, but only if we abandon the deductive and analogical methods of complexity theorists and adopt the inductive method. With this approach it is possible to construct a realist and demand-side general dynamic theory, as in the case of the dynamic-strategy theory presented in this paper.
    Keywords: complex living systems, unified theory, general theory of life, dynamics. Demand-side, methodology
    JEL: A12 B41 C73 O40
    Date: 2009–02
  12. By: Graeme Donald Snooks
    Abstract: The “crisis exaggerators” are telling us that current economic conditions amount to an “unprecedented” global economic recession. This is historically incorrect. What is unprecedented is the degree to which economic commentators and political leaders are talking up economic downturn. What is their agenda? Could it be an attempt to prepare the way for an “unprecedented” degree of government intervention in the economy? The “new interventionists” – some of whom, like Kevin Rudd, are now calling themselves “social democrats” – have attacked neoliberalism – the prevailing philosophy of Western governments over the past three decades – for failing to provide the direction and regulation needed to prevent the emergence of global financial crisis. But this misses the real point. Neoliberal governments have in fact been dangerously interventionist. Owing to the inflation-targeting policies they have championed, the dynamic mechanism of modern society has been disrupted, economic growth has slowed dramatically, and unemployment has risen – just as I warned in The Global Crisis Makers in 2000. The new global crisis makers are these new interventionists, who, ironically, not only accept neoliberal policies of inflation targeting but also intend to launch massive Keynesian and climate-mitigation programmes of intervention that will throw our strategic life-system into a downward spiral from which we will recover only with great difficulty and cost. Modern governments have lost the age-old art of strategic leadership, which once facilitated the effective operation of humanity’s dynamic life-system.
    Keywords: global crisis, neoliberalism, social democracy, strategic leadership, economic intervention
    JEL: O11 O47 O56 E31 E32 E42 E50 E60
    Date: 2009–02
  13. By: Erik Monsen (Max Planck Institute of Economics, Jena, Germany); Diemo Urbig (Max Planck Institute of Economics, Jena, Germany)
    Abstract: Based on the aggregated insights of the existing theories related to multiple sources of efficacy and locus of control, we introduce the theory of mixed control, a model of compound-risk perception. This theory considers outcome expectancies as being composed of expectancies regarding three distinct sources of risk (self, others, and chance). This reflects that entrepreneurship is a complex and dynamic activity, involving multiple sources of risk. Beliefs about the efficacy of these elements are weighted by the degree to which these elements are perceived to control the outcome. The interaction of efficacy and control beliefs is therefore at the core of our theory. Further, we discuss that risks are not only subjectively perceived but can be endogenous and depend on future decisions and actions of the entrepreneur.
    Keywords: locus of control, self-efficacy, risk perception
    JEL: D8 D83 D84
    Date: 2009–03–04
  14. By: Gollier, Christian
    Abstract: In this paper, we elaborate on an idea initially developed by Weitzman (1998) that justifies taking the lowest possible discount rate for far-distant future cash flows. His argument relies on the arbitrary assumption that when the future rate of return of capital (RRC) is uncertain, one should invest in any project with a positive expected net present value. We examine an economy with a risk-averse representative agent facing an uncertain evolution of the RRC. In this context, we characterize the socially efficient stochastic consumption path, which allows us in turn to use the Ramsey rule to characterize the term structure of socially efficient discount rates. We show that Weitzman’s claim is qualitatively correct if shocks on the RRC are persistent. On the contrary, in the absence of any serial correlation in the RRC, the term structure of discount rates should be flat.
    Keywords: Discount rate, term structure, certainty equivalent rate, Ramsey rule, sustainable development
    JEL: E43 G12 Q51
    Date: 2009
  15. By: Migheli, Matteo
    Abstract: Trust is an important variable in economics, as several transactions are based on it; unfortunately it is difficult to measure. The recent economic literature on social capital shows a positive association between this concept and trust. As social capital is easier to measure than trust is, this paper analyzes the possibility of assessing trust measuring social capital using experimental economics. A basic trust game is played in three Western European countries with undergraduate students; a questionnaire measures their level of social capital, as time spent within social networks. This measure is stronger and more precise than the ones generally used. In particular this paper firstly measures social capital as the intensity of a membership to a voluntary organization, while the extant literature generally considers only the membership per se. Secondly the use of an experiment instead of a questionnaire allows for constructiong a measure of trust which is in principle continuous. Thirdly to play an experiment allows for observing the behaviour of the participants better than by the means of a survey. The results are supportive of the fact that trust can be assessed through social capital, although the presence of a strong geographical effect has to be accounted for.
    Keywords: generalized trust, social capital, gender effect
    JEL: C72 C93
    Date: 2009–02
  16. By: Dimitri B. Papadimitriou; L. Randall Wray
    Abstract: While serving as chairman of the Federal Reserve Board, Alan Greenspan advocated unsupervised securitization, subprime lending, option ARMs, credit-default swaps, and all manner of financial alchemy in the belief that markets "work" to reduce and spread risk, and to allocate it to those best able to assess and bear it--in his view, markets would stabilize in the absence of nasty government intervention. But as Greenspan now admits, he could never have imagined the outcome: a financial and economic crisis of biblical proportions. The problem is, market forces are not stabilizing. Left to their own devices, Wall Street wizards gleefully ran right off the cliff, and took the rest of us with them for good measure. The natural instability of market processes was recognized long ago by John Maynard Keynes, and convincingly updated by Hyman P. Minsky throughout his career. Minsky's theory explained the transformation of the economy over the postwar period from robust to fragile. He pointed his finger at managed money--huge pools of pension funds, hedge funds, sovereign wealth funds, university endowments, money market funds--that are outside traditional banking and therefore largely underregulated and undersupervised. With a large appetite for risk, managed money sought high returns promised by Wall Street's financial engineers, who innovated highly complex instruments that few people understood. In this new Policy Note, President Dimitri B. Papadimitriou and Research Scholar L. Randall Wray take a look back at Wall Street's path to Armageddon, and propose some alternatives to the Bush-Paulson plan to "bail out" both the Street and the American homeowner. Under the existing plan, Treasury would become an owner of troubled financial institutions in exchange for a capital injection--but without exercising any ownership rights, such as replacing the management that created the mess. The bailout would be used as an opportunity to consolidate control of the nation's financial system in the hands of a few large (Wall Street) banks, with government funds subsidizing purchases of troubled banks by "healthy" ones. But it is highly unlikely that relieving banks of some of their bad assets, or injecting some equity into them, will increase their willingness to lend. Resolving the liquidity crisis is the best strategy, the authors say, and keeping small-to-medium-size banks open is the best way to ensure access to credit once the economy recovers. A temporary suspension of the collection of payroll taxes would put more income into the hands of households while lowering the employment costs for firms, fueling spending and employment. The government should assume a more active role in helping homeowners saddled with mortgage debt they cannot afford, providing low-cost 30-year loans directly to all comers; in the meantime, a moratorium on foreclosures is necessary. And federal grants to support local spending on needed projects would go a long way toward rectifying our $1.6 trillion public infrastructure deficit. Can the Treasury afford all these measures? The answer, the authors say, is yes--and it is a bargain if one considers the cost of not doing it. It is obvious that there exist unused resources today, as unemployment rises and factories are idled due to lack of demand. Markets are also voting with their dollars for more Treasury debt. This does not mean the Treasury should spend without restraint--whatever rescue plan is adopted should be well planned and targeted, and of the proper size. The point is that setting arbitrary budget constraints is neither necessary nor desired--especially in the worst financial and economic crisis since the Great Depression.
    Date: 2008–11
  17. By: Francesca Bortolami; Luigi Mittone
    Abstract: From a purely theoretical perspective, there is no reason to expect that different levels of contributions in public goods games are associated with the same sanctioning/rewarding rule. The efficiency of a norm should be independent of its enactment procedure. On the contrary, multidisciplinary and empirical considerations suggest that individuals may behave differently, according to the level of their direct involvement. The question whether participation in norm enactment results in more contributory gap than when the same norm is received, has not been addressed in public good literature so far. Our three experiments show a behavioural regularity: participating in a normative enactment generates different contributory effects, with respect to the case when the sanctioning norm is merely received.
    Keywords: participation, public good games, free riding
    Date: 2009
  18. By: Salomonsson, Marcus (Dept. of Economics, Stockholm School of Economics)
    Abstract: This paper surveys the literature on group selection. I describe the early contributions and the group selection controversy. I also describe the main approaches to group selection in the recent literature; fixation, assortative group formation, and reproductive externalities.
    Keywords: Altruism; spite; externalities; conformity; fixation; signalling
    JEL: C70 D62 D64
    Date: 2009–03–06
  19. By: Zaman, Asas; Rahim, Faizur
    Abstract: While the strategy of measuring and quantifying has been extremely successful, and valuable in the progress of science, it does not follow that it is universally useful. We argue that attempts to measure corruption can be counterproductive in several different ways. Qualitative and action oriented approaches may prove more valuable. A political economy explanation of why extremely distorted and biased measures of corruption continue to be used is also offered.
    Keywords: Corruption; measurement; quantitative imperative; corruption perception index
    JEL: B40 A14
    Date: 2008–12
  20. By: Davide Furceri; Annabelle Mourougane
    Abstract: This paper reviews and assesses in terms of availability, reliability and transparency existing policy and outcome indicators that have been found to be linked both directly and indirectly to economic growth and living standards. Indicators aiming at capturing the political and social situation of countries, as well as governance-related issues, are examined (e.g. political system, political stability, corruption, crime and violence). Topics also include product and labour markets, infrastructure, trade, financial indicators and composite indices of reform.<P>Un inventaire des indicateurs structurels de politique et de performance<BR>Ce document passe en revue et évalue en termes de disponibilité, fiabilité et transparence les indicateurs de politiques et de performance qui existent actuellement et sont liés directement ou indirectement à la croissance économique et au niveau de vie. Des indicateurs cherchant à mesurer la situation sociale et politique des pays, de même que des sujets liés à la gouvernance sont examinés (par exemple, le système politique, la corruption, le crime et la violence). Sont aussi couverts les marchés des produits et du travail, les infrastructures, le commerce, les indicateurs financiers et les indicateurs composites de réforme.
    Keywords: governance, Policy, Politique, gouvernance, structural indicators, indicateurs structurels, outcome, résultats
    JEL: O4 P50
    Date: 2009–02–27

This nep-hpe issue is ©2009 by Erik Thomson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.