nep-cmp New Economics Papers
on Computational Economics
Issue of 2012‒10‒27
nine papers chosen by
Stan Miles
Thompson Rivers University

  1. Imputing Individual Effects in Dynamic Microsimulation Models. An application of the Rank Method By Matteo Richiardi; Ambra Poggi
  2. Exact Draws from the Stationary Distribution of Entry-Exit Models By Takashi Kamihigashi; John Stachurski
  3. Public Pension Benefits Claiming Behavior: New Evidence from the Japanese Study on Aging and Retirement By Shimizutani, Satoshi; Oshio, Takashi
  4. An Impulse Control Approach to Dike Height Optimization (Revised version of CentER DP 2011-097) By Chahim, M.; Brekelmans, R.C.M.; Hertog, D. den; Kort, P.M.
  5. Redistribution spurs growth by using a portfolio effect on human capital By Jan Lorenz; Fabian Paetzel; Frank Schweitzer
  6. Optimal policy for macro-financial stability By Gianluca Benigno; Huigang Chen; Christopher Otrok; Alessandro Rebucci; Eric R. Young
  8. Probability and Asset Updating using Bayesian Networks for Combinatorial Prediction Markets By Wei Sun; Robin Hanson; Kathryn Blackmond Laskey; Charles Twardy
  9. A Framework for Extracting the Probability of Default from Stock Option Prices By Azusa Takeyama; Nick Constantinou; Dmitri Vinogradov

  1. By: Matteo Richiardi; Ambra Poggi
    Abstract: Dynamic microsimulation modeling involves two stages: estimation and forecasting. Unobserved heterogeneity is often considered in estimation, but not in forecasting, beyond trivial cases. Non-trivial cases involve individuals that enter the simulation with a history of previous outcomes. We show that the simple solutions of attributing to these individuals a null effect or a random draw from the estimated unconditional distributions lead to biased forecasts, which are often worse than those obtained neglecting unobserved heterogeneity altogether. We then present a first implementation of the Rank method, a new algorithm for attributing the individual effects to the simulation sample which greatly simplifies those already known in the literature. Out-of-sample validation of our model shows that correctly imputing unobserved heterogeneity significantly improves the quality of the forecasts.
    Keywords: Dynamic microsimulation, Unobserved heterogeneity, Validation, Rank method, Assignment algorithms, Female labor force participation, Italy
    JEL: C53 C18 C23 C25 J11 J12 J21
    Date: 2012
  2. By: Takashi Kamihigashi (Research Institute for Economics & Business Administration (RIEB), Kobe University, Japan); John Stachurski (Research School of Economics, Australian National University, Australia)
    Abstract: In equilibrium models of firm dynamics, the stationary equilibrium distribution of firms summarizes the predictions of the model for a given set of primitives. Focusing on Hopenhayn's seminal model of firm dynamics with entry and exit (Econometrica, 60:5, 1992, p. 1127–1150), we provide an algorithm that generates exact draws from the stationary distribution in finite time for any specified exit threshold. The technique is able to rapidly generate large numbers of exact and independent draws.
    Keywords: Simulation, Stationary equilibrium, Firm dynamics
    JEL: C61 C63
    Date: 2012–10
  3. By: Shimizutani, Satoshi; Oshio, Takashi
    Abstract: This paper explores the public pension claiming behavior of the Japanese. First, we perform financial simulations and estimate the expected utility, depicting the typical patterns of pension benefits over a life cycle. We show that the optimal retirement age depends on the beneficiaries’ mortality risk, discount rate, initial wealth, and risk attitude. Second, we use individual-level data from the Japanese Study on Aging and Retirement to examine empirically the determinants of the take-up timing. We find supportive evidence that most of the factors examined in the simulation are indeed significantly associated with early claiming of pension benefits for wage earners.
    Keywords: Claiming behavior, Pension benefit, Survival probability, Risk attitude, Japanese Study on Aging and Retirement
    JEL: H55 J26
    Date: 2012–10
  4. By: Chahim, M.; Brekelmans, R.C.M.; Hertog, D. den; Kort, P.M. (Tilburg University, Center for Economic Research)
    Abstract: Abstract: This paper determines the optimal timing of dike heightenings as well as the corresponding optimal dike heightenings to protect against floods. To derive the optimal policy we design an algorithm based on the Impulse Control Maximum Principle. In this way the paper presents one of the first real life applications of the Impulse Control Maximum Principle developed by Blaquiere. We show that the proposed Impulse Control approach performs better than Dynamic Programming with respect to computational time. This is caused by the fact that Impulse Control does not need discretization in time.
    Keywords: Impulse Control Maximum Principle;Optimal Control;flood prevention;dikes;cost-benefit analysis.
    JEL: C61 D61 H54 Q54
    Date: 2012
  5. By: Jan Lorenz; Fabian Paetzel; Frank Schweitzer
    Abstract: We demonstrate by mathematical analysis and systematic computer simulations that redistribution can lead to sustainable growth in a society. The human capital dynamics of each agent is described by a stochastic multiplicative process which, in the long run, leads to the destruction of individual human capital and the extinction of the individualistic society. When agents are linked by fully-redistributive taxation the situation might turn to individual growth in the long run. We consider that a government collects a proportion of income and reduces it by a fraction as costs for administration (efficiency losses). The remaining public good is equally redistributed to all agents. We derive conditions under which the destruction of human capital can be turned into sustainable growth, despite the losses from the random growth process and despite the administrative costs. Sustainable growth is induced by redistribution. This effect could be explained by a simple portfolio-effect which re-balances individual stochastic processes. The findings are verified for three different tax schemes: proportional tax, taking proportional more from the rich, and proportionally more from the poor. We discuss which of these tax schemes is optimal with respect to maximize growth under a fixed rate of administrative costs, or with respect to maximize the governmental income. This leads us to some general conclusions about governmental decisions, the relation to public good games, and the use of taxation in a risk taking society.
    Date: 2012–10
  6. By: Gianluca Benigno; Huigang Chen; Christopher Otrok; Alessandro Rebucci; Eric R. Young
    Abstract: In this paper we study whether policy makers should wait to intervene until a financial crisis strikes or rather act in a preemptive manner. We study this question in a relatively simple dynamic stochastic general equilibrium model in which crises are endogenous events induced by the presence of an occasionally binding borrowing constraint as in Mendoza (2010). First, we show that the same set of taxes that replicates the constrained social planner allocation could be used optimally by a Ramsey planner to achieve the first best unconstrained equilibrium: in both cases without any precautionary intervention. Second, we show that the extent to which policymakers should intervene in a preemptive manner depends critically on the set of policy tools available and what these instruments can achieve when a crisis strikes. For example, in the context of our model, we find that, if the policy tools is constrained so that the first best cannot be achieved and the policy maker has access to only one tax instrument, it is always desirable to intervene before the crisis regardless of the instrument used. If however the policy maker has access to two instruments, it is optimal to act only during crisis times. Third and finally, we propose a computational algorithm to solve Markov-Perfect optimal policy for problems in which the policy function is not differentiable.
    Keywords: Monetary policy ; Financial stability
    Date: 2012
  7. By: Lilia Cavallari (Università degli Studi di Roma Tre)
    Abstract: This paper provides a DSGE model with firm entry. Simulations show that the model matches the synchronization of markups and entry observed in the data while at the same time reproducing empirically plausible moments for key macroeconomic variables. Sticky prices are essential for these results.
    Keywords: endogenous entry, firm dynamics, monopolistic competition, market power, markups
    JEL: E31 E32 E52
    Date: 2012
  8. By: Wei Sun; Robin Hanson; Kathryn Blackmond Laskey; Charles Twardy
    Abstract: A market-maker-based prediction market lets forecasters aggregate information by editing a consensus probability distribution either directly or by trading securities that pay off contingent on an event of interest. Combinatorial prediction markets allow trading on any event that can be specified as a combination of a base set of events. However, explicitly representing the full joint distribution is infeasible for markets with more than a few base events. A factored representation such as a Bayesian network (BN) can achieve tractable computation for problems with many related variables. Standard BN inference algorithms, such as the junction tree algorithm, can be used to update a representation of the entire joint distribution given a change to any local conditional probability. However, in order to let traders reuse assets from prior trades while never allowing assets to become negative, a BN based prediction market also needs to update a representation of each user's assets and find the conditional state in which a user has minimum assets. Users also find it useful to see their expected assets given an edit outcome. We show how to generalize the junction tree algorithm to perform all these computations.
    Date: 2012–10
  9. By: Azusa Takeyama (Deputy Director and Economist, Institute for Monetary and Economic Studies, Bank of Japan (E-mail:; Nick Constantinou (Lectuer, Essex Business School, University of Essex (E-mail:; Dmitri Vinogradov (Lectuer, Essex Business School, University of Essex (
    Abstract: This paper develops a framework to estimate the probability of default (PD) implied in listed stock options. The underlying option pricing model measures PD as the intensity of a jump diffusion process, in which the underlying stock price jumps to zero at default. We adopt a two-stage calibration algorithm to obtain the precise estimator of PD. In the calibration procedure, we improve the fitness of the option pricing model via the implementation of the time inhomogeneous term structure model in the option pricing model. Since the term structure model perfectly fits the actual term structure, we resolve the estimation bias caused by the poor fitness of the time homogeneous term structure model. It is demonstrated that the PD estimator from listed stock options can provide meaningful insights on the pricing of credit derivatives like credit default swap.
    Keywords: probability of default (PD), option pricing under credit risk, perturbation method
    JEL: C12 C53 G13
    Date: 2012–10

This nep-cmp issue is ©2012 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.