|
on Computational Economics |
Issue of 2010‒06‒04
twelve papers chosen by |
By: | Nadia Belhaj Hassine; Véronique Robichaud; Bernard Decaluwé |
Abstract: | Computable General Equilibrium (CGE) models have gained continuously in popularity as an empirical tool for assessing the impact of trade liberalization on agricultural growth, poverty and income distribution. Conventional models ignore however the channels linking technical change in agriculture, trade openness and poverty. This study seeks to incorporate econometric evidence of these linkages into a CGE model to estimate the impact of alternative trade liberalization scenarios on poverty and equity. The analysis uses the Latent Class Stochastic Frontier Model (LCSFM) and the metafrontier function to investigate the influence of trade openness on agricultural technological change. The estimated productivity effects induced from higher levels of trade are combined with a general equilibrium analysis of trade liberalization to evaluate the income and prices changes. These effects are then used to infer the impact on poverty and inequality following the top-down approach. The model is applied to Tunisian data using the social accounting matrix of 2001 and the 2000 household expenditures surveys. Poverty is found to decline under agricultural and full trade liberalization and this decline is much more pronounced when the productivity effects are included. |
Keywords: | Openness, Agriculture, Productivity, Poverty, CGE modeling |
JEL: | C24 C33 D24 F43 I32 Q17 |
Date: | 2010 |
URL: | http://d.repec.org/n?u=RePEc:lvl:lacicr:1022&r=cmp |
By: | Situngkir, Hokky |
Abstract: | Psychological states side by side with the bounded rational expectations among social agents contributes to the pattern of consumptions in economic system. One of the psychological states are the envy – a tendency to emulate any gaps with other agents’ properties. The evolutionary game theoretic works on conspicuous consumption are explored by growing the micro-view of economic agency in lattice-based populations, the landscape of consumptions. The emerged macro-view of multiple equilibria is shown in computational simulative demonstrations altogether with the spatial clustered agents based upon the emerged agents’ economic profiles. |
Keywords: | conspicuous consumption; behavioral economics; agent-based simulations |
JEL: | D11 C63 D82 C78 B40 C02 C62 E20 A14 |
Date: | 2010–05–07 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:22948&r=cmp |
By: | Shi, Guanming (University of Wisconsin); Stiegert, Kyle (University of Wisconsin); Chavas, Jean-Paul (University of Wisconsin) |
Abstract: | In this paper, we investigate substitution/complementarity relationships among products sold with different bundled characteristics and under different vertical arrangements. Our conceptual model demonstrates the interactive price impacts emanating from product differentiation, market concentration and market size. The model is applied to the U.S. cottonseed market using transaction level data from 2002 to 2007. This market has been impacted structurally in numerous ways due to the advances and the rapid adoption of seeds with differing bundles of biotechnology traits and vertical penetration emanating from the biotechnology seed industry. Several interesting findings are reported. The econometric investigation finds evidence of sub-additive pricing in the bundling of patented biotech traits. Vertical organization is found to affect pricing and the exercise of market power. While higher market concentration is associated with higher prices, there is also evidence of cross-product complementarity effects that lead to lower prices. Simulation methods are developed to measure the net price effects. These simulations are applicable for use in pre-merger analysis of industries producing differentiated products and exhibiting similar market complexities. |
JEL: | L13 L40 L65 |
Date: | 2009–12 |
URL: | http://d.repec.org/n?u=RePEc:ecl:wisagr:543&r=cmp |
By: | Arthur Huang; David Levinson (Nexus (Networks, Economics, and Urban Systems) Research Group, Department of Civil Engineering, University of Minnesota) |
Abstract: | Adopting an agent-based approach, this paper explores the topological evolution of the Minneapolis Skyway System from a microscopic perspective. Under a decentralized decision-making mechanism, skyway segments are built by self-interested building owners. We measure the accessibility for the blocks from 1962 to 2002 using the size of office space in each block as an indicator of business opportunities. By building skyway segments, building owners desire to increase their buildingsÕ value of accessibility, and thus potential business revenue. The skyway network in equilibrium generated from the agent model displays similarity to the actual skyway system. The network topology is evaluated by multiple centrality measures (e.g., degree centrality, closeness centrality, and betweenness centrality) and a measure of road contiguity, roadness. Sensitivity tests such parameters as distance decay parameter and construction cost per unit length of segments are performed. Our results disclose that the accessibility- based agent model can provide unique insights for the dynamics of the skyway network growth. |
JEL: | R41 R48 Q41 R51 |
Date: | 2010 |
URL: | http://d.repec.org/n?u=RePEc:nex:wpaper:skywayagents&r=cmp |
By: | Juan Carlos Hatchondo; Leonardo Martinez |
Abstract: | We propose a sovereign default framework that allows us to quantify the importance of the debt dilution problem in accounting for the level and volatility of sovereign default risk. We find that debt dilution accounts for almost 100% of the level and volatility of sovereign default risk in the simulations of a baseline model. Even without commitment to future repayment policies and without contingency of sovereign debt, if the sovereign could eliminate the dilution problem, the number of defaults per 100 years in our simulations decreases from 2.72 with debt dilution to 0.01 without debt dilution. This occurs in spite of dilution accounting for only 1% of the mean debt level. Our analysis is also relevant for the study of other credit markets where the debt dilution problem could appear. |
Date: | 2010 |
URL: | http://d.repec.org/n?u=RePEc:fip:fedrwp:10-08&r=cmp |
By: | Armstrong, Christopher D. (University of Pennsylvania); Larcker, David F. (Stanford University); Su, Che-Lin (University of Chicago) |
Abstract: | The two major paradigms in the theoretical agency literature are moral hazard (i.e., hidden action) and adverse selection (i.e., hidden information). Prior research typically solves these problems in isolation, as opposed to simultaneously incorporating both adverse selection and moral hazard features. We formulate two complementary generalized principal-agent models that incorporate features observed in real world contracting environments (e.g., agents with power utility and limited liability, lognormal stock price distributions, and stock options) as mathematical programs with equilibrium constraints (MPEC). We use state- of-the-art numerical algorithms to solve the resulting models. We find that many of the standard results no longer obtain when wealth effects are present. We also develop a new measure of incentives calculated as the change in the agent's certainty equivalent under the optimal contract for a change in action evaluated at the optimal action. This measure facilitates interpretation of the resulting contracts and allows us to compare contracts across different contracting environments. |
JEL: | C60 C61 J33 M52 |
Date: | 2010–02 |
URL: | http://d.repec.org/n?u=RePEc:ecl:stabus:2049&r=cmp |
By: | Yu-Min Yen |
Abstract: | In this short report, we discuss how coordinate-wise descent algorithms can be used to solve minimum variance portfolio (MVP) problems in which the portfolio weights are constrained by $l_{q}$ norms, where $1\leq q \leq 2$. A portfolio which weights are regularised by such norms is called a sparse portfolio (Brodie et al.), since these constraints facilitate sparsity (zero components) of the weight vector. We first consider a case when the portfolio weights are regularised by a weighted $l_{1}$ and squared $l_{2}$ norm. Then two benchmark data sets (Fama and French 48 industries and 100 size and BM ratio portfolios) are used to examine performances of the sparse portfolios. When the sample size is not relatively large to the number of assets, sparse portfolios tend to have lower out-of-sample portfolio variances, turnover rates, active assets, short-sale positions, but higher Sharpe ratios than the unregularised MVP. We then show some possible extensions; particularly we derive an efficient algorithm for solving an MVP problem in which assets are allowed to be chosen grouply. |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1005.5082&r=cmp |
By: | Pierre Del Moral (INRIA Bordeaux - Sud-Ouest - ALEA - INRIA - Université de Bordeaux - CNRS : UMR5251); Peng Hu (INRIA Bordeaux - Sud-Ouest - ALEA - INRIA - Université de Bordeaux - CNRS : UMR5251); Nadia Oudjane (LAGA - Laboratoire Analyse, Géométrie et Applications - Universit´e Paris 13 - Université Paris-Nord - Paris XIII, EDF R&D - EDF); Bruno Rémillard (MQG - Méthodes Quantitatives de Gestion - HEC-Montréal) |
Abstract: | We analyze the robustness properties of the Snell envelope backward evolution equation for discrete time models. We provide a general robustness lemma, and we apply this result to a series of approximation methods, including cut-off type approximations, Euler discretization schemes, interpolation models, quantization tree models, and the Stochastic Mesh method of Broadie-Glasserman. In each situation, we provide non asymptotic convergence estimates, including Lp-mean error bounds and exponential concentration inequalities. In particular, this analysis allows us to recover existing convergence results for the quantization tree method and to improve significantly the rates of convergence obtained for the Stochastic Mesh estimator of Broadie-Glasserman. In the final part of the article, we propose a genealogical tree based algorithm based on a mean field approximation of the reference Markov process in terms of a neutral type genetic model. In contrast to Broadie-Glasserman Monte Carlo models, the computational cost of this new stochastic particle approximation is linear in the number of sampled points. |
Keywords: | Snell envelope; optimal stopping; American option pricing; genealogical trees; interacting particle model |
Date: | 2010–05–28 |
URL: | http://d.repec.org/n?u=RePEc:hal:journl:inria-00487103_v2&r=cmp |
By: | Erik Jenelius; Lars-Goran Mattsson; David Levinson (Nexus (Networks, Economics, and Urban Systems) Research Group, Department of Civil Engineering, University of Minnesota) |
Abstract: | In this paper we introduce an activity-based modeling approach for evaluating the traveler costs of transport network disruptions. The model handles several important aspects of such events: increases in travel time may be very long in relation to the normal day-to-day fluctuations; the impact of delay may depend on the flexibility to reschedule activities; lack of information and uncertainty about travel conditions may lead to under- or over-adjustment of the daily schedule in response to the delay; delays on more than one trip may restrict the gain from rescheduling activities. We derive properties such as the value of time and schedule costs analytically. Numerical calculations show that the average cost per hour delay increases with the delay duration, so that every additional minute of delay comes with a higher cost. The cost varies depending on adjustment behavior (less adjustment, loosely speaking, giving higher cost) and scheduling flexibility (greater flexibility giving lower cost). The results indicate that existing evaluations of real network disruptions have underestimated the societal costs of the events. |
JEL: | R41 R48 D63 |
Date: | 2010 |
URL: | http://d.repec.org/n?u=RePEc:nex:wpaper:traveller_disruptions_costs&r=cmp |
By: | Yves Balasko (University of York); Enrique Kawamura (Department of Economics, Universidad de San Andres) |
Abstract: | This paper answers the question of whether non-strategic default improves welfare, not only for borrowers with uncertain future income but also for lenders with certain future endowments, relative to no default. We show that the answer is a¢ rmative for a positive-Lebesgue-measure set of individual endowments. Numerical computations show that the size of such endowment set is larger the larger are both the risk aversion and the probability of default. Other numerical examples show that with defaultable securities lenders may finance the purchase of the latter by selling short default-free assets. This portfolio reminds those of hedge-funds such as LTCM. |
Keywords: | macroeconomics, welfare, Pareto |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:sad:wpaper:102&r=cmp |
By: | Lukasz Delong |
Abstract: | In this paper we investigate novel applications of a new class of equations which we call time-delayed backward stochastic differential equations. We show that many pricing and hedging problems concerning structured products, participating products or variable annuities can be handled by this equations. Time-delayed BSDEs may appear when we want to find a strategy and a portfolio which should replicate the liability whose pay-off depends on the applied investment strategy or the values of the portfolio. This is usually the case for investment funds or life insurance investment contracts which have bonus distribution mechanisms or provide protection against low returns. We consider some life insurance products, derive the corresponding time-delayed BSDEs and solve them explicitly or at least provide hints how to solve them numerically. We investigate perfect hedging and quadratic hedging which is crucial for insurance applications. We study consequences and give an economic interpretation of the fact that a time-delay BSDE may not have a solution or may have multiple solutions. |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1005.4417&r=cmp |
By: | Georg Müller-Fürstenberger; Gunter Stephan |
Abstract: | This paper discusses the interplay between the choice of the discount rate, greenhouse gas mitigation and endogenous technological change. Neglecting the issue of uncertainty it is shown that the green golden rule stock of atmospheric carbon is uniquely determined, but is not affected by technological change. More general it is shown analytically within the framework of a reduced model of integrated assessment that optimal stationary stocks of atmospheric carbon depend on the choice of the discount rate, but are independent of the stock of technological knowledge. These results are then reinforced numerically in a fully specified integrated assessment analysis. |
Keywords: | Integrated Assessment; discount rate; endogenous technological change; climate change |
JEL: | Q40 O13 |
Date: | 2010–05 |
URL: | http://d.repec.org/n?u=RePEc:ube:dpvwib:dp1008&r=cmp |