
on Computational Economics 
Issue of 2012‒01‒03
24 papers chosen by 
By:  Su, EnDer; Fen, YuGin 
Abstract:  The present study uses the structural equation model (SEM) to analyze the correlations between various economic indices pertaining to latent variables, such as the New Taiwan Dollar (NTD) value, the United States Dollar (USD) value, and USD index. In addition, a risk factor of volatility of currency returns is considered to develop a riskcontrollable fuzzy inference system. The rational and linguistic knowledgebased fuzzy rules are established based on the SEM model and then optimized using the genetic algorithm. The empirical results reveal that the fuzzy logic trading system using the SEM indeed outperforms the buyandhold strategy. Moreover, when considering the risk factor of currency volatility, the performance appears significantly better. Remarkably, the trading strategy is apparently affected when the USD value or the volatility of currency returns shifts into either a higher or lower state. 
Keywords:  Knowledgebased Systems; Fuzzy Sets; Structural Equation Model (SEM); Genetic Algorithm (GA); Currency Volatility 
JEL:  C3 C45 F31 
Date:  2011–12–12 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:35474&r=cmp 
By:  Carfì, David; Perrone, Emanuele 
Abstract:  In this paper we apply the Complete Analysis of Differentiable Games (introduced by D. Carfì in [3], [6], [8] and [9]) and already employed by himself and others in [4], [5], [7]) and some new algorithms employing the software wxMaxima 11.04.0 in order to reach a total knowledge of the classic Bertrand Duopoly (1883), viewed as a complex interaction between two competitive subjects, in a particularly difficult asymmetric case. The software wxMaxima is an interface for the computer algebra system Maxima. Maxima is a system for the manipulation of symbolic and numerical expressions, including differentiation, systems of linear equations, polynomials, and sets, vectors, matrices. Maxima yields high precision numeric results by using exact fractions, arbitrary precision integers, and variable precision floating point numbers. Maxima can plot functions and data in two and three dimensions. The Bertrand Duopoly is a classic oligopolistic market in which there are two enterprises producing the same commodity and selling it in the same market. In this classic model, in a competitive background, the two enterprises employ as possible strategies the unit prices of their products, contrary to the Cournot duopoly, in which the enterprises decide to use the quantities of the commodity produced as strategies. The main solutions proposed in literature for this kind of duopoly (as in the case of Cournot duopoly) are the Nash equilibrium and the Collusive Optimum, without any subsequent critical exam about these two kinds of solutions. The absence of any critical quantitative analysis is due to the relevant lack of knowledge regarding the set of all possible outcomes of this strategic interaction. On the contrary, by considering the Bertrand Duopoly as a differentiable game (games with differentiable payoff functions) and studying it by the new topological methodologies introduced by D. Carfì, we obtain an exhaustive and complete vision of the entire payoff space of the Bertrand game (this also in asymmetric cases with the help of wxMaxima 11.04.0) and this total view allows us to analyze critically the classic solutions and to find other ways of action to select Pareto strategies, in the asymmetric cases too. In order to illustrate the application of this topological methodology to the considered infinite game we show how the complete study gives a real extremely extended comprehension of the classic model. 
Keywords:  Asymmetric Bertrand Duopoly; Normalform games; software algorithms in microeconomic policy; Complete Analysis of a normalform complex interaction; Pareto optima; valuation of Nash equilibriums; bargaining solutions 
JEL:  C88 C7 C8 D43 D4 L1 C72 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:35417&r=cmp 
By:  Sören Koch (Faculty of Economics and Management, OttovonGuericke University Magdeburg); Gerhard Wäscher (Faculty of Economics and Management, OttovonGuericke University Magdeburg) 
Abstract:  Order picking is a warehouse function that deals with the retrieval of articles from their storage locations in order to satisfy certain customer demands. Combining several single customer orders into one (more substantial) picking order can increase the efficiency of warehouse operations. The Order Batching Problem considered in this paper deals with the question of how different customer orders should be grouped into picking orders, such that the total length of all tours through the warehouse is minimized, which are necessary to collect all requested articles. For the solution of this problem, the authors introduce a Grouping Genetic Algorithm. This genetic algorithm is combined with a local search procedure which results in a highly competitive hybrid algorithm. In a series of extensive numerical experiments, the algorithm is benchmarked against a genetic algorithm with a standard itemoriented encoding scheme. The results show that the new genetic algorithm based on the grouporiented encoding scheme is preferable for the Order Batching Problem, and that the algorithm provides high quality solutions in reasonable computing times. 
Keywords:  Warehouse Management, Order Picking, Order Batching, Genetic Algorithms 
Date:  2011–11 
URL:  http://d.repec.org/n?u=RePEc:mag:wpaper:110026&r=cmp 
By:  Nwaobi, Godwin 
Abstract:  In recent years, the government, of African Countries has assumed major responsibilities for economic reforms and growth. In attempting to describe their economies, economists (policymakers) in many African Countries have applied certain models that are by now widely known: Linear programming models, inputoutput models, macroeconometric models, vector auto regression models and computable general equilibrium models. Unfortunately, economies are complicated systems encompassing micro behaviors, interaction patterns and global regularities. Whether partial or general in scope, studies of economic systems must consider how to handle difficult realworld aspects such as asymmetric information, imperfect competition, strategic interaction, collective learning and multiple equilibria possibility. This paper therefore argues for the adoption of alternative modeling (bottomup culturedish) approach known as AGENTBASED Computational Economics (ACE), which is the computational study of African economies modeled as evolving systems of autonomous interacting agents. However, the software bottleneck (what rules to write for our agents) remains the primary challenge ahead. 
Keywords:  artificial intelligence; computational laboratory; complex networks; multiagent systems; agentbase computational economics; social networks; macroeconometric model; linear programming; inputoutput; vector auto regression; ace models; var models; neural networks; gene networks; derivatives; financial contagion; Africa economies; aceges models; energy 
JEL:  C9 C6 C5 C0 B4 C7 C1 C4 D5 C3 C8 C2 
Date:  2011–12–14 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:35414&r=cmp 
By:  Patricia Deflorin (Department of Business Administration, University of Zurich); Helmut Dietl (Department of Business Administration, University of Zurich); Markus Lang (Department of Business Administration, University of Zurich); Eric Lucas (Department of Business Administration, University of Zurich) 
Abstract:  This paper develops a simulation model to compare the performance of two stylized manufacturing networks: the lead factory network (LFN) and the archetype network (AN). The model identifies the optimal network configuration and its implications for coordination mechanisms. Using an NK simulation model to differentiate between exogenous factors (configuration) and endogenous factors (coordination), we find low complexity of the production process, low transfer costs and high search costs, as well as a larger number of manufacturing plants benefit LFN compared to AN. Optimally coordinating the chosen network configuration of LFN might require to fully transfer knowledge in the short run but to transfer nothing in the long run. Moreover, a late knowledge transfer from the lead factory to the plants increases the pretransfer performance of LFN but results in a larger performance drop, yielding a lower shortrun but a higher longrun performance of LFN. 
Keywords:  Manufacturing network, manufacturing plant, global operations management, lead factory, knowledge transfer 
Date:  2011–12 
URL:  http://d.repec.org/n?u=RePEc:iso:wpaper:0152&r=cmp 
By:  Ketter, W.; Collins, J.; Reddy, P.; Flath, C.; Weerdt, M.M. de 
Abstract:  This is the specification for the Power Trading Agent Competition for 2012 (Power TAC 2012). Power TAC is a competitive simulation that models a â€œliberalizedâ€ retail electrical energy market, where competing business entities or â€œbrokersâ€ offer energy services to customers through tariff contracts, and must then serve those customers by trading in a wholesale market. Brokers are challenged to maximize their profits by buying and selling energy in the wholesale and retail markets, subject to fixed costs and constraints. Costs include fees for publication and withdrawal of tariffs, and distribution fees for transporting energy to their contracted customers. Costs are also incurred whenever there is an imbalance between a brokerâ€™s total contracted energy supply and demand within a given timeslot. The simulation environment models a wholesale market, a regulated distribution utility, and a population of energy customers, situated in a real location on Earth during a specific period for which weather data is available. The wholesale market is a relatively simple call market, similar to many existing wholesale electric power markets, such as Nord Pool in Scandinavia or FERC markets in North America, but unlike the FERC markets we are modelling a single region, and therefore we do not model locationmarginal pricing. Customer models include households and a variety of commercial and industrial entities, many of which have production capacity (such as solar panels or wind turbines) as well as electric vehicles. All have â€œrealtimeâ€ metering to support allocation of their hourly supply and demand to their subscribed brokers, and all are approximate utility maximizers with respect to tariff selection, although the factors making up their utility functions may include aversion to change and complexity that can retard uptake of marginally better tariff offers. The distribution utility models the regulated natural monopoly that owns the regional distribution network, and is responsible for maintenance of its infrastructure and for realtime balancing of supply and demand. The balancing process is a marketbased mechanism that uses economic incentives to encourage brokers to achieve balance within their portfolios of tariff subscribers and wholesale market positions, in the face of stochastic customer behaviors and weatherdependent renewable energy sources. The broker with the highest bank balance at the end of the simulation wins. 
Keywords:  power;portfolio management;sustainability;preferences;energy;trading agent competition;electronic commerce;autonomous agents;policy guidance 
Date:  2011–12–14 
URL:  http://d.repec.org/n?u=RePEc:dgr:eureri:1765030683&r=cmp 
By:  Lajos Gergely Gyurko; Ben Hambly; Jan Hendrik Witte 
Abstract:  We consider a class of discrete time stochastic control problems motivated by some financial applications. We use a pathwise stochastic control approach to provide a dual formulation of the problem. This enables us to develop a numerical technique for obtaining an estimate of the value function which improves on purely regression based methods. We demonstrate the competitiveness of the method on the example of a gas storage valuation problem. 
Date:  2011–12 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1112.4351&r=cmp 
By:  Pickhardt, Michael; Seibold, Goetz 
Abstract:  We analyze income tax evasion dynamics in a standard model of statistical mechanics, the Ising model of ferromagnetism. However, in contrast to previous research, we use an inhomogeneous multidimensional Ising model where the local degrees of freedom (agents) are subject to a specific social temperature and coupled to external fields which govern their social behavior. This new modeling frame allows for analyzing large societies of four different and interacting agent types. As a second novelty, our model may reproduce results from agentbased models that incorporate standard Allingham and Sandmo tax evasion features as well as results from existing twodimensional Ising based tax evasion models. We then use our model for analyzing income tax evasion dynamics under different enforcement scenarios and point to some policy implications.  
Keywords:  tax evasion,tax compliance,Ising Model,econophysics,numerical simulation 
JEL:  H26 O17 C15 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:zbw:cawmdp:53&r=cmp 
By:  Philipp Doersek; Josef Teichmann 
Abstract:  We introduce efficient numerical methods for generic HJM equations of interest rate theory by means of highorder weak approximation schemes. These schemes allow for QMC implementations due to the relatively low dimensional integration space. The complexity of the resulting algorithm is considerably lower than the complexity of multilevel MC algorithms as long as the optimal order of QMCconvergence is guaranteed. In order to make the methods applicable to real world problems, we introduce and use the setting of weighted function spaces, such that unbounded payoffs and unbounded characteristics of the equations in question are still allowed. We also provide an implementation, where we efficiently calibrate an HJM equation to caplet data. 
Date:  2011–12 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1112.5330&r=cmp 
By:  Stefan Boeters (CPB, Netherlands Bureau for Economic Policy Analysis); Luc Savard (Département d’économique and GRÉDI, Université de Sherbrooke) 
Abstract:  This chapter reviews options of labour market modelling in a CGE framework. On the labour supply side, two principal modelling options are distinguished and discussed: aggregated, representative households and microsimulation based on individual household data. On the labour demand side, we focus on the substitution possibilities between different types of labour in production. With respect to labour market coordination, we discuss several wageforming mechanisms and involuntary unemployment. 
Keywords:  computable general equilibrium model, labour market, labour supply, labour demand, microsimulation, involuntary unemployment 
JEL:  C68 D58 J20 J64 
Date:  2011–10 
URL:  http://d.repec.org/n?u=RePEc:shr:wpaper:1120&r=cmp 
By:  Tetsuji Tanaka (School of Oriental and African Studies, University of London); Nobuhiro Hosoe (National Graduate Institute for Policy Studies) 
Abstract:  In the late 2000s, the world grain markets experienced severe turbulence with rapid crop price rises caused by bad crops, oil price hikes, export restrictions, and the emergence of biofuels as well as financial speculation. We review the impacts of the first four realside factors using a world trade computable general equilibrium model. Our simulation results show that oil and biofuelsrelated shocks were the major factors among these four in crop price hikes but that these realside factors in total can explain only about 10% of the actual crop price rises. 
Date:  2011–12 
URL:  http://d.repec.org/n?u=RePEc:ngi:dpaper:1116&r=cmp 
By:  Patricia Apps; Ngo Van Long; Ray Rees 
Abstract:  Given its signiÖcance in practice, piecewise linear taxation has received relatively little attention in the literature. This paper o§ers a simple and transparent analysis of its main characteristics. We fully characterize optimal tax parameters for the cases in which budget sets are convex and nonconvex respectively. A numerical analysis of a discrete version of the model shows the circumstances under which each of these cases will hold as a global optimum. We Önd that, given plausible parameter values and wage distributions, the globally optimal tax system is convex, and marginal rate progressivity increases with rising inequality. 
Keywords:  piecewise linear; income; taxation 
JEL:  H21 H31 J22 
Date:  2011–12 
URL:  http://d.repec.org/n?u=RePEc:auu:dpaper:655&r=cmp 
By:  Carlo Mazzaferro; Marcello Morciano 
Abstract:  Reforms to the Italian social security system, carried out from 1992 onwards, will dramatically change its structure in the long run. So far, empirical research has devoted more attention to their macroeconomic and financial effects while relatively less attention has been paid to analysing their redistributional implications. We present this line of research using CAPP_DYN, a populationbased dynamic microsimulation model. The model stochastically simulates the sociodemographic and economic evolution of a representative sample of the Italian population over the period 20102050. The initial sample is subjected to a large number of demographic and economic events such as partnership formation/dissolution, birth, education, work, retirement, health and disability and death. While acknowledging the rather complex phasing in of the Notional Defined Contribution system, introduced into the Italian social security system from 1995, a set of indexes (net present value ratio, Gini index, replacement ratios etc.) is used to evaluate the distributional properties of the reformed pension system in each of the simulated years as well as in a lifetime/cohort perspective. Two main critical distributional aspects will emerge. Firstly the model predicts an increase in the oldage pensions dispersion in the transitional phase (2015 – 2030) due to the coexistence of different pension regimes and rules in calculating pensions. Moreover, a problem of adequacy in the public pension system from 2035 emerges as the NDC system will be almost completely phased in. 
JEL:  H20 H30 H55 
Date:  2011–12 
URL:  http://d.repec.org/n?u=RePEc:itt:wpaper:wp201111&r=cmp 
By:  Diana Barro (Department of Economics, University Of Venice Cà Foscari); Elio Canestrelli (Department of Economics, University Of Venice Cà Foscari) 
Abstract:  In this contribution we propose an approach to solve a multistage stochastic programming problem which allows us to obtain a time and nodal decomposition of the original problem. This double decomposition is achieved applying a discrete time optimal control formulation to the original stochastic programming problem in arborescent form. Combining the arborescent formulation of the problem with the point of view of the optimal control theory naturally gives as a first result the time decomposability of the optimality conditions, which can be organized according to the terminology and structure of a discrete time optimal control problem into the systems of equation for the state and adjoint variables dynamics and the optimality conditions for the generalized Hamiltonian. Moreover these conditions, due to the arborescent formulation of the stochastic programming problem, further decompose with respect to the nodes in the event tree. The optimal solution is obtained by solving small decomposed subproblems and using a mean valued fixedpoint iterative scheme to combine them. To enhance the convergence we suggest an optimization step where the weights are chosen in an optimal way at each iteration. 
Keywords:  Stochastic programming, discrete time control problem, decomposition methods, iterative scheme 
JEL:  C61 C63 D81 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:ven:wpaper:2011_24&r=cmp 
By:  Zerdani, Ouiza; Moulai, Mustapha 
Abstract:  The problem of optimizing a real valued function over an efficient set of the Multiple Objective Linear Fractional Programming problem (MOLFP) is an important field of research and has not received as much attention as did the problem of optimizing a linear function over an efficient set of the Multiple Objective Linear Programming problem (MOLP).In this work an algorithm is developed that optimizes an arbitrary linear function over an integer efficient set of problem (MOLFP) without explicitly having to enumerate all the efficient solutions. The proposed method is based on a simple selection technique that improves the linear objective value at each iteration.A numerical illustration is included to explain the proposed method. 
Keywords:  Integer programming; Optimization over the efficient set; Multiple objective linear fractional programming; Global optimization 
JEL:  I23 C61 
Date:  2011–02–10 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:35579&r=cmp 
By:  Ana Corina Miller (IIIS and Department of Economics, Trinity College Dublin); Alan Matthews (IIIS and Department of Economics, Trinity College Dublin); Trevor Donnellan (Rural Economy Research Centre, Teagasc); Cathal O'Donoghue (Rural Economy Research Centre, Teagasc) 
Abstract:  This paper describes the construction of a Social Accounting Matrix (SAM) for Ireland for the year 2005. The SAM describes the full circular flow of money and goods in the Irish economy. The SAM includes 55 activities, 55 commodities, two factors of production (capital and labour), one account each for households, enterprises, government and investment/saving, three taxrelated accounts (direct and indirect taxes and custom duties), a trade and transport margin account and three external sectors (UK, Rest of the EU and Rest of the World). Its construction takes place in three steps: (1) building the macroSAM; (2) building an initial unbalanced SAM making use of a variety of additional data sources, and (3) balancing the SAM using the crossentropy method. By treating the SAM as a linear model of the Irish economy and by specifying certain accounts as exogenous, the SAM can be used to simulate the effect of shocks to the exogenous variables or accounts. Examples of such multiplier analysis are presented at the end of this paper. 
Keywords:  social accounting matrix, macro SAM, crossentropy method, inputoutput table, Ireland 
JEL:  C02 C67 D57 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:iis:dispap:iiisdp365&r=cmp 
By:  Dorothée Boccanfuso (Département d’économique and GRÉDI, Université de Sherbrooke); Luc Savard (Département d’économique and GRÉDI, Université de Sherbrooke) 
Abstract:  Labour market analysis is an important element to understand the inequality and poverty within a given population. The literature reveals that the informal sector is characterised by a great deal of flexibility and exempt from formal market rigidities but on the other hand, this sector can constitute a trap from which it is difficult to exit for workers active in the sector with low wages. In this paper we aim to identify the main characteristics differentiating the labor supply of workers on the informal and formal market in the Philippines while estimating these two labor supplies, capturing discrete choice or changes in employment status. We use these estimates to construct a labor supply model that can serve as an input for a broader macromicrosimulation model applied to the Philippines. The results of the estimation provide relatively intuitive findings, highlighting some differences between the two markets. We also contribute to shedding some light into this macromicrosimulation modelling framework that is generally opaque in describing how to construct a microsimulation model with endogenous discrete choice model linked to a CGE model. 
Keywords:  labor supply, informal sector, microsimulation, discrete choice model, Philippines 
JEL:  C35 O53 J24 C81 O17 
Date:  2011–11 
URL:  http://d.repec.org/n?u=RePEc:shr:wpaper:1119&r=cmp 
By:  Bobashev, Georgiy; Cropper, Maureen (Resources for the Future); Epstein, Joshua; Goedecke, Michael; Hutton, Stephen; Over, Mead 
Abstract:  This paper examines positive externalities and complementarities between countries in the use of antiviral pharmaceuticals to mitigate pandemic influenza. It demonstrates the presence of treatment externalities in simple SIR (susceptibleinfectiousrecovered) models and simulations of a Global Epidemiological Model. In these simulations, the pandemic spreads from city to city through the international airline network and from cities to rural areas through ground transport. While most treatment benefits are private, spillovers suggest that it is in the selfinterest of highincome countries to pay for some antiviral treatment in lowincome countries. The most costeffective policy is to donate doses to the country where the outbreak originates; however, donating doses to lowincome countries in proportion to their populations may also be costeffective. These results depend on the transmissibility of the flu strain, its start date, the efficacy of antivirals in reducing transmissibility, and the proportion of infectious people who can be identified and treated. 
Keywords:  pandemic influenza, disease control externalities 
JEL:  I18 C63 D62 
Date:  2011–09–22 
URL:  http://d.repec.org/n?u=RePEc:rff:dpaper:dp1141&r=cmp 
By:  Kapustenko, Oleg 
Abstract:  An analysis of falsified election results is presented. A model of the falsification process is proposed and simulations are performed. The model fits well the data of the parliamentary elections in Russia on December 4, 2011. It is shown that the "noise" of false votes is well separated from the fair “signal”, which can be extracted with high statistical accuracy (less than l%) allowing quantitative reconstruction of the falsification patterns. 
Keywords:  election; ballot stuffing; statistical analysis 
JEL:  D72 
Date:  2011–12–23 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:35543&r=cmp 
By:  ChangShuai Li 
Abstract:  This article analyzes the relationship between copersistence and hedging which indicates copersistence ratio is just the longterm hedging ratio. The new method of exhaustive search algorithm for deriving copersistence ratio is derived in the article. And we also develop a new hedging strategy of combining copersistence with dynamic hedging which can enhance the hedging effectiveness and reduce the persistence of the hedged portfolio. Finally our strategy is illustrated to study the hedge of JIASHI300 index and HS300 stock index future . 
Date:  2011–12 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1112.4027&r=cmp 
By:  Gioia De Melo 
Abstract:  This paper provides evidence on peer effects in educational achievement exploiting for the first time a unique data set on social networks within primary schools in Uruguay. The relevance of peer effects in education is still largely debated due to the identification challenges that the study of social interactions poses. I adopt a recently developed identification method that exploits detailed information on social networks, i.e. individualspecific peer groups. This method enables me to disentangle endogenous effects from contextual effects via instrumental variables that emerge naturally from the network structure. Correlated effects are controlled, to some extent, by classroom fixed effects. I find significant endogenous effects in standardized tests for reading and math. A one standard deviation increase in peers’ test score increases the individual’s test score by 40% of a standard deviation. This magnitude is comparable to the effect of having a mother that completed college. By means of a simulation I illustrate that when schools are stratified by socioeconomic status peer effects may operate as amplifiers of educational inequalities. 
JEL:  I21 I24 O1 
Date:  2011–11 
URL:  http://d.repec.org/n?u=RePEc:usi:wpaper:627&r=cmp 
By:  Ive Marx (Centre for Social Policy, University of Antwerp); Pieter Vandenbroucke (Centrum voor Sociaal Beleid, Herman Deleeck, Universiteit Antwerpen); Verbist, G. (Gerlinde) 
Abstract:  At the European level and in most EU member states, higher employment levels are seen as key to better poverty outcomes. But what can we expect the actual impact to be? Up until now shiftshare analysis has been used to estimate the impact of rising employment on relative income poverty. This method has serious limitations. We propose a more sophisticated simulation model that builds on regression based estimates of employment probabilities and wages. We use this model to estimate the impact on relative income poverty of moving towards the Europe 2020 target of 75 percent of the working aged population in work. Two sensitivity checks are included: giving priority in job allocation to jobless households and imputing low instead of estimated wages. This article shows that employment growth does not necessarily result in lower relative poverty shares, a result that is largely consistent with observed outcomes over the past decade. 
Date:  2011–10 
URL:  http://d.repec.org/n?u=RePEc:aia:ginidp:dp15&r=cmp 
By:  Han Chen; Vasco Cúrdia; Andrea Ferrero 
Abstract:  The effects of asset purchase programs on macroeconomic variables are likely to be moderate. We reach this conclusion after simulating the impact of the Federal Reserve’s second largescale asset purchase program (LSAP II) in a DSGE model enriched with a preferred habitat framework and estimated on U.S. data. Our simulations suggest that such a program increases GDP growth by less than half a percentage point, although the effect on the level of GDP is very persistent. The program’s marginal contribution to inflation is very small. One key reason for our findings is that we estimate a small degree of financial market segmentation. If we enrich the set of observables with a measure of longterm debt, the semielasticity of the risk premium to the amount of debt in privatesector hands is substantially smaller than that reported in the recent empirical literature. In this case, our baseline estimates of the effects of LSAP II on the macroeconomy decrease by at least a factor of two. Throughout the analysis, a commitment to an extended period at the zero lower bound for nominal interest rates increases the effects of asset purchase programs on GDP growth and inflation. 
Keywords:  Macroeconomics ; Gross domestic product ; Federal Reserve System ; Inflation (Finance) ; Debt ; Stochastic analysis 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:fip:fednsr:527&r=cmp 
By:  Michael W.L. Elsby; Bart Hobijn; Aysegul Sahin; Robert G. Valletta 
Abstract:  Since the end of the Great Recession in mid2009, the unemployment rate has recovered slowly, falling by only one percentage point from its peak. We find that the lackluster labor market recovery can be traced in large part to weakness in aggregate demand; only a small part seems attributable to increases in labor market frictions. This continued labor market weakness has led to the highest level of longterm unemployment in the U.S. in the postwar period, and a blurring of the distinction between unemployment and nonparticipation. We show that flows from nonparticipation to unemployment are important for understanding the recent evolution of the duration distribution of unemployment. Simulations that account for these flows suggest that the U.S. labor market is unlikely to be subject to high levels of structural longterm unemployment after aggregate demand recovers. ; Powerpoint supplement available at http://www.frbsf.org/economics/economist s/wp1129bk_supplement.pdf 
Keywords:  Labor market ; Recessions 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:fip:fedfwp:201129&r=cmp 