
on Computational Economics 
Issue of 2009‒09‒26
twelve papers chosen by 
By:  Adrian PeraltaAlva; Manuel S. Santos 
Abstract:  Our work has been concerned with the numerical simulation of dynamic economies with heterogeneous agents and economic distortions. Recent research has drawn attention to inherent difficulties in the computation of competitive equilibria for these economies: A continuous Markovian solution may fail to exist, and some commonly used numerical algorithms may not deliver accurate approximations. We consider a reliable algorithm set forth in Feng et al. (2009), and discuss problems related to the existence and computation of Markovian equilibria, as well as convergence and accuracy properties. We offer new insights into numerical simulation. 
Keywords:  Econometric models 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:fip:fedlwp:2009036&r=cmp 
By:  Hélène Maisonnave; Bernard Decaluwé; Margaret Chitiga 
Abstract:  This paper presents a computable general equilibrium model (CGEM) able to measure the impacts of the affirmative action policy set up in South Africa. In order to decrease inequalities inherited from the former regime, the government encourages firms to employ Historically Disadvantaged Persons (HDP). Through this study, we evaluate the impact of this policy on employment, poverty and inequality. To evaluate impacts on poverty and inequality, we use a CGE Top Down approach. The paper analyses two scenarios; the first one deals with the impact of affirmative action on skilled jobs. The second scenario adds to the previous by including semi skilled workers in the simulation. Both of these scenarios show a deep decrease in unemployment as well as a fall of poverty for each population groups. 
Keywords:  Computable General Equilibrium Model, Top Down Analysis, South Africa, Poverty, Inequality, Labor Market 
JEL:  D58 E27 I32 O11 O55 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:lvl:lacicr:0936&r=cmp 
By:  John Gilbert (Department of Economics and Finance, Utah State University) 
Abstract:  We present a numerical version of the factor proportions (HeckscherOhlinSamuelson) model of production in a small economy, built in Excel that features tax interventions at the input, output, consumption and trade levels. The model features the most common graphical devices used to explain the model properties, including integrating partial equilibrium geometry corresponding to the general equilibrium system. The solution is embedded in the sheet, making the use of the Solver addin unnecessary. The model can be used to demonstrate the a wide variety of results from the neoclassical theory of commercial policy. 
Keywords:  HeckcherOhlinSamuelson model, Factor proportions, Tariffs, Excel 
JEL:  A2 D5 F1 
Date:  2009–07–26 
URL:  http://d.repec.org/n?u=RePEc:usu:wpaper:200905&r=cmp 
By:  Breisinger, Clemens; Diao, Xinshen; Schweickert, Rainer; Wiebelt, Manfred 
Abstract:  "Contemporary policy debates on the macroeconomics of resource booms often concentrate on the shortrun Dutch disease effects of public expenditure, ignoring the possible longterm effects of alternative revenueallocation options and the supplyside impact of royaltyfinanced public investments. In a simple model applied here, the government decides the level and timing of resourcerent spending. This model also considers productivity spillovers over time, which may exhibit a sector bias toward domestic production or exports. A dynamic computable general equilibrium (DCGE) model is used to simulate the effect of temporary oil revenue inflows to Ghana. The simulations show that beyond the shortrun Dutch disease effects, the relationship between windfall profits, growth, and households' welfare is less straightforward than what the simple model of the “resource curse” suggests. The DCGE model results suggest that designing a rule that allocates oil revenues to both productivityenhancing investments and an oil fund is crucial to achieving shared growth and macroeconomic stability." from authors' abstract 
Keywords:  Oil fund, Public expenditures, Growth, Computable general equilibrium (CGE) analysis, Development strategies, 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:fpr:ifprid:893&r=cmp 
By:  Manoj Atolia; Yoshinori Kurokawa 
Abstract:  It can be theoretically shown that variety trade can be a possible source of increased skill premium in wages. No past studies, however, have empirically quantified how much of the increase in skill premium can be accounted for by the increase in variety trade. This paper now formulates a static general equilibrium model and then calibrates it to the Mexican inputoutput matrix for 1987. In the calibrated model, our numerical experiments show that the increase in U.S.Mexican variety trade can explain approximately 12 percent of the actual increase in skill premium in Mexico from 1987 to 2000. 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:tsu:tewpjp:2009006&r=cmp 
By:  Veelenturf, L.P.; Potthoff, D.; Huisman, D.; Kroon, L.G. (Erasmus Econometric Institute) 
Abstract:  Railway operations are disrupted frequently, e.g. the Dutch railway network experiences about three large disruptions per day on average. In such a disrupted situation railway operators need to quickly adjust their resource schedules. Nowadays, the timetable, the rolling stock and the crew schedule are recovered in a sequential way. In this paper, we model and solve the crew rescheduling problem with retiming. This problem extends the crew rescheduling problem by the possibility to delay the departure of some trains. In this way we partly integrate timetable adjustment and crew rescheduling. The algorithm is based on column generation techniques combined with Lagrangian heuristics. In order to prevent a large increase in computational time, retiming is allowed only for a limited number of trains where it seems very promising. Computational experiments with reallife disruption data show that, compared to the classical approach, it is possible to find better solutions by using crew rescheduling with retiming. 
Date:  2009–09–15 
URL:  http://d.repec.org/n?u=RePEc:dgr:eureir:1765016746&r=cmp 
By:  JuanPablo Ortega; Rainer Pullirsch; Josef Teichmann; Julian Wergieluk 
Abstract:  We provide a new dynamic approach to scenario generation for the purposes of risk management in the banking industry. We connect ideas from conventional techniques  like historical and Monte Carlo simulation  and we come up with a hybrid method that shares the advantages of standard procedures but eliminates several of their drawbacks. Instead of considering the static problem of constructing one or ten day ahead distributions for vectors of risk factors, we embed the problem into a dynamic framework, where any time horizon can be consistently simulated. Additionally, we use standard models from mathematical finance for each risk factor, whence bridging the worlds of trading and risk management. Our approach is based on stochastic differential equations (SDEs), like the HJMequation or the BlackScholes equation, governing the time evolution of risk factors, on an empirical calibration method to the market for the chosen SDEs, and on an Euler scheme (or highorder schemes) for the numerical evaluation of the respective SDEs. The empirical calibration procedure presented in this paper can be seen as the SDEcounterpart of the so called Filtered Historical Simulation method; the behavior of volatility stems in our case out of the assumptions on the underlying SDEs. Furthermore, we are able to easily incorporate "middlesize" and "largesize" events within our framework always making a precise distinction between the information obtained from the market and the one coming from the necessary apriori intuition of the risk manager. Results of one concrete implementation are provided. 
Date:  2009–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0904.0624&r=cmp 
By:  V. Aquaro; M. Bardoscia; R. Bellotti; A. Consiglio; F. De Carlo; G. Ferri 
Abstract:  A system for Operational Risk management based on the computational paradigm of Bayesian Networks is presented. The algorithm allows the construction of a Bayesian Network targeted for each bank using only internal loss data, and takes into account in a simple and realistic way the correlations among different processes of the bank. The internal losses are averaged over a variable time horizon, so that the correlations at different times are removed, while the correlations at the same time are kept: the averaged losses are thus suitable to perform the learning of the network topology and parameters. The algorithm has been validated on synthetic time series. It should be stressed that the practical implementation of the proposed algorithm has a small impact on the organizational structure of a bank and requires an investment in human resources limited to the computational area. 
Date:  2009–06 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0906.3968&r=cmp 
By:  Francesco Ciardiello; Crescenzio Gallo 
Abstract:  We propose an algorithm for computing "main stable sets" recently introduced by Ciardiello, Di Liddo (2009) on effectiveness form coalitional games modeled through a directed pseudograph. The algorithm is based upon a graph traversing method exploring extended paths minimal in coalitions and we study some its interesting computational aspects for making these stability concepts as useful tools for decision theory. 
Keywords:  Algorithmic game theory; coalitional games; dominance relations; stable sets; graph theory. 
Date:  2009–05 
URL:  http://d.repec.org/n?u=RePEc:ufg:qdsems:082009&r=cmp 
By:  Hiroshi Iyetomi; Hideaki Aoyama; Yoshi Fujiwara; Yuichi Ikeda; Wataru Souma 
Abstract:  An agentbased model for firms' dynamics is developed. The model consists of firm agents with identical characteristic parameters and a bank agent. Dynamics of those agents is described by their balance sheets. Each firm tries to maximize its expected profit with possible risks in market. Infinite growth of a firm directed by the "profit maximization" principle is suppressed by a concept of "going concern". Possibility of bankruptcy of firms is also introduced by incorporating a retardation effect of information on firms' decision. The firms, mutually interacting through the monopolistic bank, become heterogeneous in the course of temporal evolution. Statistical properties of firms' dynamics obtained by simulations based on the model are discussed in light of observations in the real economy. 
Date:  2009–01 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0901.1794&r=cmp 
By:  Didier Rulli\`ere; Diana Dorobantu 
Abstract:  The present paper provides a multiperiod contagion model in the credit risk field. Our model is an extension of Davis and Lo's infectious default model. We consider an economy of $n$ firms which may default directly or may be infected by another defaulting firm (a domino effect being also possible). The spontaneous default without external influence and the infections are described by not necessary independent Bernoullitype random variables. Moreover, several contaminations could be necessary to infect another firm. In this paper we compute the probability distribution function of the total number of defaults in a dependency context. We also give a simple recursive algorithm to compute this distribution in an exchangeability context. Numerical applications illustrate the impact of exchangeability among direct defaults and among contaminations, on different indicators calculated from the law of the total number of defaults. 
Date:  2009–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0904.1653&r=cmp 
By:  Alejandro Reveiz; Carlos Léon 
Abstract:  Operational Risk (OR) results from endogenous and exogenous risk factors, as diverse and complex to assess as human resources and technology, which may not be properly measured using traditional quantitative approaches. Engineering has faced the same challenges when designing practical solutions to complex multifactor and nonlinear systems where human reasoning, expert knowledge or imprecise information are valuable inputs. One of the solutions provided by engineering is a Fuzzy Logic Inference System (FLIS). Despite the goal of the FLIS model for OR is its assessment, it is not an end in itself. The choice of a FLIS results in a convenient and sound use of qualitative and quantitative inputs, capable of effectively articulating risk management’s identification, assessment, monitoring and mitigation stages. Different from traditional approaches, the proposed model allows evaluating mitigation efforts exante, thus avoiding concealed OR sources from system complexity buildup and optimizing risk management resources. Furthermore, because the model contrasts effective with expected OR data, it is able to constantly validate its outcome, recognize environment shifts and issue warning signals. 
Date:  2009–09–13 
URL:  http://d.repec.org/n?u=RePEc:col:000094:005841&r=cmp 