New Economics Papers
on Computational Economics
Issue of 2005‒04‒16
25 papers chosen by



  1. A Genetic Algorithm for the Structural Estimation of Games with Multiple Equilibria By Victor Aguirregabiria; Pedro Mira
  2. A Toolbox for the Numerical Study of Linear Dynamic Rational Expectations Models By P. Marcelo Oviedo
  3. Modeling the risk process in the XploRe computing environment By Krzysztof Burnecki; Rafal Weron
  4. SimCode: Agent-based Simulation Modelling of Open-Source Software Development By Jean-Michel Dalle; Paul A. David
  5. The Allocation of Software Development Resources In ‘Open Source’ Production Mode By Jean-Michel Dalle; Paul David
  6. Multi-agent modeling and simulation of a sequential monetary production economy By Marco Raberto; Andrea Teglio; Silvano Cincotti
  7. Proxy simulation schemes using likelihood ratio weighted Monte Carlo for generic robust Monte-Carlo sensitivities and high accuracy drift approximation (with applications to the LIBOR Market Model) By Christian P. Fries; Joerg Kampen
  8. Fast Ejection Chain Algorithms for Vehicle Routing with Time Windows By Sontrop,H.M.J.; Horn,S.P.,van der; Teeuwen,G.; Uetz,M.
  9. Can a carbon permit system reduce Spanish unemployment? By Fæhn, Taran; Gómez-Plana, Antonio G.; Kverndokk, Snorre
  10. Consumption and population age structure By Erlandsen, Solveig; Nymoen, Ragnar
  11. Labor supply when tax evasion is an option By Jørgensen, Øystein; Ognedal, Tone; Strøm, Steinar
  12. Shocks and Business Cycles By Frankel, David M.; Burdzy, Krzysztof
  13. Monte Carlo Tests with Nuisance Parameters: A General Approach to Finite-Sample Inference and Nonstandard Asymptotics By DUFOUR, Jean-Marie
  14. The Impact of Fiscal Policy on Income Distribution and Poverty: A Computable General Equilibrium Approach for Indonesia By Yose Rizal Damuri; Ari A. Perdana
  15. Economic Crisis and Trade Liberalization: A CGE Analysis On The Forestry Sector By Tubagus Feridhanusetyawan; Yose Rizal Damuri
  16. The Impact of Heterogeneous Trading Rules on the Limit Order Book and Order Flows By Carl Chiarella; Giulia Iori
  17. Modeling electricity prices with regime switching models By Michael Bierbrauer; Stefan Trueck; Rafal Weron
  18. Dynamic Conditional Correlation with Elliptical Distributions By Matteo M. Pelagatti; Stefania Rondena
  19. Mean Reversion Expectations and the 1987 Stock Market Crash: An Empirical Investigation By Eric Hillebrand
  20. Fast drift approximated pricing in the BGM model By Raoul Pietersz; Antoon Pelsser; Marcel van Regenmortel
  21. Rank Reduction of Correlation Matrices by Majorization By Raoul Pietersz; Patrick J. F. Groenen
  22. The Effects of Foreign Trade Liberalization and Financial Flows between Slovenia and the EU after the Accession By Boris Majcen; Miroslav Verbic; Sasa Knezevic
  23. Finance Matters By Pedro S. Amaral; Erwan Quintin
  24. Monetary Policy Shifts, Indeterminacy and Inflation Dynamics By Paolo Surico
  25. Credit Risk, Systemic Uncertainties and Economic Capital Requirements for an Artificial Bank Loan Portfolio By Alexis Derviz; Narcisa KadlÄáková; Lucie Kobzová

  1. By: Victor Aguirregabiria (Boston University); Pedro Mira (CEMFI)
    Abstract: This paper proposes an algorithm to obtain maximum likelihood estimates of structural parameters in discrete games with multiple equilibria. The method combines a genetic algorithm (GA) with a pseudo maximum likelihood (PML) procedure. The GA searches efficiently over the huge space of possible combinations of equilibria in the data. The PML procedure avoids the repeated computation of equilibria for each trial value of the parameters of interest. To test the ability of this method to get maximum likelihood estimates, we present a Monte Carlo experiment in the context of a game of price competition and collusion.
    Keywords: Empirical games; Maximum likelihood estimation; Multiple equilibria; Genetic algorithms.
    JEL: C13 C35
    Date: 2005–02–28
    URL: http://d.repec.org/n?u=RePEc:wpa:wuwpem:0502017&r=cmp
  2. By: P. Marcelo Oviedo (Iowa State University)
    Abstract: By simplifying the computational tasks and by providing step-by-step explanations of the procedures required to study a linear dynamic rational expectations (LDRE) model, this paper and the accompanying ``LDRE Toolbox' of Matalb functions guide a researcher with almost no experience in computational work to resolve and study his own model. After coding the model following specific guidelines, a single function call is all that is needed to log-linearize the model; simulate it under exogenous sequences of shocks; compute sample and population moment conditions; and obtain impulse-response functions. Three classical models in the Real-Business-Cycles literature are solved and studied throughout to give detailed examples of the steps involved in solving and studying LDRE models using the LDRE Toolbox. Namely, the economies in Brock and Mirman (Optimal Growth and Uncertainty: the Discounted Case, Journal of Economic Theory, 4(3): 479-513; 1972); King, Plosser, and Rebelo (Production, Growth and Business Cycles I: The Basic Neoclassical Model, Journal of Monetary Economics 21: 195-232; 1988); and Mendoza (Real Business Cycles in a Small Open Economy, American Economic Review 81(4): 797-818; 1991).
    Keywords: RBC models; Solution method; Toolbox of Matlab functions; Log- linear approximation techniques
    JEL: C63 C68 E32 F41
    Date: 2005–01–26
    URL: http://d.repec.org/n?u=RePEc:wpa:wuwpge:0501004&r=cmp
  3. By: Krzysztof Burnecki (Hugo Steinhaus Center); Rafal Weron (Hugo Steinhaus Center)
    Abstract: A user friendly approach to modeling the risk process is presented. It utilizes the insurance library of the XploRe computing environment which is accompanied by on-line, hyperlinked and freely downloadable from the web manuals and e-books. The empirical analysis for Danish fire losses for the years 1980-90 is conducted and the best fitting of the risk process to the data is illustrated.
    Keywords: Risk process, Monte Carlo simulation, XploRe computing environment
    JEL: G22
    Date: 2005–02–07
    URL: http://d.repec.org/n?u=RePEc:wpa:wuwpri:0502001&r=cmp
  4. By: Jean-Michel Dalle (University Pierre-et-Marie-Curie & IMRI- Dauphine); Paul A. David (Stanford University & Oxford Internet Institute)
    Abstract: We present an original modeling tool, which can be used to study the mechanisms by which free/libre and open source software developers’ code-writing efforts are allocated within open source projects. It is first described analytically in a discrete choice framework, and then simulated using agent-based experiments. Contributions are added sequentially to either existing modules, or to create new modules out of existing ones: as a consequence, the global emerging architecture forms a hierarchical tree. Choices among modules reflect expectations of peer- regard, i.e. developers are more attracted a) to generic modules, b) to launching new ones, and c) to contributing their work to currently active development sites in the project. In this context, we are able – particularly by allowing for the attractiveness of “hot spots”-- to replicate the high degree of concentration (measured by Gini coefficients) in the distributions of modules sizes. The latter have been found by empirical studies to be a characteristic typical of the code of large projects, such as the Linux kernel. Introducing further a simple social utility function for evaluating the mophology of “software trees,” it turns out that the hypothesized developers’ incentive structure that generates high Gini coefficients is not particularly conducive to producing self-organized software code that yields high utility to end-users who want a large and diverse range of applications. Allowing for a simple governance mechanism by the introduction of maintenance rules reveals that “early release” rules can have a positive effect on the social utility rating of the resulting software trees.
    JEL: L
    Date: 2005–02–09
    URL: http://d.repec.org/n?u=RePEc:wpa:wuwpio:0502008&r=cmp
  5. By: Jean-Michel Dalle (Université Paris VI & IMRI-Université Paris Dauphine); Paul David (Stanford University & Oxford Internet- Institute)
    Abstract: This paper aims to develop a stochastic simulation structure capable of describing the decentralized, micro-level decisions that allocate programming resources both within and among open source/free software (OS/FS) projects, and that thereby generate an array of OS/FS system products each of which possesses particular qualitative attributes. The core or behavioral kernel of simulation tool presented here represents the effects of the reputational reward structure of OS/FS communities (as characterized by Raymond 1998) to be the key mechanism governing the probabilistic allocation of agents’ individual contributions among the constituent components of an evolving software system. In this regard, our approach follows the institutional analysis approach associated with studies of academic researchers in “open science” communities. For the purposes of this first step, the focus of the analysis is confined to showing the ways in which the specific norms of the reward system and organizational rules can shape emergent properties of successive releases of code for a given project, such as its range of functions and reliability. The global performance of the OS/FS mode, in matching the functional and other characteristics of the variety of software systems that are produced with the needs of users in various sectors of the economy and polity, obviously, is a matter of considerable importance that will bear upon the long-term viability and growth of this mode of organizing production and distribution. Our larger objective, therefore, is to arrive at a parsimonious characterization of the workings of OS/FS communities engaged across a number of projects, and their collective productive performance in dimensions that are amenable to “social welfare” evaluation. Seeking that goal will pose further new and interesting problems for study, a number of which are identified in the essay’s conclusion. Yet, it is argued that that these too will be found to be tractable within the framework provided by refining and elaborating on the core (“proof of concept”) model that is presented in this paper.
    JEL: L
    Date: 2005–02–10
    URL: http://d.repec.org/n?u=RePEc:wpa:wuwpio:0502011&r=cmp
  6. By: Marco Raberto (DIBE-CINEF, University of Genoa); Andrea Teglio (DIBE-CINEF, University of Genoa); Silvano Cincotti (DIBE- CINEF, University of Genoa)
    Abstract: This paper presents a heterogeneous agent model of a sequential monetary production economy. A deterministic dynamic flow model is employed. The model is characterized by three classes of agents: a single homogeneous representative consumer, heterogeneous firms and a banking sector. There are three asset classes (or debts): a single homogeneous physical good, money and debt securities. The homogeneous commodity is produced by firms and, if saved, increases their capital stock. Firms issue debts to finance growth. Firms are homogeneous as regarding production technology but are heterogeneous relative to expected in°ation. Consumers provide labor force and make the decision of consumption and saving of their income. They own all the equities of firms and banks. The banking sector collects consumer savings and provides credit supply to firms. The main result of the model is that real economic variables are strongly affected by the level of credit supply in relation to the level of savings.
    Keywords: Heterogeneous agents, financial markets and the macroeconomy, computer simulation
    JEL: D92 E17 E44
    Date: 2005–03–12
    URL: http://d.repec.org/n?u=RePEc:wpa:wuwpco:0503002&r=cmp
  7. By: Christian P. Fries (Universität Heidelberg); Joerg Kampen (Universität Heidelberg)
    Abstract: We consider a generic framework for generating likelihood ratio weighted Monte Carlo simulation paths, where we use one simulation scheme K° (proxy scheme) to generate realizations and then reinterpret them as realizations of another scheme K* (target scheme) by adjusting measure (via likelihood ratio) to match the distribution of K° such that E( f(K*) | F_t ) = E( f(K°) w | F_t ). This is done numerically in every time step, on every path. This makes the approach independent of the product (the function f) and even of the model, it only depends on the numerical scheme. The approach is essentially a numerical version of the likelihood ratio method [Broadie & Glasserman, 1996] and Malliavin's Calculus [Fournie et al., 1999; Malliavin, 1997] reconsidered on the level of the discrete numerical simulation scheme. Since the numerical scheme represents a time discrete stochastic process sampled on a discrete probability space the essence of the method may be motivated without a deeper mathematical understanding of the time continuous theory (e.g. Malliavin's Calculus). The framework is completely generic and may be used for high accuracy drift approximations and the robust calculation of partial derivatives of expectations w.r.t. model parameters (i.e. sensitivities, aka. Greeks) by applying finite differences by reevaluating the expectation with a model with shifted parameters. We present numerical results using a Monte-Carlo simulation of the LIBOR Market Model for benchmarking.
    Keywords: Monte-Carlo, Likelihood Ratio, Malliavin Calculus, Sensitivities, Greeks
    JEL: C15 G13
    Date: 2005–04–12
    URL: http://d.repec.org/n?u=RePEc:wpa:wuwpfi:0504010&r=cmp
  8. By: Sontrop,H.M.J.; Horn,S.P.,van der; Teeuwen,G.; Uetz,M. (METEOR)
    Abstract: This paper introduces new ejection chain strategies to effectively target vehicle routing problems with time window constraints (VRPTW). Ejection chain procedures are based on the idea of compound moves that allow a variable number of solution components to be modified within any single iteration of a local search algorithm. The yardstick behind such procedures is the underlying reference structure, which is the structure that is used to coordinate the moves that are available for the local search algorithm. The main contribution of the paper is a new reference structure that is particularly suited in order to handle the asymmetric aspects in a VRPTW. The new reference structure is a generalization of the doubly rooted reference structure introduced by Glover, resulting in a new, powerful neighborhood for the VRPTW. We use tabu search for the generation of the ejection chains. On a higher algorithmic level, we study the effect of different meta heuristics to steer the tabu chain ejection process. Computational results confirm that our approach leads to very fast algorithms that can compete with the current state of the art algorithms for the VRPTW.
    Keywords: operations research and management science;
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:dgr:umamet:2005012&r=cmp
  9. By: Fæhn, Taran (Research Department, Statistics Norway,); Gómez-Plana, Antonio G. (Dept. of Economics, University of Oslo); Kverndokk, Snorre (Ragnar Frisch Centre for Economic Research)
    Abstract: This paper analyses whether recycling revenues from carbon emission permit auctions can reduce unemployment in the Spanish economy. Spain's deviation from EU's intermediate emission goals is more serious than for most other EU countries, and the unemployment is also well above the EU average. We use a CGE model that includes a matching model with two types of labour, and which allows for different pricing rules and returns-to-scale assumptions. We find that abatement reduces unemployment due to beneficial impacts of recycling the revenue from permit sales. Unemployment is more effectively abated when revenues are used to reduce labour taxes rather than indirect taxes. Contrary to other studies of Europe, we find that the best option is to reduce payroll taxes on skilled labour. This reform is the most successful both in increasing demand and in dampening the supply response to rising wages. All the recycling schemes also generate dividends in terms of welfare, but none offset the abatement costs entirely.
    Keywords: Spanish unemployment; Tax reform; Emission Permit Auctions; Employment dividend; Matching functions; Increasing returns to scale; Computable general equilibrium models
    JEL: D58 J68 Q38
    Date: 2005–12–15
    URL: http://d.repec.org/n?u=RePEc:hhs:osloec:2004_026&r=cmp
  10. By: Erlandsen, Solveig (Norges Bank); Nymoen, Ragnar (Dept. of Economics, University of Oslo)
    Abstract: In this paper the effects on aggregate consumption of changes in the age distribution of the population are analysed empirically. Economic theories predict that age influences individuals’ saving and consumption behaviour. Despite this, age structure effects are rarely controlled for in empirical consumption functions. Our findings suggest that they should. By analysing Norwegian quarterly time series data we find that changes in the age distribution of the population have significant and life cycle consistent effects on aggregate consumption. Furthermore, controlling for age structure effects stabilizes the other parameters of the consumption function and reveals significant real interest rate effects. Simulation experiments show that the numerical effect on the savings rate of age structure changes is substantial when the indirect effects via wealth and income are accounted for.
    Keywords: Consumption; demography; savings; time series models; cointegration.
    JEL: C51 C53 E21 J10
    Date: 2005–04–06
    URL: http://d.repec.org/n?u=RePEc:hhs:osloec:2004_027&r=cmp
  11. By: Jørgensen, Øystein (Dept. of Economics, University of Oslo); Ognedal, Tone (Dept. of Economics, University of Oslo); Strøm, Steinar (Dept. of Economics, University of Oslo)
    Abstract: We estimate labor supply when tax evasion is an option, using a discrete choice model on pooled Norwegian survey data from 1980 and 2001. Direct labor supply elasticities, conditional on sectors, are in the range of 0.2-0.4. The elasticities are higher for work that is not registered for taxation, than for registered work. Overall wage increases have a positive impact on the supply of registered work and a negative impact on supply of unregistered work. In addition to economic factors such as wages and tax rates, also social norms and opportunities for tax evasion at the work place have an impact on the supply of unregistered labor. The model is used to simulate the impact on labor supply of changes in the tax structure, such as the lowering of marginal tax rates. The fraction of the population who did unreported work was reduced from 1980 to 2001. Lower and less progressive tax rates after 1980 have contributed to this reduction. Although taxes matter for supply of both reported and non-reported labor, the impact is not strong. Social norms and opportunities for tax evasion at the work place are also important in explaining the change.
    Keywords: Labor supply; tax evasion; survey data; microeconometrics
    JEL: C25 D12 D81 H26 J22
    Date: 2005–04–06
    URL: http://d.repec.org/n?u=RePEc:hhs:osloec:2005_006&r=cmp
  12. By: Frankel, David M.; Burdzy, Krzysztof
    Abstract: A popular theory of business cycles is that they are driven by animal spirits: shifts in expectations brought on by sunspots. A prominent example is Howitt and McAfee (AER, 1992). We show that this model has a unique equilibrium if there are payoff shocks of any size. This equilibrium still has the desirable property that recessions and expansions can occur without any large exogenous shocks. We give an algorithm for computing the equilibrium and study its comparative statics properties. This work generalizes Burdzy, Frankel, and Pauzner (2000) to the case of endogenous frictions and seasonal and mean-reverting shocks.
    JEL: C7 E3
    Date: 2005–04–08
    URL: http://d.repec.org/n?u=RePEc:isu:genres:12274&r=cmp
  13. By: DUFOUR, Jean-Marie
    Abstract: The technique of Monte Carlo (MC) tests [Dwass (1957), Barnard (1963)] provides an attractive method of building exact tests from statistics whose finite sample distribution is intractable but can be simulated (provided it does not involve nuisance parameters). We extend this method in two ways: first, by allowing for MC tests based on exchangeable possibly discrete test statistics; second, by generalizing the method to statistics whose null distributions involve nuisance parameters (maximized MC tests, MMC). Simplified asymptotically justified versions of the MMC method are also proposed and it is shown that they provide a simple way of improving standard asymptotics and dealing with nonstandard asymptotics (e.g., unit root asymptotics). Parametric bootstrap tests may be interpreted as a simplified version of the MMC method (without the general validity properties of the latter).
    Keywords: Monte Carlo test ; maximized monte Carlo test ; finite same test ; exact test ; nuisance rameter ; bounds ; bootstra; rametric bootstra; simulated annealing ; asymotics ; nonstandard asymotic distribution.
    JEL: C12 C15 C2 C52 C22
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:mtl:montde:2005-03&r=cmp
  14. By: Yose Rizal Damuri (Department of Economics, Centre for Strategic and International Studies); Ari A. Perdana (Department of Economics, Centre for Strategic and International Studies)
    Abstract: The paper seeks to quantitatively measure the impact of fiscal policy on income distribution and poverty in Indonesia using WAYANG, the CGE model for Indonesian economy. We find that scenarios for fiscal expansion significantly influence income distribution and poverty. Fiscal expansion mainly benefits urban households and non-labour rural households – basically, the wealthiest segments of the society. We have are several explanations. First, factors of production owned by these segments allowed them to reap the most benefits from fiscal expansions. Second, these households are least affected by price increases due to their consumption structure. Finally, we find that, in real terms, the Indonesia’s taxation system burdens poorer households more than richer ones.
    Keywords: Indonesia, income distribution, poverty, economic modelling, fiscal policy
    Date: 2003–05
    URL: http://d.repec.org/n?u=RePEc:eab:macroe:58&r=cmp
  15. By: Tubagus Feridhanusetyawan (Department of Economics, Centre for Strategic and International Studies); Yose Rizal Damuri (Department of Economics, Centre for Strategic and International Studies)
    Abstract: This paper uses simulations based on a GTAP model to reproduce the economic crisis in Southeast Asia, and in particular in Indonesia. The model is a static-real sector model, so the focus of the simulation is on the declining investment and the declining prices of non-traded goods during the crisis. The simulation is conducted by creating an exogenous shock on risk premium in Indonesia, Thailand and Malaysia, which leads to smaller allocation of regional investment in these countries, lower stock of capital goods, and lower production. The second shock, which is the declining price of land and natural resource, opens the possibility of resource allocation between sectors in the economy. The results of the crisis simulation show that the declining overall GDP during the crisis is accompanied by declining productions of capital and labor-intensive commodities, and expansion of natural resource and land based sectors. Based on the simulation, the economic crisis is expected to lower production of forestry and forestry related manufacturing sectors, mainly because these sectors are more capital or labor intensive, rather than land or natural resource intensive. Consistent with the modeling exercise, the output of these sectors also declined in reality during the worst time of the crisis in 1997-99. The simulation results also show that the negative impact of the crisis on welfare, measured as the changes in equivalent variation, is serious. The second simulation in this study measures the impact of trade liberalization on the economy after the crisis. The results show that the potential benefit from trade liberalization is large, and larger than the welfare lost during the crisis. In other words, pursuing more progressive trade liberalization would speed up the economic recovery after the crisis by creating more opportunity to get the most benefit from the global economy.
    Keywords: Southeast Asia, Indonesia, Asian crisis, forestry sector, computable general equilibrium (CGE)
    Date: 2004–02
    URL: http://d.repec.org/n?u=RePEc:eab:macroe:62&r=cmp
  16. By: Carl Chiarella (School of Finance and Economics, University of Technology, Sydney); Giulia Iori
    Abstract: In this paper we develop a model of an order-driven market where traders set bids and asks and post market or limit orders according to exogenously fixed rules. The model seeks to capture a number of features suggested by recent empirical analysis of limit order data, such as; fat-tailed distribution of limit order placement from current bid/ask; fat-tailed distribution of order execution-time; fat-tailed distribution of orders stored in the order book; long memory in the signs (buy or sell) of trades. The model developed here extends the earlier one of Chiarella and Iori (2002) in several important aspects, in particular agents have heterogenous time horizons and can submit orders of sizes larger than one, determined either by utility maximisation or by a random selection procedure. We analyze the impact of chartist and fundamentalist strategies on the determination of both the placement level and the placement size, on the shape of the book, the distribution of orders at different prices, and the distribution of their execution time. We compare the results of model simulations with real market data.
    Date: 2005–02–01
    URL: http://d.repec.org/n?u=RePEc:uts:rpaper:152&r=cmp
  17. By: Michael Bierbrauer (University of Karlsruhe); Stefan Trueck (University of Karlsruhe); Rafal Weron (Hugo Steinhaus Center)
    Abstract: We address the issue of modeling spot electricity prices with regime switching models. After reviewing the stylized facts about power markets we propose and fit various models to spot prices from the Nordic power exchange. Afterwards we assess their performance by comparing simulated and market prices.
    Keywords: Power market, Electricity price modeling, Regime switching model
    JEL: C51 L94 Q40
    Date: 2005–02–07
    URL: http://d.repec.org/n?u=RePEc:wpa:wuwpem:0502005&r=cmp
  18. By: Matteo M. Pelagatti (University of Milan-Bicocca); Stefania Rondena (University of Milan-Bicocca)
    Abstract: The Dynamic Conditional Correlation model of Engle has made the estimation of multivariate GARCH models feasible for reasonably big vectors of securities’ returns. In the present paper we show how Engle’s twosteps estimate of the model can be easily extended to elliptical conditional distributions and apply different leptokurtic DCC models to some stocks listed at the Milan Stock Exchange. A free software written by the authors to carry out all the required computations is presented as well.
    Keywords: Multivariate GARCH, Dynamic conditional correlation, Generalized method of moments
    JEL: C1 C2 C3 C4 C5 C8
    Date: 2005–03–11
    URL: http://d.repec.org/n?u=RePEc:wpa:wuwpem:0503007&r=cmp
  19. By: Eric Hillebrand (Louisiana State University, Department of Economics)
    Abstract: After the stock market crash of 1987, Fischer Black proposed a model in which he explained the crash by inconsistencies in the formation of expectations of mean reversion in stock returns. Following this explanation, a model that allows for mean reversion in stock returns is estimated on daily stock index data around the crash of 1987. The results strongly support Black’s hypothesis. Simulations show that on Friday Oct 16, 1987, a crash of 20 percent or more had a probability of more than seven percent.
    Keywords: stock-market crash, mean reversion, stock return predictability, change-points
    JEL: G10 C22
    Date: 2005–01–31
    URL: http://d.repec.org/n?u=RePEc:wpa:wuwpfi:0501015&r=cmp
  20. By: Raoul Pietersz (Erasmus University Rotterdam); Antoon Pelsser (Erasmus University Rotterdam); Marcel van Regenmortel (ABN AMRO Bank)
    Abstract: This paper shows that the forward rates process discretized by a single time step together with a separability assumption on the volatility function allows for representation by a low-dimensional Markov process. This in turn leads to e±cient pricing by for example finite differences. We then develop a discretization based on the Brownian bridge especially designed to have high accuracy for single time stepping. The scheme is proven to converge weakly with order 1. We compare the single time step method for pricing on a grid with multi step Monte Carlo simulation for a Bermudan swaption, reporting a computational speed increase of a factor 10, yet pricing sufficiently accurate.
    Keywords: BGM model, predictor-corrector, Brownian bridge, Markov processes, separability, Feynman-Kac, Bermudan swaption
    JEL: G13
    Date: 2005–02–11
    URL: http://d.repec.org/n?u=RePEc:wpa:wuwpfi:0502005&r=cmp
  21. By: Raoul Pietersz (Erasmus University Rotterdam); Patrick J. F. Groenen (Erasmus University Rotterdam)
    Abstract: A novel algorithm is developed for the problem of finding a low-rank correlation matrix nearest to a given correlation matrix. The algorithm is based on majorization and, therefore, it is globally convergent. The algorithm is computationally efficient, is straightforward to implement, and can handle arbitrary weights on the entries of the correlation matrix. A simulation study suggests that majorization compares favourably with competing approaches in terms of the quality of the solution within a fixed computational time. The problem of rank reduction of correlation matrices occurs when pricing a derivative dependent on a large number of assets, where the asset prices are modelled as correlated log-normal processes. Mainly, such an application concerns interest rates.
    Keywords: rank, correlation matrix, majorization, lognormal price processes
    JEL: G13
    Date: 2005–02–11
    URL: http://d.repec.org/n?u=RePEc:wpa:wuwpfi:0502006&r=cmp
  22. By: Boris Majcen (Institute for Economic Research Ljubljana); Miroslav Verbic (Institute for Economic Research Ljubljana); Sasa Knezevic (Institute for Economic Research Ljubljana)
    Abstract: The new version of the CGE model of the Slovenian economy, based on the 1998 SAM, was used for simulations of the consequences of further foreign trade liberalization after 1998 as the outcome of the finished processes of implementation of Free Trade Agreements and the European Agreement, adaptation of the Customs Tariff to the EU Common External Tariff for the manufacturing products, adoption of the EU Common External Tariff after the accession of Slovenia to the EU as well as the estimated transfers between both budgets. Results obtained show a positive net outcome of the Slovenian accession to the EU in the long run. On the other hand, rational behaviour of the government will certainly moderate possible short run negative effects and improve favourable long run effects.
    Keywords: Computable General Equilibrium Model, EU-Accession, Financial Flows, Trade Liberalization, Transition Country, Regionalism
    JEL: D58 F15 F43 E2
    Date: 2005–01–29
    URL: http://d.repec.org/n?u=RePEc:wpa:wuwpit:0501011&r=cmp
  23. By: Pedro S. Amaral (Southern Methodist University); Erwan Quintin (Federal Reserve Bank of Dallas)
    Abstract: We present a model in which the importance of financial intermediation for development can be measured. We generate financial differences by varying the degree to which contracts can be enforced. Economies where enforcement is poor employ less capital and less efficient technologies. Calibrated simulations reveal that both effects are important. Yet, accounting for all the observed dispersion in output requires a higher capital share or a lower elasticity of substitution between capital and labor than usually assumed. We find that the effects of changes in those technological parameters on output are markedly larger when financial frictions are present. Finance, that is, matters.
    JEL: E
    Date: 2005–02–01
    URL: http://d.repec.org/n?u=RePEc:wpa:wuwpma:0502007&r=cmp
  24. By: Paolo Surico (Bank of England & University of Bari)
    Abstract: The New-Keynesian Phillips curve plays a central role in modern macroeconomic theory. A vast empirical literature has estimated this structural relationship over various postwar full-samples. While it is well know that in a New-Keynesian model a weak central bank response to inflation generates sunspot fluctuations, the consequences of pooling observations from different monetary policy regimes for the estimates of the Phillips curve had not been investigated. Using Montecarlo simulations from a purely forward-looking model, this paper shows that indeterminacy can introduce a sizable persistence in the estimated process of inflation. This persistence however is not an intrinsic feature of the economy; rather it is the result of self full-filling expectations. By neglecting indeterminacy the estimates of the forward- looking term of the Phillips curve are shown to be biased downward. The implications are in line with the empirical evidence for the UK and US.
    Keywords: indeterminacy, New-Keynesian Phillips curve, Montecarlo, bias, persistence
    JEL: E58 E31 E32
    Date: 2005–04–08
    URL: http://d.repec.org/n?u=RePEc:wpa:wuwpma:0504014&r=cmp
  25. By: Alexis Derviz; Narcisa KadlÄáková; Lucie Kobzová
    Abstract: This paper analyses the impact of different credit risk-based capital requirement implementations on banks’ need for capital. The capital requirements for an artificially constructed risky loan portfolio are calculated by applying the BIS approach, the two widespread commercial risk-measurement models, CreditMetrics and CreditRisk+, and, finally, an original synthetic model similar to KMV. In the first three cases we closely follow the methodologies proposed by the regulatory or credit risk models. Economic capital requirements for the latter are obtained by means of Monte Carlo simulations. In the context of CreditMetrics, we additionally perform a Monte Carlo-based stress testing of the monetary policy changes reflected in the term structure of interest rates. Our model of KMV type combines the elements of the structural and the reduced-form methods of risky debt pricing, and the possibilities of its numerical solution are outlined.
    Keywords: credit risk, economic capital, market risk, New Basel Capital Accord, systemic uncertainty.
    JEL: G21 G28 G33
    Date: 2003–12
    URL: http://d.repec.org/n?u=RePEc:cnb:wpaper:2003/09&r=cmp

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.