nep-cmp New Economics Papers
on Computational Economics
Issue of 2012‒05‒02
fifteen papers chosen by
Stan Miles
Thompson Rivers University

  1. Support Vector Machines with Evolutionary Feature Selection for Default Prediction By Wolfgang Karl Härdle; Dedy Dwi Prastyo; Christian Hafner
  2. Costs and benefits of logistics pooling for urban freight distribution: scenario simulation and assessment for strategic decision support By Jesus Gonzalez-Feliu
  3. To Tune or not to Tune: Rule Evaluation for Metaheuristic-based Sequential Covering Algorithms By B. MINNAERT; D. MARTENS; M. DE BACKER; B. BAESENS
  4. Optimal simulation schemes for L\'evy driven stochastic differential equations By Arturo Kohatsu-Higa; Salvador Ortiz-Latorre; Peter Tankov
  5. The Prospects of the Baby Boomers: Methodological Challenges in Projecting the Lives of an Aging Cohort By Christian Westermeier; Anika Rasner; Markus M. Grabka
  6. French e-grocery models: a comparison of deliveries performances By Durand Bruno; Jesus Gonzalez-Feliu
  7. Transmission of distress in a bank credit network By Yoshiharu Maeno; Satoshi Morinaga; Hirokazu Matsushima; Kenichi Amagai
  8. An Economic Assessment of Biogas Production and Land Use under the German Renewable Energy Source Act By Ruth Delzeit , Wolfgang Britz
  9. Government spending in a model where debt effects output gap By Bell, Peter N
  10. Pricing Decisions and Insider Trading in Horse Betting Markets By A. SCHNYTZER; V. MAKROPOULOU; M. LAMERS
  11. A Class of Adaptive Importance Sampling Weighted EM Algorithms for Efficient and Robust Posterior and Predictive Simulation By Lennart Hoogerheide; Anne Opschoor; Herman K. van Dijk
  12. Signing distortions in optimal tax or other adverse selection models with random participation By Laurence Jacquet; Etienne lehmann; Bruno Van Der Linden
  13. Equilibres Multiples, Croissance Endogène et Politiques Publiques By Ali Abcha
  14. Public Sector Wage Bargaining, Unemployment, and Inequality By Gabriele Cardullo
  15. Estimating financial institutions’ intraday liquidity risk: a Monte Carlo simulation approach By Carlos Léon

  1. By: Wolfgang Karl Härdle; Dedy Dwi Prastyo; Christian Hafner
    Abstract: Predicting default probabilities is at the core of credit risk management and is becoming more and more important for banks in order to measure their client's degree of risk, and for rms to operate successfully. The SVM with evolutionary feature selection is applied to the CreditReform database. We use classical methods such as discriminan analysis (DA), logit and probit models as benchmark On overall, GA-SVM is outperforms compared to the benchmark models in both training and testing dataset.
    Keywords: SVM, Genetic algorithm, global optmimum, default prediction
    JEL: C14 C45 C61 C63 G33
    Date: 2012–04
  2. By: Jesus Gonzalez-Feliu (LET - Laboratoire d'économie des transports - CNRS : UMR5593 - Université Lumière - Lyon II - Ecole Nationale des Travaux Publics de l'Etat)
    Abstract: Collaborative transportation and logistics pooling are relatively new concepts in research, but are very popular in practice. In the last years, collaborative transportation seems a good city logistics alternative to classical urban consolidation centres, but it is still in a development stage. This paper proposes a framework for urban logistics pooling ex-ante evaluation. This framework is developed with two purposes. The first is to generate comparable contrasted or progressive scenarios representing realistic situations; the second to simulate and assess them to make a "before-after" comparative analysis. In this framework, a demand generation model is combined with a route optimization algorithm to simulate the resulting routes of the proposed individual or collaborative distribution schemes assumed by each scenario. Then, several indicators can be obtained, mainly travelled distances, working times, road occupancy rates and operational monetary costs. To illustrate that framework, several scenarios for the urban area of Lyon (France) are simulated and discussed to illustrate the proposed framework possible applications.
    Keywords: urban logistics; resource sharing; freight transport pooling; policy-oriented modelling; simulation-based comparative analysis
    Date: 2011–11–09
    Abstract: While many papers propose innovative methods for constructing individual rules in separate-and-conquer rule learning algorithms, comparatively few study the heuristic rule evaluation functions used in these algorithms to ensure that the selected rules combine into a good rule set. Underestimating the impact of this component has led to suboptimal design choices in many algorithms. The main goal of this paper is to demonstrate the importance of heuristic rule evaluation functions by improving existing rule induction techniques and to provide guidelines for algorithm designers.We first select optimal heuristic rule learning functions for several metaheuristic-based algorithms and empirically compare the resulting heuristics across algorithms. This results in large and significant improvements of the predictive accuracy for two techniques. We find that despite the absence of a global optimal choice for all algorithms, good default choices seem to exist for families of algorithms. A near-optimal selection can thus be found for new algorithms with minor experimental tuning. A major contribution is made towards balancing a model’s predictive accuracy with its comprehensibility, as the parametrized heuristics offer an unmatched flexibility when it comes to setting the trade-off between accuracy and comprehensibility.
    Keywords: Classification · Rule Induction · Heuristics · Rule Evaluation ·Sequential Covering
    Date: 2012–01
  4. By: Arturo Kohatsu-Higa; Salvador Ortiz-Latorre; Peter Tankov
    Abstract: We consider a general class of high order weak approximation schemes for stochastic differential equations driven by L\'evy processes with infinite activity. These schemes combine a compound Poisson approximation for the jump part of the L\'evy process with a high order scheme for the Brownian driven component, applied between the jump times. The overall approximation is analyzed using a stochastic splitting argument. The resulting error bound involves separate contributions of the compound Poisson approximation and of the discretization scheme for the Brownian part, and allows, on one hand, to balance the two contributions in order to minimize the computational time, and on the other hand, to study the optimal design of the approximating compound Poisson process. For driving processes whose L\'evy measure explodes near zero in a regularly varying way, this procedure allows to construct discretization schemes with arbitrary order of convergence.
    Date: 2012–04
  5. By: Christian Westermeier; Anika Rasner; Markus M. Grabka
    Abstract: In most industrialized countries, the work and family patterns of the baby boomers characterized by more heterogeneous working careers and less stable family lives set them apart from preceding cohorts. Thus, it is of crucial importance to understand how these different work and family lives are linked to the boomers’ prospective material well-being as they retire. This paper presents a new and unique matching-based approach for the projection of the life courses of German baby boomers, called the LAW-Life Projection Model. Basis for the projection are data from 27 waves of the German Socio-Economic Panel linked with administrative pension records from the German Statutory Pension In-surance that cover lifecycle pension-relevant earnings. Unlike model-based micro simula-tions that age the data year by year our matching-based projection uses sequences from older birth cohorts to complete the life-courses of statistically similar baby boomers through to retirement. An advantage of this approach is to coherently project the work-life and family trajectories as well as lifecycle earnings. The authors present a benchmark anal-ysis to assess the validity and accuracy of the projection. For this purpose, they cut a signif-icant portion of already lived lives and test different combinations of matching algorithms and donor pool specifications to identify the combination that produces the best fit be-tween previously cut but observed and projected life-course information. Exploiting the advantages of the projected data, the authors compare the returns to education - measured in terms of pension entitlements – across cohorts. The results indicate that within cohorts, differences between individuals with low and high educational attainment increase over time for men and women in East and West Germany. East German boomer women with low educational attainment face the most substantial losses in pension entitlements that put them at a high risk of being poor as they retire.
    Keywords: Forecasting Models, simulation methods, SOEP, baby boomers, education, public pensions
    JEL: C53 H55 I24
    Date: 2012
  6. By: Durand Bruno (LEMNA - Laboratoire d'économie et de management de Nantes Atlantique - Université de Nantes : EA4272); Jesus Gonzalez-Feliu (LET - Laboratoire d'économie des transports - CNRS : UMR5593 - Université Lumière - Lyon II - Ecole Nationale des Travaux Publics de l'Etat)
    Abstract: This paper proposes a discussion of three scenarios related to French e-grocery developments, in order to identify and analyze the impacts of new forms of proximity deliveries on households' shopping trip flows. One of our objectives will be to consider logistics solutions adopted by online retailers. Firstly, we present the two basic models of B2C: order-picking on a dedicated site and in-store picking. Secondly, we evaluate three distribution systems adopted by French e-grocery retailers. We focus in particular on the impacts of these systems on consumer's purchasing trips and, to this end, we will use an empirical simulation approach to make a comparison of the systems studied.
    Keywords: e-Grocery, Warehouse-picking, Store-picking, Home Delivery (HD), Out of Home Delivery (OHD)
    Date: 2012–04–05
  7. By: Yoshiharu Maeno; Satoshi Morinaga; Hirokazu Matsushima; Kenichi Amagai
    Abstract: The European sovereign debt crisis has impaired many European banks. The distress on the European banks may transmit worldwide, and result in a large-scale knock-on default of financial institutions. This study presents a computer simulation model to analyze the risk of insolvency of banks and defaults in a bank credit network. Simulation experiments reproduce the knock-on default, and quantify the impact which is imposed on the number of bank defaults by heterogeneity of the bank credit network, the equity capital ratio of banks, and the capital surcharge on big banks.
    Date: 2012–04
  8. By: Ruth Delzeit , Wolfgang Britz
    Abstract: Abstract: The Renewable Energy Source Act (EEG) promotes German biogas production in order to substitute fossil fuels, protect the environment, and prevent climate change. As a consequence, green maize production has increased significantly over the last years, causing negative environmental effects on soil, water and biodiversity. In this paper we quantitatively analyse the EEG-reform in 2012 by applying the simulation tool ReSI-M (Regionalised Location Information System – Maize). Comparing the EEG 2012 with a former version of the legislation, results imply that the reform contributes to an expansion of biogas electricity generation compared to former versions, and thus to substitution of fossil fuels. Furthermore, given a restriction in the share of green maize input, its production is reduced and the crop-mix is diversified. However, since maize provides the highest energy output per area, total land requirement for biogas production increases. An alternative analysis shows that an EEG with tariffs independent from plant-types would provide the highest subsidy-efficiency, but slightly lower land efficiency compared to the EEG 2012
    Keywords: bioenergy, biogas, land use, policy analysis, simulation model
    JEL: C61 Q16 Q42
    Date: 2012–04
  9. By: Bell, Peter N
    Abstract: In this paper I present a simple model of government spending where the level of government debt affects the output gap. The structure of the economy is specified such that the output gap has a structural part, which is a function of debt. Based on empirical research, the structural part is assigned a specific functional form. The government faces an optimization problem where they attempt to close the output gap. The optimal change in government debt is found by solving a nonlinear equation. Numerical results show that the optimal change in debt has nonlinear behaviour. The solution to the unconstrained problem is an alternating equilibrium, whereas the solution to the constrained problem is a non linear cycle around the government's upper bound of admissible debt.
    Keywords: Debt; Macroeconomy; Fiscal; Government Spending; Output Gap; Nonlinear; Numerical Method
    JEL: H60 E00
    Date: 2012–04–12
    Abstract: This paper builds on a theoretical model by Schnytzer, Lamers, and Makropoulou (2010) that conceptualizes fixed odds horse betting markets as implicit call option markets. We model the decision making process of a bookmaker that sets his prices under uncertainty. We extend the paper of Schnytzer et al. (2010) by relaxing some assumptions and allowing for betting at multiple time periods. We show that when a bookmaker follows this pricing process built upon implicit options, the returns will exhibit a favorite-longshot bias. By performing Monte Carlo simulations we generate the option values and are able to measure the degree of insider trading, which we find to be around 60% in our dataset.
    Keywords: Betting, Insider Trading, Contingent Pricing
    JEL: D81 D82 G13
    Date: 2012–02
  11. By: Lennart Hoogerheide (VU University Amsterdam); Anne Opschoor (Erasmus University Amsterdam); Herman K. van Dijk (Erasmus University Rotterdam, and VU University Amsterdam)
    Abstract: A class of adaptive sampling methods is introduced for efficient posterior and predictive simulation. The proposed methods are robust in the sense that they can handle target distributions that exhibit non-elliptical shapes such as multimodality and skewness. The basic method makes use of sequences of importance weighted Expectation Maximization steps in order to efficiently construct a mixture of Student-<I>t</I> densities that approximates accurately the target distribution - typically a posterior distribution, of which we only require a kernel - in the sense that the Kullback-Leibler divergence between target and mixture is minimized. We label this approach <I>Mixture of t by Importance Sampling and Expectation Maximization</I> (MitISEM). The constructed mixture is used as a candidate density for quick and reliable application of either Importance Sampling (IS) or the Metropolis-Hastings (MH) method. We also introduce three extensions of the basic MitISEM approach. First, we propose a method for applying MitISEM in a <I>sequential</I> manner. Second, we introduce a <I>permutation-augmented</I> MitISEM approach. Third, we propose a <I>partial</I> MitISEM approach, which aims at approximating the joint distribution by estimating a product of marginal and conditional distributions. This division can substantially reduce the dimension of the approximation problem, which facilitates the application of adaptive importance sampling for posterior simulation in more complex models with larger numbers of parameters. Our results indicate that the proposed methods can substantially reduce the computational burden in econometric models like DCC or mixture GARCH models and a mixture instrumental variables model.
    Keywords: mixture of Student-t distributions; importance sampling; Kullback-Leibler divergence; Expectation Maximization; Metropolis-Hastings algorithm; predictive likelihood; DCC GARCH; mixture GARCH; instrumental variables
    JEL: C11 C22 C26
    Date: 2012–03–23
  12. By: Laurence Jacquet; Etienne lehmann; Bruno Van Der Linden (THEMA, Universite de Cergy-Pontoise; CREST; IRES - Université Catholique de Louvain and FNRS)
    Abstract: We develop a methodology to sign output distortions in the random participation framework. We apply our method to monopoly nonlinear pricing problem, to the regulatory monopoly problem and mainly to the optimal income tax problem. In the latter framework, individuals are heterogeneous across two unobserved dimensions: their skill and their disutility of participation to the labor market. We derive a fairly mild condition for optimal marginal tax rates to be non negative everywhere, implying that in-work eort is distorted downwards. Numerical simulations for the U.S. conrm this property. Moreover, it is typically optimal to provide a distinct level of transfer to the non-employed and to workers with zero or negligible earnings.
    Keywords: Adverse selection, Optimal taxation, Random participation.
    JEL: H21 H23
    Date: 2012
  13. By: Ali Abcha
    Abstract: Public policies can change the number of equilibria in an endogenous growth model. This work shows that in a growth model with monopolistic competition the existence of externalities not internalized by the agents can result in a multiplicity of equilibria. Government intervention in the economy can have an impact on this multiplicity by the management of externalities and adverse effects of imperfect competition. However inefficient public intervention can make the economy converge to a suboptimal equilibrium. To this end, we develop a macroeconomic model for a closed economy that has a perfectly competitive sector of final goods and a sector of monopolistically competitive intermediate goods. In order to identify the effects of a public policy on this model we simulate.
    Keywords: multiple equilibria, endogenous growth, public policies
    Date: 2012
  14. By: Gabriele Cardullo (DIEM, Faculty of Economics, University of Genoa, Italy)
    Abstract: In many countries, the government pays almost identical nominal wages to workers living in regions with notable economic disparities. In most cases this is the result of highly centralized pay systems. By developing a two-region general equilibrium model with unions and search frictions in the labour market, I study the differences in terms of unemployment, real wages, and inequality between a regional wage bargaining process and a national one in the public sector. Adopting the former lowers public sector real salaries but it also decreases unemployment and jacks up private sector real earnings. Simulations conducted on the basis of Italian data show that, compared to a national negotiation process, a regional one also increases inequality both within and between regions.
    Keywords: public sector wages; unemployment; economic integration; local labour markets
    JEL: H53 J38 J64 R12 R13
    Date: 2012–02
  15. By: Carlos Léon
    Abstract: The most recent financial crisis unveiled that liquidity risk is far more important and intricate than regulation have conceived. The shift from bank-based to market-based financial systems and from Deferred Net Systems to liquidity-demanding Real-Time Gross Settlement of payments explains some of the shortcomings of traditional liquidity risk management. Although liquidity regulations do exist, they still are in an early stage of development and discussion. Moreover, no all connotations of liquidity are equally addressed. Unlike market and funding liquidity, intraday liquidity has been absent from financial regulation, and has appeared only recently, after the crisis. This paper addresses the measurement of Large-Value Payment System’s intraday liquidity risk. Based on the generation of bivariate Poisson random numbers for simulating the minute-by-minute arrival of received and executed payments, each financial institution’s intraday payments time-varying volume and degree of synchrony (i.e. timing) is modeled. To model intraday payments’ uncertainty allows for (i) overseeing participants’ intraday behavior; (ii) assessing their ability to fulfill intraday payments at a certain confidence level; (iii) identifying participants non-resilient to changes in payments’ timing mismatches; (iv) estimating intraday liquidity buffers. Vis-à-vis the increasing importance of liquidity risk as a source of systemic risk, and the recent regulatory amendments, results are useful for financial authorities and institutions.
    Date: 2012–04–11

This nep-cmp issue is ©2012 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.