New Economics Papers
on Computational Economics
Issue of 2013‒02‒03
eight papers chosen by



  1. On the Efficacy of Fourier Series Approximations for Pricing European and Digital Options By A. S. Hurn; K. A. Lindsay; A. J. Mcclelland
  2. Households heterogeneity in a global CGE model: an illustration with the MIRAGE-HH (MIRAGE-HouseHolds) model By Antoine Bouet; Carmen Estrades; David Laborde
  3. Kernel Factory: An Ensemble of Kernel Machines By M. BALLINGS; D. VAN DEN POEL
  4. Portfolio Optimization over a Finite Horizon with Fixed and Proportional Transaction Costs and Liquidity Constraints By Stefano Baccarin; Daniele Marazzina
  5. Embarrassingly Easy Embarrassingly Parallel Processing in R: Implementation and Reproducibility By Michael S. Delgado; Christopher F. Parmeter
  6. Navigating a Changing World Economy: ASEAN, the People’s Republic of China, and India By Petri, Peter A.; Zhai, Fan
  7. Using Premia and Nsp for Constructing a Risk Management Benchmark for Testing Parallel Architecture By Jean-Philippe Chancelier; Jérôme Lelong; Bernard Lapeyre
  8. Protecting Research and Technology from Espionage By D. THORLEUCHTER; D. VAN DEN POEL

  1. By: A. S. Hurn (QUT); K. A. Lindsay (Glasgow and QUT); A. J. Mcclelland (Sydney Numerix)
    Abstract: This paper investigates several competing procedures for computing the price of European and digital options in which the underlying model has a characteristic function that is known in at least semi-closed form. The algorithms for pricing the options investigated here are the half-range Fourier cosine series, the half-range Fourier sine series and the full-range Fourier series. The performance of the algorithms is assessed in simulation experiments which price options in a Black-Scholes world where an analytical solution is available and for a simple affine model of stochastic volatility in which there is no closed-form solution. The results suggest that the half-range sine series approximation is the least effective of the three proposed algorithms. It is rather more difficult to distinguish between the performance of the half-range cosine series and the full-range Fourier series. There are however two clear differences. First, when the interval over which the density is approximated is relatively large, the full-range Fourier series is at least as good as the half-range Fourier cosine series, and outperforms the latter in pricing out-of-the-money call options, in particular with maturities of three months or less. Second, the computational time required by the half-range Fourier cosine series is uniformly longer than that required by the full-range Fourier series for an interval of fixed length. Taken together, these two conclusions make a strong case for the merit of pricing options using a full-range range Fourier series as opposed to a half-range Fourier cosine series.
    Keywords: Fourier transform, Fourier series, characteristic function, option price
    Date: 2013–01–22
    URL: http://d.repec.org/n?u=RePEc:qut:auncer:2013_02&r=cmp
  2. By: Antoine Bouet (Larefi - Laboratoire d'analyse et de recherche en économie et finance internationales - Université Montesquieu - Bordeaux IV : EA2954); Carmen Estrades (IFPRI - International Food Policy Research Institute - aaa); David Laborde (IFPRI - International Food Policy Research Institute - aaa)
    Abstract: The objective of this paper is to develop a version of the MIRAGE model with household heterogeneity and a public agent, to better analyze the impact of trade liberalization and other trade reforms on real income and welfare at the household level In a first step, the model disaggregates the representative household into up to 13-39 households in five developing countries (Brazil, Pakistan, Tanzania, Uruguay and Vietnam). The sources of income and consumption structure reflect disaggregated statistical information coming from households' surveys. The new model better captures the behavior of the public agent in terms of revenues collected and in terms of expenditures. Since domestic remittances may constitute an important determinant of income redistribution, the new version also endogenizes private inter-households transfers. This new version of MIRAGE takes into account the reaction of households to these shocks in an integrated and consistent framework. We study the impact of full trade liberalization on these households. This study concludes that: (i) while the impact of full trade liberalization may be small at the macroeconomic level, the effect on households' real income may be quite substantial at the household level with a great heterogeneity in terms of results; (ii) the major channel of heterogenity of the impact of trade liberalization on households' real income is productive factors' remuneration while the channel of consumption prices of commodities has limited impact; (iii) various domestic policies simultaneously implemented to trade liberalization like modification of public transfers to households or changes in income taxation may significantly change the picture and offer compensation for negative effects of this shock or amplify direct impact of full trade liberalization; (iv) the impact of trade reform on poverty and inequality is significant and diverse from one country to the other.
    Keywords: CGE modeling, poverty, trade liberalization, households survey
    Date: 2013–01–09
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-00780103&r=cmp
  3. By: M. BALLINGS; D. VAN DEN POEL
    Abstract: We propose an ensemble method for kernel machines. The training data is randomly split into a number of mutually exclusive partitions defined by a row and column parameter. Each partition forms an input space and is transformed by a kernel function into a kernel matrix K. Subsequently, each K is used as training data for a base binary classifier (Random Forest). This results in a number of predictions equal to the number of partitions. A weighted average combines the predictions into one final prediction. To optimize the weights, a genetic algorithm is used. This approach has the advantage of simultaneously promoting (1) diversity, (2) accuracy, and (3) computational speed. (1) Diversity is fostered because the individual K’s are based on a subset of features and observations, (2) accuracy is sought by optimizing the weights with the genetic algorithm, and (3) computational speed is obtained because the computation of each K can be parallelized. Using five times two-fold cross validation we benchmark the classification performance of Kernel Factory against Random Forest and Kernel-Induced Random Forest (KIRF). We find that Kernel Factory has significantly better performance than Kernel-Induced Random Forest. When the right kernel is specified Kernel Factory is also significantly better than Random Forest. In addition, an open-source Rsoftware package of the algorithm (kernelFactory) is available from CRAN.
    Date: 2012–12
    URL: http://d.repec.org/n?u=RePEc:rug:rugwps:12/825&r=cmp
  4. By: Stefano Baccarin (Department of Economics and Statistics (Dipartimento di Scienze Economico-Sociali e Matematico-Statistiche), University of Torino, Italy); Daniele Marazzina (Department of Mathematics, Polytechnic University of Milano, Italy)
    Abstract: We investigate a portfolio optimization problem for an agent who invests in two assets, a risk-free and a risky asset modeled by a geometric Brownian motion. The investor faces both fixed and proportional transaction costs and liquidity constraints. His objective is to maximize the expected utility from the portfolio liquidation at a terminal finite horizon. The model is formulated as a parabolic impulse control problem and we characterize the value function as the unique constrained viscosity solution of the associated quasi-variational inequality. We compute numerically the optimal policy by a an iterative finite element discretization technique, presenting extended numerical results in the case of a constant relative risk aversion utility function. Our results show that, even with small transaction costs and distant horizons, the optimal strategy is essentially a buy-and-hold trading strategy where the agent recalibrates his portfolio very few times. This contrasts sharply with the continuous interventions of the Merton's model without transaction costs.
    Keywords: Portfolio Optimization, Quasi-variational Inequalities, Transaction Costs, Viscosity Solutions
    JEL: G11 D92 C61
    Date: 2013–01
    URL: http://d.repec.org/n?u=RePEc:tur:wpapnw:017&r=cmp
  5. By: Michael S. Delgado (Department of Agricultural Economics, Purdue University); Christopher F. Parmeter (Department of Economics, University of Miami)
    Keywords: Parallel processing, reproducibility, computational efficiency, bootstrap, nonlinear optimization, Monte Carlo
    Date: 2013–01–17
    URL: http://d.repec.org/n?u=RePEc:mia:wpaper:2013-06&r=cmp
  6. By: Petri, Peter A. (Asian Development Bank Institute); Zhai, Fan (Asian Development Bank Institute)
    Abstract: Most projections envision continued rapid growth in the members of the Association of Southeast Asian Nations (ASEAN), the People’s Republic of China (PRC), and India (collectively, ACI) over the next two decades. By 2030, they could quadruple their output, virtually eliminate extreme poverty, and dramatically transform the lives of their more than 3 billion citizens. The impact will be felt across the world. This study—a background paper to an Asian Development Bank report—used a Computable General Equilibrium model to examine the likely effects of the region's growth on trade, resources and the environment, as well as the implications of the many risks the region's growth path faces from its internal and external environment.
    Keywords: asean; prc; india; world economy; aci; great transformation; growth engines
    JEL: F02 F13 F33 F53
    Date: 2013–01–23
    URL: http://d.repec.org/n?u=RePEc:ris:adbiwp:0404&r=cmp
  7. By: Jean-Philippe Chancelier (CERMICS - Centre d'Enseignement et de Recherche en Mathématiques et Calcul Scientifique - Ecole des Ponts ParisTech); Jérôme Lelong (LJK - Laboratoire Jean Kuntzmann - CNRS : UMR5224 - Université Joseph Fourier - Grenoble I - Université Pierre Mendès-France - Grenoble II - Institut Polytechnique de Grenoble - Grenoble Institute of Technology); Bernard Lapeyre (CERMICS - Centre d'Enseignement et de Recherche en Mathématiques et Calcul Scientifique - Ecole des Ponts ParisTech)
    Abstract: Financial institutions have massive computations to carry out overnight which are very demanding in terms of the consumed CPU. The challenge is to price many different products on a cluster-like architecture. We have used the Premia software to valuate the financial derivatives. In this work, we explain how Premia can be embedded into Nsp, a scientific software like Matlab, to provide a powerful tool to valuate a whole portfolio. Finally, we have integrated an MPI toolbox into Nsp to enable to use Premia to solve a bunch of pricing problems on a cluster. This unified framework can then be used to test different parallel architectures.
    Keywords: Premia; Mpi; Nsp
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:hal:journl:hal-00447845&r=cmp
  8. By: D. THORLEUCHTER; D. VAN DEN POEL
    Abstract: In recent years, governmental and industrial espionage becomes an increased problem for governments and corporations. Especially information about current technology development and research activities are interesting targets for espionage. Thus, we introduce a new and automated methodology that investigates the information leakage risk of projects in research and technology (R&T) processed by an organization concerning governmental or industrial espionage. Latent semantic indexing is applied together with machine based learning and prediction modeling. This identifies semantic textual patterns representing technologies and their corresponding application fields that are of high relevance for the organization’s strategy. These patterns are used to estimate organization’s costs of an information leakage for each project. Further, a web mining approach is processed to identify worldwide knowledge distribution within the relevant technologies and corresponding application fields. This information is used to estimate the probability that an information leakage occur. A risk assessment methodology calculates the information leakage risk for each project. In a case study, the information leakage risk of defense based R&T projects is investigated. This is because defense based R&T is of particularly interest by espionage agents. Overall, it can be shown that the proposed methodology is successful in calculation the espionage information leakage risk of projects. This supports an organization by processing espionage risk management.
    Keywords: Latent semantic indexing, SVD, Espionage, Risk assessment
    Date: 2012–12
    URL: http://d.repec.org/n?u=RePEc:rug:rugwps:12/824&r=cmp

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.