nep-cmp New Economics Papers
on Computational Economics
Issue of 2017‒01‒22
seven papers chosen by



  1. A New Calibration for CORTAX: A computable general equilibrium model for simulating corporate tax reforms By à lvarez-Martínez, María Teresa; Barrios, Salvador; Bettendorf, Leon; d'Andria, Diego; Gesualdo, Maria; Loretz, Simon; Pontikakis, Dimitrios; Pycroft, Jonathan
  2. A Spatial Interpolation Framework for Efficient Valuation of Large Portfolios of Variable Annuities By Seyed Amir Hejazi; Kenneth R. Jackson; Guojun Gan
  3. Parallelizing Computation of Expected Values in Recombinant Binomial Trees By Sai K. Popuri; Andrew M. Raim; Nagaraj K. Neerchal; Matthias K. Gobbert
  4. A Balancing Act; Reform Options for Paraguay’s Fiscal Responsibility Law By Antonio David; Natalija Novta
  5. Optimal Trading with a Trailing Stop By Tim Leung; Hongzhong Zhang
  6. Economic impacts of road investments under different financing alternatives By Tales Rozenfeld; Eduardo A. Haddad
  7. Parsimonious modeling with information filtering networks By Wolfram Barfuss; Guido Previde Massara; T. Di Matteo; Tomaso Aste

  1. By: Ã lvarez-Martínez, María Teresa (European Commission - JRC); Barrios, Salvador (European Commission - JRC); Bettendorf, Leon (CPB Netherlands); d'Andria, Diego (European Commission - JRC); Gesualdo, Maria (European Commission - JRC); Loretz, Simon (Institute of Advanced Studies (Vienna)); Pontikakis, Dimitrios (European Commission - JRC); Pycroft, Jonathan (European Commission - JRC)
    Abstract: The paper presents a new calibration for CORTAX (short for CORporate TAXation), which is a computable general equilibrium (CGE) model covering all EU member states, the US, Japan and a tax haven. The CORTAX model was originally built by the Centraal Planbureau (CPB) in the Netherlands based on the earlier OECDTAX model of Sorensen (2001). The calibration presented in this paper updates the base year to 2012. As the previous calibration was for 2007, the two calibrations represent pre- and post-crisis data. CORTAX models many key features of the corporate tax regimes including multinational profit shifting, investment decisions, loss compensation and the debt-equity choice of firms. The model is designed to investigate many aspects of corporate income taxation (CIT), including adjustment or harmonisation of the CIT rate or base and reforms to address the debt bias in CIT. Furthermore, it can examine consolidation of multinational CIT base, which inter alia addresses some of the issues concerning base erosion and profit shifting (BEPS). Given the choices companies have when confronted with changes in their respective environments, it is important to assess the effects of the reform under a general framework, which takes into account the interactions between different parts of internationally open economies, such as the impact of CIT reforms on firms' investment decisions. Indeed, as a computable general equilibrium model, it simulates all main macroeconomic variables, including GDP, investment and employment. The paper gives an explanation of the model structure, describes the data used, the calibration method and provides descriptive statistics for the baseline values of the model, comparing those for 2012 with the previous 2007 values.
    Keywords: Corporate taxation; Computable general equilibrium model; Model calibration; CORTAX
    Date: 2016–12
    URL: http://d.repec.org/n?u=RePEc:ipt:taxref:201609&r=cmp
  2. By: Seyed Amir Hejazi; Kenneth R. Jackson; Guojun Gan
    Abstract: Variable Annuity (VA) products expose insurance companies to considerable risk because of the guarantees they provide to buyers of these products. Managing and hedging these risks requires insurers to find the value of key risk metrics for a large portfolio of VA products. In practice, many companies rely on nested Monte Carlo (MC) simulations to find key risk metrics. MC simulations are computationally demanding, forcing insurance companies to invest hundreds of thousands of dollars in computational infrastructure per year. Moreover, existing academic methodologies are focused on fair valuation of a single VA contract, exploiting ideas in option theory and regression. In most cases, the computational complexity of these methods surpasses the computational requirements of MC simulations. Therefore, academic methodologies cannot scale well to large portfolios of VA contracts. In this paper, we present a framework for valuing such portfolios based on spatial interpolation. We provide a comprehensive study of this framework and compare existing interpolation schemes. Our numerical results show superior performance, in terms of both computational efficiency and accuracy, for these methods compared to nested MC simulations. We also present insights into the challenge of finding an effective interpolation scheme in this framework, and suggest guidelines that help us build a fully automated scheme that is efficient and accurate.
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1701.04134&r=cmp
  3. By: Sai K. Popuri; Andrew M. Raim; Nagaraj K. Neerchal; Matthias K. Gobbert
    Abstract: Recombinant binomial trees are binary trees where each non-leaf node has two child nodes, but adjacent parents share a common child node. Such trees arise in finance when pricing an option. For example, valuation of a European option can be carried out by evaluating the expected value of asset payoffs with respect to random paths in the tree. In many variants of the option valuation problem, a closed form solution cannot be obtained and computational methods are needed. The cost to exactly compute expected values over random paths grows exponentially in the depth of the tree, rendering a serial computation of one branch at a time impractical. We propose a parallelization method that transforms the calculation of the expected value into an "embarrassingly parallel" problem by mapping the branches of the binomial tree to the processes in a multiprocessor computing environment. We also propose a parallel Monte Carlo method which takes advantage of the mapping to achieve a reduced variance over the basic Monte Carlo estimator. Performance results from R and Julia implementations of the parallelization method on a distributed computing cluster indicate that both the implementations are scalable, but Julia is significantly faster than a similarly written R code. A simulation study is carried out to verify the convergence and the variance reduction behavior in the proposed Monte Carlo method.
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1701.03512&r=cmp
  4. By: Antonio David; Natalija Novta
    Abstract: Paraguay faces a trade-off between building fiscal credibility and amending the existing fiscal rule to accommodate infrastructure investment and provide space for countercyclical policies. In this paper, we discuss several alternative fiscal rules for Paraguay and present simulations of debt trajectories in each case, assuming a baseline and three deterministic shock scenarios. We provide a supplementary Excel file to replicate debt simulations under different fiscal rules. The results suggest that potential modifications to make the fiscal rules more flexible in Paraguay should be accompanied by a number of safeguards that enhance credibility of the fiscal anchor and preserve sustainability.
    Keywords: Fiscal responsibility law;Paraguay;Fiscal rules;Fiscal policy;Fiscal Rules; Fiscal Governance; Paraguay
    Date: 2016–11–16
    URL: http://d.repec.org/n?u=RePEc:imf:imfwpa:16/226&r=cmp
  5. By: Tim Leung; Hongzhong Zhang
    Abstract: Trailing stop is a popular stop-loss trading strategy by which the investor will sell the asset once its price experiences a pre-specified percentage drawdown. In this paper, we study the problem of timing buy and then sell an asset subject to a trailing stop. Under a general linear diffusion framework, we study an optimal double stopping problem with a random path-dependent maturity. Specifically, we first derive the optimal liquidation strategy prior to a given trailing stop, and prove the optimality of using a sell limit order in conjunction with the trailing stop. Our analytic results for the liquidation problem is then used to solve for the optimal strategy to acquire the asset and simultaneously initiate the trailing stop. The method of solution also lends itself to an efficient numerical method for computing the the optimal acquisition and liquidation regions. For illustration, we implement an example and conduct a sensitivity analysis under the exponential Ornstein-Uhlenbeck model.
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1701.03960&r=cmp
  6. By: Tales Rozenfeld; Eduardo A. Haddad
    Abstract: This study explores the issue of road infrastructure funding, analyzing the impact of financing a road improvement project through tolls tariff charged from the final users vis-a-vis the financing through an increase in the country’s payroll tax rate. Using a transport model integrated to an interregional computable general equilibrium model this research simulated alternative arrangements for financing investments made at BR-040, Brazilian road granted by the Federal Government and which figured as the case study for this research. The results indicate that the way the investment is financed is relevant to the regionally distributed impacts of the project, being decisive in defining which regions are benefited by the improvement project. Analyzing the country’s aggregated results, the situation that has the greatest impact on the Brazilian's GDP growth is the investment payed by the road users through toll tariff. From a regional perspective, a clear area of influence that benefits from the improvements on the road can be identified and, when the costs for executing such improvements are shared with the whole country through a tax increase, these benefits are accentuated.
    Keywords: regional economics; transport policy; general equilibrium
    JEL: R13 R42 C68
    Date: 2016–12–16
    URL: http://d.repec.org/n?u=RePEc:spa:wpaper:2016wpecon42&r=cmp
  7. By: Wolfram Barfuss; Guido Previde Massara; T. Di Matteo; Tomaso Aste
    Abstract: We introduce a methodology to construct parsimonious probabilistic models. This method makes use of information filtering networks to produce a robust estimate of the global sparse inverse covariance from a simple sum of local inverse covariances computed on small subparts of the network. Being based on local and low-dimensional inversions, this method is computationally very efficient and statistically robust, even for the estimation of inverse covariance of high-dimensional, noisy, and short time series. Applied to financial data our method results are computationally more efficient than state-of-the-art methodologies such as Glasso producing, in a fraction of the computation time, models that can have equivalent or better performances but with a sparser inference structure. We also discuss performances with sparse factor models where we notice that relative performances decrease with the number of factors. The local nature of this approach allows us to perform computations in parallel and provides a tool for dynamical adaptation by partial updating when the properties of some variables change without the need of recomputing the whole model. This makes this approach particularly suitable to handle big data sets with large numbers of variables. Examples of practical application for forecasting, stress testing, and risk allocation in financial systems are also provided.
    JEL: F3 G3
    Date: 2016–12–13
    URL: http://d.repec.org/n?u=RePEc:ehl:lserod:68860&r=cmp

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.