New Economics Papers
on Computational Economics
Issue of 2012‒11‒03
sixteen papers chosen by



  1. The comparison of optimization algorithms on unit root testing with smooth transition By Omay, Tolga
  2. Forecasting the Index of Financial Safety (IFS) of South Africa using neural networks By Matkovskyy, Roman
  3. Determination the Parameters of Markowitz Portfolio Optimization Model By Ertugrul Bayraktar; Ayse Humeyra Bilge
  4. Efficient simulation of DSGE models with inequality constraints By Tom Holden; Michael Paetz
  5. Approximating the Price Effects of Mergers: Numerical Evidence and an Empirical Application By Nathan H. Miller; Conor Ryan; Marc Remer; Gloria Sheu
  6. Integrating Water Resources into Computable General Equilibrium Models - A Survey By Roberto Ponce; Francesco Bosello; Carlo Giupponi
  7. Pricing Interest Rate Derivatives in a Multifactor HJM Model with Time By Ingo Beyna; Carl Chiarella
  8. An Agent Based Decentralized Matching Macroeconomic Model By Riccetti, Luca; Russo, Alberto; Gallegati, Mauro
  9. The Epistemology of Simulation, Computation and Dynamics in Economics By K.Vela Velupillai
  10. HOW DO WORLD AGRICULTURAL COMMODITY PRICE SPIKES AFFECT THE INCOME DISTRIBUTION IN ISRAEL? By Grethe, Harald; Siddig, Khalid; Goetz, Linde; Ihle, Rico
  11. The Relevance of Computation Irreducibility as Computation Universality in Economics By K. Vela Velupillai
  12. Can Numerical Models Estimate Indirect Land-use Change? By Thierry Brunelle; Patrice Dumas
  13. The Ricardo-Lemke parametric algorithm on oddity and uniqueness By Christian Bidard;
  14. Reconciling Performance and Interpretability in Customer Churn Prediction using Ensemble Learning based on Generalized Additive Models By K. W. DE BOCK; D. VAN DEN POEL
  15. Pension reform in an OLG model with heterogeneous abilities By T. BUYSE; F. HEYLEN; R. VAN DE KERCKHOVE
  16. Assessing the Resilience of ASEAN Banking Systems: the Case of the Philippines By Albert, Jose Ramon G.; Ng, Thiam Hee

  1. By: Omay, Tolga
    Abstract: The aim of this study is to search for a better optimization algorithm in applying unit root tests that inherit nonlinear models in the testing process. The algorithms analyzed include Broyden, Fletcher, Goldfarb and Shanno (BFGS), Gauss-Jordan, Simplex, Genetic, and Extensive Grid-Search. The simulation results indicate that the derivative free methods, such as Genetic and Simplex, have advantages over hill climbing methods, such as BFGS and Gauss-Jordan, in obtaining accurate critical values for the Leybourne, Newbold and Vougos (1996, 1998) (LNV) and Sollis (2004) unit root tests. Moreover, when parameters are estimated under the alternative hypothesis of the LNV type of unit root tests the derivative free methods lead to an unbiased and efficient estimator as opposed to those obtained from other algorithms. Finally, the empirical analyses show that the derivative free methods, hill climbing and simple grid search can be used interchangeably when testing for a unit root since all three optimization methods lead to the same empirical test results.
    Keywords: Nonlinear trend; Deterministic smooth transition; Structural change; Estimation methods
    JEL: C15 C22 C01
    Date: 2012–10–22
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:42129&r=cmp
  2. By: Matkovskyy, Roman
    Abstract: This paper investigates neural network tools, especially the nonlinear autoregressive model with exogenous input (NARX), to forecast the future conditions of the Index of Financial Safety (IFS) of South Africa. Based on the time series that was used to construct the IFS for South Africa (Matkovskyy, 2012), the NARX model was built to forecast the future values of this index and the results are benchmarked against that of Bayesian Vector-Autoregressive Models. The results show that the NARX model applied to IFS of South Africa and trained by the Levenberg-Marquardt algorithm may ensure a forecast of adequate quality with less computation expanses, compared to BVAR models with different priors.
    Keywords: Index of Financial Safety (IFS); neural networks; nonlinear dynamic network (NDN); nonlinear autoregressive model with exogenous input (NARX); forecast
    JEL: C45 E44 G01
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:42153&r=cmp
  3. By: Ertugrul Bayraktar; Ayse Humeyra Bilge
    Abstract: The main purpose of this study is the determination of the optimal length of the historical data for the estimation of statistical parameters in Markowitz Portfolio Optimization. We present a trading simulation using Markowitz method, for a portfolio consisting of foreign currency exchange rates and selected assets from the Istanbul Stock Exchange ISE 30, over the period 2001-2009. In the simulation, the expected returns and the covariance matrix are computed from historical data observed for past n days and the target returns are chosen as multiples of the return of the market index. The trading strategy is to buy a stock if the simulation resulted in a feasible solution and sell the stock after exactly m days, independently from the market conditions. The actual returns are computed for n and m being equal to 21, 42, 63, 84 and 105 days and we have seen that the best return is obtained when the observation period is 2 or 3 times the investment period.
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1210.5859&r=cmp
  4. By: Tom Holden (University of Surrey); Michael Paetz (University of Hamburg)
    Abstract: This paper presents a fast, simple and intuitive algorithm for simulation of linear dynamic stochastic general equilibrium models with inequality constraints. The algorithm handles both the computation of impulse responses, and stochastic simulation, and can deal with arbitrarily many bounded variables. Furthermore, the algorithm is able to capture the precautionary motive associated with the risk of hitting such a bound. To illustrate the usefulness and efficiency of this algorithm we provide a variety of applications including to models incorporating a zero lower bound (ZLB) on nominal interest rates. Our procedure is much faster than comparable methods and can readily handle large models. We therefore expect this algorithm to be useful in a wide variety of applications.
    JEL: C63 E32 E43 E52
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:sur:surrec:1612&r=cmp
  5. By: Nathan H. Miller (Economic Analysis Group, Antitrust Division, U.S. Department of Justice); Conor Ryan (Economic Analysis Group, Antitrust Division, U.S. Department of Justice); Marc Remer (Economic Analysis Group, Antitrust Division, U.S. Department of Justice); Gloria Sheu (Economic Analysis Group, Antitrust Division, U.S. Department of Justice)
    Abstract: We analyze the accuracy of first order approximation, a method developed theoretically in Jaffe and Weyl (2012) for predicting the price effects of mergers, and provide an empirical application. Approximation is an alternative to the model-based simulations commonly employed in industrial economics. It provides predictions that are free from functional form assumptions, using data on either cost pass-through or demand curvature in the neighborhood of the initial equilibrium. Our numerical experiments indicate that approximation is more accurate than simulations that use incorrect structural assumptions on demand. For instance, when the true underlying demand system is logit, approximation is more accurate than almost ideal demand system (AIDS) simulation in 79.1 percent of the randomly-drawn industries and more accurate than linear simulation in 90.3 percent of these industries. We also develop, among other results, (i) how accuracy changes across a variety of economic environments, (ii) how accuracy is affected by incomplete data on cost pass-through, and (iii) that a simplified version of approximation provides conservative predictions of price increases.
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:doj:eagpap:201208&r=cmp
  6. By: Roberto Ponce (Department of Economics, Ca’ Foscari University, Italy); Francesco Bosello (Fondazione Eni Enrico Mattei, Italy); Carlo Giupponi (Department of Economics, Ca’ Foscari University, Italy)
    Abstract: Water resources are facing several stresses in terms of quantity and quality. These pressures are closely related to the human interventions in fields like: agriculture, land-use/land use change, construction/management of reservoirs, pollutant emissions, and water /wastewater treatment, among others. Considering the critical role that water plays for agricultural production, any shock in water availability will have great implications for agricultural production, and through agricultural markets these impacts will reach the whole economy with economy-wide consequences. The aim of this report is to present a literature review about the state of the art methodology regarding the study of water issues using the CGE approach at global and national scale. The analysis of the different studies confirms the economy wide consequences of changes in water allocation, irrigation policies, and climate change, among others water related issues.
    Keywords: Computable General Equilibrium Models, Water, Irrigation, Agricultural Policy, Water Allocation
    JEL: C68 Q18 Q25 Q54
    Date: 2012–09
    URL: http://d.repec.org/n?u=RePEc:fem:femwpa:2012.57&r=cmp
  7. By: Ingo Beyna (Centre for Practical Quantitative Finance, Frankfurt School of Finance and Management); Carl Chiarella (Finance Discipline Group, UTS Business School, University of Technology, Sydney; Finance Discipline Group, UTS Business School, University of Technology, Sydney)
    Abstract: We investigate the partial differential equation (PDE) for pricing interest derivatives in the multi-factor Cheyette Model, which involves time-dependent volatility functions with a special structure. The high dimensional parabolic PDE that results is solved numerically via a modified sparse grid approach, that turns out to be accurate and efficient. In addition we study the corresponding Monte Carlo simulation, which is fast since the distribution of the state variables can be calculated explicitly. The results obtained from both methodologies are compared to the known analytical solutions for bonds and caplets. When there is no analytical solution, both European and Bermudan swaptions have been evaluated using the sparse grid PDE approach that is shown to outperform the Monte Carlo simulation.
    Keywords: Cheyette model; Gaussian HJM; multi-factor model; PDE valuation; sparse grid; Monte Carlo simulation
    Date: 2012–10–01
    URL: http://d.repec.org/n?u=RePEc:uts:rpaper:317&r=cmp
  8. By: Riccetti, Luca; Russo, Alberto; Gallegati, Mauro
    Abstract: In this paper we present a macroeconomic microfounded framework with heterogeneous agents – households, firms, banks – which interact through a decentralized matching process presenting common features across four markets – goods, labor, credit and deposit. We study the dynamics of the model by means of computer simulation. Some macroeconomic properties emerge such as endogenous business cycles, nominal GDP growth, unemployment rate fluctuations, the Phillips curve, leverage cycles and credit constraints, bank defaults and financial instability, and the importance of government as an acyclical sector which stabilize the economy. The model highlights that even extended crises can endogenously emerge. In these cases, the system may remain trapped in a large unemployment status, without the possibility to quickly recover unless an exogenous intervention.
    Keywords: agent-based macroeconomics; business cycle; crisis; unemployment; leverage;
    JEL: E32 C63
    Date: 2012–10
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:42211&r=cmp
  9. By: K.Vela Velupillai
    Abstract: Computation and Simulation have always played a role in economics – whether it be pure economic theory or any variant of applied, especially policy oriented, macro and micro economics or what has increasingly come to be called empirical economics. This is a tradition that can, without too much difficulty, be traced to the spirit and vision of the founding father of Political Economy --as Political Arithmetic -- by William Petty, whose finest exponent was, in my opinion, Richard Stone, in modern times a noble tradition whose living custodian is Lance Taylor. In this paper their spirit is the driving force, but it is given new theoretical foundations, mainly as a result of developments in the mathematics underpinnings of the tremendous developments in the potentials of computing, especially using digital technology. A running theme in this essay is the recognition --never neglected by Petty, Stone or Taylor --that, increasingly, the development of economic theory seems to go hand in hand with advances in the theory and practice of computing, which is, in turn, a catalyst for the move away from too much reliance on any kind of mathematics for the formalisation of economic entities that is inconsistent with the mathematical, philosophical and epistemological foundations of the digital computer.
    Keywords: Simulation, Computation, Computable, Analysis, Dynamics Proof, Algorithm
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:trn:utwpas:1218&r=cmp
  10. By: Grethe, Harald; Siddig, Khalid; Goetz, Linde; Ihle, Rico
    Abstract: We assess the distributional effects of the transmission of world market price shocks for the highly import dependent economy of Israel. We combine a CGE simulation with an empirical cointegration analysis for assessing the direction and extent of the connectedness of Israeli and world market prices. The Israeli and the world market for wheat are found to be integrated. Price shocks are completely transmitted from the world market to the domestic Israeli market. We find negative effects on the amount of domestic household income, on consumption and on welfare. Regressive expenditure effects dominate progressive income effects so that the resulting domestic income distribution appears to be more unequal.
    Keywords: Agricultural trade, CGE, commodity prices, income distribution, Israel, Middle East, price transmission, Agrarhandel, CGE, Agrarpreise, Einkommensverteilung, Israel, Naher Osten, Preistransmission, Agricultural and Food Policy, International Relations/Trade,
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:ags:gewi12:137154&r=cmp
  11. By: K. Vela Velupillai
    Abstract: Stephen Wolfram’s A New Kind of Science should have made a greater impact in economics - at least in its theorising and computational modes – than it seems to have. There are those who subscribe to varieties of agent-based modelling, who do refer to Wolfram’s paradigms - a word I use with the utmost trepidation -- whenever simulational exercises within a framework of cellular automata is invoked to make claims on complexity, emergence, holism, reduction and many such buzz words. Very few of these exercises, and their practitioners, seem to be aware of the deep mathematical -- and even metamathematical-- underpinnings of Wolfram’s innovative concepts, particularly of computational equivalence and computational irreducibility in the works of Turing and Ulam. Some threads of these foundational underpinnings are woven together to form a possible tapestry for economic theorising and modelling in computable modes.
    Keywords: Computational equivalence, Computational irreducibility, Computation universality.
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:trn:utwpas:1212&r=cmp
  12. By: Thierry Brunelle (Centre International de Recherche sur l'Environnement et le Développement); Patrice Dumas (Centre International de Recherche sur l'Environnement et le Développement and Centre de Coopération Internationale en Recherche Agronomique pour le Développement)
    Abstract: Motivated by the conclusions from various modelling studies, modifications to the bioenergy sector regulations are under way in Europe and in the USA to account for emissions from indirect land-use change (ILUC). Despite their influence on the policy-making, evaluations of the capacity of numerical models to estimate ILUC are sparse. To address this void, this paper reviews recent developments in land-use modelling, with a particular focus on the solutions adopted to estimate ILUC due to biofuel production. As indirect effects of bioenergy result from the interplay of various mechanisms, their modelling is a major challenge for land-use science. In recent years, numerical models have been significantly upgraded to provide a more comprehensive vision of the agricultural system. This has been performed by improving the representation of land supply and the biofuel production process in general equilibrium models (e.g., GTAP, MIRAGE, DART). At the same time, modelling systems coupling partial equilibrium models with CGE (e.g., KLUM@GTAP) or economic modules with spatially explicit models (e.g., MAgPIE, GLOBIOM, LEITAP), and modelling architecture combining land-use and life-cycle assessment models (e.g., FASOM/FAPRI/GREET) have been developed. In spite of these advances, some limitations remain and uncertainties are still numerous.
    Keywords: Indirect, Land-Use Change, Modelling, Biofuel
    JEL: O13 Q15 Q16 Q17 Q18 D58
    Date: 2012–09
    URL: http://d.repec.org/n?u=RePEc:fem:femwpa:2012.65&r=cmp
  13. By: Christian Bidard;
    Abstract: The parametric Lemke algorithm finds an odd number of solutions to the linear complementarity problem LCP (q, M), for a matrix M with zero blocks on the diagonal and vector q within a certain domain. A criterion for monotonicity and uniqueness is given. The algorithm applies to the determination of a long-run equilibrium in the presence of scarce resources, and its first description can be traced back to the nineteenth century economist David Ricardo.
    Keywords: Oddity, parametric Lemke algorithm, Ricardo, uniqueness.
    JEL: B12 C61 C63
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:drm:wpaper:2012-41&r=cmp
  14. By: K. W. DE BOCK; D. VAN DEN POEL
    Abstract: To build a successful customer churn prediction model, a classification algorithm should be chosen that fulfills two requirements: strong classification performance and a high level of model interpretability. In recent literature, ensemble classifiers have demonstrated superior performance in a multitude of applications and data mining contests. However, due to an increased complexity they result in models that are often difficult to interpret. In this study, GAMensPlus, an ensemble classifier based upon generalized additive models (GAMs), in which both performance and interpretability are reconciled, is presented and evaluated in a context of churn prediction modeling. The recently proposed GAMens, based upon Bagging, the Random Subspace Method and semiparametric GAMs as constituent classifiers, is extended to include two instruments for model interpretability: generalized feature importance scores, and bootstrap confidence bands for smoothing splines. In an experimental comparison on data sets of six real-life churn prediction projects, the competitive performance of the proposed algorithm over a set of well-known benchmark algorithms is demonstrated in terms of four evaluation metrics. Further, the ability of the technique to deliver valuable insight into the drivers of customer churn is illustrated in a case study on data from a European bank. Firstly, it is shown how the generalized feature importance scores allow the analyst to identify the importances of churn predictors in function of the criterion that is used to measure the quality of the model predictions. Secondly, the ability of GAMensPlus to identify nonlinear relationships between predictors and churn probabilities is demonstrated.
    Keywords: Database marketing, customer churn prediction, ensemble classification, generalized additive models (GAMs), GAMens, model interpretability
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:rug:rugwps:12/805&r=cmp
  15. By: T. BUYSE; F. HEYLEN; R. VAN DE KERCKHOVE
    Abstract: We study the effects of pension reform in a four-period OLG model for an open economy where hours worked by three active generations, education of the young, the retirement decision of older workers, and aggregate growth, are all endogenous. Within each generation we distinguish individuals with high, medium or low ability to build human capital. This extension allows to investigate also the effects of pension reform on the income and welfare levels of different ability groups. Particular attention goes to the income at old-age and the welfare level of low-ability individuals. Our simulation results prefer an intelligent pay-as-you-go pension system above a fully-funded private system. When it comes to promoting employment, human capital, growth, and aggregate welfare, positive effects in a pay-as-you-go system are the strongest when it includes a tight link between individual labor income (and contributions) and the pension, and when it attaches a high weight to labor income earned as an older worker to compute the pension assessment base. Such a regime does, however, imply welfare losses for the current low-ability generations, and rising inequality in welfare. Complementing or replacing this ‘intelligent’ pay-as-you-go system by basic and/or minimum pension components is negative for aggregate welfare, employment and growth. Better is to maintain the tight link between individual labor income and the pension also for low-ability individuals, but to strongly raise their replacement rate.
    Keywords: employment by age; endogenous growth; retirement; pension reform; heterogeneous abilities; overlapping generations
    JEL: E62 H55 J22 J24
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:rug:rugwps:12/810&r=cmp
  16. By: Albert, Jose Ramon G.; Ng, Thiam Hee
    Abstract: Since the global financial crisis in 2008/09 there has been heightened concern about the resilience of banking systems in Southeast Asia. This paper proposes a methodology that uses a macroprudential perspective to assess the resilience of banking systems in member countries of the Association of Southeast Asian Nations. It then proceeds to apply this methodology to examine the resilience of the Philippine banking system. Data on financial soundness in the Philippine banking system are utilized in a vector autoregression model to study the dynamic relationships that exist among financial and macroeconomic indicators. Using impulse response functions, a simulation of financial ratios in the banking system is conducted by assuming unlikely but plausible stress scenarios to determine whether banking system credit and capital could withstand the impact of such circumstances. In the stress scenarios, the estimated impact of macroeconomic shocks on nonperforming loan and capital adequacy ratios is generally minimal. The results, however, do suggest that the Philippine banking system has some vulnerability to interest rate and stock market shocks. The results of such stress testing provide a better understanding of the level of preparedness required for managing risks in the financial system, especially in the wake of continuing global economic uncertainty.
    Keywords: banking system, Philippines, macroprudential, stress testing, panel VAR
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:phd:dpaper:dp_2012-23&r=cmp

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.