|
on Computational Economics |
Issue of 2015‒08‒30
eighteen papers chosen by |
By: | Mehdad, Ehsan (Tilburg University, Center For Economic Research); Kleijnen, J.P.C. (Tilburg University, Center For Economic Research) |
Abstract: | Efficient Global Optimization (EGO) is a popular method that searches sequentially for the global optimum of a simulated system. EGO treats the simulation model as a black-box, and balances local and global searches. In deterministic simulation, EGO uses ordinary Kriging (OK), which is a special case of universal Kriging (UK). In our EGO variant we use intrinsic Kriging (IK), which eliminates the need to estimate the parameters that quantify the trend in UK. In random simulation, EGO uses stochastic Kriging (SK), but we use stochastic IK (SIK). Moreover, in random simulation, EGO needs to select the number of replications per simulated input combination, accounting for the heteroscedastic variances of the simulation outputs. A popular selection method uses optimal computer budget allocation (OCBA), which allocates the available total number of replications over simulated combinations. We derive a new allocation algorithm. We perform several numerical experiments with deterministic simulations and random simulations. These experiments suggest that (1) in deterministic simulations, EGO with IK outperforms classic EGO; (2) in random simulations, EGO with SIK and our allocation rule does not differ significantly from EGO with SK combined with the OCBA allocation rule. |
Keywords: | global optimization; Gaussian process; Kriging; intrinsic Krgigin; metamodel |
JEL: | C0 C1 C9 C15 C44 |
Date: | 2015 |
URL: | http://d.repec.org/n?u=RePEc:tiu:tiucen:5e785713-146c-4e5b-b671-f7eb4a8b7a41&r=all |
By: | Mehdad, Ehsan (Tilburg University, Center For Economic Research); Kleijnen, J.P.C. (Tilburg University, Center For Economic Research) |
Abstract: | Kriging provides metamodels for deterministic and random simulation models. Actually, there are several types of Kriging; the classic type is so-called universal Kriging, which includes ordinary Kriging. These classic types require estimation of the trend in the input-output data of the underlying simulation model; this estimation deteriorates the Kriging metamodel. We therefore consider so-called intrinsic Kriging originating in geostatistics, and derive intrinsic Kriging for deterministic and random simulations. Moreover, for random simulations we derive experimental designs that specify the number of replications that varies with the input combination of the simulation model. To compare the performance<br/>of intrinsic Kriging and classic Kriging, we use several numerical experiments with deterministic simulations and random simulations. These experiments show that intrinsic Kriging gives better metamodels, in most experiments. |
Keywords: | simulation; gaussian process; Kriging; intrinsic random functions; metamodel |
JEL: | C0 C1 C9 C15 C44 |
Date: | 2015 |
URL: | http://d.repec.org/n?u=RePEc:tiu:tiucen:00bed9cb-d34c-4e98-93ef-e805fce63fa0&r=all |
By: | Jesus Gonzalez-Feliu (LET - Laboratoire d'économie des transports - CNRS - UL2 - Université Lumière - Lyon 2 - École Nationale des Travaux Publics de l'État [ENTPE]); Josep-Maria Salanova Grau (Hellenic Institute or Transport - Center of Research and Technologie Hellas) |
Abstract: | This paper proposes a comparison between genetic and semi-greedy algorithms for a collaborative VRP in city logistics. In order to compare the performance of both algorithms on real-size test cases, we develop a cluster-first route second algorithm. The clustering phase is made by a seep algorithm, which defines the number of used vehicles and assigns a set of customers to it. Then, for each vehicle, we build a min-cost route by two methods. The first is a semi-greedy algorithm. The second is a genetic algorithm. We test both approaches on real-size instances Computational results are presented and discussed. |
Abstract: | Cet article propose une comparaison entre algorithmes génétiques et semi-greedy pour un problème de tournées de véhicules collaboratif en logistique urbaine. Pour comparer les deux algorithmes, nous proposons des algorithmes séquentiels basés sur la même phase initiale, puis les tournées sont construites par des procédures différentes. La première est de type sem-greedy ; la deuxième un algorithme génétique. Des résultats sont presents et discutés. |
Date: | 2015 |
URL: | http://d.repec.org/n?u=RePEc:hal:journl:halshs-00986648&r=all |
By: | Jonas Zangenberg Hansen (Danish Rational Economic Agents Model, DREAM); Peter Stephensen (Danish Rational Economic Agents Model, DREAM) |
Abstract: | We utilize the newly developed dynamic microsimulation model SMILE (Simulation Model for Individual Lifecycle Evaluation) to make a long-term forecast of detailed housing demand both in terms of key aggregate figures and compositional features of future Danish housing demand. SMILE simulates the life course of the full Danish population with respect to three main types of events: demographic, socioeconomic, and housing-related events. Demographic events include ageing, births, deaths, migration, leaving home, and couple formation and dissolution - all of which are key indirect drivers of future housing demand. Socioeconomic events such as education attendance and attainment, and labor market events are also important indirect drivers of housing demand because they are closely linked to the timing and direction of households' moving patterns. Finally, households move spatially and between dwelling types based on historically observed moving patterns and estimated transition probabilities by using the tree-based classification model. The key results from the simulations are: changing patterns of cohabitation with a decreasing average household size is projected to increase the number of households by roughly one-third above what the general increase in population indicates. Increasing urbanization leads to an increasing demand for multi-dwelling houses. An ageing population is expected to pent-up the demand for smaller dwellings, especially rental housing. |
Keywords: | population projections, education, household projections, housing demand, microsimulation |
Date: | 2013–12 |
URL: | http://d.repec.org/n?u=RePEc:dra:wpaper:201304&r=all |
By: | Diamond, John W. (James A Baker III Institute for Public Policy, Rice University); Zodrow, George R. (Rice University and Centre for Business Taxation, Oxford University) |
Abstract: | The reports of several recent commissions focusing on deficit and debt reduction have suggested curtailing or eliminating the home mortgage interest deduction (MID). This paper examines the economic effects of such proposals to eliminate or curtail the MID. We use a dynamic, overlapping generations, computable general equilibrium (CGE) model of the U.S. economy to simulate both the short run and long run macroeconomic effects of such proposals, including their effects on the housing market, such as changes in housing prices, housing investment and the housing capital stock, and the mix of owner-occupied and rental housing. We also estimate the changes in tax liability by age and income group due to these changes in the MID, taking into account differences across households in whether they itemize and in the marginal tax rate at which the MID is taken, as well as the portfolio reallocations that would be expected to occur as households decide to pay down mortgage debt once the tax advantages of the MID are reduced or eliminated. In addition, we estimate how the reforms would affect the housing user cost of capital, and include estimates of the effects of eliminating or curtailing the MID for a few representative households. Finally, we also perform some rough supplemental "off-model" calculations to estimate the effects of the simulated reform-induced reductions in housing prices on the number of households with negative equity and the numbers of these homes that might be expected to end up in foreclosure proceedings. |
JEL: | H24 H31 R21 |
Date: | 2014 |
URL: | http://d.repec.org/n?u=RePEc:ecl:riceco:14-011&r=all |
By: | Kurt Kratena |
Abstract: | A significant reduction of the global environmental consequences of European consumption and production activities are the main objective of the policy simulations carried out in this paper. For this purpose three different modelling approaches have been chosen. Two macroeconomic models following the philosophy of consistent stock-flow accounting for the main institutional sectors (households, firms, banks, central bank and government) are used for quantifying the impact of several different policies. These policies comprise classical tax reforms (pricing of resources and emissions) as well as policies aiming at behavioural change in private and public consumption and at technological change (energy and resource efficiency and renewable sources). A Dynamic New Keynesian (DYNK) model is used for a comparison between classical green tax reform and taxing direct and indirect (footprint) energy and resource use of consumers. An important leading principle of the modelling work is the simultaneous treatment of economic (GDP, employment), social (income distribution, unemployment) and environmental issues. The paper shortly describes the different modelling approaches and highlights the most important features for the evaluation of the impacts of different policies. Then the different policy scenarios that are carried out with each model are described. The policy scenarios are not directly comparable between the different models, but show some similarities. The simulation results of the different policy scenarios are then analyzed and discussed. Two important conclusions can be drawn from the simulation results: (i) important trade-offs and synergies exist between the different economic, social and environmental goals (ii) simple policy scenarios mainly putting all the effort in one simple instrument (e.g. tax reform) are not likely to achieve an optimal result. A combination of instruments is most likely to achieve results satisfying the different economic, social and environmental goals. |
Keywords: | Behavioural economics, Ecological innovation, Economic growth path, Innovation policy, New technologies |
JEL: | C54 Q54 B52 |
Date: | 2015–07 |
URL: | http://d.repec.org/n?u=RePEc:feu:wfedel:y:2015:m:7:d:0:i:8&r=all |
By: | Philip Adams; Janine Dixon; Mark Horridge |
Abstract: | The Victoria University Regional Model (VURM, formerly known as MMRF) is a dynamic model of Australia's six states and two territories. It models each region as an economy in its own right, with region-specific prices, region-specific consumers, region-specific industries, and so on. Based on the model’s current database, in each region 79 industries produce 83 commodities. Capital is industry and region specific. In each region, there is a single household and a regional government. There is also a Federal government. Finally, there are foreigners, whose behaviour is summarised by demand curves for international exports and supply curves for international imports. In recursive-dynamic mode, VURM produces sequences of annual solutions connected by dynamic relationships such as physical capital accumulation. Policy analysis with VURM conducted in a dynamic setting involves the comparison of two alternative sequences of solutions, one generated without the policy change and the other with the policy change in place. The first sequence, called the base case projection, serves as a control path from which deviations are measured to assess the effects of the policy shock. The model includes a number of satellite modules providing more detail on the models government finance accounts, household income accounts, population and demography, and energy and greenhouse gas emissions. Each of the ‘satellite’ modules is linked into other parts of the model, so that, projections from the model core can feed through into relevant parts of a module and changes in a module can feed back into the model core. The model also includes extensions to the core model theory dealing with links between demography and government consumption, the supply and interstate mobility of labour, and export supplies. |
Keywords: | CGE modelling, dynamics, regional economics |
JEL: | C68 D58 R13 |
Date: | 2015–07 |
URL: | http://d.repec.org/n?u=RePEc:cop:wpaper:g-254&r=all |
By: | Eric Weese (Department of Economics, Yale University); Masayoshi Hayashi (Faculty of Economics, The University of Tokyo); Akihiko Takahashi (College of Economics, Aoyama Gakuin University) |
Abstract: | Does the exercise of the right of self-determination lead to inefficiency? This paper considers a set of centrally planned municipal mergers during the Meiji period, with data from Gifu prefecture. The observed merger pattern can be explained as a social optimum based on a very simple individual utility function. If individual villages had been allowed to choose their merger partners, counterfactual simulations show that the core is always non-empty, but core partitions contain about 80% more (postmerger) municipalities than the social optimum. Simulations are possible because core partitions can be calculated using repeated application of a mixed integer program. |
Date: | 2015–08 |
URL: | http://d.repec.org/n?u=RePEc:tky:fseres:2015cf989&r=all |
By: | Yuri Biondi; Simone Righi |
Abstract: | Our computational economic analysis investigates the relationship between inequality, mobility and the financial accumulation process. Extending the baseline model by Levy et al., we characterise the economic process trough stylised return structures generating alternative evolutions of income and wealth through historical time. First we explore the limited heuristic contribution of one and two factors models comprising one single stock (capital wealth) and one single flow factor (labour) as pure drivers of income and wealth generation and allocation over time. Then we introduce heuristic modes of taxation in line with the baseline approach. Our computational economic analysis corroborates that the financial accumulation process featuring compound returns plays a significant role as source of inequality, while institutional configurations including taxation play a significant role in framing and shaping the aggregate economic process that evolves over socioeconomic space and time. |
Keywords: | inequality, economic process, compound interest, simple interest, taxation, minimal insti- tution, computational economics, econophysics |
JEL: | C46 C63 D31 E02 E21 E27 D63 H22 |
Date: | 2015–07 |
URL: | http://d.repec.org/n?u=RePEc:mod:dembwp:0058&r=all |
By: | L. C. G. Rogers |
Abstract: | The aim of this study is to devise numerical methods for dealing with very high-dimensional Bermudan-style derivatives. For such problems, we quickly see that we can at best hope for price bounds, and we can only use a simulation approach. We use the approach of Barraquand & Martineau which proposes that the reward process should be treated as if it were Markovian, and then uses this to generate a stopping rule and hence a lower bound on the price. Using the dual approach introduced by Rogers, and Haugh & Kogan, this approximate Markov process leads us to hedging strategies, and upper bounds on the price. The methodology is generic, and is illustrated on eight examples of varying levels of difficulty. Run times are largely insensitive to dimension. |
Date: | 2015–08 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1508.06117&r=all |
By: | Jordan Mann; J. Nathan Kutz |
Abstract: | We demonstrate the application of an algorithmic trading strategy based upon the recently developed dynamic mode decomposition (DMD) on portfolios of financial data. The method is capable of characterizing complex dynamical systems, in this case financial market dynamics, in an equation-free manner by decomposing the state of the system into low-rank terms whose temporal coefficients in time are known. By extracting key temporal coherent structures (portfolios) in its sampling window, it provides a regression to a best fit linear dynamical system, allowing for a predictive assessment of the market dynamics and informing an investment strategy. The data-driven analytics capitalizes on stock market patterns, either real or perceived, to inform buy/sell/hold investment decisions. Critical to the method is an associated learning algorithm that optimizes the sampling and prediction windows of the algorithm by discovering trading hot-spots. The underlying mathematical structure of the algorithms is rooted in methods from nonlinear dynamical systems and shows that the decomposition is an effective mathematical tool for data-driven discovery of market patterns. |
Date: | 2015–08 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1508.04487&r=all |
By: | Simone Farinelli; Luisa Tibiletti |
Abstract: | Hydro storage system optimization is becoming one of the most challenging task in Energy Finance. Following the Blomvall and Lindberg (2002) interior point model, we set up a stochastic multiperiod optimization procedure by means of a "bushy" recombining tree that provides fast computational results. Inequality constraints are packed into the objective function by the logarithmic barrier approach and the utility function is approximated by its second order Taylor polynomial. The optimal solution for the original problem is obtained as a diagonal sequence where the first diagonal dimension is the parameter controlling the logarithmic penalty and the second is the parameter for the Newton step in the construction of the approximated solution. Optimimal intraday electricity trading and water values for hydroassets are computed. The algorithm is implemented in Mathematica. |
Date: | 2015–08 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1508.05837&r=all |
By: | Okitonyumbe Y.F., Joseph; Ulungu, Berthold E.-L. |
Abstract: | Résumé : La résolution du problème multi-objectif de tournées de distribution (MOVRP) par des méthodes dites exactes présente beaucoup de difficultés pour des instances de moyenne et grande dimensions. S’inspirant de l’une de trois approches identifiées par Ulungu & Teghem, à savoir l’approche méthodologique, pour résoudre les problèmes d’optimisation combinatoire multi-objectif et du comportement des araignées tissant des toiles nous concevons, à travers cet article, une hybridation de quatre heuristiques dédicacées au problème VRP mono-objectif grâce à la méthode du repère préférentiel de dominance : algorithme de toile d’araignées. Un exemple didactique valide notre démarche. Abstract : Solving the multiobjective vehicle routing problem (MOVRP) by exact methods present many difficulties for average and large size instances. Inspired by one of three approaches identified by Ulungu & Teghem, namely the methodological approach for solving multi-objective combinatorial optimization problems and behavior of spiders weaving webs we conceives, through this paper, hybridization of four inscribed heuristics to classical VRP with dominance preferential mark method : cobweb algorithm. A didactic example validates our approach. |
Keywords: | Mots clés : Gains, Heuristique, Hybridation , Problème multi-objectif de tournées de distribution, Solution efficace, Repère préférentiel de dominance. Keyword :Saving, Heuristic, Hybridization, Multiobjectif Vehicle Routing Problem, efficient solutions, Dominance preferential reference mark method. |
JEL: | C61 |
Date: | 2014–12 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:66193&r=all |
By: | Michel Alexandre da Silva; Gilberto Tadeu Lima |
Abstract: | The aim of this paper is to study the interaction between monetary policy and prudential regulation in an agent-based modeling framework. In the model proposed here, firms borrow funds from the banking system in an economy regulated by a central bank. The central bank is responsible for carrying out monetary policy, by setting the interest rate, and prudential regulation, by establishing the banking capital requirement. Different combinations of interest rate and capital requirement rules are evaluated regarding both macroeconomic and financial stability. Several relevant policy implications are drawn from this analysis. First, the implementation of a cyclical capital component as proposed in Basel III, while successful in reducing financial instability when applied alone, loses its efficacy when combined with some interest rate rules. Second, interest rate smoothing is more effective than the other interest rate rules assessed, as it outperforms them concerning financial stability and performs as well as them regarding macroeconomic stability. Finally, there is no long-run tradeoff between monetary and financial stability regarding the sensitiveness of the cyclical capital component to the credit-to-output ratio, as well as the smoothing interest rate parameter |
Date: | 2015–08 |
URL: | http://d.repec.org/n?u=RePEc:bcb:wpaper:394&r=all |
By: | Hans Bækgaard (Danish Rational Economic Agents Model, DREAM) |
Abstract: | The paper presents a novel approach to modelling labour market processes in dynamic microsimulation. The method combines and integrates Bayesian simulation based estimation and simulation of the dependent variables. The approach is applied to a dynamic panel model for hourly wage rates for Danish employees using a large panel data set with 17 years of data for 1995 to 2011. The wage rate model and a parallel model for annual work hours are currently being implemented in SMILE (Simulation Model for Individual Lifecycle Evaluation), a new dynamic microsimulation model for the Danish household sector. The application benefits from the richness of Danish administrative panel data. Nevertheless, the results and the approach have several features that should be of interest to micro-simulators and others. Indeed, the model features both an extraordinarily comprehensive list of dependencies and a rich dynamic structure. Together, these features contribute to ensure that simulations produce realistic cross-sectional distributions and interactions as well as inter-temporal mobility – the key determinants of the quality of a dynamic microsimulation model. In addition to the 'usual' socio-demographic variables (gender, age, ethnicity, experience, education etc.), the dependencies include a more novel set of variables that represent a person’s labour market history, secondary school grade and social heritage (represented by the parent’s education level). The dynamic model structure includes a lagged dependent variable, an auto-correlated error term with a mixed Gaussian distribution for the white noise component, an individual random effect with a mixed Gaussian distribution and permanent effect of a person’s first wage after leaving the education system. The estimation sample is identical to the simulation sample, which allows us to use the same historical detail as well as estimated individual effects – i.e. random effect components – for the simulation of future wage rates. The Bayesian estimation method handles missing observations for the dependent variable – due to either non-employment or temporary non-participation – by treating missing observations as latent variables that are simulated alongside the Bayesian iterations. As a byproduct, the estimations produce model consistent latent wage rates for the unemployed that are useful for labour supply analysis. |
Keywords: | labour market, modelling, dynamic microsimulation |
Date: | 2013–12 |
URL: | http://d.repec.org/n?u=RePEc:dra:wpaper:201301&r=all |
By: | Marco Corazza (Dept. of Economics, Università Ca' Foscari Venice); Giacomo Di Tollo (Dept. of Management, Università Ca' Foscari Venice); Giovanni Fasano (Dept. of Management, Università Ca' Foscari Venice); Raffaele Pesenti (Dept. of Management, Università Ca' Foscari Venice) |
Abstract: | In this paper we propose an efficient initialization of a deterministic Particle Swarm Optimization (PSO) scheme. PSO has showed to be promising for solving several unconstrained global optimization problems from real applications, where derivatives are unavailable and the evaluation of the objective function tends to be costly. Here we provide a theoretical framework which motivates the use of a deterministic version of PSO, in place of the standard stochastic iteration currently adopted in the literature. Then, in order to test our proposal, we include a numerical experience using a realistic complex portfolio selection problem. This numerical experience includes the application of PSO to a parameter dependent unconstrained reformulation of the considered portfolio selection problem. The parameters are either adaptively updated as in an exact penalty framework, or they are tuned by the code REVAC. We show that in both these settings our PSO initialization is preferable with respect to the standard proposal from the literature. |
Keywords: | Deterministic PSO, Global Optimization, Portfolio Selection Problems, Exact Penalty functions. |
JEL: | G11 C44 C61 |
Date: | 2015–07 |
URL: | http://d.repec.org/n?u=RePEc:vnm:wpdman:105&r=all |
By: | Marianne Frank Hansen (Danish Rational Economic Agents Model, DREAM); Marie Louise Schultz-Nielsen (Rockwool Foundation Research Unit and IZA); Torben Tranæs (Rockwool Foundation Research Unit and IZA) |
Abstract: | All over Europe, ageing populations threaten nations’ financial sustainability. In this paper we examine the potential of immigration to strengthen financial sustainability. We look at a particularly challenging case, namely that of Denmark, which has extensive tax-financed welfare programmes that provide a high social safety net. The analysis is based on a forecast for the entire Danish economy made using a dynamic computable general equilibrium model with overlapping generations. Net contributions to the public purse are presented both as cross-sectional figures for a long time horizon and as average individual life-cycle contributions. The main conclusion is that immigrants from richer countries have a positive fiscal impact, while immigrants from poorer countries have a large negative one. The negative effect is caused by both a weak labour market performance and early retirement in combination with the universal Danish welfare schemes. |
Keywords: | immigration, public finances, forecasting, denmark |
Date: | 2015–02 |
URL: | http://d.repec.org/n?u=RePEc:dra:wpaper:201501&r=all |
By: | Zura Kakushadze |
Abstract: | We give a complete algorithm and source code for constructing what we refer to as heterotic risk models (for equities), which combine: i) granularity of an industry classification; ii) diagonality of the principal component factor covariance matrix for any sub-cluster of stocks; and iii) dramatic reduction of the factor covariance matrix size in the Russian-doll risk model construction. This appears to prove a powerful approach for constructing out-of-sample stable short-lookback risk models. Thus, for intraday mean-reversion alphas based on overnight returns, Sharpe ratio optimization using our heterotic risk models sizably improves the performance characteristics compared to weighted regressions based on principal components or industry classification. We also give source code for: a) building statistical risk models; and ii) Sharpe ratio optimization with homogeneous linear constraints and position bounds. |
Date: | 2015–08 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1508.04883&r=all |