nep-cmp New Economics Papers
on Computational Economics
Issue of 2019‒04‒29
twenty-six papers chosen by
Stan Miles
Thompson Rivers University

  1. A neural network-based framework for financial model calibration By Shuaiqiang Liu; Anastasia Borovykh; Lech A. Grzelak; Cornelis W. Oosterlee
  2. Modeling for Local Impact Analysis By Ioannou, Petros
  3. Continuous-Time Mean-Variance Portfolio Optimization via Reinforcement Learning By Haoran Wang; Xun Yu Zhou
  4. Automated Vehicle Scenarios: Simulation of System-Level Travel Effects Using Agent-Based Demand and Supply Models in the San Francisco Bay Area By Rodier, Caroline; Jaller, Miguel; Pourrahmani, Elham; Bischoff, Joschka; Freedman, Joel; Pahwa, Anmol
  5. Forecasting in Big Data Environments: an Adaptable and Automated Shrinkage Estimation of Neural Networks (AAShNet) By Ali Habibnia; Esfandiar Maasoumi
  6. MOVES-Matrix for High-Performance Emission Rate Model Applications By Guensler, Randall; Liu, Haobing; Xu, Xiaodan; Lu, Hongyu; Rodgers, Michael O.
  7. Reducing Truck Emissions and Improving Truck Fuel Economy via Intelligent Transportation System Technologies By Ioannou, Petros; Zhang, Yihang
  8. Deep Q-Learning for Nash Equilibria: Nash-DQN By Philippe Casgrain; Brian Ning; Sebastian Jaimungal
  9. The Hypothetical Household Tool (HHoT) in EUROMOD: a new instrument for comparative research on tax-benefit policies in Europe By Tine Hufkens; Tim Goedeme; Katrin Gasior; Chrysa Leventi; Kostas Manios; Olga Rastrigina; Pasquale Recchia; Holly Sutherland; Natascha Van Mechelen; Gerlinde Verbist
  10. Integrating Management of Truck and Rail Systems in Los Angeles By Dessouky, Maged; Fu, Lunce; Hu, Shichun
  11. Energy and Air Quality Impacts of Truck-Only Lanes: A Case Study of Interstate 75 Between Macon and McDonough, Georgia By Kim, Daejin; Guin, Angshuman; Rodgers, Michael O; Guensler, Randall
  12. Winter is possibly not coming: Mitigating financial instability in an agent-based model with interbank market By Lilit Popoyan; Mauro Napoletano; Andrea Roventini
  13. Dynamic Scheduling of Chassis Movements with Chassis Processing Facilities in the Loop By Chassiakos, Anastasios; Jula, Hossein; VanderBeek, Timothy
  14. Micro-founded tax policy effects in a heterogenenous-agent macro-model By Diego d'Andria; Jason DeBacker; Richard Evans; Jonathan Pycroft; Magdalena Zachlod-Jelec
  15. DyMH_LU: a simple tool for modelling and simulating the health status of the Luxembourgish elderly in the longer run By GENEVOIS Anne-Sophie; LIEGEOIS Philippe; PI ALPERIN Maria Noel
  16. The welfare effects of trade policy experiments in quantitative trade models: The role of solution methods and baseline calibration By Bekkers, Eddy
  17. City-wide traffic control: modeling impacts of cordon queues By Ni, Wei; Cassidy, Michael J
  18. Simulation-based Value-at-Risk for Nonlinear Portfolios By Junyao Chen; Tony Sit; Hoi Ying Wong
  19. A Tool to Predict Fleet-Wide Heavy-Duty Vehicle Fuel-Saving Benefits from Low Rolling Resistance Tires By Gbologah, Franklin E.; Rodgers, Michael O.; Li, Hanyan "Ann"
  20. Development of Key Enabling Technologies for a Variable-blend Natural Gas Vehicle By Park, Chan Seung; Roy, Partho
  21. Potential economic effects of a global trade conflict: Projecting the medium-run effects with the WTO global trade model By Bekkers, Eddy; Teh, Robert
  22. Inefficiency of the Brazilian Stock Market: the IBOVESPA Future Contracts By Tarcisio M. Rocha Filho; Paulo M. M. Rocha
  23. Using Cooperative Adaptive Cruise Control (CACC) to Form High-Performance Vehicle Streams:Simulation Results Analysis By Liu, Hao; Kan, Xingan David; Shladover, Steven E.; Lu, Xiao-Yun
  24. Analysis of Sustainable Procurement in SMEs in Developing Countries By MUKHERJEE, KRISHNENDU
  25. HOT Lane Simulation Tools By Horowitz, Roberto; Kurzhanskiy, Alex A.; Wright, Mathew
  26. Interactive macroeconomics: A pluralist simulator By Prante, Franz J.; Barmucci, Alessandro; Hein, Eckhard; Truger, Achim

  1. By: Shuaiqiang Liu; Anastasia Borovykh; Lech A. Grzelak; Cornelis W. Oosterlee
    Abstract: A data-driven approach called CaNN (Calibration Neural Network) is proposed to calibrate financial asset price models using an Artificial Neural Network (ANN). Determining optimal values of the model parameters is formulated as training hidden neurons within a machine learning framework, based on available financial option prices. The framework consists of two parts: a forward pass in which we train the weights of the ANN off-line, valuing options under many different asset model parameter settings; and a backward pass, in which we evaluate the trained ANN-solver on-line, aiming to find the weights of the neurons in the input layer. The rapid on-line learning of implied volatility by ANNs, in combination with the use of an adapted parallel global optimization method, tackles the computation bottleneck and provides a fast and reliable technique for calibrating model parameters while avoiding, as much as possible, getting stuck in local minima. Numerical experiments confirm that this machine-learning framework can be employed to calibrate parameters of high-dimensional stochastic volatility models efficiently and accurately.
    Date: 2019–04
  2. By: Ioannou, Petros
    Abstract: The Los Angeles/Long Beach area is important for freight as it involves the twin ports and warehouses and freight hubs. The way freight is consolidated and distributed affects what is going on within the terminals and roadway and rail networks. The complexity and dynamics of the multimodal transportation networks in Los Angeles/Long Beach region that are also shared by passengers, together with the unpredictability of the effect of incidents, disruptions and demand, in temporal and special coordinates makes the local impact analysis of freight transportation a very challenging task despite recent advances in information technologies. Under this project, the researchers developed a set of traffic simulation models for the Los Angeles/Long Beach region that allowed them to evaluate the impact of new traffic flow control systems, vehicle routing, policy interventions such as land use changes and other ITS technologies on the efficiency of the transportation system and on the environment. The developed simulation models include: macroscopic simulation model for studying and evaluating large traffic networks, and microscopic simulation model for smaller networks. The macroscopic model focusses on flows and covers a much larger area as it is computationally much more efficient than the microscopic one. The microscopic model models the motion of each truck and vehicle, traffic lights, stop signs, speed limits, traffic rules etc. and resembles the real situation as close as possible. The developed simulation models have been used to evaluate different systems and application scenarios, including freight priority traffic signal control, multimodal freight routing and the impact analysis of the spatial pattern changes of warehousing and distribution. View the NCST Project Webpage
    Keywords: Engineering, Freight traffic, Freight transportation, Macroscopic traffic flow, Microscopic traffic flow, Multimodal transportation, Routing, Traffic models, Traffic simulation
    Date: 2018–12–01
  3. By: Haoran Wang; Xun Yu Zhou
    Abstract: We consider continuous-time Mean-variance (MV) portfolio optimization problem in the Reinforcement Learning (RL) setting. The problem falls into the entropy-regularized relaxed stochastic control framework recently introduced in Wang et al. (2019). We derive the feedback exploration policy as the Gaussian distribution, with time-decaying variance. Close connections between the entropy-regularized MV and the classical MV are also discussed, including the solvability equivalence and the convergence as exploration decays. Finally, we prove a policy improvement theorem (PIT) for the continuous-time MV problem under both entropy regularization and control relaxation. The PIT leads to an implementable RL algorithm for the continuous-time MV problem. Our algorithm outperforms an adaptive control based method that estimates the underlying parameters in real-time and a state-of-the-art RL method that uses deep neural networks for continuous control problems by a large margin in nearly all simulations.
    Date: 2019–04
  4. By: Rodier, Caroline; Jaller, Miguel; Pourrahmani, Elham; Bischoff, Joschka; Freedman, Joel; Pahwa, Anmol
    Abstract: In much in the same way that the automobile disrupted horse and cart transportation in the 20th century, automated vehicles hold the potential to disrupt our current system of transportation in the 21st century. Experts predict that vehicles could be fully automated by as early as 2025 or as late as 2035. Methods are needed to help the public and private sector understand automated vehicle technologies and their system-level effects. First, we explore the effects of automated vehicles using the San Francisco Bay Area Metropolitan Transportation Commission’s activity-based travel demand model (MTC-ABM). The simulation is unique in that it articulates the size and direction of change on travel for a wide range of automated vehicles scenarios. Second, we simulate the effects of the introduction of an automated taxi service on conventional personal vehicle and transit travel in the San Francisco Bay Area region and use new research on the costs of automated vehicles to represent plausible per mile automated taxi fares. We use an integrated model for the San Francisco Bay Area that includes the MTC-ABM combined with the agent-based MATSim model customized for the region. This model set uses baseline travel demand data from the region’s official activity-based travel model and dynamically assigns vehicles on road and transit networks by the time of day. Third, we use the MTC-ABM and the MATSim dynamic assignment model to simulate different “first†mile transit access services, including ride-hailing (Uber and Lyft) and ridesharing (Uber Pool/Lyft Line and Via) with and without automated vehicles. The results provide insight into the relative benefits of each service and automated vehicle technology and the potential market for these services. View the NCST Project Webpage
    Keywords: Business, Automated Vehicles, Travel Demand Modeling, Agent-Based Models, Transit Access
    Date: 2018–09–01
  5. By: Ali Habibnia (Virginia Tech); Esfandiar Maasoumi (Emory University)
    Abstract: This paper considers improved forecasting in possibly nonlinear dynamic settings, with high-dimension predictors ("big data" environments). To overcome the curse of dimensionality and manage data and model complexity, we examine shrinkage estimation of a back-propagation algorithm of a deep neural net with skip-layer connections. We expressly include both linear and nonlinear components. This is a high-dimensional learning approach including both sparsity L1 and smoothness L2 penalties, allowing high-dimensionality and nonlinearity to be accommodated in one step. This approach selects significant predictors as well as the topology of the neural network. We estimate optimal values of shrinkage hyperparameters by incorporating a gradient-based optimization technique resulting in robust predictions with improved reproducibility. The latter has been an issue in some approaches. This is statistically interpretable and unravels some network structure, commonly left to a black box. An additional advantage is that the nonlinear part tends to get pruned if the underlying process is linear. In an application to forecasting equity returns, the proposed approach captures nonlinear dynamics between equities to enhance forecast performance. It offers an appreciable improvement over current univariate and multivariate models by RMSE and actual portfolio performance.
    Date: 2019–04
  6. By: Guensler, Randall; Liu, Haobing; Xu, Xiaodan; Lu, Hongyu; Rodgers, Michael O.
    Abstract: The MOtor Vehicle Emission Simulator (MOVES) model was developed by the U.S. Environmental Protection Agency (USEPA) to estimate emissions from on-road and off-road vehicles in the United States. The MOVES model represents a significant improvement over the older MOBILE series of modes, primarily because emission rates are now truly modal in nature. Emission rates are now a function of power surrogates, which depend on speed and acceleration. Traffic simulation model outputs and smartphone GPS data can provide second-by-second vehicle activity data in time and space, including vehicle speed and acceleration. Coupling high-resolution vehicle activity data with appropriate MOVES emission rates further advances research efforts designed to assess the environmental impacts of transportation design and operation strategies. However, the MOVES interface is complicated, and the structure of input variables and algorithms involved in running MOVES to assess operational improvements makes analyses cumbersome and time consuming. The MOVES interface also makes it difficult to assess complicated transportation networks and to undertake analyses of large-scale systems that are dynamic in nature. The MOVES-Matrix system developed by the research team can be used to perform emissions modeling activities in a fraction of the time it takes to perform even one single individual MOVES run. The MOVES-Matrix approach involves running the MOVES model iteratively, across all potential input variable combinations, and using the resulting multidimensional array of pre-run MOVES outputs in emissions modeling. The research team configured MOVES to run on a distributed computing cluster, obtaining MOVES energy consumption and emission rate outputs for each vehicle class, model year, and operating condition, by calendar year, fuel composition (summer, winter, and transition fuels), local Inspection/Maintenance (I/M) program, meteorology, and other variables of interest. The team ran MOVES 146,853 times to generate the on-road emission rate matrices for Atlanta. More than 90 billion emission rates populate the primary output matrix, but implementation tools developed by the team generate matrix subsets for specific applications to speed up the analytical processes. In 2017-2018, the team developed MOVES-Matrix 2.0, which now integrates engine start, soak, evaporative, and truck hoteling emissions. The resulting emission rate matrices allow users to link emission rates to assess big data projects (such as regional emissions for emission inventory development) and to support near-real-time evaluations of changes in emissions for large, dynamic transportation systems. In the case study applications performed by the team, emission rate generation with MOVES-Matrix is 200-times faster than using the batch mode of MOVES graphic user interface in the same computer environment and the process predicts exactly the same emissions result. View the NCST Project Webpage
    Keywords: Engineering, Computer programs, Energy consumption, Greenhouse gases, Matrices (Mathematics), Pollutants, Simulation, Traffic data, Traffic simulation, Travel patterns
    Date: 2018–10–01
  7. By: Ioannou, Petros; Zhang, Yihang
    Abstract: The aim of this project is to use intelligent transportation system (ITS) technologies that take into account the presence of trucks in the traffic flow, in order to improve impact on the environment by reducing fuel consumption and pollution levels in areas where the truck volume is relatively high. The work is divided into two parts. In the first part, we propose an integrated variable speed limit (VSL), ramp metering (RM) and lane change (LC) controller using feedback linearization. The proposed integrated controller keeps the bottleneck flow at the maximum level and homogenizes the density and speed of the traffic flow along the highway sections. This improvement of the traffic flow characteristics lead to improved fuel economy and reduction in tailpipe emissions of both trucks and passenger vehicles. In order to evaluate the performance of the integrated traffic controller, a microscopic traffic simulation network of the I-710 highway, which is connected to the Ports of Long Beach/Los Angeles and has high truck volume, is developed. We use Monte-Carlo traffic flow simulations to demonstrate that the integrated traffic controller can generate consistent improvements with respect to travel time, safety, fuel economy and emissions under different traffic conditions. In the second part, we compared the proposed feedback linearization controller with the widely-used model predictive traffic controller in terms of performance and robustness with respect to perturbations on traffic demand, model parameters and measurement noise. Results show that both controllers are able to improve the total time spent, which leads to improvements in fuel economy and emissions, under different levels of perturbation and noise. The feedback linearization controller however, guarantees good performance and robustness properties than the model predictive controller with much less computational effort. View the NCST Project Webpage
    Keywords: Engineering, Physical Sciences and Mathematics, Feedback control, Fuel consumption, Lane changing, Monte Carlo method, Pollutants, Ramp metering, Traffic flow, Trucks, Variable speed limits
    Date: 2018–01–01
  8. By: Philippe Casgrain; Brian Ning; Sebastian Jaimungal
    Abstract: Model-free learning for multi-agent stochastic games is an active area of research. Existing reinforcement learning algorithms, however, are often restricted to zero-sum games, and are applicable only in small state-action spaces or other simplified settings. Here, we develop a new data efficient Deep-Q-learning methodology for model-free learning of Nash equilibria for general-sum stochastic games. The algorithm uses a local linear-quadratic expansion of the stochastic game, which leads to analytically solvable optimal actions. The expansion is parametrized by deep neural networks to give it sufficient flexibility to learn the environment without the need to experience all state-action pairs. We study symmetry properties of the algorithm stemming from label-invariant stochastic games and as a proof of concept, apply our algorithm to learning optimal trading strategies in competitive electronic markets.
    Date: 2019–04
  9. By: Tine Hufkens (European Commission - JRC); Tim Goedeme; Katrin Gasior; Chrysa Leventi; Kostas Manios; Olga Rastrigina; Pasquale Recchia; Holly Sutherland; Natascha Van Mechelen; Gerlinde Verbist
    Abstract: This paper introduces the Hypothetical Household Tool (HHoT), a new extension of EUROMOD, the tax-benefit microsimulation model for the European Union. With HHoT, users can easily create their own hypothetical data, which enables them to better understand how policies work for households with specific characteristics. The tool creates unique possibilities for an enhanced analysis of taxes and social benefits in Europe by integrating results from microsimulations and hypothetical household simulations in a single modelling framework. Furthermore, the flexibility of HHoT facilitates an advanced use of hypothetical household simulations to create new comparative policy indicators in the context of multi-country and longitudinal analyses. In this paper, we highlight the main features of HHoT, its strengths and limitations, and illustrate how it can be used for comparative policy purposes.
    Keywords: HHoT, microsimulation model, hypothetical household simulations, European Union
    Date: 2019–03
  10. By: Dessouky, Maged; Fu, Lunce; Hu, Shichun
    Abstract: This project establishes models to optimize the balance of freight demand across rail and truck modes. In real life situations, trains often travel at different speeds (i.e. passenger trains and freight trains share the same rail network). This incurs train delay whereby reducing the efficiency of the rail network. To provide a solution for this problem, we develop heuristic algorithms to improve conventional dispatching rules to reduce the average train delay. Then we build a control model and provide the solution procedure to adapt a dynamic headway concept inspired by new signaling technology like Positive Train Control (PTC). Rail network data of the Southern California region is collected to perform a detailed simulation analysis. The simulation results show significant improvement of network efficiency brought by our model and algorithms: as high as 21% reduction in average train delay with our best dispatching policy while with the dynamic headway control model, the average train delay is reduced by 40%. The railway network is therefore shown to have the potential to increase throughput capacity by 20%. View the NCST Project Webpage
    Keywords: Engineering, Delays, Freight trains, Freight transportation, Headways, Passenger trains, Positive train control, Railroad tracks, Switches (Railroads)
    Date: 2018–10–01
  11. By: Kim, Daejin; Guin, Angshuman; Rodgers, Michael O; Guensler, Randall
    Abstract: Since heavy-duty truck operations can significantly affect traffic congestion, especially on road grade, the creation of exclusive lanes for trucks has been viewed as a potential alternative to reduce congestion delay, fuel consumption, and emissions. However, few studies have rigorously evaluated the effectiveness of truck-only lanes in achieving these benefits. This study demonstrates a model framework that combines a microscopic traffic simulation with emissions and microscale dispersion models to quantify the potential impacts of truck-only lanes on fuel consumption, emissions, and near-road pollutant concentrations. As a case study, the framework was used to evaluate a proposed $2 billion project to construct 40-miles of truck-only lanes on Interstate 75 (I-75) between Atlanta and Macon, Georgia (USA). The findings of this study suggest that truck-only lanes could significantly improve the traffic flow, and reduce energy, emissions, and pollutant concentrations. The research team expects that the extensive simulation results of this study help to understand the performance of truck-only lanes on a large-scale network with a heavy mixture of truck and general purpose lane traffic. The methodology and framework developed in this study can be effectively and efficiently applied to a wide variety of scenarios to evaluate the environmental impacts of other transportation projects under various conditions. View the NCST Project Webpage
    Keywords: Engineering, Air quality, Fuel consumption, Genetic algorithms, Greenhouse gases, Pollutants, Traffic flow, Traffic models, Traffic simulation, Truck lanes, Trucks
    Date: 2018–11–01
  12. By: Lilit Popoyan; Mauro Napoletano; Andrea Roventini
    Abstract: We develop a macroeconomic agent-based model to study how financial instability can emerge from the co-evolution of interbank and credit markets and the policy responses to mitigate its impact on the real economy. The model is populated by heterogenous firms, consumers, and banks that locally interact in different markets. In particular, banks provide credit to firms according to a Basel II or III macro-prudential frameworks and manage their liquidity in the interbank market. The Central Bank performs monetary policy according to different types of Taylor rules. We find that the model endogenously generates market freezes in the interbank market which interact with the financial accelerator possibly leading to firm bankruptcies, bank- ing crises and the emergence of deep downturns. This requires the timely intervention of the Central Bank as a liquidity lender of last resort. Moreover, we find that the joint adoption of a three mandate Taylor rule tackling credit growth and the Basel III macro-prudential frame- work is the best policy mix to stabilize financial and real economic dynamics. However, as the Liquidity Coverage Ratio spurs financial instability by increasing the pro-cyclicality of banksù liquid reserves, a new counter-cyclical liquidity buffer should be added to Basel III to improve its performance further. Finally, we find that the Central Bank can also dampen financial in- stability by employing a new unconventional monetary-policy tool involving active management of the interest-rate corridor in the interbank market.
    Keywords: financial instability; interbank market freezes; monetary policy; macro-prudential policy; Basel III regulation; Tinbergen principle; agent-based models.
    Date: 2019–04–24
  13. By: Chassiakos, Anastasios; Jula, Hossein; VanderBeek, Timothy
    Abstract: This work studies the optimization of scheduling of chassis and container movements at the operational level for individual trucking companies when Chassis Processing Facilities (CPFs) are available for use in the vicinity of a container port within a major metropolitan area. A multi-objective optimization problem is formulated in which the weighted combination of the total travel time for the schedules of all vehicles in the company fleet and the maximum work span across all vehicle drivers during the day is minimized. Time-varying dynamic models for the movements of chassis and containers are developed to be used in the optimization process. The optimal solution is obtained through a genetic algorithm, and the effectiveness of the developed methodology is evaluated through a case study which focuses on the Los Angeles/Long Beach port complex. The case study uses a trucking company located in the Los Angeles region, which can utilize three candidate CPFs for exchange of chassis. The company assigns container movement tasks to its fleet of trucks, with warehouse locations spread across the region. In the simulation scenarios developed for the case study, the use of CPFs at the trucking company level, can provide improvements up to 13% (depending upon the specific scenario) over the cases of not using any CPFs. It was found in this work that for typical cases where the number of jobs is much larger than the number of vehicles in the company fleet, the greatest benefit from CPF use would be in the cases where there are some significant job to job differences with respect to chassis usage. View the NCST Project Webpage
    Keywords: Engineering, Algorithms, Chassis, Container handling, Containerization, Freight handling, Port traffic, Scheduling, Trucking
    Date: 2018–11–01
  14. By: Diego d'Andria (European Commission - JRC); Jason DeBacker (University of South Carolina – Darla Moore School of Business); Richard Evans (University of Chicago - Becker Friedman Institute); Jonathan Pycroft (European Commission - JRC); Magdalena Zachlod-Jelec (European Commission - JRC)
    Abstract: Microsimulation models are increasingly used to calibrate macro models for tax policy analysis. Yet, their potential remains underexploited, especially in order to represent the non-linearity of the tax and social benefit system and interactions between capital and labour incomes which play a key role to understand behavioural effects. Following DeBacker et al. (2018b) we use a microsimulation model to provide the output with which to estimate the parameters of bivariate non-linear tax functions in a macro model. In doing so we make marginal and average tax rates bivariate functions of capital income and labour income. We estimate the parameters of tax functions in order to capture the most important non-linearities of the actual tax schedule, together with interaction effects between labour and capital incomes. To illustrate the methodology, we simulate a reduction in marginal personal income tax rates in Italy with a microsimulation model, translating the microsimulation results into the shock for a dynamic overlapping generations model. Our results show that this policy change affects differently households distinguished by age and ability type.
    Keywords: computable models, general equilibrium, overlapping generations, taxation, microsimulation models
    JEL: H24 H31 D58
    Date: 2019–04
  15. By: GENEVOIS Anne-Sophie; LIEGEOIS Philippe; PI ALPERIN Maria Noel
    Abstract: We are facing one of the most important demographic events of the last decades in Europe: the population ageing process. This process will have significant economic effects particularly on health. As most diseases are age-related, this process might imply a proportionally higher share of individuals with declining health. Being able to forecast the health status of the population can help to deal with concerns about the financial and social sustainability of several public policies including health. In this paper, we present the DyMH_LU model, a dynamic microsimulation model focused exclusively on the health status of the Luxembourgish population. One of its major characteristics is that it simulates more than sixty different diseases and limitations in the activities of daily living. All this simulated information can be aggregated in order to compute, for each period, the overall health status of each individual, the marginal distribution of each disease among the total population and the global health status of the entire population. The starting point of the DyMH_LU model is the information collected in 2015 in the Wave 6 of the SHARE database that targets individuals aged 51 or older. The simulation period covers 2017 until 2045.
    Keywords: Dynamic microsimulation; Health; SHARE; Luxembourg
    JEL: C01 C02 I10
    Date: 2019–04
  16. By: Bekkers, Eddy
    Abstract: This paper compares the solution methods and baseline calibration of three different quantitative trade models (QTMs): computable general equilibrium (CGE) models, structural gravity (SG) models and models employing exact hat algebra (EHA). The different solution methods generate identical results on counterfactual experiments if baseline trade shares or baseline trade costs are identical. SG models, calibrating the baseline to gravity-predicted shares, potentially suffer from bias in the predicted welfare effects as a result of misspecification of the gravity equation, whereas the other methods, calibrating to actual shares, potentially suffer from bias as a result of random variation and measurement error of trade flows. Simulations show that predicted shares calibration can generate large biases in predicted welfare effects if the gravity equation does not contain pairwise fixed effects or is estimated without domestic trade flows. Calibration to actual shares and to fitted shares based on gravity estimation including pairwise fixed effects display similar performance in terms of robustness to the different sources of bias.
    Keywords: quantitative trade models,baseline calibration,free trade agreements,gravity estimation
    JEL: F13 F14 F15
    Date: 2019
  17. By: Ni, Wei; Cassidy, Michael J
    Abstract: Optimal cordon-metering rates are obtained using Macroscopic Fundamental Diagrams in combination with flow conservation laws. A model-predictive control algorithm is also used so that time-varying metering rates are generated based on their forecasted impacts. Our scalable algorithm can do this for an arbitrary number of cordoned neighborhoods within a city. Unlike its predecessors, the proposed model accounts for the constraining effects that cordon queues impose on a neighborhood's circulating traffic. It does so at every time step by approximating a neighborhood's street space occupied by cordon queues, and re-scaling the MFD downward to describe the state of circulating traffic that results. The model is also unique in that it differentiates between saturated and under-saturated cordon-metering operations. Computer simulations show that these enhancements can substantially improve the predictions of both, the trip completion rates in a neighborhood and the rates that vehicles cross metered cordons. Optimal metering policies generated as a result are similarly shown to do a better job in reducing the Vehicle Hours Traveled in a city. The VHT reductions stemming from the proposed model and from its predecessors differed by as much as 18%.
    Keywords: Engineering, Traffic control, Traffic models, Algorithms, Urban transportation
    Date: 2018–03–01
  18. By: Junyao Chen; Tony Sit; Hoi Ying Wong
    Abstract: Value-at-risk (VaR) has been playing the role of a standard risk measure since its introduction. In practice, the delta-normal approach is usually adopted to approximate the VaR of portfolios with option positions. Its effectiveness, however, substantially diminishes when the portfolios concerned involve a high dimension of derivative positions with nonlinear payoffs; lack of closed form pricing solution for these potentially highly correlated, American-style derivatives further complicates the problem. This paper proposes a generic simulation-based algorithm for VaR estimation that can be easily applied to any existing procedures. Our proposal leverages cross-sectional information and applies variable selection techniques to simplify the existing simulation framework. Asymptotic properties of the new approach demonstrate faster convergence due to the additional model selection component introduced. We have also performed sets of numerical results that verify the effectiveness of our approach in comparison with some existing strategies.
    Date: 2019–04
  19. By: Gbologah, Franklin E.; Rodgers, Michael O.; Li, Hanyan "Ann"
    Abstract: The cost of fuel represents a major portion of the costs of operating on-road heavy-duty vehicles (HDV). Over the next couple of decades, the total energy demand from the HDV sector will likely increase due to forecasted growth in freight demand in many global markets, including the United States, and much of this energy will continue to be provided by fossil fuels. Therefore, carbon dioxide emissions from the HDV sector are also expected to increase in the absence of effective mitigating measures to reduce the sectors reliance on fossil fuels. Along with other fuel-saving technologies, the United States Environmental Protection Agency identified the use of Low Rolling Resistance (LRR) tires as an effective method of reducing fuel consumption. It is estimated that LRR tires can improve fuel economy in HDV by about 10 percent. However, adoption of LRR faces many barriers and the most fundamental of these barriers relate to potential performance uncertainties under real-world operating conditions. Previous published decision support tools developed to help fleet operators and other stakeholders estimate the fuel-savings from LRR tires have been found to have limited accuracy due to inherent transient speed profiles in real-world operating cycles. In this study, we develop a tool to predict the fleet-wide fuel-saving benefits from low rolling resistance tires. Unlike previous studies, the developed tool is applicable to both stabilized speed operations and transient speed operations. The tool is based on empirical models that estimate the fuel consumption contribution from tires as a function of vehicle payload, aerodynamic drag, road grade, duration of acceleration, duration of deceleration and, and road facility type (freeway, major arterial, and minor arterial/local road). We limited the scope of the developed tool to tractor-trailers in the U.S. heavy-duty vehicle market, because the United States has the second largest HDV market in the world and tractor-trailers account for the largest share of the market. The tool was developed with data generated by simulating real-world heavy-duty vehicle operating cycles with Autonomie®, the state-of-the-art model for automotive control-system design, and simulating vehicle energy consumption and performance. Autonomie® is a preferred vehicle simulation tool of the United States Department of Energy. The primary purpose of the Tool to Predict Fleet-Wide Heavy-Duty Vehicle Fuel-Saving from Low Rolling Resistance Tires is to assist fleet operators, regulatory agencies, and policy analysts in assessing the fuel consumption savings from low rolling resistance tires. To facilitate ease-of-use by stakeholders, the statistical empirical models are embedded in a Microsoft Excel® spreadsheet. Fleet managers can customize the tool to their specific fleet and the tool is designed to inform fleet operators about the benefits and costs of making low rolling resistance tire investments. In addition to fuel consumption estimates, the spreadsheet tool further estimates related emission reductions. In the future, this tool can be extended to other vehicle segments. The spreadsheet algorithms can also be developed into a web-based computer program in the future to facilitate online use of the tool. The HDV Low Rolling Resistance Tire Fuel and Emission Reduction Calculator is available to download as a spreadsheet tool here: View the NCST Project Webpage
    Keywords: Engineering, Physical Sciences and Mathematics, Data analysis, Fleet management, Fuel consumption, Heavy duty vehicles, Rolling resistance, Simulation, Tires, Tractor trailer combinations, Traffic speed, Vehicle fleets
    Date: 2018–10–01
  20. By: Park, Chan Seung; Roy, Partho
    Abstract: A portable, economic and reliable sensor for the Natural Gas (NG) fuel quality has been developed. Both Wobbe Index (WI) and Methane Indexes (MI) as well as inert gas content (inert%) of the NG fuel can be measured in real time within 5% accuracy. This sensor is targeting to be used in any equipment that involves NG combustion including NG vehicle, boiler, building HVAC, various consumer level gas appliance and Variable Natural Gas Vehicle (VNGV). The VNGV is an NG vehicle that can operate on any arbitrary mixture of CH4 and CO2, thus allowing the use of Renewable Natural Gas (RNG) including biogas for transportation without comprehensive gas cleanup/upgrading. The technology behind is to predict the “Value of Interests†(WI, MI and inert%) by the signals from easily “Measurable Physical Properties†(such as thermal conductivity, temperature, etc..), as shown in the figure. Prediction of “Value of Interest†by data mining (esp. Multivariate Analysis and/or Artificial Neural Network) is the key idea of the concept. This technology is non-invasive, rugged, and small in size promising to overcome limitations and shortcomings such as bulky size and intrusive nature of conventional measurement technology. VNGV technology will enable widespread use of RNG as a transportation fuel, resulting in significant reductions in GHG emissions in the transportation sector.
    Keywords: Engineering
    Date: 2017–12–01
  21. By: Bekkers, Eddy; Teh, Robert
    Abstract: The WTO Global Trade Model is employed to project the medium-run economic effects of a global trade conflict. The trade conflict scenario is based on recent estimates in the literature of the difference between cooperative and non-cooperative tariffs. The study provides three main insights. First, the projected macroeconomic effects in the medium run are considerable. A global trade conflict started in 2019 would lead to a reduction in global GDP in 2022 of about 1.96% and a reduction in global trade of about 17% compared to the baseline. For context global GDP fell about 2.1% and global trade 12.4% in the global financial crisis of 2009. Second, behind the single-digit aggregate production effects there are much larger, double-digit sectoral production effects in many countries, leading to a painful adjustment process. In general, a global trade conflict leads to a reallocation of resources away from the most efficient allocation based on comparative advantage. Third, the large swings in sectoral production lead to substantial labour displacement. On average 1.15% and 1.74% of high-skilled and low-skilled workers respectively would leave their initial sector of employment.
    Keywords: computable general equilibrium (CGE) model,Nash tariffs,revealed comparative advantage (RCA),labour displacement
    JEL: B41 C63 F13 F16 F51 F53
    Date: 2019
  22. By: Tarcisio M. Rocha Filho; Paulo M. M. Rocha
    Abstract: We present some indications of inefficiency of the Brazilian stock market based on the existence of strong long-time cross-correlations with foreign markets and indices. Our results show a strong dependence on foreign markets indices as the S\&P 500 and CAC 40, but not to the Shanghai SSE 180, indicating an intricate interdependence. We also show that the distribution of log-returns of the Brazilian BOVESPA index has a discrete fat tail in the time scale of a day, which is also a deviation of what is expected of an efficient equilibrated market. As a final argument of the inefficiency of the Brazilian stock market, we use a neural network approach to forecast the direction of movement of the value of the IBOVESPA future contracts, with an accuracy allowing financial returns over passive strategies.
    Date: 2019–04
  23. By: Liu, Hao; Kan, Xingan David; Shladover, Steven E.; Lu, Xiao-Yun
    Keywords: Engineering
    Date: 2018–05–17
    Abstract: The purpose of the paper is to integrate supply base consolidation, rationalization, and buyer’s perspective about its suppliers to reveal more insight to implement sustainable procurement in small and medium enterprises (SMEs) in developing countries like India. In this paper an attempt has been made to integrate Constrained Optimization of Frobenius Norm by Genetic Algorithm (COFGA) with traditional spend, and value risk analysis to consolidate and rationalize supply base w.r.t fifteen triple bottom line indicators (TBL). This paper shows that spend analysis is justified in crisp domain and becomes myopic in limited data environment. Spend analysis becomes more ineffective to deal imprecise and vague qualitative data. Integrated approach of multiple criteria decision analysis,spend analysis, and value risk analysis, thus, an alternative approach to give better insight to sustainable procurement in fuzzy environment. Finally, a case study is discussed to use proposed method.
    Keywords: Sustainable supplier selection; small and medium enterprises (SMEs); genetic algorithm(GA);spend analysis; triple bottom line (TBL); multiple criteria decision analysis; value risk analysis
    JEL: C61 C63
    Date: 2019–03–31
  25. By: Horowitz, Roberto; Kurzhanskiy, Alex A.; Wright, Mathew
    Keywords: Engineering
    Date: 2018–06–25
  26. By: Prante, Franz J.; Barmucci, Alessandro; Hein, Eckhard; Truger, Achim
    Abstract: The aim of our contribution is to present an innovative instrument to teach macroeconomics at the undergraduate and master level. We develop a digital learning platform to present and explore some controversies at the very foundations of macroeconomic theory. For this purpose, we explicitly present two competing paradigms, the new-Keynesian and the post-Keynesian one. Several interactive scenarios are made available where the user can take control over different economic policy instruments and is guided through a set of problems that require appropriate actions in the context of the different approaches.
    Keywords: macroeconomics teaching,simulations,pluralism,new Keynesian macroeconomics,post-Keynesian macroeconomics
    JEL: A22 A23 E12 E17 E60
    Date: 2019

This nep-cmp issue is ©2019 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.