|
on Computational Economics |
Issue of 2018‒02‒26
eight papers chosen by |
By: | Kohn, Robert; Nguyen, Nghia; Nott, David; Tran, Minh-Ngoc |
Abstract: | Deep neural networks (DNNs) are a powerful tool for functional approximation. We describe flexible versions of generalized linear and generalized linear mixed models incorporating basis functions formed by a deep neural network. The consideration of neural networks with random effects seems little used in the literature, perhaps because of the computational challenges of incorporating subject specific parameters into already complex models. Efficient computational methods for Bayesian inference are developed based on Gaussian variational approximation methods. A parsimonious but flexible factor parametrization of the covariance matrix is used in the Gaussian variational approximation. We implement natural gradient methods for the optimization, exploiting the factor structure of the variational covariance matrix to perform fast matrix vector multiplications in iterative conjugate gradient linear solvers in natural gradient computations. The method can be implemented in high dimensions, and the use of the natural gradient allows faster and more stable convergence of the variational algorithm. In the case of random effects, we compute unbiased estimates of the gradient of the lower bound in the model with the random effects integrated out by making use of Fisher's identity. The proposed methods are illustrated in several examples for DNN random effects models and high-dimensional logistic regression with sparse signal shrinkage priors. |
Keywords: | Variational approximation; Stochastic optimization; Reparametrization gradient; Factor models |
Date: | 2017 |
URL: | http://d.repec.org/n?u=RePEc:syb:wpbsba:2123/17877&r=cmp |
By: | Gioele Figus (Centre for Energy Policy, University of Strathclyde); Kim Swales (Department of Economics, University of Strathclyde) |
Abstract: | This paper demonstrates the importance of considering both energy and non-energy efficiency improvements in the provision of energy intensive household services. Using the example of private transport, we analyse whether vehicle efficiency can beat fuel efficiency in cutting fuel use. We find that this ultimately depend on the elasticity of demand for transport, the substitutability between vehicles and fuels and the initial share of fuel use in private transport. The framework also allows to identify ‘multiple benefits’ of technical progress in private transport by considering both the ability of such policy to reduce fuel demand and to increase the consumer’s surplus. We extend the partial equilibrium framework by using computable general equilibrium (CGE) simulations to identify the system-wide impacts on total fuel use of the two alternative efficiency changes. Simulation results suggest that the substitution effects identified in the partial equilibrium analysis are an important element in determining the change in total fuel use resulting from these consumption efficiency changes. However, the identification of associated changes in intermediate fuel demand, plus the potential expansionary effects of the improvements in household efficiency transmitted through the labour market can generate general equilibrium effects that vary substantially from those derived using partial equilibrium analysis. |
Keywords: | energy services, technical progres,, energy efficiency, partial equilbrium, general equilibrium |
JEL: | C68 D58 Q43 Q48 |
Date: | 2018–02 |
URL: | http://d.repec.org/n?u=RePEc:str:wpaper:1802&r=cmp |
By: | Hans B\"uhler; Lukas Gonon; Josef Teichmann; Ben Wood |
Abstract: | We present a framework for hedging a portfolio of derivatives in the presence of market frictions such as transaction costs, market impact, liquidity constraints or risk limits using modern deep reinforcement machine learning methods. We discuss how standard reinforcement learning methods can be applied to non-linear reward structures, i.e. in our case convex risk measures. As a general contribution to the use of deep learning for stochastic processes, we also show that the set of constrained trading strategies used by our algorithm is large enough to $\epsilon$-approximate any optimal solution. Our algorithm can be implemented efficiently even in high-dimensional situations using modern machine learning tools. Its structure does not depend on specific market dynamics, and generalizes across hedging instruments including the use of liquid derivatives. Its computational performance is largely invariant in the size of the portfolio as it depends mainly on the number of hedging instruments available. We illustrate our approach by showing the effect on hedging under transaction costs in a synthetic market driven by the Heston model, where we outperform the standard "complete market" solution. |
Date: | 2018–02 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1802.03042&r=cmp |
By: | Matt Taddy |
Abstract: | We have seen in the past decade a sharp increase in the extent that companies use data to optimize their businesses. Variously called the `Big Data' or `Data Science' revolution, this has been characterized by massive amounts of data, including unstructured and nontraditional data like text and images, and the use of fast and flexible Machine Learning (ML) algorithms in analysis. With recent improvements in Deep Neural Networks (DNNs) and related methods, application of high-performance ML algorithms has become more automatic and robust to different data scenarios. That has led to the rapid rise of an Artificial Intelligence (AI) that works by combining many ML algorithms together – each targeting a straightforward prediction task – to solve complex problems. We will define a framework for thinking about the ingredients of this new ML-driven AI. Having an understanding of the pieces that make up these systems and how they fit together is important for those who will be building businesses around this technology. Those studying the economics of AI can use these definitions to remove ambiguity from the conversation on AI's projected productivity impacts and data requirements. Finally, this framework should help clarify the role for AI in the practice of modern business analytics and economic measurement. |
JEL: | C01 C1 O33 |
Date: | 2018–02 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:24301&r=cmp |
By: | Ana Arencibia Pareja (Banco de España); Samuel Hurtado (Banco de España); Mercedes de Luis López (Banco de España); Eva Ortega (Banco de España) |
Abstract: | The Quarterly Model of Banco de España (MTBE, Modelo Trimestral del Banco de España), is a large-scale macro-econometric model used for medium term macroeconomic forecasting of the Spanish economy, as well as for performing scenario simulations. The model is specified as a large set of error correction equations, and, especially in the short run, is mostly demand driven. This paper presents an update of the model, estimated with data from 1995 to 2014. In this iteration, a big revamp to the econometric techniques used in estimation has been implemented. Despite that, changes in coefficients and simulation results with respect to the previous version of the model are smaller than what we saw in earlier updates. Compared with MTBE-2014, this new version (MTBE-2017) shows less response of demand to interest rates and stock market prices but more to credit, less response of GDP to world demand but more to world prices and to the price of oil, more positive effects to output and employment from price and wage moderation, and slightly faster and bigger fiscal multipliers for some shocks (government consumption and investment, direct taxes to households) but smaller for others (indirect taxes, direct taxes to firms). |
Keywords: | Spanish economy, macroeconometric model |
JEL: | E10 E17 E20 E60 |
Date: | 2017–12 |
URL: | http://d.repec.org/n?u=RePEc:bde:opaper:1709&r=cmp |
By: | Adam M. Guren (Boston University); Timothy J. McQuade (Stanford University) |
Abstract: | We present a dynamic search model in which foreclosures exacerbate housing busts and delay the housing market;s recovery. By eroding lender equity, destroying the credit of potential buyers, and making buyers more selective, foreclosures freeze the market for non-foreclosures can cause price-default spirals that amplify an initial shock. To quantitatively asses these channels, the model is calibrated to the recent bust. The amplification is significant: ruined credit and choosey buyers account for 22.5 percent of the total decline in non-distressed prices and lender losses account for an additional 30 percent. We use our model to evaluate foreclosure mitigation policies and find that payment reduction is quite effective, but creating a single seller of foreclosures that holds them off the market until demand picks up is the most effective policy. Policies that slow down the pace of foreclosures can be counterproductive. |
Keywords: | Housing Prices & Dynamics, Foreclosures, Search, Great Recession |
JEL: | E30 R31 |
URL: | http://d.repec.org/n?u=RePEc:bos:wpaper:wp2018-007&r=cmp |
By: | Chia-Lin Chang (National Chung Hsing University); Michael McALeer (Asia University; University of Sydney Business School; Erasmus University); Wing-Keung Wong (Asia University; China Medical University Hospital; Lingnan University) |
Abstract: | The paper provides a review of the literature that connects Big Data, Computational Science, Economics, Finance, Marketing, Management, and Psychology, and discusses some research that is related to the seven disciplines. Academics could develop theoretical models and subsequent econometric and statistical models to estimate the parameters in the associated models, as well as conduct simulation to examine whether the estimators in their theories on estimation and hypothesis testing have good size and high power. Thereafter, academics and practitioners could apply theory to analyse some interesting issues in the seven disciplines and cognate areas. |
Keywords: | Big Data; Computational science; Economics; Finance; Management; Theoretical models; Econometric and statistical models; Applications. |
JEL: | A10 G00 G31 O32 |
Date: | 2018–02–03 |
URL: | http://d.repec.org/n?u=RePEc:tin:wpaper:20180011&r=cmp |
By: | Rui Fan (School of Management, Swansea University); Oleksandr Talavera (School of Management, Swansea University); Vu Tran (School of Management, Swansea University) |
Abstract: | This study investigates whether investors react to signals of an association between a firm and Donald Trump indicated by tweets containing both words 'Trump' (or '@realDonaldTrump') and a S&P500 firm name. Our results reveal that a large number of such tweets ignite speculations about a political connection between a firm and the US's President, thus affecting investors' trading behaviors. In particular, a rise in these tweets induces significant increases in trading volatility and trading volume. However, we do not find such great impacts on stock returns, suggesting that the speculations are more likely to spread among small non-institutional investors. Further investigations show that sentiments embedded in these tweets have limited influence. There is evidence of a changing market behavior towards such tweets centered by the President's inauguration. |
Keywords: | Twitter, US Election, stock market, investor sentiment, text classification, computational linguistics. |
JEL: | D72 G12 G14 L86 |
Date: | 2018–02–19 |
URL: | http://d.repec.org/n?u=RePEc:swn:wpaper:2018-07&r=cmp |