|
on Computational Economics |
Issue of 2019‒03‒25
nineteen papers chosen by |
By: | Martin Christensen (European Commission - JRC); Andrea Conte (European Commission - JRC); Simone Salotti (European Commission - JRC) |
Abstract: | In 2018 the European Commission has published its proposal for its future research and innovation (R&I) programme, Horizon Europe, a €100 billion programme that will succeed Horizon 2020.Horizon Europe is designed around three pillars: support researchers and projects (Open Science), pursue industrial leadership related to societal issues (Global challenges), and boost market-creating innovation (Open Innovation). The RHOMOLO dynamic CGE model has been used for policy simulations to estimate the economic impact of Horizon Europe. The analysis compares three alternative policy designs to a scenario without the policy: Continuation, in which Horizon Europe is implemented similarly to the previous Horizon 2020; Centralisation, in which the programme is reinforced by centralising at the EU level a third of the national competitive-based project funding; and Decentralisation, in which the programme is implemented at the national level. The RHOMOLO simulations suggest that Horizon Europe can contribute to higher aggregate GDP and employment, with considerable potential regional heterogeneity. |
Keywords: | rhomolo, region, growth, impact assessment, modelling, R&D, R&I, Horizon Europe, Horizon 2020, investment |
JEL: | C68 E61 R12 |
Date: | 2019–03 |
URL: | http://d.repec.org/n?u=RePEc:ipt:iptwpa:jrc115437&r=all |
By: | Masoud Fekri; Babak Barazandeh |
Abstract: | Optimal capital allocation between different assets is an important financial problem, which is generally framed as the portfolio optimization problem. General models include the single-period and multi-period cases. The traditional Mean-Variance model introduced by Harry Markowitz has been the basis of many models used to solve the portfolio optimization problem. The overall goal is to achieve the highest return and lowest risk in portfolio optimization problems. In this paper, we will present an optimal portfolio based the Markowitz Mean-Variance-Skewness with weight constraints model for short-term investment opportunities in Iran's stock market. We will use a neural network based predictor to predict the stock returns and measure the risk of stocks based on the prediction errors in the neural network. We will perform a series of experiments on our portfolio optimization model with the real data from Iran's stock market indices including Bank, Insurance, Investment, Petroleum Products and Chemicals indices. Finally, 8 different portfolios with low, medium and high risks for different type of investors (risk-averse or risk taker) using genetic algorithm will be designed and analyzed. |
Date: | 2019–02 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1903.06632&r=all |
By: | Sylvain Barde (Sciences Po); Sander Van Der Hoog (Universität Bielefeld) |
Abstract: | Despite recent advances in bringing agent-based models (ABMs) to the data, the estimation or calibration of model parameters remains a challenge, especially when it comes to large-scale agentbased macroeconomic models. Most methods, such as the method of simulated moments (MSM), require in-the-loop simulation of new data, which may not be feasible for such computationally heavy simulation models. The purpose of this paper is to provide a proof-of-concept of a generic empirical validation methodology for such large-scale simulation models. We introduce an alternative ‘large-scale’ empirical validation approach, and apply it to the Eurace@Unibi macroeconomic simulation model (Dawid et al., 2016). This model was selected because it displays strong emergent behaviour and is able to generate a wide variety of nonlinear economic dynamics, including endogenous business- and financial cycles. In addition, it is a computationally heavy simulation model, so it fits our targeted use-case. The validation protocol consists of three stages. At the first stage we use Nearly-Orthogonal Latin Hypercube sampling (NOLH) in order to generate a set of 513 parameter combinations with good space-filling properties. At the second stage we use the recently developed Markov Information Criterion (MIC) to score the simulated data against empirical data. Finally, at the third stage we use stochastic kriging to construct a surrogate model of the MIC response surface, resulting in an interpolation of the response surface as a function of the parameters. The parameter combinations providing the best fit to the data are then identified as the local minima of the interpolated MIC response surface. The Model Confidence Set (MCS) procedure of Hansen et al. (2011) is used to restrict the set of model calibrations to those models that cannot be rejected to have equal predictive ability, at a given confidence level. Validation of the surrogate model is carried out by re-running the second stage of the analysis on the so identified optima and cross-checking that the realised MIC scores equal the MIC scores predicted by the surrogate model. The results we obtain so far look promising as a first proof-of-concept for the empirical validation methodology since we are able to validate the model using empirical data series for 30 OECD countries and the euro area. The internal validation procedure of the surrogate model also suggests that the combination of NOLH sampling, MIC measurement and stochastic kriging yields reliable predictions of the MIC scores for samples not included in the original NOLH sample set. In our opinion, this is a strong indication that the method we propose could provide a viable statistical machine learning technique for the empirical validation of (large-scale) ABMs |
Keywords: | Statistical machine learning; Surrogate modelling; Empirical validation |
Date: | 2017–07 |
URL: | http://d.repec.org/n?u=RePEc:spo:wpmain:info:hdl:2441/4pa18fd9lf9h59m4vfavfcf61e&r=all |
By: | Zura Kakushadze; Willie Yu |
Abstract: | We give an explicit algorithm and source code for constructing risk models based on machine learning techniques. The resultant covariance matrices are not factor models. Based on empirical backtests, we compare the performance of these machine learning risk models to other constructions, including statistical risk models, risk models based on fundamental industry classifications, and also those utilizing multilevel clustering based industry classifications. |
Date: | 2019–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1903.06334&r=all |
By: | Dat Thanh Tran; Juho Kanniainen; Moncef Gabbouj; Alexandros Iosifidis |
Abstract: | Forecasting based on financial time-series is a challenging task since most real-world data exhibits nonstationary property and nonlinear dependencies. In addition, different data modalities often embed different nonlinear relationships which are difficult to capture by human-designed models. To tackle the supervised learning task in financial time-series prediction, we propose the application of a recently formulated algorithm that adaptively learns a mapping function, realized by a heterogeneous neural architecture composing of Generalized Operational Perceptron, given a set of labeled data. With a modified objective function, the proposed algorithm can accommodate the frequently observed imbalanced data distribution problem. Experiments on a large-scale Limit Order Book dataset demonstrate that the proposed algorithm outperforms related algorithms, including tensor-based methods which have access to a broader set of input information. |
Date: | 2019–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1903.06751&r=all |
By: | Hufkens, Tine; Goedemé, Tim; Gasior, Katrin; Leventi, Chrysa; Manios, Kostas; Rastrigina, Olga; Recchia, Pasquale; Sutherland, Holly; Van Mechelen, Natascha; Verbist, Gerlinde |
Abstract: | This paper introduces the Hypothetical Household Tool (HHoT), a new extension of EUROMOD, the tax-benefit microsimulation model for the European Union. With HHoT, users can easily create their own hypothetical data, which enables them to better understand how policies work for households with specific characteristics. The tool creates unique possibilities for an enhanced analysis of taxes and social benefits in Europe by integrating results from microsimulations and hypothetical household simulations in a single modelling framework. Furthermore, the flexibility of HHoT facilitates an advanced use of hypothetical household simulations to create new comparative policy indicators in the context of multi-country and longitudinal analyses. In this paper, we highlight the main features of HHoT, its strengths and limitations, and illustrate how it can be used for comparative policy purposes. |
Date: | 2019–03–12 |
URL: | http://d.repec.org/n?u=RePEc:ese:emodwp:em5-19&r=all |
By: | Annarita Colasante (LEE and Department of Economics, Universitat Jaume I, Castellón, Spain); Simone Alfarano (LEE and Department of Economics, Universitat Jaume I, Castellón, Spain); Eva Camacho-Cuena (LEE and Department of Economics, Universitat Jaume I, Castellón, Spain) |
Abstract: | We compare the performance of two learning algorithms in replicating individual short and long-run expectations: the Exploration-Explotation Algorithm (EEA) and the Heuristic Switching Model (HSM). Individual expectations are elicited in a series of Learning-to-Forecast Experiments (LtFEs) with different feedback mechanisms between expectations and market price: positive and negative feedback markets. We implement the EEA proposed by Colasante et al. (2018c). Moreover, we modify the existing version of the HSM in order to incorporate the long-run predictions. Although the two algorithms provide a fairly good description of marker prices in the short- run, the EEA outperforms the HSM in replicating the main characteristics of individual expectation in the long-run, both in terms of coordination of individual expectations and convergence of expectations to the fundamental value. |
Keywords: | Expectations, Experiment, Evolutionary Learning |
JEL: | D03 G12 C91 |
Date: | 2019 |
URL: | http://d.repec.org/n?u=RePEc:jau:wpaper:2019/02&r=all |
By: | Weilong Fu; Ali Hirsa |
Abstract: | We investigate methods for pricing American options under the variance gamma model. The variance gamma process is a pure jump process which is constructed by replacing the calendar time by the gamma time in a Brownian motion with drift, which makes it a time-changed Brownian motion. In general, the finite difference method and the simulation method can be used for pricing under this model, but their speed is not satisfactory. So there is a need for fast but accurate approximation methods. In the case of Black-Merton-Scholes model, there are fast approximation methods, but they cannot be utilized for the variance gamma model. We develop a new fast method inspired by the quadratic approximation method, while reducing the error by making use of a machine learning technique on pre-calculated quantities. We compare the performance of our proposed method with those of the existing methods and show that this method is efficient and accurate for practical use. |
Date: | 2019–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1903.07519&r=all |
By: | Peiyang Guo (Department of Electrical and Electronic Engineering, The University of Hong Kong, Hong Kong); Jacqueline CK Lam (Department of Electrical and Electronic Engineering, The University of Hong Kong, Hong Kong); Victor OK Li (Department of Electrical and Electronic Engineering, The University of Hong Kong, Hong Kong) |
Keywords: | Time-based electricity pricing, price responsiveness, high-potential users, variable selection, Time of Use, machine learning |
JEL: | Q41 |
Date: | 2018–08 |
URL: | http://d.repec.org/n?u=RePEc:enp:wpaper:eprg1824&r=all |
By: | \c{C}a\u{g}{\i}n Ararat; Nurtai Meimanjanov |
Abstract: | Systemic risk is concerned with the instability of a financial system whose members are interdependent in the sense that the failure of a few institutions may trigger a chain of defaults throughout the system. Recently, several systemic risk measures are proposed in the literature that are used to determine capital requirements for the members subject to joint risk considerations. We address the problem of computing systemic risk measures for systems with sophisticated clearing mechanisms. In particular, we consider the Eisenberg-Noe network model and the Rogers-Veraart network model, where the former one is extended to the case where operating cash flows in the system are unrestricted in sign. We propose novel mixed-integer linear programming problems that can be used to compute clearing vectors for these models. Due to the binary variables in these problems, the corresponding (set-valued) systemic risk measures fail to have convex values in general. We associate nonconvex vector optimization problems to these systemic risk measures and solve them by a recent nonconvex variant of Benson's algorithm which requires solving two types of scalar optimization problems. We provide a detailed analysis of the theoretical features of these problems for the extended Eisenberg-Noe and Rogers-Veraart models. We test the proposed formulations on computational examples and perform sensitivity analyses with respect to some model-specific and structural parameters. |
Date: | 2019–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1903.08367&r=all |
By: | Sang Il Lee; Seong Joon Yoo |
Abstract: | Stock prices are influenced by numerous factors. We present a method to combine these factors and we validate the method by taking the international stock market as a case study. In today's increasingly international economy, return and volatility spillover effects across international equity markets are major macroeconomic drivers of stock dynamics. Thus, foreign market information is one of the most important factors in forecasting domestic stock prices. However, the cross-correlation between domestic and foreign markets is so complex that it would be extremely difficult to express it explicitly with a dynamical equation. In this study, we develop stock return prediction models that can jointly consider international markets, using multimodal deep learning. Our contributions are three-fold: (1) we visualize the transfer information between South Korea and US stock markets using scatter plots; (2) we incorporate the information into stock prediction using multimodal deep learning; (3) we conclusively show that both early and late fusion models achieve a significant performance boost in comparison with single modality models. Our study indicates that considering international stock markets jointly can improve prediction accuracy, and deep neural networks are very effective for such tasks. |
Date: | 2019–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1903.06478&r=all |
By: | M. R. Hesamzadeh (Electricity Market Research Group (EMReG), KTH Royal Institute of Technology, Sweden); P. Holmberg (Research Institute of Industrial Economics (IFN), Sweden -Energy Policy Research Group (EPRG), University of Cambridge); M. Sarfati (Energy Policy Research Group (EPRG), University of Cambridge) |
Keywords: | Two-stage game, Zonal pricing, Wholesale electricity market, Bilinear programming |
JEL: | C61 C63 C72 D43 L13 L94 |
Date: | 2018–05 |
URL: | http://d.repec.org/n?u=RePEc:enp:wpaper:eprg1813&r=all |
By: | Adrien Ehrhardt; Christophe Biernacki; Vincent Vandewalle; Philippe Heinrich |
Abstract: | For regulatory and interpretability reasons, logistic regression is still widely used. To improve prediction accuracy and interpretability, a preprocessing step quantizing both continuous and categorical data is usually performed: continuous features are discretized and, if numerous, levels of categorical features are grouped. An even better predictive accuracy can be reached by embedding this quantization estimation step directly into the predictive estimation step itself. But doing so, the predictive loss has to be optimized on a huge set. To overcome this difficulty, we introduce a specific two-step optimization strategy: first, the optimization problem is relaxed by approximating discontinuous quantization functions by smooth functions; second, the resulting relaxed optimization problem is solved via a particular neural network. The good performances of this approach, which we call glmdisc, are illustrated on simulated and real data from the UCI library and Cr\'edit Agricole Consumer Finance (a major European historic player in the consumer credit market). |
Date: | 2019–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1903.08920&r=all |
By: | Tammik, Miko |
Abstract: | This paper presents baseline results from the latest version of EUROMOD (version I1.0+), the tax-benefit microsimulation model for the EU. First, we briefly report the process of updating EUROMOD. We then present indicators for income inequality and risk of poverty using EUROMOD and discuss the main reasons for differences between these and EU-SILC based indicators. We further compare EUROMOD distributional indicators across all EU 28 countries and over time between 2015 and 2018. Finally, we provide estimates of marginal effective tax rates (METR) for all 28 EU countries in order to explore the effect of tax and benefit systems on work incentives at the intensive margin. Throughout the paper, we highlight both the potential of EUROMOD as a tool for policy analysis and the caveats that should be borne in mind when using it and interpreting results. This paper updates the work reported in Tammik (2018). |
Date: | 2019–03–13 |
URL: | http://d.repec.org/n?u=RePEc:ese:emodwp:em6-19&r=all |
By: | Tadamasa Sawada (National Research University Higher School of Economics) |
Abstract: | It is commonly believed that the visual system requires oculomotor information to perceive depth from binocular disparity. However, any effect of the oculomotor information on depth perception is too restricted to explain depth perception under natural viewing conditions. In this study, I describe a computational model that can recover depth from a stereo-pair of retinal images without using any oculomotor information. The model shows that, at least from a computational perspective, any oculomotor information is not necessary for perceiving depth from the stereo retinal images. |
Keywords: | binocular disparity; stereo vision; P3P problem; multiple view geometry |
JEL: | Z |
Date: | 2019 |
URL: | http://d.repec.org/n?u=RePEc:hig:wpaper:106psy2019&r=all |
By: | Marchal, Sarah; Siöland, Linus; Goedemé, Tim |
Abstract: | This paper aims to show how the newly developed Hypothetical Household Tool of the EUROMOD microsimulation model can be used to generate institutional minimum income protection indicators. It does so by updating the CSB’s Minimum Income Protection Indicators (CSB-MIPI) dataset using EUROMOD and HHoT. We discuss the necessary assumptions for this exercise, and describe, present and validate the obtained indicators. In doing so, we provide and discuss both an updated minimum income protection indicator dataset, and give guidance to researchers who want to use the flexibility of HHoT to calculate purpose designed minimum income protection indicators. |
Date: | 2019–03–11 |
URL: | http://d.repec.org/n?u=RePEc:ese:emodwp:em4-19&r=all |
By: | Guillaume Allegre; Hélène Périvier (Observatoire français des conjonctures économiques); Muriel Pucci (Université Paris 1 Panthéon-Sorbonne) |
Abstract: | Le quotient conjugal exige des couples mariés et pacsés qu’ils déclarent conjointement leurs revenus et leur attribue deux parts fiscales. Ce dispositif soulève des enjeux en termes de justice fiscale, d’efficacité ou encore de choix redistributif qui sont peu discutés dans le débat démocratique. Pourtant il fait l’objet de nombreuses controverses parmi les économistes : il n’est pas adapté aux nouvelles configurations familiales ; il désincite potentiellement à l’activité des femmes mariées ; il n’est pas conforme au principe de capacité contributive des ménages ; enfin il procure un avantage d’autant plus important que les revenus sont élevés. A l’aide du modèle de microsimulation Ines, nous simulons trois réformes : une individualisation de l’impôt, la réduction à 1,5 part du quotient conjugal en ouvrant la possibilité pour les couples mariés/pacsés d’opter pour une imposition individuelle, enfin le plafonnement du quotient conjugal au même niveau que le quotient familial. L’individualisation conduit au gain fiscal le plus élevé (environ 7 milliards) contre 5 milliards pour le quotient conjugal à 1,5 part et 3 milliards pour le plafonnement. Avec l’individualisation, 46% des ménages sont perdants et la perte est inférieure à 1.5% du revenu disponible pour la moitié des perdants ; 60% des perdants se situent dans les 3 derniers déciles contre 6% dans les trois premiers. Avec un quotient conjugal à 1,5 part, 45% des couples sont perdants (soit environ 5,8 millions) pour une perte médiane de 680 euros, correspondant à 1.3% du revenu disponible ; 64% des perdants se situent dans les 3 derniers déciles. Enfin avec le plafonnement 7% des couples sont perdants (soit environ 895 000) pour une perte moyenne de 3200 euros par an, et une perte médiane de 1800 euros, soit 2.6% du revenu disponible ; 83% des perdants se situent dans les 3 derniers déciles. |
Keywords: | Microsimulation; Impot sur le revenu; Quotient conjugal |
JEL: | H24 H31 D31 |
Date: | 2019–02 |
URL: | http://d.repec.org/n?u=RePEc:spo:wpmain:info:hdl:2441/1lc919l7sm8pt87iaubt8n43lo&r=all |
By: | Adams, Abigail |
Abstract: | Revealed preference restrictions are increasingly used to predict demand behaviour at new budgets of interest and as shape restrictions in nonparametric estimation exercises. However, the restrictions imposed are not sufficient for rationality when predictions are made at multiple budgets. I highlight the nonconvexities in the set of predictions that arise when making multiple predictions. I develop a mixed integer programming characterisation of the problem that can be used to impose rationality on multiple predictions. The approach is applied to the UK Family Expenditure Survey to recover rational demand predictions with substantially reduced computational resources compared to known alternatives. |
Keywords: | Demand estimation; mixed integer programming; Revealed Preference |
JEL: | C60 D11 D12 |
Date: | 2019–03 |
URL: | http://d.repec.org/n?u=RePEc:cpr:ceprdp:13580&r=all |
By: | Bonin, Holger (IZA); Sommer, Eric (IZA); Buhlmann, Florian (ZEW Mannheim); Stichnoth, Holger (ZEW Mannheim) |
Abstract: | Studie im Auftrag des Bundesministeriums für Wirtschaft und Energie (BMWi), Bonn 2019 (60 Seiten) |
Date: | 2019–03–11 |
URL: | http://d.repec.org/n?u=RePEc:iza:izarrs:88&r=all |