nep-cmp New Economics Papers
on Computational Economics
Issue of 2014‒05‒24
nine papers chosen by
Stan Miles
Thompson Rivers University

  1. “A multivariate neural network approach to tourism demand forecasting” By Oscar Claveria; Enric Monte; Salvador Torra
  2. A computational model of optimal commodity taxation By John T. Revesz
  3. Information Risk, Market Stress and Institutional Herding in Financial Markets: New Evidence Through the Lens of a Simulated Model By Christopher Boortz; Stephanie Kremer; Simon Jurkatis; Dieter Nautz
  4. Simulation of multivariate diffusion bridges By Mogens Bladt; Samuel Finch; Michael Sørensen
  5. Further development of SAMPERS and modeling of urban congestion By Almroth, Andreas; Berglund, Svante; Canella , Olivier; Engelson, Leonid; Flötteröd, Gunnar; Jonsson, Daniel; Kristoffersson, Ida; West, Jens
  6. Integrating Timetabling and Crew Scheduling at a Freight Railway Operator By Bach, L.; Dollevoet, T.A.B.; Huisman, D.
  7. Small Business Credit Scoring and Its Pitfalls: Evidence from Japan By Ryo Hasumi; Hideaki Hirata
  8. Analyse économique des impacts et de l’adaptation aux changements climatiques de l’industrie forestière québécoise à l’aide d’un modèle d’équilibre général calculable de type micro-simulation By Dorothée Boccanfuso; Luc Savard; Jonathan Goyette; Véronique Gosselin; Clovis Tanekou Mangoua
  9. A statistical analysis of reliability of audit opinions as bankruptcy predictors. By Caserio Carlo; Panaro Delio; Trucco Sara

  1. By: Oscar Claveria (Faculty of Economics, University of Barcelona); Enric Monte (Department of Signal Theory and Communications, Polytechnic University of Catalunya); Salvador Torra (Faculty of Economics, University of Barcelona)
    Abstract: This study compares the performance of different Artificial Neural Networks models for tourist demand forecasting in a multiple-output framework. We test the forecasting accuracy of three different types of architectures: a multi-layer perceptron network, a radial basis function network and an Elman neural network. We use official statistical data of inbound international tourism demand to Catalonia (Spain) from 2001 to 2012. By means of cointegration analysis we find that growth rates of tourist arrivals from all different countries share a common stochastic trend, which leads us to apply a multivariate out-of-sample forecasting comparison. When comparing the forecasting accuracy of the different techniques for each visitor market and for different forecasting horizons, we find that radial basis function models outperform multi-layer perceptron and Elman networks. We repeat the experiment assuming different topologies regarding the number of lags used for concatenation so as to evaluate the effect of the memory on the forecasting results, and we find no significant differences when additional lags are incorporated. These results reveal the suitability of hybrid models such as radial basis functions that combine supervised and unsupervised learning for economic forecasting with seasonal data.
    Keywords: forecasting; tourism demand; cointegration; multiple-output; artificial neural networks. JEL classification: L83; C53; C45; R11
    Date: 2014–05
  2. By: John T. Revesz (Australian Public Service)
    Abstract: This report examines the structure of optimal commodity tax rates in a many-person many-goods static computational model using segmented LES utility. One of the major findings is that with non-linear Engel curves and linear income tax, optimal commodity tax rates tend to be progressive and highly dispersed under logarithmic utility specifications. However, the dispersion of tax rates is considerably reduced if the inequality aversion of society is low or if tax evasion depends among other things on disparities between commodity tax rates. With exogenously given non-optimal and non-linear income tax schedules, usually there is still a need for differentiated and progressive commodity taxation. Tax evasion tends to reduce optimal tax rates for necessities but increases them for luxuries. Private compliance costs and government administration costs reduce optimal tax rates by a similar amount to the share of these costs from taxes. The results indicate that in a redistributive model the effect of externalities on optimal tax rates exceeds the corresponding Pigovian tax rates or subsidies. The main benefit of higher taxes on leisure complements than leisure substitutes appears to relate to increased tax revenue for redistribution rather than improvement in the utility position of those paying the taxes. The effect of complexities such as tax evasion, administrative costs, externalities and leisure complements/substitutes on redistribution is not neutral. Generally, these complexities tend to increase the progressivity of optimal commodity tax rates. Explanations are provided why the numerical results presented here do not contradict the Laroque-Kaplow proposition, which advocates uniform commodity taxation. Some practical application problems and logical weaknesses of the Laroque-Kaplow proposition are noted.
    Keywords: optimal taxation, computational models
    JEL: C63 H21
    Date: 2014–05
  3. By: Christopher Boortz; Stephanie Kremer; Simon Jurkatis; Dieter Nautz
    Abstract: This paper employs numerical simulations of the Park and Sabourian (2011) herd model to derive new theory-based predictions for how information risk and market stress influence aggregate herding intensity. We test these predictions empirically using a comprehensive data set of highfrequency and investor-specic trading data from the German stock market. Exploiting intra-day patterns of institutional trading behavior, we confirm that higher information risk increases both buy and sell herding. The model also explains why buy, not sell, herding is more pronounced during the financial crisis.
    Keywords: Herd behavior, information risk, financial crisis, institutional trading, model simulation
    JEL: D81 D82 G14
    Date: 2014–05
  4. By: Mogens Bladt (Universidad Nacional Autónoma de México); Samuel Finch (University of Copenhagen, Dept. of Mathematical Sciences); Michael Sørensen (University of Copenhagen, Dept. of Mathematical Sciences and CREATES)
    Abstract: We propose simple methods for multivariate diffusion bridge simulation, which plays a fundamental role in simulation-based likelihood and Bayesian inference for stochastic differential equations. By a novel application of classical coupling methods, the new approach generalizes a previously proposed simulation method for one-dimensional bridges to the multi-variate setting. First a method of simulating approximate, but often very accurate, diffusion bridges is proposed. These approximate bridges are used as proposal for easily implementable MCMC algorithms that produce exact diffusion bridges. The new method is much more generally applicable than previous methods. Another advantage is that the new method works well for diffusion bridges in long intervals because the computational complexity of the method is linear in the length of the interval. In a simulation study the new method performs well, and its usefulness is illustrated by an application to Bayesian estimation for the multivariate hyperbolic diffusion model.
    Keywords: Bayesian inference, coupling, discretely sampled diffusions, likelihood inference, stochastic differential equation, time-reversal.
    JEL: C22 C15
    Date: 2014–05–13
  5. By: Almroth, Andreas (SWECO); Berglund, Svante (KTH / TLA); Canella , Olivier (WSP); Engelson, Leonid (KTH / TLA); Flötteröd, Gunnar (KTH); Jonsson, Daniel (KTH / TLA); Kristoffersson, Ida (SWECO); West, Jens (KTH / SWECO)
    Abstract: The need to more precisely represent the consequences of congestion mitigation policies in urban transport systems calls for replacement of the static equilibrium assignment by DTA in the integrated travel demand and traffic assignment models. Despite of the availability of DTA models and despite of the conceptual clarity of how such integration should take place, only few operational model systems have been developed for large-scale applications. We report on replacement of the static traffic assignment by two different DTAs in the four stage demand model for the Greater Stockholm region: the macroscopic analytic Visum DUE and microscopic simulation Transmodeler. First results show that even without systematic calibration the DTA is in reasonable agreement with observed traffic counts and travel times. The presented experiments did not reveal striking difference between using macroscopic and microscopic assignment package. However, given the clear trend to microscopic modeling and simulation on the travel demand side, the use of micro-simulation-based DTA package appears more natural from system integration perspective.
    Keywords: Dynamic traffic assignment; DTA; Microscopic simulation; Travel demand models
    JEL: R40
    Date: 2014–05–22
  6. By: Bach, L.; Dollevoet, T.A.B.; Huisman, D.
    Abstract: __Abstract__ We investigate to what degree we can integrate a Train Timetabling / Engine Scheduling Problem with a Crew Scheduling Problem. In the Timetabling Problem we design a timetable for the desired lines by fixing the departure and arrival times. Also, we allocate time-slots in the network to secure a feasible timetable. Next, we assign engines in the Engine Scheduling Problem to the lines in accordance with the timetable. The overall integration is achieved by obtaining an optimal solution for the Timetabling / Engine Scheduling Problem. We exploit the fact that numerous optimal, and near optimal solutions exists. We consider all solutions that can be obtained from the optimal engine schedule by altering the timetable, while keeping the order of demands in the schedules intact. The Crew Scheduling model is allowed to re-time the service of demands if the additional cost is outweighed by the crew savings. This information is implemented in a mathematical model for the Crew Scheduling Problem. The model is solved using a column generation scheme. Hereby it is possible for the Crew Scheduling algorithm to adjust the timetable and achieve a better overall solution. We perform computational experiments based on a case at a freight railway operator, DB Schenker Rail Scandinavia, and show that significant cost savings can be achieved.
    Keywords: railway crew planning, vehicle and crew scheduling, partial integration, tome windows, branch-and-price
    Date: 2014–04–01
  7. By: Ryo Hasumi; Hideaki Hirata
    Abstract: This paper studies the Japanese credit scoring market using data on 2,000 small and medium-sized enterprises and a small business credit scoring (SBCS) model widely used in the market. After constructing a model for determining a bank's profit maximization, some simulation exercises are conducted, and pitfalls of lending based on SBCS are indicated. The simulation results suggest that the reason why SBCS loan losses occur would be the combination of adverse selection and window-dressing problems. In addition, omitted variable bias and transparency of financial statements are important.
  8. By: Dorothée Boccanfuso (Département d'Économique, Université de Sherbrooke); Luc Savard (Département d'Économique, Université de Sherbrooke); Jonathan Goyette (Département d'Économique, Université de Sherbrooke); Véronique Gosselin (GREDI, Université de Sherbrooke); Clovis Tanekou Mangoua (GREDI, Université de Sherbrooke)
    Abstract: Les forêts québécoises représentent 20 % des forêts canadiennes et 2 % des forêts mondiales. Elles remplissent de nombreux rôles essentiels tels que servir d’habitat à de nombreuses espèces, fournir des biens et des services, engendrer des retombées socioéconomiques ainsi qu’offrir un mode et un milieu de vie pour les Québécois. Qu’il s’agisse des périodes de sécheresse, des étés plus chauds, des hivers moins froids ou de phénomènes plus spécifiques comme la crise du dendroctrone du pin depuis le début des années 2000 en Colombie Britannique, de nombreux phénomènes démontrent la vulnérabilité de la forêt canadienne et québécoise aux changements climatiques (CC). Notre étude compte deux objectifs. Le premier est d’analyser l’impact potentiel des changements climatiques sur l’industrie forestière au Québec et l’économie québécoise. Le second objectif consiste à étudier les effets des programmes et/ou des politiques d’adaptation aux changements climatiques qui pourraient être mis en oeuvre par les décideurs publics. Une analyse d’impact distributive a aussi été réalisée. Un cadre d’analyse macro-micro dynamique a été utilisé pour les fins de l’analyse. Les éléments de dynamique ont été intégrés dans un modèle d’équilibre général calculable (EGC) avec dynamique séquentielle et aussi dans un modèle de micro-simulation. Les modèles, qui ont été résolus sur un horizon de 40 années, ont permis d’illustrer les mécanismes de transmission entre les CC et les programmes d’adaptation sur l’économie en général et sur la variation de mesure de pauvreté. Nos résultats montrent que les impacts des changements climatiques sur la foresterie ont très peu d’effet sur les variables macro-économiques même si les branches de l’industrie forestière subissent elles des effets plus importants. Pour l’analyse distributive, les effets courts termes (20 ans) indiquent des effets faibles mais négatifs dans le cas d’un choc de productivité ou d’offre, augmentant la pauvreté en comparaison à la situation du BAU (Business as Usual). Nos résultats de long terme (2050) convergent vers une baisse de la pauvreté dans ses trois dimensions, quelles que soit la simulation et les zones d’habitation. Ceci est également vrai dans le cas de l’analyse de l’inégalité.
    Keywords: Analyse distributive, modèle d’équilibre général calculable dynamique, modèle microsimulation, changements climatiques, politiques d’adaptation
    JEL: C68 D58 I32 O13 Q54 Q56
    Date: 2014–02
  9. By: Caserio Carlo; Panaro Delio; Trucco Sara
    Abstract: Research measures the reliability of audit firms in predi cting bankruptcy for US-listed financial institutions. Object of the analysis is the Going Concern Opinion (GCO), widely considered a bankruptcy warning signal to stakeholders. The sample is composed of 42 US- listed financial companies that filed Chapter 11 between 1998 and 2011. To highlight differences between bankrupting and healthy firms, a matching sample composed by 42 randomly picked healthy US-listed financial companies is collected. We concentrate on financial institutions, whereas the existing literature pays considerably heavier attention to the industrial sector. This research imbalance is remarkable and particularly unexpected in the wake of recent financial scandals. Literature points out two main approaches on bankruptcy prediction: 1) purely mathematical; 2) approaches based on a combination of auditor knowledge, expertise and experience. The use of data mining techniques, allow us to benefit from the best features of both approaches. Statistical tools used in the analysis are: Logit regression, Support Vector Machines and an Adaboost Meta-algorithm. Findings show a quite low reliability of GCOs in predicting bankruptcy. It is likely that auditors consider further information in supporting their audit opinions, aside from financial - economic ratios. The scant predictive ability of auditors might be due to critical relationships with distressed clients, as suggested by recent literature.
    Keywords: Bankruptcy; Financial institutions;Going Concern Opinion; Data Mining.
    JEL: M42 G33
    Date: 2014–01–01

This nep-cmp issue is ©2014 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.