nep-cmp New Economics Papers
on Computational Economics
Issue of 2015‒07‒25
seven papers chosen by
Stan Miles
Thompson Rivers University

  1. Modelling Financial Markets by Self-Organized Criticality By A. E. Biondo; A. Pluchino; A. Rapisarda
  2. Risk Assessment of Input Uncertainty in Stochastic Simulation By Helin Zhu; Enlu Zhou
  3. Comparing Policies to Confront Permit Over-allocation By Fell, Harrison
  4. Forecasting Accuracy Evaluation of Tourist Arrivals: Evidence from Parametric and Non-Parametric Techniques By Hossein Hassani; Emmanuel Sirimal Silva; Nikolaos Antonakakis; George Filis; Rangan Gupta
  5. Factorisable Sparse Tail Event Curves By Shih-Kang Chao; Wolfgang K. Härdle; Ming Yuan;
  6. Housing Market Forecasts with Factor Combinations By Charles Rahal
  7. Evaluating Insurer's Risk embedded in the Korean Reverse Mortgage Program Using Concurrent Simulation Method By Keunock Lew; Seungryul Ma

  1. By: A. E. Biondo; A. Pluchino; A. Rapisarda
    Abstract: We present a financial market model, characterized by self-organized criticality, that is able to generate endogenously a realistic price dynamics and to reproduce well-known stylized facts. We consider a community of heterogeneous traders, composed by chartists and fundamentalists, and focus on the role of informative pressure on market participants, showing how the spreading of information, based on a realistic imitative behavior, drives contagion and causes market fragility. In this model imitation is not intended as a change in the agent's group of origin, but is referred only to the price formation process. We introduce in the community also a variable number of random traders in order to study their possible beneficial role in stabilizing the market, as found in other studies. Finally we also suggest some counterintuitive policy strategies able to dampen fluctuations by means of a partial reduction of information.
    Date: 2015–07
  2. By: Helin Zhu; Enlu Zhou
    Abstract: When simulating a complex stochastic system, the behavior of the output response depends on the input parameters estimated from finite real-world data, and the finiteness of data brings input uncertainty to the output response. The quantification of the impact of input uncertainty on output response has been extensively studied. Most of the existing literature focuses on providing inferences on the mean output response with respect to input uncertainty, including point estimation and confidence interval construction of the mean response. However, risk assessment of the mean response with respect to input uncertainty often plays an important role in system evaluation/control because it quantifies the behavior of the mean response under extreme input models. To the best of our knowledge, it has been rarely systematically studied in the literature. In the present paper, we will fill in the gap and introduce risk measures for input uncertainty in output analysis. We develop nested Monte Carlo estimators and construct (asymptotically valid) confidence intervals for risk measures of mean response. We further study the associated budget allocation problem for more efficient nested simulation of the estimators, and propose a novel method to solve the problem.
    Date: 2015–07
  3. By: Fell, Harrison
    Abstract: Instability in cap-and-trade markets, particularly with respect to permit price collapses has been an area of concern for regulators. To that end, several policies, including hybrid price-quantity mechanisms and the newly introduced "market stability reserve" (MSR) systems have been introduced and even implemented in some cases. I develop a stochastic dynamic model of a cap-and-trade system, parameterized to values relevant to the European Union's Emission Trading System (EU ETS) to analyze the performance of these policies aimed at adding stability to the system or at least at reducing perceived over-allocations of permits. Results suggest adaptive allocation mechanisms such as a price collar or MSR can reduce permit over-allocations and permit price volatility in a more cost-effective manner than simply reducing scheduled permit allocations. However, it is also found that the performance of these adaptive allocation policies, and in particular the MSR, are greatly affected by assumed discount rates and policy parameters.
    Keywords: cap-and-trade, market stability reserve, price collar, EU ETS
    Date: 2015–06–25
  4. By: Hossein Hassani (Statistical Research Centre, Bournemouth University, 89 Holdenhurst Road, Bournemouth BH8 8EB, UK); Emmanuel Sirimal Silva (Statistical Research Centre, Bournemouth University, 89 Holdenhurst Road, Bournemouth BH8 8EB, UK); Nikolaos Antonakakis (Vienna University of Economics and Business, Department of Economics, Institute for International Economics, Welthandelsplatz 1, 1020, Vienna, Austria and University of Portsmouth, Economics and Finance Subject Group, Portsmouth Business School, Portland Street, Portsmouth, PO1 3DE, United Kingdom and Johannes Kepler University, Department of Economics, Altenbergerstrae 69, Linz, 4040, Austria); George Filis (Bournemouth University, Accounting, Finance and Economics Department, 89 Holdenhurst Road, Bournemouth, Dorset, BH8 8EB, United Kingdom); Rangan Gupta (Department of Economics, University of Pretoria)
    Abstract: This paper evaluates the use of several parametric and nonparametric forecasting techniques for predicting tourism demand in selected European countries. ARIMA, Exponential Smoothing (ETS), Neural Networks (NN), Trigonometric Box-Cox ARMA Trend Seasonal (TBATS), Fractionalized ARIMA (ARFIMA) and both Singular Spectrum Analysis algorithms, i.e. recurrent SSA (SSA-R) and vector SSA (SSA-V), are adopted to forecast tourist arrivals in Germany, Greece, Spain, Cyprus, Netherlands, Austria, Portugal, Sweden and United Kingdom. This paper not only marks the introductory application of the TBATS model for tourism demand forecasting, but also marks the first instance in which the SSA-R model is effectively utilized for forecasting tourist arrivals. The data is tested rigorously for normality, seasonal unit roots and break points whilst the out-of-sample forecasts are tested for statistical significance. Our findings show that no single model can provide the best forecasts for any of the countries considered here in the short-, medium- and long-run. Moreover, forecasts from NN and ARFIMA models provide the least accurate predictions for European tourist arrivals, yet interestingly ARFIMA forecasts are better than the powerful NN model. SSA-R, SSA-V, ARIMA and TBATS are found to be viable options for modelling European tourist arrivals based on the most number of times a given model outperforms the competing models in the above order. The results enable forecasters to choose the most suitable model (from those evaluated here) based on the country and horizon for forecasting tourism demand. Should a single model be of interest, then, across all selected countries and horizons the SSA-R model is found to be the most efficient based on lowest overall forecasting error.
    Keywords: Tourist arrivals, Tourism demand, Forecasting, Singular Spectrum Analysis, ARIMA, Exponential Smoothing, Neural Networks, TBATS, ARFIMA.
    Date: 2015–07
  5. By: Shih-Kang Chao; Wolfgang K. Härdle; Ming Yuan;
    Abstract: In this paper, we propose a multivariate quantile regression method which enables localized analysis on conditional quantiles and global comovement analysis on conditional ranges for high-dimensional data. The proposed method, hereafter referred to as FActorisable Sparse Tail Event Curves, or FASTEC for short, exploits the potential factor structure of multivariate conditional quantiles through nuclear norm regularization and is particularly suitable for dealing with extreme quantiles. We study both theoretical properties and computational aspects of the estimating procedure for FASTEC. In particular, we derive nonasymptotic oracle bounds for the estimation error, and develope an efficient proximal gradient algorithm for the non-smooth optimization problem incurred in our estimating procedure. Merits of the proposed methodology are further demonstrated through applications to Conditional Autoregressive Value-at-Risk (CAViaR) (Engle and Manganelli; 2004), and a Chinese temperature dataset.
    Keywords: High-dimensional data analysis, multivariate quantile regression, quantile regression, value-at-risk, nuclear norm, multi-task learning
    JEL: C38 C63 G17 G20
    Date: 2015–07
  6. By: Charles Rahal
    Abstract: In this paper we take a computational approach to forecasting a macroeconometric model of housing markets across six original data sets with large cross-sectional dimensions. We compare a large number of models which vary by the choice of factors, 'observable endogenous variables' and the number of lags in addition to classical and modern (factor based) specifications. We utilize various optimal model selection and model averaging techniques, comparing them against classical benchmarks. Within a 'pseudo real-time' out of sample forecasting context, results show that the approximate BMA method is the best weighting and selection technique, generating forecasts able to outperform the automated univariate benchmark of Dyndman and Khandakar (2008) upwards of 58% of the time. However, the average forecast error is lower in magnitude over all recursions and countries for the benchmark compared with all models for all variables. We also provide results on the biased nature of this class of models in general, in addition to the forecast error increasing as a function of the underlying variance of the series being forecast.
    Keywords: Housing Markets, Forecasting, Factor Error Correction Models, FAVARs
    JEL: C53 R30
    Date: 2015–06
  7. By: Keunock Lew (Seoul National University of Science & Technology); Seungryul Ma (Government Employees Pension Foundation)
    Abstract: The paper conducted a concurrent simulation analysis to evaluate guarantor's risk in the reverse mortgage annuity program, considering that key variables of the program change simultaneously with their own stochastic processes. From the analysis with the data covering a period from September 2004 to December 2014, it was revealed that the probability of the guarantor having net liability (or net loss) turned out almost nothing (ie. merely 3.65%). Therefore, it is interpreted that the current program was designed very safely for the interest of guarantor and has room to increase monthly payment for annuitants. We also evaluated the effect of individual variable’s volatility on the magnitude of guarantor's total risk. From the analysis, it was confirmed that the current reverse mortgage program was designed to offset longevity risk which may increase with the period mortality rate of 2013 life table by market risk which can decrease with assuming low growth rate of housing price and high level of loan rates. The concurrent simulation is viewed as a more realistic way for evaluating guarantor’s risk because it assumes key variables to change simultaneously with their interdependency in the analysis. Therefore, the concurrent simulation results could give more rational implications to the government's policy makers as well as the reverse mortgage annuity market.
    Keywords: Reverse Mortgage Annuity, Stochastic Process, Guarantor Risk Evaluation, Concurrent Simulation
    JEL: G20 G22

This nep-cmp issue is ©2015 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.