nep-ets New Economics Papers
on Econometric Time Series
Issue of 2016‒05‒28
twelve papers chosen by
Yong Yin
SUNY at Buffalo

  1. The Local Fractional Bootstrap By Mikkel Bennedsen; Ulrich Hounyo; Asger Lunde; Mikko S. Pakkanen
  2. Comparing Predictive Accuracy under Long Memory - With an Application to Volatility Forecasting By Robinson Kruse; Christian Leschinski; Michael Will
  3. Mean-correction and Higher Order Moments for a Stochastic Volatility Model with Correlated Errors By Sujay Mukhoti; Pritam Ranjan
  4. The Jacobi Stochastic Volatility Model By Damien Ackerer; Damir Filipovi\'c; Sergio Pulido
  5. A Functional Approach to Test Trending Volatility By Hernández del Valle Gerardo; Juárez-Torres Miriam; Guerrero Santiago
  6. Inference on nonstationary time series with moving mean By Jiti Gao; Peter M. Robinson
  7. Learning Time-Varying Forecast Combinations By Antoine Mandel; Amir Sani
  8. Estimation and filtering of nonlinear MS-DSGE models By Sergey Ivashchenko
  9. Auxiliary Likelihood-Based Approximate Bayesian Computation in State Space Models By Gael M. Martin; Brendan P.M. McCabe; David T. Frazier; Worapree Maneesoonthorn; Christian P. Robert
  10. A New Method for Working With Sign Restrictions in SVARs By S Ouliaris; A R Pagan
  11. Testing for Deterministic Seasonality in Mixed-Frequency VARs By Tomás del Barrio Castro; Alain Hecq
  12. Forecasting with Neural Networks Models. By Francis Bismans; Igor N. Litvine

  1. By: Mikkel Bennedsen (Aarhus University and CREATES); Ulrich Hounyo (Aarhus University and CREATES); Asger Lunde (Aarhus University and CREATES); Mikko S. Pakkanen (Imperial College London and CREATES)
    Abstract: We introduce a bootstrap procedure for high-frequency statistics of Brownian semistationary processes. More specifically, we focus on a hypothesis test on the roughness of sample paths of Brownian semistationary processes, which uses an estimator based on a ratio of realized power variations. Our new resampling method, the local fractional bootstrap, relies on simulating an auxiliary fractional Brownian motion that mimics the fine properties of high frequency differences of the Brownian semistationary process under the null hypothesis. We prove the first order validity of the bootstrap method and in simulations we observe that the bootstrap-based hypothesis test provides considerable finite-sample improvements over an existing test that is based on a central limit theorem. This is important when studying the roughness properties of time series data; we illustrate this by applying the bootstrap method to two empirical data sets: we assess the roughness of a time series of high-frequency asset prices and we test the validity of Kolmogorov's scaling law in atmospheric turbulence data.
    Keywords: Brownian semistationary process; roughness; fractal index; Hölder regularity; fractional Brownian motion; bootstrap; stochastic volatility; turbulence JEL Classification: C12, C22, C63, G12 MSC 2010 Classification: 60G10, 60G15, 60G17, 60G22, 62M07, 62M09, 65C05
    Date: 2016–05–03
  2. By: Robinson Kruse (Rijksuniversiteit Groningen and CREATES); Christian Leschinski (Leibniz University Hannover); Michael Will (Leibniz University Hannover)
    Abstract: This paper extends the popular Diebold-Mariano test to situations when the forecast error loss differential exhibits long memory. It is shown that this situation can arise frequently, since long memory can be transmitted from forecasts and the forecast objective to forecast error loss differentials. The nature of this transmission mainly depends on the (un)biasedness of the forecasts and whether the involved series share common long memory. Further results show that the conventional Diebold-Mariano test is invalidated under these circumstances. Robust statistics based on a memory and autocorrelation consistent estimator and an extended fixed-bandwidth approach are considered. The subsequent Monte Carlo study provides a novel comparison of these robust statistics. As empirical applications, we conduct forecast comparison tests for the realized volatility of the Standard and Poors 500 index among recent extensions of the heterogeneous autoregressive model. While we find that forecasts improve significantly if jumps in the log-price process are considered separately from continuous components, improvements achieved by the inclusion of implied volatility turn out to be insignificant in most situations.
    Keywords: Equal Predictive Ability, Long Memory, Diebold-Mariano Test, Long-run Variance Estimation, Realized Volatility
    JEL: C22 C52 C53
    Date: 2016–05–19
  3. By: Sujay Mukhoti; Pritam Ranjan
    Abstract: In an efficient stock market, the log-returns and their time-dependent variances are often jointly modelled by stochastic volatility models (SVMs). Many SVMs assume that errors in log-return and latent volatility process are uncorrelated, which is unrealistic. It turns out that if a non-zero correlation is included in the SVM (e.g., Shephard (2005)), then the expected log-return at time t conditional on the past returns is non-zero, which is not a desirable feature of an efficient stock market. In this paper, we propose a mean-correction for such an SVM for discrete-time returns with non-zero correlation. We also find closed form analytical expressions for higher moments of log-return and its lead-lag correlations with the volatility process. We compare the performance of the proposed and classical SVMs on S&P 500 index returns obtained from NYSE.
    Date: 2016–05
  4. By: Damien Ackerer; Damir Filipovi\'c; Sergio Pulido
    Abstract: We introduce a novel stochastic volatility model where the squared volatility of the asset return follows a Jacobi process. It contains the Heston model as a limit case. We show that the finite-dimensional distributions of the log price process admit a Gram--Charlier A expansion in closed-form. We use this to derive closed-form series representations for option prices whose payoff is a function of the underlying asset price trajectory at finitely many time points. This includes European call, put, and digital options, forward start options, and forward start options on the underlying return. We derive sharp analytical and numerical bounds on the truncation errors. We illustrate the performance by numerical examples, which show that our approach offers a viable alternative to Fourier transform techniques.
    Date: 2016–05
  5. By: Hernández del Valle Gerardo; Juárez-Torres Miriam; Guerrero Santiago
    Abstract: In this paper we extend the traditional GARCH(1,1) model by including a functional trend term in the conditional volatility of a time series. We derive the main properties of the model and apply it to all agricultural commodities in the Mexican CPI basket, as well as to the international prices of maize, wheat, pork, poultry and beef products for three different time periods that implied changes in price regulations and behavior. The proposed model seems to adequately fit the volatility process and, according to homoscedasticity tests, outperforms the ARCH(1) and GARCH(1,1) models, some of the most popular approaches used in the literature to analyze price volatility.
    Keywords: Agricultural prices; volatility; GARCH models.
    JEL: C22 C51 E31 Q18
    Date: 2016–04
  6. By: Jiti Gao; Peter M. Robinson
    Abstract: A semiparametric model is proposed in which a parametric filtering of a nonstationary time series, incorporating fractionally differencing with short memory correction, removes correlation but leaves a nonparametric deterministic trend. Estimates of the memory parameter and other dependence parameters are proposed, and shown to be consistent and asymptotically normally distributed with parametric rate. Tests with standard asymptotics for I(1) and other hypotheses are thereby justified. Estimation of the trend function is also considered. We include a Monte Carlo study of finite-sample performance.
    JEL: J1
    Date: 2014–12
  7. By: Antoine Mandel (CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics); Amir Sani (CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics)
    Abstract: Non-parametric forecast combination methods choose a set of static weights to combine over candidate forecasts as opposed to traditional forecasting approaches, such as ordinary least squares, that combine over information (e.g. exogenous variables). While they are robust to noise, structural breaks, inconsistent predictors and changing dynamics in the target variable, sophisticated combination methods fail to outperform the simple mean. Time-varying weights have been suggested as a way forward. Here we address the challenge to “develop methods better geared to the intermittent and evolving nature of predictive relations” in Stock and Watson (2001) and propose a data driven machine learning approach to learn time-varying forecast combinations for output, inflation or any macroeconomic time series of interest. Further, the proposed procedure “hedges” combination weights against poor performance to the mean, while optimizing weights to minimize the performance gap to the best candidate forecast in hindsight. Theoretical results are reported along with empirical performance on a standard macroeconomic dataset for predicting output and inflation.
    Abstract: Les méthodes non-paramétriques de combinaison de prédicteurs déterminent un vecteur statique de poids pour combiner les prédicteurs. Elles différent ainsi des méthodes de prévision traditionnelles qui visent à combiner l'information (i.e. les variables exogènes). Bien qu'elles soient très robustes, notamment au bruit, aux changements structurels ou à la présence de prédicteurs inconsistants, les méthodes de combinaison complexes n'offrent généralement pas une performance supérieure à celle de la simple moyenne. L'usage de poids variables dans le temps est considéré comme une nouvelle voie de recherche prometteuse face à ce dilemme. Dans cet article, nous développons cette approche en proposant une approche par l'apprentissage statistique du problème de la détermination de combinaisons de prédicteurs évoluant dans le temps pour l'inflation, le PIB ou tout autre série macro-économique. La méthode proposée permet en particulier de garantir, au pire, une performance proche de celle de la moyenne tout en optimisant les poids de telle sorte que la performance soit proche de celle de la meilleure combinaison à posteriori. Nous reportons à cet effet des résultats théoriques et empiriques sur un ensemble de données standard pour la prédiction macro-économique.
    Keywords: Forecast combinations,Machine Learning,Econometrics,Forecasting,Forecast Combination Puzzle,Apprentissage statistique,Combinaison de prédicteurs,Econométrie
    Date: 2016–04
  8. By: Sergey Ivashchenko (National Research University Higher School of Economics)
    Abstract: This article suggests and compares the properties of some nonlinear Markov-switching filters. Two of them are sigma point filters: the Markov switching central difference Kalman filter (MSCDKF) and MSCDKFA. Two of them are Gaussian assumed filters: Markov switching quadratic Kalman filter (MSQKF) and MSQKFA. A small scale financial MS-DSGE model is used for tests. MSQKF greatly outperforms other filters in terms of computational costs. It also is the first or the second best according to most tests of filtering quality (including the quality of quasi-maximum likelihood estimation with use of a filter, RMSE and LPS of unobserved variables).
    Keywords: regime switching, second-order approximation, non-linear MS-DSGE estimation, MSQKF, MSCDKF
    JEL: C13 C32 E32
    Date: 2016
  9. By: Gael M. Martin; Brendan P.M. McCabe; David T. Frazier; Worapree Maneesoonthorn; Christian P. Robert
    Abstract: A new approach to inference in state space models is proposed, using approximate Bayesian computation (ABC). ABC avoids evaluation of an intractable likelihood by matching summary statistics computed from observed data with statistics computed from data simulated from the true process, based on parameter draws from the prior. Draws that produce a ‘match’ between observed and simulated summaries are retained, and used to estimate the inaccessible posterior; exact inference being possible in the state space setting, we pursue summaries via the maximization of an auxiliary likelihood function. We derive conditions under which this auxiliary likelihood-based approach achieves Bayesian consistency and show that – in a precise limiting sense – results yielded by the auxiliary maximum likelihood estimator are replicated by the auxiliary score. Particular attention is given to a structure in which the state variable is driven by a continuous time process, with exact inference typically infeasible in this case due to intractable transitions Two models for continuous time stochastic volatility are used for illustration, with auxiliary likelihoods constructed by applying computationally efficient filtering methods to discrete time approximations. The extent to which the conditions for consistency are satisfied is demonstrated in both cases, and the accuracy of the proposed technique when applied to a square root volatility model also demonstrated numerically. In multiple parameter settings a separate treatment of each parameter, based on integrated likelihood techniques, is advocated as a way of avoiding the curse of dimensionality associated with ABC methods.
    Keywords: Likelihood-free methods, latent diffusion models, Bayesian consistency, asymptotic sufficiency, unscented Kalman filter, stochastic volatility
    JEL: C11 C22 C58
    Date: 2016
  10. By: S Ouliaris; A R Pagan (UniSyd)
    Abstract: Structural VARs are used to compute impulse responses to shocks. One problem that has arisen involves the information needed to perform this task i.e. how are the shocks to separated into those representing technology, monetary effects etc. Increasingly the signs of impulse responses are used for this task. However it is often desirable to impose some parametric assumption as well e.g. that monetary shocks have no long-run impact on output. Existing methods for combining sign and parametric restrictions are not well developed. In this paper we provide a relatively simple way to allow for these combinations and show how it works in a number of different contexts.
    Keywords: VAR
    Date: 2015–05–11
  11. By: Tomás del Barrio Castro (Universitat de les Illes Balears); Alain Hecq (Maastricht University)
    Abstract: This paper investigates the presence of deterministic seasonal features within a mixed frequency vector autoregressive model. A strategy based on Wald tests is proposed.
    Keywords: deterministic seasonal features, mixed frequency VARs
    JEL: C32
    Date: 2016
  12. By: Francis Bismans; Igor N. Litvine
    Abstract: This paper deals with so-called feedforward neural network model which we consider from a statistical and econometric viewpoint. It was shown how this model can be estimated by maximum likelihood. Finally, we apply the ANN methodology to model demand for electricity in South Africa. The comparison of forecasts based on a linear and ANN model respectively shows the usefulness of the latter.
    Keywords: Artificial neural networks (ANN), electricity consumption, forecasting, linear and non-linear models, recessions.
    JEL: C45 C53 E17 E27 Q43 Q47
    Date: 2016

This nep-ets issue is ©2016 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.