nep-ets New Economics Papers
on Econometric Time Series
Issue of 2019‒01‒07
fourteen papers chosen by
Jaqueson K. Galimberti
KOF Swiss Economic Institute

  1. Mild-explosive and Local-to-mild-explosive Autoregressions with Serially Correlated Errors By Lui, Yiu Lim; Xiao, Weilin; Yu, Jun
  2. A supreme test for periodic explosive GARCH By Stefan Richter; Weining Wang; Wei Biao Wu
  3. Generating Univariate Fractional Integration within a Large VAR(1) By Guillaume Chevillon; Alain Hecq; Sébastien Laurent
  4. Panel Cointegration Techniques and Open Challenges By Peter Pedroni
  5. Robust inference in models identified via heteroskedasticity By Lewis, Daniel J.
  6. Benchmarking Deep Sequential Models on Volatility Predictions for Financial Time Series By Qiang Zhang; Rui Luo; Yaodong Yang; Yuanyuan Liu
  7. Sequential test for unit root in AR(1) model By Keiji Nagai; Yoshihiko Nishiyama; Kohtaro Hitomi
  8. Volatility Estimation and Jump Detection for drift-diffusion Processes By Sébastien Laurent; Shuping Shi
  9. Nowcasting private consumption: traditional indicators, uncertainty measures, credit cards and some internet data By María Gil; Javier J. Pérez; A. Jesús Sánchez; Alberto Urtasun
  10. The ETS challenges: a machine learning approach to the evaluation of simulated financial time series for improving generation processes By Javier Franco-Pedroso; Joaquin Gonzalez-Rodriguez; Maria Planas; Jorge Cubero; Rafael Cobo; Fernando Pablos
  11. Bootstrapping Structural Change Tests By Otilia Boldea; Adriana Cornea-Madeira; Alastair R. Hall
  12. Estimation of Structural Break Point in Linear Regression Models By Yaein Baek
  13. Temporal disaggregation of overlapping noisy quarterly data using state space models: Estimation of monthly business sector output from Value Added Tax data in the UK By Paul Labonne; Martin Weale
  14. Nowcasting real GDP growth with business tendency surveys data: A cross country analysis By Evzen Kocenda; Karen Poghosyan

  1. By: Lui, Yiu Lim (School of Economics, Singapore Management University); Xiao, Weilin (School of Management, Zhejiang University); Yu, Jun (School of Economics, Singapore Management University)
    Abstract: This paper firstly extends the results of Phillips and Magdalinos (2007a) by allowing for anti-persistent errors in mildly explosive autoregressive models. It is shown that the Cauchy asymptotic theory remains valid for the least squares (LS) estimator. The paper then extends the results of Phillips, Magdalinos and Giraitis (2010) by allowing for serially correlated errors of various forms in local-to mild-explosive autoregressive models. It is shown that the result of smooth transition in the limit theory between local-to-unity and mild-explosiveness remains valid for the LS estimator. Finally, the limit theory for autoregression with intercept is developed.
    Keywords: Anti-persistent; unit root; mildly explosive; limit theory; bubble; fractional integration; Young integral
    JEL: C22
    Date: 2018–12–15
  2. By: Stefan Richter; Weining Wang; Wei Biao Wu
    Abstract: We develop a uniform test for detecting and dating explosive behavior of a strictly stationary GARCH$(r,s)$ (generalized autoregressive conditional heteroskedasticity) process. Namely, we test the null hypothesis of a globally stable GARCH process with constant parameters against an alternative where there is an 'abnormal' period with changed parameter values. During this period, the change may lead to an explosive behavior of the volatility process. It is assumed that both the magnitude and the timing of the breaks are unknown. We develop a double supreme test for the existence of a break, and then provide an algorithm to identify the period of change. Our theoretical results hold under mild moment assumptions on the innovations of the GARCH process. Technically, the existing properties for the QMLE in the GARCH model need to be reinvestigated to hold uniformly over all possible periods of change. The key results involve a uniform weak Bahadur representation for the estimated parameters, which leads to weak convergence of the test statistic to the supreme of a Gaussian Process. In simulations we show that the test has good size and power for reasonably large time series lengths. We apply the test to Apple asset returns and Bitcoin returns.
    Date: 2018–12
  3. By: Guillaume Chevillon (Department of Information Systems, Decision Sciences and Statistics, ESSEC Business School); Alain Hecq (Department of Quantitative Economics, School of Business and Economics, Maastricht University); Sébastien Laurent (Aix-Marseille Univ., CNRS, EHESS, Centrale Marseille, AMSE & Aix-Marseille Graduate School of Management)
    Abstract: This paper shows that a large dimensional vector autoregressive model (VAR) of finite order can generate fractional integration in the marginalized univariate series. We derive high-level assumptions under which the final equation representation of a VAR(1) leads to univariate fractional white noises and verify the validity of these assumptions for two specific models.
    Keywords: long memory, vector autoregressive model, marginalization, final equation representation
    JEL: C10 C32
    Date: 2018–12
  4. By: Peter Pedroni (Williams College)
    Abstract: This chapter discusses the challenges that shape panel cointegration techniques, with an emphasis on the challenge of maintaining the robustness of cointegration methods when temporal dependencies interact with both cross sectional heterogeneities and dependencies. It also discusses some of the open challenges that lie ahead, including the challenge of generalizing to nonlinear and time varying cointegrating relationships. The chapter is written in a nontechnical style that is intended to be assessable to nonspecialists, with an emphasis on conveying the underlying concepts and intuition.
    Keywords: Panel Time Series, Cointegration, Nonstationary Panels, Nonlinear Panels
    JEL: C33
    Date: 2018–10
  5. By: Lewis, Daniel J. (Federal Reserve Bank of New York)
    Abstract: Identification via heteroskedasticity exploits differences in variances across regimes to identify parameters in simultaneous equations. I study weak identification in such models, which arises when variances change very little or the variances of multiple shocks change close to proportionally. I show that this causes standard inference to become unreliable, propose two tests to detect weak identification, and develop nonconservative methods for robust inference on a subset of the parameter vector. I apply these tools to monetary policy shocks, identified using heteroskedasticity in high frequency data. I detect weak identification in daily data, causing standard inference methods to be invalid. However, using intraday data instead allows the shocks to be strongly identified.
    Keywords: heteroskedasticity; weak identification; robust inference; pretesting; monetary policy; impulse response function
    JEL: C12 C32 E43
    Date: 2018–12–01
  6. By: Qiang Zhang; Rui Luo; Yaodong Yang; Yuanyuan Liu
    Abstract: Volatility is a quantity of measurement for the price movements of stocks or options which indicates the uncertainty within financial markets. As an indicator of the level of risk or the degree of variation, volatility is important to analyse the financial market, and it is taken into consideration in various decision-making processes in financial activities. On the other hand, recent advancement in deep learning techniques has shown strong capabilities in modelling sequential data, such as speech and natural language. In this paper, we empirically study the applicability of the latest deep structures with respect to the volatility modelling problem, through which we aim to provide an empirical guidance for the theoretical analysis of the marriage between deep learning techniques and financial applications in the future. We examine both the traditional approaches and the deep sequential models on the task of volatility prediction, including the most recent variants of convolutional and recurrent networks, such as the dilated architecture. Accordingly, experiments with real-world stock price datasets are performed on a set of 1314 daily stock series for 2018 days of transaction. The evaluation and comparison are based on the negative log likelihood (NLL) of real-world stock price time series. The result shows that the dilated neural models, including dilated CNN and Dilated RNN, produce most accurate estimation and prediction, outperforming various widely-used deterministic models in the GARCH family and several recently proposed stochastic models. In addition, the high flexibility and rich expressive power are validated in this study.
    Date: 2018–11
  7. By: Keiji Nagai (Yokohama National University); Yoshihiko Nishiyama (Institute of Economic Research, Kyoto University); Kohtaro Hitomi (Kyoto Institute of Technology)
    Abstract: We consider unit root tests under sequential sampling for an AR(1) process against both stationary and explosive alternatives. We propose three kinds of test, or t type, stopping time and Bonferroni tests, using the sequential coefficient estimator and the stopping time of Lai and Siegmund (1983). To examine the statistical properties, we obtain their weak joint limit by approximating the processes in D[0;∞) and using time change and a DDS (Dambis and Dubins-Schwarz) Brownian motion. The distribution of the stopping time is characterized by a Bessel process of dimension 3/2 with and without drift, while the esitimator is asymptotically normally distributed. We implement Monte Carlo simulations and numerical computations to examine their small sample properties.
    Date: 2018–10
  8. By: Sébastien Laurent (Aix-Marseille Univ., CNRS, EHESS, Centrale Marseille, AMSE & Aix-Marseille Graduate School of Management); Shuping Shi (Department of Economics, Macquarie University & Centre for Applied Macroeconomic Analysis (CAMA))
    Abstract: Logarithms of prices of financial assets are conventionally assumed to follow drift-diffusion processes. While the drift term is typically ignored in the infill asymptotic theory and applications, the presence of nonzero drifts is an undeniable fact. The finite sample theory and extensive simulations provided in this paper reveal that the drift component has a nonnegligible impact on the estimation accuracy of volatility and leads to a dramatic power loss of a class of jump identification procedures. We propose an alternative construction of volatility estimators and jump tests and observe significant improvement of both in the presence of nonnegligible drift. As an illustration, we apply the new volatility estimators and jump tests, along with their original versions, to 21 years of 5-minute log-returns of the NASDAQ stock price index.
    Keywords: diffusion process, nonzero drift, finite sample theory, volatility estimation, jumps
    JEL: C12 C14
    Date: 2018–12
  9. By: María Gil (Banco de España); Javier J. Pérez (Banco de España); A. Jesús Sánchez (Instituto Complutense de Estudios Internacionales (UCM) and GEN); Alberto Urtasun (Banco de España)
    Abstract: The focus of this paper is on nowcasting and forecasting quarterly private consumption. The selection of real-time, monthly indicators focuses on standard (“hard” / “soft” indicators) and less-standard variables. Among the latter group we analyze: i) proxy indicators of economic and policy uncertainty; ii) payment cards’ transactions, as measured at “Point-of-sale” (POS) and ATM withdrawals; iii) indicators based on consumption-related search queries retrieved by means of the Google Trends application. We estimate a suite of mixed-frequency, time series models at the monthly frequency, on a real-time database with Spanish data, and conduct out-of-sample forecasting exercises to assess the relevant merits of the different groups of indicators. Some results stand out: i) “hard” and payments cards indicators are the best performers when taken individually, and more so when combined; ii) nonetheless, “soft” indicators are helpful to detect qualitative signals in the nowcasting horizon; iii) Google-based and uncertainty indicators add value when combined with traditional indicators, most notably at estimation horizons beyond the nowcasting one, what would be consistent with capturing information about future consumption decisions; iv) the combinations of models that include the best performing indicators tend to beat broader-based combinations.
    Keywords: private consumption, nowcasting, forecasting, uncertainty, Google Trends.
    JEL: E27 C32 C53
    Date: 2018–12
  10. By: Javier Franco-Pedroso; Joaquin Gonzalez-Rodriguez; Maria Planas; Jorge Cubero; Rafael Cobo; Fernando Pablos
    Abstract: This paper presents an evaluation framework that attempts to quantify the "degree of realism" of simulated financial time series, whatever the simulation method could be, with the aim of discover unknown characteristics that are not being properly reproduced by such methods in order to improve them. For that purpose, the evaluation framework is posed as a machine learning problem in which some given time series examples have to be classified as simulated or real financial time series. The "challenge" is proposed as an open competition, similar to those published at the Kaggle platform, in which participants must send their classification results along with a description of the features and the classifiers used. The results of these "challenges" have revealed some interesting properties of financial data, and have lead to substantial improvements in our simulation methods under research, some of which will be described in this work.
    Date: 2018–11
  11. By: Otilia Boldea (Tilburg University); Adriana Cornea-Madeira (University of York); Alastair R. Hall (University of Manchester)
    Abstract: This paper analyses the use of bootstrap methods to test for parameter change in linear models estimated via Two Stage Least Squares (2SLS). Two types of test are considered: one where the null hypothesis is of no change and the alternative hypothesis involves discrete change at k unknown break-points in the sample; and a second test where the null hypothesis is that there is discrete parameter change at l break-points in the sample against an alternative in which the parameters change at l + 1 break-points. In both cases, we consider inferences based on a sup-Wald-type statistic using either the wild recursive bootstrap or the wild fixed bootstrap. We establish the asymptotic validity of these bootstrap tests under a set of general conditions that allow the errors to exhibit conditional and/or unconditional heteroskedasticity, and report results from a simulation study that indicate the tests yield reliable inferences in the sample sizes often encountered in macroeconomics. The analysis covers the cases where the first-stage estimation of 2SLS involves a model whose parameters are either constant or themselves subject to discrete parameter change. If the errors exhibit unconditional heteroskedasticity and/or the reduced form is unstable then the bootstrap methods are particularly attractive because the limiting distributions of the test statistics are not pivotal.
    Date: 2018–11
  12. By: Yaein Baek
    Abstract: This paper proposes a point estimator of the break location for a one-time structural break in linear regression models. If the break magnitude is small, the least-squares estimator of the break date has two modes at ends of the finite sample period, regardless of the true break location. I suggest a modification of the least-squares objective function to solve this problem. The modified objective function incorporates estimation uncertainty that varies across potential break dates. The new break point estimator is consistent and has a unimodal finite sample distribution under a small break magnitude. A limit distribution is provided under a in-fill asymptotic framework which verifies that the new estimator outperforms the least-squares estimator.
    Date: 2018–11
  13. By: Paul Labonne; Martin Weale
    Abstract: This paper derives monthly estimates of turnover for small and medium size businesses in the UK from rolling quarterly VAT-based turnover data. We develop a state space approach for filtering and temporally disaggregating the VAT figures, which are noisy and exhibit dynamic unobserved components. We notably derive multivariate and nonlinear methods to make use of indicator series and data in logarithms respectively. After illustrating our temporal disaggregation method and estimation strategy using an example industry, we estimate monthly seasonally adjusted figures for the seventy-five industries for which the data are available. We thus produce an aggregate series representing approximately a quarter of gross value added in the economy. We compare our estimates with those derived from the Monthly Business Survey and find that the VAT-based estimates show a different time profile and are less volatile. In addition to this empirical work our contribution to the literature on temporal disaggregation is twofold. First, we provide a discussion of the effect that noise in aggregate figures has on the estimation of disaggregated model components. Secondly, we illustrate a new temporal aggregation strategy suited for overlapping data. The technique we adopt is more parsimonious than the seminal method of Harvey and Pierse (1984) and can easily be generalised to nonoverlapping data.
    Keywords: Temporal disaggregation, State space models, Structural time series models, Administrative data, Monthly GDP
    JEL: E01 C32 P44
    Date: 2018–12
  14. By: Evzen Kocenda (Institute of Economic Studies, Faculty of Social Sciences, Charles University in Prague, Czech Republic); Karen Poghosyan (Central Bank of Armenia, Economic Research Department, Yerevan, Armenia)
    Abstract: We use nowcasting methodology to forecast the dynamics of the real GDP growth in real time based on the business tendency surveys data. Nowcasting is important because key macroeconomic variables on the current state of the economy are available only with a certain lag. This is particularly true for those variables that are collected on a quarterly basis. To conduct out‐of‐sample forecast evaluation we use business tendency surveys data for 22 European countries. Based on the different dataset and using outof‐sample recursive regression scheme we conclude that nowcasting model outperforms several alternative short‐term forecasting statistical models, even when the volatility of the real GDP growth is increasing both in time and across different countries. Based on the Diebold‐Mariano test statistics, we conclude that nowcasting strongly outperforms BVAR and BFAVAR models, but comparison with AR, FAAR and FAVAR does not produce sufficient evidence to prefer one over another.
    Keywords: Nowcasting, short‐term forecasting, dynamic and static principal components, Bayesian VAR, Factor Augmented VAR, real GDP growth, European OECD countries
    JEL: E52 C33 C38 C52 C53 E37
    Date: 2018–09

This nep-ets issue is ©2019 by Jaqueson K. Galimberti. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.