nep-ets New Economics Papers
on Econometric Time Series
Issue of 2018‒11‒05
eleven papers chosen by
Jaqueson K. Galimberti
KOF Swiss Economic Institute

  1. Principal component analysis for second-order stationary vector time series By Chang, Jinyuan; Guo, Bin; Yao, Qiwei
  2. Forecasting Time Series with VARMA Recursions on Graphs By Elvin Isufi; Andreas Loukas; Nathanael Perraudin; Geert Leus
  3. Seasonal adjustment of daily time series By Ollech, Daniel
  4. Deep calibration of rough stochastic volatility models By Christian Bayer; Benjamin Stemper
  5. Using Deep Learning for price prediction by exploiting stationary limit order book features By Avraam Tsantekidis; Nikolaos Passalis; Anastasios Tefas; Juho Kanniainen; Moncef Gabbouj; Alexandros Iosifidis
  6. On LASSO for Predictive Regression By Ji Hyung Lee; Zhentao Shi; Zhan Gao
  7. Robust performance hypothesis testing with smooth functions of population moments By Olivier Ledoit; Michael Wolf
  8. Factor-Driven Two-Regime Regression By Sokbae Lee; Yuan Liao; Myung Hwan Seo; Youngki Shin
  9. The Shale Oil Boom and the U.S. Economy: Spillovers and Time-Varying Effects By Hilde C. Bjørnland; Julia Zhulanova
  10. Agnostic structural disturbances (ASDs): detecting and reducing misspecification in empirical macroeconomic models By Den Haan, Wouter J.; Drechsel, Thomas
  11. Weather-induced Short-term Fluctuations of Economic Output By Schreiber, Sven

  1. By: Chang, Jinyuan; Guo, Bin; Yao, Qiwei
    Abstract: We extend the principal component analysis (PCA) to secondorder stationary vector time series in the sense that we seek for a contemporaneous linear transformation for a p-variate time series such that the transformed series is segmented into several lowerdimensional subseries, and those subseries are uncorrelated with each other both contemporaneously and serially. Therefore those lowerdimensional series can be analyzed separately as far as the linear dynamic structure is concerned. Technically it boils down to an eigenanalysis for a positive definite matrix. When p is large, an additional step is required to perform a permutation in terms of either maximum cross-correlations or FDR based on multiple tests. The asymptotic theory is established for both fixed p and diverging p when the sample size n tends to infinity. Numerical experiments with both simulated and real data sets indicate that the proposed method is an effective initial step in analyzing multiple time series data, which leads to substantial dimension reduction in modelling and forecasting high-dimensional linear dynamical structures. Unlike PCA for independent data, there is no guarantee that the required linear transformation exists. When it does not, the proposed method provides an approximate segmentation which leads to the advantages in, for example, forecasting for future values. The method can also be adapted to segment multiple volatility processes
    JEL: C1
    Date: 2017–07–09
    URL: http://d.repec.org/n?u=RePEc:ehl:lserod:84106&r=ets
  2. By: Elvin Isufi; Andreas Loukas; Nathanael Perraudin; Geert Leus
    Abstract: Graph-based techniques emerged as a choice to deal with the dimensionality issues in modeling multivariate time series. However, there is yet no complete understanding of how the underlying structure could be exploited to ease this task. This work provides contributions in this direction by considering the forecasting of a process evolving over a graph. We make use of the (approximate) time-vertex stationarity assumption, i.e., timevarying graph signals whose first and second order statistical moments are invariant over time and correlated to a known graph topology. The latter is combined with VAR and VARMA models to tackle the dimensionality issues present in predicting the temporal evolution of multivariate time series. We find out that by projecting the data to the graph spectral domain: (i) the multivariate model estimation reduces to that of fitting a number of uncorrelated univariate ARMA models and (ii) an optimal low-rank data representation can be exploited so as to further reduce the estimation costs. In the case that the multivariate process can be observed at a subset of nodes, the proposed models extend naturally to Kalman filtering on graphs allowing for optimal tracking. Numerical experiments with both synthetic and real data validate the proposed approach and highlight its benefits over state-of-the-art alternatives.
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1810.08581&r=ets
  3. By: Ollech, Daniel
    Abstract: Currently, the methods used by producers of official statistics do not facilitate the seasonal and calendar adjustment of daily time series, even though an increasing number of series with daily observations are available. The aim of this paper is the development of a procedure to estimate and adjust for periodically recurring systematic effects and the influence of moving holidays in time series with daily observations. To this end, an iterative STL based seasonal adjustment routine is combined with a RegARIMA model for the estimation of calendar and outlier effects. The procedure is illustrated and validated using the currency in circulation in Germany and a set of simulated time series. A comparison with established methods used for the adjustment of monthly data shows that the procedures estimate similar seasonally adjusted series. Thus, the developed procedure closes a gap by facilitating the seasonal and calendar adjustment of daily time series.
    Keywords: Seasonal adjustment,STL,Daily time series,Seasonality
    JEL: C14 C22 C53
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:zbw:bubdps:412018&r=ets
  4. By: Christian Bayer; Benjamin Stemper
    Abstract: Sparked by Al\`os, Le\'on, and Vives (2007); Fukasawa (2011, 2017); Gatheral, Jaisson, and Rosenbaum (2018), so-called rough stochastic volatility models such as the rough Bergomi model by Bayer, Friz, and Gatheral (2016) constitute the latest evolution in option price modeling. Unlike standard bivariate diffusion models such as Heston (1993), these non-Markovian models with fractional volatility drivers allow to parsimoniously recover key stylized facts of market implied volatility surfaces such as the exploding power-law behaviour of the at-the-money volatility skew as time to maturity goes to zero. Standard model calibration routines rely on the repetitive evaluation of the map from model parameters to Black-Scholes implied volatility, rendering calibration of many (rough) stochastic volatility models prohibitively expensive since there the map can often only be approximated by costly Monte Carlo (MC) simulations (Bennedsen, Lunde, & Pakkanen, 2017; McCrickerd & Pakkanen, 2018; Bayer et al., 2016; Horvath, Jacquier, & Muguruza, 2017). As a remedy, we propose to combine a standard Levenberg-Marquardt calibration routine with neural network regression, replacing expensive MC simulations with cheap forward runs of a neural network trained to approximate the implied volatility map. Numerical experiments confirm the high accuracy and speed of our approach.
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1810.03399&r=ets
  5. By: Avraam Tsantekidis; Nikolaos Passalis; Anastasios Tefas; Juho Kanniainen; Moncef Gabbouj; Alexandros Iosifidis
    Abstract: The recent surge in Deep Learning (DL) research of the past decade has successfully provided solutions to many difficult problems. The field of quantitative analysis has been slowly adapting the new methods to its problems, but due to problems such as the non-stationary nature of financial data, significant challenges must be overcome before DL is fully utilized. In this work a new method to construct stationary features, that allows DL models to be applied effectively, is proposed. These features are thoroughly tested on the task of predicting mid price movements of the Limit Order Book. Several DL models are evaluated, such as recurrent Long Short Term Memory (LSTM) networks and Convolutional Neural Networks (CNN). Finally a novel model that combines the ability of CNNs to extract useful features and the ability of LSTMs' to analyze time series, is proposed and evaluated. The combined model is able to outperform the individual LSTM and CNN models in the prediction horizons that are tested.
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1810.09965&r=ets
  6. By: Ji Hyung Lee; Zhentao Shi; Zhan Gao
    Abstract: A typical predictive regression employs a multitude of potential regressors with various degrees of persistence while their signal strength in explaining the dependent variable is often low. Variable selection in such context is of great importance. In this paper, we explore the pitfalls and possibilities of the LASSO methods in this predictive regression A typical predictive regression employs a multitude of potential regressors with various degrees of persistence while their signal strength in explaining the dependent variable is often low. Variable selection in such context is of great importance. In this paper, we explore the pitfalls and possibilities of the LASSO methods in this predictive regression framework with mixed degrees of persistence. With the presence of stationary, unit root and cointegrated predictors, we show that the adaptive LASSO maintains the consistent variable selection and the oracle property due to its penalty scheme that accommodates the system of regressors. On the contrary, conventional LASSO does not have this desirable feature as the penalty its imposed according to the marginal behavior of each individual regressor. We demonstrate this theoretical property via extensive Monte Carlo simulations, and evaluate its empirical performance for short- and long-horizon stock return predictability.
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1810.03140&r=ets
  7. By: Olivier Ledoit; Michael Wolf
    Abstract: Applied researchers often want to make inference for the difference of a given performance measure for two investment strategies. In this paper, we consider the class of performance measures that are smooth functions of population means of the underlying returns; this class is very rich and contains many performance measures of practical interest (such as the Sharpe ratio and the variance). Unfortunately, many of the inference procedures that have been suggested previously in the applied literature make unreasonable assumptions that do not apply to real-life return data, such as normality and independence over time. We will discuss inference procedures that are asymptotically valid under very general conditions, allowing for heavy tails and time dependence in the return data. In particular, we will promote a studentized time series bootstrap procedure. A simulation study demonstrates the improved finite-sample performance compared to existing procedures. Applications to real data are also provided.
    Keywords: Bootstrap, HAC inference, kurtosis, Sharpe ratio, sknewness, variance
    JEL: C12 C14 C22
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:zur:econwp:305&r=ets
  8. By: Sokbae Lee; Yuan Liao; Myung Hwan Seo; Youngki Shin
    Abstract: We propose a novel two-regime regression model where the switching between the regimes is driven by a vector of possibly unobservable factors. When the factors are latent, we estimate them by the principal component analysis of a much larger panel data set. Our approach enriches conventional threshold models in that a vector of factors may represent economy-wide shocks more realistically than a scalar observed random variable. Estimating our model brings new challenges as well as opportunities in terms of both computation and asymptotic theory. We show that the optimization problem can be reformulated as mixed integer optimization and present two alternative computational algorithms. We derive the asymptotic distributions of the resulting estimators under the scheme that the threshold effect shrinks to zero. In particular, with latent factors, not only do we establish the conditions on factor estimation for a strong oracle property, which are different from those for smooth factor augmented models, but we also identify semi-strong and weak oracle cases and establish a phase transition that describes the effect of first stage factor estimation as the cross-sectional dimension of panel data increases relative to the time-series dimension. Moreover, we develop a consistent factor selection procedure with a penalty term on the number of factors and present a complementary bootstrap testing procedure for linearity with the aid of efficient computational algorithms. Finally, we illustrate our methods via Monte Carlo experiments and by applying them to factor-driven threshold autoregressive models of US macro data.
    Keywords: threshold regression, factors, mixed integer optimization, panel data, phase transition, oracle properties, l0-penalization.
    JEL: C13 C51
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:mcm:deptwp:2018-14&r=ets
  9. By: Hilde C. Bjørnland; Julia Zhulanova
    Abstract: We analyze if the transmission of oil price shocks on the U.S. economy has changed as a result of the shale oil boom. To do so we allow for spillovers at the state level, as well as aggregate country level effects. We identify and quantify these spillovers using a factor-augmented vector autoregressive (VAR) model, allowing for time-varying changes. In contrast to previous results, we find considerable changes in the way oil price shocks are transmitted: there are now positive spillovers to non-oil investment, employment and production in many U.S. states from an increase in the oil price - effects that were not present before the shale oil boom.
    Keywords: Shale oil boom, FAVAR model, Time-varying changes, Geographical dispersion
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:bny:wpaper:0066&r=ets
  10. By: Den Haan, Wouter J.; Drechsel, Thomas
    Abstract: Exogenous random structural disturbances are the main driving force behind fluctuations in most business cycle models and typically a wide variety is used. This paper documents that a minor misspecification regarding structural disturbances can lead to large distortions for parameter estimates and implied model properties, such as impulse response functions with a wrong shape and even an incorrect sign. We propose a novel concept, namely an agnostic structural disturbance (ASD), that can be used to both detect and correct for misspecification of the structural disturbances. In contrast to regular disturbances and wedges, ASDs do not impose additional restrictions on policy functions. When applied to the Smets-Wouters (SW) model, we find that its risk-premium disturbance and its investment-specific productivity disturbance are rejected in favor of our ASDs. While agnostic in nature, studying the estimated associated coefficients and the impulse response functions of these ASDs allows us to interpret them economically as a risk-premium/preference and an investment-specific productivity type disturbance as in SW, but our results indicate that they enter the model quite differently than the original SW disturbances. Our procedure also selects an additional wage mark-up disturbance that is associated with increased capital efficiency.
    Keywords: DSGE; full-information model estimation; structural disturbances
    JEL: C13 C52 E30
    Date: 2018–08–23
    URL: http://d.repec.org/n?u=RePEc:ehl:lserod:90384&r=ets
  11. By: Schreiber, Sven
    Abstract: We contribute to the recent literature on the economic effects of those weather conditions that deviate from their regular seasonal pattern. To this end we use local temperature and snow measurements across Germany to analyze their impact on German monthly total industrial and construction-sector production. We find noticeable effects of the various (linear and nonlinear, contemporaneous and dynamic) weather regressors, which in the –seasonally adjusted– construction sector growth data imply an extra explanatory power of more than 50% of the variation, compared to benchmark predictive regressions. As expected, the impact is quite a bit less in total industrial production. From our estimates we obtain (seasonally as well as) weather adjusted production series, and our regression-based approach also yields confidence intervals for these adjustments. The estimated adjustments are quantitatively relevant also for broad output (quarterly GDP). In a mixed-frequency framework we find some value of the estimated monthly weather impact for quarterly GDP nowcasts in (quasi) real time.
    Keywords: weather,business cycle,nowcasting,MIDAS
    JEL: E32 E27
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:zbw:vfsc18:181622&r=ets

This nep-ets issue is ©2018 by Jaqueson K. Galimberti. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.