nep-ets New Economics Papers
on Econometric Time Series
Issue of 2019‒10‒07
nine papers chosen by
Jaqueson K. Galimberti
KOF Swiss Economic Institute

  1. Stationarity of the detrended time series of S&P500 By Karina Arias-Calluari; Morteza. N. Najafi; Michael S. Harr\'e; Fernando Alonso-Marroquin
  2. The Identification Problem for Linear Rational Expectations Models By Majid M. Al-Sadoon; Piotr Zwiernik
  3. Estimating the Exchange Rate Pass-Through: A Time-Varying Vector Auto-Regression with Residual Stochastic Volatility Approach By Juan Manuel Julio-Román
  4. A Non-Elliptical Orthogonal GARCH Model for Portfolio Selection under Transaction Costs By Marc S. Paolella; Pawel Polak; Patrick S. Walker
  5. Data Smashing 2.0: Sequence Likelihood (SL) Divergence For Fast Time Series Comparison By Yi Huang; Ishanu Chattopadhyay
  6. Not so Particular about Calibration: Smile Problem Resolved By Aitor Muguruza
  7. "Particle Rolling MCMC" By Naoki Awaya; Yasuhiro Omori
  8. A tale of two sentiment scales: Disentangling short-run and long-run components in multivariate sentiment dynamics By Danilo Vassallo; Giacomo Bormetti; Fabrizio Lillo
  9. Using Machine Learning to Predict Realized Variance By Peter Carr; Liuren Wu; Zhibai Zhang

  1. By: Karina Arias-Calluari; Morteza. N. Najafi; Michael S. Harr\'e; Fernando Alonso-Marroquin
    Abstract: Our study presents the analysis of stock market data of S&P500 before and after been detrended. The analysis is based on two types of returns, simple return and log-return respectively. Both of them are non-stationary time series. This means that their statistical distribution change over time. Consequently a detrended process is made to neutralize the non-stationary effects. The detrended process is obtained by decomposing the financial time series into a deterministic trend and random fluctuations. We present an alternative method on detrending time series based on the classical moving average (MA) models, where Kurtosis is used to determine the windows size. Then, the dentrending fluctuation analysis (DFA) is use to show that the detrended part is stationary. This is done by considering the autocorrelation of detrended price return and the power spectrum analysis of detrended price.
    Date: 2019–10
  2. By: Majid M. Al-Sadoon; Piotr Zwiernik
    Abstract: We consider the problem of the identification of stationary solutions to linear rational expectations models from the second moments of observable data. Observational equivalence is characterized and necessary and sufficient conditions are provided for: (i) identification under affine restrictions, (ii) generic identification under affine restrictions of analytically parametrized models, and (iii) local identification under non-linear restrictions. The results strongly resemble the classical theory for VARMA models although significant points of departure are also documented.
    Keywords: identification, linear rational expectations models, linear systems, vector autoregressive moving average models
    JEL: C10 C22 C32
    Date: 2019–09
  3. By: Juan Manuel Julio-Román (Banco de la República de Colombia)
    Abstract: The adoption of a Time-Varying Vector Auto-Regression with residual Stochastic Volatility approach to address the state and time dependency of the exchange rate pass-through, ERPT, is proposed. This procedure is employed to estimate the size, duration and stability of the ERPT to flexible relative price changes in Colombia through a fairly simple Phillips curve. For this, the generalized impulse responses, i.e. pass-throughs, from different periods of time are compared. It was found that the ERPT is bigger and faster than previous estimates for broader price indexes. It was also also found that regardless of the existence of time-varying shock sizes, i.e. time varying standard deviations, the ERPT before full Inflation Targeting, IT, is marked and significantly larger before than during full IT, and also that the ERPT relates to real exchange rate volatility. The second results relates to the benefits derived from the adoption of full IT in this country. It was finally found that the output gap and flexible relative price change residual volatilities drop permanently and importantly at 1998Q3, emphasizing the role of the free float regime adoption in the success of IT in this country. **** RESUMEN: La adopción de un enfoque de Vectores Auto-Regresivos Tiempo-Variantes con Volatilidad Estocástica residual para examinar la variación temporal y sobre el estado de la economía del Traspaso de la Tasa de Cambio, TCC, es propuesta. Este enfoque es empleado para estimar el tamaño, duración y estabilidad del TTC a los cambios de los precios relativos de los flexibles en Colombia a través de una curva de Phillips relativamente simple. Para esto, las funciones de impulso respuesta generalizadas, es decir los TTC, de diferentes periodos de tiempo son comparados. Se encontró que el TTC es más grande y rápido que estimaciones anteriores para agregados más amplios de precios. Se encontró también que a pesar del tamaño tiempo-variante de los choques, es decir las desviaciones estándar, el traspaso antes del Esquema completo de Inflación Objetivo, EIO, es marcada y significativamente más grande que el traspaso durante este, y también se halló evidencia de una relación entre el traspaso y la volatilidad de la tasa de cambio real. El segundo resultado se relaciona con los beneficios derivados de la adopción del esquema de inflación objetivo en este país. Se encontró, finalmente, que la volatilidad residual de la brecha del PIB y del cambio de los precios relativos de los flexibles cayó substancial y permanentemente en 1998Q3, enfatizando el papel del régimen de libre flotación en el éxito del EIO en este país.
    Keywords: Pass-Through, Price Stickiness, Phillips Curve, Traspaso de la Tasa de Cambio, Rigideces de Precios, Curva de Phillips.
    JEL: C22 F31 F41
    Date: 2019–10
  4. By: Marc S. Paolella (University of Zurich - Department of Banking and Finance; Swiss Finance Institute); Pawel Polak (Stevens Institute of Technology, Department of Mathematical Sciences); Patrick S. Walker (University of Zurich, Department of Banking and Finance)
    Abstract: Covariance matrix forecasts for portfolio optimization have to balance sensitivity to new data points with stability in order to avoid excessive rebalancing. To achieve this, a new robust orthogonal GARCH model for a multivariate set of non-Gaussian asset returns is proposed. The conditional return distribution is multivariate generalized hyperbolic and the dispersion matrix dynamics are driven by the leading factors in a principle component decomposition. Each of these leading factors is endowed with a univariate GARCH structure, while the remaining eigenvalues are kept constant over time. Joint maximum likelihood estimation of all model parameters is performed via an expectation maximization algorithm, and is applicable in high dimensions. The new model generates realistic correlation forecasts even for large asset universes and captures rising pairwise correlations in periods of market distress better than numerous competing models. Moreover, it leads to improved forecasts of an eigenvalue-based financial systemic risk indicator. Crucially, it generates portfolios with much lower turnover and superior risk-adjusted returns net of transaction costs, outperforming the equally weighted strategy even under high transaction fees.
    Keywords: Dynamic Conditional Correlations; Multivariate GARCH; Multivariate Generalized Hyperbolic Distribution; Principle Component Analysis; Financial Systemic Risk
    JEL: C32 C53 G11 G17
    Date: 2019–09
  5. By: Yi Huang; Ishanu Chattopadhyay
    Abstract: Recognizing subtle historical patterns is central to modeling and forecasting problems in time series analysis. Here we introduce and develop a new approach to quantify deviations in the underlying hidden generators of observed data streams, resulting in a new efficiently computable universal metric for time series. The proposed metric is in the sense that we can compare and contrast data streams regardless of where and how they are generated and without any feature engineering step. The approach proposed in this paper is conceptually distinct from our previous work on data smashing, and vastly improves discrimination performance and computing speed. The core idea here is the generalization of the notion of KL divergence often used to compare probability distributions to a notion of divergence in time series. We call this the sequence likelihood (SL) divergence, which may be used to measure deviations within a well-defined class of discrete-valued stochastic processes. We devise efficient estimators of SL divergence from finite sample paths and subsequently formulate a universal metric useful for computing distance between time series produced by hidden stochastic generators.
    Date: 2019–09
  6. By: Aitor Muguruza
    Abstract: We present a novel Monte Carlo based LSV calibration algorithm that applies to all stochastic volatility models, including the non-Markovian rough volatility family. Our framework overcomes the limitations of the particle method proposed by Guyon and Henry-Labord\`ere (2012) and theoretically guarantees a variance reduction without additional computational complexity. Specifically, we obtain a closed-form and exact calibration method that allows us to remove the dependency on both the kernel function and bandwidth parameter. This makes the algorithm more robust and less prone to errors or instabilities in a production environment. We test the efficiency of our algorithm on various hybrid (rough) local stochastic volatility models.
    Date: 2019–09
  7. By: Naoki Awaya (Graduate School of Economics, The University of Tokyo); Yasuhiro Omori (Faculty of Economics, The University of Tokyo)
    Abstract: An efficient simulation-based methodology is proposed for the rolling window esti-mation of state space models, called particle rolling Markov chain Monte Carlo (MCMC)with double block sampling. In our method, which is based on Sequential Monte Carlo(SMC), particles are sequentially updated to approximate the posterior distribution foreach window by learning new information and discarding old information from obser-vations. Th particles are refreshed with an MCMC algorithm when the importanceweights degenerate. To avoid degeneracy, which is crucial for reducing the computa-tion time, we introduce a block sampling scheme and generate multiple candidates bythe algorithm based on the conditional SMC. The theoretical discussion shows thatthe proposed methodology with a nested structure is expressed as SMC sampling forthe augmented space to provide the justification. The computational performance isevaluated in illustrative examples, showing that the posterior distributions of the modelparameters are accurately estimated. The proofs and additional discussions (algorithmsand experimental results) are provided in the Supplementary Material.
    Date: 2019–09
  8. By: Danilo Vassallo; Giacomo Bormetti; Fabrizio Lillo
    Abstract: We propose a novel approach to sentiment data filtering for a portfolio of assets. In our framework, a dynamic factor model drives the evolution of the observed sentiment and allows to identify two distinct components: a long-term component, modeled as a random walk, and a short-term component driven by a stationary VAR(1) process. Our model encompasses alternative approaches available in literature and can be readily estimated by means of Kalman filtering and expectation maximization. This feature makes it convenient when the cross-sectional dimension of the sentiment increases. By applying the model to a portfolio of Dow Jones stocks, we find that the long term component co-integrates with the market principal factor, while the short term one captures transient swings of the market associated with the idiosyncratic components and captures the correlation structure of returns. Finally, using quantile regressions, we assess the significance of the contemporaneous and lagged explanatory power of sentiment on returns finding strong statistical evidence when extreme returns, especially negative ones, are considered.
    Date: 2019–10
  9. By: Peter Carr; Liuren Wu; Zhibai Zhang
    Abstract: In this paper we formulate a regression problem to predict realized volatility by using option price data and enhance VIX-styled volatility indices' predictability and liquidity. We test algorithms including regularized regression and machine learning methods such as Feedforward Neural Networks (FNN) on S&P 500 Index and its option data. By conducting a time series validation we find that both Ridge regression and FNN can improve volatility indexing with higher prediction performance and fewer options required. The best approach found is to predict the difference between the realized volatility and the VIX-styled index's prediction rather than to predict the realized volatility directly, representing a successful combination of human learning and machine learning. We also discuss suitability of different regression algorithms for volatility indexing and applications of our findings.
    Date: 2019–09

This nep-ets issue is ©2019 by Jaqueson K. Galimberti. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.