nep-ets New Economics Papers
on Econometric Time Series
Issue of 2017‒01‒15
fifteen papers chosen by
Yong Yin
SUNY at Buffalo

  1. Impulse Response Estimation By Smooth Local Projections By Barnichon, Régis; Brownlees, Christian
  2. Automatic Signal Extraction for Stationary and Non-Stationary Time Series by Circulant SSA By Bógalo, Juan; Poncela, Pilar; Senra, Eva
  3. Long Memory, Breaks, and Trends: On the Sources of Persistence in Inflation Rates By Rinke, Saskia; Busch, Marie; Leschinski, Christian
  4. Changes in Persistence in Outlier Contaminated Time Series By Hirsch, Tristan; Rinke, Saskia
  5. Stochastic processes of limited frequency and the effects of oversampling By D.S.G. Pollock
  6. Trends Cycles And Seasons: Econometric Methods Of Signal Extraction By D.S.G. Pollock
  7. Econometric Filters By D.S.G. Pollock
  8. Truncated sum of squares estimation of fractional time series models with deterministic trends By Javier Hualde; Morten Ørregaard Nielsen
  9. Identification-robust moment-based tests for Markov-switching in autoregressive models By Jean-Marie Dufour; Richard Luger
  10. Estimation of Possibly Non-Stationary First-Order Auto-Regressive Processes By Ana Paula Martins
  11. Adaptive Shrinkage in Bayesian Vector Autoregressive Models By Feldkircher, Martin; Huber, Florian
  12. Should I stay or should I go? Bayesian inference in the threshold time varying parameter (TTVP) model By Huber, Florian; Kastner, Gregor; Feldkircher, Martin
  13. The perils of Counterfactual Analysis with Integrated Processes By Carlos Viana de Carvalho; Ricardo Masini; Marcelo Cunha Medeiros
  14. "Multivariate Stochastic Volatility Model with Realized Volatilities and Pairwise Realized Correlations " By Yuta Yamauchi; Yasuhiro Omori
  15. A Markov switching factor-augmented VAR model for analyzing US business cycles and monetary policy By Huber, Florian; Fischer, Manfred M.

  1. By: Barnichon, Régis; Brownlees, Christian
    Abstract: Vector Autoregressions (VAR) and Local Projections (LP) are well established methodologies for the estimation of Impulse Responses (IR). These techniques have complementary features: The VAR approach is more efficient when the model is correctly specified whereas the LP approach is less efficient but more robust to model misspecification. We propose a novel IR estimation methodology -- Smooth Local Projections (SLP) -- to strike a balance between these approaches. SLP consists in estimating LP under the assumption that the IR is a smooth function of the forecast horizon. Inference is carried out using semi-parametric techniques based on Penalized B-splines, which are straightforward to implement in practice. SLP preserves the flexibility of standard LP and at the same time can increase precision substantially. A simulation study shows the large gains in IR estimation accuracy of SLP over LP. We show how SLP may be used with common identification schemes such as timing restrictions and instrumental variables to directly recover structural IRs. We illustrate our technique by studying the effects of monetary shocks.
    Keywords: impulse response; local projections; semiparametric estimation
    JEL: C14 C32 C53 E47
    Date: 2016–12
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:11726&r=ets
  2. By: Bógalo, Juan; Poncela, Pilar; Senra, Eva
    Abstract: Singular Spectrum Analysis (SSA) is a nonparametric tecnique for signal extraction in time series based on principal components. However, it requires the intervention of the analyst to identify the frequencies associated to the extracted principal components. We propose a new variant of SSA, Circulant SSA (CSSA) that automatically makes this association. We also prove the validity of CSSA for the nonstationary case. Through several sets of simulations, we show the good properties of our approach: it is reliable, fast, automatic and produces strongly separable elementary components by frequency. Finally, we apply Circulant SSA to the Industrial Production Index of six countries. We use it to deseasonalize the series and to illustrate that it also reproduces a cycle in accordance to the dated recessions from the OECD.
    Keywords: circulant matrices, signal extraction, singular spectrum analysis, non-parametric, time series, Toeplitz matrices.
    JEL: C22 E32
    Date: 2017–01–05
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:76023&r=ets
  3. By: Rinke, Saskia; Busch, Marie; Leschinski, Christian
    Abstract: The persistence of inflation rates is of major importance to central banks due to the fact that it determines the costs of monetary policy according to the Phillips curve. This article is motivated by newly available econometric methods which allow for a consistent estimation of the persistence parameter under low frequency contaminations and consistent break point estimation under long memory without a priori assumptions on the presence of breaks. In contrast to previous studies, we allow for smooth trends in addition to breaks as a source of spurious long memory. We support the fi nding of reduced memory parameters in monthly inflation rates of the G7 countries as well as spurious long memory, except for the US. Nevertheless, only a few breaks can be located. Instead, all countries exhibit signi cant trends at the 5 percent level with the exception of the US.
    Keywords: Spurious Long Memory; Breaks; Trends; Inflation; G7 countries
    JEL: C13 E58
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:han:dpaper:dp-584&r=ets
  4. By: Hirsch, Tristan; Rinke, Saskia
    Abstract: Outlying observations in time series influence parameter estimation and testing procedures, leading to biased estimates and spurious test decisions. Further inference based on these results will be misleading. In this paper the effects of outliers on the performance of ratio-based tests for a change in persistence are investigated. We consider two types of outliers, additive outliers and innovative outliers. Our simulation results show that the effect of outliers crucially depends on the outlier type and on the degree of persistence of the underlying process. Additive outliers deteriorate the performance of the tests for high degrees of persistence. In contrast, innovative outliers do not negatively influence the performance of the tests. Since additive outliers lead to severe size distortions when the null hypothesis under consideration is described by a nonstationary process, we apply an outlier detection method designed for unit-root testing. The adjustment of the series results in size improvements and power gains. In an empirical example we apply the tests and the outlier detection method to the G7 inflation rates.
    Keywords: Additive Outliers; Innovative Outliers; Change in Persistence; Outlier Detection; Monte Carlo
    JEL: C15 C22
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:han:dpaper:dp-583&r=ets
  5. By: D.S.G. Pollock
    Abstract: Discrete-time ARMA processes can be placed in a one-to-one correspondence with a set of continuous-time processes that are bounded in frequency by the Nyquist value of ? radians per sample period. It is well known that, if data are sampled from a continuous process of which the maximum frequency exceeds the Nyquist value, then there will be a problem of aliasing. However, if the sampling is too rapid, then other problems will arise that may cause the ARMA estimates to be severely biased. The paper reveals the nature of these problems and it shows how they may be overcome.
    Keywords: ARMA Modelling, Stochastic Differential Equations, Frequency-Limited Stochastic Processes, Oversampling
    JEL: C22 C32 E32
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:lec:leecon:17/03&r=ets
  6. By: D.S.G. Pollock
    Abstract: Alternative methods of trend extraction and of seasonal adjustment are described that operate in the time domain and in the frequency domain. The time-domain methods that are implemented in the TRAMO–SEATS and the STAMP programs are compared. An abbreviated time-domain method of seasonal adjustment that is implemented in the IDEOLOG program is also presented. Finite-sample versions of the Wiener–Kolmogorov filter are described that can be used to implement the methods in a common way. The frequency-domain method, which is also implemented in the IDEOLOG program, employs an ideal frequency selective filter that depends on identifying the ordinates of the Fourier transform of a detrended data sequence that should lie in the pass band of the filter and those that should lie in its stop band. Filters of this nature can be used both for extracting a low-frequency cyclical component of the data and for extracting the seasonal component.
    Keywords: Time series, Spectral analysis, Business cycles, Turning points, Seasonality.
    JEL: C22 C32 E32
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:lec:leecon:17/02&r=ets
  7. By: D.S.G. Pollock
    Abstract: A variety of filters that are commonly employed by econometricians are analysed with a view to determining their effectiveness in extracting well-defined components of economic data sequences. These components can be defined in terms of their spectral structures—i.e. their frequency content—and it is argued that the process of econometric signal extraction should be guided by a careful appraisal of the periodogram of the detrended data sequence. Whereas it is true that many annual and quarterly economic data sequences are amenable to relatively unsophisticated filtering techniques, it is often the case that monthly data that exhibit strong seasonal fluctuations require a far more delicate approach. In such cases, it may be appropriate to use filters that work directly in the frequency domain by selecting or modifying the spectral ordinates of a Fourier decomposition of data that have been subject to a preliminary detrending.
    Keywords: Time series, Spectral analysis, Business cycles, Turning points, Seasonality.
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:lec:leecon:17/01&r=ets
  8. By: Javier Hualde (Universidad Publica de Navarra); Morten Ørregaard Nielsen (Queen's University and CREATES)
    Abstract: We consider truncated (or conditional) sum of squares estimation of a parametric model composed of a fractional time series and an additive generalized polynomial trend. Both the memory parameter, which characterizes the behaviour of the stochastic component of the model, and the exponent parameter, which drives the shape of the deterministic component, are considered not only unknown real numbers, but also lying in arbitrarily large (but finite) intervals. Thus, our model captures different forms of nonstationarity and noninvertibility. As in related settings, the proof of consistency (which is a prerequisite for proving asymptotic normality) is challenging due to non-uniform convergence of the objective function over a large admissible parameter space, but, in addition, our framework is substantially more involved due to the competition between stochastic and deterministic components. We establish consistency and asymptotic normality under quite general circumstances, finding that results differ crucially depending on the relative strength of the deterministic and stochastic components.
    Keywords: Asymptotic normality, consistency, deterministic trend, fractional process, generalized polynomial trend, noninvertibility, nonstationarity, truncated sum of squares estimation
    JEL: C22
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:qed:wpaper:1376&r=ets
  9. By: Jean-Marie Dufour; Richard Luger
    Abstract: This paper develops tests of the null hypothesis of linearity in the context of autoregressive models with Markov-switching means and variances. These tests are robust to the identification failures that plague conventional likelihood-based inference methods. The approach exploits the moments of normal mixtures implied by the regime-switching process and uses Monte Carlo test techniques to deal with the presence of an autoregressive component in the model specification. The proposed tests have very respectable power in comparison to the optimal tests for Markov-switching parameters of Carrasco et al. (2014) and they are also quite attractive owing to their computational simplicity. The new tests are illustrated with an empirical application to an autoregressive model of U.S. output growth.
    Keywords: Mixture distributions; Markov chains; Regime switching; Parametric bootstrap; Monte Carlo tests; Exact inference,
    JEL: C12 C15 C22 C52
    Date: 2016–12–31
    URL: http://d.repec.org/n?u=RePEc:cir:cirwor:2016s-63&r=ets
  10. By: Ana Paula Martins
    Abstract: This paper inspects a grid search algorithm to estimate the AR(1) process, based on the joint estimation of the canonical AR(1) equation along with its reverse form. The method relies on the GLS principle, accounting for the covariance error structure of the special estimable system. Nevertheless, it stands as potentially improving to rely on across-equation-restricted system estimation with free covariance structure. The algorithm is (computationally) implemented and applied to inference of the AR(1) parameter of simulated – some stationary, others non-stationary - series. Additionally, it is argued - and illustrated by simulation - that non-stationary AR(1) processes appear to be consistently estimable by OLS. Also, it is suggested that the parameter of a stationary AR(1) process is estimable by OLS from the AR(2) representation of its non-stationary “first-integrated” series; or from the joint estimate of the canonical and reverse form of the AR(1) process by OLS. Importance of further study of differenced, D(p) – stationary after being integrated p times - processes is concluded.
    Keywords: Nonlinear Estimation; Grid Search Methods; AR(1) Processes; Integrated Series; Differenced Processes; Factored AR(1) Processes; Unit Roots.
    JEL: C22 C13 C12 C63
    Date: 2016–11–21
    URL: http://d.repec.org/n?u=RePEc:eei:rpaper:eeri_rp_2016_21&r=ets
  11. By: Feldkircher, Martin; Huber, Florian
    Abstract: Vector autoregressive (VAR) models are frequently used for forecasting and impulse response analysis. For both applications, shrinkage priors can help improving inference. In this paper we derive the shrinkage prior of Griffin et al. (2010) for the VAR case and its relevant conditional posterior distributions. This framework imposes a set of normally distributed priors on the autoregressive coefficients and the covariances of the VAR along with Gamma priors on a set of local and global prior scaling parameters. This prior setup is then generalized by introducing another layer of shrinkage with scaling parameters that push certain regions of the parameter space to zero. A simulation exercise shows that the proposed framework yields more precise estimates of the model parameters and impulse response functions. In addition, a forecasting exercise applied to US data shows that the proposed prior outperforms other specifications in terms of point and density predictions. (authors' abstract)
    Keywords: Normal-Gamma prior; density predictions; hierarchical modeling
    Date: 2016–03
    URL: http://d.repec.org/n?u=RePEc:wiw:wus005:4933&r=ets
  12. By: Huber, Florian; Kastner, Gregor; Feldkircher, Martin
    Abstract: We provide a flexible means of estimating time-varying parameter models in a Bayesian framework. By specifying the state innovations to be characterized trough a threshold process that is driven by the absolute size of parameter changes, our model detects at each point in time whether a given regression coefficient is constant or time-varying. Moreover, our framework accounts for model uncertainty in a data-based fashion through Bayesian shrinkage priors on the initial values of the states. In a simulation, we show that our model reliably identifies regime shifts in cases where the data generating processes display high, moderate, and low numbers of movements in the regression parameters. Finally, we illustrate the merits of our approach by means of two applications. In the first application we forecast the US equity premium and in the second application we investigate the macroeconomic effects of a US monetary policy shock. (authors' abstract)
    Keywords: Change point model; Threshold mixture innovations; Structural breaks; Shrinkage; Bayesian statistics; Monetary policy
    Date: 2016–09
    URL: http://d.repec.org/n?u=RePEc:wiw:wus005:5178&r=ets
  13. By: Carlos Viana de Carvalho (Department of Economics, PUC-Rio); Ricardo Masini (São Paulo School of Economics, Getúlio Vargas Foundation); Marcelo Cunha Medeiros (Department of Economics, PUC-Rio)
    Abstract: Recently, there has been a growing interest in developing econometric tools to conduct counterfactual analysis with aggregate data when a “treated” unit suffers an intervention, such as a policy change, and there is no obvious control group. Usually, the proposed methods are based on the construction of an artificial counterfactual from a pool of “untreated” peers, organized in a panel data structure. In this paper, we investigate the consequences of applying such methodologies when the data are formed by integrated process of order 1. We find that without a cointegration relation (spurious case) the intervention estimator diverges resulting in the rejection of the hypothesis of no intervention effect regardless of its existence. Whereas, for the case when at least one cointegration relation exists, we have a vT-consistent estimator for the intervention effect albeit with a non-standard distribution. However, even in this case, the test of no intervention effect is extremely oversized if nonstationarity is ignored. When a drift is present in the data generating processes, the estimator for both cases (cointegrated and spurious) either diverges or is not well defined asymptotically. As a final recommendation we suggest to work in first-differences to avoid spurious results.
    Date: 2016–12
    URL: http://d.repec.org/n?u=RePEc:rio:texdis:654&r=ets
  14. By: Yuta Yamauchi (Graduate School of Economics, The University of Tokyo); Yasuhiro Omori (Faculty of Economics, The University of Tokyo)
    Abstract: Although stochastic volatility and GARCH models have been successful to describe the volatility dynamics of univariate asset returns, their natural extension to the multivariate models with dynamic correlations has been difficult due to several major problems. Firstly, there are too many parameters to estimate if available data are only daily returns, which results in unstable estimates. One solution to this problem is to incorporate additional observations based on intraday asset returns such as realized covariances. However, secondly, since multivariate asset returns are not traded synchronously, we have to use largest time intervals so that all asset returns are observed to compute the realized covariance matrices, where we fail to make full use of available intraday informations when there are less frequently traded assets. Thirdly, it is not straightforward to guarantee that the estimated (and the realized) covariance matrices are positive definite. Our contributions are : (1) we obtain the stable parameter estimates for dynamic correlation models using the realized measures, (2) we make full use of intraday informations by using pairwise realized correlations, (3) the covariance matrices are guaranteed to be positive definite, (4) we avoid the arbitrariness of the ordering of asset returns, (5) propose the flexible correlation structure model (e.g. such as setting some correlations to be identically zeros if necessary), and (6) the parsimonious specification for the leverage effect is proposed. Our proposed models are applied to daily returns of nine U.S. stocks with their realized volatilities and pairwise realized correlations, and are shown to outperform the existing models with regard to portfolio performances.
    Date: 2016–11
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2016cf1029&r=ets
  15. By: Huber, Florian; Fischer, Manfred M.
    Abstract: This paper develops a multivariate regime switching monetary policy model for the US economy. To exploit a large dataset we use a factor-augmented VAR with discrete regime shifts, capturing distinct business cycle phases. The transition probabilities are modelled as time-varying, depending on a broad set of indicators that influence business cycle movements. The model is used to investigate the relationship between business cycle phases and monetary policy. Our results indicate that the effects of monetary policy are stronger in recessions, whereas the responses are more muted in expansionary phases. Moreover, lagged prices serve as good predictors for business cycle transitions. (authors' abstract)
    Keywords: Non-linear FAVAR; business cycles; monetary policy; structural model
    Date: 2015–08
    URL: http://d.repec.org/n?u=RePEc:wiw:wus005:4626&r=ets

This nep-ets issue is ©2017 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.