nep-ets New Economics Papers
on Econometric Time Series
Issue of 2019‒03‒11
ten papers chosen by
Jaqueson K. Galimberti
KOF Swiss Economic Institute

  1. A Nonparametric Dynamic Causal Model for Macroeconometrics By Ashesh Rambachan; Neil Shephard
  2. A Generalised Fractional Differencing Bootstrap for Long Memory Processes By Kapetanios, George; Papailias, Fotis; Taylor, AM Robert
  3. Testing for Episodic Predictability in Stock Returns By Demetrescu, Matei; Georgiev, Iliyan; Rodrigues, Paulo MM; Taylor, AM Robert
  4. Nonparametric Recovery of the Yield Curve Evolution from Cross-Section and Time Series Information By Koo, B.; La Vecchia, D.; Linton, O.
  5. Approximation Properties of Variational Bayes for Vector Autoregressions By Reza Hajargasht
  6. Response surface regressions for critical value bounds and approximate p-values in equilibrium correction models By Sebastian Kripfganz; Daniel C. Schneider
  7. Multivariate Filter Estimation of Potential Output for the United States: An Extension with Labor Market Hysteresis By Ali Alichi; Hayk Avetisyan; Douglas Laxton; Shalva Mkhatrishvili; Armen Nurbekyan; Lusine Torosyan; Hou Wang
  8. Asymptotics for volatility derivatives in multi-factor rough volatility models By Chloe Lacombe; Aitor Muguruza; Henry Stone
  9. Ancillarity-Sufficiency Interweaving Strategy (ASIS) for Boosting MCMC Estimation of Stochastic Volatility Models By Gregor Kastner; Sylvia Fr\"uhwirth-Schnatter
  10. Score-driven time series models with dynamic shape : an application to the Standard & Poor's 500 index By Escribano Sáez, Álvaro; Blazsek, Szabolcs Istvan; Ayala, Astrid

  1. By: Ashesh Rambachan; Neil Shephard
    Abstract: This paper uses potential outcome time series to provide a nonparametric framework for quantifying dynamic causal effects in macroeconometrics. This provides sufficient conditions for the nonparametric identification of dynamic causal effects as well as clarify the causal content of several common assumptions and methods in macroeconomics. Our key identifying assumption is shown to be non-anticipating treatments which enables nonparametric inference on dynamic causal effects. Next, we provide a formal definition of a `shock' and this leads to a shocked potential outcome time series. This is a nonparametric statement of the Frisch-Slutzky paradigm. The common additional assumptions that the causal effects are additive and that the treatments are shocks place substantial restrictions on the underlying dynamic causal estimands. We use this structure to causally interpret several common estimation strategies. We provide sufficient conditions under which local projections is causally interpretable and show that the standard assumptions for local projections with an instrument are not sufficient to identify dynamic causal effects. We finally show that the structural vector moving average form is causally equivalent to a restricted potential outcome time series under the usual invertibility assumption.
    Date: 2019–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1903.01637&r=all
  2. By: Kapetanios, George; Papailias, Fotis; Taylor, AM Robert
    Abstract: A bootstrap methodology, first proposed in a restricted form by Kapetanios and Papailias (2011), suitable for use with stationary and nonstationary fractionally integrated time series is further developed in this paper. The resampling algorithm involves estimating the degree of fractional integration, applying the fractional differencing operator, resampling the resulting approximation to the underlying short memory series and, finally, cumulating to obtain a resample of the original fractionally integrated process. While a similar approach based on differencing has been independently proposed in the literature for stationary fractionally integrated processes using the sieve bootstrap by Poskitt, Grose and Martin (2015), we extend it to allow for general bootstrap schemes including blockwise bootstraps. Further, we show that it can also be validly used for nonstationary fractionally integrated processes. We establish asymptotic validity results for the general method and provide simulation evidence which highlights a number of favourable aspects of its finite sample performance, relative to other commonly used bootstrap methods.
    Date: 2019–02–27
    URL: http://d.repec.org/n?u=RePEc:esy:uefcwp:24136&r=all
  3. By: Demetrescu, Matei; Georgiev, Iliyan; Rodrigues, Paulo MM; Taylor, AM Robert
    Abstract: Standard tests based on predictive regressions estimated over the full available sample data have tended to find little evidence of predictability in stock returns. Recent approaches based on the analysis of subsamples of the data have been considered, suggesting that predictability where it occurs might exist only within so-called 'pockets of predictability' rather than across the entire sample. However, these methods are prone to the criticism that the sub-sample dates are endogenously determined such that the use of standard critical values appropriate for full sample tests will result in incorrectly sized tests leading to spurious findings of stock returns predictability. To avoid the problem of endogenously-determined sample splits, we propose new tests derived from sequences of predictability statistics systematically calculated over sub-samples of the data. Specifically, we will base tests on the maximum of such statistics from sequences of forward and backward recursive, rolling, and double-recursive predictive sub-sample regressions. We develop our approach using the over-identified instrumental variable-based predictability test statistics of Breitung and Demetrescu (2015). This approach is based on partial-sum asymptotics and so, unlike many other popular approaches including, for example, those based on Bonferroni corrections, can be readily adapted to implementation over sequences of subsamples. We show that the limiting distributions of our proposed tests are robust to both the degree of persistence and endogeneity of the regressors in the predictive regression, but not to any heteroskedasticity present even if the sub-sample statistics are based on heteroskedasticity-robust standard errors. We therefore develop fixed regressor wild bootstrap implementations of the tests which we demonstrate to be first-order asymptotically valid. Finite sample behaviour against a variety of temporarily predictable processes is considered. An empirical application to US stock returns illustrates the usefulness of the new predictability testing methods we propose.
    Keywords: predictive regression; rolling and recursive IV estimation; persistence; endogeneity; conditional and unconditional heteroskedasticity
    Date: 2019–02–27
    URL: http://d.repec.org/n?u=RePEc:esy:uefcwp:24137&r=all
  4. By: Koo, B.; La Vecchia, D.; Linton, O.
    Abstract: We develop estimation methodology for an additive nonparametric panel model that is suitable for capturing the pricing of coupon-paying government bonds followed over many time periods. We use our model to estimate the discount function and yield curve of nominally riskless government bonds. The novelty of our approach is the combination of two different techniques: cross-sectional nonparametric methods and kernel estimation for time varying dynamics in the time series context. The resulting estimator is able to capture the yield curve shapes and dynamics commonly observed in the fixed income markets. We establish the consistency, the rate of convergence, and the asymptotic normality of the proposed estimator. A Monte Carlo exercise illustrates the good performance of the method under different scenarios. We apply our methodology to the daily CRSP bond dataset, and compare with the popular Diebold and Li (2006) method.
    Keywords: nonparametric inference, panel data, time varying, yield curve dynamics
    JEL: C13 C14 C22 G12
    Date: 2019–02–27
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:1916&r=all
  5. By: Reza Hajargasht
    Abstract: Variational Bayes (VB) is a recent approximate method for Bayesian inference. It has the merit of being a fast and scalable alternative to Markov Chain Monte Carlo (MCMC) but its approximation error is often unknown. In this paper, we derive the approximation error of VB in terms of mean, mode, variance, predictive density and KL divergence for the linear Gaussian multi-equation regression. Our results indicate that VB approximates the posterior mean perfectly. Factors affecting the magnitude of underestimation in posterior variance and mode are revealed. Importantly, We demonstrate that VB estimates predictive densities accurately.
    Date: 2019–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1903.00617&r=all
  6. By: Sebastian Kripfganz (University of Exeter); Daniel C. Schneider (Max Planck Institute for Demographic Research)
    Abstract: Single-equation conditional equilibrium correction models can be used to test for the existence of a level relationship among the variables of interest. The distributions of the respective test statistics are nonstandard under the null hypothesis of no such relationship and critical values need to be obtained with stochastic simulations. We compute more than 95 billion F -statistics and 57 billion t-statistics for a large number of specifications of the Pesaran, Shin, and Smith (2001, Journal of Applied Econometrics 16: 289Ð326) bounds test. Our large-scale simulations enable us to draw smooth density functions and to estimate response surface models that improve upon and substantially extend the set of available critical values for the bounds test. Besides covering the full range of possible sample sizes and lag orders, our approach notably allows for any number of variables in the long-run level relationship by exploiting the diminishing effect on the distributions of adding another variable to the model. The computation of approximate p-values enables a fine-grained statistical inference and allows us to quantify the finite-sample distortions from using asymptotic critical values. We find that the bounds test can be easily oversized by more than 5 percentage points in small samples.
    Keywords: Bounds test, Cointegration, Error correction model, Generalized Dickey-Fuller regression, Level relationship, Unit roots
    JEL: C12 C15 C32 C46 C63
    Date: 2019
    URL: http://d.repec.org/n?u=RePEc:exe:wpaper:1901&r=all
  7. By: Ali Alichi; Hayk Avetisyan; Douglas Laxton; Shalva Mkhatrishvili; Armen Nurbekyan; Lusine Torosyan; Hou Wang
    Abstract: This paper extends the multivariate filter approach of estimating potential output developed by Alichi and others (2018) to incorporate labor market hysteresis. This extension captures the idea that long and deep recessions (expansions) cause persistent damage (improvement) to the labor market, thereby reducing (increasing) potential output. Applying the model to U.S. data results in significantly smaller estimates of output gaps, and higher estimates of the NAIRU, after the global financial crisis, compared to estimates without hysteresis. The smaller output gaps partly explain the absence of persistent deflation despite the slow recovery during 2010-2017. Going forward, if strong growth performance continues well beyond 2018, hysteresis is expected to result in a structural improvement in growth and employment.
    Date: 2019–02–19
    URL: http://d.repec.org/n?u=RePEc:imf:imfwpa:19/35&r=all
  8. By: Chloe Lacombe; Aitor Muguruza; Henry Stone
    Abstract: We present small-time implied volatility asymptotics for Realised Variance (RV) and VIX options for a number of (rough) stochastic volatility models via large deviations principle. We provide numerical results along with efficient and robust numerical recipes to compute the rate function; the backbone of our theoretical framework. Based on our results, we further develop approximation schemes for the density of RV, which in turn allows to express the volatility swap in close-form. Lastly, we investigate different constructions of multi-factor models and how each of them affects the convexity of the implied volatility smile. Interestingly, we identify the class of models that generate non-linear smiles around-the-money.
    Date: 2019–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1903.02833&r=all
  9. By: Gregor Kastner; Sylvia Fr\"uhwirth-Schnatter
    Abstract: Bayesian inference for stochastic volatility models using MCMC methods highly depends on actual parameter values in terms of sampling efficiency. While draws from the posterior utilizing the standard centered parameterization break down when the volatility of volatility parameter in the latent state equation is small, non-centered versions of the model show deficiencies for highly persistent latent variable series. The novel approach of ancillarity-sufficiency interweaving has recently been shown to aid in overcoming these issues for a broad class of multilevel models. In this paper, we demonstrate how such an interweaving strategy can be applied to stochastic volatility models in order to greatly improve sampling efficiency for all parameters and throughout the entire parameter range. Moreover, this method of "combining best of different worlds" allows for inference for parameter constellations that have previously been infeasible to estimate without the need to select a particular parameterization beforehand.
    Date: 2017–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1706.05280&r=all
  10. By: Escribano Sáez, Álvaro; Blazsek, Szabolcs Istvan; Ayala, Astrid
    Abstract: We introduce new dynamic conditional score (DCS) volatility models with dynamic scale and shape parameters for the effective measurement of volatility. In the new models, we use the EGB2 (exponential generalized beta of the second kind), NIG (normal-inverse Gaussian) and Skew-Gen-t (skewed generalized-t) probability distributions. Those distributions involve several shape parameters that control the dynamic skewness, tail shape and peakedness of financial returns. We use daily return data from the Standard & Poor's 500 (S&P 500) index for the period of January 4, 1950 to December 30, 2017. We estimate all models by using the maximum likelihood (ML) method, and we present the conditions of consistency and asymptotic normality of the ML estimates. We study those conditions for the S&P 500 and we also perform diagnostic tests for the residuals. The statistical performances of several DCS specifications with dynamic shape are superior to the statistical performance of the DCS specification with constant shape. Outliers in the shape parameters are associated with important announcements that affected the United States (US) stock market. Our results motivate the application of the new DCS models to volatility measurement, pricing financial derivatives, or estimation of the value-at-risk (VaR) and expected shortfall (ES) metrics.
    Keywords: score-driven shape parameters; Dynamic conditional score (DCS) models
    JEL: C58 C52 C22
    Date: 2019–01–28
    URL: http://d.repec.org/n?u=RePEc:cte:werepe:28133&r=all

This nep-ets issue is ©2019 by Jaqueson K. Galimberti. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.