nep-ets New Economics Papers
on Econometric Time Series
Issue of 2013‒01‒07
twenty papers chosen by
Yong Yin
SUNY at Buffalo

  1. A test for the rank of the volatility process: the random perturbation approach By Jean Jacod; Mark Podolskij
  2. A Smooth Transition Long-Memory Model By Marcel Aloy; Gilles Dufrénot; Charles Lai Tong; Anne Péguin-Feissolle
  3. Modeling Financial Volatility in the Presence of Abrupt Changes By Gordon J. Ross
  4. Filtering with heavy tails By Harvey, A.; Luati, A.
  5. ON THE BEHAVIOR OF NONPARAMETRIC DENSITY AND SPECTRAL DENSITY ESTIMATORS AT ZERO POINTS OF THEIR SUPPORT By Politis, Dimitris
  6. Testing for Common GARCH Factors By Prosper Dovonon; Éric Renault
  7. Transformed Polynomials for Nonlinear Autoregressive Models of the Conditional Mean By Francisco Blasques
  8. Bayesian analysis of recursive SVAR models with overidentifying restrictions By Andrzej Kociecki; Michał Rubaszek; Michele Ca' Zorzi
  9. Prior selection for vector autoregressions By Domenico Giannone; Michele Lenza; Giorgio E. Primiceri
  10. Nonparametric HAC estimation for time series data with missing observations By Deepa Dhume Datta; Wenxin Du
  11. Multi-step ahead forecasting of vector time series By Tucker McElroy; Michael W. McCracken
  12. Qual VAR Revisited: Good Forecast, Bad Story By Makram El-Shagi; Gregor von Schweinitz
  13. Forecasting and Signal Extraction with Regularised Multivariate Direct Filter Approach By Ginters Buss
  14. Testing for Breaks in Cointegrated Panels By Chihwa Kao; Lorenzo Trapani; Giovanni Urga
  15. The Hausman-Taylor Panel Data Model with Serial Correlation By Badi H. Baltagi; Long Liu
  16. Estimation and Prediction in the Random Effects Model with AR(p) Remainder Disturbances By Badi H. Baltagi; Long Liu
  17. A Robust Hausman-Taylor Estimator By Badi H. Baltagi; Georges Bresson
  18. Small Sample Properties and Pretest Estimation of A Spatial Hausman-Taylor Model By Badi H. Baltagi; Peter H. Egger; Michaela Kesina
  19. Improved Variance Estimation of Maximum Likelihood Estimators in Stable First-Order Dynamic Regression Models By Jan F. KIVIET; Garry D.A. PHILLIPS
  20. Volatility modelling of foreign exchange rate: discrete GARCH family versus continuous GARCH By ARI, YAKUP

  1. By: Jean Jacod (University Paris VI); Mark Podolskij (Heidelberg University and CREATES)
    Abstract: In this paper we present a test for the maximal rank of the matrix-valued volatility process in the continuous Itô semimartingale framework. Our idea is based upon a random perturbation of the original high frequency observations of an Itô semimartingale, which opens the way for rank testing. We develop the complete limit theory for the test statistic and apply it to various null and alternative hypotheses. Finally, we demonstrate a homoscedasticity test for the rank process.
    Keywords: central limit theorem, high frequency data, homoscedasticity testing, Itô semimartingales, rank estimation, stable convergence.
    JEL: C10 C13 C14
    Date: 2012–12–14
    URL: http://d.repec.org/n?u=RePEc:aah:create:2012-57&r=ets
  2. By: Marcel Aloy (Aix-Marseille University (Aix-Marseille School of Economics), CNRS & EHESS); Gilles Dufrénot (Aix-Marseille University (Aix-Marseille School of Economics), CNRS & EHESS, Banque de France and CEPII); Charles Lai Tong (Aix-Marseille University (Aix-Marseille School of Economics), CNRS & EHESS); Anne Péguin-Feissolle (Aix-Marseille University (Aix-Marseille School of Economics), CNRS & EHESS)
    Abstract: This paper proposes a new fractional model with a time-varying long-memory parameter. The latter evolves nonlinearly according to a transition variable through a logistic function. We present a LR-based test that allows to discriminate between the standard fractional model and our model. We further  apply the nonlinear least squares method  to estimate the long memory parameter. We present an application to the unemployment rate in the United-States from 1948 to 2012.
    Keywords: Long-memory, nonlinearity, time varying parameter, logistic.
    JEL: C22 C51 C58
    Date: 2012–12
    URL: http://d.repec.org/n?u=RePEc:aim:wpaimx:1240&r=ets
  3. By: Gordon J. Ross
    Abstract: The volatility of financial instruments is rarely constant, and usually varies over time. This creates a phenomenon called volatility clustering, where large price movements on one day are followed by similarly large movements on successive days, creating temporal clusters. The GARCH model, which treats volatility as a drift process, is commonly used to capture this behavior. However research suggests that volatility is often better described by a structural break model, where the volatility undergoes abrupt jumps in addition to drift. Most efforts to integrate these jumps into the GARCH methodology have resulted in models which are either very computationally demanding, or which make problematic assumptions about the distribution of the instruments, often assuming that they are Gaussian. We present a new approach which uses ideas from nonparametric statistics to identify structural break points without making such distributional assumptions, and then models drift separately within each identified regime. Using our method, we investigate the volatility of several major stock indexes, and find that our approach can potentially give an improved fit compared to more commonly used techniques.
    Date: 2012–12
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1212.6016&r=ets
  4. By: Harvey, A.; Luati, A.
    Abstract: An unobserved components model in which the signal is buried in noise that is non-Gaussian may throw up observations that, when judged by the Gaussian yardstick, are outliers. We describe an observation driven model, based on a conditional Student t-distribution, that is tractable and retains some of the desirable features of the linear Gaussian model. Letting the dynamics be driven by the score of the conditional distribution leads to a specification that is not only easy to implement, but which also facilitates the development of a comprehensive and relatively straightforward theory for the asymptotic distribution of the ML estimator. The methods are illustrated with an application to rail travel in the UK. The .final part of the article shows how the model may be extended to include explanatory variables.
    Keywords: Outlier; robustness; score; seasonal; t-distribution; trend
    JEL: C22
    Date: 2012–12–19
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:1255&r=ets
  5. By: Politis, Dimitris
    Abstract: The asymptotic behavior of nonparametric estimators of the probability density function of an i.i.d. sample and of the spectral density function of a stationary time series have been studied in some detail in the last 50-60 years. Nevertheless, an open problem remains to date, namely the behavior of the estimator when the target function happens to vanish at the point of interest. In the paper at hand we fill this gap, and show that asymptotic normality still holds true but with a super-efficient rate of convergence. We also provide two possible applications where these new results can be found useful in practice.
    Keywords: Econometrics and Quantitative Economics, Nonparametric Density
    Date: 2012–12–01
    URL: http://d.repec.org/n?u=RePEc:cdl:ucsdec:qt40g0z0tz&r=ets
  6. By: Prosper Dovonon; Éric Renault
    Abstract: This paper proposes a test for common conditionally heteroskedastic (CH) features in asset returns. Following Engle and Kozicki (1993), the common CH features property is expressed in terms of testable overidentifying moment restrictions. However, as we show, these moment conditions have a degenerate Jacobian matrix at the true parameter value and therefore the standard asymptotic results of Hansen (1982) do not apply. We show in this context that the Hansen’s (1982) J-test statistic is asymptotically distributed as the minimum of the limit of a certain empirical process with a markedly nonstandard distribution. If two assets are considered, this asymptotic distribution is a half-half mixture of x_(H-1)^2and x_H^2, where H is the number of moment conditions, as opposed to a x_(H-1)^2. With more than two assets, this distribution lies between the x_(H-p)^2 and x_H^2 (p, the number of parameters). These results show that ignoring the lack of first order identification of the moment condition model leads to oversized tests with possibly increasing over-rejection rate with the number of assets. A Monte Carlo study illustrates these findings. <P>Cet article propose un test pour la détection de caractéristiques communes d’hétéroscédasticité conditionnelle (HC) dans des rendements d’actifs financiers. Conformément à Engle et Kozicki (1993), l’existence de caractéristiques communes HC est exprimée en termes de conditions de moment sur-identifiantes testables. Cependant, nous montrons que ces conditions de moment ne sont pas localement linéairement indépendantes; la matrice Jacobienne est nulle à la vraie valeur des paramètres et, par conséquent, la théorie asymptotique de Hansen (1982) ne s’applique pas. Nous montrons dans ce contexte que la statistique de J-test de Hansen (1982) est distribuée asymptotiquement comme le minimum de la limite d’un processus empirique avec une distribution non standard. Quand on considère deux actifs, cette distribution asymptotique est un mélange à parts égales de x_(H-1)^2 et x_H^2, où H est le nombre de conditions de moment, par opposition à x_(H-1)^2. Avec plus de deux actifs, cette distribution est comprise entre x_(H-p)^2 et x_H^2 (p, le nombre de paramètres). Ces résultats montrent que l’ignorance du défaut d’identification au premier ordre dans ce modèle de conditions de moments conduit à des tests qui rejettent trop souvent l’hypothèse nulle, le degré de sur-rejet étant croissant avec le nombre d’actifs. Une étude de Monte-Carlo illustre ces résultats.
    Keywords: Common features, GARCH factors, Nonstandard asymptotics, GMM, GMM overidentification test, identification, first order identification,
    Date: 2012–12–01
    URL: http://d.repec.org/n?u=RePEc:cir:cirwor:2012s-34&r=ets
  7. By: Francisco Blasques (VU University Amsterdam)
    Abstract: This paper proposes a new set of transformed polynomial functions that provide a flexible setting for nonlinear autoregressive modeling of the conditional mean while at the same time ensuring the strict stationarity, ergodicity, fading memory and existence of moments of the implied stochastic sequence. The great flexibility of the transformed polynomial functions makes them interesting for both parametric and semi-nonparametric autoregressive modeling. This flexibility is established by showing that transformed polynomial sieves are sup-norm-dense on the space of continuous functions and offer appropriate convergence speeds on Holder function spaces.
    Keywords: time-series; nonlinear autoregressive models; semi-nonparametric models; method of sieves.
    JEL: C01 C13 C14 C22
    Date: 2012–12–05
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20120133&r=ets
  8. By: Andrzej Kociecki (National Bank of Poland); Michał Rubaszek (National Bank of Poland; Warsaw School of Economics); Michele Ca' Zorzi (European Central Bank)
    Abstract: The paper provides a novel Bayesian methodological framework to estimate structural VAR (SVAR) models with recursive identification schemes that allows for the inclusion of over-identifying restrictions. The proposed framework enables the researcher to (i) elicit the prior on the non-zero contemporaneous relations between economic variables and to (ii) derive an analytical expression for the posterior distribution and marginal data density. We illustrate our methodological framework by estimating a backward looking New-Keynesian model taking into account prior beliefs about the contemporaneous coefficients in the Phillips curve and Taylor rule. JEL Classification: C11, C32, E47
    Keywords: Structural VAR, Bayesian inference, overidentifying restrictions
    Date: 2012–11
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20121492&r=ets
  9. By: Domenico Giannone (Université Libre de Bruxelles; CEPR - Centre for Economic Policy Research); Michele Lenza (European Central Bank); Giorgio E. Primiceri (Northwestern University; CEPR - Centre for Economic Policy Research; NBER - the National Bureau of Economic Research)
    Abstract: Vector autoregressions (VARs) are flexible time series models that can capture complex dynamic interrelationships among macroeconomic variables. However, their dense parameterization leads to unstable inference and inaccurate out-ofsample forecasts, particularly for models with many variables. A solution to this problem is to use informative priors, in order to shrink the richly parameterized unrestricted model towards a parsimonious naïve benchmark, and thus reduce estimation uncertainty. This paper studies the optimal choice of the informativeness of these priors, which we treat as additional parameters, in the spirit of hierarchical modeling. This approach is theoretically grounded, easy to implement, and greatly reduces the number and importance of subjective choices in the setting of the prior. Moreover, it performs very well both in terms of out-of-sample forecasting—as well as factor models—and accuracy in the estimation of impulse response functions. JEL Classification: C11, C32, C53, E37
    Keywords: Forecasting, Bayesian methods, Marginal Likelihood, Hierarchical modeling, impulse responses
    Date: 2012–11
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20121494&r=ets
  10. By: Deepa Dhume Datta; Wenxin Du
    Abstract: The Newey and West (1987) estimator has become the standard way to estimate a heteroskedasticity and autocorrelation consistent (HAC) covariance matrix, but it does not immediately apply to time series with missing observations. We demonstrate that the intuitive approach to estimate the true spectrum of the underlying process using only the observed data leads to incorrect inference. Instead, we propose two simple consistent HAC estimators for time series with missing data. First, we develop the Amplitude Modulated estimator by applying the Newey-West estimator and treating the missing observations as non-serially correlated. Secondly, we develop the Equal Spacing estimator by applying the Newey-West estimator to the series formed by treating the data as equally spaced. We show asymptotic consistency of both estimators for inference purposes and discuss finite sample variance and bias tradeoff. In Monte Carlo simulations, we demonstrate that the Equal Spacing estimator is preferred in most cases due to its lower bias, while the Amplitude Modulated estimator is preferred for small sample size and low autocorrelation due to its lower variance.
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:fip:fedgif:1060&r=ets
  11. By: Tucker McElroy; Michael W. McCracken
    Abstract: This paper develops the theory of multi-step ahead forecasting for vector time series that exhibit temporal nonstationarity and co-integration. We treat the case of a semi-infinite past, developing the forecast filters and the forecast error filters explicitly, and also provide formulas for forecasting from a finite-sample of data. This latter application can be accomplished by the use of large matrices, which remains practicable when the total sample size is moderate. Expressions for Mean Square Error of forecasts are also derived, and can be implemented readily. Three diverse data applications illustrate the flexibility and generality of these formulas: forecasting Euro Area macroeconomic aggregates; backcasting fertility rates by racial category; and forecasting regional housing starts using a seasonally co-integrated model.
    Keywords: Econometric models ; Economic forecasting
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:fip:fedlwp:2012-060&r=ets
  12. By: Makram El-Shagi; Gregor von Schweinitz
    Abstract: Due to the recent financial crisis, the interest in econometric models that allow to incorporate binary variables (such as the occurrence of a crisis) experienced a huge surge. This paper evaluates the performance of the Qual VAR, i.e. a VAR model including a latent variable that governs the behavior of an observable binary variable. While we find that the Qual VAR performs reasonably well in forecasting (outperforming a probit benchmark), there are substantial identification problems. Therefore, when the economic interpretation of the dynamic behavior of the latent variable and the chain of causality matter, the Qual VAR is inadvisable.
    Keywords: binary choice model, Gibbs sampling, latent variable, MCMC, method evaluation
    JEL: C15 C35 E37
    Date: 2012–12
    URL: http://d.repec.org/n?u=RePEc:iwh:dispap:12-12&r=ets
  13. By: Ginters Buss
    Abstract: The paper studies regularised direct filter approach as a tool for high-dimensional filtering and real-time signal extraction. It is shown that the regularised filter is able to process high-dimensional data sets by controlling for effective degrees of freedom and that it is computationally fast. The paper illustrates the features of the filter by tracking the medium-to-long-run component in GDP growth for the euro area, including replication of Eurocoin-type behavior as well as producing more timely indicators. A further robustness check is performed on a less homogeneous dataset for Latvia. The resulting real-time indicators are found to track economic activity in a timely and robust manner. The regularised direct filter approach can thus be considered a promising tool for both concurrent estimation and forecasting using high-dimensional datasets and a decent alternative to the dynamic factor methodology.
    Keywords: high-dimensional filtering, real-time estimation, coincident indicator, leading indicator, parameter shrinkage, business cycles, dynamic factor model
    JEL: C13 C32 E32 E37
    Date: 2012–12–27
    URL: http://d.repec.org/n?u=RePEc:ltv:wpaper:201206&r=ets
  14. By: Chihwa Kao (Center for Policy Research, Maxwell School, Syracuse University, Syracuse, NY 13244-1020); Lorenzo Trapani (Cass Business School); Giovanni Urga (Cass Business School, City University London)
    Abstract: We investigate the issue of testing for structural breaks in large cointegrated panels with common and idiosyncratic regressors. We prove a panel Functional Central Limit Theorem. We show that the estimated coefficients of the common regressors have a mixed normal distribution, whilst the estimated coefficients of the idiosyncratic regressors have a normal distribution. We consider strong dependence across the idiosyncratic regressors by allowing for the presence of (stationary and nonstationary) common factors. We show that tests based on transformations of Wald-type statistics have power versus alternatives of order
    Keywords: Structural change, Panel cointegration, Common trends JEL codes: C23
    Date: 2012–02
    URL: http://d.repec.org/n?u=RePEc:max:cprwps:135&r=ets
  15. By: Badi H. Baltagi (Center for Policy Research, Maxwell School, Syracuse University, 426 Eggers Hall, Syracuse, NY 13244-1020); Long Liu (The University of Texas at San Antonio)
    Abstract: This paper modifies the Hausman and Taylor (1981) panel data estimator to allow for serial correlation in the remainder disturbances. It demonstrates the gains in efficiency of this estimator versus the standard panel data estimators that ignore serial correlation using Monte Carlo experiments.
    Keywords: Panel Data, Fixed Effects, Random Effects, Instrumental Variables, Serial Correlation
    JEL: C32
    Date: 2012–03
    URL: http://d.repec.org/n?u=RePEc:max:cprwps:136&r=ets
  16. By: Badi H. Baltagi (Center for Policy Research, Maxwell School, Syracuse University, 426 Eggers Hall, Syracuse, NY 13244-1020); Long Liu (The University of Texas at San Antonio)
    Abstract: This paper considers the problem of estimation and forecasting in a panel data model with random individual effects and AR(p) remainder disturbances. It utilizes a simple exact transformation for the AR(p) time series process derived by Baltagi and Li (1994) and obtains the generalized least squares estimator for this panel model as a least squares regression. This exact transformation is also used in conjunction with Goldberger’s (1962) result to derive an analytic expression for the best linear unbiased predictor. The performance of this predictor is investigated using Monte Carlo experiments and illustrated using an empirical example. Key Words: Prediction; Panel Data; Random Effects; Serial Correlation; AR(p) JEL Classification: C32
    Date: 2012–07
    URL: http://d.repec.org/n?u=RePEc:max:cprwps:138&r=ets
  17. By: Badi H. Baltagi (Center for Policy Research, Maxwell School, Syracuse University, 426 Eggers Hall, Syracuse, NY 13244-1020); Georges Bresson (Université Panthéon-Assas (Paris II))
    Abstract: This paper suggests a robust Hausman and Taylor (1981) estimator, here-after HT, that deals with the possible presence of outliers. This entails two modifications of the classical HT estimator. The first modification uses the Bramati and Croux (2007) robust Within MS estimator instead of the Within estimator in the first stage of the HT estimator. The second modification uses the robust Wagenvoort and Waldmann (2002) two stage generalized MS estimator instead of the 2SLS estimator in the second step of the HT estimator. Monte Carlo simulations show that, in the presence of vertical outliers or bad leverage points, the robust HT estimator yields large gains in MSE as compared to its classical Hausman-Taylor counterpart. We illustrate this robust version of the Hausman-Taylor estimator using an empirical application. Key Words: Bad leverage points, Hausman-Taylor, panel data, two stage generalized MS estimator, vertical outliers. JEL No. C23, C26
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:max:cprwps:140&r=ets
  18. By: Badi H. Baltagi (Center for Policy Research, Maxwell School, Syracuse University, 426 Eggers Hall, Syracuse, NY 13244-1020); Peter H. Egger (Centre for Economic Policy Research (CEPR); London, United Kingdom); Michaela Kesina (ETH Zurich)
    Abstract: This paper considers a Hausman and Taylor (1981) panel data model that exhibits a Cliff and Ord (1973) spatial error structure. We analyze the small sample properties of a generalized moments estimation approach for that model. This spatial Hausman-Taylor estimator allows for endogeneity of the time-varying and time-invariant variables with the individual effects. For this model, the spatial effects estimator is known to be consistent, but its disadvantage is that it wipes out the effects of time-invariant variables, which are important for most empirical studies. Monte Carlo results show that the spatial Hausman-Taylor estimator performs well in small samples. Key Words: Hausman-Taylor estimator; Spatial random effects; Small sample properties JEL No. C23, 31
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:max:cprwps:141&r=ets
  19. By: Jan F. KIVIET (Division of Economics, Nanyang Technological University, Singapore 637332, Singapore); Garry D.A. PHILLIPS (Cardiff Business School, Aberconway Building, Colum Drive, CF10 3EU, Cardiff, Wales, UK)
    Abstract: In dynamic regression models conditional maximum likelihood (least-squares) coe¢ cient and variance estimators are biased. From expansions of the coefficient variance and its estimator we obtain an approximation to the bias in variance es- timation and a bias corrected variance estimator, for both the standard and a bias corrected coe¢ cient estimator. These enable a comparison of their mean squared errors to second order. We formally derive su¢ cient conditions for admissibility of these approximations. Illustrative numerical and simulation results are presented on bias reduction of coefficient and variance estimation for three relevant classes of ?rst-order autoregressive models, supplemented by e¤ects on mean squared er- rors, test size and size corrected power. These indicate that substantial biases do occur in moderately large samples, but these can be mitigated substantially and may also yield mean squared error reduction. Crude asymptotic tests are cursed by huge size distortions. However, operational bias corrections of both the esti- mates of coefficients and their estimated variance are shown to curb type I errors reasonably well.
    Keywords: higher-order asymptotic expansions, bias correction, efficiency gains, lagged dependent variables, finite sample moments, size improvement
    JEL: C13 C22
    Date: 2012–06
    URL: http://d.repec.org/n?u=RePEc:nan:wpaper:1206&r=ets
  20. By: ARI, YAKUP
    Abstract: Non-linearity is the general characteristic of financial series. Thus, common non-linear models such as GARCH, EGARCH and TGARCH are used to obtain the volatility of data. in addition , continuous time GARCH (COGARCH) model that is the extansion and analogue of the discrete time GARCH process, is the new approach for volatility and derivative pricing. COGARCH has a single source variability like GARCH, but also it is constructed on driving Levy Process since increments of Levy Process is replaced with the innovations in the discrete time. in this study, the proper model for the volatility is shown to represent foreign exchange rate of USD versus TRY for different period of time from January 2009 to December 2011.
    Keywords: Volatility; Levy Process;GARCH; EGARCH; TGARCH;COGARCH;
    JEL: C01
    Date: 2012–05
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:43330&r=ets

This nep-ets issue is ©2013 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.