nep-ets New Economics Papers
on Econometric Time Series
Issue of 2008‒11‒18
ten papers chosen by
Yong Yin
SUNY at Buffalo

  1. Optimal Linear Filtering, Smoothing and Trend Extraction for m-period Differences of Processes with a Unit Root By Dimitrios Thomakos
  2. Opening the Black Box: Structural Factor Models with Large Cross-Sections By Mario Forni; Domenico Giannone; Marco Lippi; Lucrezia Reichlin
  3. Large Bayesian VARs By Marta Banbura; Domenico Giannone; Lucrezia Reichlin
  4. A Quasi Maximum Likelihood Approach for Large Approximate Dynamic Factor Models By Catherine Doz; Domenico Giannone; Lucrezia Reichlin
  5. Nonlinear Adjustment in US Bond Yields: an Empirical Analysis with Conditional Heteroskedasticity By Lucchetti, Riccardo; Palomba, Giulio
  6. On the Correlation Structure of Microstructure Noise in Theory and Practice By Francis X. Diebold; Georg H. Strasser
  7. Temporal aggregation of univariate and multivariate time series models: A survey By Andrea Silvestrini; David Veredas
  8. Short-Term Forecasts of Euro Area GDP Growth By Elena Angelini; Gonzalo Camba-Mendez; Domenico Giannone; Lucrezia Reichlin; Gerhard Rünstler
  9. Calibration of local volatility using the local and implied instantaneous variance By Gabriel Turinici
  10. Estimation of Dynamic Models with Nonparametric Simulated Maximum Likelihood By Dennis Kristensen; Yongseok Shin

  1. By: Dimitrios Thomakos
    Abstract: In this paper I consider the problem of optimal linear filtering, smoothing and trend extraction for m-period differences of processes with a unit root. Such processes arise naturally in economics and finance, in the form of rates of change (price inflation, economic growth, financial returns) and finding an appropriate smoother is thus of immediate practical interest. The filter and resulting smoother are based on the methodology of Singular Spectrum Analysis (SSA) and their form and properties are examined in detail. In particular, I find explicit representations for the asymptotic decomposition of the covariance matrix and show that the first two leading eigenvalues of the decomposition account for over 90% of the variability of the process. I examine the structure of the impulse and frequency response functions finding that the optimal filter has a “permanent” and a “transitory component” with the corresponding smoother being the sum of two such components. I also find explicit representations for the extrapolation coefficients that can be used in out-of-sample prediction. The methodology of the paper is illustrated with three short empirical applications using data on U.S. inflation and real GDP growth and data on the Euro/US dollar exchange rate. Finally, the paper contains a new technical result: I derive explicit representations for the filtering weights in the context of SSA for an arbitrary covariance matrix. This result allows one to examine specific effects of smoothing in any situation and has not appeared so far, to the best of my knowledge, in the related literature.
    Keywords: core inflation, business cycles, differences, euro, linear filtering, singular spectrum analysis, smoothing, trading strategies, trend extraction and prediction, unit root.
    Date: 2008
  2. By: Mario Forni; Domenico Giannone; Marco Lippi; Lucrezia Reichlin
    Abstract: This paper shows how large-dimensional dynamic factor models are suitable for structural analysis. We argue that all identification schemes employed in SVAR analysis can be easily adapted in dynamic factor models. Moreover, the “problem of fundamentalness”, which is intractable in structural VARs, can be solved, provided that the impulse-response functions are sufficiently heterogeneous. We provide consistent estimators for the impulse-response functions, as well as (n, T) rates of convergence. An exercise with US macroeconomic data shows that our solution of the fundamentalness problem may have important empirical consequences.
    Keywords: Dynamic factor models, structural VARs, identification, fundamentalness
    JEL: E0 C1
    Date: 2008
  3. By: Marta Banbura; Domenico Giannone; Lucrezia Reichlin
    Abstract: This paper shows that Vector Autoregression with Bayesian shrinkage is an appropriate tool for large dynamic models. We build on the results by De Mol, Giannone, and Reichlin (2008) and show that, when the degree of shrinkage is set in relation to the cross-sectional dimension, the forecasting performance of small monetary VARs can be improved by adding additional macroeconomic variables and sectoral information. In addition, we show that large VARs with shrinkage produce credible impulse responses and are suitable for structural analysis.
    Keywords: Bayesian VAR, Forecasting, Monetary VAR, large cross-sections
    JEL: C11 C13 C33 C53
    Date: 2008
  4. By: Catherine Doz; Domenico Giannone; Lucrezia Reichlin
    Abstract: Is maximum likelihood suitable for factor models in large cross-sections of time series? We answer this question from both an asymptotic and an empirical perspective. We show that estimates of the common factors based on maximum likelihood are consistent for the size of the cross-section (n) and the sample size (T) going to infinity along any path of n and T and that therefore maximum likelihood is viable for n large. The estimator is robust to misspecification of the cross-sectional and time series correlation of the the idiosyncratic components. In practice, the estimator can be easily implemented using the Kalman smoother and the EM algorithm as in traditional factor analysis.
    Keywords: Factor Model, large cross-sections, Quasi Maximum Likelihood
    JEL: C51 C32 C33
    Date: 2008
  5. By: Lucchetti, Riccardo; Palomba, Giulio
    Abstract: Starting from the work by Campbell and Shiller (1987), empirical analysis of interest rates has been conducted in the framework of cointegration. However, parts of this approach have been questioned recently, as the adjustment mechanism may not follow a simple linear rule; another line of criticism points out that stationarity of the spreads is difficult to maintain empirically. In this paper, we analyse data on US bond yields by means of an augmented VAR specification which approximates a generic nonlinear adjustment model. We argue that nonlinearity captures macro information via the shape of the yield curve and thus provides an alternative explanation for some findings recently appeared in the literature. Moreover, we show how conditional heteroskedasticity can be taken into account via GARCH specifications for the conditional variance, either univariate and multivariate.
    Keywords: interest rates; cointegration; nonlinear adjustment; conditional heteroskedasticity
    JEL: C51 C32 E43
    Date: 2008
  6. By: Francis X. Diebold (Department of Economics, University of Pennsylvania); Georg H. Strasser (Department of Economics, Boston College)
    Abstract: We argue for incorporating the financial economics of market microstructure into the financial econometrics of asset return volatility estimation. In particular, we use market microstructure theory to derive the cross-correlation function between latent returns and market microstructure noise, which feature prominently in the recent volatility literature. The cross-correlation at zero displacement is typically negative, and cross-correlations at nonzero displacements are positive and decay geometrically. If market makers are sufficiently risk averse, however, the cross-correlation pattern is inverted. Our results are useful for assessing the validity of the frequently-assumed independence of latent price and microstructure noise, for explaining observed crosscorrelation patterns, for predicting as-yet undiscovered patterns, and for making informed conjectures as to improved volatility estimation methods.
    Keywords: Realized volatility, Market microstructure theory, High-frequency data, Financial econometrics
    JEL: G14 G20 D82 D83 C51
    Date: 2008–10–09
  7. By: Andrea Silvestrini (Bank of Italy, Economics and International Relations and CORE, Université catholique de Louvain.); David Veredas (ECARES, Université Libre de Bruxelles and CORE, Université catholique de Louvain, Belgium)
    Abstract: We present a unified and up-to-date overview of temporal aggregation techniques for univariate and multivariate time series models explaining in detail how these techniques are employed. Some empirical applications illustrate the main issues.
    Keywords: Temporal aggregation, ARIMA, Seasonality, GARCH, Vector ARMA, Spurious causality, Multivariate GARCH
    JEL: C10 C22 C32 C43
    Date: 2008–08
  8. By: Elena Angelini; Gonzalo Camba-Mendez; Domenico Giannone; Lucrezia Reichlin; Gerhard Rünstler
    Abstract: This paper evaluates models that exploit timely monthly releases to compute early estimates of current quarter GDP (now-casting) in the euro area. We compare traditional methods used at institutions with a new method proposed by Giannone, Reichlin, and Small (2005). The method consists in bridging quarterly GDP with monthly data via a regression on factors extracted from a large panel of monthly series with different publication lags. We show that bridging via factors produces more accurate estimates than traditional bridge equations. We also show that survey data and other ‘soft’ information are valuable for now-casting.
    Keywords: Forecasting, Monetary Policy, Factor Model, Real Time Data, Large data-sets, News
    JEL: E52 C33 C53
    Date: 2008
  9. By: Gabriel Turinici (CEREMADE - CEntre de REcherches en MAthématiques de la DEcision - CNRS : UMR7534 - Université Paris Dauphine - Paris IX)
    Abstract: We document the calibration of the local volatility through a functional to be optimized; our calibration variables are the local and implied instantaneous variances whose theoretical properties are fully explored within a particular class of volatilities. We confirm the theoretical results through a numerical procedure where we separate the parametric optimization (performed with any suitable optimization algorithm) from the computation of the functional by the use of an adjoint to obtain an approximation. The procedure performs well on benchmarks from the literature and on FOREX data.
    Keywords: calibration; local volatility; implied volatility; Dupire formula; adjoint; instantaneous local variance;instantaneous implied variance; implied variance
    Date: 2008–11–11
  10. By: Dennis Kristensen; Yongseok Shin (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: We propose a simulated maximum likelihood estimator for dynamic models based on non- parametric kernel methods. Our method is designed for models without latent dynamics from which one can simulate observations but cannot obtain a closed-form representation of the like- lihood function. Using the simulated observations, we nonparametrically estimate the density - which is unknown in closed form - by kernel methods, and then construct a likelihood func- tion that can be maximized. We prove for dynamic models that this nonparametric simulated maximum likelihood (NPSML) estimator is consistent and asymptotically efficient. NPSML is applicable to general classes of models and is easy to implement in practice.
    Keywords: dynamic models, estimation, kernel density estimation, maximum-likelihood, simulation
    JEL: C13 C14 C15 C32 C35
    Date: 2008–11–13

This nep-ets issue is ©2008 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.