
on Econometric Time Series 
By:  Gautier Marti; Frank Nielsen; Philippe Donnat; S\'ebastien Andler 
Abstract:  The following working document summarizes our work on the clustering of financial time series. It was written for a workshop on information geometry and its application for image and signal processing. This workshop brought several experts in pure and applied mathematics together with applied researchers from medical imaging, radar signal processing and finance. The authors belong to the latter group. This document was written as a long introduction to further development of geometric tools in financial applications such as risk or portfolio analysis. Indeed, risk and portfolio analysis essentially rely on covariance matrices. Besides that the Gaussian assumption is known to be inaccurate, covariance matrices are difficult to estimate from empirical data. To filter noise from the empirical estimate, Mantegna proposed using hierarchical clustering. In this work, we first show that this procedure is statistically consistent. Then, we propose to use clustering with a much broader application than the filtering of empirical covariance matrices from the estimate correlation coefficients. To be able to do that, we need to obtain distances between the financial time series that incorporate all the available information in these crossdependent random processes. 
Date:  2016–03 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1603.07822&r=ets 
By:  Fabrizio Cipollini; Robert F. Engle; Giampiero M. Gallo 
Abstract:  The Multiplicative Error Model (Engle (2002)) for nonnegative valued processes is specified as the product of a (conditionally autoregressive) scale factor and an innovation process with nonnegative support. A multivariate extension allows for the innovations to be contemporaneously correlated. We overcome the lack of sufficiently flexible probability density functions for such processes by suggesting a copula function approach to estimate the parameters of the scale factors and of the correlations of the innovation processes. We illustrate this vector MEM with an application to the interactions between realized volatility, volume and the number of trades. We show that significantly superior realized volatility forecasts are delivered in the presence of other trading activity indicators and contemporaneous correlations. 
Date:  2016–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1604.01338&r=ets 
By:  Davide Pettenuzzo (Brandeis University); Gary Koop (NUniversity of Strathclyde); Dimitris Korobilis (University of Glasgow) 
Abstract:  Macroeconomists are increasingly working with large Vector Autoregressions (VARs) where the number of parameters vastly exceeds the number of observations. Existing approaches either involve prior shrinkage or the use of factor methods. In this paper, we develop an alternative based on ideas from the compressed regression literature. It involves randomly compressing the explanatory variables prior to analysis. A huge dimensional problem is thus turned into a much smaller, more computationally tractable one. Bayesian model averaging can be done over various compressions, attaching greater weight to compressions which forecast well. In a macroeconomic application involving up to 129 variables, we find compressed VAR methods to forecast better than either factor methods or large VAR methods involving prior shrinkage. 
Keywords:  multivariate time series, random projection, forecasting 
JEL:  C11 C32 C53 
Date:  2016–03 
URL:  http://d.repec.org/n?u=RePEc:brd:wpaper:103&r=ets 
By:  Forni, Mario; Gambetti, Luca; Sala, Luca 
Abstract:  A shock of interest can be recovered, either exactly or with a good approximation, by means of standard VAR techniques even when the structural MA representation is non invertible. We propose a measure of how informative a VAR model is for a specific shock of interest. We show how to use such a measure for the validation of shocks' transmission mechanism of DSGE models through VARs. In an application, we validate a theory of news shocks. The theory does fairly well for all variables, but understates the longrun effects of technology news on TFP. 
Date:  2016–03 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:11178&r=ets 
By:  Helmut Lütkepohl; Anna StaszewskaBystrova; Peter Winker 
Abstract:  This paper proposes a new nonparametric method of constructing joint confidence bands for impulse response functions of vector autoregressive models. The estimation uncertainty is captured by means of bootstrapping and the highest density region (HDR) approach is used to construct the bands. A Monte Carlo comparison of the HDR bands with existing alternatives shows that the former are competitive with the bootstrapbased Bonferroni and Wald confidence regions. The relative tightness of the HDR bands matched with their good coverage properties makes them attractive for applications. An application to corporate bond spreads for Germany highlights the potential for empirical work. 
Keywords:  Impulse responses, joint confidence bands, highest density region, vector autoregressive process 
JEL:  C32 
Date:  2016 
URL:  http://d.repec.org/n?u=RePEc:diw:diwwpp:dp1564&r=ets 
By:  Mario Forni; Alessandro Giovannelli; Marco Lippi; Stefano Soccorsi 
Abstract:  Abstract. The paper compares the pseudo realtime forecasting performance of threeDynamic Factor Models: (i) The standard principalcomponent model, Stock and Watson(2002a), (ii) The model based on generalized principal components, Forni et al. (2005),(iii) The model recently proposed in Forni et al. (2015) and Forni et al. (2016). We employa large monthly dataset of macroeconomic and financial time series for the US economy,which includes the Great Moderation, the Great Recession and the subsequent recovery.Using a rolling window for estimation and prediction, we find that (iii) neatly outperforms(i) and (ii) in the Great Moderation period for both Industrial Production and Inflation,and for Inflation over the full sample. However, (iii) is outperfomed by (i) and (ii) over thefull sample for Industrial Production. 
Date:  2016–03 
URL:  http://d.repec.org/n?u=RePEc:eca:wpaper:2013/228908&r=ets 
By:  Barigozzi, Matteo; Lippi, Marco; Luciani, Matteo 
Abstract:  We develop the econometric theory for NonStationary Dynamic Factor models for large panels of time series, with a particular focus on building estimators of impulse response functions to unexpected macroeconomic shocks. We derive conditions for consistent estimation of the model as both the crosssectional size, n, and the time dimension, T, go to infinity, and whether or not cointegration is imposed. We also propose a new estimator for the nonstationary common factors, as well as an information criterion to determine the number of common trends. Finally, the numerical properties of our estimator are explored by means of a MonteCarlo exercise and of a realdata application, in which we study the effects of monetary policy and supply shocks on the US economy. 
Keywords:  Dynamic Factor model ; , common trends ; impulse response functions ; unit root processes 
JEL:  C00 C01 E00 
Date:  2016–03–04 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgfe:201624&r=ets 
By:  Ossola, Elisa; Gagilardini, Patrick; Scaillet, Olivier 
Abstract:  We develop an econometric methodology to infer the path of risk premia from a large unbalanced panel of individual stock returns. We estimate the timevarying risk premia implied by conditional linear asset pricing models where the conditioning includes both instruments common to all assets and asset specific instruments. The estimator uses simple weighted twopass crosssectional regressions, and we show its consistency and asymptotic normality under increasing crosssectional and time series dimensions. We address consistent estimation of the asymptotic variance by hard thresholding, and testing for asset pricing restrictions induced by the noarbitrage assumption. We derive the restrictions given by a continuum of assets in a multiperiod economy under an approximate factor structure robust to asset repackaging. The empirical analysis on returns for about ten thousands US stocks from July 1964 to December 2009 shows that risk premia are large and volatile in crisis periods. They exhibit large positive and negative strays from timeinvariant estimates, follow the macroeconomic cycles, and do not match risk premia estimates on standard sets of portfolios. The asset pricing restrictions are rejected for a conditional fourfactor model capturing market, size, value and momentum effects. 
JEL:  C12 C13 C23 C51 C52 G12 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:gnv:wpgsem:unige:76321&r=ets 
By:  Martyna Marczak (University of Hohenheim); Tommaso Proietti (CEIS and DEF,University of Rome "Tor Vergata"); Stefano Grassi (University of Kent) 
Abstract:  This article presents a robust augmented Kalman filter that extends the data– cleaning filter (Masreliez and Martin, 1977) to the general state space model featuring nonstationary and regression effects. The robust filter shrinks the observations towards their one–step–ahead prediction based on the past, by bounding the effect of the information carried by a new observation according to an influence function. When maximum likelihood estimation is carried out on the replacement data, an M–type estimator is obtained. We investigate the performance of the robust AKF in two applications using as a modeling framework the basic structural time series model, a popular unobserved components model in the analysis of seasonal time series. First, a Monte Carlo experiment is conducted in order to evaluate the comparative accuracy of the proposed method for estimating the variance parameters. Second, the method is applied in a forecasting context to a large set of European trade statistics series. 
Keywords:  robust filtering, augmented Kalman filter, structural time series model, additive outlier, innovation outlier 
JEL:  C32 C53 C63 
Date:  2016–03–31 
URL:  http://d.repec.org/n?u=RePEc:rtv:ceisrp:374&r=ets 
By:  Klaus Neusser 
Abstract:  The notion of the group of orthogonal matrices acting on the set of all feasible identification schemes is used to characterize the identification problem arising in structural vector autoregressions. This approach presents several conceptual advantages. First, it provides a fundamental justification for the use of the normalized Haar measure as the natural uninformative prior. Second, it allows to derive the joint distribution of blocks of parameters defining an identification scheme. Finally, it provides a coherent way for studying perturbations of identification schemes becomes relevant, among other things, for the specification of vector autoregressions with timevarying covariance matrices 
Keywords:  SVAR; identification; group action; Haar measure; perturbation 
JEL:  C1 C18 C32 
Date:  2016–03 
URL:  http://d.repec.org/n?u=RePEc:ube:dpvwib:dp1604&r=ets 
By:  Kufenko, Vadim 
Abstract:  In this paper we revisit the methodological aspects of the issue of spurious cycles: using the wellestablished clinometric data, we apply an empirical strategy to identify spurious periodicities and crossvalidate the results. The analysis of cyclical fluctuations involves numerous challenges, including data preparation and detrending. As a result, there is a risk of statistical artifacts to arise: it is known that summation operators and filtering yield a red noise alike spectral signature, amplifying lower frequencies and thus, longer periodicity, whereas detrending using differencing yields a blue noise alike spectral signature, amplifying higher frequencies and thus, shorter periodicity. In our paper we explicitly address this issue. In order to derive the stationary signals to be tested, we perform outlier adjustment, derive cycles from the series with the asymmetric band pass ChristianoFitzgerald filter using the upper bands of the Kuznets and the Juglar cycles as cutoffs, and obtain detrended prefiltered signals by differencing the series in the absence of fractional integration. Afterwards, we simultaneously test whether the spectral densities of filtered and detrended prefiltered signals are significantly different from the spectral density of the related noise. The periodicities from the Kuznets range were not simultaneously significant, and thus are likely to be spurious; whereas ones of the Juglar and Kitchin ranges were simultaneously significant. The simultaneous significance test helps to identify spurious periodicities and the results, in general, accord with the durations of the business cycles found in other works. 
Keywords:  business cycles,spectral analysis,spurious cycles,fractional integration,simultaneous testing 
JEL:  E02 E32 E39 F44 
Date:  2016–03 
URL:  http://d.repec.org/n?u=RePEc:zbw:hohpro:482016&r=ets 