nep-ets New Economics Papers
on Econometric Time Series
Issue of 2011‒02‒19
eleven papers chosen by
Yong Yin
SUNY at Buffalo

  1. An extension of cointegration to fractional autoregressive processes By Søren Johansen
  2. Mathematical Models and Economic Forecasting: Some Uses and Mis-Uses of Mathematicsin Economics By David F. Hendry
  3. Model Selection in Equations with Many 'Small' Effects By Jennifer L. Castle; Jurgen A. Doornik; David F. Hendry
  4. Residual-based tests for cointegration and multiple deterministic structural breaks: A Monte Carlo study By Matteo Mogliani
  5. FaMIDAS: A Mixed Frequency Factor Model with MIDAS structure By Cecilia Frale; Libero Monteforte
  6. Out-Of-Sample Comparisons of Overfit Models By Calhoun, Gray
  7. Modelling asset correlations: A nonparametric approach By Aslanidis, Nektarios; Casas, Isabel
  8. Estimation and evaluation of DSGE models: progress and challenges By Frank Schorfheide
  9. Quantifying and Modeling Long-Range Cross-Correlations in Multiple Time Series with Applications to World Stock Indices By Duan Wang; Boris Podobnik; Davor Horvati\'c; H. Eugene Stanley
  10. A Copula Approach on the Dynamics of Statistical Dependencies in the US Stock Market By Michael C. M\"unnix; Rudi Sch\"afer
  11. Spectral Analysis Informs the Proper Frequency in the Sampling of Financial Time Series Data By Taufemback, Cleiton; Da Silva, Sergio

  1. By: Søren Johansen (University of Copenhagen and CREATES)
    Abstract: This paper contains an overview of some recent results on the statistical analysis of cofractional processes, see Johansen and Nielsen (2010). We first give an brief summary of the analysis of cointegration in the vector autoregressive model and then show how this can be extended to fractional processes. The model allows the process X(t) to be fractional of order d and cofractional of order d-b>0; that is, there exist vectors beta for which beta'X(t) is fractional of order d-b. We analyse the Gaussian likelihood function to derive estimators and test statistics. The asymptotic properties are derived without the Gaussian assumption, under suitable moment conditions. We assume that the initial values are bounded and show that they do not influence the asymptotic analysis The estimator of \beta is asymptotically mixed Gaussian and estimators of the remaining parameters are asymptotically Gaussian. The asymptotic distribution of the likelihood ratio test for cointegration rank is a functional of fractional Brownian motion.
    Keywords: Cofractional processes, cointegration rank, fractional cointegration, likelihood inference, vector autoregressive model.
    JEL: C32
    Date: 2011–01–31
  2. By: David F. Hendry
    Abstract: We consider three ‘cases studies’ of the uses and mis-uses of mathematics in economics and econometrics. The first concerns economic forecasting, where a mathematical analysis is essential, and is independent of the specific forecasting model and how the process being forecast behaves. The second concerns model selection with more candidate variables than the number of observations. Again, an understanding of the properties of extended general-to-specific procedures is impossible without advanced mathematical analysis. The third concerns inter-temporal optimization and the formation of ‘rational expectations’, where misleading results follow from present mathematical approaches for realistic economies. The appropriate mathematics remains to be developed, and may end ‘problem specific’ rather than generic.
    Keywords: Economic forecasting, structural breaks, model selections, expectations, impulse-indicator saturation, mathematical analyses
    JEL: C02 C22
    Date: 2011
  3. By: Jennifer L. Castle; Jurgen A. Doornik; David F. Hendry
    Abstract: General unrestricted models (GUMs) may include important individual determinants, many small relevant effects, and irrelevant variables. Automatic model selection procedures can handle perfect collinearity and more candidate variables than observations, allowing substantial dimension reduction from GUMs with salient regressors, lags, non-linear transformations, and multiple location shifts, together with all the principal components representing ‘factor’ structures, which can also capture small influences that selection may not retain individually. High dimensional GUMs and even the final model can implicitly include more variables than observations entering via ‘factors’. We simulate selection in several special cases to illustrate.
    Keywords: Model selection, high dimensionality, principal components, non-linearity, Monte Carlos
    JEL: C51 C22
    Date: 2011
  4. By: Matteo Mogliani (PSE - Paris-Jourdan Sciences Economiques - CNRS : UMR8545 - Ecole des Hautes Etudes en Sciences Sociales (EHESS) - Ecole des Ponts ParisTech - Ecole Normale Supérieure de Paris - ENS Paris - INRA, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris)
    Abstract: The aim of this paper is to study the performance of residual-based tests for cointegration in the presence of multiple deterministic structural breaks via Monte Carlo simulations. We consider the KPSS-type LM tests proposed in Carrion-i-Silvestre and Sansò (2006) and in Bartley, Lee and Strazicich (2001), as well as the Schmidt and Phillips-type LM tests proposed in Westerlund and Edgerton (2007). This exercise allow us to cover a wide set of single-equation cointegration estimators. Monte Carlo experiments reveal a trade-off between size and power distortions across tests and models. KPSS-type tests display large size distortions under multiple breaks scenarios, while Schmidt and Phillips-type tests appear well-sized across all simulations. However, when regressors are endogenous, the former group of tests displays quite high power against the alternative hypothesis, while the latter shows severe low power.
    Keywords: cointegration ; single-equation ; structural breaks ; Monte Carlo simulations
    Date: 2010–08
  5. By: Cecilia Frale (MEF-Ministry of the Economy and Finance-Italy, Treasury Department); Libero Monteforte (Bank of Italy and MEF-Ministry of the Economy and Finance-Italy, Treasury Department)
    Abstract: In this paper a dynamic factor model with mixed frequency is proposed (FaMIDAS), where the past observations of high frequency indicators are used following the MIDAS approach. This structure is able to represent with richer dynamics the information content of the economic indicators and produces smoothed factors and forecasts. In addition, the Kalman filter is applied, which is particularly suited for dealing with unbalanced data set and revisions in the preliminary data. In the empirical application for the Italian quarterly GDP the short-term forecasting performance is evaluated against other mixed frequency models in a pseudo-real time experiment, also allowing for pooled forecast from factor models.
    Keywords: mixed frequency models, dynamic factor models, MIDAS,forecasting.
    JEL: E32 E37 C53
    Date: 2011–01
  6. By: Calhoun, Gray
    Abstract: This paper uses dimension asymptotics to study why overfit linear regression models should be compared out-of-sample; we let the number of predictors used by the larger model increase with the number of observations so that their ratio remains uniformly positive. Under this limit theory, the naive Diebold-Mariano-West out-of-sample test can test hypotheses about a key quantity for evaluating forecasting models---a time series analogue to the generalization error---as long as the out-of-sample period is small relative to the total sample size. Moreover, tests that are designed to reject if the larger model is true, such as the usual in-sample Wald and LM tests and also Clark and McCracken's (2001, 2005a), McCracken's (2007) and Clark and West's (2006, 2007) out-of-sample statistics, will choose the larger model too often when the smaller model is more accurate.
    Keywords: Generalization Error; Forecasting; ModelSelection; t-test; Dimension Asymptotics
    JEL: C01 C12 C22 C52 C53
    Date: 2011–02–10
  7. By: Aslanidis, Nektarios; Casas, Isabel
    Abstract: This article proposes a time-varying nonparametric estimator and a time-varying semiparametric estimator of the correlation matrix. We discuss representation, estimation based on kernel smoothing and inference. An extensive Monte Carlo simulation study is performed to compare the semiparametric and nonparametric models with the DCC speci fication. Our bivariate simulation results show that the semiparametric and nonparametric models are best in DGPs with gradual changes or structural breaks in correlations. However, in DGPs with rapid changes or constancy in correlations the DCC delivers the best outcome. Moreover, in multivariate simulations the semiparametric and nonparametric models fare the best in DGPs with substantial time-variability in correlations, while when allowing for little variability in the correlations the DCC is the dominant speci fication. The methodologies are illustrated by estimating the correlations for two interesting portfolios. The rst portfolio consists of the equity sectors SPDRs and the S&P 500 composite, while the second one contains major currencies that are actively traded in the foreign exchange market. Portfolio evaluation results show that the nonparametric estimator generally dominates its competitors, with a statistically significant lower portfolio variance.
    Keywords: Portfolio Evaluation; DCC; Local Linear Estimator; Nonparametric Correlations; Semiparametric Conditional Correlation Model
    Date: 2011–01
  8. By: Frank Schorfheide
    Abstract: Estimated dynamic stochastic equilibrium (DSGE) models are now widely used for empirical research in macroeconomics as well as for quantitative policy analysis and forecasting at central banks around the world. This paper reviews recent advances in the estimation and evaluation of DSGE models, discusses current challenges, and provides avenues for future research.
    Keywords: Econometric models ; Stochastic analysis
    Date: 2011
  9. By: Duan Wang; Boris Podobnik; Davor Horvati\'c; H. Eugene Stanley
    Abstract: We propose a modified time lag random matrix theory in order to study time lag cross-correlations in multiple time series. We apply the method to 48 world indices, one for each of 48 different countries. We find long-range power-law cross-correlations in the absolute values of returns that quantify risk, and find that they decay much more slowly than cross-correlations between the returns. The magnitude of the cross-correlations constitute "bad news" for international investment managers who may believe that risk is reduced by diversifying across countries. We find that when a market shock is transmitted around the world, the risk decays very slowly. We explain these time lag cross-correlations by introducing a global factor model (GFM) in which all index returns fluctuate in response to a single global factor. For each pair of individual time series of returns, the cross-correlations between returns (or magnitudes) can be modeled with the auto-correlations of the global factor returns (or magnitudes). We estimate the global factor using principal component analysis, which minimizes the variance of the residuals after removing the global trend. Using random matrix theory, a significant fraction of the world index cross-correlations can be explained by the global factor, which supports the utility of the GFM. We demonstrate applications of the GFM in forecasting risks at the world level, and in finding uncorrelated individual indices. We find 10 indices are practically uncorrelated with the global factor and with the remainder of the world indices, which is relevant information for world managers in reducing their portfolio risk. Finally, we argue that this general method can be applied to a wide range of phenomena in which time series are measured, ranging from seismology and physiology to atmospheric geophysics.
    Date: 2011–02
  10. By: Michael C. M\"unnix; Rudi Sch\"afer
    Abstract: We analyze the statistical dependency structure of the S&P 500 constituents in the 4-year period from 2007 to 2010 using intraday data from the New York Stock Exchange's TAQ database. With a copula-based approach, we find that the statistical dependencies are very strong in the tails of the marginal distributions. This tail dependence is higher than in a bivariate Gaussian distribution, which is implied in the calculation of many correlation coefficients. We compare the tail dependence to the market's average correlation level as a commonly used quantity and disclose an neraly linear relation.
    Date: 2011–02
  11. By: Taufemback, Cleiton; Da Silva, Sergio
    Abstract: Applied econometricians tend to show a long neglect for the proper frequency to be considered while sampling the time series data. The present study shows how spectral analysis can be usefully employed to fix this problem. The case is illustrated with ultra-high-frequency data and daily prices of four selected stocks listed on the Sao Paulo stock exchange.
    Keywords: Econophysics; Spectral analysis; Aliasing; Sampling; Financial time series
    JEL: C81
    Date: 2011

This nep-ets issue is ©2011 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.