nep-ecm New Economics Papers
on Econometrics
Issue of 2008‒04‒04
nine papers chosen by
Sune Karlsson
Orebro University

  1. A Test for Dependence and Covariance Estimator of Market Microstructure Noise By Masato Ubukata; Kosuke Oya
  2. A Shrinkage Instrumental Variable Estimator for Large Datasets By Andrea Carriero; George Kapetanios; Massimiliano Marcellino
  3. A Review of Forecasting Techniques for Large Data Sets By Jana Eklund; George Kapetanios
  4. Bootstrap prediction intervals in State Space models By Alejandro Rodriguez; Esther Ruiz
  5. Cross-sectional Averaging and Instrumental Variable Estimation with Many Weak Instruments By George Kapetanios; Massimiliano Marcellino
  6. Negative Volatility Spillovers in the Unrestricted ECCC-GARCH Model By Christian Conrad; Menelaos Karanasos
  7. Comparing the DSGE model with the factor model: an out-of-sample forecasting experiment By Wang, Mu-Chun
  8. Assessing the Effectiveness of a Stochastic Regression Imputation Method for Ordered Categorical Data By Isabella Sulis; Mariano Porcu
  9. Volatility Threshold Dynamic Conditional Correlations: An International Analysis By Maria Kasch; Massimiliano Caporin

  1. By: Masato Ubukata (Graduate School of Economics, Osaka University); Kosuke Oya (Graduate School of Economics, Osaka University)
    Abstract: There are many approaches for estimating an integrated variance and covariance in the presence of market microstructure noise. It is important to know a dependence of noise to construct the integrated variance and covariance estimators. We study a time dependence of bivariate noise processes in this paper. We propose a test statistic for the dependence of the noises and an autocovariance estimator of the noises and derive its asymptotic distribution. The asymptotic distribution of the autocovariance estimator provides us to another test statistic which is for significance of the autocovariances and for detection whether the noise exists or not. We obtain good performances of the test statistics and autocovariance estimator of the noises in a finite sample through Monte Carlo simulation. In empirical illustration, we confirm that the proposed statistics and estimators capture various dependence patterns of the market microstructure noises.
    Keywords: test statistic; market microstructure noise; time-dependence; nonsynchronous observations; high frequency data.
    JEL: C12 D49
    Date: 2008–03
  2. By: Andrea Carriero (Queen Mary, University of London); George Kapetanios (Queen Mary, University of London); Massimiliano Marcellino (Bocconi University and EUI)
    Abstract: This paper proposes and discusses an instrumental variable estimator that can be of particular relevance when many instruments are available. Intuition and recent work (see, e.g., Hahn (2002)) suggest that parsimonious devices used in the construction of the final instruments, may provide effective estimation strategies. Shrinkage is a well known approach that promotes parsimony. We consider a new shrinkage 2SLS estimator. We derive a consistency result for this estimator under general conditions, and via Monte Carlo simulation show that this estimator has good potential for inference in small samples.
    Keywords: Instrumental variable estimation, 2SLS, Shrinkage, Bayesian regression
    JEL: C13 C23 C51
    Date: 2008–03
  3. By: Jana Eklund (Bank of England); George Kapetanios (Queen Mary, University of London)
    Abstract: This paper provides a review which focuses on forecasting using statistical/econometric methods designed for dealing with large data sets.
    Keywords: Macroeconomic forecasting, Factor models, Forecast combination, Principal components
    JEL: C22 C53 E37 E47
    Date: 2008–03
  4. By: Alejandro Rodriguez; Esther Ruiz
    Abstract: Prediction intervals in State Space models can be obtained by assuming Gaussian innovations and using the prediction equations of the Kalman filter, where the true parameters are substituted by consistent estimates. This approach has two limitations. First, it does not incorporate the uncertainty due to parameter estimation. Second, the Gaussianity assumption of future innovations may be inaccurate. To overcome these drawbacks, Wall and Stoffer (2002) propose to obtain prediction intervals by using a bootstrap procedure that requires the backward representation of the model. Obtaining this representation increases the complexity of the procedure and limits its implementation to models for which it exists. The bootstrap procedure proposed by Wall and Stoffer (2002) is further complicated by fact that the intervals are obtained for the prediction errors instead of for the observations. In this paper, we propose a bootstrap procedure for constructing prediction intervals in State Space models that does not need the backward representation of the model and is based on obtaining the intervals directly for the observations. Therefore, its application is much simpler, without loosing the good behavior of bootstrap prediction intervals. We study its finite sample properties and compare them with those of the standard and the Wall and Stoffer (2002) procedures for the Local Level Model. Finally, we illustrate the results by implementing the new procedure to obtain prediction intervals for future values of a real time series.
    Keywords: Backward representation, Kalman filter, Local Level Model, Unobserved Components
    Date: 2008–03
  5. By: George Kapetanios (Queen Mary, University of London); Massimiliano Marcellino (Bocconi University and EUI)
    Abstract: Instrumental variable estimation is central to econometric analysis and has justifiably been receiving considerable and consistent attention in the literature in the past. Recent developments have focused on cases where instruments are either weak, in terms of correlations with the endogenous variables, or many or both. The present paper suggests a new way to deal with many, possibly weak, instruments. Our suggestion is to cross-sectionally average the instruments and use these averages as instruments. Intuition and interesting recent work by Hahn (2002) suggest that parsimonious devices used in the construction of the final instruments, may provide effective estimation strategies. Our use of cross-sectional averaging promotes parsimony and therefore falls within the context of such arguments. We provide a theoretical analysis of this approach in terms of its consistency properties and also show, via a Monte Carlo study, that the approach can provide improved estimation compared to standard instrumental variables estimation.
    Keywords: Instrumental variable estimation, 2SLS, Cross-sectional average
    JEL: C13 C23 C51
    Date: 2008–03
  6. By: Christian Conrad (KOF Swiss Economic Institute, ETH Zurich); Menelaos Karanasos (Economics and Finance, Brunel University, Uxbridge, West London)
    Abstract: This paper considers a formulation of the extended constant or time-varying conditional correlation GARCH model which allows for volatility feedback of either sign, i.e., positive or negative. In the previous literature, negative volatility spillovers were ruled out by the assumption that all the coefficients of the model are non- negative, which is a su±cient condition for ensuring the positive definiteness of the conditional covariance matrix. In order to allow for negative feedback, we show that the positive definiteness of the conditional covariance matrix can be guaranteed even if some of the parameters are negative. Thus, we extend the results of Nelson and Cao (1992) and Tsai and Chan (2008) to a multivariate setting. For the bivariate case of order one we look into the consequences of adopting these less severe restrictions and find that the flexibility of the process is substantially increased. Our results are helpful for the model-builder, who can consider the unrestricted formulation as a tool for testing various economic theories.
    Keywords: Inequality constraints, multivariate GARCH processes, volatility feedback
    JEL: C32 C51 C52 C53
    Date: 2008–02
  7. By: Wang, Mu-Chun
    Abstract: In this paper, we put DSGE forecasts in competition with factor forecasts. We focus on these two models since they represent nicely the two opposing forecasting philosophies. The DSGE model on the one hand has a strong theoretical economic background; the factor model on the other hand is mainly data-driven. We show that by incooperating large information set using factor analysis can indeed improve the short horizon predictive ability, as claimed by manyresearchers. The micro founded DSGE model can provide reasonable forecasts for inflation, especially with growing forecast horizons. To a certain extent, our results are consistent with the prevailling view that simple time series models should be used in short-horizon forecasting and structural models should be used in long-horizon forecasting. Our paper compareds both state-of-the art data-driven and theory-based modelling in a rigorous manner.
    Keywords: DSGE models, factor models, forecasting, forecastevaluation
    JEL: C2 C3 C53 E37
    Date: 2008
  8. By: Isabella Sulis; Mariano Porcu
    Abstract: The main aim of this paper is to describe a workable method based on stochastic regression and multiple imputation analysis (MISR) to recover for missingness in surveys where multi-item Likert-type scale are used to measure a latent attribute (namely, the quality of university teaching). A simulation analysis has been carried out and results have been compared in terms of bias and efficiency with other missing data handling methods, specifically: Complete Cases Analysis (CCA) and Multiple Imputation by Chained Equations (MICE). The authors provide also functions (implemented in R language) to apply the procedure to a matrix of ordered categorical items. Functions described allow: (i) to simulate missing data at random and completely at random; (ii) to replicate the simulation study presented in this work in order to assess the accuracy in distribution and in estimation of a multiple imputation procedure.
    Keywords: Multiple Imputation Analysis, Validation Process, MAR, MCAR, MICE
    JEL: C15
    Date: 2008
  9. By: Maria Kasch (University of Bonn); Massimiliano Caporin (Università di Padova)
    Abstract: We extend the Dynamic Conditional Correlation multivariate GARCH specification to investigate the dynamic contemporaneous relationship between correlations and variances of the underlying assets. We present a generalization of the DCC model where the dynamic behavior depends on the assets variances through a threshold structure. Our purpose is to analyze the behavior of correlations in periods of high volatility. The application of the proposed specification to a sample of markets heterogeneous in the levels of their development allows the identification of market pairs whose correlations show low sensitivity to high underlying volatility.
    Keywords: dynamic correlations, thresholds, volatility thresholds, spillovers
    JEL: C50 F37 G11 G15
    Date: 2008

This nep-ecm issue is ©2008 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.