nep-ecm New Economics Papers
on Econometrics
Issue of 2012‒09‒16
fifteen papers chosen by
Sune Karlsson
Orebro University

  1. Model Specification between Parametric and Nonparametric Cointegration By Jiti Gao; Dag Tjøstheim; Jiying Yin
  2. Series Estimation of Stochastic Processes: Recent Developments and Econometric Applications By Peter C.B. Phillips; Zhipeng Liao
  3. A Flexible Semiparametric Model for Time Series By Degui Li; Oliver Linton; Zudi Lu
  4. Automated Estimation of Vector Error Correction Models By Zhipeng Liao; Peter C.B. Phillips
  5. On Geometric Ergodicity of Skewed - SVCHARME models By Jerzy P. Rydlewski; Ma{\l}gorzata Snarska
  6. Inference on Structural Breaks using Information Criteria By Alastair R. Hall; Denise R. Osborn; Nikolaos D. Sakkas
  7. Consistent testing for structural change at the ends of the sample By Michael W. McCracken
  8. Real-Time Forecast Density Combinations (Forecasting US GDP Growth Using Mixed-Frequency Data) By Götz Thomas B.; Hecq Alain; Urbain Jean-Pierre
  9. Capital adjustment cost and bias in income based dynamic panel models with fixed effects By Yoseph Yilma Getachew; Keshab Bhattarai; Parantap Basu
  10. Introducción a la econometría espacial: Una aplicación al estudio de la fecundidad en la Argentina usando R. By Herrera Gómez, Marcos; Cid , Juan Carlos; Paz , Jorge Augusto
  11. The Determinants of VAT Introduction: A Spatial Duration Analysis By Cizek, P.; Lei, J.; Ligthart, J.E.
  12. Real-time forecasting with a mixed-frequency VAR By Frank Schorfheide; Dongho Song
  13. Dynamic factor models with macro, frailty and industry effects for US default counts: the credit crisis of 2008 By Siem Jan Koopman; André Lucas; Bernd Schwaab
  14. The stability of feature selection and class prediction from ensemble tree classifiers. By Paul, Jérôme
  15. Assessing the evidence on neighborhood effects from> moving to opportunity By Dionissi Aliprantis

  1. By: Jiti Gao; Dag Tjøstheim; Jiying Yin
    Abstract: This paper considers a general model specification between a parametric co-integrating model and a nonparametric co-integrating model in a multivariate regression model, which involves a univariate integrated time series regressor and a vector of stationary time series regressors. A new and simple test is proposed and the resulting asymptotic theory is established. The test statistic is constructed based on a natural distance function between a nonparametric estimate and a smoothed parametric counterpart. The asymptotic distribution of the test statistic under the parametric specification is proportional to that of a local-time random variable with a known distribution. In addition, the finite sample performance of the proposed test is evaluated through using both simulated and real data examples.
    Keywords: Cointegration, nonparametric kernel estimation, parametric model specification, time series.
    JEL: C12 C14 C22
    Date: 2012–08–24
  2. By: Peter C.B. Phillips (Cowles Foundation, Yale University); Zhipeng Liao (Dept. of Economics, UCLA)
    Abstract: This paper overviews recent developments in series estimation of stochastic processes and some of their applications in econometrics. Underlying this approach is the idea that a stochastic process may under certain conditions be represented in terms of a set of orthonormal basis functions, giving a series representation that involves deterministic functions. Several applications of this series approximation method are discussed. The first shows how a continuous function can be approximated by a linear combination of Brownian motions (BMs), which is useful in the study of the spurious regressions. The second application utilizes the series representation of BM to investigate the effect of the presence of deterministic trends in a regression on traditional unit-root tests. The third uses basis functions in the series approximation as instrumental variables (IVs) to perform efficient estimation of the parameters in cointegrated systems. The fourth application proposes alternative estimators of long-run variances in some econometric models with dependent data, thereby providing autocorrelation robust inference methods in these models. We review some work related to these applications and some ongoing research involving series approximation methods.
    Keywords: Cointegrated system, HAC estimation; Instrumental variables, Lasso regression, Karhunen-Loeve representation, Long-run variance, Reproducing kernel Hilbert space, Oracle effciency, Orthonormal system, Trend basis
    JEL: C22
    Date: 2012–09
  3. By: Degui Li; Oliver Linton; Zudi Lu
    Abstract: We consider approximating a multivariate regression function by an affine combination of one-dimensional conditional component regression functions. The weight parameters involved in the approximation are estimated by least squares on the first-stage nonparametric kernel estimates. We establish asymptotic normality for the estimated weights and the regression function in two cases: the number of the covariates is finite, and the number of the covariates is diverging. As the observations are assumed to be stationary and near epoch dependent, the approach in this paper is applicable to estimation and forecasting issues in time series analysis. Furthermore, the methods and results are augmented by a simulation study and illustrated by application in the analysis of the Australian annual mean temperature anomaly series. We also apply our methods to high frequency volatility forecasting, where we obtain superior results to parametric methods.
    Keywords: Asymptotic normality, model averaging, Nadaraya-Watson kernel estimation, near epoch dependence, semiparametric method
    JEL: C14 C22
    Date: 2012–08–04
  4. By: Zhipeng Liao (Dept. of Economics, UCLA); Peter C.B. Phillips (Cowles Foundation, Yale University)
    Abstract: Model selection and associated issues of post-model selection inference present well known challenges in empirical econometric research. These modeling issues are manifest in all applied work but they are particularly acute in multivariate time series settings such as cointegrated systems where multiple interconnected decisions can materially affect the form of the model and its interpretation. In cointegrated system modeling, empirical estimation typically proceeds in a stepwise manner that involves the determination of cointegrating rank and autoregressive lag order in a reduced rank vector autoregression followed by estimation and inference. This paper proposes an automated approach to cointegrated system modeling that uses adaptive shrinkage techniques to estimate vector error correction models with unknown cointegrating rank structure and unknown transient lag dynamic order. These methods enable simultaneous order estimation of the cointegrating rank and autoregressive order in conjunction with oracle-like efficient estimation of the cointegrating matrix and transient dynamics. As such they offer considerable advantages to the practitioner as an automated approach to the estimation of cointegrated systems. The paper develops the new methods, derives their limit theory, reports simulations and presents an empirical illustration with macroeconomic aggregates.
    Keywords: Adaptive shrinkage, Automation, Cointegrating rank, Lasso regression, Oracle efficiency, Transient dynamics, Vector error correction
    JEL: C22
    Date: 2012–09
  5. By: Jerzy P. Rydlewski; Ma{\l}gorzata Snarska
    Abstract: Markov Chain Monte Carlo is repeatedly used to analyze the properties of intractable distributions in a convenient way. In this paper we derive conditions for geometric ergodicity of a general class of nonparametric stochastic volatility models with skewness driven by hidden Markov Chain with switching.
    Date: 2012–09
  6. By: Alastair R. Hall; Denise R. Osborn; Nikolaos D. Sakkas
    Abstract: This paper investigates the usefulness of information criteria for inference on the number of structural breaks in a standard linear regression model. In particular, we propose a modified penalty function for such criteria based on theoretical arguments, which implies each break is equivalent to estimation of three individual regression coefficients. A Monte Carlo analysis compares information criteria to sequential testing, with the modified BIC and HQIC criteria performing well overall, for DGPs both without and with breaks. The methods are also used to examine changes in Euro area monetary policy between 1971 and 2007.
    Date: 2012
  7. By: Michael W. McCracken
    Abstract: In this paper we provide analytical and Monte Carlo evidence that Chow and Predictive tests can be consistent against alternatives that allow structural change to occur at either end of the sample. Attention is restricted to linear regression models that may have a break in the intercept. The results are based on a novel reparameterization of the actual and potential break point locations. Standard methods parameterize both of these locations as fixed fractions of the sample size. We parameterize these locations as more general integer valued functions. Power at the ends of the sample is evaluated by letting both locations, as a percentage of the sample size, converge to zero or one. We find that for a potential break point function, the tests are consistent against alternatives that converge to zero or one at sufficiently slow rates and are inconsistent against alternatives that converge sufficiently quickly. Monte Carlo evidence supports the theory though large samples are sometimes needed for reasonable power.
    Keywords: Econometric models
    Date: 2012
  8. By: Götz Thomas B.; Hecq Alain; Urbain Jean-Pierre (METEOR)
    Abstract: We combine the issues of dealing with variables sampled at mixed frequencies and the use ofreal-time data. In particular, the repeated observations forecasting (ROF) analysis of Stark andCroushore (2002) is extended to an autoregressive distributed lag setting in which the regressorsmay be sampled at higher frequencies than the regressand. For the US GDP quarterly growth rate, wecompare the forecasting performances of an AR model with several mixed-frequency models amongwhich the MIDAS approach. The additional dimension provided by different vintages allows us tocompute several forecasts for a given calendar date and use them to construct forecast densities.Scoring rules are employed to test for their equality and to construct combinations of them. Giventhe change of the implied weights over time, we propose time-varying ROF-based weights usingvintage data which present an alternative to traditional weighting schemes.
    Keywords: macroeconomics ;
    Date: 2012
  9. By: Yoseph Yilma Getachew (Durham Business School); Keshab Bhattarai (University of Hull); Parantap Basu (Durham Business School)
    Abstract: The fixed effects (FE) estimator of "conditional convergence" in income based dynamic panel models could be biased downward when capital adjustment cost is present. Such a capital adjustment cost means a rising marginal cost of investment which could slow down the convergence. The standard FE regression fails to take into account of this capital adjustment cost and thus it could overestimate the rate of convergence. Using a Ramsey model with long-run adjustment cost of capital, we characterize this bias that does not go away even for a longer time dimension. The size of the bias is greater in economies with a higher adjustment cost. The cross-country regression suggests that the size of this bias could be substantial.
    Keywords: Dynamic panel model, fixed effects, adjustment cost of capital, downward bias
    JEL: C5 D2 D9 O5
    Date: 2012–09–07
  10. By: Herrera Gómez, Marcos; Cid , Juan Carlos; Paz , Jorge Augusto
    Abstract: Spatial econometrics is a relatively young branch econometric but with a great growth in the last decades. The complexity of spatial analysis and the estimation of spatial models has been the major obstacle for applied studies. The aim of this paper is to contribute to the diffusion of spatial tools developed. Specifically, this paper performs a concise review of the theoretical aspects that involve the spatial treatment. We also present an empirical application of the techniques discussed. Using the statistical program R, we analyze the determinants of fertility in Argentina.
    Keywords: Econometría Espacial; Autocorrelación Espacial; Fecundidad; Programa Estadístico R
    JEL: J13 R12 C21
    Date: 2012
  11. By: Cizek, P.; Lei, J.; Ligthart, J.E. (Tilburg University, Center for Economic Research)
    Abstract: Abstract: The spatial survival models typically impose frailties, which characterize unobserved heterogeneity, to be spatially correlated. This specification relies highly on a pre-determinate covariance structure of the errors. However, the spatial effect may not only exist in the unobserved errors, but it can also be present in the baseline hazards and the dependent variables. A new spatial survival model with these three possible spatial correlation structures is explored and used to investigate the determinants of value-added tax implementation in 92 countries over the period 1970–2008 using the proposed model. The estimation results suggest the presence of a significant copycat effect among neighboring countries for both contiguity and distance weight matrices.
    Keywords: Spatial duration;MCMC;Metropolis-Hastings algorithm;Value-added tax.
    JEL: C11 C23 C41 H20 H70
    Date: 2012
  12. By: Frank Schorfheide; Dongho Song
    Abstract: This paper develops a vector autoregression (VAR) for macroeconomic time series which are observed at mixed frequencies – quarterly and monthly. The mixed-frequency VAR is cast in state-space form and estimated with Bayesian methods under a Minnesota-style prior. Using a real-time data set, we generate and evaluate forecasts from the mixed-frequency VAR and compare them to forecasts from a VAR that is estimated based on data time-aggregated to quarterly frequency. We document how information that becomes available within the quarter improves the forecasts in real time.
    Keywords: Bayesian statistical decision theory ; Forecasting ; Vector autoregression
    Date: 2012
  13. By: Siem Jan Koopman (VU University Amsterdam, Department of Econometrics, De Boelelaan 1105, 1081 HV Amsterdam, The Netherlands and Tinbergen Institute); André Lucas (VU University Amsterdam, Department of Finance, De Boelelaan 1105, 1081 HV Amsterdam, The Netherlands, Tinbergen Institute and Duisenberg school of finance); Bernd Schwaab (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt, Germany)
    Abstract: We develop a high-dimensional and partly nonlinear non-Gaussian dynamic factor model for the decomposition of systematic default risk conditions into a set of latent components that correspond with macroeconomic/financial, default-specific (frailty), and industry-specific effects. Discrete default counts together with macroeconomic and financial variables are modeled simultaneously in this framework. In our empirical study based on defaults of U.S. firms, we find that approximately 35 percent of default rate variation is due to systematic and industry factors. Approximately one third of systematic variation is captured by macroeconomic/financial factors. The remainder is captured by frailty (about 40 percent) and industry (about 25 percent) effects. The default-specific effects are particularly relevant before and during times of financial turbulence. For example, we detect a build-up of systematic risk over the period preceding the 2008 credit crisis. JEL Classification: C33, G21
    Keywords: Financial crisis, default risk, credit portfolio models, frailty-correlated defaults, state space methods
    Date: 2012–08
  14. By: Paul, Jérôme
    Abstract: The bootstrap aggregating procedure at the core of ensemble tree classifiers reduces, in most cases, the variance of such models while offering good generalization capabilities. The average predictive performance of those ensembles is known to improve up to a certain point while increasing the ensemble size. The present work studies this convergence in contrast to the stability of the class prediction and the variable selection performed while and after growing the ensemble. Experiments on several biomedical datasets, using random forests or bagging of decision trees,show that class prediction and, most notably, variable selection typically require orders of magnitude more trees to get stable.
    Date: 2012
  15. By: Dionissi Aliprantis
    Abstract: This paper investigates the assumptions under which various parameters can be identified by the Moving to Opportunity (MTO) housing mobility experiment. Joint models of potential outcomes and selection into treatment are used to clarify the current interpretation of empirical evidence, distinguishing program effects from neighborhood effects. It is shown that MTO only identifi es a restricted subset of the neighborhood effects of interest, with empirical evidence presented that MTO does not identify effects from moving to high quality neighborhoods. One implication is that programs designed around measures other than poverty might have larger effects than MTO.
    Keywords: Housing policy ; Poverty
    Date: 2012

This nep-ecm issue is ©2012 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.