nep-ecm New Economics Papers
on Econometrics
Issue of 2011‒07‒13
twenty-one papers chosen by
Sune Karlsson
Orebro University

  1. Monte Carlo Maximum Likelihood Estimation for Generalized Long-Memory Time Series Models By Geert Mesters; Siem Jan Koopman; Marius Ooms
  2. Bootstrap LR tests of stationarity, common trends and cointegration By Fabio Busetti; Silvestro di Sanzo
  3. Common Correlated Effects Estimation of Dynamic Panels with Cross-Sectional Dependence By T. DE GROOTE; G. EVERAERT
  4. Distributional results for thresholding estimators in high-dimensional Gaussian regression models By Pötscher, Benedikt M.; Schneider, Ulrike
  5. Testing for Multivariate Cointegration in the Presence of Structural Breaks: p-Values and Critical Values By David E. Giles; Ryan T. Godwin
  6. Testing for IIA with the Hausman-McFadden Test By Vijverberg, Wim P.
  7. Testing for Weak Identification in Possibly Nonlinear Models By Barbara Rossi; Atsushi Inoue
  8. A goodnes-of-fit test based on ranks for arma models. By Ferretti, Nélida E.; Kelmansky, Diana M.; Yohai, Victor J.
  9. Practical Proposals for Specifying k-Nearest Neighbours Weights Matrices By Gerkman, Linda; Ahlgren, Niklas
  10. Bayesian Adaptive Hamiltonian Monte Carlo with an Application to High-Dimensional BEKK GARCH Models By Martin Burda; John Maheu
  11. Calculating Confidence Intervals for Continuous and Discontinuous Functions of Estimated Parameters By Ham, John C.; Woutersen, Tiemen
  12. A nonparametric hypothesis test via the Bootstrap resampling By Temel, Tugrul
  13. Interpolation, outliers and inverse autocorrelations. By Peña, Daniel; Maravall, Agustín
  14. Automated model selection in finance: General-to-specic modelling of the mean and volatility specications By Alvaro Escribano; Genaro Sucarrat
  15. A Consistency Theorem for Regular Conditional Distributions By Patrizia Berti; Pietro Rigo
  16. Band Spectrum Regressions using Wavelet Analysis By Andersson, Fredrik N. G.
  17. A Comprehensive Comparison of Alternative Tests for Jumps in Asset Prices By Marina Theodosiou; Filip Zikes
  18. Accounting For Endogenous Search Behavior in Matching Function Estimation By Borowczyk-Martins, Daniel; Jolivet, Grégory; Postel-Vinay, Fabien
  19. Limit Theorems for Empirical Processes Based on Dependent Data By Patrizia Berti; Luca Pratelli; Pietro Rigo
  20. Cycle Extraction: A Comparison of the Phase-Average Trend Method, the Hodrick-Prescott and Christiano-Fitzgerald Filters By Ronny Nilsson; Gyorgy Gyomai
  21. How Large is Large? Preliminary and relative guidelines for interpreting partial correlations in economics By Hristos Doucouliagos

  1. By: Geert Mesters (Netherlands Institute for the Study of Crime and Law Enforcement); Siem Jan Koopman (VU University Amsterdam); Marius Ooms (VU University Amsterdam)
    Abstract: An exact maximum likelihood method is developed for the estimation of parameters in a non-Gaussian nonlinear log-density function that depends on a latent Gaussian dynamic process with long-memory properties. Our method relies on the method of importance sampling and on a linear Gaussian approximating model from which the latent process can be simulated. Given the presence of a latent long-memory process, we require a modification of the importance sampling technique. In particular, the long-memory process needs to be approximated by a finite dynamic linear process. Two possible approximations are discussed and are compared with each other. We show that an auto-regression obtained from minimizing mean squared prediction errors leads to an effective and feasible method. In our empirical study we analyze ten log-return series from the S&P 500 stock index by univariate and multivariate long-memory stochastic volatility models.
    Keywords: Fractional Integration; Importance Sampling; Kalman Filter; Latent Factors; Stochastic Volatility
    JEL: C33 C43
    Date: 2011–06–27
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20110090&r=ecm
  2. By: Fabio Busetti (Bank of Italy); Silvestro di Sanzo (Confcommercio)
    Abstract: The paper considers likelihood ratio (LR) tests of stationarity, common trends and cointegration for multivariate time series. As the distribution of these tests is not known, a bootstrap version is proposed via a state space representation. The bootstrap samples are obtained from the Kalman filter innovations under the null hypothesis. Monte Carlo simulations for the Gaussian univariate random walk plus noise model show that the bootstrap LR test achieves higher power for medium-sized deviations from the null hypothesis than a locally optimal and one-sided LM test, that has a known asymptotic distribution. The power gains of the bootstrap LR test are significantly larger for testing the hypothesis of common trends and cointegration in multivariate time series, as the alternative asymptotic procedure -obtained as an extension of the LM test of stationarity- does not possess properties of optimality. Finally, it is showed that the (pseudo) LR tests maintain good size and power properties also for non-Gaussian series. As an empirical illustration, we find evidence of two common stochastic trends in the volatility of the US dollar exchange rate against european and asian/pacific currencies.
    Keywords: Kalman filter, state-space models, unit roots
    JEL: C12 C22
    Date: 2011–03
    URL: http://d.repec.org/n?u=RePEc:bdi:wptemi:td_799_11&r=ecm
  3. By: T. DE GROOTE; G. EVERAERT
    Abstract: We study estimation of dynamic panel data models with error cross-sectional dependence generated by an unobserved common factor. We show that for a temporally dependent factor, the standard within groups (WG) estimator is inconsistent even as both N and T tend to innity. Next we investigate the properties of the common correlated effects pooled (CCEP) estimator of Pesaran [Econometrica, 2006] which eliminates the cross-sectional dependence using cross-sectional averages of the data. In contrast to the static case, the CCEP estimator is only consistent if next to N also T tends to innity. It is shown that for the most relevant parameter settings, the asymptotic bias of the CCEP estimator is larger than that of the infeasible WG estimator, which includes the common factors as regressors. Restricting the CCEP estimator results in a somewhat smaller asymptotic bias. The small sample proper- ties of the various estimators are analysed using Monte Carlo experiments. The simulation results suggest that the CCEP estimator can be used to estimate dynamic panel data models provided T is not too small. The size of N is of less importance.
    Keywords: Cross-Sectional Dependence; Dynamic Panel; Common Correlated Effects
    JEL: C13 C15 C23
    Date: 2011–06
    URL: http://d.repec.org/n?u=RePEc:rug:rugwps:11/723&r=ecm
  4. By: Pötscher, Benedikt M.; Schneider, Ulrike
    Abstract: We study the distribution of hard-, soft-, and adaptive soft-thresholding estimators within a linear regression model where the number of parameters k can depend on sample size n and may diverge with n. In addition to the case of known error-variance, we define and study versions of the estimators when the error-variance is unknown. We derive the finite-sample distribution of each estimator and study its behavior in the large-sample limit, also investigating the effects of having to estimate the variance when the degrees of freedom n-k does not tend to infinity or tends to infinity very slowly. Our analysis encompasses both the case where the estimators are tuned to perform consistent model selection and the case where the estimators are tuned to perform conservative model selection. Furthermore, we discuss consistency, uniform consistency and derive the minimax rate under either type of tuning.
    Keywords: Thresholding; Lasso; adaptive Lasso; penalized maximum likelihood; finite-sample distribution; asymptotic distribution; variance estimation; minimax rate; high-dimensional model; oracle property
    JEL: C13 C20 C52
    Date: 2011–06
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:31882&r=ecm
  5. By: David E. Giles (Department of Economics, University of Victoria); Ryan T. Godwin
    Abstract: Testing for multivariate cointegration when the data exhibit structural breaks is a problem that is encountered frequently in empirical economic analysis. The standard tests must be modified in this situation, and the asymptotic distributions of the test statistics change accordingly. We supply code that allows practitioners to easily calculate both p-values and critical values for the trace tests of Johansen et al. (2000). Access is also provided to tables of critical values for a broad selection of situations.
    Keywords: Cointegration, structural breaks, trace test, p-values, critical values
    JEL: C12 C32 C87
    Date: 2011–07–04
    URL: http://d.repec.org/n?u=RePEc:vic:vicewp:1110&r=ecm
  6. By: Vijverberg, Wim P. (CUNY Graduate Center)
    Abstract: The Independence of Irrelevant Alternatives assumption inherent in multinomial logit models is most frequently tested with a Hausman-McFadden test. As is confirmed by many findings in the literature, this test sometimes produces negative outcomes, in contradiction of its asymptotic χ² distribution. This problem is caused by the use of an improper variance matrix and may lead to an invalid statistical inference even when the test value is positive. With a correct specification of the variance, the sampling distribution for small samples is indeed close to a χ² distribution.
    Keywords: multinomial logit, IIA assumption, Hausman-McFadden test
    JEL: C12 C35
    Date: 2011–06
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp5826&r=ecm
  7. By: Barbara Rossi; Atsushi Inoue
    Abstract: In this paper we propose a chi-square test for identification. Our proposed test statistic is based on the distance between two shrinkage extremum estimators. The two estimators converge in probability to the same limit when identification is strong, and their asymptotic distributions are different when identification is weak. The proposed test is consistent not only for the alternative hypothesis of no identification but also for the alternative of weak identification, which is confirmed by our Monte Carlo results. We apply the proposed technique to test whether the structural parameters of a representative Taylor-rule monetary policy reaction function are identified.
    Keywords: GMM, Shrinkage, Weak Identification
    JEL: C12
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:duk:dukeec:10-92&r=ecm
  8. By: Ferretti, Nélida E.; Kelmansky, Diana M.; Yohai, Victor J.
    Abstract: In this paper we introduce a goodness-of-fit test based on ranks for ARMA models. The classical portmanteau statistic is generalized to a class of estimators based on ranks. The asymptotic distributions of the proposed statistics are derived. Simulation results suggest that the proposed statistics have good robustness properties for an adequate choice of the score functions.
    Keywords: AR,MA models; Ranks; Goodness-of-fit;
    URL: http://d.repec.org/n?u=RePEc:ner:carlos:info:hdl:10016/5817&r=ecm
  9. By: Gerkman, Linda (Department of Finance and Statistics); Ahlgren, Niklas (Department of Finance and Statistics)
    Abstract: In this article we introduce and evaluate testing procedures for specifying the number k of nearest neighbours in the weights matrix of spatial econometric models. The spatial J-test is used for specification search. Two testing procedures are suggested: an increasing neighbours testing procedure and a decreasing neighbours testing procedure. Simulations show that the increasing neighbours testing procedures can be used in large samples to determine k. The decreasing neighbours testing procedure is found to have low power, and is not recommended for use in practice. An empirical example involving house price data is provided to show how to use the testing procedures with real data.
    Keywords: k-nearest neighbours; model specification; spatial j-test; weights matrix
    Date: 2011–05–18
    URL: http://d.repec.org/n?u=RePEc:hhb:hanken:0555&r=ecm
  10. By: Martin Burda; John Maheu
    Abstract: Hamiltonian Monte Carlo (HMC) is a recent statistical procedure to sample from complex distributions. Distant proposal draws are taken in a equence of steps following the Hamiltonian dynamics of the underlying parameter space, often yielding superior mixing properties of the resulting Markov chain. However, its performance can deteriorate sharply with the degree of irregularity of the underlying likelihood due to its lack of local adaptability in the parameter space. Riemann Manifold HMC (RMHMC), a locally adaptive version of HMC, alleviates this problem, but at a substantially increased computational cost that can become prohibitive in high-dimensional scenarios. In this paper we propose the Adaptive HMC (AHMC), an alternative inferential method based on HMC that is both fast and locally adaptive, combining the advantages of both HMC and RMHMC. The benefits become more pronounced with higher dimensionality of the parameter space and with the degree of irregularity of the underlying likelihood surface. We show that AHMC satisfies detailed balance for a valid MCMC scheme and provide a comparison with RMHMC in terms of effective sample size, highlighting substantial efficiency gains of AHMC. Simulation examples and an application of the BEKK GARCH model show the usefulness of the new posterior sampler.
    Keywords: High-dimensional joint sampling; Markov chain Monte Carlo; Multivariate GARCH
    JEL: C01 C11 C15 C32
    Date: 2011–06–21
    URL: http://d.repec.org/n?u=RePEc:tor:tecipa:tecipa-438&r=ecm
  11. By: Ham, John C. (University of Maryland); Woutersen, Tiemen (Johns Hopkins University)
    Abstract: The delta method is commonly used to calculate confidence intervals of functions of estimated parameters that are differentiable with non-zero, bounded derivatives. When the delta method is inappropriate, researchers usually first use a bootstrap procedure where they i) repeatedly take a draw from the asymptotic distribution of the parameter values and ii) calculate the function value for this draw. They then trim the bottom and top of the distribution of function values to obtain their confidence interval. This note first provides several examples where this procedure and/or delta method fail to provide an appropriate confidence interval. It next presents a method that is appropriate for constructing confidence intervals for functions that are discontinuous or are continuous but have zero or unbounded derivatives. In particular the coverage probabilities for our method converge uniformly to their nominal values, which is not necessarily true for the other methods discussed above.
    Keywords: confidence intervals, simulation, structural models, policy effects
    JEL: C12 C15
    Date: 2011–06
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp5816&r=ecm
  12. By: Temel, Tugrul
    Abstract: This paper adapts an already existing nonparametric hypothesis test to the bootstrap framework. The test utilizes the nonparametric kernel regression method to estimate a measure of distance between the models stated under the null hypothesis. The bootstraped version of the test allows to approximate errors involved in the asymptotic hypothesis test. The paper also develops a Mathematica Code for the test algorithm.
    Keywords: Hypothesis test; the bootstrap; nonparametric regression; omitted variables
    JEL: C14 C12 C15
    Date: 2011–06–28
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:31880&r=ecm
  13. By: Peña, Daniel; Maravall, Agustín
    Abstract: The paper addresses the problem of estimating missing observations in linear, possibly nonstationary, stochastic processes when the model is known. The general case of any possible distribution of missing observations in the time series is considered, and analytical expressions for the optimal estimators and their associated mean squared errors are obtained. These expressions involve solely the elements of the inverse or dual autocorrelation function of the series. This optimal estimator -the conditional expectation of the missing observations given the available ones-is equal oto the estimator that results from filling the missing values in the series with arbitrary numbers, treating these numbers as additive outliers, and removing the outlier effects from the invented numbers using intervention analysis.
    Keywords: Missing observations; Outliers; Intervention analysis; ARIMA models; Inverse autocorrelation function;
    URL: http://d.repec.org/n?u=RePEc:ner:carlos:info:hdl:10016/2770&r=ecm
  14. By: Alvaro Escribano (Universidad Carlos III de Madrid); Genaro Sucarrat (BI Norwegian School of Management)
    Abstract: General-to-Specific (GETS) modelling has witnessed major advances over the last decade thanks to the automation of multi-path GETS specification search. However, several scholars have argued that the estimation complexity associated with financial models constitutes an obstacle to multi-path GETS modelling in finance. Making use of a recent result on log-GARCH Models, we provide and study simple but general and flexible methods that automate financial multi-path GETS modelling. Starting from a general model where the mean specification can contain autoregressive (AR) terms and explanatory variables, and where the exponential volatility specification can include log-ARCH terms, asymmetry terms, volatility proxies and other explanatory variables, the algorithm we propose returns parsimonious mean and volatility specifications. The finite sample properties of the methods are studied by means of extensive Monte Carlo simulations, and two empirical applications suggest the methods are very useful in practice.
    Keywords: general-to-specific; specification search; model selection; finance; volatility
    JEL: C32 C51 C52 E44
    Date: 2011–06–23
    URL: http://d.repec.org/n?u=RePEc:imd:wpaper:wp2011-09&r=ecm
  15. By: Patrizia Berti (Department of Mathematics, University of Modena and Reggio Emilia); Pietro Rigo (Department of Economics and Quantitative Methods, University of Pavia)
    Abstract: Let (omega, beta) be a measurable space, An in B a sub-sigma-field and µn a random probability measure, n >= 1. In various frameworks, one looks for a probability P on B such that µn is a regular conditional distribution for P given An for all n. Conditions for such a P to exist are given. The conditions are quite simple when (omega, beta) is a compact Hausdorff space equipped with the Borel or the Bairesigma-field (as well as under other similar assumptions). Such conditions are then applied to Bayesian statistics.
    Keywords: Posterior distribution, Random probability measure, Regular conditional distribution.
    Date: 2010–12
    URL: http://d.repec.org/n?u=RePEc:pav:wpaper:259&r=ecm
  16. By: Andersson, Fredrik N. G. (Department of Economics, Lund University)
    Abstract: In economics it is common to distinguish between different time horizons (i.e. short run, medium run, and long run). Engle (1974) proposed combining the discrete Fourier transform with a band spectrum regression to estimate models that separates between different time horizons. In this paper we discuss possibilities and challenges using the maximal overlap discrete wavelet transform instead of the Fourier transform when estimating band spectrum regressions.
    Keywords: band spectrum regression; wavelet transform; frequency domain; economic modeling
    JEL: C14 C32 C51
    Date: 2011–06–29
    URL: http://d.repec.org/n?u=RePEc:hhs:lunewp:2011_022&r=ecm
  17. By: Marina Theodosiou (Central Bank of Cyprus); Filip Zikes (Imperial College London)
    Abstract: This paper presents a comprehensive comparison of the existing tests for the presence of jumps in prices of financial assets. The relative performance of the tests is examined in a Monte Carlo simulation, covering scenarios of both finite and infinite activity jumps, stochastic volatility models with continuous and discontinuous volatility sample paths, microstructure noise, infrequent trading and deterministic diurnal volatility. The simulation results reveal important differences in terms of size and power across the different data generating processes and sensitivity to the presence of zero returns and microstructure frictions in the data. An empirical application to assets from different classes complements the analysis.
    Keywords: Quadratic variation, jumps, stochastic volatility, realized measures,high-frequency data
    JEL: E31 C12 C14 G10
    Date: 2011–07
    URL: http://d.repec.org/n?u=RePEc:cyb:wpaper:2011-1&r=ecm
  18. By: Borowczyk-Martins, Daniel; Jolivet, Grégory; Postel-Vinay, Fabien
    Abstract: We show that equilibrium matching models imply that standard estimates of the matching function elasticities are exposed to an endogeneity bias, which arises from the search behavior of agents on either side of the market. We offer an estimation method which, under certain assumptions, is immune from that bias. Application of our method to the estimation of a basic version of the matching function using aggregate U.S. data from the Job Openings and Labor Turnover Survey (JOLTS) suggests that the bias is quantitatively important.
    Keywords: Job Finding; Matching Function Estimation; Unemployment; Vacancies
    JEL: J63 J64
    Date: 2011–07
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:8471&r=ecm
  19. By: Patrizia Berti (Department of Mathematics, University of Modena and Reggio Emilia); Luca Pratelli (Accademia Navale di Livorno); Pietro Rigo (Department of Economics and Quantitative Methods, University of Pavia)
    Abstract: Empirical processes for non ergodic data are investigated under uniform distance. Some CLTs, both uniform and non uniform, are proved. In particular, conditions for Bn = n^(1/2) (µn - bn) and Cn = n^(1/2) (µn - an) to converge in distribution are given, where µn is the empirical measure, an the predictive measure, and bn = 1/n sum (ai) for i=0 to n-1. Such conditions can be applied to any adapted sequence of random variables. Various examples and a characterization of conditionally identically distributed sequences are given as well.
    Keywords: Conditional identity in distribution, empirical process, exchangeability, predictive measure, stable convergence.
    Date: 2010–11
    URL: http://d.repec.org/n?u=RePEc:pav:wpaper:255&r=ecm
  20. By: Ronny Nilsson; Gyorgy Gyomai
    Abstract: This paper reports on revision properties of different de-trending and smoothing methods (cycle estimation methods), including PAT with MCD smoothing, a double Hodrick-Prescott (HP) filter and the Christiano-Fitzgerald (CF) filter. The different cycle estimation methods are rated on their revision performance in a simulated real time experiment. Our goal is to find a robust method that gives early turning point signals and steady turning point signals. The revision performance of the methods has been evaluated according to bias, overall revision size and signal stability measures. In a second phase, we investigate if revision performance is improved using stabilizing forecasts or by changing the cycle estimation window from the baseline 6 and 96 months (i.e. filtering out high frequency noise with a cycle length shorter than 6 months and removing trend components with cycle length longer than 96 months) to 12 and 120 months. The results show that, for all tested time series, the PAT de-trending method is outperformed by both the HP or CF filter. In addition, the results indicate that the HP filter outperforms the CF filter in turning point signal stability but has a weaker performance in absolute numerical precision. Short horizon stabilizing forecasts tend to improve revision characteristics of both methods and the changed filter window also delivers more robust turning point estimates.<BR>Ce document présente l’impact des révisions dû à différentes méthodes de lissage et de correction de la tendance (méthodes d'estimation du cycle), comme la méthode PAT avec lissage en utilisant le mois de dominance cyclique (MCD), le double filtre de Hodrick-Prescott (HP) et le filtre Christiano-Fitzgerald (CF). Les différentes méthodes d'estimation du cycle sont évaluées sur leur performance de révision faite à partir d’une simulation en temps réel. Notre objectif est de trouver une méthode robuste qui donne des signaux de point de retournement tôt et stable á la fois. La performance de révisions de ces méthodes a été évaluée en fonction du biais, de la grandeur de la révision et de la stabilité du signal. Nous examinerons ensuite si la performance de la révision peut être améliorée en utilisant des prévisions de stabilisation ou en changeant la fenêtre d'estimation du cycle de base de 6 et 96 mois à une fenêtre de 12 et 120 mois. La fenêtre d’estimation de base correspond à un filtre pour éliminer le bruit (hautes fréquences) avec une longueur de cycle de moins de 6 mois et supprimer la tendance avec une longueur de cycle supérieure à 96 mois. Les résultats montrent que, pour toutes les séries testées, la méthode PAT est moins performante que les deux filtres HP ou CF. En outre, les résultats indiquent que le filtre HP surpasse le filtre CF du point de vue de la stabilité du signal du point de retournement mais sa performance est plus faible quant à la précision numérique absolue. Des prévisions à court terme ont la tendance à améliorer les caractéristiques des révisions des deux méthodes et la modification de la fenêtre de base offre aussi des estimations plus robustes des points de retournement.
    Date: 2011–05–27
    URL: http://d.repec.org/n?u=RePEc:oec:stdaaa:2011/3-en&r=ecm
  21. By: Hristos Doucouliagos
    Abstract: An essential part of empirical economics research is the identification of the size of an empirical effect. Partial correlations offer a convenient statistically based measure of the strength of an economic relationship. A key question arises in their interpretation: When is a partial correlation large? This paper draws upon the observed distribution of 22,000 partial correlations from a diverse group of economics fields. The median absolute partial correlation from these fields is 0.173, which under Cohen’s (1988) conventional guidelines for zero order correlations is a small to moderate effect. The paper develops new guidelines for key qualitative categories (small, medium and large). According to the new guidelines, partial correlations that are larger than ± 0.33 can be deemed to be large. This is considerably different to Cohen’s guideline of ±0.50 for zero order correlations. Researchers and meta-analysts should exercise caution when applying Cohen’s guidelines to describe the importance of partial correlations in economics.
    Keywords: partial correlations; guidelines; empirical economics; meta-analysis
    JEL: C01 C50
    Date: 2011–07–04
    URL: http://d.repec.org/n?u=RePEc:dkn:econwp:eco_2011_5&r=ecm

This nep-ecm issue is ©2011 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.