nep-ecm New Economics Papers
on Econometrics
Issue of 2011‒07‒21
sixteen papers chosen by
Sune Karlsson
Orebro University

  1. Unit Root Testing with Stationary Covariates and a Structural Break in the Trend Function By Fossati, Sebastian
  2. Mixtures of g-priors for bayesian model averaging with economic applications By Eduardo Ley; Mark F.J. Steel
  3. Enhanced Objective Bayesian Testing for the Equality of two Proportions By Guido Consonni; Jonathan J. Forster; Luca La Rocca
  4. Specification Sensitivities in Right-Tailed Unit Root Testing for Financial Bubbles By Shu-Ping Shi; Peter C. B. Phillips; Jun Yu
  5. A new approach for estimation of long-run relationships in economic analysis using Engle-Granger and artificial intelligence methods By Arshia Amiri; Ulf-G Gerdtham; Bruno Ventelou
  6. Tests for neglected heterogeneity in moment condition models By Jinyong Hahn; Whitney Newey; Richard Smith
  7. Hierarchical hidden Markov structure for dynamic correlations: the hierarchical RSDC model (version révisée) By Philippe Charlot; Vêlayoudom Marimoutou
  8. The Robustness of the Hyperbolic Efficiency Estimator By Christopher Bruffaerts; Bram De Rock; Catherine Dehon
  9. Summability of stochastic processes: a generalization of integration and co-integration valid for non-linear processes By Vanessa Berenguer Rico; Jesus Gonzalo
  10. Avoid Filling Swiss Cheese with Whipped Cream: Imputation Techniques and Evaluation Procedures for Cross-Country Time Series By Michaela Denk; Michael Weber
  11. Predicting Bid-Ask Spreads Using Long Memory Autoregressive Conditional Poisson Models By Axel Groß-Klußmann; Nikolaus Hautsch
  12. The Contribution of Structural Break Models to Forecasting Macroeconomic Series By Luc Bauwens; Gary Koop; Dimitris Korobilis; Jeroen V.K. Rombouts
  13. The identification of a mixture of first order binary Markov Chains By Martin Browning; Jesús M. Carro
  14. Testing Hardy-Weinberg Equilibrium: an Objective Bayesian Analysis By Guido Consonni; Elias Moreno; Sergio Venturini
  15. State dependence and heterogeneity in health using a bias corrected fixed effects estimator By Jesús M. Carro; Alejandra Traferri
  16. When Can We Trust Population Thresholds in Regression Discontinuity Designs? By Florian Ade; Ronny Freier

  1. By: Fossati, Sebastian (University of Alberta, Department of Economics)
    Abstract: The issue of testing for a unit root allowing for a structural break in the trend function is considered. The focus is on the construction of more powerful tests using the information in relevant multivariate data sets. The proposed test adopts the GLS detrending approach and uses correlated stationary covariates to improve power. As it is standard in the literature, the break date is treated as unknown. Asymptotic distributions are derived and a set of asymptotic and nite sample critical values are tabulated. Asymptotic local power functions show that power gains can be large. Finite sample results show that the test exhibits small size distortions and power that can be far beyond what is achievable by univariate tests.
    Keywords: unit root test; CLS detrending; structural break
    JEL: C22 C32
    Date: 2011–05–01
    URL: http://d.repec.org/n?u=RePEc:ris:albaec:2011_010&r=ecm
  2. By: Eduardo Ley; Mark F.J. Steel
    Abstract: We examine the issue of variable selection in linear regression have a potentially large amount of possible covariates and economic theory offers insufficient guidance on how to select the Model Averaging presents uncertainty. Our main interest here is the effect of the prior on the results, such as posterior inclusion probabilities of regressors and predictive performance. We combine a Binomial-Beta prior on model size with a g addition, we assign a hyperprior to g, as the choice impact on the results. For the prior of Beta shrinkage priors, which covers most choices in the recent literature. We propose a benchmark Beta prior, inspired by earlier findings with fixed g, and show it leads to selection. Inference is conducted through a Markov chain Monte Carlo sampler over model space and g. We examine the performance of the various priors in the context of simulated and real data. For the latter, we consider two important appl economics, namely cross-country growth regression and returns to schooling. Recommendations to applied users are provided.
    Keywords: Consistency, Model uncertainty, Posterior odds, Prediction, Robustness
    JEL: C11 O47
    Date: 2011–07
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws112116&r=ecm
  3. By: Guido Consonni (Department of Economics and Quantitative Methods, University of Pavia); Jonathan J. Forster (School of Mathematics, University of Southampton); Luca La Rocca (Dipartimento di Comunicazione e Economia, University of Modena and Reggio Emilia)
    Abstract: We develop a new class of prior distributions for Bayesian comparison of nested models, which we call intrinsic moment priors, by combining the well-established notion of intrinsic prior with the recently introduced idea of non-local priors, and in particular of moment priors. Specifically, we aim at testing the equality of two proportions, based on independent samples, and thus focus on discrete data models. Given two nested models, each equipped with a default prior, we first construct a moment prior under the larger model. In this way, the asymptotic learning behavior of the Bayes factor is strengthened, relative to currently used local priors, when the smaller model holds; remarkably, this effect is already apparent for moderate sample sizes. On the other hand, the asymptotic learning behavior of the Bayes factor when the larger model holds is unchanged. However, without appropriate tuning, a moment prior does not provide enough evidence for the larger model when the sample size is small and the data only moderately support the smaller one. For this reason, we apply to the moment prior an intrinsic prior procedure, which amounts to pulling the moment prior towards the subspace specified by the smaller model; we provide general guidelines for determining the training sample size necessary to implement this step. Thus, by joining the virtues of moment and intrinsic priors, we obtain an enhanced objective Bayesian testing procedure: i) our evidence for small samples is broadly comparable to that given by current objective methods; ii) we achieve a superior learning performance as the sample size increases (when the smaller model holds). We first illustrate our methodology in a running Bernoulli example, where we test a sharp null hypothesis, then we implement our procedure to test the equality of two proportions. A detailed analysis of the properties of our method, including a comparison with standard intrinsic priors, is presented together with an application to a collection of real-world 2 × 2 tables involving a sensitivity analysis and a crossvalidation study.
    Keywords: Bayes factor, intrinsic prior, model choice, moment prior, non-local prior, training sample size.
    Date: 2010–10
    URL: http://d.repec.org/n?u=RePEc:pav:wpaper:248&r=ecm
  4. By: Shu-Ping Shi (The Australian National University); Peter C. B. Phillips (Yale University, University of Auckland, University of Southampton and Singapore Management University); Jun Yu (Singapore Management University and Hong Kong Institute for Monetary Research)
    Abstract: Right-tailed unit root tests have proved promising for detecting exuberance in economic and financial activities. Like left-tailed tests, the limit theory and test performance are sensitive to the null hypothesis and the model specification used in parameter estimation. This paper aims to provide some empirical guidelines for the practical implementation of right-tailed unit root tests, focusing on the sup ADF test of Phillips, Wu and Yu (2011), which implements a right-tailed ADF test repeatedly on a sequence of forward sample recursions. We analyze and compare the limit theory of the sup ADF test under different hypotheses and model specifications. The size and power properties of the test under various scenarios are examined in simulations and some recommendations for empirical practice are given. An empirical application to Nasdaq data reveals the practical importance of model specification on test outcomes.
    Keywords: Unit Root Test, Mildly Explosive Process, Recursive Regression, Size and Power
    JEL: C15 C22
    Date: 2011–06
    URL: http://d.repec.org/n?u=RePEc:hkm:wpaper:172011&r=ecm
  5. By: Arshia Amiri (Department of Agricultural Economics - Shiraz University); Ulf-G Gerdtham (Lund University - Lund University); Bruno Ventelou (GREQAM - Groupement de Recherche en Économie Quantitative d'Aix-Marseille - Université de la Méditerranée - Aix-Marseille II - Université Paul Cézanne - Aix-Marseille III - Ecole des Hautes Etudes en Sciences Sociales (EHESS) - CNRS : UMR6579)
    Abstract: In time series analysis, most estimation of relationships and tests are typically based on linear estimators and most classical co-integration methods and causality tests are based on OLS regresses. However the linear functional specification is not necessarily the most appropriate form. This paper breaks the ordinary rules in econometrics and makes use of time series with artificial intelligence methods, testing for existence of nonlinear relationship. We illustrate the testing exercise using two examples based on OECD health data. In our illustration we confirm that improved nonlinear AEG and VEC, significantly, have a better ability to identify long run co-integration and causal relationships than ordinary linear ones. Ordinary methods and improved-nonlinear methods demonstrate similar results if the variables in a model are approximately linear.
    Keywords: Cointegration; Non-linear time series analysis; Augmented Engle-Granger; Vector error correction method; Artificial intelligence; Health economics
    Date: 2011–07–05
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:halshs-00606048&r=ecm
  6. By: Jinyong Hahn; Whitney Newey (Institute for Fiscal Studies and Massachusetts Institute of Technology); Richard Smith (Institute for Fiscal Studies and University of Cambridge)
    Abstract: <p>The central concern of the paper is with the formulation of tests of neglected parameter heterogeneity appropriate for model environments specified by a number of unconditional or conditional moment conditions. We initially consider the unconditional moment restrictions framework. Optimal m-tests against moment condition parameter heterogeneity are derived with the relevant Jacobian matrix obtained as the second order derivative of the moment indicator in a leading case. GMM and GEL tests of specification based on generalized information matrix equalities appropriate for moment-based models are described and their relation to the optimal m-tests against moment condition parameter heterogeneity examined. A fundamental and important difference is noted between GMM and GEL constructions. The paper is concluded by a generalization of these tests to the conditional moment context.</p>
    Date: 2011–07
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:26/11&r=ecm
  7. By: Philippe Charlot (GREQAM - Groupement de Recherche en Économie Quantitative d'Aix-Marseille - Université de la Méditerranée - Aix-Marseille II - Université Paul Cézanne - Aix-Marseille III - Ecole des Hautes Etudes en Sciences Sociales (EHESS) - CNRS : UMR6579); Vêlayoudom Marimoutou (IFP - Institut Français de Pondichéry - Ministère des Affaires étrangères et européennes - CNRS : UMIFRE21)
    Abstract: This paper presents a new multivariate GARCH model with time-varying conditional correlation structure, which is a special case of the Regime Switching Dynamic Correlation (RSDC) of Pelletier (2006). This model which we have named Hierarchical RSDC (HRSDC), has been built with the hierarchical generalization of the hidden Markov model introduced by Fine et al. (1998). This can be viewed graphically as a tree-structure with different types of states. The former are called production states, and they can emit observations, as in the class of Markov-Switching approach. The latter are called "abstract" states. They can't emit observations but establish vertical and horizontal probabilities that define the dynamic of the hidden hierarchical structure. The main advantage of this approach, comparable to the classical Markov-Switching model, is that it improves the granularity of the regimes. Our model is also comparable to the new Double Smooth Transition Conditional Correlation GARCH model (DSTCC), a STAR approach for dynamic correlations proposed by Silvennoinen and Terasvirta (2007). The reason is that, under certain assumptions, the DSTCC and our model represent two classical competing approaches to modeling regime switching. We performed, Monte-Carlo simulations, and we applied the model to two empirical applications in studying the conditional correlations of selected stock returns. Results show that the HRSDC provides a good measure of the correlations, and possesses an interesting explanatory power.
    Keywords: Multivariate GARCH - Dynamic correlations - Regime switching - Markov chain - Hidden Markov models - Hierarchical Hidden Markov models
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-00605965&r=ecm
  8. By: Christopher Bruffaerts; Bram De Rock; Catherine Dehon
    Abstract: In this paper we examine the robustness properties of a specific type of orientation in the context of efficiency measurement using partial frontiers. This so called unconditional hyperbolic -quantile estimator of efficiency has been recently studied by Wheelock and Wilson (2008) and can be seen as an extension of the input/ output methodology of partial frontiers that was introduced by Aragon, Daouia and Thomas Agnan (2005). The influence function of this fully non-parametric and unconditional estimator is here derived for a complete multivariate setup (multiple inputs and outputs). Like for the input and output quantile estimators, the hyperbolic-quantile estimator is B-robust. The asymptotic variance of this estimator is recovered from the influence function. Some examples are given to assess the relevance of this type of estimator and to show the differences with the input and output-quantile estimators of efficiency from both a robustness and a statistical efficiency point of view.
    Keywords: asymptotic variance; efficiency measurement; hyperbolic orientation; influence function; gross-error sensitivity
    Date: 2011–06
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/91263&r=ecm
  9. By: Vanessa Berenguer Rico; Jesus Gonzalo
    Abstract: The order of integration is valid to characterize linear processes; but it is not appropriate for non-linear worlds. We propose the concept of summability (a re-scaled partial sum of the process being Op(1)) to handle non-linearities. The paper shows that this new concept, S (d): (i) generalizes I (d); (ii) measures the degree of persistence as well as of the evolution of the variance; (iii) controls the balancedness of non-linear relationships; (iv) opens the door to the concept of co-summability which represents a generalization of co-integration for non-linear processes. To make this concept empirically applicable, an estimator for d and its asymptotic properties are provided. The finite sample performance of subsampling confidence intervals is analyzed via a Monte Carlo experiment. The paper finishes with the estimation of the degree of summability of the macroeconomic variables in an extended version of the Nelson-Plosser database.
    Keywords: Co-integration, Co-summability, Integrated processes, Non-linear balanced relationships, Non-linear processes, Summability
    JEL: C01 C22
    Date: 2011–06
    URL: http://d.repec.org/n?u=RePEc:cte:werepe:we1115&r=ecm
  10. By: Michaela Denk; Michael Weber
    Abstract: International organizations collect data from national authorities to create multivariate cross-sectional time series for their analyses. As data from countries with not yet well-established statistical systems may be incomplete, the bridging of data gaps is a crucial challenge. This paper investigates data structures and missing data patterns in the cross-sectional time series framework, reviews missing value imputation techniques used for micro data in official statistics, and discusses their applicability to cross-sectional time series. It presents statistical methods and quality indicators that enable the (comparative) evaluation of imputation processes and completed datasets.
    Date: 2011–06–30
    URL: http://d.repec.org/n?u=RePEc:imf:imfwpa:11/151&r=ecm
  11. By: Axel Groß-Klußmann; Nikolaus Hautsch
    Abstract: We introduce a long memory autoregressive conditional Poisson (LMACP) model to model highly persistent time series of counts. The model is applied to forecast quoted bid-ask spreads, a key parameter in stock trading operations. It is shown that the LMACP nicely captures salient features of bid-ask spreads like the strong autocorrelation and discreteness of observations. We discuss theoretical properties of LMACP models and evaluate rolling window forecasts of quoted bid-ask spreads for stocks traded at NYSE and NASDAQ. We show that Poisson time series models significantly outperform forecasts from ARMA, ARFIMA, ACD and FIACD models. The economic significance of our results is supported by the evaluation of a trade schedule. Scheduling trades according to spread forecasts we realize cost savings of up to 13 % of spread transaction costs.
    Keywords: Bid-ask spreads, forecasting, high-frequency data, stock market liquidity, count data time series, long memory Poisson autoregression
    JEL: G14 C32
    Date: 2011–07
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2011-044&r=ecm
  12. By: Luc Bauwens (Université catholique de Louvain, CORE); Gary Koop (University of Strathclyde); Dimitris Korobilis (Université catholique de Louvain, CORE); Jeroen V.K. Rombouts (Institute of Applied Economics at HEC Montréal, CIRANO, CIRPEE; Université catholique de Louvain, CORE)
    Abstract: This paper compares the forecasting performance of different models which have been proposed for forecasting in the presence of structural breaks. These models differ in their treatment of the break process, the model which applies in each regime and the out-of-sample probability of a break occurring. In an extensive empirical evaluation involving many important macroeconomic time series, we demonstrate the presence of structural breaks and their importance for forecasting in the vast majority of cases. We find no single forecasting model consistently works best in the presence of structural breaks. In many cases, the formal modeling of the break process is important in achieving good forecast performance. However, there are also many cases where simple, rolling window based forecasts perform well.
    Keywords: Forecasting, change-points, Markov switching, Bayesian inference
    JEL: C11 C22 C53
    Date: 2011–07
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:38_11&r=ecm
  13. By: Martin Browning; Jesús M. Carro
    Abstract: Let S be the number of components in a finite discrete mixing distribution. We prove that the number of waves of panel being greater than or equal to 2S is a sufficient condition for global identification of a dynamic binary choice model in which all the parameters are heterogeneous. This model results in a mixture of S binary first order Markov Chains
    Keywords: Discrete choice, Markov processes, Global identification
    JEL: C23 C24 J64
    Date: 2011–01
    URL: http://d.repec.org/n?u=RePEc:cte:werepe:we1117&r=ecm
  14. By: Guido Consonni (Department of Economics and Quantitative Methods, University of Pavia); Elias Moreno (University of Granada); Sergio Venturini (Bocconi University of Milan)
    Abstract: We analyze the general (multiallelic) Hardy-Weinberg equilibrium problem from an objective Bayesian testing standpoint. We argue that for small or moderate sample sizes the answer is rather sensitive to the prior chosen, and this suggests to carry out a sensitivity analysis with respect to the prior. This goal is achieved through the identification of a class of priors specifically designed for this testing problem. In this paper we consider the class of intrinsic priors under the full model, indexed by a tuning quantity, the training sample size. These priors are objective, satisfy Savage’s continuity condition and have proved to behave extremely well for many statistical testing problems. We compute the posterior probability of the Hardy-Weinberg equilibrium model for the class of intrinsic priors, assess robustness over the range of plausible answers, as well as stability of the decision in favor of either hypothesis.
    Keywords: Bayes factor; Hardy-Weinberg equilibrium; Intrinsic prior; Model posterior probability; Robustness.
    Date: 2010–08
    URL: http://d.repec.org/n?u=RePEc:pav:wpaper:244&r=ecm
  15. By: Jesús M. Carro; Alejandra Traferri
    Abstract: This paper considers the estimation of a dynamic ordered probit of self-assessed health status with two fixed effects: one in the linear index equation and one in the cut points. The two fixed effects allow us to robustly control for heterogeneity in unobserved health status and in reporting behaviour, even though we can not separate both sources of heterogeneity. The contributions of this paper are twofold. First it contributes to the literature that studies the determinants and dynamics of Self-Assessed Health measures. Second, this paper contributes to the recent literature on bias correction in nonlinear panel data models with fixed effects by applying and studying the finite sample properties of two of the existing proposals to our model. The most direct and easily applicable correction to our model is not the best one, and has important biases in our sample sizes
    Keywords: Dynamic ordered probit, Fixed effects, Self-assessed health, Reporting bias, Panel data, Unobserved heterogeneity, Incidental parameters, Bias correction
    JEL: C23 C25 I19
    Date: 2011–05
    URL: http://d.repec.org/n?u=RePEc:cte:werepe:we1118&r=ecm
  16. By: Florian Ade; Ronny Freier
    Abstract: A recent literature has used variation just around deterministic legislative population thresholds to identify the causal effects of institutional changes. This paper reviews the use of regression discontinuity designs using such population thresholds. Our concern involves three arguments: (1) simultaneous exogenous (co-)treatment, (2) simultaneous endogenous choices and (3) manipulation and precise control over population measures. Revisiting the study by Egger and Koethenbuerger (2010), who analyse the relationship between council size and government spending, we present new evidence that these three concerns do matter for causal analysis. Our results suggest that empirical designs using population thresholds are only to be used with utmost care and confidence in the precise institutional setting.
    Keywords: Regression discontinuity design, population thresholds, local elections, government spending
    JEL: C2 D7 H7
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:diw:diwwpp:dp1136&r=ecm

This nep-ecm issue is ©2011 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.