nep-ecm New Economics Papers
on Econometrics
Issue of 2006‒02‒05
thirteen papers chosen by
Sune Karlsson
Orebro University

  1. Bayesian Inference in a Cointegrating Panel Data Model By Gary Koop; Roberto Leon-Gonzalez; Rodney Strachan
  2. UNEMPLOYMENT AND HYSTERESIS: A NONLINEAR UNOBSERVED COMPONENTS APPROACH By Alicia Pérez Alon; Silvestro Di Sanzo
  3. THE TWO-SAMPLE PROBLEM WITH REGRESSION ERRORS: AN EMPIRICAL PROCESS APPROACH By Juan Mora
  4. TWO-STAGE HUBER ESTIMATION By Christophe Muller; Tae-Hwan Kim
  5. Structure and asymptotic theory for STAR(1)-GARCH(1,1) models By Marcelo Cunha Medeiros; Felix Chan; Michael McAller
  6. Testing Unit Root in Threshold Cointegration By Krishnakumar, Jaya; Neto, David
  7. THE PROCESS FOLLOWED BY PPP DATA. ON THE PROPERTIES OF LINEARITY TESTS By Ivan Paya; David A. Peel
  8. Models for Dynamic Panels in Space and Time - an Application to Regional Unemployment in the EU By J.Paul Elhorst
  9. Estimation of the Stylized Facts of a Stochastic Cascade Model. By Céline Azizieh; Wolfgang Breymann
  10. Unit root and cointegration tests for cross-sectionally correlated panels - Estimating regional production functions By Roberto Basile; Sergio Destefanis; Mauro Costantini
  11. SPECIFICATION OF A MODEL TO MEASURE - THE VALUE OF TRAVEL TIME SAVINGS FROM BINOMIAL DATA By Mogens Fosgerau
  12. A closer look at the Spatial Durbin Model By Jesus Mur; Ana Angulo
  13. ECOLOGICAL INFERENCE AND SPATIAL HETEROGENEITY - A NEW APPROACH BASED ON ENTROPY ECONOMETRICS By Ludo Peeters; Coro Chasco-Yrigoyen

  1. By: Gary Koop; Roberto Leon-Gonzalez; Rodney Strachan
    Abstract: This paper develops methods of Bayesian inference in a cointegrating panel data model. This model involves each cross-sectional unit having a vector error correction representation. It is flexible in the sense that different cross-sectional units can have different cointegration ranks and cointegration spaces. Furthermore, the parameters which characterize short-run dynamics and deterministic components are allowed to vary over cross-sectional units. In addition to a noninformative prior, we introduce an informative prior which allows for information about the likely location of the cointegration space and about the degree of similarity in coefficients in different cross-sectional units. A collapsed Gibbs sampling algorithm is developed which allows for efficient posterior inference. Our methods are illustrated using real and artificial data.
    Keywords: Bayesian; panel data cointegration; error correction model; reduced rank regression; Markov Chain Monte Carlo
    JEL: C11 C32 C33
    Date: 2006–01
    URL: http://d.repec.org/n?u=RePEc:lec:leecon:06/2&r=ecm
  2. By: Alicia Pérez Alon (Universidad de Alicante); Silvestro Di Sanzo (Universidad de Alicante)
    Abstract: The aim of this paper is to find a possible hysteresis effect on unemployment rate series from Italy, France and the United States. We propose a definition of hysteresis taken from Physics which allows for nonlinearities. To test for the presence of hysteresis we use a nonlinear unobserved components model for unemployment series. The estimation methodology used can be assimilated into a threshold autoregressive representation in the framework of a Kalman filter. To derive an appropriate p-value for a test for hysteresis we propose two alternative bootstrap procedures: the first is valid under homoskedastic errors and the second allows for general heteroskedasticity. We investigate the performance of both bootstrap procedures using Monte Carlo simulation.
    Keywords: Hysteresis; Unobserved Components Model; Threshold Autoregressive Models; Nuisance parameters; Bootstrap
    JEL: C12 C13 C15 C32 E24
    Date: 2005–12
    URL: http://d.repec.org/n?u=RePEc:ivi:wpasad:2005-34&r=ecm
  3. By: Juan Mora (Universidad de Alicante)
    Abstract: We describe how to test the null hypothesis that errors from two parametrically specified regression models have the same distribution versus a general alternative. First we obtain the asymptotic properties of test-statistics derived from the difference between the two residual-based empirical distribution functions. Under the null distribution they are not asymptotically distribution free and, hence, a consistent bootstrap procedure is proposed to compute critical values. As an alternative, we describe how to perform the test with statistics based on martingale-transformed empirical processes, which are asymptotically distribution free. Some Monte Carlo experiments are performed to compare the behaviour of all statistics with moderate sample sizes.
    Keywords: Two-Sample Problem; Residual-Based Empirical Process; Smooth Bootstrap; Martingale Transform
    Date: 2005–05
    URL: http://d.repec.org/n?u=RePEc:ivi:wpasad:2005-18&r=ecm
  4. By: Christophe Muller (Universidad de Alicante); Tae-Hwan Kim (Yonsei University)
    Abstract: In this paper we study how the Huber estimator can be adapted to the presence of endogeneity in a two stage equations setting similar to that of 2SLS. We propose an estimation procedure that is at the same time relatively (i) simple, (ii) robust and (iii) efficient. Moreover, we deal with the case of random regressors and asymmetric errors, two extensions rarely present in this literature. The preliminary scale correction is implemented with median absolute deviation estimator, which is consistent with our above criteria and is a very robust estimator of scale. The resulting estimator is termed as the Two-Stage Huber (2SH) estimator. We explicitly establish the conditions for consistency and asymptotic normality of the 2SH estimator and we derive the formula of the asymptotic covariance matrix. We conduct Monte Carlo simulations whose results indicate that the 2SH estimator has smaller standard errors than the Two-Stage Least Squares (2SLS) estimator and than the Two-Stage Least Absolute Deviations (2SLAD) estimator in many situations. On the whole, the 2SH estimator appears to be a simple and useful alternative to 2SLS and 2SLAD in cases of two-stage estimation to deal with endogeneity when there are concerns for both robustness and efficiency.
    Keywords: Two-stage estimation, Huber estimation, robustness, endogeneity
    Date: 2005–04
    URL: http://d.repec.org/n?u=RePEc:ivi:wpasad:2005-17&r=ecm
  5. By: Marcelo Cunha Medeiros (Department of Economics PUC-Rio); Felix Chan (University of Western Australia); Michael McAller (University of Western Australia)
    Abstract: Nonlinear time series models, especially those with regime-switching and GARCH errors, have become increasingly popular in the economics and finance literature. However, much of the research has concentrated on the empirical applications of various models, with little theoretical or statistical analysis associated with the structure of the processes or the associated asymptotic theory. In this paper we derive necessary and sufficient conditions for strict stationarity and ergodicity of three different specifications of the first-order STAR-GARCH model, and sufficient conditions for the existence of moments. This is important, among others, to establish the conditions under which the traditional LM linearity tests based on Taylor expansions are valid. Finally, we provide sufficient conditions for consistency and asymptotic normality of the Quasi-Maximum Likelihood Estimator.
    Keywords: Nonlinear time series, regime-switching, STAR, GARCH, log-moment, moment conditions, asymptotic theory.
    Date: 2005–11
    URL: http://d.repec.org/n?u=RePEc:rio:texdis:506&r=ecm
  6. By: Krishnakumar, Jaya; Neto, David
    Abstract: This paper proposes a simple procedure to test the hypothesis of no cointegration against both threshold cointegration and an intermediate possibility that we call partial cointegration. Asymptotic theory is developed, the power of the proposed test is analysed through simulations and a successful empirical example is provided.
    Date: 2005–11
    URL: http://d.repec.org/n?u=RePEc:gen:geneem:2005.04&r=ecm
  7. By: Ivan Paya (Universidad de Alicante); David A. Peel (University Management School)
    Abstract: Recent research has reported the lack of correct size in stationarity test for PPP deviations within a linear framework. However, theoretically well motivated nonlinear models, such as the ESTAR, appear to parsimoniously fit the PPP data and provide an explanation for the PPP ¿puzzle¿. Employing Monte Carlo experiments we analyze the size and power of the nonlinear tests against a variety of nonstationary hypotheses. We also fit the ESTAR model to data from high inflation economies. Our results provide further support for ESTAR specification.
    Keywords: ESTAR, Real Exchange Rate, Size, Linearity Test.
    JEL: C15 C22 F31
    Date: 2005–06
    URL: http://d.repec.org/n?u=RePEc:ivi:wpasad:2005-23&r=ecm
  8. By: J.Paul Elhorst
    Abstract: One of the central research questions in modelling space-time data is the right econometric model. At least three problems must be tackled: (i) The observations on each spatial unit might be correlated over time, (ii) The observations at each point in time might be correlated over space, and (iii) The omission of time-invariant and/or spatial-invariant background variables could bias the regression coefficients in a typical cross-section or time-series model. As we have no a priori reasons to believe that one problem is more important than another, this paper presents a general model that encompasses a wide series of simpler models frequently used in the time-series econometrics, spatial econometrics and panel data econometrics literature. A framework is developed to determine which model is the most likely candidate to study space-time data.
    Date: 2005–08
    URL: http://d.repec.org/n?u=RePEc:wiw:wiwrsa:ersa05p81&r=ecm
  9. By: Céline Azizieh (Centre Emile Bernheim, Solvay Business School, Université Libre de Bruxelles, Brussels); Wolfgang Breymann (Department of Mathematics, ETHZ, Zürich Switzerland)
    Abstract: We present a time series model that integrates properties from Levy-type and multifractal models. Formally, it is a stochastic volatility model with discrete time steps, t-distributed return innovations and a stochastic cascade for the volatility process. This model reproduces very well different stylized facts which cannot be reproduced together by other classes of models. We also present an estimation procedure based on the reproduction of stylized facts. This procedure is general and can easily be adapted and/or extended to other models. It may be considered as an extension of the generalized method of moments.
    Keywords: stochastic cascade, multifractal models, stochastic volatility
    JEL: G13 G19 C13
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:sol:wpaper:05-009&r=ecm
  10. By: Roberto Basile; Sergio Destefanis; Mauro Costantini
    Abstract: There is a plethora of studies of regional production functions using stationary panel data. Only some recent works consider non-stationary panel data. All of them assume the hypothesis of cross-section independence. Here, we claim that the independence assumption is too strong when regional data are used. In this paper, the cross-section independence assumption is released and cross-sectional dependence is assumed. First, unit roots and cointegration properties of the panel dataset are properly investigated by using newly developed tests for cross-sectionally dependent panels. Second, dynamic OLS (DOLS) and recent regression models for cross-sectionally correlated panels are used to estimate the cointegrated relationship between value added, physical and human capital, for Italian regions over the period 1970-1998.
    Date: 2005–08
    URL: http://d.repec.org/n?u=RePEc:wiw:wiwrsa:ersa05p171&r=ecm
  11. By: Mogens Fosgerau
    Abstract: This paper develops a semiparametric methodology for the evaluation of the distribution of the value of travel time savings (VTTS) from binary choice data. Fosgerau (2004) deals with the case of just one time component. This paper extends to the case of several time components. The methodology is applied to a recent large dataset of about 2200 car drivers who undertook a series of stated choice experiments. The VTTS is a fundamental concept in transport economics, being the main yardstick against which transport investments are measured. However, the methodology presented is generally applicable to evaluation of willingness to pay from binary choice data. Current standard-of-practice methodology applies a mixing distribution to a binary choice model in order to take account of individual heterogeneity. While this is definitely progress, there remains the problem of deciding which mixing distribution to apply. This problem is avoided here by using a nonparametric distribution. For prediction of choices, the choice of mixing distribution may matter less but it is absolutely crucial for evaluating willingness to pay. Even so, it is rare to see a justification for the choice of mixing distribution. The paper tests a range of parametric distributions against the semiparametric alternative.
    Date: 2005–08
    URL: http://d.repec.org/n?u=RePEc:wiw:wiwrsa:ersa05p77&r=ecm
  12. By: Jesus Mur; Ana Angulo
    Abstract: The spatial Durbin model occupies an interesting position in Spatial Econometrics. It is the reduced form of a model with cross-sectional dependence in the errors, but it may be used, also, as the nesting model in a more general approach of model selection. In the first case, that is the equation where we solve the Likelihood Ratio test of Common Factors. The objective in this case is to discriminate between substantive and residual dependence in a misspecified equation. Its role, when discussing the specification of the model, is also of great value as a way to access either to a static model, to a dynamic model or to a model with residual dependence. Our paper tries to go further into the interpretation of this intermediate equation in both aspects. We include a small Monte Carlo study related to the LR tests and present some new results that expedites the use, and the interpretation, of the Durbin equation in the more general process of econometric model selection in a spatial context.
    Date: 2005–08
    URL: http://d.repec.org/n?u=RePEc:wiw:wiwrsa:ersa05p392&r=ecm
  13. By: Ludo Peeters; Coro Chasco-Yrigoyen
    Abstract: In this paper, we compare the results obtained by the application of three alternative methods of ecological inference. The data is on per capita household disposable income in the 50 provinces and 78 municipalities of Asturias, Spain. The first method is based on Ordinary Least Squares regression model, which assumes constancy or homogeneity. The second method is based on a spatial autocorrelation model, which assumes heterogeneity in two spatial regimes. The third method is based on a varying-coefficients model, which assumes total heterogeneity. The second model is estimated by Maximum Likelihood, whereas the latter is estimated by using Generalized Maximum or Cross Entropy.
    Date: 2005–08
    URL: http://d.repec.org/n?u=RePEc:wiw:wiwrsa:ersa05p705&r=ecm

This nep-ecm issue is ©2006 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.