nep-ecm New Economics Papers
on Econometrics
Issue of 2013‒06‒09
fifteen papers chosen by
Sune Karlsson
Orebro University

  1. Testing for Time-Invariant Unobserved Heterogeneity in Generalized Linear Models for Panel Data By Francesco Bartolucci; Federico Belotti; Franco Peracchi
  2. Regressions with Berkson errors in covariates- a nonparametric approach By Susanne Schennach
  3. Bayesian generalized additive models for location, scale and shape for zero-inflated and overdispersed count data By Nadja Klein; Thomas Kneib; Stefan Lang
  4. Analysis of Deviance in Generalized Partial Linear Models By Wolgang Karl Härdle; Li-Shan Huang; ;
  5. Nonparametric tests for event studies under cross-sectional dependence By Matteo Pelagatti
  6. Rejection Probabilities for a Battery of Unit-Root Tests By Maican, Florin G.; Sweeney, Richard J.
  7. The relationship between DSGE and VAR models By Raffaella Giacomini
  8. Inference and forecasting in the age-period-cohort model with unknown exposure with an application to mesothelioma mortality By Bent Nielsen; Maria Dolores Martinez Miranda; Jens Perch Nielsen
  9. The Taylor Decomposition: A Unified Generalization of the Oaxaca Method to Nonlinear Models By Stephen Bazen; Xavier Joutard
  10. B-spline techniques for volatility modeling By Sylvain Corlay
  11. Identifying Age-Cohort-Time Effects, Their Curvature and Interactions from Polynomials: Examples Related to Sickness Absence By Biørn, Erik
  12. Measuring Economic Growth from Outer Space: A Comment By Berliant, Marcus; Weiss, Adam
  13. Are your data really Pareto distributed? By Pasquale Cirillo
  14. Estimating and Identifying Empirical BVAR-DSGE Models for Small Open Economies By Tim Robinson
  15. Trend-cycle decomposition: implications from an exact structural identification By Mardi Dungey; Jan P.A.M. Jacobs; Jing Tian; Simon van Norden

  1. By: Francesco Bartolucci (University of Perugia); Federico Belotti (University of Rome "Tor Vergata"); Franco Peracchi (University of Rome "Tor Vergata" and EIEF)
    Abstract: Recent literature on panel data has emphasized the importance of accounting for time-varying unobserved heterogeneity, which may stem either from time-varying omitted variables or macro-level shocks that affect each individual unit differently. In this paper, we propose a computationally convenient test for the null hypothesis of time-invariant individual effects. The proposed test is an application of Hausman (1978) specification test procedure and can be applied to generalized linear models for panel data, a wide class of models that includes the Gaussian linear model and a variety of nonlinear models typically employed for discrete or categorical outcomes. The basic idea is to compare fixed effects estimators defined as the maximand of full and pairwise conditional likelihood functions. Thus, the proposed approach requires no assumptions on the distribution of the individual effects and, most importantly, it does not require them to be independent of the covariates in the model. We investigate the finite sample properties of the test through a set of Monte Carlo experiments. Our results show that the test performs quite well, with small size distortions and good power properties. A health economics example based on data from the Health and Retirement Study is used to illustrate the proposed test.
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:eie:wpaper:1312&r=ecm
  2. By: Susanne Schennach (Institute for Fiscal Studies and Brown University)
    Abstract: This paper establishes that so-called instrumental variables enable the identification and the estimation of a fully nonparametric regression model with Berkson-type measurement error in the regressors. An estimator is proposed and proven to be consistent. Its practical performance and feasibility are investigated via Monte Carlo simulations as well as through an epidemiological application investigating the effect of particulate air pollution on respiratory health. These examples illustrate that Berkson errors can clearly not be neglected in nonlinear regression models and that the proposed method represents an effective remedy.
    Date: 2013–05
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:22/13&r=ecm
  3. By: Nadja Klein; Thomas Kneib; Stefan Lang
    Abstract: Frequent problems in applied research that prevent the application of the classical Poisson log-linear model for analyzing count data include overdispersion, an excess of zeros compared to the Poisson distribution, correlated responses, as well as complex predictor structures comprising nonlinear effects of continuous covariates, interactions or spatial effects. We propose a general class of Bayesian generalized additive models for zero-inflated and overdispersed count data within the framework of generalized additive models for location, scale and shape where semiparametric predictors can be specified for several parameters of a count data distribution. As special instances, we consider the zero-inflated Poisson, the negative binomial and the zero-inflated negative binomial distribution as standard options for applied work. The additive predictor specifications rely on basis function approximations for the different types of effects in combination with Gaussian smoothness priors. We develop Bayesian inference based on Markov chain Monte Carlo simulation techniques where suitable proposal densities are constructed based on iteratively weighted least squares approximations to the full conditionals. To ensure practicability of the inference we consider theoretical properties like the involved question whether the joint posterior is proper. The proposed approach is evaluated in simulation studies and applied to count data arising from patent citations and claim frequencies in car insurances. For the comparison of models with respect to the distribution, we consider quantile residuals as an effective graphical device and scoring rules that allow to quantify the predictive ability of the models. The deviance information criterion is used for further model specification.
    Keywords: iteratively weighted least squares, Markov chain Monte Carlo, penalized splines, zero-inflated negative binomial, zero-inflated Poisson
    Date: 2013–06
    URL: http://d.repec.org/n?u=RePEc:inn:wpaper:2013-12&r=ecm
  4. By: Wolgang Karl Härdle; Li-Shan Huang; ;
    Abstract: We develop analysis of deviance tools for generalized partial linear models based on local polynomial fitting. Assuming a canonical link, we propose expressions for both local and global analysis of deviance, which admit an additivity property that reduces to ANOVA decompositions in the Gaussian case. Chi-square tests based on integrated likelihood functions are proposed to formally test whether the nonparametric term is significant. Simulation results are shown to illustrate the proposed chi-square tests. The methodology is applied to German Bundesbank Federal Reserve data.
    Keywords: ANOVA decomposition, Integrated likelihood, Link function, Local polynomial AMS 2000 subject classifications: Primary 62G08; secondary 62J12
    JEL: C00 C14 C50 C58
    Date: 2013–05
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2013-028&r=ecm
  5. By: Matteo Pelagatti
    Abstract: We propose three nonparametric tests for the null of no eventinduced shifts in the distribution of stock returns. One test is the natural extension of the popular Corrado rank test to the case of crosssectionally dependent returns, while the other two are based on new ideas. Unfortunately only for one of these tests a solid theory for approximating the distribution of the statistic can be derived, but some simulation experiments confirm that normality is a good approximation also for the other two. The new tests are compared to a widely used parametric test (Patell) through simulation experiments and are shown to compare favourably in terms of power. Simulation results are based on bootstrapping daily stock returns from the S&P100 and NASDAQ indexes.
    Keywords: Rank test, Event study, Abnormal returns, Cross-sectional dependence
    JEL: G14 C12 C14
    Date: 2013–05
    URL: http://d.repec.org/n?u=RePEc:mib:wpaper:244&r=ecm
  6. By: Maican, Florin G. (Department of Economics, School of Business, Economics and Law, Göteborg University); Sweeney, Richard J. (Georgetown University, Washington, D.C.)
    Abstract: If the researcher tests each model in a battery at the a % significance level, the probability that at least one test rejects is generally larger than a %. For five unit-root models, this paper uses Monte Carlo simulation and the inclusion-exclusion principle to show for a %=5% for each test, the probability that at least one test rejects is 16.2% rather than the upper-bound of 25% from the Bonferroni inequality. It also gives estimated probabilities that any combination two, three, four or five models all reject.<p>
    Keywords: Real Exchange Rates; Unit root; Monte Carlo; Break models
    JEL: C15 C22 C32 C33 E31 F31
    Date: 2013–06–03
    URL: http://d.repec.org/n?u=RePEc:hhs:gunwpe:0568&r=ecm
  7. By: Raffaella Giacomini (Institute for Fiscal Studies and UCL)
    Abstract: This chapter reviews the literature on the econometric relationship between DSGE and VAR models from the point of view of estimation and model validation. The mapping between DSGE and VAR models is broken down into three stages: 1) from DSGE to state-space model; 2) from state-space model to VAR (∞); 3) from VAR (∞) to finite order VAR. The focus is on discussing what can go wrong at each step of this mapping and on critically highlighting the hidden assumptions. I also point out some open research questions and interesting new research directions in the literature on the econometrics of DSGE models. These include, in no particular order: understanding the effects of log-linearisation on estimation and identification; dealing with multiplicity of equilibria; estimating nonlinear DSGE models; incorporating into DSGE models information from atheoretical models and from survey data; adopting flexible modelling approaches that combine the theoretical rigor of DSGE models and the econometric model's ability to fit the data.
    Date: 2013–05
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:21/13&r=ecm
  8. By: Bent Nielsen; Maria Dolores Martinez Miranda; Jens Perch Nielsen
    Abstract: It is of considerable interest to forecast future mesothelioma mortality.  No measures for exposure are available so it is not straight forward to apply a dose-response model.  It is proposed to model the counts of deaths directly using a Poisson regression with an age-period-cohort structure, but without offset.  Traditionally the age-period-cohort is viewed to suffer from an identification problem.  It is shown how to re-parameterize the model in terms of freely varying parameters, so as to avoid this problem.  It is shown how to conduct inference and how to construct distribution forecasts.
    Date: 2013–03–26
    URL: http://d.repec.org/n?u=RePEc:oxf:wpaper:2013-w05&r=ecm
  9. By: Stephen Bazen (AMSE - Aix-Marseille School of Economics - Aix-Marseille Univ. - Centre national de la recherche scientifique (CNRS) - École des Hautes Études en Sciences Sociales [EHESS] - Ecole Centrale Marseille (ECM)); Xavier Joutard (AMSE - Aix-Marseille School of Economics - Aix-Marseille Univ. - Centre national de la recherche scientifique (CNRS) - École des Hautes Études en Sciences Sociales [EHESS] - Ecole Centrale Marseille (ECM))
    Abstract: The widely used Oaxaca decomposition applies to linear models. Extending it to commonly used nonlinear models such as binary choice and duration models is not straightforward. This paper shows that the original decomposition using a linear model can be obtained as a first order Taylor expansion. This basis provides a means of obtaining a coherent and unified approach which applies to nonlinear models, which we refer to as a Taylor decomposition. Explicit formulae are provided for the Taylor decomposition for the main nonlinear models used in applied econometrics including the Probit binary choice and Weibull duration models. The detailed decomposition of the explained component is expressed in terms of what are usually referred to as marginal effects and a remainder. Given Jensen's inequality, the latter will always be present in nonlinear models unless an ad hoc or tautological basis for decomposition is used.
    Keywords: Oaxaca decomposition; nonlinear models
    Date: 2013–05
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:halshs-00828790&r=ecm
  10. By: Sylvain Corlay
    Abstract: This paper is devoted to the application of B-splines to volatility modeling, specifically the calibration of the leverage function in stochastic local volatility models and the parameterization of an arbitrage-free implied volatility surface calibrated to sparse option data. We use an extension to the classical B-splines obtained by including basis functions of infinite support. \par We first come back to the application of shape-constrained B-splines to the estimation of conditional expectations, not merely from a scatter plot but also with the given of the marginal distributions. An application is the Monte Carlo calibration of stochastic local volatility models by Markov projection. Then we present a new technique for the calibration of an implied volatility surface to sparse option data. We use a B-spline parameterization of the Radon-Nikodym derivative of the underlying's risk-neutral probability density with respect to a roughly calibrated base model. We show that the method provides smooth arbitrage-free implied volatility surfaces. Eventually, we propose a Galerkin method with B-spline finite elements to the solution of the P.D.E. satisfied by the Radon Nikodym derivative.
    Date: 2013–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1306.0995&r=ecm
  11. By: Biørn, Erik (Dept. of Economics, University of Oslo)
    Abstract: In the paper is considered identification of coefficients in equations explaining a continuous variable, say the number of sickness absence days of an individual per year, by cohort, time and age, subject to their definitional identity. Extensions of a linear equation to polynomials, including additive polynomials, are explored. The cohort+time=age identity makes the treatment of interactions important. If no interactions between the three variables are included, only the coefficients of the linear terms remain unidentified unless additional information is available. Illustrations using a large data set for individual long-term sickness absence in Norway are given. The sensitivity to the estimated marginal effects of cohort and age at the samplemean, as well as conclusions about the equations’ curvature, are illustrated. We find notable differences in this respect between linear and quadratic equations on the one hand and cubic and fourth-order polynomials on the other.
    Keywords: AGe cohort-time problem; Identification; Polynomial regression; Interaction; Age-cohort curvature; Panel data; Sickness absence
    JEL: C23 C24 C25 C52 H55 I18 J21
    Date: 2013–03–21
    URL: http://d.repec.org/n?u=RePEc:hhs:osloec:2013_008&r=ecm
  12. By: Berliant, Marcus; Weiss, Adam
    Abstract: We examine spatial econometric issues arising from the model specification in Henderson, Storeygard and Weil (2012), that uses night light data to proxy for missing or unreliable GDP growth data.
    Keywords: GDP, Night light data, Spatial autocorrelation
    JEL: C21 C23
    Date: 2013–06–01
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:47340&r=ecm
  13. By: Pasquale Cirillo
    Abstract: Pareto distributions, and power laws in general, have demonstrated to be very useful models to describe very different phenomena, from physics to finance. In recent years, the econophysical literature has proposed a large amount of papers and models justifying the presence of power laws in economic data. Most of the times, this Paretianity is inferred from the observation of some plots, such as the Zipf plot and the mean excess plot. If the Zipf plot looks almost linear, then everything is ok and the parameters of the Pareto distribution are estimated. Often with OLS. Unfortunately, as we show in this paper, these heuristic graphical tools are not reliable. To be more exact, we show that only a combination of plots can give some degree of confidence about the real presence of Paretianity in the data. We start by reviewing some of the most important plots, discussing their points of strength and weakness, and then we propose some additional tools that can be used to refine the analysis.
    Date: 2013–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1306.0100&r=ecm
  14. By: Tim Robinson (Reserve Bank of Australia)
    Abstract: Different approaches to modelling the macroeconomy vary in the emphasis they place on coherence with theory relative to their ability to match the data. Dynamic stochastic general equilibrium (DSGE) models place greater emphasis on theory, while vector autoregression (VAR) models tend to provide a better fit of the data. Del Negro and Schorfheide (2004) develop a method of using a DSGE model to inform the priors of a Bayesian VAR. The resulting BVAR-DSGE model partially relaxes the relationships in the DSGE so as to fit the data better. However, their approach does not accommodate the typical restriction of small open economy models which ensures that developments in the small economy cannot affect the large economy. I develop a method that allows this restriction to be imposed and introduce a simple way, suitable for small open economies, of identifying the empirical BVAR-DSGE using information from the DSGE model. These methods are demonstrated using the Justiniano and Preston (2010a) DSGE model. Compared to the DSGE model, the empirical BVAR-DSGE model estimates that there is a larger role for foreign shocks in the small economy's business cycle.
    Keywords: BVAR-DSGE; small open economy
    JEL: C11 C32 C51 E30
    Date: 2013–06
    URL: http://d.repec.org/n?u=RePEc:rba:rbardp:rdp2013-06&r=ecm
  15. By: Mardi Dungey; Jan P.A.M. Jacobs; Jing Tian; Simon van Norden
    Abstract: A well-documented property of the Beveridge-Nelson trend-cycle decomposition is the perfect negative correlation between trend and cycle innovations. We show how this may be consistent with a structural model where trend shocks enter the cycle, or cyclic shocks enter the trend and that identification restrictions are necessary to make this structural distinction. A reduced-form unrestricted version such as that of Morley, Nelson and Zivot (2003) is compatible with either option, but cannot distinguish which is relevant. We discuss economic interpretations and implications using US real GDP data.
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:fip:fedpwp:13-22&r=ecm

This nep-ecm issue is ©2013 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.