nep-ecm New Economics Papers
on Econometrics
Issue of 2014‒08‒16
eleven papers chosen by
Sune Karlsson
Örebro universitet

  1. Alternative estimating procedures for multiple membership logit models with mixed effects: indirect inference and data cloning By Anna Gottard; Giorgio Calzolari
  2. Inference in VARs with Conditional Heteroskedasticity of Unknown Form By Ralf Brüggemann; Carsten Jentsch; Carsten Trenkler
  3. Powerful nonparametric checks for quantile regression By Maistre, Samuel; Lavergne, Pascal; Patilea, Valentin
  4. A Significance Test for Covariates in Nonparametric Regression By Lavergne, Pascal; Maistre, Samuel; Patilea, Valentin
  5. A J-Test for Panel Models with Fixed Effects, Spatial and Time By Harry H. Kelejian; Gianfranco Piras
  6. Unit root tests for dependent and heterogeneous micropanels By In Choi
  7. Corrections to: Multivariate normal distribution approaches for dependently truncated data By Pan, Chi-Hung; Emura, Takeshi
  8. ROM Simulation with Rotation Matrices By Daniel Ledermann; Carol Alexander
  9. Monetary Policy Indeterminacy and Identification Failures in the U.S.: Results from a Robust Test By Efrem Castelnuovo; Luca Fanelli
  10. Specification and Estimation of Gravity Models: A Review of the Issues in the Literature By Fatima Olanike Kareem; Olayinka Idowu Kareem
  11. A Dual Least-Squares Estimator of the Errors-In-Variables Model Using Only First And Second Moments By Paris, Quirino

  1. By: Anna Gottard (Dipartimento di Statistica, Informatica, Applicazioni "G. Parenti", Università di Firenze); Giorgio Calzolari (Dipartimento di Statistica, Informatica, Applicazioni "G. Parenti", Università di Firenze)
    Abstract: Multiple-membership logit models with random effects are logit models for clustered binary data, where each statistical unit can belong to more than one group. For these models, the likelihood function is analytically intractable. We propose two different approaches for parameter estimation: data cloning and indirect inference. Data cloning computes maximum likelihood estimates, through the posterior distribution of an adequate Bayesian model fitted on cloned data. We implement a data cloning algorithm specific for the case of multiple-membership models. Indirect inference is a non-likelihood based method which uses an auxiliary model to select sensible estimates. We propose an auxiliary model having the same dimension of parameter space as the target model, which is particularly convenient to reach good estimates very fast. A Monte Carlo experiment compares the two approaches on a set of simulated data. We report also Bayesian posterior mean and INLA hybrid data cloning estimates for comparison. Simulations show a negligible loss of efficiency for the indirect inference estimator, compensated by a relevant computational gain. The approaches are then illustrated with a real example on matched paired data.
    Keywords: Binary data, Bradley Terry models, intractable likelihood, integrated nested Laplace approximation, non-hierarchical random effects models
    JEL: C51
    Date: 2014–07
    URL: http://d.repec.org/n?u=RePEc:fir:econom:wp2014_07&r=ecm
  2. By: Ralf Brüggemann (Department of Economics, University of Konstanz, Germany); Carsten Jentsch (Department of Economics, University of Mannheim, Germany); Carsten Trenkler (Department of Economics, University of Mannheim, Germany)
    Abstract: We derive a framework for asymptotically valid inference in stable vector autoregressive (VAR) models with conditional heteroskedasticity of unknown form. We prove a joint central limit theorem for the VAR slope parameter and innovation covariance parameter estimators and address bootstrap inference as well. Our results are important for correct inference on VAR statistics that depend both on the VAR slope and the variance parameters as e.g. in structural impulse response functions (IRFs). We also show that wild and pairwise bootstrap schemes fail in the presence of conditional heteroskedasticity if inference on (functions) of the unconditional variance parameters is of interest because they do not correctly replicate the relevant fourth moments' structure of the error terms. In contrast, the residual-based moving block bootstrap results in asymptotically valid inference. We illustrate the practical implications of our theoretical results by providing simulation evidence on the finite sample properties of different inference methods for IRFs. Our results point out that estimation uncertainty may increase dramatically in the presence of conditional heteroskedasticity. Moreover, most inference methods are likely to understate the true estimation uncertainty substantially in finite samples.
    Keywords: VAR, Conditional heteroskedasticity, Residual-based moving block bootstrap, Pairwise bootstrap, Wild bootstrap
    JEL: C30 C32
    Date: 2014–08–04
    URL: http://d.repec.org/n?u=RePEc:knz:dpteco:1413&r=ecm
  3. By: Maistre, Samuel; Lavergne, Pascal; Patilea, Valentin
    Abstract: We address the issue of lack-of-fit testing for a parametric quantile regression. We propose a simple test that involves one-dimensional kernel smoothing, so that the rate at which it detects local alternatives is independent of the number of covariates. The test has asymptotically gaussian critical values, and wild bootstrap can be applied to obtain more accurate ones in small samples. Our procedure appears to be competitive with existing ones in simulations. We illustrate the usefulness of our test on birthweight data.
    Keywords: Goodness-of-fit test, U-statistics, Smoothing.
    JEL: C14 C52
    Date: 2014–06
    URL: http://d.repec.org/n?u=RePEc:tse:wpaper:28289&r=ecm
  4. By: Lavergne, Pascal; Maistre, Samuel; Patilea, Valentin
    Abstract: We consider testing the significance of a subset of covariates in a nonparamet- ric regression. These covariates can be continuous and/or discrete. We propose a new kernel-based test that smoothes only over the covariates appearing under the null hypothesis, so that the curse of dimensionality is mitigated. The test statistic is asymptotically pivotal and the rate of which the test detects local alternatives depends only on the dimension of the covariates under the null hy- pothesis. We show the validity of wild bootstrap for the test. In small samples, our test is competitive compared to existing procedures.
    Keywords: Testing, Bootstrap, Kernel Smoothing, U−statistic.
    JEL: C14 C52
    Date: 2014–03
    URL: http://d.repec.org/n?u=RePEc:tse:wpaper:28290&r=ecm
  5. By: Harry H. Kelejian (Department of Economics, University of Maryland); Gianfranco Piras (Regional Research Institute, West Virginia University)
    Abstract: In this paper we suggest a J-test in a spatial panel framework of a null model against one or more alternatives. The null model we consider has fixed effects, along with spatial and time dependence. The alternatives can have either fixed or random effects. We implement our procedure to test the specifications of a demand for cigarette model. We find that the most appropriate specification is one that contains the average price of cigarettes in neighboring states, as well as the spatial lag of the dependent variable. Along with formal large sample results, we also give small sample Monte Carlo results. Our large samples results are based on the assumption N ? 8 and T is fixed. Our Monte Carlo results suggest that our proposed J-test has good power, and proper size even for small to moderately sized samples.
    Keywords: spatial panel models, fixed effects, time and spatial lags, non-nested j-test
    JEL: C01 C12
    Date: 2013–03
    URL: http://d.repec.org/n?u=RePEc:rri:wpaper:201303&r=ecm
  6. By: In Choi (Department of Economics, Sogang University, Seoul)
    Abstract: This paper proposes a panel unit root test for micropanels with short time dimension (T) and large cross section (N). There are several distinctive features of this test. First, the test is based on a panel AR(1) model, which allows for cross-sectional dependency, which is introduced by the initial condition's assumption of a factor structure. Second, the test employs the panel AR(1) model with heterogeneous AR(1) coefficients. Third, the test does not use the AR(1) coefficient estimator. The effectiveness of the test rests on the fact that the initial condition has permanent effects on the trajectory of a time series in the presence of a unit root. To measure the effects of the initial condition, this paper employs cross-sectional regression using the first time series observations as a regressor and the last as a dependent variable. If there is a unit root in every individual time series, the coefficient of the regressor is equal to one. The t-ratio for the coefficient is this paper's test statistic and has a standard normal distribution in the limit. The t-ratio is based on the instrumental variables estimator that uses a reshuffled regressor as an instrument. The test proposed in this paper makes it possible to test for a unit root even at T=2 as long as N is large. Simulation results show that the test has reasonable empirical size and power. The test is applied to college graduates' monthly real wage in South Korea. The number of time series observations for this data is only two. The test rejects the null hypothesis of a unit root.
    Keywords: Unit root, panel data, factor model, internal instrument, earnings dynamics
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:sgo:wpaper:1404&r=ecm
  7. By: Pan, Chi-Hung; Emura, Takeshi
    Abstract: We provide corrections for Emura and Konno (2010). We also numerically verify the corrected formulae. Appendix gives a real data used for numerical analysis.
    Keywords: Dependent truncation • Information matrix • Maximum likelihood • Multivariate analysis
    JEL: C02 C13 C16 C34 C83
    Date: 2014–08–09
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:57852&r=ecm
  8. By: Daniel Ledermann (ICMA Centre, Henley Business School, University of Reading); Carol Alexander (ICMA Centre, Henley Business School, University of Reading)
    Abstract: This paper explores the properties of random orthogonal matrix (ROM) simulation when the random matrix is drawn from the class of rotational matrices. We describe the characteristics of ROM simulated samples that are generated using random Hessenberg, Cayley and exponential matrices and compare the computational efficiency of parametric ROM simulations with standard Monte Carlo techniques.
    Keywords: : Computational efficiency, L matrices, Ledermann matrix, Random Orthogonal Matrix (ROM), Rotation matrix, Simulation
    URL: http://d.repec.org/n?u=RePEc:rdg:icmadp:icma-dp2011-06&r=ecm
  9. By: Efrem Castelnuovo (Melbourne Institute of Applied Economic and Social Research, The University of Melbourne; and Department of Economics and Management, University of Padova); Luca Fanelli (Department of Statistical Sciences, University of Bologna)
    Abstract: We propose a novel identification-robust test for the null hypothesis that an estimated new- Keynesian model has a reduced form consistent with the unique stable solution against the alternative of sunspot-driven multiple equilibria. Our strategy is designed to handle identification failures as well as the misspecification of the relevant propagation mechanisms. We invert a likelihood ratio test for the cross-equation restrictions (CER) that the new- Keynesian system places on its reduced form solution under determinacy. If the CER are not rejected, sunspot-driven expectations can be ruled out from the model equilibrium and we accept the structural model. Otherwise, we move to a second-step and invert an Anderson and Rubin-type test for the orthogonality restrictions (OR) implied by the system of Euler equations. The hypothesis of indeterminacy and the structural model are accepted if the OR are not rejected. We investigate the finite sample performance of the suggested identificationrobust two-steps testing strategy by some Monte Carlo experiments and then apply it to a new-Keynesian AD/AS model estimated with actual U.S. data. In spite of some evidence of weak identification as for the ‘Great Moderation’ period, our results offer formal support to the hypothesis of a switch from indeterminacy to a scenario consistent with uniqueness which occurred in the late 1970s. Our identification-robust full-information confidence set for the structural parameters computed on the ‘Great Moderation’ regime turns out to be more precise than the intervals previously reported in the literature through ‘limited-information’ methods.
    Keywords: Confidence set, determinacy, identification failures, indeterminacy, misspecification, new-Keynesian business cycle model, VAR system
    JEL: C31 C22 E31 E52
    Date: 2014–07
    URL: http://d.repec.org/n?u=RePEc:iae:iaewps:wp2014n18&r=ecm
  10. By: Fatima Olanike Kareem; Olayinka Idowu Kareem
    Abstract: The gravity model has become an efficient tool in the analysis of international economic relations due to its theoretical derivation and ability to explain these relationships. The contending issue now is the appropriate specification and estimation techniques. This paper presents a review of current controversy surrounding the specification and estimation of gravity model with zero trade data, which we called ‘gravity modeling estimation debate’. Different positions in the literature were enunciated with the view of bringing the readers to the frontier of knowledge in this area of empirical strategies revolving on the gravity modeling in the presence of zero trade. By and large, the identification of the most appropriate estimation technique in the presence of zero trade is still an empirical issue. This paper deduced that the choice of the estimation technique should largely be based on the research questions, the model specification and the choice of data to be used for the analysis.
    Keywords: Gravity Model, Specification, Estimation, Debate
    JEL: C13 C51 F10
    Date: 2014–06
    URL: http://d.repec.org/n?u=RePEc:rsc:rsceui:2014/74&r=ecm
  11. By: Paris, Quirino
    Abstract: The paper presents an estimator of the errors-in-variables in multiple regressions using only first and second-order moments. The consistency property of the estimator is explored by Monte Carlo experiments. Based on these results, we conjecture that the estimator is consistent. The proof of consistency, to be dealt in another paper, is based upon the assumptions of Kiefer and Wolfowitz (1956). The novel treatment of the errors-in-variables model relies crucially upon a neutral parameterization of the error terms of the dependent and the explanatory variables. The estimator does not have a closed form solution. It requires the maximization of a dual least-squares objective function that guarantees a global optimum. This estimator, therefore, includes the naïve least-squares method (when only the dependent variable is measured with error) as a special case.
    Keywords: errors-in-variables, measurement errors, dual least squares, first moments, second moments, Monte Carlo, Demand and Price Analysis, Research Methods/ Statistical Methods, C30,
    Date: 2014–06–30
    URL: http://d.repec.org/n?u=RePEc:ags:ucdavw:181288&r=ecm

This nep-ecm issue is ©2014 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.