Econometrics
http://lists.repec.org/mailman/listinfo/nep-ecm
Econometrics2014-08-16Sune KarlssonAlternative estimating procedures for multiple membership logit models with mixed effects: indirect inference and data cloning
http://d.repec.org/n?u=RePEc:fir:econom:wp2014_07&r=ecm
Multiple-membership logit models with random effects are logit models for clustered binary data, where each statistical unit can belong to more than one group. For these models, the likelihood function is analytically intractable. We propose two different approaches for parameter estimation: data cloning and indirect inference. Data cloning computes maximum likelihood estimates, through the posterior distribution of an adequate Bayesian model fitted on cloned data. We implement a data cloning algorithm specific for the case of multiple-membership models. Indirect inference is a non-likelihood based method which uses an auxiliary model to select sensible estimates. We propose an auxiliary model having the same dimension of parameter space as the target model, which is particularly convenient to reach good estimates very fast. A Monte Carlo experiment compares the two approaches on a set of simulated data. We report also Bayesian posterior mean and INLA hybrid data cloning estimates for comparison. Simulations show a negligible loss of efficiency for the indirect inference estimator, compensated by a relevant computational gain. The approaches are then illustrated with a real example on matched paired data.Anna Gottard, Giorgio Calzolari2014-07Binary data, Bradley Terry models, intractable likelihood, integrated nested Laplace approximation, non-hierarchical random effects modelsInference in VARs with Conditional Heteroskedasticity of Unknown Form
http://d.repec.org/n?u=RePEc:knz:dpteco:1413&r=ecm
We derive a framework for asymptotically valid inference in stable vector autoregressive (VAR) models with conditional heteroskedasticity of unknown form. We prove a joint central limit theorem for the VAR slope parameter and innovation covariance parameter estimators and address bootstrap inference as well. Our results are important for correct inference on VAR statistics that depend both on the VAR slope and the variance parameters as e.g. in structural impulse response functions (IRFs). We also show that wild and pairwise bootstrap schemes fail in the presence of conditional heteroskedasticity if inference on (functions) of the unconditional variance parameters is of interest because they do not correctly replicate the relevant fourth moments' structure of the error terms. In contrast, the residual-based moving block bootstrap results in asymptotically valid inference. We illustrate the practical implications of our theoretical results by providing simulation evidence on the finite sample properties of different inference methods for IRFs. Our results point out that estimation uncertainty may increase dramatically in the presence of conditional heteroskedasticity. Moreover, most inference methods are likely to understate the true estimation uncertainty substantially in finite samples.Ralf Brüggemann, Carsten Jentsch, Carsten Trenkler2014-08-04VAR, Conditional heteroskedasticity, Residual-based moving block bootstrap, Pairwise bootstrap, Wild bootstrapPowerful nonparametric checks for quantile regression
http://d.repec.org/n?u=RePEc:tse:wpaper:28289&r=ecm
We address the issue of lack-of-fit testing for a parametric quantile regression. We propose a simple test that involves one-dimensional kernel smoothing, so that the rate at which it detects local alternatives is independent of the number of covariates. The test has asymptotically gaussian critical values, and wild bootstrap can be applied to obtain more accurate ones in small samples. Our procedure appears to be competitive with existing ones in simulations. We illustrate the usefulness of our test on birthweight data.Maistre, Samuel, Lavergne, Pascal, Patilea, Valentin2014-06Goodness-of-fit test, U-statistics, Smoothing.A Significance Test for Covariates in Nonparametric Regression
http://d.repec.org/n?u=RePEc:tse:wpaper:28290&r=ecm
We consider testing the significance of a subset of covariates in a nonparamet- ric regression. These covariates can be continuous and/or discrete. We propose a new kernel-based test that smoothes only over the covariates appearing under the null hypothesis, so that the curse of dimensionality is mitigated. The test statistic is asymptotically pivotal and the rate of which the test detects local alternatives depends only on the dimension of the covariates under the null hy- pothesis. We show the validity of wild bootstrap for the test. In small samples, our test is competitive compared to existing procedures.Lavergne, Pascal, Maistre, Samuel, Patilea, Valentin2014-03Testing, Bootstrap, Kernel Smoothing, U−statistic.A J-Test for Panel Models with Fixed Effects, Spatial and Time
http://d.repec.org/n?u=RePEc:rri:wpaper:201303&r=ecm
In this paper we suggest a J-test in a spatial panel framework of a null model against one or more alternatives. The null model we consider has fixed effects, along with spatial and time dependence. The alternatives can have either fixed or random effects. We implement our procedure to test the specifications of a demand for cigarette model. We find that the most appropriate specification is one that contains the average price of cigarettes in neighboring states, as well as the spatial lag of the dependent variable. Along with formal large sample results, we also give small sample Monte Carlo results. Our large samples results are based on the assumption N ? 8 and T is fixed. Our Monte Carlo results suggest that our proposed J-test has good power, and proper size even for small to moderately sized samples.Harry H. Kelejian, Gianfranco Piras2013-03spatial panel models, fixed effects, time and spatial lags, non-nested j-testUnit root tests for dependent and heterogeneous micropanels
http://d.repec.org/n?u=RePEc:sgo:wpaper:1404&r=ecm
This paper proposes a panel unit root test for micropanels with short time dimension (T) and large cross section (N). There are several distinctive features of this test. First, the test is based on a panel AR(1) model, which allows for cross-sectional dependency, which is introduced by the initial condition's assumption of a factor structure. Second, the test employs the panel AR(1) model with heterogeneous AR(1) coefficients. Third, the test does not use the AR(1) coefficient estimator. The effectiveness of the test rests on the fact that the initial condition has permanent effects on the trajectory of a time series in the presence of a unit root. To measure the effects of the initial condition, this paper employs cross-sectional regression using the first time series observations as a regressor and the last as a dependent variable. If there is a unit root in every individual time series, the coefficient of the regressor is equal to one. The t-ratio for the coefficient is this paper's test statistic and has a standard normal distribution in the limit. The t-ratio is based on the instrumental variables estimator that uses a reshuffled regressor as an instrument. The test proposed in this paper makes it possible to test for a unit root even at T=2 as long as N is large. Simulation results show that the test has reasonable empirical size and power. The test is applied to college graduates' monthly real wage in South Korea. The number of time series observations for this data is only two. The test rejects the null hypothesis of a unit root.In Choi2014Unit root, panel data, factor model, internal instrument, earnings dynamicsCorrections to: Multivariate normal distribution approaches for dependently truncated data
http://d.repec.org/n?u=RePEc:pra:mprapa:57852&r=ecm
We provide corrections for Emura and Konno (2010). We also numerically verify the corrected formulae. Appendix gives a real data used for numerical analysis.Pan, Chi-Hung, Emura, Takeshi2014-08-09Dependent truncation • Information matrix • Maximum likelihood • Multivariate analysisROM Simulation with Rotation Matrices
http://d.repec.org/n?u=RePEc:rdg:icmadp:icma-dp2011-06&r=ecm
This paper explores the properties of random orthogonal matrix (ROM) simulation when the random matrix is drawn from the class of rotational matrices. We describe the characteristics of ROM simulated samples that are generated using random Hessenberg, Cayley and exponential matrices and compare the computational efficiency of parametric ROM simulations with standard Monte Carlo techniques.Daniel Ledermann, Carol Alexander: Computational efficiency, L matrices, Ledermann matrix, Random Orthogonal Matrix (ROM), Rotation matrix, SimulationMonetary Policy Indeterminacy and Identification Failures in the U.S.: Results from a Robust Test
http://d.repec.org/n?u=RePEc:iae:iaewps:wp2014n18&r=ecm
We propose a novel identification-robust test for the null hypothesis that an estimated new- Keynesian model has a reduced form consistent with the unique stable solution against the alternative of sunspot-driven multiple equilibria. Our strategy is designed to handle identification failures as well as the misspecification of the relevant propagation mechanisms. We invert a likelihood ratio test for the cross-equation restrictions (CER) that the new- Keynesian system places on its reduced form solution under determinacy. If the CER are not rejected, sunspot-driven expectations can be ruled out from the model equilibrium and we accept the structural model. Otherwise, we move to a second-step and invert an Anderson and Rubin-type test for the orthogonality restrictions (OR) implied by the system of Euler equations. The hypothesis of indeterminacy and the structural model are accepted if the OR are not rejected. We investigate the finite sample performance of the suggested identificationrobust two-steps testing strategy by some Monte Carlo experiments and then apply it to a new-Keynesian AD/AS model estimated with actual U.S. data. In spite of some evidence of weak identification as for the ‘Great Moderation’ period, our results offer formal support to the hypothesis of a switch from indeterminacy to a scenario consistent with uniqueness which occurred in the late 1970s. Our identification-robust full-information confidence set for the structural parameters computed on the ‘Great Moderation’ regime turns out to be more precise than the intervals previously reported in the literature through ‘limited-information’ methods.Efrem Castelnuovo, Luca Fanelli2014-07Confidence set, determinacy, identification failures, indeterminacy, misspecification, new-Keynesian business cycle model, VAR systemSpecification and Estimation of Gravity Models: A Review of the Issues in the Literature
http://d.repec.org/n?u=RePEc:rsc:rsceui:2014/74&r=ecm
The gravity model has become an efficient tool in the analysis of international economic relations due to its theoretical derivation and ability to explain these relationships. The contending issue now is the appropriate specification and estimation techniques. This paper presents a review of current controversy surrounding the specification and estimation of gravity model with zero trade data, which we called ‘gravity modeling estimation debate’. Different positions in the literature were enunciated with the view of bringing the readers to the frontier of knowledge in this area of empirical strategies revolving on the gravity modeling in the presence of zero trade. By and large, the identification of the most appropriate estimation technique in the presence of zero trade is still an empirical issue. This paper deduced that the choice of the estimation technique should largely be based on the research questions, the model specification and the choice of data to be used for the analysis.Fatima Olanike Kareem, Olayinka Idowu Kareem2014-06Gravity Model, Specification, Estimation, DebateA Dual Least-Squares Estimator of the Errors-In-Variables Model Using Only First And Second Moments
http://d.repec.org/n?u=RePEc:ags:ucdavw:181288&r=ecm
The paper presents an estimator of the errors-in-variables in multiple regressions using only first and second-order moments. The consistency property of the estimator is explored by Monte Carlo experiments. Based on these results, we conjecture that the estimator is consistent. The proof of consistency, to be dealt in another paper, is based upon the assumptions of Kiefer and Wolfowitz (1956). The novel treatment of the errors-in-variables model relies crucially upon a neutral parameterization of the error terms of the dependent and the explanatory variables. The estimator does not have a closed form solution. It requires the maximization of a dual least-squares objective function that guarantees a global optimum. This estimator, therefore, includes the naïve least-squares method (when only the dependent variable is measured with error) as a special case.Paris, Quirino2014-06-30errors-in-variables, measurement errors, dual least squares, first moments, second moments, Monte Carlo, Demand and Price Analysis, Research Methods/ Statistical Methods, C30,