|
on Econometrics |
By: | Juan Carlos Escanciano (Indiana University) |
Abstract: | This paper investigates estimation of linear regression models with strictly exogenous instruments under minimal identifying assumptions. It is known that under this setting the commonly used Instrumental Variables (IV) estimators are not uniformly consistent (uniformity is in the underlying data generating process). This negative result is due to the lack of "continuity" in the identification of IV caused by weak instruments. This paper introduces a uniformly consistent estimator in this setting. The proposed estimator, called here the Integrated Instrumental Variables (IIV) estimator, is a weighted least squares estimator with trivial implementation. Monte Carlo evidence supports the theoretical claims and suggests that the IIV estimator is a robust and reliable alternative to IV and optimal IV in finite samples under weak identification and strictly exogenous instruments. An application to estimating the elasticity of intertemporal substitution highlights the merits of the proposed approach over classical IV methods. |
Keywords: | Identifi cation; Instrumental variables; Weak instruments; Efficient IV; Intertemporal elasticity of substitution |
Date: | 2015–08 |
URL: | http://d.repec.org/n?u=RePEc:inu:caeprp:2015023&r=ecm |
By: | Marco Bee; Roberto Benedetti; Giuseppe Espa |
Abstract: | Likelihood inference for the Bingham distribution is difficult because the density function contains a normalization constant that cannot be computed in closed form. We propose to estimate the parameters by means of Approximate Maximum Likelihood Estimation (AMLE), thus bypassing the problem of evaluating the likelihood function. We study the impact of the input parameters of the AMLE algorithm and suggest some methods for choosing their numerical values. Moreover, we compare AMLE to the standard approach consisting in maximizing numerically the (approximate) likelihood obtained with the normalization constant estimated via the Holonomic Gradient Method (HGM). For the Bingham distribution on the sphere, simulation experiments and real-data applications produce similar outcomes for both methods. On the other hand, AMLE outperforms HGM when the dimension increases. |
Keywords: | Directional data; Simulation; Intractable Likelihood; Sufficient statistics |
Date: | 2015 |
URL: | http://d.repec.org/n?u=RePEc:trn:utwprg:2015/02&r=ecm |
By: | Drew Creal (The University of Chicago Booth School of Business, United States); Siem Jan Koopman (VU University Amsterdam, the Netherlands); André Lucas (VU University Amsterdam, the Netherlands); Marcin Zamojski (VU University Amsterdam, the Netherlands) |
Abstract: | We introduce a new estimation framework which extends the Generalized Method of Moments (GMM) to settings where a subset of the parameters vary over time with unknown dynamics. To filter out the dynamic path of the time-varying parameter, we approximate the dynamics by an autoregressive process driven by the score of the local GMM criterion function. Our approach is completely observation driven, rendering estimation and inference straightforward. It provides a unified framework for modeling parameter instability in a context where the model and its parameters are only specified through (conditional) moment conditions, thus generalizing approaches built on fully specified parametric models. We provide examples of increasing complexity to highlight the advantages of our method. |
Keywords: | dynamic models; time-varying parameters; generalized method of moments; non-linearity |
JEL: | C10 C22 C32 C51 |
Date: | 2015–12–24 |
URL: | http://d.repec.org/n?u=RePEc:tin:wpaper:20150138&r=ecm |
By: | Monokroussos, George |
Abstract: | This paper proposes a Bayesian nowcasting approach that utilizes information coming both from large real-time data sets and from priors constructed using internet search popularity measures. Exploiting rich information sets has been shown to deliver significant gains in nowcasting contexts, whereas popularity priors can lead to better nowcasts in the face of model and data uncertainty in real time, challenges which can be particularly relevant during turning points. It is shown, for a period centered on the latest recession in the United States, that this approach has the potential to deliver particularly good real-time nowcasts of GDP growth. |
Keywords: | Nowcasting, Gibbs Sampling, Factor Models, Kalman Filter, Real-Time Data, Google Trends, Monetary Policy, Great Recession. |
JEL: | C11 C22 C53 E37 E52 |
Date: | 2015–11–01 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:68594&r=ecm |
By: | Hiroaki Chigira; Tsunemasa Shiba |
Abstract: | We propose a Bayesian procedure to estimate heteroskedastic variances of the regression error term ƒÖ, when the form of heteroskedasticity is unknown. The prior information on ƒÖ is based on a Dirichlet distribution, and in the Markov Chain Monte Carlo sampling, its proposal density parameters' information is elicited from the well-known Eicker-White Heteroskedasticity Consistent Variance-Covariance Matrix Estimator. We present an emprical example to show that our scheme works. |
Date: | 2015–12–22 |
URL: | http://d.repec.org/n?u=RePEc:toh:tergaa:341&r=ecm |
By: | Fisher, Mark (Federal Reserve Bank of Atlanta) |
Abstract: | This note presents a nonparametric Bayesian approach to fitting a distribution to the survey data provided in Kilian and Zha (2002) regarding the prior for the half-life of deviations from purchasing power parity (PPP). A point mass at infinity is included. The unknown density is represented as an average of shape-restricted Bernstein polynomials, each of which has been skewed according to a preliminary parametric fit. A sparsity prior is adopted for regularization. |
Keywords: | nonparametric Bayesian estimation; Bernstein polynomials; simplex regression; importance sampling; PPP half-life deviations |
JEL: | C11 C14 F31 |
Date: | 2015–12–15 |
URL: | http://d.repec.org/n?u=RePEc:fip:fedawp:2015-15&r=ecm |
By: | Marco Bee; Giuseppe Espa; Diego Giuliani; Flavio Santi |
Abstract: | In this paper we use the cross-entropy method for noisy optimisation for fitting generalised linear multilevel models through maximum likelihood. We propose specifications of the instrumental distributions for positive and bounded parameters that improve the computational performance. We also introduce a new stopping criterion, which has the advantage of being problem-independent. In a second step we find, by means of extensive Monte Carlo experiments, the most suitable values of the input parameters of the algorithm. Finally, we compare the method to benchmark estimation technique based on numerical integration. The cross-entropy approach turns out to be preferable from both the statistical and the computational point of view. In the last part of the paper, the method is used to model death probability of firms in the healthcare industry in Italy. |
Date: | 2015 |
URL: | http://d.repec.org/n?u=RePEc:trn:utwprg:2015/04&r=ecm |
By: | Zdravko Botev (The University of New South Wales, Sydney, Australia); Michel Mandjes (University of Amsterdam, the Netherlands); Ad Ridder (VU University Amsterdam, the Netherlands) |
Abstract: | In this article we consider the efficient estimation of the tail distribution of the maximum of correlated normal random variables. We show that the currently recommended Monte Carlo estimator has difficulties in quantifying its precision, because its sample variance estimator is an inefficient estimator of the true variance. We propose a simple remedy: to still use this estimator, but to rely on an alternative quantification of its precision. In addition to this we also consider a completely new sequential importance sampling estimator of the desired tail probability. Numerical experiments suggest that the sequential importance sampling estimator can be significantly more efficient than its competitor. |
Keywords: | Rare event simulation; Correlated Gaussian; Tail probabilities; Sequential importance sampling |
JEL: | C61 C63 |
Date: | 2015–12–15 |
URL: | http://d.repec.org/n?u=RePEc:tin:wpaper:20150132&r=ecm |
By: | Van de gaer, Dirk (Ghent University); Ramos, Xavi (Universitat Autònoma de Barcelona) |
Abstract: | The theoretical literature on inequality of opportunity formulates basic properties that measures of inequality of opportunity should have. Standard methods for the measurement of inequality of opportunity require the construction of counterfactual outcome distributions through statistical methods. We show that, when standard parametric procedures are used to construct the counterfactuals, the specification used determines whether the resulting measures of inequality of opportunity satisfy the basic properties. |
Keywords: | counterfactuals, inequality measurement, opportunities |
JEL: | D3 D63 C1 |
Date: | 2015–12 |
URL: | http://d.repec.org/n?u=RePEc:iza:izadps:dp9582&r=ecm |
By: | Franc Klaassen (University of Amsterdam); Rutger Teulings (University of Amsterdam, the Netherlands) |
Abstract: | Fixed effects (FE) in panel data models overlap each other and prohibit the identification of the impact of ''constant'' regressors. Think of regressors that are constant across countries in a country-time panel with time FE. The traditional approach is to drop some FE and constant regressors by normalizing their impact to zero. We introduce ''untangling normalization'', meaning that we orthogonalize the FE and, if present, the constant regressors. The untangled FE are much easier to interpret. Moreover, the impact of constant regressors can now be estimated, and the untangled FE indicate to what extent the estimates reflect the true value. Our untangled estimates are a linear transformation of the traditional, zero-normalized estimates; no new estimation is needed. We apply the approach to a gravity model for OECD countries' exports to the US. The constant regressors US GDP, world GDP and the US effective exchange rate explain 90% of the time FE, making the latter redundant, so the estimated impacts indeed reflect the true value. |
Keywords: | gravity model; fixed effects; multicollinearity; normalization; orthogonalization |
JEL: | C18 C23 F14 |
Date: | 2015–12–24 |
URL: | http://d.repec.org/n?u=RePEc:tin:wpaper:20150137&r=ecm |
By: | Zuhe Zhang (University of Melbourne - University of Melbourne); Benjamin Rubinstein (University of Melbourne - University of Melbourne); Christos Dimitrakakis (Université de Lille Sciences humaines et sociales, Chalmers University of Technology [Gothenburg]) |
Abstract: | We study how to communicate findings of Bayesian inference to third parties, while preserving the strong guarantee of differential privacy. Our main contributions are four different algorithms for private Bayesian inference on proba-bilistic graphical models. These include two mechanisms for adding noise to the Bayesian updates, either directly to the posterior parameters, or to their Fourier transform so as to preserve update consistency. We also utilise a recently introduced posterior sampling mechanism, for which we prove bounds for the specific but general case of discrete Bayesian networks; and we introduce a maximum-a-posteriori private mechanism. Our analysis includes utility and privacy bounds, with a novel focus on the influence of graph structure on privacy. Worked examples and experiments with Bayesian naïve Bayes and Bayesian linear regression illustrate the application of our mechanisms. |
Keywords: | posterior sampling,Bayesian inference,differential privacy |
Date: | 2016–02–10 |
URL: | http://d.repec.org/n?u=RePEc:hal:journl:hal-01234215&r=ecm |