|
on Econometrics |
By: | Badi Baltagi (Center for Policy Research, Maxwell School, Syracuse University, 426 Eggers Hall, Syracuse, NY 13244); Peter Egger (ETH Zürich, KOF Konjunkturforschungsstelle); Michaela Kesina (ETH Zürich, KOF Konjunkturforschungsstelle) |
Abstract: | This paper formulates and analyzes Bayesian model variants for the analysis of systems of spatial panel data with binary dependent variables. The paper focuses on cases where latent variables of cross-sectional units in an equation of the system contemporaneously depend on the values of the same and, eventually, other latent variables of other cross-sectional units. Moreover, the paper discusses cases where time-invariant effects are exogenous versus endogenous. Such models may have numerous applications in industrial economics, public economics, or international economics. The paper illustrates that the performance of Bayesian estimation methods for such models is supportive of their use with even relatively small panel data sets. |
Keywords: | Spatial Econometric; Panel Probit; Multivariate Probit |
JEL: | C11 C31 C35 |
Date: | 2016–01 |
URL: | http://d.repec.org/n?u=RePEc:max:cprwps:187&r=ecm |
By: | Anna Kormilitsina (Southern Methodist University); Denis Nekipelov (University of Virginia) |
Abstract: | Laplace-type estimator has become popular in applied macroeconomics, in particular for estimation of DSGE models. It is often obtained as the mean and variance of parameter's quasi-posterior distribution, which is defined using a classical estimation objective. We demonstrate that the objective must be properly scalded; otherwise, arbirarily small confidence intervals can be obtained if calculated directly from the quasiposterior distribution. We estimate a standard DSGE model and find that scaling up the objective may be useful in estimation with problematic parameter identification. In this case, however, it is important to adjust the quasi-posterior variance to obtain valid confidence intervals. |
Keywords: | Laplace-type estimator, GMM, DSGE model |
JEL: | C11 C13 C15 E30 |
Date: | 2015–05 |
URL: | http://d.repec.org/n?u=RePEc:smu:ecowpa:1510&r=ecm |
By: | Timothy B. Armstrong (Cowles Foundation, Yale University); Michal Kolesar (Princeton University) |
Abstract: | We consider the problem of constructing confidence intervals (CIs) for a linear functional of a regression function, such as its value at a point, the regression discontinuity parameter, or a regression coefficient in a linear or partly linear regression. Our main assumption is that the regression function is known to lie in a convex function class, which covers most smoothness and/or shape assumptions used in econometrics. We derive finite-sample optimal CIs and sharp efficiency bounds under normal errors with known variance. We show that these results translate to uniform (over the function class) asymptotic results when the error distribution is not known. When the function class is centrosymmetric, these efficiency bounds imply that minimax CIs are close to efficient at smooth regression functions. This implies, in particular, that it is impossible to form CIs that are tighter using data-dependent tuning parameters, and maintain coverage over the whole function class. We specialize our results to inference in a linear regression, and inference on the regression discontinuity parameter, and illustrate them in simulations and an empirical application. |
Keywords: | Nonparametric inference, efficiency bounds |
JEL: | C12 C14 |
Date: | 2016–05 |
URL: | http://d.repec.org/n?u=RePEc:cwl:cwldpp:2043&r=ecm |
By: | Sundström, David (Department of Economics, Umeå University) |
Abstract: | In Paper [I] we use data on Swedish public procurement auctions for internal regular cleaning service contracts to provide novel empirical evidence regarding green public procurement (GPP) and its effect on the potential suppliers’ decision to submit a bid and their probability of being qualified for supplier selection. We find only a weak effect on supplier behavior which suggests that GPP does not live up to its political expectations. However, several environmental criteria appear to be associated with increased complexity, as indicated by the reduced probability of a bid being qualified in the postqualification process. As such, GPP appears to have limited or no potential to function as an environmental policy instrument. In Paper [II] the observation is made that empirical evaluations of the effect of policies transmitted through public procurements on bid sizes are made using linear regressions or by more involved non-linear structural models. The aspiration is typically to determine a marginal effect. Here, I compare marginal effects generated under both types of specifications. I study how a political initiative to make firms less environmentally damaging implemented through public procurement influences Swedish firms’ behavior. The collected evidence brings about a statistically as well as economically significant effect on firms’ bids and costs. Paper [III] embarks by noting that auction theory suggests that as the number of bidders (competition) increases, the sizes of the participants’ bids decrease. An issue in the empirical literature on auctions is which measurement(s) of competition to use. Utilizing a dataset on public procurements containing measurements on both the actual and potential number of bidders I find that a workhorse model of public procurements is best fitted to data using only actual bidders as measurement for competition. Acknowledging that all measurements of competition may be erroneous, I propose an instrumental variable estimator that (given my data) brings about a competition effect bounded by those generated by specifications using the actual and potential number of bidders, respectively. Also, some asymptotic results are provided for non-linear least squares estimators obtained from a dependent variable transformation model. Paper [VI] introduces a novel method to measure bidders’ costs (valuations) in descending (ascending) auctions. Based on two bounded rationality constraints bidders’ costs (valuations) are given an imperfect measurements interpretation robust to behavioral deviations from traditional rationality assumptions. Theory provides no guidance as to the shape of the cost (valuation) distributions while empirical evidence suggests them to be positively skew. Consequently, a flexible distribution is employed in an imperfect measurements framework. An illustration of the proposed method on Swedish public procurement data is provided along with a comparison to a traditional Bayesian Nash Equilibrium approach. |
Keywords: | auctions; dependent variable transformation model; green public procurement; indirect inference; instrumental variable; latent variable; log-generalized gamma distribution; maximum likelihood; measurement error; non-linear least squares; objective effectiveness; orthogonal polynomial regression; prediction; simulation estimation; structural estimation |
JEL: | C15 C24 C26 C51 C57 D22 D44 H57 Q01 Q28 |
Date: | 2016–06–08 |
URL: | http://d.repec.org/n?u=RePEc:hhs:umnees:0931&r=ecm |
By: | Bartalotti, Otávio C.; Calhoun, Gray; He, Yang |
Abstract: | This paper develops a novel bootstrap procedure to obtain robust bias-corrected confidence intervals in regression discontinuity (RD) designs using the uniform kernel. The procedure uses a residual bootstrap from a second order local polynomial to estimate the bias of the local linear RD estimator; the bias is then subtracted from the original estimator. The bias-corrected estimator is then bootstrapped itself to generate valid confidence intervals. The confidence intervals generated by this procedure are valid under conditions similar to Calonico, Cattaneo and Titiunik's (2014, Econometrica) analytical correction---i.e. when the bias of the naive regression discontinuity estimator would otherwise prevent valid inference.This paper also provides simulation evidence that our method is as accurate as the analytical corrections and we demonstrate its use through a reanalysis of Ludwig and Miller's (2008) Head Start dataset. |
Date: | 2016–05–01 |
URL: | http://d.repec.org/n?u=RePEc:isu:genres:3394&r=ecm |
By: | Raphael Studer; Rainer Winkelmann |
Abstract: | We propose a new non-linear regression model for rating dependent variables. The rating scale model accounts for the upper and lower bounds of ratings. Parametric and semi-parametric estimation is discussed. An application investigates the relationship between stated health satisfaction and physical and mental health scores derived from self-reports of various health impairments, using data from the German Socio-Economic Panel. We compare our new approach to modeling ratings with ordinary least squares (OLS). In one specification, OLS average effects exceed that from our rating scale model by up to 50 percent. Also, OLS in-sample mean predictions violate the upper bound of the dependent variable in a number of cases. |
Keywords: | Quasi maximum likelihood, bounded dependent variable, German Socio-Economic Panel |
JEL: | C25 I10 |
Date: | 2016 |
URL: | http://d.repec.org/n?u=RePEc:diw:diwsop:diw_sp846&r=ecm |
By: | Ferman, Bruno; Pinto, Cristine; Possebom, Vitor |
Abstract: | The synthetic control (SC) method has been recently proposed as an alternative method to estimate treatment e ects in comparative case studies. Abadie et al. [2010] and Abadie et al. [2015] argue that one of the advantages of the SC method is that it imposes a data-driven process to select the comparison units, providing more transparency and less discretionary power to the researcher. However, an important limitation of the SC method is that it does not provide clear guidance on the choice of predictor variables used to estimate the SC weights. We show that such lack of speci c guidances provides signi cant opportunities for the researcher to search for speci cations with statistically signi cant results, undermining one of the main advantages of the method. Considering six alternative speci cations commonly used in SC applications, we calculate in Monte Carlo simulations the probability of nding a statistically signi cant result at 5% in at least one speci cation. We nd that this probability can be as high as 13% (23% for a 10% signi cance test) when there are 12 pre-intervention periods and decay slowly with the number of pre-intervention periods. With 230 pre-intervention periods, this probability is still around 10% (18% for a 10% signi cance test). We show that the speci cation that uses the average pre-treatment outcome values to estimate the weights performed particularly bad in our simulations. However, the speci cation-searching problem remains relevant even when we do not consider this speci cation. We also show that this speci cation-searching problem is relevant in simulations with real datasets looking at placebo interventions in the Current Population Survey (CPS). In order to mitigate this problem, we propose a criterion to select among SC di erent speci cations based on the prediction error of each speci cations in placebo estimations |
Date: | 2016–06–08 |
URL: | http://d.repec.org/n?u=RePEc:fgv:eesptd:420&r=ecm |
By: | Anderson, James (Department of Economics Boston College); Larch, Mario (University of Bayreuth); Yotov, Yoto (School of Economics Drexel University) |
Abstract: | We develop a simple estimation procedure for general equilibrium (GE) comparative static analysis of gravity models. Non-linear solvers of estimated models are replaced by (constrained) regressions. Applied economists can more readily generate results, with more intuition about the working of the model. We illustrate with a worldwide border removal application using the Poisson Pseudo-Maximum-Likelihood (PPML) estimator in STATA, iterated to deliver conditional and full general equilibrium responses. The method works by fully exploiting the combined properties of structural gravity and PPML. Our procedures readily extend to a wide class of general equilibrium production models. |
Keywords: | Structural Gravity; General Equilibrium Effects; Counterfactuals; Estimation |
JEL: | F10 F14 F16 |
Date: | 2015–11–09 |
URL: | http://d.repec.org/n?u=RePEc:ris:drxlwp:2016_006&r=ecm |
By: | Xiaohong Chen (Institute for Fiscal Studies and Yale University); Oliver Linton (Institute for Fiscal Studies and University of Cambridge); Stefan Schneeberger (Institute for Fiscal Studies); Yanping Yi (Institute for Fiscal Studies) |
Abstract: | We propose new methods for estimating the bid-ask spread from observed transaction prices alone. Our methods are based on the empirical characteristic function instead of the sample autocovariance function like the method of Roll (1984). As in Roll (1984), we have a closed form expression for the spread, but this is only based on a limited amount of the model-implied identification restrictions. We also provide methods that take account of more identification information. We compare our methods theoretically and numerically with the Roll method as well as with its best known competitor, the Hasbrouck (2004) method, which uses a Bayesian Gibbs methodology under a Gaussian assumption. Our estimators are competitive with Roll’s and Hasbrouck’s when the latent true fundamental return distribution is Gaussian, and perform much better when this distribution is far from Gaussian. Our methods are applied to the Emini futures contract on the S&P 500 during the Flash Crash of May 6, 2010. Extensions to models allowing for unbalanced order flow or Hidden Markov trade direction indicators or trade direction indicators having general asymmetric sup port or adverse selection are also presented, without requiring additional data. |
Date: | 2016–03–18 |
URL: | http://d.repec.org/n?u=RePEc:ifs:cemmap:12/16&r=ecm |
By: | Ivan A. Canay (Institute for Fiscal Studies and Northwestern University); Azeem Shaikh (Institute for Fiscal Studies) |
Abstract: | This paper surveys some of the recent literature on inference in partially identified models. After reviewing some basic concepts, including the definition of a partially identified model and the identified set, we turn our attention to the construction of confidence regions in partially identified settings. In our discussion, we emphasize the importance of requiring confidence regions to be uniformly consistent in level over relevant classes of distributions. Due to space limitations, our survey is mainly limited to the class of partially identified models in which the identified set is characterized by a finite number of moment inequalities or the closely related class of partially identified models in which the identified set is a function of a such a set. The latter class of models most commonly arise when interest focuses on a subvector of a vector valued parameter, whose values are limited by a finite number of moment inequalities. We then rapidly review some important parts of the broader literature on inference in partially identified models and conclude by providing some thoughts on fruitful directions for future research. |
Keywords: | Partially Identified Model, Confidence Regions, Uniform Asymptotic Validity, Moment Inequalities, Subvector Inference |
Date: | 2016–01–29 |
URL: | http://d.repec.org/n?u=RePEc:ifs:cemmap:05/16&r=ecm |
By: | Susanne M. Schennach (Institute for Fiscal Studies) |
Abstract: | The traditional approach to obtain valid con?dence intervals for nonparametric quantities is to select a smoothing parameter such that the bias of the estimator is negligible relative to its standard deviation. While this approach is apparently simple, it has two drawbacks: First, the question of optimal bandwidth selection is no longer well-de?ned, as it is not clear what ratio of bias to standard deviation should be considered negligible. Second, since the bandwidth choice necessarily deviates from the optimal (mean squares-minimizing) bandwidth, such a con?dence interval is very inefficient. To address these issues, we construct valid con?dence intervals that account for the presence of a nonnegligible bias and thus make it possible to perform inference with optimal mean squared error minimizing bandwidths. The key difficulty in achieving this involves ?nding a strict, yet feasible, bound on the bias of a nonparametric estimator. It is well-known that it is not possible to consistently estimate the point-wise bias of an optimal nonparametric estimator (for otherwise, one could subtract it and obtain a faster convergence rate violating Stone's bounds on optimal convergence rate). Nevertheless, we ?nd that, under minimal primitive assumptions, it is possible to consistently estimate an upper bound on the magnitude of the bias, which is su?cient to deliver a valid con?dence interval whose length decreases at the optimal rate and which does not contradict Stone’s results. |
Date: | 2015–11–25 |
URL: | http://d.repec.org/n?u=RePEc:ifs:cemmap:71/15&r=ecm |
By: | Christian M. Hafner (Institute for Fiscal Studies); Oliver Linton (Institute for Fiscal Studies and University of Cambridge); Haihan Tang (Institute for Fiscal Studies) |
Abstract: | We consider a Kronecker product structure for large covariance matrices, which has the feature that the number of free parameters increases logarithmically with the dimensions of the matrix. We propose an estimation method of the free parameters based on the log linear property of this structure, and also a Quasi-Likelihood method. We establish the rate of convergence of the estimated parameters when the size of the matrix diverges. We also establish a CLT for our method. We apply the method to portfolio choice for S&P500 daily returns and compare with sample covariance based methods and with the recent Fan et al. (2013) method. |
Keywords: | Correlation Matrix; Kronecker Product; MTMM; Portfolio Choice |
Date: | 2016–05–17 |
URL: | http://d.repec.org/n?u=RePEc:ifs:cemmap:23/16&r=ecm |
By: | Shelton Peiris (University of Sydney, Australia); Manabu Asai (Soka University, Japan); Michael McAleer (National Tsing Hua University, Taiwan; Erasmus University Rotterdam, the Netherlands; Complutense University of Madrid, Spain) |
Abstract: | In recent years fractionally differenced processes have received a great deal of attention due to its flexibility in financial applications with long memory. This paper considers a class of models generated by Gegenbauer polynomials, incorporating the long memory in stochastic volatility (SV) components in order to develop the General Long Memory SV (GLMSV) model. We examine the statistical properties of the new model, suggest using the spectral likelihood estimation for long memory processes, and investigate the finite sample properties via Monte Carlo experiments. We apply the model to three exchange rate return series. Overall, the results of the out-of-sample forecasts show the adequacy of the new GLMSV model. |
Keywords: | Stochastic volatility; GARCH models; Gegenbauer Polynomial; Long Memory; Spectral Likelihood; Estimation; Forecasting |
JEL: | C18 C21 C58 |
Date: | 2016–06–06 |
URL: | http://d.repec.org/n?u=RePEc:tin:wpaper:20160044&r=ecm |
By: | Alexander N. Bogin (Federal Housing Finance Agency); Stephen D. Bruestle (Penn State Erie); William M. Doerner (Federal Housing Finance Agency) |
Abstract: | We develop a theoretically-based statistical technique to identify a conservative lower bound for house prices. Leveraging a model based upon consumer and investor incentives, we are able to explain the depth of housing market downturns at both the national and state level over a variety of market environments. This approach performs well in several historical back tests and has strong out-of-sample predictive ability. When back-tested, our estimation approach does not understate house price declines in any state over the 1987 to 2001 housing cycle and only understates declines in three states during the most recent financial crisis. This latter result is particularly noteworthy given that the post-2001 estimates are performed out-of-sample. Our measure of a conservative lower bound is attractive because it (1) provides a leading indicator of the severity of future downturns and (2) allows trough estimates to dynamically adjust as markets conditions change. This estimation technique could prove particularly helpful in measuring the credit risk associated with portfolios of mortgage assets as part of evaluating static stress tests or designing dynamic stress tests. |
Keywords: | house prices, trough, lower bound, trend, financial stress testing |
JEL: | G21 C58 R31 |
Date: | 2015–05 |
URL: | http://d.repec.org/n?u=RePEc:hfa:wpaper:15-01&r=ecm |
By: | Monika Hadas-Dyduch (University of Economics in Katowice) |
Abstract: | The aim of this article is the prediction of GDP Polish and other selected European countries. For this purpose integrated into one algorithm econometric methods and wavelet analysis. Econometric methods and wavelet transform are combined goal of constructing a copyright model for predicting macroeconomic indicators. In the article, for estimating the macroeconomic indicators on the example of GDP proposed authorial algorithm that combines the following methods: a method trend creep method of alignment exponential and analysis multiresolution. Used econometric methods, this is a trend crawling and alignment exponential have been modified in several major stages. The aim of the merger of these methods is the construction of algorithm to predict short-term time series. In the copyright algorithm was applied wavelet continuous compactly supported. wavelet used Daubechies. The Daubechies wavelets, are a family of orthogonal wavelets and characterized by a maximal number of vanishing moments for some given support. With each wavelet type of this class, there is a scaling function which generates an orthogonal multiresolution analysis. |
Keywords: | prediction, wavelets, wavelet transform |
JEL: | F37 C13 G15 |
Date: | 2016–06 |
URL: | http://d.repec.org/n?u=RePEc:pes:wpaper:2016:no30&r=ecm |
By: | YAMAMOTO, Yohei |
Abstract: | In this paper, we consider residual-based bootstrap methods à la GonÇalves and Perron (2014) to construct the confidence interval for structural impulse response functions in factor-augmented vector autoregressions. In particular, we compare the bootstrap with factor estimation (Procedure A) with the bootstrap without factor estimation (Procedure B). In theory, both procedures are asymptotically valid under a condition √T/N → 0, where N and T are the cross-sectional dimension and the time dimension, respectively. Even when √T/N → 0 is irrelevant, Procedure A still accounts for the effect of the factor estimation errors on the impulse response function estimate and it achieves good coverage rates in most cases. On the contrary, Procedure B is invalid in such cases and tends to undercover if N is much smaller than T. However, Procedure B is implemented more straightforwardly from the standard structural VARs and the length of the confidence interval is shorter than that of Procedure A in finite samples. Given that Procedure B still gives a satisfactory coverage rate unless N is very small, it remains in consideration of empirical use, although using Procedure A is safer as it correctly accounts for the effect of the factor estimation errors. |
Keywords: | factor-augmented vector autoregression, structural identiOcation, coverage rate, impulse response function |
JEL: | C14 C22 |
Date: | 2016–05–28 |
URL: | http://d.repec.org/n?u=RePEc:hit:hiasdp:hias-e-26&r=ecm |