|
on Econometrics |
By: | Michael Vogt; Oliver Linton (Institute for Fiscal Studies and Cambridge University) |
Abstract: | In this paper, we study a nonparametric regression model including a periodic component, a smooth trend function, and a stochastic error term. We propose a procedure to estimate the unknown period and the function values of the periodic component as well as the nonparametric trend function. The theoretical part of the paper establishes the asymptotic properties of our estimators. In particular, we show that our estimator of the period is consistent. In addition, we derive the convergence rates as well as the limiting distributions of our estimators of the periodic component and the trend function. The asymptotic results are complemented with a simulation study that investigates the small sample behaviour of our procedure. Finally, we illustrate our method by applying it to a series of global temperature anomalies. |
Keywords: | Nonparametric estimation; penalized least squares; periodic sequence; temperature anomaly data. |
Date: | 2012–09 |
URL: | http://d.repec.org/n?u=RePEc:ifs:cemmap:23/12&r=ecm |
By: | Ioannis Kasparis; Elena Andreou; Peter C. B. Phillips |
Abstract: | A unifying framework for inference is developed in predictive regressions where the predictor has unknown integration properties and may be stationary or nonstationary. Two easily implemented nonparametric F-tests are proposed. The test statistics are related to those of Kasparis and Phillips (2012) and are obtained by kernel regression. The limit distribution of these predictive tests holds for a wide range of predictors including stationary as well as non-stationary fractional and near unit root processes. In this sense the proposed tests provide a unifying framework for predictive inference, allowing for possibly nonlinear relationships of unknown form, and offering robustness to integration order and functional form. Under the null of no predictability the limit distributions of the tests involve functionals of independent ÷² variates. The tests are consistent and divergence rates are faster when the predictor is stationary. Asymptotic theory and simulations show that the proposed tests are more powerful than existing parametric predictability tests when deviations from unity are large or the predictive regression is nonlinear. Some empirical illustrations to monthly SP500 stock returns data are provided. |
Keywords: | Functional regression, Nonparametric predictability test, Nonparametric regression, Stock returns, Predictive regression |
Date: | 2012–09 |
URL: | http://d.repec.org/n?u=RePEc:ucy:cypeua:14-2012&r=ecm |
By: | Bertille Antoine (Simon Fraser University); Eric Renault (Brown University) |
Abstract: | We consider models defined by a set of moment restrictions that may be subject to weak identification. Following the recent literature, the identification of the structural parameters is characterized by the Jacobian of the moment conditions. We unify several definitions of identification that have been used in the literature, and show how they are linked to the consistency and asymptotic normality of GMM estimators. We then develop two tests to assess the identification strength of the structural parameters. Both tests are straightforward to apply. In simulations, our tests are well-behaved when compared to contenders, both in terms of size and power. |
Keywords: | GMM, Weak IV, Test, Misspecification |
JEL: | C32 C12 C13 C51 |
Date: | 2012–09 |
URL: | http://d.repec.org/n?u=RePEc:sfu:sfudps:dp12-17&r=ecm |
By: | LIU, CHU-AN |
Abstract: | This paper proposes a new model averaging estimator for the linear regression model with heteroskedastic errors. We address the issues of how to optimally assign the weights for candidate models and how to make inference based on the averaging estimator. We derive the asymptotic mean squared error (AMSE) of the averaging estimator in a local asymptotic framework, and then choose the optimal weights by minimizing the AMSE. We propose a plug-in estimator of the optimal weights and use these estimated weights to construct a plug-in averaging estimator of the parameter of interest. We derive the asymptotic distribution of the plug-in averaging estimator and suggest a plug-in method to construct confidence intervals. Monte Carlo simulations show that the plug-in averaging estimator has much smaller expected squared error, maximum risk, and maximum regret than other existing model selection and model averaging methods. As an empirical illustration, the proposed methodology is applied to cross-country growth regressions. |
Keywords: | Local asymptotic theory; Model averaging; Model selection; Plug-in estimators |
JEL: | C51 C52 |
Date: | 2012–08–10 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:41414&r=ecm |
By: | Startz, Richard |
Abstract: | Use of heteroskedasticity-robust standard errors has become common in frequentist regressions. I offer here a Bayesian analog. The Bayesian version is derived by first focusing on the likelihood function for the sample values of the identifying moment conditions of least squares and then formulating a convenient prior for the variances of the error terms. The first step introduces a sandwich estimator into the posterior calculations, while the second step allows the investigator to set the sandwich for either heteroskedastic or homoskedastic error variances. If desired, the Bayesian estimator can be made to look very similar to the usual heteroskedasticity-robust frequentist estimator. Bayesian estimation is easily accomplished by a standard MCMC procedure. |
Keywords: | Econometrics and Quantitative Economics, robust standard errors, bayesian |
Date: | 2012–08–15 |
URL: | http://d.repec.org/n?u=RePEc:cdl:ucsbec:qt69c4x8m9&r=ecm |
By: | Michael Vogt |
Abstract: | In this paper, we study nonparametric models allowing for locally stationary regressors and a regression function that changes smoothly over time. These models are a natural extension of time series models with time-varying coecients. We introduce a kernel-based method to estimate the time-varying regression function and provide asymptotic theory for our estimates. Moreover, we show that the main conditions of the theory are satised for a large class of nonlinear autoregressive processes with a time-varying regression function. Finally, we examine structured models where the regression function splits up into time-varying additive components. As will be seen, estimation in these models does not suer from the curse of dimensionality. We complement the technical analysis of the paper by an application to nancial data. |
Keywords: | local stationarity, nonparametric regression, smooth backfitting |
Date: | 2012–09 |
URL: | http://d.repec.org/n?u=RePEc:ifs:cemmap:22/12&r=ecm |
By: | Arnold Zellner (University of Chicago); Tomohiro Ando (Keio University); Nalan Basturk (Erasmus University Rotterdam); Lennart Hoogerheide (VU University Amsterdam); Herman K. van Dijk (Erasmus University Rotterdam, and VU University Amsterdam) |
Abstract: | We discuss Bayesian inferential procedures within the family of instrumental variables regression models and focus on two issues: existence conditions for posterior moments of the parameters of interest under a flat prior and the potential of Direct Monte Carlo (DMC) approaches for efficient evaluation of such possibly highly non-elliptical posteriors. We show that, for the general case of <I>m</I> endogenous variables under a flat prior, posterior moments of order <I>r</I> exist for the coefficients reflecting the endogenous regressors' effect on the dependent variable, if the number of instruments is greater than <I>m</I>+<I>r</I>, even though there is an issue of local non-identification that causes non-elliptical shapes of the posterior. This stresses the need for efficient Monte Carlo integration methods. We introduce an extension of DMC that incorporates an acceptance-rejection sampling step within DMC. This Acceptance-Rejection within Direct Monte Carlo (ARDMC) method has the attractive property that the generated random drawings are independent, which greatly helps the fast convergence of simulation results, and which facilitates the evaluation of the numerical accuracy. We note that ARDMC is an analogue to the well-known 'Metropolis-Hastings within Gibbs' sampling. We compare the ARDMC approach with the Gibbs sampler using simulated data and two empirical data sets, involving the settler mortality instrument of Acemoglu et al. (2001) and father's education's instrument used by Hoogerheide et al. (2012a). An efficiency gain is observed both under strong and weak instruments. |
Keywords: | Instrumental variables; Bayesian inference; Direct Monte Carlo; Acceptance-Rejection; numerical standard errors |
JEL: | C11 C15 C26 C36 |
Date: | 2012–09–17 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:20120095&r=ecm |
By: | H. Peter Boswijk (University of Amsterdam); Michael Jansson (UC Berkeley and CREATES); Morten Ørregaard Nielsen (Queen's University and CREATES) |
Abstract: | We suggest improved tests for cointegration rank in the vector autoregressive (VAR) model and develop asymptotic distribution theory and local power results. The tests are (quasi-)likelihood ratio tests based on a Gaussian likelihood, but of course the asymptotic results apply more generally. The power gains relative to existing tests are due to two factors. First, instead of basing our tests on the conditional (with respect to the initial observations) likelihood, we follow the recent unit root literature and base our tests on the full likelihood as in, e.g., Elliott, Rothenberg, and Stock (1996). Secondly, our tests incorporate a ?sign?restriction which generalizes the one-sided unit root test. We show that the asymptotic local power of the proposed tests dominates that of existing cointegration rank tests. |
Keywords: | Cointegration rank, efficiency, likelihood ratio test, vector autoregression |
JEL: | C12 C32 |
Date: | 2012–09–19 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2012-39&r=ecm |
By: | Harding, Matthew (Stanford University); Lamarche, Carlos (University of Kentucky) |
Abstract: | This paper proposes a quantile regression estimator for a panel data model with interactive effects potentially correlated with the independent variables. We provide conditions under which the slope parameter estimator is asymptotically Gaussian. Monte Carlo studies are carried out to investigate the finite sample performance of the proposed method in comparison with other candidate methods. We discuss an approach to testing the model specification against a competing fixed effects specification. The paper presents an empirical application of the method to study the effect of class size and class composition on educational attainment. The findings show that (i) a change in the gender composition of a class impacts differently low- and high-performing students; (ii) while smaller classes are beneficial for low performers, larger classes are beneficial for high performers; (iii) reductions in class size do not seem to impact mean and median student performance; (iv) the fixed effects specification is rejected in favor of the interactive effects specification. |
Keywords: | quantile regression, panel data, interactive effects, instrumental variables, class size, educational attainment |
JEL: | C23 C33 I21 I28 |
Date: | 2012–08 |
URL: | http://d.repec.org/n?u=RePEc:iza:izadps:dp6802&r=ecm |
By: | Mika Meitz (Department of Economics, Koç University); Pentti Saikkonen (Department of Mathematics and Statistics, University of Helsinki) |
Abstract: | We consider maximum likelihood estimation of a particular noninvertible ARMA model with autoregressive conditionally heteroskedastic (ARCH) errors. The model can be seen as an extension to so-called all-pass models in that it allows for autocorrelation and for more fl exible forms of conditional heteroskedasticity. These features may be attractive especially in economic and financial applications. Unlike in previous literature on maximum likelihood estimation of noncausal and/or noninvertible ARMA models and all-pass models, our estimation theory does allow for Gaussian innovations. We give conditions under which a strongly consistent and asymptotically normally distributed solution to the likelihood equations exists, and we also provide a consistent estimator of the limiting covariance matrix. |
Keywords: | Maximum likelihood estimation, autoregressive moving average, ARMA, autoregressive conditional heteroskedasticity, ARCH, noninvertible, noncausal, all-pass, nonminimum phase. |
JEL: | C22 C51 |
Date: | 2012–09 |
URL: | http://d.repec.org/n?u=RePEc:koc:wpaper:1226&r=ecm |
By: | Francq, Christian; Wintenberger, Olivier; Zakoian, Jean-Michel |
Abstract: | This paper studies the probabilistic properties and the estimation of the asymmetric log-GARCH($p,q$) model. In this model, the log-volatility is written as a linear function of past values of the log-squared observations, with coefficients depending on the sign of the observations, and past log-volatility values. Conditions are obtained for the existence of solutions and finiteness of their log-moments. We also study the tail properties of the solution. Under mild assumptions, we show that the quasi-maximum likelihood estimation of the parameters is strongly consistent and asymptotically normal. Simulations illustrating the theoretical results and an application to real financial data are proposed. |
Keywords: | log-GARCH: Quasi-Maximum Likelihood: Strict stationarity: Tail index |
JEL: | C13 C22 |
Date: | 2012–09–16 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:41373&r=ecm |
By: | Oliver Linton (Institute for Fiscal Studies and Cambridge University); Yoon-Jae Whang (Institute for Fiscal Studies and Seoul National University); Yu-Min Yen |
Abstract: | The so-called leverage hypothesis is that negative shocks to prices/returns affect volatility more than equal positive shocks. Whether this is attributable to changing financial leverage is still subject to dispute but the terminology is in wide use. There are many tests of the leverage hypothesis using discrete time data. These typically involve fitting of a general parametric or semiparametric model to conditional volatility and then testing the implied restrictions on parameters or curves. We propose an alternative way of testing this hypothesis using realised volatility as an alternative direct nonparametric measure. Our null hypothesis is of conditional distributional dominance and so is much stronger than the usual hypotheses considered previously. We implement our test on a number of stock return datasets using intraday data over a long span. We find powerful evidence in favour of our hypothesis. |
Keywords: | Distribution Function, Leverage Effect, Gaussian Process |
JEL: | C14 C15 |
Date: | 2012–09 |
URL: | http://d.repec.org/n?u=RePEc:ifs:cemmap:24/12&r=ecm |
By: | Cogley, Timothy; Startz, Richard |
Abstract: | We set out a Gibbs sampler for the linear instrumental-variable model withnormal errors and normal priors, and we show how to compute the marginallikelihood. |
Keywords: | Econometrics and Quantitative Economics, instrumental variables, bayesian, gibbs sampling |
Date: | 2012–09–01 |
URL: | http://d.repec.org/n?u=RePEc:cdl:ucsbec:qt40v0x246&r=ecm |
By: | Rulon D. Pope; Jeffrey LaFrance |
Abstract: | Economists who estimate demand or supply systems are often faced with the issue of whether to estimate with shares or quantities as the dependent variables. This paper reviews the implications of making the wrong choice in the context of normalized profit functions of competitive behavior. The implication is that inconsistent estimates are obtained if one makes the wrong choice. A robust structure is proposed which nests these forms (shares or quantities) in a general system and that allows the data to suggest which form is preferable. An application to the U.S. agricultural sector follows. Our results suggest that shares and quantities are rejected in favor of an alternative functional form. |
Keywords: | Netputs, robust estimation |
JEL: | Q11 C51 |
Date: | 2012–09 |
URL: | http://d.repec.org/n?u=RePEc:mos:moswps:2012-17&r=ecm |
By: | Behaghel, Luc (Paris School of Economics); Crépon, Bruno (CREST); Gurgand, Marc (Paris School of Economics); Le Barbanchon, Thomas (CREST) |
Abstract: | We propose a novel selectivity correction procedure to deal with survey attrition, at the crossroads of the "Heckit" and of the bounding approach of Lee (2009). As a substitute for the instrument needed in sample selectivity correction models, we use information on the number of attempts that were made to obtain response to the survey from each individual who responded. We obtain set identification, but if the number of attempts to reach each individual is high enough, we can come closer to point identification. We apply our sample selection correction in the context of a job-search experiment with low and unbalanced response rates. |
Keywords: | survey non response, sample selectivity, treatment effect model, randomized controlled trial |
JEL: | C31 C93 J6 |
Date: | 2012–07 |
URL: | http://d.repec.org/n?u=RePEc:iza:izadps:dp6751&r=ecm |
By: | Dominique, C-Rene; Rivera-Solis, Luis Eduardo |
Abstract: | The capital market is a reflexive dynamical input/output construct whose output (time series) is usually assessed by an index of roughness known as Hurst’s exponent (H). Oddly enough, H has no theoretical foundation, but recently it has been found experimentally to vary from persistence (H > 1/2) or long-term dependence to anti-persistence (H < 1/2) or short-term dependence. This paper uses the thrown-offs of quadratic maps (modeled asymptotically) and singularity spectra of fractal sets to characterize H, the alternateness of dependence, and market crashes while proposing a simpler method of computing the correlation dimension than the Grassberger-Procaccia procedure. |
Keywords: | Hurst Exponent; anti-persistence; fractal attractors; SDIC; chaos; inherent noise; market crashes; Renyi’s generalized fractal dimensions |
JEL: | G1 C6 A1 G01 |
Date: | 2012–03–01 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:41408&r=ecm |
By: | Christopher F. Parmeter (Department of Economics, University of Miami); Jaren C. Pope (Department of Economics, Brigham Young University) |
Abstract: | There has recently been a dramatic increase in the number of papers that have combined quasi-experimental methods with hedonic property models. This is largely due to the concern that cross-sectional hedonic methods may be severely biased by omitted variables. While the empirical literature has developed extensively, there has not been a consistent treatment of the theory and methods of combining hedonic property models with quasi-experiments. The purpose of this chapter is to fill this void. An effort is made to provide background information on the traditional hedonic theory, the traditional cross-sectional hedonic methods as well as the newer quasi-experimental hedonic methods that use program evaluation techniques. By connecting these two literatures, the underlying theoretical and empirical assumptions necessary to estimate the marginal willingness to pay for a housing characteristic are highlighted. The chapter also provides a practical how to guide on implementing a quasi-experimental hedonic analysis. This is done by focusing on a series of steps that can help to ensure the reliability of a quasi-experimental identification strategy. We illustrate this process using several recent papers from the literature. |
Keywords: | Regression Discontinuity, Differences-in-Differences, Property Value, Program Evaluation, Marginal Willingness to Pay, Capitalization |
JEL: | C9 D6 Q5 R0 |
Date: | 2012–02–09 |
URL: | http://d.repec.org/n?u=RePEc:mia:wpaper:2012-7&r=ecm |
By: | Mohsen Sadatsafavi;; Carlo Marra;; Lawrence McCandless; Stirling Bryan |
Abstract: | Cost-effectiveness analyses (CEAs) that use patient-specific data on costs and health outcomes from a randomized controlled trial (RCT) are popular, yet such CEAs are often criticized because they neglect to incorporate evidence external to the trial. Although evidence directly defined on cost and health outcomes is often not transferrable across jurisdictions, evidence on biologic aspects of treatments such as the treatment effect can be transferred, and incorporating such evidence in the CEA can conceivably affect the results. Fully parametric Bayesian evidence synthesis for RCTbased CEAs is possible, but there are challenges involved in parametric modeling of cost and health outcomes and their relation with external evidence. A popular method for quantifying uncertainty in a RCT-based CEA is the bootstrap. It will be attractive to further expand this method for the incorporation of external evidence. To this end, we utilize the Bayesian interpretation of the bootstrap and derive the distribution for the cost and effectiveness outcomes after observing the current RCT data and the external evidence. We propose simple modifications of the bootstrap for sampling from such posterior distributions. We use data from a clinical trial and incorporate external evidence on the effect size of treatments to illustrate the method in action. Compared to the parametric models of evidence synthesis, the proposed approach requires fewer distributional assumptions, does not require explicit modeling of the relation between external evidence and outcomes of interest, and is generally easier to implement. |
Keywords: | Cost-Benefit Analysis+ Bayes Theorem+ Clinical Trial+ Statistics, Nonparametric |
JEL: | C15 C11 C14 C18 |
Date: | 2012–08 |
URL: | http://d.repec.org/n?u=RePEc:yor:hectdg:12/24&r=ecm |