|
on Econometrics |
By: | Matias D. Cattaneo (University of Michigan); Michael Jansson (UC Berkeley and CREATES); Whitney K. Newey (MIT Department of Economics) |
Abstract: | Non-standard distributional approximations have received considerable attention in recent years. They often provide more accurate approximations in small samples, and theoretical improvements in some cases. This paper shows that the seemingly unrelated "?many instruments asymptotics" ?and "?small bandwidth asymptotics" ?share a common structure, where the object determining the limiting distribution is a V-statistic with a remainder that is an asymptotically normal degenerate U-statistic. This general structure can be used to derive new results. We employ it to obtain a new asymptotic distribution of a series estimator of the partially linear model when the number of terms in the series approximation possibly grows as fast as the sample size. This alternative asymptotic experiment implies a larger asymptotic variance than usual. When the disturbance is homoskedastic, this larger variance is consistently estimated by any of the usual homoskedastic-consistent estimators provided a "?degrees-of-freedom correction?" is used. Under heteroskedasticity of unknown form, however, none of the commonly used heteroskedasticity-robust standard-error estimators are consistent under the "?many regressors asymptotics"?. We characterize the source of this failure, and we also propose a new standard-error estimator that is consistent under both heteroskedasticity and ?"many regressors asymptotics"?. A small simulation study shows that these new confidence intervals have reasonably good empirical size in finite samples. |
Keywords: | partially linear model, many terms, adjusted variance. |
JEL: | C13 C31 |
Date: | 2012–01–20 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2012-02&r=ecm |
By: | Díaz-Emparanza Herrero, Ignacio |
Abstract: | When working with time series data observed at intervals smaller than a year, it is often necessary to test for the presence of seasonal unit roots. One of the most widely used methods for testing seasonal unit roots is that of HEGY, which provides test statistics with non-standard distributions. This paper describes a generalisation of this method for any periodicity and uses a response surface regressions approach to calculate the critical values and P values of the HEGY statistics whatever the periodicity and sample size of the data. The algorithms are prepared with the Gretl open source econometrics package and some new tables of critical values for daily, hourly and half-hourly data are presented. |
Keywords: | seasonality, unit roots, surface response analysis, |
Date: | 2011–12 |
URL: | http://d.repec.org/n?u=RePEc:ehu:biltok:5568&r=ecm |
By: | Jozef Barunik; Ladislav Kristoufek |
Abstract: | In this paper, we show how the sampling properties of the Hurst exponent methods of estimation change with the presence of heavy tails. We run extensive Monte Carlo simulations to find out how rescaled range analysis (R/S), multifractal detrended fluctuation analysis (MF-DFA), detrending moving average (DMA) and generalized Hurst exponent approach (GHE) estimate Hurst exponent on independent series with different heavy tails. For this purpose, we generate independent random series from stable distribution with stability exponent {\alpha} changing from 1.1 (heaviest tails) to 2 (Gaussian normal distribution) and we estimate the Hurst exponent using the different methods. R/S and GHE prove to be robust to heavy tails in the underlying process. GHE provides the lowest variance and bias in comparison to the other methods regardless the presence of heavy tails in data and sample size. Utilizing this result, we apply a novel approach of the intraday time-dependent Hurst exponent and we estimate the Hurst exponent on high frequency data for each trading day separately. We obtain Hurst exponents for S&P500 index for the period beginning with year 1983 and ending by November 2009 and we discuss the surprising result which uncovers how the market's behavior changed over this long period. |
Date: | 2012–01 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1201.4786&r=ecm |
By: | Tara M. Sinclair (George Washington University) |
Abstract: | This article provides a discussion of Clements and Galvão’s “Forecasting with Vector Autoregressive Models of Data Vintages: US output growth and inflation.” Clements and Galvão argue that a multiple-vintage VAR model can be useful for forecasting data that are subject to revisions. Clements and Galvão draw a “distinction between forecasting future observations and revisions to past data,” which brings yet another real time data issue to the attention of forecasters. This comment discusses the importance of taking data revisions into consideration and compares the multiple-vintage VAR approach of Clements and Galvão to a state-space approach. |
Keywords: | Real time data, Evaluating forecasts, Forecasting practice, Time series, Econometric models |
JEL: | C53 |
Date: | 2012–01 |
URL: | http://d.repec.org/n?u=RePEc:gwc:wpaper:2012-001&r=ecm |
By: | Zárraga Alonso, Ainhoa; Nieto Domenech, Belén; Orbe Mandaluniz, Susan |
Abstract: | This paper compares the performance of three different time-varying betas that have never previously been compared: the rolling OLS estimator, a nonparametric estimator and an estimator based on GARCH models. The study is conducted using returns from the Mexican stock market grouped into six portfolios for the period 2003-2009. The comparison, based on asset pricing perspective and mean-variance space returns, concludes that GARCH based beta estimators outperform the others when the comparison is in terms of time series while the nonparametric estimator is more appropriate in the cross-sectional context. |
Keywords: | time-varying beta, nonparametric estimator, GARCH based beta estimator, G15, C12, C14, |
JEL: | G15 C12 C14 |
Date: | 2011 |
URL: | http://d.repec.org/n?u=RePEc:ehu:biltok:5283&r=ecm |
By: | Matt P. Dziubinski (Aarhus University and CREATES) |
Abstract: | We present and evaluate a numerical optimization method (together with an algorithm for choosing the starting values) pertinent to the constrained optimization problem arising in the estimation of the GARCH models with inequality constraints, in particular the Simplied Component GARCH Model (SCGARCH), together with algorithms for the objective function and analytical gradient computation for SCGARCH. |
Keywords: | Constrained optimization, GARCH, infeasibility, inference under constraints, nonlinear programming, performance of numerical algorithms, SCGARCH, sequential quadratic programming |
JEL: | C32 C51 C58 C61 C63 C88 |
Date: | 2012–01–25 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2012-03&r=ecm |