|
on Econometrics |
By: | Taoufik Bouezmarni; Jeroen V.K. Rombouts (IEA, HEC Montréal) |
Abstract: | The Gaussian kernel density estimator is known to have substantial problems for bounded random variables with high density at the boundaries. For i.i.d. data several solutions have been put forward to solve this boundary problem. In this paper we propose the gamma kernel estimator as density estimator for positive data from a stationary ?-mixing process. We derive the mean integrated squared error, almost sure convergence and asymptotic normality. In a Monte Carlo study, where we generate data from an autoregressive conditional duration model and a stochastic volatility model, we find that the gamma kernel outperforms the local linear density estimator. An application to data from financial transaction durations, realized volatility and electricity price data is provided. |
Keywords: | Gamma kernel, nonparametric density estimation, mixing process, transaction durations, realised volatility. |
JEL: | C11 C22 C52 |
Date: | 2006–09 |
URL: | http://d.repec.org/n?u=RePEc:iea:carech:0609&r=ecm |
By: | Qiying Wang (School of Mathematics and Statistics, University of Sydney); Peter C.B. Phillips (Cowles Foundation, Yale University) |
Abstract: | We provide a new asymptotic theory for local time density estimation for a general class of functionals of integrated time series. This result provides a convenient basis for developing an asymptotic theory for nonparametric cointegrating regression and autoregression. Our treatment directly involves the density function of the processes under consideration and avoids Fourier integral representations and Markov process theory which have been used in earlier research on this type of problem. The approach provides results of wide applicability to important practical cases and involves rather simple derivations that should make the limit theory more accessible and useable in econometric applications. Our main result is applied to offer an alternative development of the asymptotic theory for non-parametric estimation of a non-linear cointegrating regression involving non-stationary time series. In place of the framework of null recurrent Markov chains as developed in recent work of Karlsen, Myklebust and Tjostheim (2007), the direct local time density argument used here more closely resembles conventional nonparametric arguments, making the conditions simpler and more easily verified. |
Keywords: | Brownian Local time, Cointegration, Integrated process, Local time density estimation, Nonlinear functionals, Nonparametric regression, Unit root |
JEL: | C14 C22 |
Date: | 2006–12 |
URL: | http://d.repec.org/n?u=RePEc:cwl:cwldpp:1594&r=ecm |
By: | Ke-Li Xu (Dept. of Economics, Yale University); Peter C.B. Phillips (Cowles Foundation, Yale University) |
Abstract: | Stable autoregressive models of known finite order are considered with martingale differences errors scaled by an unknown nonparametric time-varying function generating heterogeneity. An important special case involves structural change in the error variance, but in most practical cases the pattern of variance change over time is unknown and may involve shifts at unknown discrete points in time, continuous evolution or combinations of the two. This paper develops kernel-based estimators of the residual variances and associated adaptive least squares (ALS) estimators of the autoregressive coefficients. These are shown to be asymptotically efficient, having the same limit distribution as the infeasible generalized least squares (GLS). Comparisons of the efficient procedure and ordinary least squares (OLS) reveal that least squares can be extremely inefficient in some cases while nearly optimal in others. Simulations show that, when least squares work well, the adaptive estimators perform comparably well, whereas when least squares work poorly, major efficiency gains are achieved by the new estimators. |
Keywords: | Adaptive estimation, Autoregression, Heterogeneity, Weighted regression |
JEL: | C14 C22 |
Date: | 2006–10 |
URL: | http://d.repec.org/n?u=RePEc:cwl:cwldpp:1585r&r=ecm |
By: | Jonathan Hill (Department of Economics, Florida International University) |
Abstract: | This paper develops a consistent test of best Lp-predictor functional form for a time series process. By functionally relating two moment conditions with different nuisance parameters we are able to construct a vector moment condition in which at least one element must be non-zero under the alternative. Specifically, we provide a sufficient condition for moment conditions of the type characterized by Stinchcombe and White (1998) to reveal model mis-specification for any nuisance parameter value. When the sufficient condition fails an alternative moment condition is guaranteed to work. A simulation study clearly demonstrates the superiority of a randomized test: randonly selecting the nuisance parameter leads to more power than average- and supremum-test functionals, and obtains empirical power nearly equivelant to uniformly most powerful tests in most cases. |
Keywords: | consistent test; conditional moment test; best Lp-predictor; nonlinear model. |
JEL: | C12 C45 C52 |
Date: | 2006–09 |
URL: | http://d.repec.org/n?u=RePEc:fiu:wpaper:0610&r=ecm |
By: | Siem Jan Koopman (Vrije Universiteit Amsterdam); Soon Yip Wong (Vrije Universiteit Amsterdam) |
Abstract: | A growing number of empirical studies provides evidence that dynamic properties of macroeconomic time series have been changing over time. Model-based procedures for the measurement of business cycles should therefore allow model parameters to adapt over time. In this paper the time dependencies of parameters are implied by a time dependent sample spectrum. Explicit model specifications for the parameters are therefore not required. Parameter estimation is carried out in the frequency domain by maximising the spectral likelihood function. The time dependent spectrum is specified as a semi-parametric smoothing spline ANOVA function that can be formulated in state space form. Since the resulting spectral likelihood function is time-varying, model parameter estimates become time-varying as well. This new and simple approach to business cycle extraction includes bootstrap procedures for the computation of confidence intervals and real-time procedures for the forecasting of the spectrum and the business cycle. We illustrate the methodology by presenting a complete business cycle analysis for two U.S. macroeconomic time series. The empirical results are promising and provide significant evidence for the great moderation of the U.S. business cycle. |
Keywords: | Frequency domain estimation; frequency domain bootstrap; time-varying parameters; unobserved components models |
JEL: | C13 C14 C22 E32 |
Date: | 2006–11–29 |
URL: | http://d.repec.org/n?u=RePEc:dgr:uvatin:20060105&r=ecm |
By: | Jean-Bernard Chatelain (PSE - Paris-Jourdan Sciences Economiques - [CNRS : UMR8545] - [Ecole des Hautes Etudes en Sciences Sociales][Ecole Nationale des Ponts et Chaussées][Ecole Normale Supérieure de Paris], EconomiX - [CNRS : UMR7166] - [Université de Paris X - Nanterre]) |
Abstract: | This paper proposes consistent moment selection procedures for generalized method of moments estimation based on the J test of over-identifying restrictions (Hansen [1982]) and on the Eichenbaum, Hansen and Singleton [1988] test of the validity of a subset of moment conditions. |
Keywords: | Generalized method of moments, test of over-identifying restrictions, test of subset of over-identifying restrictions, Consistent Moment Selection |
Date: | 2006–11–30 |
URL: | http://d.repec.org/n?u=RePEc:hal:papers:halshs-00112514_v2&r=ecm |
By: | Patrik Guggenberger |
URL: | http://d.repec.org/n?u=RePEc:cla:uclaol:402&r=ecm |
By: | Stefano Iacus (Department of Economics, Business and Statistics, University of Milan, IT); Masayuki Uchida (Departement of Mathematical Sciences, Faculty of Mathematics, Kyushu University, Ropponmatsu, Fukuoka 810-8560, Japan); Nakahiro Yoshida (Graduate School of Mathematical Sciences, University of Tokyo, 3-8-1 Komaba, Meguro-ku, Tokyo 153-8914 Japan) |
Abstract: | A one dimensional diffusion process $X=\{X_t, 0\leq t \leq T\}$ is observed only when its path lies over some threshold $\tau$. On the basis of the observable part of the trajectory, the problem is to estimate finite dimensional parameter in both drift and diffusion coefficient under a discrete sampling scheme. It is assumed that the sampling occurs at regularly spaced times intervals of length $h_n$ such that $h_n\cdot n =T$. The asymptotic is considered as $T\to\infty$, $n\to\infty$, $n h_n^2\to 0$. Consistency and asymptotic normality for estimators of parameters in both drift and diffusion coefficient is proved. |
Keywords: | discrete observations, partially observed systems, diffusion processes, |
Date: | 2006–11–12 |
URL: | http://d.repec.org/n?u=RePEc:bep:unimip:1042&r=ecm |
By: | John F. Geweke (University of Iowa); Joel L. Horowitz (Northwestern University); M. Hashem Pesaran (CIMF, University of Cambridge and IZA Bonn) |
Abstract: | As a unified discipline, econometrics is still relatively young and has been transforming and expanding very rapidly over the past few decades. Major advances have taken place in the analysis of cross sectional data by means of semi-parametric and non-parametric techniques. Heterogeneity of economic relations across individuals, firms and industries is increasingly acknowledged and attempts have been made to take them into account either by integrating out their effects or by modeling the sources of heterogeneity when suitable panel data exists. The counterfactual considerations that underlie policy analysis and treatment evaluation have been given a more satisfactory foundation. New time series econometric techniques have been developed and employed extensively in the areas of macroeconometrics and finance. Non-linear econometric techniques are used increasingly in the analysis of cross section and time series observations. Applications of Bayesian techniques to econometric problems have been given new impetus largely thanks to advances in computer power and computational techniques. The use of Bayesian techniques have in turn provided the investigators with a unifying framework where the tasks of forecasting, decision making, model evaluation and learning can be considered as parts of the same interactive and iterative process; thus paving the way for establishing the foundation of "real time econometrics". This paper attempts to provide an overview of some of these developments. |
Keywords: | history of econometrics, microeconometrics, macroeconometrics, Bayesian econometrics, nonparametric and semi-parametric analysis |
JEL: | C1 C2 C3 C4 C5 |
Date: | 2006–11 |
URL: | http://d.repec.org/n?u=RePEc:iza:izadps:dp2458&r=ecm |
By: | Davide La Torre (University of Milan); Herb Kunze; Ed Vrscay |
Abstract: | Most natural phenomena or the experiments that explore them are subject to small variations in the environment within which they take place. As a result, data gathered from many runs of the same experiment may well show differences that are most suitably accounted for by a model that incorporates some randomness. Differential equations with random coefficients are one such class of useful models. In this paper we consider such equations as random fixed point equations T(w,x(w)) = x(w), where T : \Omega × X \to X is a random integral operator, \Omega is a probability space and X is a complete metric space. We consider the following inverse problem for such equations: given a set of realizations of the fixed point of T (possibly the interpolations of different observational data sets), determine the operator T or the mean value of its random components, as appropriate. We solve the inverse problem for this class of equations by using the collage theorem. |
Keywords: | random fixed point equations, random differential equations, |
Date: | 2006–09–07 |
URL: | http://d.repec.org/n?u=RePEc:bep:unimip:1036&r=ecm |
By: | Lubos Briatka |
Abstract: | Kocenda (2001) introduced the test for nonlinear dependencies in time series data based on the correlation integral. The idea of the test is to estimate the correlation dimension by integrating over a range of proximity parameter epsilon. However, there is an unexplored avenue if one wants to use the test to identify nonlinear structure in nonnormal data. Using the Monte Carlo studies, we show that non-normality leads to an over-rejection of the null hypothesis due to two reasons: First, the data are not iid, and second, the data are non-normal. It is shown that even a very small deviation from normality could lead to a rejection of the null hypothesis and hence a wrong conclusion. Therefore, the bootstrap method is introduced and it is shown that it helps to avoid the over-rejection problem; moreover the power of the test increases by a significant amount. These findings help us to extend the use of the test into many other fields that deal with nonlinear data that are not necessarily normal, e. g. financial economics, stock price volatility, stock market efficiency, stock exchange, behavior of equity indices, nonlinear dynamics in foreign exchange rates, or interest rates. |
Keywords: | Chaos, nonlinear dynamics, correlation integral, Monte Carlo, power tests,high-frequency economic and financial data |
JEL: | C14 C15 C52 C87 F31 G12 |
Date: | 2006–09 |
URL: | http://d.repec.org/n?u=RePEc:cer:papers:wp308&r=ecm |
By: | Roger Klein (Rutgers University); Francis Vella (Georgetown University and IZA Bonn) |
Abstract: | This paper employs conditional second moments to identify the impact of education in wage regressions where education is treated as endogenous. This approach avoids the use of instrumental variables in a setting where instruments are frequently not available. We employ this methodology to estimate the returns to schooling for a sample of Australian workers. We find that accounting for the endogeneity of education in this manner increases the estimated return to education from 6 percent to 10 percent. |
Keywords: | returns to schooling, endogeneity, heteroskedasticity |
JEL: | J2 C31 |
Date: | 2006–10 |
URL: | http://d.repec.org/n?u=RePEc:iza:izadps:dp2407&r=ecm |
By: | Spyros Konstantopoulos (Northwestern University and IZA Bonn) |
Abstract: | Field experiments that involve nested structures may assign treatment conditions either to entire groups (such as classrooms or schools), or individuals within groups (such as students). Since field experiments involve clustering, key aspects of their design include knowledge of the intraclass correlation structure and the sample sizes necessary to achieve adequate power to detect the treatment effect. This study provides methods for computing power in three-level designs, where for example, students are nested within classrooms and classrooms are nested within schools. The power computations take into account clustering effects at the classroom and at the school level, sample size effects (e.g., number of students, classrooms, and schools), and covariate effects (e.g., pre-treatment measures). The methods are generalizable to quasi-experimental studies that examine group differences in an outcome, or associations between predictors and outcomes. |
Keywords: | nested designs, statistical power, randomized trials, treatment effects, random effects, experiments, clustering |
JEL: | C9 |
Date: | 2006–10 |
URL: | http://d.repec.org/n?u=RePEc:iza:izadps:dp2412&r=ecm |