|
on Econometric Time Series |
By: | Antonio Cosma; Olivier Scaillet; Rainer von Sachs |
Abstract: | We present a new approach on shape preserving estimation of probability distribution and density functions using wavelet methodology for multivariate dependent data. Our estimators preserve shape constraints such as monotonicity, positivity and integration to one, and allow for low spatial regularity of the underlying functions. As important application, we discuss conditional quantile estimation for financial time series data. We show that our methodology can be easily implemented with B-splines, and performs well in a finite sample situation, through Monte Carlo simulations. |
Keywords: | Conditional quantile; time series; shape preserving wavelet estimation; B-splines; multivariate process |
JEL: | C14 C15 C32 |
Date: | 2005–05 |
URL: | http://d.repec.org/n?u=RePEc:fam:rpseri:rp144&r=ets |
By: | Olivier Scaillet |
Abstract: | We study a test statistic on the integrated squared difference between a kernel estimator of the copula density and a kernel smoothed estimator of the parametric copula density. We show for fixed smoothing parameters that the test is consistent and that the asymptotic properties are driven by a U-statistic of order 4 with degeneracy of order 3. For practical implementation we suggest to compute the critical values through a semiparametric bootstrap. Monte Carlo results show that the bootstrap procedure performs well in small samples. In particular size and power are less sensitive to smoothing parameter choice than they are under the asymptotic approximation obtained for a vanishing bandwidth. |
Keywords: | Nonparametric; Copula density; Goodness-of-fit test; U-statistic. |
JEL: | C12 D18 G10 G21 G22 |
Date: | 2005–05 |
URL: | http://d.repec.org/n?u=RePEc:fam:rpseri:rp145&r=ets |
By: | Osmani Teixeira de Carvalho Guillén; João Victor Issler; George Athanasopoulos |
Abstract: | Using vector autoregressive (VAR) models and Monte-Carlo simulation methods we investigate the potential gains for forecasting accuracy and estimation uncertainty of two commonly used restrictions arising from economic relationships. The first reduces parameter space by imposing long-term restrictions on the behavior of economic variables as discussed by the literature on cointegration, and the second reduces parameter space by imposing short-term restrictions as discussed by the literature on serial-correlation common features (SCCF). Our simulations cover three important issues on model building, estimation, and forecasting. First, we examine the performance of standard and modified information criteria in choosing lag length for cointegrated VARs with SCCF restrictions. Second, we provide a comparison of forecasting accuracy of .fitted VARs when only cointegration restrictions are imposed and when cointegration and SCCF restrictions are jointly imposed. Third, we propose a new estimation algorithm where short- and long-term restrictions interact to estimate the cointegrating and the cofeature spaces respectively. We have three basic results. First, ignoring SCCF restrictions has a high cost in terms of model selection, because standard information criteria chooses too frequently inconsistent models, with too small a lag length. Criteria selecting lag and rank simultaneously have a superior performance in this case. Second, this translates into a superior forecasting performance of the restricted VECM over the VECM, with important improvements in forecasting accuracy .reaching more than 100% in extreme cases. Third, the new algorithm proposed here fares very well in terms of parameter estimation, even when we consider the estimation of long-term parameters, opening up the discussion of joint estimation of short- and long-term parameters in VAR models. |
Keywords: | Reduced rank models, model selection criteria, forecasting accuracy. |
JEL: | C32 C53 |
Date: | 2005–05 |
URL: | http://d.repec.org/n?u=RePEc:msh:ebswps:2005-15&r=ets |
By: | Yacine Ait-Sahalia; Per A. Mykland; Lan Zhang |
Abstract: | We analyze the impact of time series dependence in market microstructure noise on the properties of estimators of the integrated volatility of an asset price based on data sampled at frequencies high enough for that noise to be a dominant consideration. We show that combining two time scales for that purpose will work even when the noise exhibits time series dependence, analyze in that context a refinement of this approach based on multiple time scales, and compare empirically our different estimators to the standard realized volatility. |
JEL: | G12 C22 |
Date: | 2005–05 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:11380&r=ets |
By: | Javier Hualde (School of Economics and Business Administration, University of Navarra) |
Abstract: | Recently, increasing interest on the issue of fractional cointegration has emerged from theoretical and empirical viewpoints. Here, as opposite to the traditional prescription of unit root observables with weak dependent cointegrating errors, the orders of integration of these series are allowed to take real values, but, as in the traditional framework, equality of the orders of at least two observable series is necessary for cointegration. This assumption, in view of the real-valued nature of these orders could pose some difficulties, and in the present paper we explore some ideas related to this issue in a simple bivariate framework. First, in a situation of "nearcointegration", where the only difference with respect to the "usual" fractional cointegration is that the orders of the two observable series differ in an asymptotically negligible way, we analyse properties of standard estimates of the cointegrating parameter. Second, we discuss the estimation of the cointegrating parameter in a situation where the orders of integration of the two observables are truly different, but their corresponding balanced versions (with same order of integration) are cointegrated in the usual sense. A Monte Carlo study of finitesample performance and simulated series is included. |
JEL: | C22 |
Date: | 2005–05 |
URL: | http://d.repec.org/n?u=RePEc:una:unccee:wp0605&r=ets |
By: | J. Carlos Escanciano (School of Economics and Business Administration, University of Navarra) |
Abstract: | Economic theories in dynamic contexts usually impose certain restrictions on the conditional mean of the underlying economic variables. Omnibus specification tests are the primary tools to test such restrictions when there is no information on the possible alternative. In this paper we study in detail the power properties of a large class of omnibus specification tests for parametric conditional means under time series processes. We show that all omnibus specification tests have a preference for a finite-dimensional space of alternatives (usually unknown to the practitioner) and we characterize such space for Cramér-von Mises (CvM) tests. This fact motivates the use of optimal tests against such preferred spaces instead of the omnibus tests. We proposed new asymptotically optimal directional and smooth tests that are optimally designed for cases in which a finite-dimensional space of alternatives is in mind. The new proposed optimal procedures are asymptotically distribution-free and are valid under weak assumptions on the underlying data generating process. In particular, they are valid under possibly time varying higher conditional moments of unknown form, e.g., conditional heteroskedasticity. A Monte Carlo experiment shows that previous asymptotic results provide good approximations in small sample sizes. Finally, an application of our theory to test the martingale difference hypothesis of some exchange rates provides new information on the rejection of omnibus tests and illustrates the relevance of our results for practitioners. |
JEL: | C12 C14 C52 |
Date: | 2005–05 |
URL: | http://d.repec.org/n?u=RePEc:una:unccee:wp0705&r=ets |
By: | Joseph P. Romano; Michael Wolf |
Abstract: | Consider the problem of testing s hypotheses simultaneously. The usual approach to dealing with the multiplicity problem is to restrict attention to procedures that control the probability of even one false rejection, the familiar familywise error rate (FWER). In many applications, particularly if s is large, one might be willing to tolerate more than one false rejection if the number of such cases is controlled, thereby increasing the ability of the procedure to reject false null hypotheses One possibility is to replace control of the FWER by control of the probability of k or more false rejections, which is called the k-FWER. We derive both single-step and stepdown procedures that control the k-FWER in finite samples or asymptotically, depending on the situation. Lehmann and Romano (2005a) derive some exact methods for this purpose, which apply whenever p-values are available for individual tests; no assumptions are made on the joint dependence of the p-values. In contrast, we construct methods that implicitly take into account the dependence structure of the individual test statistics in order to further increase the ability to detect false null hypotheses. We also consider the false discovery proportion (FDP) defined as the number of false rejections divided by the total number of rejections (and defined to be 0 if there are no rejections). The false discovery rate proposed by Benjamini and Hochberg (1995) controls E(FDP). |
Keywords: | Bootstrap, False Discovery Proportion, False Discovery Rate, Generalized Familywise Error Rates, Multiple Testing, Stepdown Procedure. |
JEL: | E43 |
URL: | http://d.repec.org/n?u=RePEc:zur:iewwpx:245&r=ets |