
on Econometrics 
By:  Einmahl,John H.J.; Van Keilegom,Ingrid (Tilburg University, Center for Economic Research) 
Abstract:  Consider the nonparametric regression model Y = m(X)+e, where the function m is smooth, but unknown. We construct tests for the independence of e and X, based on n independent copies of (X; Y ). The testing procedures are based on differences of neighboring Y 's. We establish asymptotic results for the proposed tests statistics, investigate their finite sample properties through a simulation study and present an econometric application to household data. The proofs are based on delicate empirical process theory. 
Keywords:  empirical process;model diagnostics;nonparametric regression;test for independence;weak convergence 
JEL:  C12 C14 C52 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:dgr:kubcen:200680&r=ecm 
By:  Einmahl,John H.J.; Van Keilegom,Ingrid (Tilburg University, Center for Economic Research) 
Abstract:  Consider the locationscale regression model Y = m(X)+o(X)", where the error e is independent of the covariate X, and m and o are smooth but unknown functions. We construct tests for the validity of this model and show that the asymptotic limits of the proposed test statistics are distribution free. We also investigate the finite sample properties of the tests through a simulation study, and we apply the tests in the analysis of data on food expenditures. 
Keywords:  62G08;62G10;62G20;62G30;60F17; Bootstrap;empirical process;goodnessoffit;locationscale regression;model diagnostics;nonparametric regression;test for independence;weak convergence 
JEL:  C12 C14 C52 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:dgr:kubcen:200679&r=ecm 
By:  Michiel D. de Pooter (Faculty of Economics, Erasmus Universiteit Rotterdam); René Segers (Faculty of Economics, Erasmus Universiteit Rotterdam); Herman K. van Dijk (Faculty of Economics, Erasmus Universiteit Rotterdam) 
Abstract:  Several lessons learned from a Bayesian analysis of basic economic time series models by means of the Gibbs sampling algorithm are presented. Models include the CochraneOrcutt model for serial correlation, the Koyck distributed lag model, the Unit Root model, the Instrumental Variables model and as Hierarchical Linear Mixed Models, the StateSpace model and the Panel Data model. We discuss issues involved when drawing Bayesian inference on regression parameters and variance components, in particular when some parameter have substantial posterior probability near the boundary of the parameter region, and show that one should carefully scan the shape of the posterior density function. Analytical, graphical and empirical results are used along the way. 
Keywords:  Gibbs sampler; MCMC; serial correlation; nonstationarity; reduced rank models; statespace models; random effects panel data models 
JEL:  C11 C15 C22 C23 C30 
Date:  2006–08–31 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:20060076&r=ecm 
By:  Giulietti, Monica (Aston Business School, University of Aston); Otero, Jesus (Facultad de Economia, Columbia); Smith, Jeremy (Department of Economics, University of Warwick) 
Abstract:  The panel variant of the KPSS tests developed by Hadri (2000) for the null of stationarity suffers from size distortions in the presence of cross section dependence. However, applying the bootstrap methodology we find that these tests are approximately correctly sized. 
Keywords:  Heterogeneous dynamic panels ; Monte Carlo ; bootstrap ; unit root tests ; cross section dependence. 
JEL:  C12 C15 C22 C23 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:wrk:warwec:758&r=ecm 
By:  Alfredo Gª Hiernaux (Universidad Complutense de Madrid. Facultad de CC. Económicas y Empresariales. Dpto. Fundamentos del An´alisis Económico II.); Miguel Jerez (Universidad Complutense de Madrid. Facultad de CC. Económicas y Empresariales. Dpto. Fundamentos del An´alisis Económico II.); José Casals (Universidad Complutense de Madrid. Facultad de CC. Económicas y Empresariales. Dpto. Fundamentos del An´alisis Económico II) 
Abstract:  We propose two fast, stable and consistent methods to estimate time series models expressed in their equivalent statespace form. They are useful both, to obtain adequate initial conditions for a maximumlikelihood iteration, or to provide final estimates when maximumlikelihood is considered inadequate or costly. The statespace foundation of these procedures implies that they can estimate any linear fixedcoefficients model, such as ARIMA, VARMAX or structural time series models. The computational and finitesample performance of both methods is very good, as a simulation exercise shows. 
Date:  2005 
URL:  http://d.repec.org/n?u=RePEc:ucm:doicae:0504&r=ecm 
By:  Karl H. Schlag 
Abstract:  We show how to derive nonparametric estimates from results for Bernoulli distributions, provided the means are the only parameters of interest. The only information is that the support of each random variable is contained in a known bounded set. Examples include presenting minimax risk properties of the sample mean and a minimax regret estimate for costly treatment. With the same method we are able to design nonparametric exact statistical inference tests for means using existing uniformly most powerful (unbiased) tests for Bernoulli distributions. These tests are parameter most powerful in the sense that there is no alternative test with the same size that yields higher power over any set of alternatives that only depends on the means. As examples we present for the ?first time an exact unbiased nonparametric test for a single mean and for the equality of two means (both for independent samples and for paired experiments). We also show how to improve performance of Hannan consistent rules. 
Keywords:  exact, distributionfree, nonparametric inference, binomial average, finite sample theory, Hannan consistency, universal consistent 
JEL:  C12 C44 C14 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:eui:euiwps:eco2006/26&r=ecm 
By:  Ole E. BarndorffNielsen; Peter R. Hansen; Asger Lunde; Neil Shephard 
Abstract:  In a recent paper we have introduced the class of realised kernel estimators of the increments of quadratic variation in the presence of noise. We showed that this estimator is consistent and derived its limit distribution under various assumptions on the kernel weights. In this paper we extend our analysis, looking at the class of subsampled realised kernels and we derive the limit theory for this class of estimators. We find that subsampling is highly advantageous for estimators based on discontinuous kernels, such as the truncated kernel. For kinked kernels, such as the Bartlett kernel, we show that subsampling is impotent, in the sense that subsampling has no effect on the asymptotic distribution. Perhaps surprisingly, for the efficient smooth kernels, such as the Parzen kernel, we show that subsampling is harmful as it increases the asymptotic variance. We also study the performance of subsampled realised kernels in simulations and in empirical work. 
Keywords:  Bipower variation; Long run variance estimator; Market frictions; Quadratic variation; Realised kernel; Realised variance; Subsampling. 
JEL:  C13 C22 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:sbs:wpsefe:2006fe06&r=ecm 
By:  Massimo Franchi (Department of Economics, University of Copenhagen) 
Abstract:  We study the algebraic structure of an I(d) vector autoregressive process, where d is restricted to be an integer. This is useful to characterize its polynomial cointegrating relations and its moving average representation, that is to prove a version of the Granger representation theorem valid for I(d) vector autoregressive processes. 
Keywords:  vector autoregressive processes; unit roots; Granger representation theorem; cointegration 
JEL:  C32 
Date:  2006–08 
URL:  http://d.repec.org/n?u=RePEc:kud:kuiedp:0616&r=ecm 
By:  Raymond Kan; Cesare Robotti 
Abstract:  We discuss the impact of different formulations of asset pricing models on the outcome of specification tests that are performed using excess returns. It is generally believed that when only excess returns are used for testing asset pricing models, the mean of the stochastic discount factor (SDF) does not matter. We show that the mean of the candidate SDF is only irrelevant when the model is correct. When the model is misspecified, the mean of the SDF can be a very important determinant of the specification test statistic, and it also heavily influences the relative rankings of competing asset pricing models. We point out that the popular way of specifying the SDF as a linear function of the factors is problematic because the specification test statistic is not invariant to an affine transformation of the factors and the SDFs of competing models can have very different means. In contrast, an alternative specification that defines the SDF as a linear function of the demeaned factors is free from these two problems and is more appropriate for model comparison. In addition, we suggest that a modification of the traditional HansenJagannathan distance (HJ distance) is needed when only excess returns are used. The modified HJ distance uses the inverse of the covariance matrix (instead of the second moment matrix) of excess returns as the weighting matrix to aggregate pricing errors. We provide asymptotic distributions of the modified HJ distance and of the traditional HJ distance based on the demeaned SDF under the correctly specified model and the misspecified models. Finally, we propose a simple methodology for computing the standard errors of the estimated SDF parameters that are robust to model misspecification. 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:fip:fedawp:200610&r=ecm 
By:  Kleijnen,Jack P.C. (Tilburg University, Center for Economic Research) 
Abstract:  Generalized Response Surface Methodology (GRSM) is a novel generalpurpose metaheuristic based on Box and Wilson.s Response Surface Methodology (RSM). Both GRSM and RSM estimate local gradients to search for the optimal solution. These gradients use local firstorder polynomials. GRSM, however, uses these gradients to estimate a better search direction than the steepest ascent direction used by RSM. Moreover, GRSM allows multiple responses, selecting one response as goal and the other responses as constrained variables. Finally, these estimated gradients may be used to test whether the estimated solution is indeed optimal. The focus of this paper is optimization of simulated systems. 
Keywords:  experimental design;multivariate regression analysis;least squares; KarushKuhnTucker conditions;bootstrap 
JEL:  C0 C1 C9 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:dgr:kubcen:200677&r=ecm 
By:  Sumon Kumar Bhaumik (Brunel University and IZA Bonn); Ira N. Gang (Rutgers University and IZA Bonn); MyeongSu Yun (Tulane University and IZA Bonn) 
Abstract:  This paper decomposes differences in poverty incidence (head count ratio) using estimates from a regression equation, synthesizing the approaches proposed in World Bank (2003) and Yun (2004). A significance test is developed for characteristics and coefficients effects when decomposing differences in poverty incidence. The proposed method is implemented for studying differences in poverty incidence between Serbians and Albanians in Kosovo using Living Standard Measurement Survey. 
Keywords:  poverty incidence, head count ratio, OLS, probit, decomposition 
JEL:  C20 I30 
Date:  2006–08 
URL:  http://d.repec.org/n?u=RePEc:iza:izadps:dp2262&r=ecm 
By:  Bruce Mizrach (Rutgers University) 
Abstract:  This entry for the New Palgrave covers developments in nonlinear time series analysis over the last 25 years. 
Keywords:  nonlinear, time series, analysis 
JEL:  C22 C14 C45 
Date:  2006–02–25 
URL:  http://d.repec.org/n?u=RePEc:rut:rutres:200604&r=ecm 
By:  Marco Corazza (Department of Applied Mathematics, University of Venice); A.G. Malliaris (Department of Economics, Loyola University of Chicago); Elisa Scalco (Department of Applied Mathematics, University of Venice) 
Abstract:  Comovements among asset prices have received a lot of attention for several reasons. For example, comovements are important in crosshedging and crossspeculation; they determine capital allocation both domestically and in international meanÐvariance portfolios and also, they are useful in investigating the extent of integration among financial markets. In this paper we propose a new methodology for the nonÐlinear modelling of bivariate comovements. Our approach extends the ones presented in the recent literature. In fact, our methodology outlined in three steps, allows the evaluation and the statistical testing of nonlinearly driven comovements between two given random variables. Moreover, when such a bivariate dependence relationship is detected, our approach solves for a polynomial approximation. We illustrate our threeÐsteps methodology to the time series of energy related asset prices. Finally, we exploit this dependence relationship and its polynomial approximation to obtain analytical approximations of the Greeks for the European call and put options in terms of an asset whose price comoves with the price of the underlying asset. 
Keywords:  Comovement, asset prices, bivariate dependence, nonlinearity, ttest, polynomial approximation, energy asset, (vanilla) European call and put options, crossÐGreeks. 
JEL:  C59 G19 Q49 
Date:  2006–09 
URL:  http://d.repec.org/n?u=RePEc:vnm:wpaper:137&r=ecm 
By:  Aßmann, Christian; Hogrefe, Jens; Liesenfeld, Roman 
Abstract:  Empirical evidence suggests a sharp volatility decline of the growth in U.S. gross domestic product (GDP) in the mid1980s. Using Bayesian methods, we analyze whether a volatility reduction can also be detected for the German GDP. Since statistical inference for volatility processes critically depends on the specification of the conditional mean we assume for our volatility analysis different time series models for GDP growth. We find across all specifications evidence for an output stabilization around 1993, after the downturn following the boom associated with the German reunification. However, the different GDP models lead to alternative characterizations of this stabilization : In a linear AR model it shows up as smaller shocks hitting the economy, while regime switching models reveal as further sources for a stabilization, a narrowing gap between growth rates during booms and recessions or flatter trajectories characterizing the GDP growth rates. Furthermore, it appears that the reunification interrupted an output stabilization emerging already around 1987. 
Keywords:  business cycle models, Gibbs sampling, Markov Chain Monte Carlo, regime switching, structural breaks 
JEL:  C11 C15 C32 E32 
Date:  2005 
URL:  http://d.repec.org/n?u=RePEc:zbw:cauewp:4134&r=ecm 
By:  Kenichi Mitsui (Doctor Candidate of Osaka University); Yoshio Tabata (Graduate School of Business Administration, Nanzan Univeristy) 
Abstract:  In Finance, the modeling of a correlation matrix is one of the important problems. In particular, the correlation matrix obtained from market data has the noise. Here we apply the denoising processing based on the wavelet analysis to the noisy correlation matrix, which is generated by a parametric function with random parameters. First of all, we show that two properties, i.e. symmetry and ones of all diagonal elements, of the correlation matrix preserve via the denoising processing and the efficiency of the denosing processing by numerical experiments. We propose that the denoising processing is one of the effective methods in order to reduce the noise in the noisy correlation matrix. 
Keywords:  correlation matrix, calibration, rank reduction, denoising, wavelet analysis 
JEL:  C51 C61 C63 G32 
Date:  2006–09 
URL:  http://d.repec.org/n?u=RePEc:osk:wpaper:0626&r=ecm 