
on Econometric Time Series 
By:  Calista Cheung; Frédérick Demers 
Abstract:  This paper evaluates the performance of static and dynamic factor models for forecasting Canadian real output growth and core inflation on a quarterly basis. We extract the common component from a large number of macroeconomic indicators, and use the estimates to compute outofsample forecasts under a recursive and a rolling scheme with different window sizes. Factorbased forecasts are compared with AR(p) models as well as IS and Phillipscurve models. We find that factor models can improve the forecast accuracy relative to standard benchmark models, for horizons of up to 8 quarters. Forecasts from our proposed factor models are also less prone to committing large errors, in particular when the horizon increases. We further show that the choice of the samplingscheme has a large influence on the overall forecast accuracy, with smallest rollingwindow samples generating superior results to larger samples, implying that using "limitedmemory" estimators contribute to improve the quality of the forecasts. 
Keywords:  Econometric and statistical methods 
JEL:  C32 E37 
Date:  2007 
URL:  http://d.repec.org/n?u=RePEc:bca:bocawp:078&r=ets 
By:  Juan José Dolado; Jesús Gonzalo; Laura Mayoral 
Abstract:  This paper analyses how to test I(1) against I(d), d<1, in the presence of deterministic components in the DGP, by extending a Waldtype test, i.e., the (Efficient) Fractional DickeyFuller (EFDF) test, to this case. Tests of these hypotheses are important in many economic applications where it is crucial to distinguish between permanent and transitory shocks because I(d) processes with d<1 are meanreverting. On top of it, the inclusion of deterministic components becomes a necessary addition in order to analyze most macroeconomic variables. We show how simple is the implementation of the EFDF in these situations and argue that, in general, has better properties than LM tests. Finally, an empirical application is provided where the EFDF approach allowing for deterministic components is used to test for longmemory in the GDP p.c. of several OECD countries, an issue that has important consequences to discriminate between growth theories, and on which there has been some controversy. 
Date:  2006–12 
URL:  http://d.repec.org/n?u=RePEc:cte:werepe:we20061221&r=ets 
By:  Kleijnen,Jack P.C. (Tilburg University, Center for Economic Research) 
Abstract:  In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. Statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic theory assumes a single simulation response that is normally and independently distributed with a constant variance; moreover, the regression (meta)model of the simulation model's I/O behaviour is assumed to have residuals with zero means. This article addresses the following questions: (i) How realistic are these assumptions, in practice? (ii) How can these assumptions be tested? (iii) If assumptions are violated, can the simulation's I/O data be transformed such that the assumptions do hold? (iv) If not, which alternative statistical methods can then be applied? 
Keywords:  metamodels;experimental designs;generalized least squares;multivariate analysis;normality;jackknife;bootstrap;heteroscedasticity;common random numbers; validation 
JEL:  C0 C1 C9 
Date:  2007 
URL:  http://d.repec.org/n?u=RePEc:dgr:kubcen:20079&r=ets 
By:  Bent Nielsen (Nuffield College, Oxford University); Eric Engler (Dept of Economics, Oxford University) 
Abstract:  The empirical process of the residuals from general autoregressions is investigated. If an intercept is included in the regression, the empirical process is asymptotically Gaussian and free of nuisance parameters. This contrasts the known result that in the unit root case without intercept the empirical process is asymptotically nonGaussian. The result is used to establish asymptotic theory for the KolmogorovSmirnov test, ProbabilityProbability plots, and QuantileQuantile plots. The link between sample moments and the empirical process of the residuals is established and used to establish the properties of the cumulant based tests for normality referred to as the JarqueBera test. 
Keywords:  Autogression, Empirical process, KolmogorovSmirnov test, ProbabilityProbability plots, QuantileQuantile plots, Test for normality. 
Date:  2007–01–17 
URL:  http://d.repec.org/n?u=RePEc:nuf:econwp:0701&r=ets 
By:  Bent Nielsen (Nuffield College, Oxford University); Carlos Caceres (Nuffield College, Oxford University) 
Abstract:  In this paper we present a general result concerning the convergence to stochastic integrals with nonlinear integrands. The key finding represents a generalization of Chan and Wei's (1988) Theorem 2.4 and that of Ibragimov and Phillips' (2004) Theorem 8.2. This result is necessary for analysing the asymptotic properties of misspecification tests, when applied to a unit root process, for which Wooldridge (1999) mentioned that the exiting results in the literature were not sufficient. 
Keywords:  nonstationarity, unit roots, convergence, autoregressive processes, martingales stochastic integrals, nonlinearity. 
Date:  2007–02–12 
URL:  http://d.repec.org/n?u=RePEc:nuf:econwp:0702&r=ets 
By:  Collet J.J.; Fadili J.M. (School of Economics and Finance, Queensland University of Technology) 
Abstract:  In this paper, we propose to study the synthesis of Gegenbauer processes using the wavelet packets transform. In order to simulate 1factor Gegenbauer process, we introduce an original algorithm, inspired by the one proposed by Coifman and Wickerhauser [CW92], to adaptively search for the bestorthobasis in the wavelet packet library where the covariance matrix of the transformed process is nearly diagonal. Our method clearly outperforms the one recently proposed by [Whi01], is very fast, does not depend on the wavelet choice, and is not very sensitive to the length of the time series. From these first results we propose an algorithm to build bases to simulate kfactor Gegenbauer processes. Given the simplicity of programming and running, we feel the general practitioner will be attracted to our simulator. Finally we evaluate the approximation due to the fact that we consider the wavelet packet coeficients as uncorrelated. An empirical study is carried out which supports our results. 
Keywords:  Gegenbauer process, Wavelet packet transform, Bestbasis, Autocovariance 
URL:  http://d.repec.org/n?u=RePEc:qut:dpaper:190&r=ets 
By:  Adam Clements; Scott White (School of Economics and Finance, Queensland University of Technology) 
Abstract:  This paper considers the size effect, where volatility dynamics are dependant upon the current level of volatility within an stochastic volatility framework. A nonlinear filtering algorithm is proposed where the dynamics of the latent variable is conditioned on its current level. This allows for the estimation of a stochastic volatility model where dynamics are dependant on the level of volatility. Empirical results suggest that volatility dynamics are in fact influenced by the level of prevailing volatility. When volatility is relatively low (high), volatility is extremely (not) persistent with little (a great deal of) noise. 
Keywords:  Nonlinear filtering, stochastic volatility, size effect, threshold 
URL:  http://d.repec.org/n?u=RePEc:qut:dpaper:191&r=ets 
By:  Adam Clements; Scott White (School of Economics and Finance, Queensland University of Technology) 
Abstract:  This paper develops a computationally efficient filtering based procedure for the estimation of the heavy tailed SV model with leverage. While there are many accepted techniques for the estimation of standard SV models, incorporating these effects into an SV framework is difficult. Simulation evidence provided in this paper indicates that the proposed procedure outperforms competing approaches in terms of the accuracy of parameter estimation. In an empirical setting, it is shown how the individual effects of heavy tails and leverage can be isolated using standard likelihood ratio tests. 
URL:  http://d.repec.org/n?u=RePEc:qut:dpaper:192&r=ets 
By:  Anthony Murphy; Marwan Izzeldin 
Abstract:  We investigate the bootstrapped size and power properties of five long memory tests, including the modified R/S, KPSS and GPH tests. In small samples, the moving block bootstrap controls the empirical size of the tests. However, for these sample sizes, the power of bootstrapped tests against fractionally integrated alternatives is often a good deal less than that of asymptotic tests. In larger samples, the power of the five tests is good against common fractionally integrated alternatives  the FI case and the FI with a stochastic volatility error case. 
Keywords:  Moving block bootstrap; fractional integration 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:lan:wpaper:003091&r=ets 
By:  Ivan Paya; Ioannis A. Venetis; A Duarte 
Abstract:  This papers finds evidence of fractional integration for a number of monthly ex post real interest rate series using the GPH semiparametric estimator on data from fourteen European countries and the US. However, we pose empirical questions on certain time series requirements that emerge from fractional integration and we find that they do not hold pointing to "spurious" long memory and casting doubts with respect to the theoretical origins of long memory in our sample. Common stochastic trends expressed as the sum of stationary past errors do not seem appropriate as an explanation of real interest rate covariation. From an economic perspective, our results suggest that most European countries show higher speed of real interest rate equalization with Germany rather than the US. 
Keywords:  Real interest rate; Long memory, Fractional Integration 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:lan:wpaper:004341&r=ets 