nep-ets New Economics Papers
on Econometric Time Series
Issue of 2007‒02‒17
ten papers chosen by
Yong Yin
SUNY at Buffalo

  1. Evaluating Forecasts from Factor Models for Canadian GDP Growth and Core Inflation By Calista Cheung; Frédérick Demers
  2. Testing I(1) against I(d) alternatives with Wald Tests in the presence of deterministic components By Juan José Dolado; Jesús Gonzalo; Laura Mayoral
  3. Simulation experiments in practice : statistical design and regression analysis By Kleijnen,Jack P.C.
  4. The empirical process of autoregressive residuals By Bent Nielsen; Eric Engler
  5. Convergence to Stochastic Integrals with Non-linear integrands By Bent Nielsen; Carlos Caceres
  6. Simulation of Gegenbauer processes using wavelet packets By Collet J.J.; Fadili J.M.
  7. Non-linear filtering with state dependant transition probabilities: A threshold (size effect) SV model By Adam Clements; Scott White
  8. Nonlinear Filtering for Stochastic Volatility Models with Heavy Tails and Leverage By Adam Clements; Scott White
  9. Bootstrapping long memory tests: some Monte Carlo results By Anthony Murphy; Marwan Izzeldin
  10. The long memory story of real interest rates. Can it be supported? By Ivan Paya; Ioannis A. Venetis; A Duarte

  1. By: Calista Cheung; Frédérick Demers
    Abstract: This paper evaluates the performance of static and dynamic factor models for forecasting Canadian real output growth and core inflation on a quarterly basis. We extract the common component from a large number of macroeconomic indicators, and use the estimates to compute out-of-sample forecasts under a recursive and a rolling scheme with different window sizes. Factor-based forecasts are compared with AR(p) models as well as IS- and Phillips-curve models. We find that factor models can improve the forecast accuracy relative to standard benchmark models, for horizons of up to 8 quarters. Forecasts from our proposed factor models are also less prone to committing large errors, in particular when the horizon increases. We further show that the choice of the sampling-scheme has a large influence on the overall forecast accuracy, with smallest rolling-window samples generating superior results to larger samples, implying that using "limited-memory" estimators contribute to improve the quality of the forecasts.
    Keywords: Econometric and statistical methods
    JEL: C32 E37
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:bca:bocawp:07-8&r=ets
  2. By: Juan José Dolado; Jesús Gonzalo; Laura Mayoral
    Abstract: This paper analyses how to test I(1) against I(d), d<1, in the presence of deterministic components in the DGP, by extending a Wald-type test, i.e., the (Efficient) Fractional Dickey-Fuller (EFDF) test, to this case. Tests of these hypotheses are important in many economic applications where it is crucial to distinguish between permanent and transitory shocks because I(d) processes with d<1 are mean-reverting. On top of it, the inclusion of deterministic components becomes a necessary addition in order to analyze most macroeconomic variables. We show how simple is the implementation of the EFDF in these situations and argue that, in general, has better properties than LM tests. Finally, an empirical application is provided where the EFDF approach allowing for deterministic components is used to test for long-memory in the GDP p.c. of several OECD countries, an issue that has important consequences to discriminate between growth theories, and on which there has been some controversy.
    Date: 2006–12
    URL: http://d.repec.org/n?u=RePEc:cte:werepe:we20061221&r=ets
  3. By: Kleijnen,Jack P.C. (Tilburg University, Center for Economic Research)
    Abstract: In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. Statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic theory assumes a single simulation response that is normally and independently distributed with a constant variance; moreover, the regression (meta)model of the simulation model's I/O behaviour is assumed to have residuals with zero means. This article addresses the following questions: (i) How realistic are these assumptions, in practice? (ii) How can these assumptions be tested? (iii) If assumptions are violated, can the simulation's I/O data be transformed such that the assumptions do hold? (iv) If not, which alternative statistical methods can then be applied?
    Keywords: metamodels;experimental designs;generalized least squares;multivariate analysis;normality;jackknife;bootstrap;heteroscedasticity;common random numbers; validation
    JEL: C0 C1 C9
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:20079&r=ets
  4. By: Bent Nielsen (Nuffield College, Oxford University); Eric Engler (Dept of Economics, Oxford University)
    Abstract: The empirical process of the residuals from general autoregressions is investigated. If an intercept is included in the regression, the empirical process is asymptotically Gaussian and free of nuisance parameters. This contrasts the known result that in the unit root case without intercept the empirical process is asymptotically non-Gaussian. The result is used to establish asymptotic theory for the Kolmogorov-Smirnov test, Probability-Probability plots, and Quantile-Quantile plots. The link between sample moments and the empirical process of the residuals is established and used to establish the properties of the cumulant based tests for normality referred to as the Jarque-Bera test.
    Keywords: Autogression, Empirical process, Kolmogorov-Smirnov test, Probability-Probability plots, Quantile-Quantile plots, Test for normality.
    Date: 2007–01–17
    URL: http://d.repec.org/n?u=RePEc:nuf:econwp:0701&r=ets
  5. By: Bent Nielsen (Nuffield College, Oxford University); Carlos Caceres (Nuffield College, Oxford University)
    Abstract: In this paper we present a general result concerning the convergence to stochastic integrals with non-linear integrands. The key finding represents a generalization of Chan and Wei's (1988) Theorem 2.4 and that of Ibragimov and Phillips' (2004) Theorem 8.2. This result is necessary for analysing the asymptotic properties of mis-specification tests, when applied to a unit root process, for which Wooldridge (1999) mentioned that the exiting results in the literature were not sufficient.
    Keywords: non-stationarity, unit roots, convergence, autoregressive processes, martingales stochastic integrals, non-linearity.
    Date: 2007–02–12
    URL: http://d.repec.org/n?u=RePEc:nuf:econwp:0702&r=ets
  6. By: Collet J.J.; Fadili J.M. (School of Economics and Finance, Queensland University of Technology)
    Abstract: In this paper, we propose to study the synthesis of Gegenbauer processes using the wavelet packets transform. In order to simulate 1-factor Gegenbauer process, we introduce an original algorithm, inspired by the one proposed by Coifman and Wickerhauser [CW92], to adaptively search for the best-ortho-basis in the wavelet packet library where the covariance matrix of the transformed process is nearly diagonal. Our method clearly outperforms the one recently proposed by [Whi01], is very fast, does not depend on the wavelet choice, and is not very sensitive to the length of the time series. From these first results we propose an algorithm to build bases to simulate k-factor Gegenbauer processes. Given the simplicity of programming and running, we feel the general practitioner will be attracted to our simulator. Finally we evaluate the approximation due to the fact that we consider the wavelet packet coeficients as uncorrelated. An empirical study is carried out which supports our results.
    Keywords: Gegenbauer process, Wavelet packet transform, Best-basis, Autocovariance
    URL: http://d.repec.org/n?u=RePEc:qut:dpaper:190&r=ets
  7. By: Adam Clements; Scott White (School of Economics and Finance, Queensland University of Technology)
    Abstract: This paper considers the size effect, where volatility dynamics are dependant upon the current level of volatility within an stochastic volatility framework. A non-linear filtering algorithm is proposed where the dynamics of the latent variable is conditioned on its current level. This allows for the estimation of a stochastic volatility model where dynamics are dependant on the level of volatility. Empirical results suggest that volatility dynamics are in fact influenced by the level of prevailing volatility. When volatility is relatively low (high), volatility is extremely (not) persistent with little (a great deal of) noise.
    Keywords: Non-linear filtering, stochastic volatility, size effect, threshold
    URL: http://d.repec.org/n?u=RePEc:qut:dpaper:191&r=ets
  8. By: Adam Clements; Scott White (School of Economics and Finance, Queensland University of Technology)
    Abstract: This paper develops a computationally efficient filtering based procedure for the estimation of the heavy tailed SV model with leverage. While there are many accepted techniques for the estimation of standard SV models, incorporating these effects into an SV framework is difficult. Simulation evidence provided in this paper indicates that the proposed procedure outperforms competing approaches in terms of the accuracy of parameter estimation. In an empirical setting, it is shown how the individual effects of heavy tails and leverage can be isolated using standard likelihood ratio tests.
    URL: http://d.repec.org/n?u=RePEc:qut:dpaper:192&r=ets
  9. By: Anthony Murphy; Marwan Izzeldin
    Abstract: We investigate the bootstrapped size and power properties of five long memory tests, including the modified R/S, KPSS and GPH tests. In small samples, the moving block bootstrap controls the empirical size of the tests. However, for these sample sizes, the power of bootstrapped tests against fractionally integrated alternatives is often a good deal less than that of asymptotic tests. In larger samples, the power of the five tests is good against common fractionally integrated alternatives - the FI case and the FI with a stochastic volatility error case.
    Keywords: Moving block bootstrap; fractional integration
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:lan:wpaper:003091&r=ets
  10. By: Ivan Paya; Ioannis A. Venetis; A Duarte
    Abstract: This papers finds evidence of fractional integration for a number of monthly ex post real interest rate series using the GPH semiparametric estimator on data from fourteen European countries and the US. However, we pose empirical questions on certain time series requirements that emerge from fractional integration and we find that they do not hold pointing to "spurious" long memory and casting doubts with respect to the theoretical origins of long memory in our sample. Common stochastic trends expressed as the sum of stationary past errors do not seem appropriate as an explanation of real interest rate covariation. From an economic perspective, our results suggest that most European countries show higher speed of real interest rate equalization with Germany rather than the US.
    Keywords: Real interest rate; Long memory, Fractional Integration
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:lan:wpaper:004341&r=ets

This nep-ets issue is ©2007 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.