nep-ets New Economics Papers
on Econometric Time Series
Issue of 2008‒11‒04
seventeen papers chosen by
Yong Yin
SUNY at Buffalo

  1. Estimation of k-factor GIGARCH process : a Monte Carlo study By Abdou Kâ Diongue; Dominique Guegan
  2. Effect of noise filtering on predictions : on the routes of chaos By Dominique Guegan
  3. Changing regime volatility: A fractionally integrated SETAR model By Gilles Dufrenot; Dominique Guegan; Anne Peguin-Feissolle
  4. Business surveys modelling with Seasonal-Cyclical Long Memory models By Laurent Ferrara; Dominique Guegan
  5. Exact Maximum Likelihood estimation for the BL-GARCH model under elliptical distributed innovations By Abdou Kâ Diongue; Dominique Guegan; Rodney C. Wolff
  6. A multi-horizon scale for volatility By Alexander Subbotin
  7. Testing fractional order of long memory processes : a Monte Carlo study By Laurent Ferrara; Dominique Guegan; Zhiping Lu
  8. Analysis of the dependence structure in econometric time series By Aurélien Hazan; Vincent Vigneron
  9. Non-stationarity and meta-distribution By Dominique Guegan
  10. Wavelets unit root test vs DF test : A further investigation based on monte carlo experiments By Ibrahim Ahamada; Philippe Jolivaldt
  11. Estimating and Forecasting GARCH Volatility in the Presence of Outiers By M. Angeles Carnero; Daniel Peña; Esther Ruiz
  12. A Powerful Test of the Autoregressive Unit Root Hypothesis Based on a Tuning Parameter Free Statistic By Morten Ørregaard Nielsen
  13. Markov-chain approximations of vector autoregressions: application of general multivariate-normal integration techniques By Edward S. Knotek II; Stephen Terry
  14. Seeing inside the black box: Using diffusion index methodology to construct factor proxies in large scale macroeconomic time series environments By Nii Ayi Armah; Norman R. Swanson
  15. Combining forecasts from nested models By Todd E. Clark; Michael W. McCracken
  16. Clustering techniques applied to outlier detection of financial market series using a moving window filtering algorithm. By Josep Maria Puigvert Gutiérrez; Josep Fortiana Gregori
  17. Is forecasting with large models informative? Assessing the role of judgement in macro-economic forecasts. By Ricardo Mestre; Peter McAdam

  1. By: Abdou Kâ Diongue (UFR SAT - Université Gaston Berger - Université Gaston Berger de Saint-Louis, School of Economics and Finance - Queensland University of Technology); Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris)
    Abstract: In this paper, we discuss the parameter estimation for a k-factor generalized long memory process with conditionally heteroskedastic noise. Two estimation methods are proposed. The first method is based on the conditional distribution of the process and the second is obtained as an extension of Whittle's estimation approach. For comparison purposes, Monte Carlo simulations are used to evaluate the finite sample performance of these estimation techniques.
    Keywords: Long memory, Gegenbauer polynomial, heteeroskedasticity, conditional sum of squares, Whittle estimation.
    Date: 2008–01
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00235179_v1&r=ets
  2. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris)
    Abstract: The detection of chaotic behaviors in commodities, stock markets and weather data is usually complicated by large noise perturbation inherent to the underlying system. It is well known, that predictions, from pure deterministic chaotic systems can be accurate mainly in the short term. Thus, it will be important to be able to reconstruct in a robust way the attractor in which evolves the data, if this attractor exists. In chaotic theory, the deconvolution methods have been largely studied and there exist different approaches which are competitive and complementary. In this work, we apply two methods : the singular value method and the wavelet approach. This last one has not been investigated a lot of filtering chaotic systems. Using very large Monte Carlo simulations, we show the ability of this last deconvolution method. Then, we use the de-noised data set to do forecast, and we discuss deeply the possibility to do long term forecasts with chaotic systems.
    Keywords: Deconvolution, chaos, SVD, state space method, wavelets method.
    Date: 2008–01
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00235448_v1&r=ets
  3. By: Gilles Dufrenot (GREQAM - Groupement de Recherche en Économie Quantitative d'Aix-Marseille - Université de la Méditerranée - Aix-Marseille II - Université Paul Cézanne - Aix-Marseille III - Ecole des Hautes Etudes en Sciences Sociales - CNRS : UMR6579); Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I); Anne Peguin-Feissolle (GREQAM - Groupement de Recherche en Économie Quantitative d'Aix-Marseille - Université de la Méditerranée - Aix-Marseille II - Université Paul Cézanne - Aix-Marseille III - Ecole des Hautes Etudes en Sciences Sociales - CNRS : UMR6579)
    Abstract: This paper presents a 2-regime SETAR model with different long-memory processes in both regimes. We briefly present the memory properties of this model and propose an estimation method. Such a process is applied to the absolute and squared returns of five stock indices. A comparison with simple FARIMA models is made using some forecastibility criteria. Our empirical results suggest that our model offers an interesting alternative competing framework to describe the persistent dynamics in modeling the returns.
    Keywords: SETAR - Long-memory - Stock indices - Forecasting
    Date: 2008–04
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00185369_v1&r=ets
  4. By: Laurent Ferrara (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, DGEI-DAMEP - Banque de France); Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris)
    Abstract: Business surveys are an important element in the analysis of the short-term economic situation because of the timeliness and nature of the information they convey. Especially, surveys are often involved in econometric models in order to provide an early assessment of the current state of the economy, which is of great interest for policy-makers. In this paper, we focus on non-seasonally adjusted business surveys released by the European Commission. We introduce an innovative way for modelling those series taking the persistence of the seasonal roots into account through seasonal-cyclical long memory models. We empirically prove that such models produce more accurate forecasts than classical seasonal linear models.
    Keywords: Euro area, nowcasting, business surveys, seasonal, long memory.
    Date: 2008–05
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00277379_v1&r=ets
  5. By: Abdou Kâ Diongue (UFR SAT - Université Gaston Berger - Université Gaston Berger de Saint-Louis); Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris); Rodney C. Wolff (School of Mathematical Sciences - Queensland University of Technology)
    Abstract: In this paper, we discuss the class of Bilinear GATRCH (BL-GARCH) models which are capable of capturing simultaneously two key properties of non-linear time series : volatility clustering and leverage effects. It has been observed often that the marginal distributions of such time series have heavy tails ; thus we examine the BL-GARCH model in a general setting under some non-Normal distributions. We investigate some probabilistic properties of this model and we propose and implement a maximum likelihood estimation (MLE) methodology. To evaluate the small-sample performance of this method for the various models, a Monte Carlo study is conducted. Finally, within-sample estimation properties are studied using S&P 500 daily returns, when the features of interest manifest as volatility clustering and leverage effects.
    Keywords: BL-GARCH process, elliptical distribution, leverage effects, Maximum Likelihood, Monte Carlo method, volatility clustering.
    Date: 2008–03
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00270719_v1&r=ets
  6. By: Alexander Subbotin (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, Higher School of Economics - State University)
    Abstract: We decompose volatility of a stock market index both in time and scale using wavelet filters and design a probabilistic indicator for valatilities, analogous to the Richter scale in geophysics. The peak-over-threshold method is used to fit the generalized Pareto probability distribution for the extreme values in the realized variances of wavelet coefficients. The indicator is computed for the daily Dow Jones Industrial Averages index data from 1986 to 2007 and for the intraday CAC 40 data from 1995 to 2006. The results are used for comparison and structural multi-resolution analysis of extreme events on the stock market and for the detection of financial crises.
    Keywords: Stock market, volatility, wavelets, multi-resolution analysis, financial crisis.
    Date: 2008–03
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00261514_v1&r=ets
  7. By: Laurent Ferrara (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, DGEI-DAMEP - Banque de France); Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris); Zhiping Lu (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, ECNU - East China Normal University)
    Abstract: Testing the fractionally integrated order of seasonal and non-seasonal unit roots is quite important for the economic and financial time series modelling. In this paper, Robinson test (1994) is applied to various well-known long memory models. Via Monte Carlo experiments, we study and compare the performances of this test using several sample sizes.
    Keywords: Long memory processes, test, Monte Carlo simulations.
    Date: 2008–02
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00259193_v1&r=ets
  8. By: Aurélien Hazan (IBISC - Informatique, Biologie Intégrative et Systèmes Complexes - CNRS : FRE2873 - Université d'Evry-Val d'Essonne); Vincent Vigneron (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, SAMOS - Statistique Appliquée et MOdélisation Stochastique - Université Panthéon-Sorbonne - Paris I)
    Abstract: The various scales of a signal maintain relations of dependence the on es with the others. Those can vary in time and reveal speed changes in the studied phenomenon. In the goal to establish these changes, one shall compute first the wavelet transform of a signal, on various scales. Then one shall study the statistical dependences between these transforms thanks to an estimator of mutual information. One shall then propose to summarize the resulting network of dependences by a graph of dependences by thresholding the values of the mutual information or by quantifying its values. The method can be applied to several types of signals, such as fluctuations of market indexes for instance the S&P 500, or high frequency foreign exchange (FX) rates.
    Keywords: wavelet, dependence; mutual information; financial; time-series; FX
    Date: 2008–06–05
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:hal-00287463_v1&r=ets
  9. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris)
    Abstract: In this paper we deal with the problem of non-stationarity encountered in a lot of data sets, mainly in financial and economics domains, coming from the presence of multiple seasonnalities, jumps, volatility, distorsion, aggregation, etc. Existence of non-stationarity involves spurious behaviors in estimated statistics as soon as we work with finite samples. We illustrate this fact using Markov switching processes, Stopbreak models and SETAR processes. Thus, working with a theoretical framework based on the existence of an invariant measure for a whole sample is not satisfactory. Empirically alternative strategies have been developed introducing dynamics inside modelling mainly through the parameter with the use of rolling windows. A specific framework has not yet been proposed to study such non-invariant data sets. The question is difficult. Here, we address a discussion on this topic proposing the concept of meta-distribution which can be used to improve risk management strategies or forecasts.
    Keywords: Non-stationarity, switching processes, SETAR processes, jumps, forecast, risk management, copula, probability distribution function.
    Date: 2008–03
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00270708_v1&r=ets
  10. By: Ibrahim Ahamada (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris); Philippe Jolivaldt (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris)
    Abstract: Test for unit root based in wavelets theory is recently defined (Genay and Fan, 2007). While the new test is supposed to be robust to the initial value, we bring out by contrast the significant effects of the initial value in the size and the power. We found also that both the wavelets unit root test and ADF test give the same efficiency if the data are corrected of the initial value. Our approach is based in monte carlo experiment.
    Keywords: Unit root tests, wavelets, monte carlo experiments, size-power curve.
    Date: 2008–03
    URL: http://d.repec.org/n?u=RePEc:hal:paris1:halshs-00275767_v1&r=ets
  11. By: M. Angeles Carnero (Universidad de Alicante); Daniel Peña (Universidad Carlos III de Madrid); Esther Ruiz (Universidad Carlos III de Madrid)
    Abstract: The main goal when fitting GARCH models to conditionally heteroscedastic time series is to estimate the underlying volatilities. It is well known that outliers affect the estimation of the GARCH parameters. However, little is known about their effects when estimating volatilities. In this paper, we show that when estimating the volatility by using Maximum Likelihood estimates of the parameters, the biases incurred can be very large even if estimated parameters have small biases. Consequently, we propose to use robust procedures. In particular, a simple robust estimator of the parameters is proposed and shown that its properties are comparable with other more complicated ones available in the literature. The properties of the estimated and predicted volatilities obtained by using robust filters based on robust parameter estimates are analyzed. All the results are illustrated using daily S&P500 and IBEX35 returns.
    Keywords: Heteroscedasticity, M-estimator, QML estimator, Robustness, Financial Markets
    JEL: C22
    Date: 2008–10
    URL: http://d.repec.org/n?u=RePEc:ivi:wpasad:2008-13&r=ets
  12. By: Morten Ørregaard Nielsen (Queen's University)
    Abstract: This paper presents a family of simple nonparametric unit root tests indexed by one parameter, d, and containing Breitung's (2002) test as the special case d=1. It is shown that (i) each member of the family with d>0 is consistent, (ii) the asymptotic distribution depends on d, and thus reflects the parameter chosen to implement the test, and (iii) since the asymptotic distribution depends on d and the test remains consistent for all d>0, it is possible to analyze the power of the test for different values of d. The usual Phillips-Perron or Dickey-Fuller type tests are indexed by bandwidth, lag length, etc., but have none of these three properties. It is shown that members of the family with d<1 have higher asymptotic local power than the Breitung (2002) test, and when d is small the asymptotic local power of the proposed nonparametric test is relatively close to the parametric power envelope, particularly in the case with a linear time-trend. Furthermore, GLS detrending is shown to improve power when d is small, which is not the case for Breitung's (2002) test. Simulations demonstrate that when applying a sieve bootstrap procedure, the proposed variance ratio test has very good size properties, with finite sample power that is higher than that of Breitung's (2002) test and even rivals the (nearly) optimal parametric GLS detrended augmented Dickey-Fuller test with lag length chosen by an information criterion.
    Keywords: Augmented Dickey-Fuller test, fractional integration, GLS detrending, nonparametric, nuisance parameter, tuning parameter, power envelope, unit root test, variance ratio
    JEL: C22
    Date: 2008–10
    URL: http://d.repec.org/n?u=RePEc:qed:wpaper:1185&r=ets
  13. By: Edward S. Knotek II; Stephen Terry
    Abstract: Discrete Markov chains can be useful to approximate vector autoregressive processes for economists doing computational work. One such approximation method first presented by Tauchen (1986) operates under the general theoretical assumption of a transformed VAR with diagonal covariance structure for the process error term. We demonstrate one simple method of more conveniently treating this approximation problem in practice using readily available multivariate-normal integration techniques to allow for arbitrary positive-semidefinite covariance structures. Examples are provided using processes with non-diagonal and singular non-diagonal error covariances.
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:fip:fedkrw:rwp08-02&r=ets
  14. By: Nii Ayi Armah; Norman R. Swanson
    Abstract: In economics, common factors are often assumed to underlie the co-movements of a set of macroeconomic variables. For this reason, many authors have used estimated factors in the construction of prediction models. In this paper, we begin by surveying the extant literature on diffusion indexes. We then outline a number of approaches to the selection of factor proxies (observed variables that proxy unobserved estimated factors) using the statistics developed in Bai and Ng (2006a,b). Our approach to factor proxy selection is examined via a small Monte Carlo experiment, where evidence supporting our proposed methodology is presented, and via a large set of prediction experiments using the panel dataset of Stock and Watson (2005). One of our main empirical findings is that our “smoothed” approaches to factor proxy selection appear to yield predictions that are often superior not only to a benchmark factor model, but also to simple linear time series models which are generally difficult to beat in forecasting competitions. In some sense, by using our approach to predictive factor proxy selection, one is able to open up the “black box” often associated with factor analysis, and to identify actual variables that can serve as primitive building blocks for (prediction) models of a host of macroeconomic variables, and that can also serve as policy instruments, for example. Our findings suggest that important observable variables include various S&P500 variables, including stock price indices and dividend series; a 1-year Treasury bond rate; various housing activity variables; industrial production; and exchange rates.
    Keywords: Macroeconomics
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:fip:fedpwp:08-25&r=ets
  15. By: Todd E. Clark; Michael W. McCracken
    Abstract: Motivated by the common finding that linear autoregressive models often forecast better than models that incorporate additional information, this paper presents analytical, Monte Carlo, and empirical evidence on the effectiveness of combining forecasts from nested models. In our analytics, the unrestricted model is true, but a subset of the coeffcients are treated as being local-to-zero. This approach captures the practical reality that the predictive content of variables of interest is often low. We derive MSE-minimizing weights for combining the restricted and unrestricted forecasts. Monte Carlo and empirical analyses verify the practical e ectiveness of our combination approach.
    Keywords: Econometric models ; Economic forecasting
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:fip:fedlwp:2008-037&r=ets
  16. By: Josep Maria Puigvert Gutiérrez (European Central Bank, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.); Josep Fortiana Gregori (Universitat de Barcelona, Departament de Probabilitat, Lògica i Estadística, Gran Via de les Corts Catalanes, 585, 08007 Barcelona, Spain.)
    Abstract: In this study we combine clustering techniques with a moving window algorithm in order to filter financial market data outliers. We apply the algorithm to a set of financial market data which consists of 25 series selected from a larger dataset using a cluster analysis technique taking into account the daily behaviour of the market; each of these series is an element of a cluster that represents a different segment of the market. We set up a framework of possible algorithm parameter combinations that detect most of the outliers by market segment. In addition, the algorithm parameters that have been found can also be used to detect outliers in other series with similar economic behaviour in the same cluster. Moreover, the crosschecking of the behaviour of different series within each cluster reduces the possibility of observations being misclassified as outliers. JEL Classification: C19, C49, G19.
    Keywords: Outliers, financial market, cluster analysis, moving filtering window algorithm.
    Date: 2008–10
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20080948&r=ets
  17. By: Ricardo Mestre (Corresponding author: European Central Bank, DG Research, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.); Peter McAdam (European Central Bank, DG Research, Kaiserstrasse 29, D-60311 Frankfurt am Main, Germany.)
    Abstract: We evaluate residual projection strategies in the context of a large-scale macro model of the euro area and smaller benchmark time-series models. The exercises attempt to measure the accuracy of model-based forecasts simulated both out-of-sample and in-sample. Both exercises incorporate alternative residual-projection methods, to assess the importance of unaccounted-for breaks in forecast accuracy and off-model judgment. Conclusions reached are that simple mechanical residual adjustments have a significant impact of forecasting accuracy irrespective of the model in use, ostensibly due to the presence of breaks in trends in the data. The testing procedure and conclusions are applicable to a wide class of models and thus of general interest. JEL Classification: C52, E30, E32, E37.
    Keywords: Macro-model, Forecast Projections, Out-of-Sample, In-Sample, Forecast Accuracy, Structural Break.
    Date: 2008–10
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20080950&r=ets

This nep-ets issue is ©2008 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.