nep-ets New Economics Papers
on Econometric Time Series
Issue of 2005‒02‒13
twenty-two papers chosen by
Yong Yin
SUNY at Buffalo

  1. Permanent vs Transitory Components and Economic Fundamentals By Anthony Garratt; Donald Robertson; Stephen Wright
  2. UK Real-Time Macro Data Characteristics By Anthony Garratt; Shaun P Vahey
  3. Monte Carlo tests with nuisance parameters: a general approach to finite-sample inference and non-standard asymptotics By Jean-Marie Dufour
  4. Exact Multivariate Tests of Asset Pricing Models with Stable Asymmetric Distributions By Marie-Claude Beaulieu; Jean-Marie Dufour; Lynda Khalaf
  5. Interpolation and Backdating with A Large Information Set By Angelini, Elena; Henry, Jérôme; Marcellino, Massimiliano
  6. Small Sample Confidence Intervals for Multivariate Impulse Response Functions at Long Horizons By Pesavento, Elena; Rossi, Barbara
  7. Forecasting Time Series Subject to Multiple Structural Breaks By Pesaran, M Hashem; Pettenuzzo, Davide; Timmermann, Allan G
  8. Optimal Forecast Combination Under Regime Switching By Elliott, Graham; Timmermann, Allan G
  9. Model-based Clustering of Multiple Time Series By Frühwirth-Schnatter, Sylvia; Kaufmann, Sylvia
  10. A Variance Ratio Related Prediction Tool with Application to the NYSE Index 1825-2002 By Kandel, Shmuel; Zilca, Shlomo
  11. Business Cycle Turning Points : Mixed-Frequency Data with Structural Breaks By Konstantin A., KHOLODILIN; Wension Vincent, YAO
  12. Generalized dynamic linear models for financial time series By Campagnoli Patrizia; Muliere Pietro; Petrone Sonia
  13. The power of lambda max By Paruolo Paolo
  14. On Metropolis-Hastings algorithms with delayed rejection By Mira Antonietta
  15. Asymptotic standard errors for common trends linear combinations in I(2) VAR systems By Paruolo Paolo
  16. LR cointegration tests when some cointegrating relations are known By Paruolo Paolo
  17. On Monte Carlo Estimation of Relative Power By Paruolo Paolo
  18. Using Out-of-Sample Mean Squared Prediction Errors to Test the Martingale Difference By Todd E. Clark; Kenneth D. West
  19. Multiple time scales in volatility and leverage correlation: A stochastic volatility model By Jean-Philippe Bouchaud; Josep Perello; Jaume Masoliver
  20. Microscopic models for long ranged volatility correlations By Jean-Philippe Bouchaud; Marc Mezard; Irene Giardina
  21. Bootstrapping GMM Estimators for Time Series By Atsushi Inoue; Mototsugu Shintani
  22. Evaluating Density Forecasts via the Copula Approach By Xiaohong Chen; Yanqin Fan

  1. By: Anthony Garratt (School of Economics, Mathematics & Statistics, Birkbeck College); Donald Robertson; Stephen Wright (School of Economics, Mathematics & Statistics, Birkbeck College)
    Abstract: Any non-stationary series can be decomposed into permanent (or "trend") and transitory (or "cycle") components. Typically some atheoretic pre-filtering procedure is applied to extract the permanent component. This paper argues that analysis of the fundamental underlying stationary economic processes should instead be central to this process. We present a new derivation of multivariate Beveridge-Nelson permanent and transitory components, whereby the latter can be derived explicitly as a weighting of observable stationary processes. This allows far clearer economic interpretations. Different assumptions on the fundamental stationary processes result in distinctly different results; but this reflects deep economic uncertainty. We illustrate with an example using Garratt et al's (2003a) small VECM model of the UK economy. Any non-stationary series can be decomposed into permanent (or "trend") and transitory (or "cycle") components. Typically some atheoretic pre-filtering procedure is applied to extract the permanent component. This paper argues that analysis of the fundamental underlying stationary economic processes should instead be central to this process. We present a new derivation of multivariate Beveridge-Nelson permanent and transitory components, whereby the latter can be derived explicitly as a weighting of observable stationary processes. This allows far clearer economic interpretations. Different assumptions on the fundamental stationary processes result in distinctly different results; but this reflects deep economic uncertainty. We illustrate with an example using Garratt et al's (2003a) small VECM model of the UK economy.
    Keywords: Multivariate Beveridge-Nelson, VECM, Economic Fundamentals, Decomposition.
    JEL: C1 C32 E0 E32 E37
    Date: 2005–02
  2. By: Anthony Garratt (School of Economics, Mathematics & Statistics, Birkbeck College); Shaun P Vahey
    Abstract: We characterise the relationships between preliminary and subsequent measurements for 16 commonly-used UK macroeconomic indicators drawn from two existing real-time data sets and a new nominal variable database. Most preliminary measurements are biased predictors of subsequent measurements, with some revision series affected by multiple structural breaks. To illustrate how these findings facilitate real-time forecasting, we use a vector autoregression to generate real-time one-step-ahead probability event forecasts for 1990Q1 to 1999Q2. Ignoring the predictability in initial measurements understates considerably the probability of above trend output growth.
    Keywords: real-time data, structural breaks, probability event forecasts
    JEL: C22 C82 E00
    Date: 2005–02
  3. By: Jean-Marie Dufour
    Abstract: The technique of Monte Carlo (MC) tests [Dwass (1957), Barnard (1963)] provides an attractive method of building exact tests from statistics whose finite sample distribution is intractable but can be simulated (provided it does not involve nuisance parameters). We extend this method in two ways: first, by allowing for MC tests based on exchangeable possibly discrete test statistics; second, by generalizing the method to statistics whose null distributions involve nuisance parameters (maximized MC tests, MMC). Simplified asymptotically justified versions of the MMC method are also proposed and it is shown that they provide a simple way of improving standard asymptotics and dealing with nonstandard asymptotics (e.g., unit root asymptotics). Parametric bootstrap tests may be interpreted as a simplified version of the MMC method (without the general validity properties of the latter). <P>La technique des tests de Monte Carlo ((MC; Dwass (1957), Barnard (1963)) constitue une méthode attrayante qui permet de construire des tests exacts fondés sur des statistiques dont la distribution exacte est difficile à calculer par des méthodes analytiques mais peut être simulée, pourvu que cette distribution ne dépende pas de paramètres de nuisance. Nous généralisons cette méthode dans deux directions: premièrement, en considérant le cas où le test de Monte Carlo est construit à partir de réplications échangeables d’une variable aléatoire dont la distribution peut comporter des discontinuités; deuxièmement, en étendant la méthode à des statistiques dont la distribution dépend de paramètres de nuisance (tests de Monte Carlo maximisés, MMC). Nous proposons aussi des versions simplifiées de la procédure MMC, qui ne sont valides qu’asymptotiquement mais fournissent néanmoins une méthode simple qui permet d’améliorer les approximations asymptotiques usuelles, en particulier dans des cas non standards (e.g., l’asymptotique en présence de racines unitaires). Nous montrons aussi que les tests basés sur la technique du bootstrap paramétrique peut s’interpréter comme une version simplifiée de la procédure MMC. Cette dernière fournit toutefois des tests asymptotiquement valides sous des conditions beaucoup plus générales que le bootstrap paramétrique.
    Keywords: Monte Carlo test, maximized monte Carlo test, finite sample test, exact test, nuisance parameter, bounds, bootstrap, parametric bootstrap, simulated annealing, asymptotics, nonstandard asymptotic distribution, test de Monte Carlo, test de Monte Carlo maximisé, test exact, test valide en échantillon fini, paramètre de nuisance, bornes, bootstrap, bootstrap paramétrique, recuit simulé, distribution asymptotique non standard
    JEL: C12 C15 C2 C52 C22
    Date: 2005–02–01
  4. By: Marie-Claude Beaulieu; Jean-Marie Dufour; Lynda Khalaf
    Abstract: In this paper, we propose exact inference procedures for asset pricing models that can be formulated in the framework of a multivariate linear regression (CAPM), allowing for stable error distributions. The normality assumption on the distribution of stock returns is usually rejected in empirical studies, due to excess kurtosis and asymmetry. To model such data, we propose a comprehensive statistical approach which allows for alternative - possibly asymmetric - heavy tailed distributions without the use of large-sample approximations. The methods suggested are based on Monte Carlo test techniques. Goodness-of-fit tests are formally incorporated to ensure that the error distributions considered are empirically sustainable, from which exact confidence sets for the unknown tail area and asymmetry parameters of the stable error distribution are derived. Tests for the efficiency of the market portfolio (zero intercepts) which explicitly allow for the presence of (unknown) nuisance parameter in the stable error distribution are derived. The methods proposed are applied to monthly returns on 12 portfolios of the New York Stock Exchange over the period 1926-1995 (5 year subperiods). We find that stable possibly skewed distributions provide statistically significant improvement in goodness-of-fit and lead to fewer rejections of the efficiency hypothesis. <P>Dans cet article, nous proposons des méthodes d’inférence exactes pour des modèles d’évaluation d’actifs (CAPM) qui sont formulés dans le contexte des modèles de régression linéaires multivariés. De plus, ces méthodes permettent de considérer des lois de probabilité stables sur les erreurs du modèle. Il est bien connu que l’hypothèse de normalité des rendements boursiers est habituellement rejetée dans les études empiriques à cause de la présence d’asymétrie et d’aplatissement dans les distributions. Afin de modéliser de tels attributs, nous suggérons une approche qui accommode l’asymétrie et l’aplatissement dans les distributions sans avoir recours à des approximations de grands échantillons. Les méthodes suggérées sont basées sur des tests de Monte Carlo. Des tests diagnostiques multivariés sont formellement inclus dans l’analyse afin de s’assurer que les distributions d’erreurs considérées sont raisonnables pour les données étudiées. Ces tests permettent la construction de régions de confiance exactes pour les paramètres d’asymétrie et d’aplatissement des erreurs dans le cas de lois stables. Nous proposons des tests d’efficacité du portefeuille de référence (i.e., pour la nullité des constantes) qui tiennent explicitement compte de la présence de paramètres de nuisance dans les distributions stables. Les méthodes proposées sont appliquées aux rendements de 12 portefeuilles constitués d’actifs négociés à la bourse de New York (NYSE) sur la période s’étalant de 1926 à 1995 (par sous-périodes de cinq ans). Nos résultats montrent que l’utilisation de distributions stables possiblement asymétriques produit une amélioration statistique importante dans la représentation de la distribution et mène à moins de rejet de l’hypothèse d’efficacité du portefeuille de marché.
    Keywords: capital asset pricing model; mean-variance efficiency; non-normality; multivariate linear regression; stable distribution; skewness; kurtosis; asymmetry; uniform linear hypothesis; exact test; Monte Carlo test; nuisance parameter; specification test; diagnostics, modèle d’évaluation d’actifs financiers; efficience de portefeuille; non-normalité; modèle de régression multivarié; loi stable; asymétrie; aplatissement; hypothèse linéaire uniforme; test exact; test de Monte Carlo; paramètres de nuisance; tests diagnostiques
    JEL: C3 C12 C33 C15 G1 G12 G14
    Date: 2005–02–01
  5. By: Angelini, Elena; Henry, Jérôme; Marcellino, Massimiliano
    Abstract: Existing methods for data interpolation or backdating are either univariate or based on a very limited number of series, due to data and computing constraints that were binding until the recent past. Nowadays large datasets are readily available, and models with hundreds of parameters are fastly estimated. We model these large datasets with a factor model, and develop an interpolation method that exploits the estimated factors as an efficient summary of all the available information. The method is compared with existing standard approaches from a theoretical point of view, by means of Monte Carlo simulations, and also when applied to actual macroeconomic series. The results indicate that our method is more robust to model misspecification, although traditional multivariate methods also work well while univariate approaches are systematically outperformed. When interpolated series are subsequently used in econometric analyses, biases can emerge, depending on the type of interpolation but again be reduced with multivariate approaches, including factor-based ones.
    Keywords: Factor model; Interpolation; Kalman filter; spline
    JEL: C32 C43 C82
    Date: 2004–10
  6. By: Pesavento, Elena; Rossi, Barbara
    Abstract: Existing methods for constructing confidence bands for multivariate impulse response functions depend on auxiliary assumptions on the order of integration of the variables. Thus, they may have poor coverage at long lead times when variables are highly persistent. Solutions that have been proposed in the literature may be computationally challenging. The goal of this Paper is to propose a simple method for constructing confidence bands for impulse response functions that is not pointwise and that is robust to the presence of highly persistent processes. The method uses alternative approximations based on local-to-unity asymptotic theory and allows the lead time of the impulse response function to be a fixed fraction of the sample size. These devices provide better approximations in small samples. Monte Carlo simulations show that our method tends to have better coverage properties at long horizons than existing methods. We also investigate the properties of the various methods in terms of the length of their confidence bands. Finally, we show, with empirical applications, that our method may provide different economic interpretations of the data. Applications to real GDP and to nominal versus real sources of fluctuations in exchange rates are discussed.
    Keywords: impulse response functions; local to unity asymptotics; persistence; VARs
    JEL: C12 C32 F40
    Date: 2004–09
  7. By: Pesaran, M Hashem; Pettenuzzo, Davide; Timmermann, Allan G
    Abstract: This Paper provides a novel approach to forecasting time series subject to discrete structural breaks. We propose a Bayesian estimation and prediction procedure that allows for the possibility of new breaks over the forecast horizon, taking account of the size and duration of past breaks (if any) by means of a hierarchical hidden Markov chain model. Predictions are formed by integrating over the hyper parameters from the meta distributions that characterize the stochastic break point process. In an application to US Treasury bill rates, we find that the method leads to better out-of-sample forecasts than alternative methods that ignore breaks, particularly at long horizons.
    Keywords: Bayesian model averaging; forecasting; hierarchical hidden Markov Chain Model; structural breaks
    JEL: C11 C15 C53
    Date: 2004–09
  8. By: Elliott, Graham; Timmermann, Allan G
    Abstract: This Paper proposes a new forecast combination method that lets the combination weights be driven by regime switching in a latent state variable. An empirical application that combines forecasts from survey data and time series models finds that the proposed regime switching combination scheme performs well for a variety of macroeconomic variables. Monte Carlo simulations shed light on the type of data generating processes for which the proposed combination method can be expected to perform better than a range of alternative combination schemes. Finally, we show how time-variations in the combination weights arise when the target variable and the predictors share a common factor structure driven by a hidden Markov process.
    Keywords: forecast combination; Markov switching; survey data; time-varying combination weights
    JEL: C53
    Date: 2004–10
  9. By: Frühwirth-Schnatter, Sylvia; Kaufmann, Sylvia
    Abstract: We propose to use the attractiveness of pooling relatively short time series that display similar dynamics, but without restricting to pooling all into one group. We suggest estimating the appropriate grouping of time series simultaneously along with the group-specific model parameters. We cast estimation into the Bayesian framework and use Markov chain Monte Carlo simulation methods. We discuss model identification and base model selection on marginal likelihoods. A simulation study documents the efficiency gains in estimation and forecasting that are realized when appropriately grouping the time series of a panel. Two economic applications illustrate the usefulness of the method in analysing also extensions to Markov switching within clusters and heterogeneity within clusters, respectively.
    Keywords: clustering; Markov chain Monte Carlo; Markov Switching; mixture modelling; panel data
    JEL: C11 C33 E32
    Date: 2004–09
  10. By: Kandel, Shmuel; Zilca, Shlomo
    Abstract: Cochrane’s variance ratio is a leading tool for detection of deviations from random walks in financial asset prices. This Paper develops a variance ratio related regression model that can be used for prediction. We suggest a comprehensive framework for our model, including model identification, model estimation and selection, bias correction, model diagnostic check, and an inference procedure. We use our model to study and model mean reversion in the NYSE index in the period 1825-2002. We demonstrate that in addition to mean reversion, our model can generate other characteristic properties of financial asset prices, such as short-term persistence and volatility clustering of unconditional returns.
    Keywords: mean reversion; persistence; variance ratio
    Date: 2004–11
  11. By: Konstantin A., KHOLODILIN; Wension Vincent, YAO
    Abstract: This papers develops a dynamic factor models with regime switching to account for the decreasing volatility of the U.S. economy observed since the mid-1980s. Apart from the Markov switching capturing the cyclical fluctuations, an additional type of regime switching is introduced to allow variances to switch between distinct regimes. The resulting four-regime models extend univariate analysis currently used in the literature on the structural break in conditional volatility to the multivariate time series. Besides the dynamic factor model using the data with a single (monthly) frequency, we employ the additonal information incorporating the mixed-frequency data, which include not only the monthly component series but also such an important quarterly series as the real GDP. The evaluation of six different nonlinear models suggests that the probabilities derived from all the models comply with NBER business cycle dating and detect a one-time shifting from high variance to low-variance states in February 1984. In addition, we find that: mixed-frequency models outperform single-frequency models; restricted models outperform unrestricted models; four-regime switching models outperform two-regime switching models.
    Keywords: Volatility; Structural break; Composite coincident indicator; Dynamic factor model; Markov switching; Mixed-frequency data
    JEL: E32 C10
    Date: 2004–09–15
  12. By: Campagnoli Patrizia (University of Pavia, Italy); Muliere Pietro (University of Bocconi, Italy); Petrone Sonia (Department of Economics, University of Insubria, Italy)
    Abstract: In this paper we consider a class of conditionally Gaussian state space models and discuss how they can provide a flexible and fairly simple tool for modelling financial time series, even in presence of different components in the series, or of stochastic volatility. Estimation can be computed by recursive equations, which provide the optimal solution under rather mild assumptions. In more general models, the filter equations can still provide approximate solutions. We also discuss how some models traditionally employed for analysing financial time series can be regarded in the state-space framework. Finally, we illustrate the models in two examples to real data sets.
    Keywords: dynamic linear models; conditionally gaussian models; Kalman filter; stochastic regressors; stochastic volatility; GARcH models.
  13. By: Paruolo Paolo (Department of Economics, University of Insubria, Italy)
    Abstract: This paper condiders likelihood ratio (LR) cointegration rank tests in vector autoregressive models (VAR); the local power of the most widely used LR 'trace' test is compared with the LR 'lambda max' test. It is found that neither test uniformily dominates the other one. Moreover it is shown that the asymptotic properties of the estimator of the cointegration rank based on the trace test are shared by a similar estimator based on the lambda max test. These results indicate that the both tests are admissible.
    Keywords: Cointegration, Likelihood Ratio, Unit roots, Local Power
  14. By: Mira Antonietta (Department of Economics, University of Insubria, Italy)
    Abstract: The class of Metropolis-Hastings algorithms can be modified by delaying the rejection of proposed moves. The new samplers are proved to perform better than the original ones in terms of asymptotic variance of the estimates on a sweep by sweep basis. The delaying rejection algorithms also allow some space for local adaptation of the proposal distribution. We give an iterative formula for the acceptance probability at the i-th iteration of the delaying process. A special case is discussed in detail: the delaying rejection algorithm with symmetric proposal distribution
    Keywords: Markov chain Monte Carlo Methods, Metropolis-Hastings algorithm, Asymptotic variance, Peskun ordering
  15. By: Paruolo Paolo (Department of Economics, University of Insubria, Italy)
    Abstract: This paper provides asymptotic standard errors for the moving average (MA) impact matrix for the second differences of a vector autoregressive (VAR) process integrated of order 2,I(2). Standard errors of the row space of the MA impact matrix are also provided; bases of this row space define the common I(2) trends linear combinations. These standard errors are then used to formulate Wald type tests. The MA impact matrix is shown to be linked to impact factors which measure the total effect of disequilibrium errors on the growth rate of the system. Most of the relevant limit distributions are Gaussian, and we report artificial regressions that can be used to calculate the estimators of the asymptotic variances. The use of the techniques proposed in the paper are illustrated on UK money data.
    Keywords: Cointegration, Common trends, VAR, I(2), ML, 2SI2
  16. By: Paruolo Paolo (Department of Economics, University of Insubria, Italy)
    Abstract: This paper considers the asymptotic analysis of the likelihood ratio (LR), cointegration (CI) rank test in vector autoregressive models (VAR) when some CI vectors are known and fixed. It is shown that the limit law is free of nuisance parameters. In the case of LR tests against the alternative of completely unrestricted CI space, the limit law can be expressed as the convolution of known distributions. This deconvolution is employed to approximate the quantiles of the distribution, without resorting to new simulations.
    Keywords: cointegration, likelihood ratio, unit roots
  17. By: Paruolo Paolo (Department of Economics, University of Insubria, Italy)
    Abstract: This paper derives standard errors for Monte Carlo (MC) estimators of (relative) power of tests when the critical values under the null have also been estimated. This situation is common e.g. in unit root and cointegration tests. The associated issue of MC design is discussed. The results are illustrated on likelihood based tests for cointegration rank determination.
    Keywords: Monte Carlo, design of experiments, (local) power, cointegration, likelihood ratio, unit roots
  18. By: Todd E. Clark; Kenneth D. West
    Abstract: We consider using out-of-sample mean squared prediction errors (MSPEs) to evaluate the null that a given series follows a zero mean martingale difference against the alternative that it is linearly predictable. Under the null of no predictability, the population MSPE of the null %u201Cno change%u201D model equals that of the linear alternative. We show analytically and via simulations that despite this equality, the alternative model%u2019s sample MSPE is expected to be greater than the null%u2019s. For rolling regression estimators of the alternative model%u2019s parameters, we propose and evaluate an asymptotically normal test that properly accounts for the upward shift of the sample MSPE of the alternative model. Our simulations indicate that our proposed procedure works well.
    JEL: C22 C53
    Date: 2005–01
  19. By: Jean-Philippe Bouchaud (Science & Finance, Capital Fund Management; CEA Saclay;); Josep Perello; Jaume Masoliver
    Abstract: Financial time series exhibit two different type of non linear correlations: (i) volatility autocorrelations that have a very long range memory, on the order of years, and (ii) asymmetric return-volatility (or `leverage') correlations that are much shorter ranged. Different stochastic volatility models have been proposed in the past to account for both these correlations. However, in these models, the decay of the correlations is exponential, with a single time scale for both the volatility and the leverage correlations, at variance with observations. We extend the linear Ornstein-Uhlenbeck stochastic volatility model by assuming that the mean reverting level is itself random. We find that the resulting three-dimensional diffusion process can account for different correlation time scales. We show that the results are in good agreement with a century of the Dow Jones index daily returns (1900-2000), with the exception of crash days.
    JEL: G10
  20. By: Jean-Philippe Bouchaud (Science & Finance, Capital Fund Management; CEA Saclay;); Marc Mezard (Universite Paris Sud (Orsay)); Irene Giardina
    Abstract: We propose a general interpretation for long-range correlation effects in the activity and volatility of financial markets. This interpretation is based on the fact that the choice between `active' and `inactive' strategies is subordinated to random-walk like processes. We numerically demonstrate our scenario in the framework of simplified market models, such as the Minority Game model with an inactive strategy, or a more sophisticated version that includes some price dynamics. We show that real market data can be surprisingly well accounted for by these simple models.
    JEL: G10
  21. By: Atsushi Inoue (North Carolina State University); Mototsugu Shintani (Department of Economics, Vanderbilt University)
    Abstract: This paper establishes that the bootstrap provides asymptotic refinements for the generalized method of moments estimator of overidentified linear models when autocorrelation structures of moment functions are unknown. When moment functions are uncorrelated after finite lags, Hall and Horowitz (1996) showed that errors in the rejection probabilities of the symmetrical t test and the test of overidentifying restrictions based on the bootstrap are O(T-1). In general, however, such a parametric rate cannot be obtained with the heteroskedasticity and autocorrelation consistent (HAC) covariance matrix estimator since it converges at a nonparametric rate that is slower than T-1/2. By taking into account the HAC covariance matrix estimator in the Edgeworth expansion, we show that the bootstrap provides asymptotic refinements when kernels whose characteristic exponent is greater than two are used. Moreover, we find that the order of the bootstrap approximation error can be made arbitrarily close to o(T-1) provided moment conditions are satisfied. The bootstrap approximation thus improves upon the first-order asymptotic approximation even when there is a general autocorrelation. A Monte Carlo experiment shows that the bootstrap improves the accuracy of inference on regression parameters in small samples. We apply our bootstrap method to inference about the parameters in the monetary policy reaction function.
    Keywords: Asymptotic refinements, block bootstrap, dependent data, Edgeworth expansions, instrumental variables
    JEL: C12 C22 C32
    Date: 2001–12
  22. By: Xiaohong Chen (Department of Economics, New York University); Yanqin Fan (Department of Ecomomics, Vanderbilt University)
    Abstract: In this paper, we develop a general approach for constructing simple tests for the correct density forecasts, or equivalently, for i.i.d. uniformity of appropriately transformed random variables. It is based on nesting a series of i.i.d. uniform random variables into a class of copula-based stationary Markov processes. As such, it can be used to test for i.i.d. uniformity against alternative processes that exhibit a wide variety of marginal properties and temporal dependence properties, including skewed and fat-tailed marginal distributions, asymmetric dependence, and positive tail dependence. In addition, we develop tests for the dependence structure of the forecasting model that are robust to possible misspecification of the marginal distribution.
    Keywords: Density forecasts, Gaussian copula, probability integral transform, nonlinear time series
    JEL: C22 C52 C53
    Date: 2002–10

This nep-ets issue is ©2005 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.