|
on Econometric Time Series |
By: | Fabio Busetti (Bank of Italy); Andrew Harvey (Cambridge University) |
Abstract: | The paper examines various tests for assessing whether a time series model requires a slope component. We first consider the simple t-test on the mean of first differences and show that it achieves high power against the alternative hypothesis of a stochastic nonstationary slope as well as against a purely deterministic slope. The test may be modified, parametrically or nonparametrically to deal with serial correlation. Using both local limiting power arguments and finite sample Monte Carlo results, we compare the t-test with the nonparametric tests of Vogelsang (1998) and with a modified stationarity test. Overall the t-test seems a good choice, particularly if it is implemented by fitting a parametric model to the data. When standardized by the square root of the sample size, the simple t-statistic, with no correction for serial correlation, has a limiting distribution if the slope is stochastic. We investigate whether it is a viable test for the null hypothesis of a stochastic slope and conclude that its value may be limited by an inability to reject a small deterministic slope. Empirical illustrations are provided using series of relative prices in the euro-area and data on global temperature. |
Keywords: | Cramér-von Mises distribution, stationarity test, stochastic trend, unit root, unobserved component. |
JEL: | C22 C52 |
URL: | http://d.repec.org/n?u=RePEc:bdi:wptemi:td_614_07&r=ets |
By: | Fabio Busetti; Andrew Harvey |
Abstract: | Quantiles provide a comprehensive description of the properties of a variable and tracking changes in quantiles over time using signal extraction methods can be informative. It is shown here how stationarity tests can be generalized to test the null hypothesis that a particular quantile is constant over time by using weighted indicators. Corresponding tests based on expectiles are also proposed; these might be expected to be more powerful for distributions that are not heavy-tailed. Tests for changing dispersion and asymmetry may be based on contrasts between particular quantiles or expectiles. We report Monte Carlo experiments investigating the e¤ectiveness of the proposed tests and then move on to consider how to test for relative time invariance, based on residuals from fitting a time-varying level or trend. Empirical examples, using stock returns and U.S. inflation, provide an indication of the practical importance of the tests. |
Keywords: | Dispersion; expectiles; quantiles; skewness; stationarity tests; stochastic volatility, value at risk. |
JEL: | C12 C22 |
Date: | 2007–03 |
URL: | http://d.repec.org/n?u=RePEc:cam:camdae:0701&r=ets |
By: | Giuliano De Rossi; Andrew Harvey |
Abstract: | A time-varying quantile can be fitted to a sequence of observations by formulating a time series model for the corresponding population quantile and iteratively applying a suitably modified state space signal extraction algorithm. It is shown that such time-varying quantiles satisfy the defining property of fixed quantiles in having the appropriate number of observations above and below. Expectiles are similar to quantiles except that they are defined by tail expectations. Like quantiles, time-varying expectiles can be estimated by a state space signal extraction algorithm and they satisfy properties that generalize the moment conditions associated with fixed expectiles. Time-varying quantiles and expectiles provide information on various aspects of a time series, such as dispersion and asymmetry, while estimates at the end of the series provide the basis for forecasting. Because the state space form can handle irregularly spaced observations, the proposed algorithms can be easily adapted to provide a viable means of computing spline-based non-parametric quantile and expectile regressions. |
Keywords: | Asymmetric least squares; cubic splines; dispersion; non-parametric regression; quantile regression; signal extraction; state space smoother. |
JEL: | C14 C22 |
Date: | 2007–02 |
URL: | http://d.repec.org/n?u=RePEc:cam:camdae:0702&r=ets |
By: | Adrian Pagan; M. Hashem Pesaran |
Abstract: | This paper considers the implications of the permanent/transitory decomposition of shocks for identification of structural models in the general case where the model might contain more than one permanent structural shock. It provides a simple and intuitive generalization of the in.uential work of Blanchard and Quah (1989), and shows that structural equations for which there are known permanent shocks must have no error correction terms present in them, thereby freeing up the latter to be used as instruments in estimating their parameters. The proposed approach is illustrated by a re-examination of the identification scheme used in a monetary model by Wickens and Motta (2001), and in a well known paper by Gali (1992) which deals with the construction of an IS-LM model with supply-side e¤ects. We show that the latter imposes more short-run restrictions than are needed because of a failure to fully utilize the cointegration information. |
Keywords: | Permanent shocks, structural identification, error correction models, IS-LM models. |
JEL: | C30 C32 E10 |
Date: | 2007–01 |
URL: | http://d.repec.org/n?u=RePEc:cam:camdae:0704&r=ets |
By: | Gernot Doppelhofer; Jesus Crespo Cuaresma |
Abstract: | We propose a Bayesian Averaging of Thresholds (BAT) approach for assessing the existence and quantifying the effect of threshold effects in cross- country growth regressions in the presence of model uncertainty. The BAT method extends the Bayesian Averaging of Classical Estimates (BACE) approach proposed by Sala-i-Martin, Doppelhofer, and Miller (2004) by allowing for uncertainty over nonlinear threshold effects. We apply our method to a set of determinants of long-term economic growth in a cross section of 88 countries. Our results suggest that when model uncertainty is taken into account there is no evidence for robust threshold effects caused by the Initial Income, measured by GDP per capita in 1960, but that the Number of Years an Economy Has Been Open is an important source of nonlinear effects on growth. |
Keywords: | Model Uncertainty, Model Averaging, Threshold Estimation, Non-Linearities, Determinants of Economic Growth |
JEL: | C11 C15 O20 O50 |
Date: | 2007–02 |
URL: | http://d.repec.org/n?u=RePEc:cam:camdae:0706&r=ets |
By: | Elliott, Graham; Timmermann, Allan G |
Abstract: | Forecasts guide decisions in all areas of economics and finance and their value can only be understood in relation to, and in the context of, such decisions. We discuss the central role of the loss function in helping determine the forecaster's objectives and use this to present a unified framework for both the construction and evaluation of forecasts. Challenges arise from the explosion in the sheer volume of predictor variables under consideration and the forecaster's ability to entertain an endless array of functional forms and time-varying specifications, none of which may coincide with the `true' model. Methods for comparing the forecasting performance of pairs of models or evaluating the ability of the best of many models to beat a benchmark specification are also reviewed. |
Keywords: | economic forecasting; forecast evaluation; loss function |
JEL: | C53 |
Date: | 2007–03 |
URL: | http://d.repec.org/n?u=RePEc:cpr:ceprdp:6158&r=ets |
By: | George Kapetanios (Queen Mary, University of London); Zacharias Psaradakis (Birkbeck, University of London) |
Abstract: | This paper considers the problem of statistical inference in linear regression models whose stochastic regressors and errors may exhibit long-range dependence. A time-domain sieve-type generalized least squares (GLS) procedure is proposed based on an autoregressive approximation to the generating mechanism of the errors. The asymptotic properties of the sieve-type GLS estimator are established. A Monte Carlo study examines the finite-sample properties of the method for testing regression hypotheses. |
Keywords: | Autoregressive approximation, Generalized least squares, Linear regression, Long-range dependence, Spectral density |
JEL: | C12 C13 C22 |
Date: | 2007–03 |
URL: | http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp587&r=ets |
By: | George Kapetanios (Queen Mary, University of London); Andrew P. Blake (Bank of England) |
Abstract: | This paper develops theoretical results for the estimation of radial basis function neural network specifications, for dependent data, that do not require iterative estimation techniques. Use of the properties of regression based boosting algorithms is made. Both consistency and rate results are derived. An application to nonparametric specification testing illustrates the usefulness of the results. |
Keywords: | Neural Networks, Boosting |
JEL: | C12 C13 C22 |
Date: | 2007–03 |
URL: | http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp588&r=ets |
By: | Andrea Carriero (Queen Mary, University of London); Massimiliano Marcellino (IEP-Bocconi University, IGIER and CEPR) |
Abstract: | In this paper we provide an overview of recent developments in the methodology for the construction of composite coincident and leading indexes, and apply them to the UK. In particular, we evaluate the relative merits of factor based models and Markov switching specifications for the construction of coincident and leading indexes. For the leading indexes we also evaluate the performance of probit models and pooling. The results indicate that alternative methods produce similar coincident indexes, while there are more marked di.erences in the leading indexes. |
Keywords: | Forecasting, Business cycles, Leading indicators, Coincident indicators, Turning points |
JEL: | E32 E37 C53 |
Date: | 2007–03 |
URL: | http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp590&r=ets |
By: | Andrea Carriero (Queen Mary, University of London) |
Abstract: | Even if there is a fairly large evidence against the Expectations Hypothesis (EH) of the term structure of interest rates, there still seems to be an element of truth in the theory which may be exploited for forecasting and simulation. This paper formalizes this idea by proposing a way to use the EH without imposing it dogmatically. It does so by using a Bayesian framework such that the extent to which the EH is imposed on the data is under the control of the researcher. This allows to study a continuum of models ranging from one in which the EH holds exactly to one in which it does not hold at all. In between these two extremes, the EH features transitory deviations which may be explained by time varying (but stationary) term premia and errors in expectations. Once cast in this framework, the EH holds on average (i.e. after integrating out the effect of the transitory deviations) and can be safely and effectively used for forecasting and simulation. |
Keywords: | Bayesian VARs, Expectations theory, Term structure |
JEL: | C11 E43 E44 E47 |
Date: | 2007–03 |
URL: | http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp591&r=ets |
By: | Norman Swanson (Rutgers University); Geetesh Bhardwaj (Rutgerst University) |
Abstract: | This chapter builds on previous work by Bhardwaj and Swanson (2004) who address the notion that many fractional I(d) processes may fall into the “empty box” category, as discussed in Granger (1999). However, rather than focusing primarily on linear models, as do Bhardwaj and Swanson, we analyze the business cycle effects on the forecasting performance of these ARFIMA, AR, MA, ARMA, GARCH, and STAR models. This is done via examination of ex ante forecasting evidence based on an updated version of the absolute returns series examined by Ding, Granger and Engle (1993); and via the use of Diebold and Mariano (1995) and Clark and McCracken (2001) predictive accuracy tests. Results are presented for a variety of forecast horizons and for recursive and rolling estimation schemes. We find that the business cycle does not seem to have an effect on the relative forecasting performance of ARFIMA models. |
Keywords: | fractional integration, long horizon prediction, long memory, parameter estimation error, stock returns |
JEL: | C15 C22 C53 |
Date: | 2006–09–22 |
URL: | http://d.repec.org/n?u=RePEc:rut:rutres:200613&r=ets |
By: | Valentina Corradi (Queen Mary, University of London); Norman Swanson (Rutgers University); Geetesh Bhardwaj (Rutgerst University) |
Abstract: | This paper makes two contributions. First, we outline a simple simulation based framework for constructing conditional distributions for multi-factor and multi-dimensional diffusion processes, for the case where the functional form of the conditional density is unknown. The distributions can be used, for example, to form conditional confidence intervals for time period t + ¥ó , say, given information up to period t. Second, we use the simulation based approach to construct a test for the correct specification of a diffusion process. The suggested test is in the spirit of the conditional Kolmogorov test of Andrews (1997). However, in the present context the null conditional distribution is unknown and is replaced by its simulated counterpart. The limiting distribution of the test statistic is not nuisance parameter free. In light of this, asymptotically valid critical values are obtained via appropriate use of the block bootstrap. The suggested test has power against a larger class of alternatives than tests that are constructed using marginal distributions/densities, such as those in A¡§©¥t-Sahalia (1996) and Corradi and Swanson (2005). The findings of a small Monte Carlo experiment underscore the good finite sample properties of the proposed test, and an empirical illustration underscores the ease with which the proposed simulation and testing methodology can be applied. |
Keywords: | block bootstrap, diffusion processes, parameter estimation error, simulated GMM, stochastic volatility |
JEL: | C22 C51 |
Date: | 2006–09–22 |
URL: | http://d.repec.org/n?u=RePEc:rut:rutres:200614&r=ets |
By: | Valentina Corradi (Queen Mary, University of London); Norman Swanson (Rutgers University); Walter Distaso (Imperial College) |
Abstract: | In recent years, numerous volatility-based derivative products have been engineered. This has led to interest in constructing conditional predictive densities and confidence intervals for integrated volatility. In this paper, we propose nonparametric kernel estimators of the aforementioned quantities. The kernel functions used in our analysis are based on different realized volatility measures, which are constructed using the ex post variation of asset prices. A set of sufficient conditions under which the estimators are asymptotically equivalent to their unfeasible counterparts, based on the unobservable volatility process, is provided. Asymptotic normality is also established. The efficacy of the estimators is examined via Monte Carlo experimentation, and an empirical illustration based upon data from the New York Stock Exchange is provided. |
Keywords: | conditional confidence intervals, Diffusions, integrated volatility, kernels, microstructure noise, realized volatility measures |
JEL: | C14 C22 C53 |
Date: | 2006–09–22 |
URL: | http://d.repec.org/n?u=RePEc:rut:rutres:200616&r=ets |
By: | Norman Swanson (Rutgers University); Valentina Corradi (Queen Mary, University of London) |
Abstract: | Our objectives in this paper are twofold. First, we introduce block bootstrap techniques that are (first order) valid in recursive estimation frameworks. Thereafter, we present two examples where predictive accuracy tests are made operational using our new bootstrap procedures. In one application, we outline a consistent test for out-of-sample nonlinear Granger causality, and in the other we outline a test for selecting amongst multiple alternative forecasting models, all of which are possibly misspecified. More specifically, our examples extend the White (2000) reality check to the case of non vanishing parameter estimation error, and extend the integrated conditional moment tests of Bierens (1982, 1990) and Bierens and Ploberger (1997) to the case of out-of-sample prediction. In both examples, appropriate re-centering of the bootstrap score is required in order to ensure that the tests have asymptotically correct size, and the need for such re-centering is shown to arise quite naturally when testing hypotheses of predictive accuracy. In a Monte Carlo investigation, we compare the finite sample properties of our block bootstrap procedures with the parametric bootstrap due to Kilian (1999); all within the context of various encompassing and predictive accuracy tests. An empirical illustration is also discussed, in which it is found that unemployment appears to have nonlinear marginal predictive content for inflation. |
Keywords: | block bootstrap, nonlinear causality, parameter estimation error, reality check, recursive estimation scheme |
JEL: | C22 C51 |
Date: | 2006–09–22 |
URL: | http://d.repec.org/n?u=RePEc:rut:rutres:200618&r=ets |
By: | Valentina Corradi (Queen Mary, University of London); Norman Swanson (Rutgers University); Walter Distaso (Imperial College) |
Abstract: | The main objective of this paper is to propose a feasible, model free estimator of the predictive density of integrated volatility. In this sense, we extend recent papers by Andersen, Bollerslev, Diebold and Labys (2003), and by Andersen, Bollerslev and Meddahi (2004, 2005), who address the issue of pointwise prediction of volatility via ARMA models, based on the use of realized volatility. Our approach is to use a realized volatility measure to construct a non parametric (kernel) estimator of the predictive density of daily volatility. We show that, by choosing an appropriate realized measure, one can achieve consistent estimation, even in the presence of jumps and microstructure noise in prices. More precisely, we establish that four well known realized measures, i.e. realized volatility, bipower variation, and two measures robust to microstructure noise, satisfy the conditions required for the uniform consistency of our estimator. Furthermore, we outline an alternative simulation based approach to predictive density construction. Finally, we carry out a simulation experiment in order to assess the accuracy of our estimators, and provide an empirical illustration that underscores the importance of using microstructure robust measures when using high frequency data. |
Keywords: | Diffusions, integrated volatility, kernels, microstructure noise, realized volatility measures |
JEL: | C14 C22 C53 |
Date: | 2006–10–02 |
URL: | http://d.repec.org/n?u=RePEc:rut:rutres:200620&r=ets |
By: | Valentina Corradi (Queen Mary, University of London); Norman Swanson (Rutgers University) |
Abstract: | This chapter discusses estimation, specification testing, and model selection of predictive density models. In particular, predictive density estimation is briefly discussed, and a variety of different specification and model evaluation tests due to various authors including Christoffersen and Diebold (2000), Diebold, Gunther and Tay (1998), Diebold, Hahn and Tay (1999), White (2000), Bai (2003), Corradi and Swanson (2005a,b,c,d), Hong and Li (2003), and others are reviewed. Extensions of some existing techniques to the case of out-of-sample evaluation are also provided, and asymptotic results associated with these extensions are outlined. |
Keywords: | block bootstrap, density and conditional distribution, forecast accuracy testing, mean square error, parameter estimation error |
JEL: | C22 C51 |
Date: | 2006–10–02 |
URL: | http://d.repec.org/n?u=RePEc:rut:rutres:200621&r=ets |
By: | Kakamu, Kazuhiko (Graduate School of Economics, Osaka University, Osaka, Japan); Polasek, Wolfgang (Department of Economics and Finance, Institute for Advanced Studies, Vienna, Austria) |
Abstract: | We suggest a new class of cross-sectional space-time models based on local AR models and nearest neighbors using distances between observations. For the estimation we use a tightness prior for prediction of regional GDP forecasts. We extend the model to the model with exogenous variable model and hierarchical prior models. The approaches are demonstrated for a dynamic panel model for regional data in Central Europe. Finally, we find that an ARNN(1, 3) model with travel time data is best selected by marginal likelihood and there the spatial correlation is usually stronger than the time correlation. |
Keywords: | Dynamic panel data, hierarchical models, marginal likelihoods, nearest neighbors, tightness prio, spatial econometrics |
JEL: | C11 C15 C21 R11 |
Date: | 2007–02 |
URL: | http://d.repec.org/n?u=RePEc:ihs:ihsesp:203&r=ets |
By: | Urzúa, Carlos M. (Tecnológico de Monterrey, Campus Ciudad de México) |
Abstract: | Vector autoregressive models are often used in Macroeconomics to draw conclusions about the effects of policy innovations. However, those results depend on the researcher’s priors about the particular ordering of the variables. As an alternative, this paper presents a simple rule based on the Maximum Entropy principle that can be used to find the “most likely” ordering. The proposal is illustrated in the case of a VAR model of the U.S. economy. It is found that monetary policy shocks are better represented by innovations in the federal funds rate rather than by innovations in non-borrowed reserves. |
Keywords: | VAR, impulse-response functions, varimin, maximum entropy, monetary policy shocks |
JEL: | C32 C51 E52 |
Date: | 2007–02 |
URL: | http://d.repec.org/n?u=RePEc:ega:docume:200703&r=ets |
By: | Archontakis, Theofanis; Lemke, Wolfgang |
Abstract: | This paper studies a nonlinear one-factor term structure model in discrete time. The single factor is the short-term interest rate, which is modeled as a self-exciting threshold autoregressive (SETAR) process. Our specification allows for shifts in the intercept and the variance. The process is stationary but mimics the nearly I(1) dynamics typically encountered with interest rates. In comparison with a linear model, we find empirical evidence in favor of the threshold model for Germany and the US. Based on the estimated short-rate dynamics we derive the implied arbitrage-free term structure of interest rates. Since analytical solutions are not feasible, bond prices are computed by means of Monte Carlo integration. The resulting term structure exhibits properties that are qualitatively similar to those observed in the data and which cannot be captured by the linear Gaussian one-factor model. In particular, our model captures the nonlinear relation between long rates and the short rate found in the data. |
Keywords: | Non-affine term structure models, SETAR models, Asset pricing |
JEL: | C22 E43 G12 |
Date: | 2007 |
URL: | http://d.repec.org/n?u=RePEc:zbw:bubdp1:5405&r=ets |
By: | Razzak, Weshah |
Abstract: | I discuss econometric issues of high relevance to economists in central banks whose job is to interpret the permanency of shocks and provide policy advice to policymakers. Trend, unit root, and persistence are difficult to interpret. There are numerous econometric tests, which vary in their power and usefulness. I provide a set of strategies on dealing with macro time series. |
Keywords: | Unit root; trend; persistence; cointegration |
JEL: | F4 C22 C13 |
Date: | 2003 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:1970&r=ets |
By: | Figueiredo, Annibal; Gleria, Iram; Matsushita, Raul; Da Silva, Sergio |
Abstract: | This paper puts forward a technique based on the characteristic function to tackle the problem of the sum of stochastic variables. We consider independent processes whose reduced variables are identically distributed, including those that violate the conditions for the central limit theorem to hold. We also consider processes that are correlated and analyze the role of nonlinear autocorrelations in their convergence to a Gaussian. We demonstrate that nonidentity in independent processes is related to autocorrelations in nonindependent processes. We exemplify our approach with data from foreign exchange rates. |
Keywords: | econophysics; central limit theorem; characteristic function; reduced variables; autocorrelation |
JEL: | C1 |
Date: | 2006 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:1984&r=ets |
By: | López, Fernando; Chasco, Coro |
Abstract: | The purpose of this article is to analyze if spatial dependence is a synchronic effect in the first-order spatial autoregressive model, SAR(1). Spatial dependence can be not only contemporary but also time-lagged in many socio-economic phenomena. In this paper, we use three Moran-based space-time autocorrelation statistics to evaluate the simultaneity of this spatial effect. A simulation study shed some light upon these issues, demonstrating the capacity of these tests to identify the structure (only instant, only time-lagged or both instant and time-lagged) of spatial dependence in most cases. |
Keywords: | Space-time dependence; Spatial autoregressive models; Moran’s I |
JEL: | C15 C51 C21 |
Date: | 2007–03–03 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:1985&r=ets |
By: | Mishra, SK |
Abstract: | Correlation matrices have many applications, particularly in marketing and financial economics. The need to forecast demand for a group of products in order to realize savings by properly managing inventories requires the use of correlation matrices. In many cases, due to paucity of data/information or dynamic nature of the problem at hand, it is not possible to obtain a complete correlation matrix. Some elements of the matrix are unknown. Several methods exist that obtain valid complete correlation matrices from incomplete correlation matrices. In view of non-unique solutions admissible to the problem of completing the correlation matrix, some authors have suggested numerical methods that provide ranges to different unknown elements. However, they are limited to very small matrices up to order 4. Our objective in this paper is to suggest a method (and provide a Fortran program) that completes a given incomplete correlation matrix of an arbitrary order. The method proposed here has an advantage over other algorithms due to its ability to present a scenario of valid correlation matrices that might be obtained from a given incomplete matrix of an arbitrary order. The analyst may choose some particular matrices, most suitable to his purpose, from among those output matrices. Further, unlike other methods, it has no restriction on the distribution of holes over the entire matrix, nor the analyst has to interactively feed elements of the matrix sequentially, which might be quite inconvenient for larger matrices. It is flexible and by merely choosing larger population size one might obtain a more exhaustive scenario of valid matrices. |
Keywords: | Incomplete; complete; correlation matrix; valid; semi-definite; eigenvalues; Differential Evolution; global optimization; computer program; fortran; financial economics; arbitrary order |
JEL: | G10 C88 C63 C61 |
Date: | 2007–03–05 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:2000&r=ets |
By: | Everts, Martin |
Abstract: | In the following article the ideal band-pass filter is derived and explained in order to subsequently analyze the approximations by Baxter and King (1999) and Christiano and Fitzgerald (2003). It can be shown that the filters by Baxter and King and Christiano and Fitzgerald primarily differ in two assumptions, namely in the assumption about the spectral density of the analyzed variables as well as in the assumption about the symmetry of the weights of the band-pass filter. In the article at hand it is shown that the different assumptions lead to characteristics for the two filters which distinguish in three points: in the accuracy of the approximation with respect to the length of the cycles considered, in the amount of calculable data points towards the ends of the data series, as well as in the removal of the trend of the original time series. |
Keywords: | Business Cycle; Band-Pass Filter |
JEL: | C1 E3 |
Date: | 2006–01 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:2049&r=ets |