
on Econometrics 
By:  Niels Haldrup; Antonio Montañés; Andreu Sansó (Department of Economics, University of Aarhus, Denmark) 
Abstract:  The detection of additive outliers in integrated variables has attracted some attention recently, see e.g. Shin et al. (1996), Vogelsang (1999) and Perron and Rodriguez (2003). This paper serves several purposes. We prove the inconsistency of the test proposed by Vogelsang, we extend the tests proposed by Shin et al. and Perron and Rodriguez to the seasonal case, and we consider alternative ways of computing their tests. We also study the e?ects of periodically varying variances on the previous tests and demonstrate that these can be seriously size distorted. Subsequently, some new tests that allow for periodic heteroskedasticity are proposed. 
Keywords:  Additive outliers, outlier detection, integrated processes, periodic heteroscedasticity, seasonality. 
JEL:  C12 C2 C22 
Date:  2004–12–21 
URL:  http://d.repec.org/n?u=RePEc:aah:aarhec:200414&r=ecm 
By:  Peter M Robinson 
Abstract:  Asymptotic inference on nonstationary fractional time series models, including cointegrated ones, is proceeding along two routes, determined by alternative definitions of nonstationary processes. We derive bounds for the mean squared error of the difference between (possibly tapered) discrete Fourier transforms under two regimes. We apply the results to deduce limit theory for estimates of memory parameters, including ones for cointegrated errors, with mention also of implications for estimates of cointegrating coefficients. 
Keywords:  Nonstationary fractional processes, memory parameter estimation, fractional cointegration, rates of convergence. 
Date:  2004–03 
URL:  http://d.repec.org/n?u=RePEc:cep:stiecm:/2004/468&r=ecm 
By:  Peter M Robinson 
Abstract:  Smoothed nonparametric estimates of the spectral density matrix at zero frequency have been widely used in econometric inference, because they can consistently estimate the covariance matrix of a partial sum of a possibly dependent vector process. When elements of the vector process exhibit long memory or antipersistence such estimates are inconsistent. We propose estimates which are still consistent in such circumstances, adapting automatically to memory parameters that can vary across the vector and be unknown. 
Keywords:  Covariance matrix estimation, long memory, antipersistence correction, "HAC" estimates, vector process, spectral density. 
Date:  2004–03 
URL:  http://d.repec.org/n?u=RePEc:cep:stiecm:/2004/471&r=ecm 
By:  Oliver Linton 
Abstract:  This paper is concerned with the practical problem of conducting inference in a vector time series setting when the data is unbalanced or incomplete. In this case, one can work only with the common sample, to which a standard HAC/Bootstrap theory applies, but at the expense of throwing away data and perhaps losing efficiency. An alternative is to use some sort of imputation method, but this requires additional modelling assumptions, which we would rather avoid. We show how the sampling theory changes and how to modify the resampling algorithms to accommodate the problem of missing data. We also discuss efficiency and power. Unbalanced data of the type we consider are quite common in financial panel data, see, for example, Connor and Korajczyk (1993). These data also occur in crosscountry studies. 
Keywords:  Bootstrap, efficient, HAC estimation, missing data, subsampling. 
Date:  2004–04 
URL:  http://d.repec.org/n?u=RePEc:cep:stiecm:/2004/474&r=ecm 
By:  Fabrizio Iacone; Peter M Robinson 
Abstract:  We consider a cointegrated system generated by processes that may be fractionally integrated, and by additive polynomial and generalized polynomial trends. In view of the consequent competition between stochastic and deterministic trends, we consider various estimates of the cointegrating vector and develop relevant asymptotic theory, including the situation where fractional orders of integration are unknown. 
Keywords:  Fractional cointegration, deterministic trends, ordinary least squares estimation, generalized least squares estimation, Wald tests. 
Date:  2004–05 
URL:  http://d.repec.org/n?u=RePEc:cep:stiecm:/2004/476&r=ecm 
By:  TrinoManuel Niguez; Javier Perote 
Abstract:  In this paper we introduce a transformation of the EdgeworthSargan series expansion of the Gaussian distribution, that we call Positive EdgeworthSargan (PES). The main advantage of this new density is that it is well defined for all values in the parameter space, as well as it integrates up to one. We include an illustrative empirical application to compare its performance with other distributions, including the Gaussian and the Student's t, to forecast the full density of daily exchangerate returns by using graphical procedures. Our results show that the proposed function outperforms the other two models for density forecasting, then providing more reliable valueatrisk forecasts. 
Keywords:  Density forecasting, EdgeworthSargan distribution, probability integral transformations, Pvalue plots, VaR 
JEL:  C16 C53 G12 
Date:  2004–10 
URL:  http://d.repec.org/n?u=RePEc:cep:stiecm:/2004/479&r=ecm 
By:  Peter M Robinson 
Abstract:  We consider a time series model involving a fractional stochastic component, whose integration order can lie in the stationary/invertible or nonstationary regions and be unknown, and additive deterministic component consisting of a generalised polynomial. The model can thus incorporate competing descriptions of trending behaviour. The stationary input to the stochastic component has parametric autocorrelation, but innovation with distribution of unknown form. The model is thus semiparametric, and we develop estimates of the parametric component which are asymptotically normal and achieve an Mestimation efficiency bound, equal to that found in work using an adaptive LAM/LAN approach. A major technical feature which we treat is the effect of truncating the autoregressive representation in order to form innovation proxies. This is relevant also when the innovation density is parameterised, and we provide a result for that case also. Our semiparametric estimates employ nonparametric series estimation, which avoids some complications and conditions in kernel approaches featured in much work on adaptive estimation of time series models; our work thus also contributes to methods and theory for nonfractional time series models, such as autoregressive moving averages. A Monte Carlo study of finite sample performance of the semiparametric estimates is included. 
Keywords:  fractional processes, efficient semiparametric estimation, adaptive estimation, nonstationary processes, series estimation, Mestimation 
JEL:  C22 
Date:  2004–11 
URL:  http://d.repec.org/n?u=RePEc:cep:stiecm:/2004/480&r=ecm 
By:  M. Hashem Pesaran; Paolo Zaffaroni 
Abstract:  This paper considers the problem of model uncertainty in the case of multiasset volatility models and discusses the use of model averaging techniques as a way of dealing with the risk of inadvertently using false models in portfolio management. In particular, it is shown that under certain conditions portfolio returns based on an average model will be more fattailed than if based on an individual underlying model with the same average volatility. Evaluation of volatility models is also considered and a simple ValueatRisk (VaR) diagnostic test is proposed for individual as well as ‘average’ models and its exact and asymptotic properties are established. The model averaging idea and the VaR diagnostic tests are illustrated by an application to portfolios of daily returns based on twenty two of Standard & Poor’s 500 industry group indices over the period January 2, 1995 to October 13, 2003, inclusive. 
Keywords:  model averaging, valueatrisk, decision based evaluation 
JEL:  C32 C52 C53 G11 
Date:  2004 
URL:  http://d.repec.org/n?u=RePEc:ces:ceswps:_1358&r=ecm 
By:  Rombouts, J.V.K.; Verbeek, M. (Erasmus Research Institute of Management (ERIM), Erasmus University Rotterdam) 
Abstract:  In this paper we examine the usefulness of multivariate semiparametric GARCH models for portfolio selection under a ValueatRisk (VaR) constraint. First, we specify and estimate several alternative multivariate GARCH models for daily returns on the S&P 500 and Nasdaq indexes. Examining the within sample VaRs of a set of given portfolios shows that the semiparametric model performs uniformly well, while parametric models in several cases have unacceptable failure rates. Interestingly, distributional assumptions appear to have a much larger impact on the performance of the VaR estimates than the particular parametric specification chosen for the GARCH equations. Finally, we examine the economic value of the multivariate GARCH models by determining optimal portfolios based on maximizing expected returns subject to a VaR constraint, over a period of 500 consecutive days. Again, the superiority and robustness of the semiparametric model is confirmed. 
Keywords:  Multivariate GARCH;semiparametric estimation;ValueatRisk;asset allocation; 
Date:  2004–12–22 
URL:  http://d.repec.org/n?u=RePEc:dgr:eureri:30001977&r=ecm 
By:  Rubens Penha Cysne (EPGE/FGV) 
Date:  2004–10 
URL:  http://d.repec.org/n?u=RePEc:fgv:epgewp:570&r=ecm 
By:  Villani, Mattias (Research Department, Central Bank of Sweden); Larsson, Rolf (Department of Information Science, Uppsala University) 
Abstract:  The multivariate split nomal distribution extends the usual multivariate normal distribution by a set of parameters which allows for skewness in the form of contraction/dilation along a subset of the prinicpal axis. The paper derives some properties for this distribution, including its moment generating function, multivariate skewness and kurtosis. Maximum likelihood estimation is discussed and a complete Bayesian analysis of the multivariate split normal distribution is developed. 
Keywords:  Bayesian inference; Elicitation; Estimation; Maximum likelihood; Multivariate analysis; Skewness 
JEL:  C11 C16 
Date:  2004–12–01 
URL:  http://d.repec.org/n?u=RePEc:hhs:rbnkwp:0175&r=ecm 
By:  Katarina Juselius (Institute of Economics, University of Copenhagen) 
Abstract:  The paper discusses the dynamics of inflation and money growth in a stochastic framework, allowing for double unit roots in the nominal variables. It gives some examples of typical I(2) ’symptoms’ in empirical I(1) models and provides both a nontechnical and a technical discussion of the basic differences between the I(1) and the I(2) model. The notion of longrun and mediumrun price homogeneity is discussed in terms of testable restrictions on the I(2) model. The Brazilian high inflation period of 1977:11985:5 illustrates the applicability of the I(2) model and its usefulness to address questions related to inflation dynamics. 
Keywords:  cointegrated VAR; price homogeneity; Cagan model; hyper inflation 
JEL:  C32 E41 E31 
Date:  2004–12 
URL:  http://d.repec.org/n?u=RePEc:kud:kuiedp:0431&r=ecm 
By:  David E. Giles (Department of Economics, University of Victoria); Chad N. Stroomer (Department of Economics, University of Victoria) 
Abstract:  This paper presents a new method for extracting the cycle from an economic time series. This method uses the fuzzy cmeans clustering algorithm, drawn from the pattern recognition literature, to identify groups of observations. The time series is modeled over each of these subsamples, and the results are combined using the “degrees of membership” for each datapoint with each cluster. The result is a totally flexible model that readily captures complex nonlinearities in the data. This type of “fuzzy regression” analysis has been shown by Giles and Draeseke (2003) to be highly effective in a broad range of situations with economic data. The fuzzy filter that we develop here is compared with the wellknown HodrickPrescott (HP) filter in a Monte Carlo experiment, and the new filter is found to perform as well as, or better than, the HP filter. The advantage of the fuzzy filter is especially pronounced when the data have a deterministic, rather than stochastic, trend. Applications with real timeseries illustrate the different conclusions that can emerge when the fuzzy regression filter and the HP filter are each applied to extract the cycle. 
Keywords:  Fuzzy filter, fuzzy clustering, business cycle, trend extraction, HP filter 
JEL:  C19 C22 E32 
Date:  2004–12–29 
URL:  http://d.repec.org/n?u=RePEc:vic:vicewp:0406&r=ecm 
By:  Gultekin Isiklar (State University of New York at Albany) 
Abstract:  This note shows that problems due to aggregation in fixedevent forecast efficiency tests are not as severe as they are in unbiasedness tests. We also show that first lags of consensus revisions should be avoided in the tests. 
Keywords:  Aggregation bias; fixedevent; weakefficiency 
JEL:  C53 
Date:  2004–12–27 
URL:  http://d.repec.org/n?u=RePEc:wpa:wuwpem:0412011&r=ecm 
By:  Francis X. Diebold (Department of Economics, University of Pennsylvania and NBER) 
Abstract:  Engle’s footsteps range widely. His major contributions include early work on bandspectral regression, development and unification of the theory of model specification tests (particularly Lagrange multiplier tests), clarification of the meaning of econometric exogeneity and its relationship to causality, and his later stunningly influential work on common trend modeling (cointegration) and volatility modeling (ARCH, short for AutoRegressive Conditional Heteroskedasticity). More generally, Engle’s cumulative work is a fine example of bestpractice applied timeseries econometrics: he identifies important dynamic economic phenomena, formulates precise and interesting questions about those phenomena, constructs sophisticated yet simple econometric models for measurement and testing, and consistently obtains results of widespread substantive interest in the scientific, policy, and financial communities. 
Keywords:  Econometric Theory, Finance 
JEL:  B31 C10 
Date:  2004–02–01 
URL:  http://d.repec.org/n?u=RePEc:pen:papers:04010&r=ecm 
By:  Richard Harrison; George Kapetanios; Tony Yates 
Abstract:  This paper explores the effects of measurement error on dynamic forecasting models. It illustrates a tradeoff that confronts forecasters and policymakers when they use data that are measured with error. On the one hand, observations on recent data give valuable clues as to the shocks that are hitting the system and that will be propagated into the variables to be forecast. But on the other, those recent observations are likely to be those least well measured. The paper studies two classes of forecasting problem. The first class includes cases where the forecaster takes the coefficients in the datagenerating process as given, and has to choose how much of the historical time series of data to use to form a forecast. We show that if recent data are sufficiently badly measured, relative to older data, it can be optimal not to use recent data at all. The second class of problems we study is more general. We show that for a general class of linear autoregressive forecasting models, the optimal weight to place on a data observation of some age, relative to the weight in the true datagenerating process, will depend on the measurement error in that observation. We illustrate the gains in forecasting performance using a model of UK business investment growth. 
URL:  http://d.repec.org/n?u=RePEc:boe:boeewp:237&r=ecm 
By:  Torben G. Andersen (Department of Economics, Northwestern University); Tim Bollerslev (Department of Economics, Duke University); Francis X. Diebold (Department of Economics, University of Pennsylvania); Jin Wu (Department of Economics, University of Pennsylvania) 
Abstract:  A large literature over several decades reveals both extensive concern with the question of timevarying betas and an emerging consensus that betas are in fact timevarying, leading to the prominence of the conditional CAPM. Set against that background, we assess the dynamics in realized betas, visàvis the dynamics in the underlying realized market variance and individual equity covariances with the market. Working in the recentlypopularized framework of realized volatility, we are led to a framework of nonlinear fractional cointegration: although realized variances and covariances are very highly persistent and well approximated as fractionallyintegrated, realized betas, which are simple nonlinear functions of those realized variances and covariances, are less persistent and arguably best modeled as stationary I(0) processes. We conclude by drawing implications for asset pricing and portfolio management. 
Keywords:  Quadratic variation and covariation, realized volatility, asset pricing, CAPM, equity betas, long memory, nonlinear fractional cointegration, continuoustime methods 
JEL:  C1 G1 
Date:  2003–01–03 
URL:  http://d.repec.org/n?u=RePEc:pen:papers:04018&r=ecm 
By:  Jesus FernandezVillaverde (Department of Economics, University of Pennsylvania); Juan F. RubioRamirez (Federal Reserve Bank of Atlanta); Manuel Santos (College of Business, Arizona State University) 
Abstract:  This paper studies the econometrics of computed dynamic models. Since these models generally lack a closedform solution, economists approximate the policy functions of the agents in the model with numerical methods. But this implies that, instead of the exact likelihood function, the researcher can evaluate only an approximated likelihood associated with the approximated policy function. What are the consequences for inference of the use of approximated likelihoods? First, we show that as the approximated policy function converges to the exact policy, the approximated likelihood also converges to the exact likelihood. Second, we prove that the approximated likelihood converges at the same rate as the approximated policy function. Third, we find that the error in the approximated likelihood gets compounded with the size of the sample. Fourth, we discuss convergence of Bayesian and classical estimates. We complete the paper with three applications to document the quantitative importance of our results. 
Keywords:  computed dynamic models, likelihood inference, asymptotic properties 
JEL:  C1 C5 E1 
Date:  2004–08–31 
URL:  http://d.repec.org/n?u=RePEc:pen:papers:04034&r=ecm 
By:  Annaert J.; Claes A.; De Ceuster M.J.K. 
Abstract:  Some years ago, Crack and Ledoit (1996) discovered a strikingly geometric structure when plotting US stock returns against themselves. Since this pattern, in which lines radiating from the origin pop up, resembles the navigating tool it was named “Compass Rose”. Although authors differ in opinion when explaining the causes of the phenomenon, discreteness of price jumps is unanimously indicated as driver of the structure. This paper first documents the presence of a Compass Rose Structure within the illiquid Belgian stock market, looking at both individual stocks and stock indices. We then examine whether the presence of a Compass Rose, i.e. the discreteness of prices, affects normality tests. Based on simulated Brownian Motions with rounded price increments, we notice that two commonly used normality tests react differently to discreteness in the underlying data. As the tick size increases, the popular JarqueBera test is not able to detect the deviations from normality. The Lilliefors test, however, clearly rejects the normality assumption when the data exhibit tick/volatility ratios in excess of 2.5. 
Date:  2003–06 
URL:  http://d.repec.org/n?u=RePEc:ant:wpaper:2003020&r=ecm 
By:  Brys G.; Hubert M.; Struyf A. 
Abstract:  In this paper we propose several goodnessoffit tests based on robust measures of skewness and tail weight. They can be seen as generalisations of the JarqueBera test (Bera and Jarque, 1981) based on the classical skewness and kurtosis, and as an alternative to the approach of Moors et al. (1996) using quantiles. The power values and the robustness properties of the di_erent tests are investigated by means of simulations and applications on real data. We conclude that MCLR, one of our proposed tests, shows the best overall power and that it is moderately influenced by outlying values. 
Date:  2004–10 
URL:  http://d.repec.org/n?u=RePEc:ant:wpaper:2004018&r=ecm 
By:  George Kapetanios; Tony Yates 
Abstract:  Over time, economic statistics are refined. This means that newer data are typically less well measured than old data. Time or vintagevariation in measurement error like this influences how forecasts should be made. Measurement error is obviously not directly observable. This paper shows that modelling the behaviour of the statistics agency generates an estimate of this timevariation. This provides an alternative to assuming that the final releases of variables are true. The paper applies the method to UK aggregate expenditure data, and demonstrates the gains in forecasting from exploiting these modelbased estimates of measurement error. 
URL:  http://d.repec.org/n?u=RePEc:boe:boeewp:238&r=ecm 
By:  David Mandy (Department of Economics, University of MissouriColumbia); Sandor Fridli 
Abstract:  We show under very parsimonious assumptions that FGLS and GLS are asymptotically equivalent when errors follow an invertible MA(1) process. Although the linear regression model with MA errors has been studied for many years, asymptotic equivalence of FGLS and GLS has never been established for this model. We do not require anything beyond a finite second moment of the conditional white noise, uniformly bounded fourth moments and independence of the regressor vectors, consistency of the estimator for the MA parameter, and a finite nonsingular probability limit for the (transformed) averages of the regressors. These assumptions are analogous to assumptions typically used to prove asymptotic equivalence of FGLS and GLS in SUR models, models with AR(p) errors, and models of parametric heteroscedasticity. 
JEL:  L5 
Date:  2004–12–16 
URL:  http://d.repec.org/n?u=RePEc:umc:wpaper:0405&r=ecm 
By:  Michael W. McCracken (Department of Economics, University of MissouriColumbia) 
Abstract:  This paper presents analytical, Monte Carlo and empirical evidence concerning outofsample tests of Granger causality. The environment is one in which the relative predictive ability of two nested parametric regression models is of interest. Results are provided for three statistics: a regressionbased statistic suggested by Granger and Newbold (1977), a ttype statistic comparable to those suggested by Diebold and Mariano (1995) and West (1996), and an Ftype statistic akin to Theil’s U. Since the asymptotic distributions under the null are nonstandard, tables of asymptotically valid critical values are provided. Monte Carlo evidence supports the theoretical results. An empirical example relating the predictive content of an interest spread to growth shows that the tests can provide a useful model selection tool for forecasting. 
Keywords:  Granger causality, forecast evaluation, hypothesis testing, model selection 
JEL:  C12 C32 C52 C53 
Date:  2004–12–23 
URL:  http://d.repec.org/n?u=RePEc:umc:wpaper:0406&r=ecm 
By:  Michael W. McCracken (Department of Economics, University of MissouriColumbia); Todd E. Clark (Federal Reserve Bank of Kansas City) 
Abstract:  This paper presents analytical, Monte Carlo, and empirical evidence on the effectiveness of combining recursive and rolling forecasts when linear predictive models are subject to structural change. We first provide a characterization of the biasvariance tradeoff faced when choosing between either the recursive and rolling schemes or a scalar convex combination of the two. From that, we derive pointwise optimal, timevarying and datadependent observation windows and combining weights designed to minimize mean square forecast error. We then proceed to consider other methods of forecast combination, including Bayesian methods that shrink the rolling forecast to the recursive and Bayesian model averaging. Monte Carlo experiments and several empirical examples indicate that although the recursive scheme is often difficult to beat, when gains can be obtained, some form of shrinkage can often provide improvements in forecast accuracy relative to forecasts made using the recursive scheme or the rolling scheme with a fixed window width. 
Keywords:  structural breaks, forecasting, model averaging. 
JEL:  C53 C12 C52 
Date:  2004–12–23 
URL:  http://d.repec.org/n?u=RePEc:umc:wpaper:0420&r=ecm 
By:  Alan Manning 
Abstract:  This note provides a simple exposition of what IV can and cannot estimate in a model with abinary treatment variable and heterogeneous treatment effects. It shows how linear IV is amisspecification of functional form and the reason why linear IV estimates for this model willalways depend on the instrument used is because of this misspecification. It shows that if onecan estimate the correct functional form (nonlinear IV) then the treatment effects areindependent of the instrument used. However, the data may not be rich enough in practice tobe able to identify these treatment effects without strong distributional assumptions. In thiscase, one will have to settle for estimates of treatment effects that are instrumentdependent. 
Keywords:  Instrumental Variables, treatment effects, identification 
JEL:  C2 
Date:  2004–02 
URL:  http://d.repec.org/n?u=RePEc:cep:cepdps:dp0619&r=ecm 