
on Econometric Time Series 
By:  Anders Bredahl Kock (CREATES, Aarhus University); Timo Teräsvirta (CREATES, Aarhus University) 
Abstract:  In this paper, nonlinear models are restricted to mean nonlinear parametric models. Several such models popular in time series econo metrics are presented and some of their properties discussed. This in cludes two models based on universal approximators: the Kolmogorov Gabor polynomial model and two versions of a simple artificial neural network model. Techniques for generating multiperiod forecasts from nonlinear models recursively are considered, and the direct (nonrecursive) method for this purpose is mentioned as well. Forecasting with com plex dynamic systems, albeit less frequently applied to economic fore casting problems, is briefly highlighted. A number of large published studies comparing macroeconomic forecasts obtained using different time series models are discussed, and the paper also contains a small simulation study comparing recursive and direct forecasts in a partic ular case where the datagenerating process is a simple artificial neural network model. Suggestions for further reading conclude the paper. 
Keywords:  forecast accuracy, KolmogorovGabor, nearest neigh bour, neural network, nonlinear regression 
JEL:  C22 C45 C52 C53 
Date:  2010–01–01 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201001&r=ets 
By:  Ole E. BarndorffNielsen (Aarhus University and CREATES); José Manuel Corcuera (University of Barcelona); Mark Podolskij (ETH Zürich and CREATES) 
Abstract:  We present some new asymptotic results for functionals of higher order differences of Brownian semistationary processes. In an earlier work [4] we have derived a similar asymptotic theory for first order differences. However, the central limit theorems were valid only for certain values of the smoothness parameter of a Brownian semistationary process, and the parameter values which appear in typical applications, e.g. in modeling turbulent flows in physics, were excluded. The main goal of the current paper is the derivation of the asymptotic theory for the whole range of the smoothness parameter by means of using second order differences. We present the law of large numbers for the multipower variation of the second order differences of Brownian semistationary processes and show the associated central limit theorem. Finally, we demonstrate some estimation methods for the smoothness parameter of a Brownian semistationary process as an application of our probabilistic results. 
Keywords:  Brownian semistationary processes, central limit theorem, Gaussian processes, high frequency observations, higher order differences, multipower variation, stable convergence. 
JEL:  C10 C13 C14 
Date:  2009–12–21 
URL:  http://d.repec.org/n?u=RePEc:aah:create:200960&r=ets 
By:  Russell Davidson (GREQAM  Groupement de Recherche en Économie Quantitative d'AixMarseille  Université de la Méditerranée  AixMarseille II  Université Paul Cézanne  AixMarseille III  Ecole des Hautes Etudes en Sciences Sociales (EHESS)  CNRS : UMR6579, CIREG  Centre interuniversitaire de recherche en économie quantitative  Université de Montréal, Department of Economics, McGill University  McGill University) 
Abstract:  Testing for a unit root in a series obtained by summing a stationary MA(1) process with a parameter close to 1 leads to serious size distortions under the null, on account of the near cancellation of the unit root by the MA component in the driving stationary series. The situation is analysed from the point of view of bootstrap testing, and an exact quantitative account is given of the error in rejection probability of a bootstrap test. A particular method of estimating the MA parameter is recommended, as it leads to very little distortion even when the MA parameter is close to 1. A new bootstrap procedure with still better properties is proposed. While more computationally demanding than the usual bootstrap, it is much less so than the double bootstrap. 
Keywords:  Unit root test, bootstrap, MA(1), size distortion 
Date:  2009–12–30 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:halshs00443561_v1&r=ets 
By:  James G. MacKinnon (Queen's University) 
Abstract:  This paper provides tables of critical values for some popular tests of cointegration and unit roots. Although these tables are necessarily based on computer simulations, they are much more accurate than those previously available. The results of the simulation experiments are summarized by means of response surface regressions in which critical values depend on the sample size. From these regressions, asymptotic critical values can be read off directly, and critical values for any finite sample size can easily be computed with a hand calculator. Added in 2010 version: A new appendix contains additional results that are more accurate and cover more cases than the ones in the original paper. 
Keywords:  unit root test, DickeyFuller test, EngleGranger test, ADF test 
JEL:  C16 C22 C32 C12 C15 
Date:  2010–01 
URL:  http://d.repec.org/n?u=RePEc:qed:wpaper:1227&r=ets 
By:  Zaichao Du (Indiana University) 
Abstract:  In this paper, we develop a general method of testing for independence when unobservable generalized errors are involved. Our method can be applied to testing for serial independence of generalized errors, and testing for independence between the generalized errors and observ able covariates. The former can serve as a uni?ed approach to testing adequacy of time series models, as model adequacy often implies that the generalized errors obtained after a suitable transformation are independent and identically distributed. The latter is a key identi?cation assumption in many nonlinear economic models. Our tests are based on a classical sample dependence measure, the Hoe¤dingBlumKieferRosenblattype empirical process applied to generalized residuals. We establish a uniform expansion of the process, thereby deriving an ex plicit expression for the parameter estimation e¤ect, which causes our tests not to be nuisance parameterfree. To circumvent this problem, we propose a multipliertype bootstrap to approx imate the limit distribution. Our bootstrap procedure is computationally very simple as it does not require a reestimation of the parameters in each bootstrap replication. In a simulation study, we apply our method to test the adequacy of ARMAGARCH and Hansen (1994) skewed t models, and document a good ?nite sample performance of our test. Finally, an empirical application to some daily exchange rate data highlights the merits of our approach. 
Date:  2009–12 
URL:  http://d.repec.org/n?u=RePEc:inu:caeprp:2009023&r=ets 
By:  J. Carlos Escanciano (Indiana University) 
Abstract:  This article investigates model checks for a class of possibly nonlinear heteroskedastic time series models, including but not restricted to ARMAGARCH models. We propose omnibus tests based on functionals of certain weighted standardized residual empirical processes. The new tests are asymptotically distributionfree, suitable when the conditioning set is in?nite dimensional, and consistent against a class of Pitman?s local alternatives converging at the parametric rate n??1=2; with n the sample size. A Monte Carlo study shows that the simulated level of the proposed tests is close to the asymptotic level already for moderate sample sizes and that tests have a satisfactory power performance. Finally, we illustrate our methodology with an application to the wellknown S&P 500 daily stock index. The paper also contains an asymptotic uniform expansion for weighted residual empirical processes when initial conditions are considered, a result of independent interest. 
Date:  2009–09 
URL:  http://d.repec.org/n?u=RePEc:inu:caeprp:2009019&r=ets 
By:  Le, Vo Phuong Mai (Cardiff Business School); Minford, Patrick (Cardiff Business School); Wickens, Michael 
Abstract:  We review the methods used in many papers to evaluate DSGE models by comparing their simulated moments and other features with data equivalents. We note that they select, scale and characterise the shocks without reference to the data; crucially they fail to use the joint distribution of the features under comparison. We illustrate this point by recomputing an assessment of a twocountry model in a recent paper; we .nd that the paper.s conclusions are essentially reversed. 
Keywords:  Boostrap; USEU model; DSGE; VAR; indirect inference; Wald statistic; anomaly; puzzle 
JEL:  C12 C32 C52 E1 
Date:  2009–12 
URL:  http://d.repec.org/n?u=RePEc:cdf:wpaper:2009/31&r=ets 
By:  Ramses H. Mena; Stephen G. Walker 
Abstract:  This paper studies a novel idea for constructing continuoustime stationary Markov models. The approach undertaken is based on a latent representation of the corresponding transition probabilities that conveys to appealing ways to study and simulate the dynamics of the constructed processes. Some wellknown models are shown to fall within this construction shedding some light on both theoretical and applied properties. As an illustration of the capabilities of our proposal a simple estimation problem is posed. 
Keywords:  Gibbs sampler; Markov process; Stationary process 
Date:  2009–12 
URL:  http://d.repec.org/n?u=RePEc:icr:wpmath:252009&r=ets 
By:  Ramses H. Mena; Matteo Ruggiero; Stephen G. Walker 
Abstract:  This paper is concerned with the construction of a continuous parameter sequence of random probability measures and its application for modeling random phenomena evolving in continuous time. At each time point we have a random probability measure which is generated by a Bayesian nonparametric hierarchical model, and the dependence structure is induced through a WrightFisher diffusion with mutation. The sequence is shown to be a stationary and reversible diffusion taking values on the space of probability measures. A simple estimation procedure for discretely observed data is presented and illustrated with simulated and real data sets. 
Keywords:  Bayesian nonparametric inference, continuous time dependent random measure, Markov process, measurevalued process, stationary process, stickbreaking process 
Date:  2009–12 
URL:  http://d.repec.org/n?u=RePEc:icr:wpmath:262009&r=ets 
By:  Clifford R. Wymer 
Abstract:  The dynamics of economic behaviour is often developed in theory as a continuous time system. Rigorous estimation and testing of such systems, and the analysis of some aspects of their properties, is of particular importance in distinguishing between competing hypotheses and the resulting models. The consequences for the international economy during the past eighteen months of failures in the financial sector, and particularly the banking sector, make it essential that the dynamics of financial and commodity markets and of macroeconomic policy are well understood. The nonlinearity of the economic system means that it’s properties are heavily dependent on it’s parameter values. The estimators discussed here are tools to provide those parameter estimates 
Keywords:  Continuous time; Dynamics. 
Date:  2009–09 
URL:  http://d.repec.org/n?u=RePEc:des:wpaper:16&r=ets 
By:  JeanPhilippe Cayen; MarcAndré Gosselin; Sharon Kozicki 
Abstract:  The workhorse DSGE model used for monetary policy evaluation is designed to capture business cycle fluctuations in an optimizationbased format. It is commonplace to loglinearize models and express them with variables in deviationfromsteadystate format. Structural parameters are either calibrated, or estimated using data prefiltered to extract trends. Such procedures treat past and future trends as fully known by all economic agents or, at least, as independent of cyclical behaviour. With such a setup, in a forecasting environment it seems natural to add forecasts from DSGE models to trend forecasts. While this may be an intuitive starting point, efficiency can be improved in multiple dimensions. Ideally, behaviour of trends and cycles should be jointly modeled. However, for computational reasons it may not be feasible to do so, particularly with medium or largescale models. Nevertheless, marginal improvements on the standard framework can still be made. First, prefiltering of data can be amended to incorporate structural links between the various trends that are implied by the economic theory on which the model is based, improving the efficiency of trend estimates. Second, forecast efficiency can be improved by building a forecast model for modelconsistent trends. Third, decomposition of shocks into permanent and transitory components can be endogenized to also be modelconsistent. This paper proposes a unified framework for introducing these improvements. Application of the methodology validates the existence of considerable deviations between trends used for detrending data prior to structural parameter estimation and modelconsistent estimates of trends, implying the potential for efficiency gains in forecasting. Such deviations also provide information on aspects of the model that are least coherent with the data, possibly indicating model misspecification. Additionally, the framework provides a structure for examining cyclical responses to trend shocks, among other extensions. 
Keywords:  Business fluctuations and cycles; Econometric and statistical methods 
JEL:  E3 D52 C32 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:bca:bocawp:0935&r=ets 
By:  Stefano Iacus (Department of Economics, Business and Statistics, University of Milan, IT); Nakahiro Yoshida (Graduate School of Mathematical Sciences, Tokyo University, Tokyo) 
Abstract:  We consider a multidimensional Ito process Y=(Y_t), t in [0,T], with some unknown drift coefficient process b_t and volatility coefficient sigma(X_t,theta) with covariate process X=(X_t), t in[0,T], the function sigma(x,theta) being known up to theta in Theta. For this model we consider a change point problem for the parameter theta in the volatility component. The change is supposed to occur at some point t* in (0,T). Given discrete time observations from the process (X,Y), we propose quasimaximum likelihood estimation of the change point. We present the rate of convergence of the change point estimator and the limit thereoms of aymptotically mixed type. 
Keywords:  It\^o processes, discrete time observations, change point estimation, volatility, 
Date:  2009–06–18 
URL:  http://d.repec.org/n?u=RePEc:bep:unimip:1084&r=ets 
By:  Javier Mencía (Banco de España); Enrique Sentana (CEMFI) 
Abstract:  We derive Lagrange Multiplier and Likelihood Ratio specifi cation tests for the null hypotheses of multivariate normal and Student t innovations using the Generalised Hyperbolic distribution as our alternative hypothesis. We decompose the corresponding Lagrange Multipliertype tests into skewness and kurtosis components, from which we obtain more powerful onesided KuhnTucker versions that are equivalent to the Likelihood Ratio test, whose asymptotic distribution we provide. We conduct detailed Monte Carlo exercises to study our proposed tests in finite samples. Finally, we present an empirical application to ten US sectoral stock returns, which indicates that their conditional distribution is mildly asymmetric and strongly leptokurtic. 
Keywords:  Bootstrap, Inequality Constraints, Kurtosis, Normality Tests, Skewness, Supremum Test, Underidentifed parameters 
JEL:  C12 C52 C32 
Date:  2009–12 
URL:  http://d.repec.org/n?u=RePEc:bde:wpaper:0929&r=ets 
By:  D. Sornette; A. Saichev; V. Filimonov 
Abstract:  We present a new theory of homogeneous volatility (and variance) estimators for arbitrary stochastic processes. The main tool of our theory is the parsimonious encoding of all the information contained in the OHLC prices for a given time interval by the joint distributions of the highminusopen, lowminusopen and closeminusopen values, whose analytical expression is derived exactly for Wiener processes with drift. The efficiency of the new proposed estimators is favorably compared with that of the GarmanKlass, RogerSatchell and maximum likelihood estimators. 
Keywords:  Variance and volatility estimators, efficiency, homogeneous functions, Schwarz inequality, extremes of Wiener processes 
JEL:  C13 C51 
Date:  2009–10–25 
URL:  http://d.repec.org/n?u=RePEc:stz:wpaper:ccss0900007&r=ets 
By:  Jin Seo Cho (Korea University); ChirokHan (Korea University); Peter C. B. Phillips (Yale University, University of Auckland, University of Southampton & Singapore Management University) 
Abstract:  Least absolute deviations (LAD) estimation of linear timeseries models is considered under conditional heteroskedasticity and serial correlation. The limit theory of the LAD estimator is obtained without assuming the finite density condition for the errors that is required in standard LAD asymptotics. The results are particularly useful in application of LAD estimation to financial timeseries data. 
Keywords:  Asymptotic leptokurtosis, Convex function, Infinite density, Least absolute deviations, Median, Weak convergence. 
JEL:  C12 G11 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:iek:wpaper:0917&r=ets 
By:  Adolfo Alvarez; Daniel Pena 
Abstract:  This article discusses the problem of forming groups from previously split data. Algorithms for Cluster Analysis like SAR proposed by Peña, Rodriguez and Tiao (2004), divide the sample into small very homogeneous groups and then recombine them to form the definitive data configuration. This kind of splitting leads to dependent data in the sense that the groups are disjoint, so no traditional homogeneity of means or variances tests can be used. We propose an alternative by using Order Statistics. Studying the distribution and some moments of linear combination of Order Statistics it is possible to recombine disjoint data groups when they merge into a sample from the same population. 
Keywords:  SAR, Cluster Analysis, Order Statistics, Lstatistics, Bootstrapping 
Date:  2009–12 
URL:  http://d.repec.org/n?u=RePEc:cte:wsrepe:ws098526&r=ets 
By:  Emanuel Moench; Serena Ng; Simon Potter 
Abstract:  This paper uses multilevel factor models to characterize within and betweenblock variations as well as idiosyncratic noise in large dynamic panels. Blocklevel shocks are distinguished from genuinely common shocks, and the estimated blocklevel factors are easy to interpret. The framework achieves dimension reduction and yet explicitly allows for heterogeneity between blocks. The model is estimated using a Markov chain MonteCarlo algorithm that takes into account the hierarchical structure of the factors. We organize a panel of 447 series into blocks according to the timing of data releases and use a fourlevel model to study the dynamics of real activity at both the block and aggregate levels. While the effect of the economic downturn of 200709 is pervasive, growth cycles are synchronized only loosely across blocks. The state of the leading and the lagging sectors, as well as that of the overall economy, is monitored in a coherent framework. 
Keywords:  Econometric models ; Economic forecasting ; Economic indicators ; Markov processes 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:fip:fednsr:412&r=ets 
By:  Jouchi Nakajima (Department of Statistical Science, Duke University and Bank of Japan); Yasuhiro Omori (Faculty of Economics, University of Tokyo) 
Abstract:  Bayesian analysis of a stochastic volatility model with a generalized hyperbolic (GH) skew Student's terror distribution is described where we first consider an asymmetric heavytailness as well as leverage effects. An efficient Markov chain Monte Carlo estimation method is described exploiting a normal variancemean mixture representation of the error distribution with an inverse gamma distribution as a mixing distribution. The proposed method is illustrated using simulated data, daily TOPIX and S&P500 stock returns. The model comparison for stock returns is conducted based on the marginal likelihood in the empirical study. The strong evidence of the leverage and asymmetric heavytailness is found in the stock returns. Further, the prior sensitivity analysis is conducted to investigate whether obtained results are robust with respect to the choice of the priors. 
Date:  2009–12 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2009cf701&r=ets 
By:  WeiXing Zhou (ECUST) 
Abstract:  Many financial variables are found to exhibit multifractal nature, which is usually attributed to the influence of temporal correlations and fattailedness in the probability distribution (PDF). Based on the partition function approach of multifractal analysis, we show that there is a marked finitesize effect in the detection of multifractality, and the effective multifractality is the apparent multifractality after removing the finitesize effect. We find that the effective multifractality can be further decomposed into two components, the PDF component and the nonlinearity component. Referring to the normal distribution, we can determine the PDF component by comparing the effective multifractality of the original time series and the surrogate data that have a normal distribution and keep the same linear and nonlinear correlations as the original data. We demonstrate our method by taking the daily volatility data of Dow Jones Industrial Average from 26 May 1896 to 27 April 2007 as an example. Extensive numerical experiments show that a time series exhibits effective multifractality only if it possesses nonlinearity and the PDF has impact on the effective multifractality only when the time series possesses nonlinearity. Our method can also be applied to judge the presence of multifractality and determine its components of multifractal time series in other complex systems. 
Date:  2009–12 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:0912.4782&r=ets 
By:  Tetsuya Takaishi 
Abstract:  The hybrid Monte Carlo (HMC) algorithm is applied for the Bayesian inference of the stochastic volatility (SV) model. We use the HMC algorithm for the Markov chain Monte Carlo updates of volatility variables of the SV model. First we compute parameters of the SV model by using the artificial financial data and compare the results from the HMC algorithm with those from the Metropolis algorithm. We find that the HMC algorithm decorrelates the volatility variables faster than the Metropolis algorithm. Second we make an empirical study for the time series of the Nikkei 225 stock index by the HMC algorithm. We find the similar correlation behavior for the sampled data to the results from the artificial financial data and obtain a $\phi$ value close to one ($\phi \approx 0.977$), which means that the time series has the strong persistency of the volatility shock. 
Date:  2009–12 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1001.0024&r=ets 