
on Econometric Time Series 
By:  Masahiko Shibamoto (Research Institute for Economics and Business Administration, Kobe University); Yoshiro Tsutsui (Graduate School of Economics, Osaka University) 
Abstract:  Studies using the dynamic panel regression approach have found the speed of income convergence among the world and regional economies to be high. For example, Lee et al. (1997, 1998) report the income convergence speed to be 30% per annum. This note argues that their estimates may be seriously overstated. Using a factor model, we show that the coefficient of the lagged income in their specification may not be the longrun convergence speed, but the adjustment speed of the shortrun deviation from the longrun equilibrium path. We give an example of an empirical analysis, where the shortrun adjustment speed is about 40%. 
Keywords:  convergence speed, dynamic panel regression, factor model 
JEL:  O40 
Date:  2011–01 
URL:  http://d.repec.org/n?u=RePEc:kob:dpaper:dp201104&r=ets 
By:  Prasad S Bhattacharya; Dimitrios D Thomakos 
Abstract:  This study presents extensive results on the benefits of rolling window and model averaging. Building on the recent work on rolling window averaging by Pesaran et al (2010, 2009) and on exchange rate forecasting by Molodtsova and Papell (2009), we explore whether rolling window averaging can be considered beneficial on a priori grounds. We investigate whether rolling window averaging can improve the performance of model averaging, especially when ‘simpler’ models are used. The analysis provides strong support for rolling window averaging, outperforming the best window forecasts more than 50% of the time across all rolling windows. Furthermore, rolling window averaging smoothes out the forecast path, improves robustness, and minimizes the pitfalls associated with potential structural breaks. 
Keywords:  Exchange rate forecasting, inflation forecasting, output growth forecasting, rolling window, model averaging, short horizon, robustness. 
JEL:  C22 C53 F31 F47 E31 
Date:  2011–02–21 
URL:  http://d.repec.org/n?u=RePEc:dkn:econwp:eco_2011_1&r=ets 
By:  Dong Jin Lee (University of Connecticut) 
Abstract:  This paper considers tests for structural breaks in linear models when the regressors and the serially dependent error process are unstable. The set of models contains various economic circumstances such as the structural breaks in the regressors and/or the error variance, and a linear trend model with I(0)/I(1) error. We show that the existing heteroscedasticity robust tests and the fixed regressor bootstrap method of Hansen (2000) have severe size distortion problem even in the asymptotics. We suggest a method which combines the fixed regressor bootstrap and the sievewild bootstrap method to nonparametrically approximate the the serially dependent unstable error process. The suggested method is shown to asymptotically replicates the true distribution of the existing tests under various circumstances. Monte Carlo experiments show significant improvements both in the size and the power properties. Once the size is controlled by the bootstrap, Wald type tests have better power properties relative to LM type tests. 
Keywords:  structural break, sieve bootstrap, fixed regressor bootstrap, robust test, break in linear trend 
JEL:  C10 C12 C22 
Date:  2011–03 
URL:  http://d.repec.org/n?u=RePEc:uct:uconnp:201105&r=ets 
By:  Amélie Charles (Audencia Nantes, School of Management); Olivier Darné (LEMNA, University of Nantes); Jae H Kim (School of Economics and Finance, La Trobe University) 
Abstract:  A Monte Carlo experiment is conducted to compare power properties of al ternative tests for the martingale difference hypothesis. Overall, we find that the wild bootstrap automatic variance ratio test shows the highest power against lin ear dependence; while the generalized spectral test performs most desirably under nonlinear dependence. 
Keywords:  Monte Carlo experiment; Nonlinear dependence; Portmanteau test; Variance ratio test 
JEL:  C12 C14 
Date:  2010–11 
URL:  http://d.repec.org/n?u=RePEc:ltr:wpaper:2010.07&r=ets 
By:  Charles S. Bos 
Abstract:  Estimation of the volatility of time series has taken off since the introduction of the GARCH and stochastic volatility models. While variants of the GARCH model are applied in scores of articles, use of the stochastic volatility model is less widespread. In this article it is argued that one reason for this difference is the relative difficulty of estimating the unobserved stochastic volatility, and the varying approaches that have been taken for such estimation. In order to simplify the comprehension of these estimation methods, the main methods for estimating stochastic volatility are discussed, with focus on their commonalities. In this manner, the advantages of each method are investigated, resulting in a comparison of the methods for their efficiency, difficultyofimplementation, and precision. 
Keywords:  Stochastic volatility; estimation; methodology 
JEL:  C13 C51 
Date:  2011–03–03 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:20110049&r=ets 
By:  Antonietta Mira (Department of Economics, University of Insubria, Italy); Daniele Imparato (Department of Economics, University of Insubria, Italy); Reza Solgi (Istituto di Finanza, Universita di Lugano) 
Abstract:  A general purpose variance reduction technique for Markov chain Monte Carlo (MCMC) estimators, based on the zerovariance principle introduced in the physics literature, is proposed to evaluate the expected value, of a function f with respect to a, possibly unnormalized, probability distribution . In this context, a control variate approach, generally used for Monte Carlo simulation, is exploited by replacing f with a dierent function, ~ f. The function ~ f is constructed so that its expectation, under , equals f , but its variance with respect to is much smaller. Theoretically, an optimal renormalization f exists which may lead to zero variance; in practice, a suitable approximation for it must be investigated. In this paper, an ecient class of renormalized ~ f is investigated, based on a polynomial parametrization. We nd that a lowdegree polynomial (1st, 2nd or 3rd degree) can lead to dramatically huge variance reduction of the resulting zerovariance MCMC estimator. General formulas for the construction of the control variates in this context are given. These allow for an easy implementation of the method in very general settings regardless of the form of the target/posterior distribution (only dierentiability is required) and of the MCMC algorithm implemented (in particular, no reversibility is needed). 
Keywords:  Control variates, GARCH models, Logistic regression, MetropolisHastings algorithm, Variance reduction 
Date:  2011–03 
URL:  http://d.repec.org/n?u=RePEc:ins:quaeco:qf1109&r=ets 
By:  Hong, Seung Hyun (Korea Institute of Public Finance, Seoul, Korea); Wagner, Martin (Department of Economics and Finance, Institute for Advanced Studies, Vienna, Austria) 
Abstract:  This paper develops a fully modified OLS estimator for cointegrating polynomial regressions, i.e. for regressions including deterministic variables, integrated processes and powers of integrated processes as explanatory variables and stationary errors. The errors are allowed to be serially correlated and the regressors are allowed to be endogenous. The paper thus extends the fully modified approach developed in Phillips and Hansen (1990). The FMOLS estimator has a zero mean Gaussian mixture limiting distribution, which is the basis for standard asymptotic inference. In addition Wald and LM tests for specification as well as a KPSStype test for cointegration are derived. The theoretical analysis is complemented by a simulation study which shows that the developed FMOLS estimator and tests based upon it perform well in the sense that the performance advantages over OLS are by and large similar to the performance advantages of FMOLS over OLS in cointegrating regressions. 
Keywords:  Cointegrating polynomial regression, fully modified OLS estimation, integrated process, testing 
JEL:  C12 C13 C32 
Date:  2011–03 
URL:  http://d.repec.org/n?u=RePEc:ihs:ihsesp:264&r=ets 
By:  Pierre Guerin; Massimiliano Marcellino 
Abstract:  This paper introduces a new regression model  Markovswitching mixed data sampling (MSMIDAS)  that incorporates regime changes in the parameters of the mixed data sampling (MIDAS) models and allows for the use of mixedfrequency data in Markovswitching models. After a discussion of estimation and inference for MSMIDAS, and a small sample simulation based evaluation, the MSMIDAS model is applied to the prediction of the US and UK economic activity, in terms both of quantitative forecasts of the aggregate economic activity and of the prediction of the business cycle regimes. Both simulation and empirical results indicate that MSMIDAS is a very useful specification. 
Keywords:  Business cycle, Mixedfrequency data, Nonlinear models, Forecasting, Nowcasting 
JEL:  C22 C53 E37 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:eui:euiwps:eco2011/03&r=ets 
By:  ZhiQiang Jiang; WeiXing Zhou 
Abstract:  There are a number of situations in which several signals are simultaneously recorded in complex systems, which exhibit longterm powerlaw crosscorrelations. The multifractal detrended crosscorrelation analysis (MFDCCA) approaches can be used to quantify such crosscorrelations, such as the MFDCCA based on detrended fluctuation analysis (MFXDFA) method. We develop in this work a class of MFDCCA algorithms based on the detrending moving average analysis, called MFXDMA. The performances of the MFXDMA algorithms are compared with the MFXDFA method by extensive numerical experiments on pairs of time series generated from bivariate fractional Brownian motions, twocomponent autoregressive fractionally integrated moving average processes and binomial measures, which have theoretical expressions of the multifractal nature. In all cases, the scaling exponents $h_{xy}$ extracted from the MFXDMA and MFXDFA algorithms are very close to the theoretical values. For bivariate fractional Brownian motions, the scaling exponent of the crosscorrelation is independent of the crosscorrelation coefficient between two time series and the MFXDFA and centered MFXDMA algorithms have comparative performance, which outperform the forward and backward MFXDMA algorithms. We apply these algorithms to the return time series of two stock market indexes and to their volatilities. For the returns, the centered MFXDMA algorithm gives the best estimates of $h_{xy}(q)$ since its $h_{xy}(2)$ is closest to 0.5 as expected, and the MFXDFA algorithm has the second best performance. For the volatilities, the forward and backward MFXDMA algorithms give similar results, while the centered MFXDMA and the MFXDFA algorithms fails to extract rational multifractal nature. 
Date:  2011–03 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1103.2577&r=ets 
By:  Di Iorio, Francesca; Triacca, Umberto 
Abstract:  A new noncausality test based on the notion of distance between ARMA models is proposed in this paper. The advantage of this test is that it can be used in possible integrated and cointegrated systems, without pretesting for unit roots and cointegration. The Monte Carlo experiments indicate that the proposed method performs reasonably well in nite samples. The empirical relevance of the test is illustrated via two applications. 
Keywords:  AR metric; Bootstrap test; Granger noncausality; VAR 
JEL:  C12 C15 C22 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:29637&r=ets 