
on Econometric Time Series 
By:  Javier Hualde (Departamento de EconomíaUPNA) 
Abstract:  A necessary condition for two time series to be nontrivially cointegrated is the equality of their respective integration orders. Thus, it is standard practice to test for order homogeneity prior to testing for cointegration. Tests for the equality of integration orders are particular cases of more general tests of linear restrictions among memory parameters of different time series, for which asymptotic theory has been developed in parametric and semiparametric settings. However, most tests have been developed in stationary and invertible settings, and, more importantly, many of them are invalid when the observables are cointegrated, because they usually involve inversion of an asymptotically singular matrix. We propose a general testing procedure which does not suffer from this serious drawback, and, in addition, it is very simple to compute, it covers the stationary/nonstationary and invertible/noninvertible ranges, and, as we show in a Monte Carlo experiment, it works well in finite samples. 
Keywords:  integration orders, fractional differencing, fractional cointegration. 
JEL:  C32 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:nav:ecupna:1206&r=ets 
By:  Wojciech Charemza; Yuriy Kharin; Vladislav Maevskiy 
Abstract:  The paper aims at assessing the forecast risk and the maximum admissible forecast horizon for the nonsystematic component of inflation modeled autoregressively, where a distortion is caused by a simple firstorder bilinear process. The concept of the guaranteed upper risk of forecasting and the dadmissible distortion level is defined here. In order to make this concept operational we propose a method of evaluation of the pmaximum admissible forecast risk, on the basis of the maximum likelihood estimates of the bilinear coefficient. It has been found that for the majority of developed countries (in terms of average GDP per capita) the maximum admissible forecast horizon is between 5 and 12 months, while for the poorer countries it is either shorter than 5 or longer than 12. There is also a negative correlation of the maximum admissible forecast horizon with the average GDP growth. 
Keywords:  Forecasting; Inflation; Bilinear Processes 
JEL:  C22 C53 E31 E37 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:lec:leecon:12/22&r=ets 
By:  Blazej Mazur (Economic Institute, National Bank of Poland and Department of Econometrics and Operations Research, Cracow University of Economics); Mateusz Pipien (Economic Institute, National Bank of Poland and Department of Econometrics and Operations Research, Cracow University of Economics) 
Abstract:  We discuss the empirical importance of long term cyclical effects in the volatility of financial returns. Following ˘Ci˘zek and Spokoiny (2009), Amado and Teräsvirta (2012) and others, we consider a general conditionally heteroscedastic process with stationarity property distorted by a deterministic function that governs the possible variability in time of unconditional variance. The function proposed in this paper can be interpreted as a finite Fourier approximation of an Almost Periodic (AP) function as defined by Corduneanu (1989). The resulting model has a particular form of a GARCH process with time varying parameters, intensively discussed in the recent literature. In the empirical analyses we apply a generalisation of the Bayesian AR(1)t GARCH(1,1) model for daily returns of S&P500, covering the period of sixty years of US postwar economy, including the recently observed global financial crisis. The results of a formal Bayesian model comparison clearly indicate the existence of significant long term cyclical patterns in volatility with a strongly supported periodic component corresponding to a 14 year cycle. This may be interpreted as empirical evidence in favour of a linkage between the business cycle in the US economy and long term changes in the volatility of the basic stock market index. 
Keywords:  Periodically correlated stochastic processes, GARCH models, Bayesian inference, volatility, unconditional variance 
JEL:  C58 C11 G10 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:nbp:nbpmis:124&r=ets 
By:  Bai, Jushan; Li, Kunpeng 
Abstract:  An approximate factor model of high dimension has two key features. First, the idiosyncratic errors are correlated and heteroskedastic over both the crosssection and time dimensions; the correlations and heteroskedasticities are of unknown forms. Second, the number of variables is comparable or even greater than the sample size. Thus a large number of parameters exist under a high dimensional approximate factor model. Most widely used approaches to estimation are principal component based. This paper considers the maximum likelihoodbased estimation of the model. Consistency, rate of convergence, and limiting distributions are obtained under various identification restrictions. Comparison with the principal component method is made. The likelihoodbased estimators are more efficient than those of principal component based. Monte Carlo simulations show the method is easy to implement and an application to the U.S. yield curves is considered 
Keywords:  Factor analysis; Approximate factor models; Maximum likelihood; Kalman smoother; Principal components; Inferential theory 
JEL:  C51 C33 
Date:  2012–01–10 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:42099&r=ets 
By:  Domenico Giannone; Michele Lenza; Giorgio E. Primiceri 
Abstract:  Vector autoregressions (VARs) are flexible time series models that can capture complex dynamic interrelationships among macroeconomic variables. However, their dense parameterization leads to unstable inference and inaccurate outofsample forecasts, particularly for models with many variables. A solution to this problem is to use informative priors, in order to shrink the richly parameterized unrestricted model towards a parsimonious naïve benchmark, and thus reduce estimation uncertainty. This paper studies the optimal choice of the informativeness of these priors, which we treat as additional parameters, in the spirit of hierarchical modeling. This approach is theoretically grounded, easy to implement, and greatly reduces the number and importance of subjective choices in the setting of the prior. Moreover, it performs very well both in terms of outofsample forecasting—as well as factor models—and accuracy in the estimation of impulse response functions. 
JEL:  C11 C32 C53 E37 E47 
Date:  2012–10 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:18467&r=ets 
By:  Mohamed Chikhi (Université de Ouargla and Université Montpellier I, Lameta); Anne PéguinFeissolle (CNRS, Greqam); Michel Terraza (Université Montpellier I, Lameta) 
Abstract:  This paper analyzes the cyclical behavior of Dow Jones by testing the existence of long memory through a new class of semiparametric ARFIMA models with HYGARCH errors (SEMIFARMAHYGARCH); this class includes nonparametric deterministic trend, stochastic trend, shortrange and longrange dependence and long memory heteroscedastic errors. We study the daily returns of the Dow Jones from 1896 to 2006. We estimate several models and we find that the coefficients of the SEMIFARMAHYGARCH model, including long memory coefficients for the equations of the mean and the conditional variance, are highly significant. The forecasting results show that the informational shocks have permanent effects on volatility and the SEMIFARMAHYGARCH model has better performance over some other models for long and/or short horizons. The predictions from this model are also better than the predictions of the random walk model; accordingly, the weak efficiency assumption of financial markets seems violated for Dow Jones returns studied over a long period. 
Keywords:  SEMIFARMA model, HYGARCH model, nonparametric deterministic trend,kernel methodology, long memory. 
JEL:  C14 C22 C58 G17 
Date:  2012–06 
URL:  http://d.repec.org/n?u=RePEc:aim:wpaimx:1214&r=ets 