nep-ets New Economics Papers
on Econometric Time Series
Issue of 2005‒12‒09
sixteen papers chosen by
Yong Yin
SUNY at Buffalo

  1. Total Factor Productivity: An Unobserved Components Approach By Raul Crespo
  2. Non-Linearities in the Relation between the Exchange Rate and its Fundamentals By Carlo Altavilla; Paul De Grauwe
  3. Unit Roots and Cointegration in Panels By Joerg Breitung; M. Hashem Pesaran
  4. Forecast Combination and Model Averaging Using Predictive Measures By Eklund, Jana; Karlsson, Sune
  5. Where Are We Now? Real-Time Estimates of the Macro Economy By Evans, Martin D.D.
  6. Data Revisions Are Not Well-Behaved By Aruoba, Boragan
  7. Pooling-based data interpolation and backdating By Marcellino, Massimiliano
  8. How Useful is Bagging in Forecasting Economic Time Series? A Case Study of US CPI Inflation By Inoue, Atsushi; Kilian, Lutz
  9. Reconciling the Return Predictability Evidenc: In-Sample Forecasts, Out-of-Sample Forecasts, and Parameter Instability By Lettau, Martin; van Nieuwerburgh, Stijn
  10. Forecast Combinations By Timmermann, Allan G
  11. Testing Linearity in Cointegrating Relations with an Application to Purchasing Power Parity By Seung Hyun Hong; Peter C. B. Phillips
  12. Econometric Models of Asymmetric Price Transmission By Matteo Manera; Giliola Frey
  13. Oil Prices, Inflation and Interest Rates in a Structural Cointegrated VAR Model for the G-7 Countries By Matteo Manera; Alessandro Cologni
  14. A Test of Cointegration Rank Based on Principal Component Analysis By Hiroaki Chigira
  15. Mind your Ps and Qs! Improving ARMA forecasts with RBC priors By Kirdan Lees; Troy Matheson
  16. Optimal Time Interval Selection in Long-Run Correlation Estimation By Pedro H. Albuquerque

  1. By: Raul Crespo
    Abstract: This work examines the presence of unobserved components in the time series of Total Factor Productivity, which is an idea central to modern Macroeconomics. The main approaches in both the study of economic growth and the study of business cycles rely on certain properties of the different components of the time series of Total Factor Productivity. In the study of economic growth, the Neoclassical growth model explains growth in terms of technical progress as measured by the secular component of Total Factor Productivity. While in the study of business cycles, the Real Business Cycle approach explains short-run fluctuations in the economy as determined by temporary movements in the production function, which are reflected by the cyclical component of the time series of the same variable. The econometric methodology employed in the estimation of these different components is the structural time series approach developed by Harvey (1989), Harvey and Shephard (1993), and others. An application to the time series of Total Factor Productivity for the 1948-2002 U.S. private non-farm business sector is presented. The pattern described by technical progress in this economy is characterised by important growth for the period immediately after War World II, which reaches its peak at the beginning of the 1960s to decline until the earliest 1980s where it shows a modest rebound. On the other hand, the cyclical component of the series seems to be better described by two cycles with periodicity of six and twelve years, respectively.
    Keywords: Productivity, Business Cycles, Structural Time Series Models, Unobserved Components.
    JEL: E23 E32 C22
    Date: 2005–12
    URL: http://d.repec.org/n?u=RePEc:bri:uobdis:05/579&r=ets
  2. By: Carlo Altavilla; Paul De Grauwe
    Abstract: This paper investigates the relationship between the euro-dollar exchange rate and its underlying fundamentals. First, we develop a simple theoretical model in which chartists and fundamentalists interact. This model predicts the existence of different regimes, and thus non-linearities in the link between the exchange rate and its fundamentals. Second, we account for non-linearity in the exchange rate process by adopting a Markov-switching vector error correction model (MSVECM). Finally, the paper investigates the out-of-sample forecast performance of three competing models of exchange rate determination. The results suggest the presence of nonlinear mean reversion in the nominal exchange rate process. The implications are that different sets of macroeconomic fundamentals act as driving forces of the exchange rates during different time periods. More interestingly, the nonlinear specification significantly improves the forecast accuracy during periods when the deviation between exchange rate and fundamentals is large. Conversely, when the exchange rate is close to its equilibrium value it tends to be better approximated by a naïve random walk.
    Keywords: non-linearity, Markov-switching model, fundamentals
    JEL: C32 F31
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:ces:ceswps:_1561&r=ets
  3. By: Joerg Breitung; M. Hashem Pesaran
    Abstract: This paper provides a review of the literature on unit roots and cointegration in panels where the time dimension (T) and the cross section dimension (N) are relatively large. It distinguishes between the first generation tests developed on the assumption of the cross section independence, and the second generation tests that allow, in a variety of forms and degrees, the dependence that might prevail across the different units in the panel. In the analysis of cointegration the hypothesis testing and estimation problems are further complicated by the possibility of cross section cointegration which could arise if the unit roots in the different cross section units are due to common random walk components.
    Keywords: panel unit roots, panel cointegration, cross section dependence, common effects
    JEL: C12 C15 C22 C23
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:ces:ceswps:_1565&r=ets
  4. By: Eklund, Jana; Karlsson, Sune
    Abstract: We extend the standard approach to Bayesian forecast combination by forming the weights for the model averaged forecast from the predictive likelihood rather than the standard marginal likelihood. The use of predictive measures of fit offers greater protection against in-sample overfitting and improves forecast performance. For the predictive likelihood we show analytically that the forecast weights have good large and small sample properties. This is confirmed in a simulation study and an application to forecasts of the Swedish inflation rate where forecast combination using the predictive likelihood outperforms standard Bayesian model averaging using the marginal likelihood.
    Keywords: Bayesian model averaging; inflation rate; partial Bayes factor; predictive likelihood; training sample
    JEL: C11 C51 C52 C53
    Date: 2005–10
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:5268&r=ets
  5. By: Evans, Martin D.D.
    Abstract: This paper describes a method for calculating daily real-time estimates of the current state of the US economy. The estimates are computed from data on scheduled US macroeconomic announcements using an econometric model that allows for variable reporting lags, temporal aggregation, and other complications in the data. The model can be applied to find real-time estimates of GDP, inflation, unemployment or any other macroeconomic variable of interest. In this paper I focus on the problem of estimating the current level of and growth rate in GDP. I construct daily real-time estimates of GDP that incorporate public information known on the day in question. The real-time estimates produced by the model are uniquely suited to studying how perceived developments the macro economy are linked to asset prices over a wide range of frequencies. The estimates also provide, for the first time, daily time series that can be used in practical policy decisions.
    Keywords: forecasting GDP; Kalman filtering; real-time data
    JEL: C32 E37
    Date: 2005–10
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:5270&r=ets
  6. By: Aruoba, Boragan
    Abstract: We document the empirical properties of revisions to major macroeconomic variables in the United States. Our findings suggest that they do not satisfy simple desirable statistical properties. In particular, we find that these revisions do not have a zero mean, which indicates that the initial announcements by statistical agencies are biased. We also find that the revisions are quite large compared to the original variables and they are predictable using the information set at the time of the initial announcement, which means that the initial announcements of statistical agencies are not rational forecasts. We also provide evidence that professional forecasters ignore this predictability.
    Keywords: forecasting; news and noise; NIPA variables; real-time data
    JEL: C22 C53 C82
    Date: 2005–10
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:5271&r=ets
  7. By: Marcellino, Massimiliano
    Abstract: Pooling forecasts obtained from different procedures typically reduces the mean square forecast error and more generally improves the quality of the forecast. In this paper we evaluate whether pooling interpolated or backdated time series obtained from different procedures can also improve the quality of the generated data. Both simulation results and empirical analyses with macroeconomic time series indicate that pooling plays a positive and important role also in this context.
    Keywords: factor Model; interpolation; Kalman Filter; pooling; spline
    JEL: C32 C43 C82
    Date: 2005–10
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:5295&r=ets
  8. By: Inoue, Atsushi; Kilian, Lutz
    Abstract: This paper explores the usefulness of bagging methods in forecasting economic time series from linear multiple regression models. We focus on the widely studied question of whether the inclusion of indicators of real economic activity lowers the prediction mean-squared error of forecast models of US consumer price inflation. We study bagging methods for linear regression models with correlated regressors and for factor models. We compare the accuracy of simulated out-of-sample forecasts of inflation based on these bagging methods to that of alternative forecast methods, including factor model forecasts, shrinkage estimator forecasts, combination forecasts and Bayesian model averaging. We find that bagging methods in this application are almost as accurate or more accurate than the best alternatives. Our empirical analysis demonstrates that large reductions in the prediction mean squared error are possible relative to existing methods, a result that is also suggested by the asymptotic analysis of some stylized linear multiple regression examples.
    Keywords: Bayesian model averaging; bootstrap aggregation; factor models; forecast combination; forecast model selection; pre-testing; shrinkage estimation
    JEL: C22 C52 C53
    Date: 2005–10
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:5304&r=ets
  9. By: Lettau, Martin; van Nieuwerburgh, Stijn
    Abstract: Evidence of stock return predictability by financial ratios is still controversial as documented by inconsistent results for in-sample and out-of-sample regressions as well as substantial parameter instability. This paper shows that these seemingly incompatible results can be reconciled if the assumption of a fixed steady state mean of the economy is relaxed. We find strong empirical evidence in support of shifts in the steady state and propose simple methods to adjust financial ratios for such shifts. The forecasting relationship of adjusted price ratios and future returns is statistically significant, stable over time and present in out-of-sample tests. We also show that shifts in the steady state are responsible for parameter instability and poor out-of-sample performance of unadjusted price ratios that is found in the data. Our conclusions hold for a variety of financial ratios and are robust to changes in the econometric technique used to estimate shifts in the steady state.
    Keywords: price ratios; dividend price ratio; out-of-sample test; predictibility; Stock returns
    JEL: C12 C22 G1
    Date: 2005–11
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:5355&r=ets
  10. By: Timmermann, Allan G
    Abstract: Forecast combinations have frequently been found in empirical studies to produce better forecasts on average than methods based on the ex-ante best individual forecasting model. Moreover, simple combinations that ignore correlations between forecast errors often dominate more refined combination schemes aimed at estimating the theoretically optimal combination weights. In this paper we analyse theoretically the factors that determine the advantages from combining forecasts (for example, the degree of correlation between forecast errors and the relative size of the individual models’ forecast error variances). Although the reasons for the success of simple combination schemes are poorly understood, we discuss several possibilities related to model misspecification, instability (non-stationarities) and estimation error in situations where the numbers of models is large relative to the available sample size. We discuss the role of combinations under asymmetric loss and consider combinations of point, interval and probability forecasts.
    Keywords: diversification gains; forecast combinations; model misspecification; pooling and trimming; shrinkage methods
    JEL: C22 C53
    Date: 2005–11
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:5361&r=ets
  11. By: Seung Hyun Hong (Dept. of Economics, Concordia University); Peter C. B. Phillips (Cowles Foundation, Yale University; University of Auckland & University of York)
    Abstract: This paper develops a linearity test that can be applied to cointegrating relations. We consider the widely used RESET specification test and show that when this test is applied to nonstationary time series its asymptotic distribution involves a mixture of noncentral chi^2 distributions, which leads to severe size distortions in conventional testing based on the central chi^2. Nonstationarity is shown to introduce two bias terms in the limit distribution, which are the source of the size distortion in testing. Appropriate corrections for this asymptotic bias leads to a modified version of the RESET test which has a central chi^2 limit distribution under linearity. The modified test has power not only against nonlinear cointegration but also against the absence of cointegration. Simulation results reveal that the modified test has good size infinite samples and reasonable power against many nonlinear models as well as models with no cointegration, confirming the analytic results. In an empirical illustration, the linear purchasing power parity (PPP) specification is tested using US, Japan, and Canada monthly data after Bretton Woods. While commonly used ADF and PP cointegration tests give mixed results on the presence of linear cointegration in the series, the modified test rejects the null of linear PPP cointegration.
    Keywords: Nonlinear cointegration, Specification test, RESET test, Noncentral chi^2 distribution
    JEL: C12 C22
    Date: 2005–12
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1541&r=ets
  12. By: Matteo Manera (University of Milan-Bicocca and Fondazione Eni Enrico Mattei); Giliola Frey (Fondazione Eni Enrico Mattei)
    Abstract: In this paper we review the existing empirical literature on price asymmetries in commodities, providing a way to classify and compare different studies which are highly heterogeneous in terms of econometric models, type of asymmetries and empirical findings. Relative to the previous literature, this paper is novel in several respects. First, it presents a detailed and updated survey of the existing empirical contributions on the existence of price asymmetries in the transmission mechanism linking input prices to output prices. Second, this paper presents an extension of the traditional distinction between long-run and short-run asymmetries to new categories of asymmetries, such as: contemporaneous impact, distributed lag effect, cumulated impact, reaction time, equilibrium and momentum equilibrium adjustment path, regime effect, regime equilibrium adjustment path. Third, each empirical study is critically discussed in the light of this new classification of asymmetries. Fourth, this paper evaluates the relative merits of the most popular econometric models for price asymmetries, namely autoregressive distributed lags, partial adjustments, error correction models, regime switching and vector autoregressive models.
    Keywords: Price asymmetries, Cointegration, Partial adjustment, Threshold regime switching
    JEL: C22 D40 Q40
    Date: 2005–09
    URL: http://d.repec.org/n?u=RePEc:fem:femwpa:2005.100&r=ets
  13. By: Matteo Manera (University of Milan-Bicocca and Fondazione Eni Enrico Mattei); Alessandro Cologni (Fondazione Eni Enrico Mattei)
    Abstract: Sharp increases in the price of oil are generally seen as a major contributor to business cycle asymmetries. Moreover, the very recent highs registered in the world oil market are causing concern about possible slowdowns in the economic performance of the most developed countries. While several authors have considered the direct channels of transmission of energy price increases, other authors have argued that the economic downturns arose from the monetary policy response to the inflation presumably caused by oil price increases. In this paper a structural cointegrated VAR model has been considered for the G-7 countries in order to study the direct effects of oil price shocks on output and prices and the reaction of monetary variables to external shocks. Empirical analysis shows that, for most of the countries considered, there seems to be an impact of unexpected oil price shocks on interest rates, suggesting a contractionary monetary policy response directed to fight inflation. In turn, increases in interest rates are transmitted to real economy by reducing output growth and the inflation rate.
    Keywords: Oil price shocks, Monetary policy response, Structural VAR models
    JEL: E31 E32 E52 Q41
    Date: 2005–09
    URL: http://d.repec.org/n?u=RePEc:fem:femwpa:2005.101&r=ets
  14. By: Hiroaki Chigira
    Abstract: This paper considers a test of the rank of cointegration. The test is based on the fact that in an m-variate system the m-r th principal component is I (1) under the null of r cointegration rank but I (0) under the alternative of r+1 cointegration rank. Exploiting this fact, we construct a cointegration rank test that is less restrictive than Johansen's tests, easy to calculate, and independent of the dimension of the process. Monte Carlo simulations indicate that the proposed test outperforms Johansen's tests, even in the case of a model that satisfies the assumptions required for Johansen's tests and when the sample size is small.
    Keywords: Cointegratin test, Unit roots
    JEL: C12 C22 C32
    Date: 2005–11
    URL: http://d.repec.org/n?u=RePEc:hst:hstdps:d05-126&r=ets
  15. By: Kirdan Lees; Troy Matheson (Reserve Bank of New Zealand)
    Abstract: We utilise prior information from a simple RBC model to improve ARMA forecasts of post-war US GDP. We develop three alternative ARMA forecasting processes that use varying degrees of information from the Campbell (1994) flexible labour model. Directly calibrating the model produces poor forecasting performance whereas a model that uses a Bayesian framework to take the model to the data, yields forecasting performance comparable to a purely statistical ARMA process. A final model that uses theory only to restrict the order of the ARMA process (the ps and qs), but that estimates the ARMA parameters using maximum likelihood, yields improved forecasting performance.
    JEL: C11 C22 E37
    Date: 2005–10
    URL: http://d.repec.org/n?u=RePEc:nzb:nzbdps:2005/02&r=ets
  16. By: Pedro H. Albuquerque (Texas A&M International University)
    Abstract: This paper presents an asymptotically optimal time interval selection criterion for the long-run correlation block estimator (Bartlett kernel estimator) based on the Newey-West approach. An alignment criterion that enhances finite-sample performance is also proposed. The procedure offers an optimal yet unobtrusive alternative to the common practice in finance and economics of arbitrarily choosing time intervals or lags in correlation studies. A Monte Carlo experiment using parameters derived from Dow Jones returns data confirms that the procedure is MSE-superior to typical alternatives such as aggregation over arbitrary time intervals, VAR estimation, and Newey-West automatic lag selection.
    Keywords: Long-Run Correlation, Bartlett, Lag Selection, Time Interval, Alignment, Newey-West
    JEL: C14
    Date: 2005–11–23
    URL: http://d.repec.org/n?u=RePEc:wpa:wuwpem:0511017&r=ets

This nep-ets issue is ©2005 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.