
on Econometric Time Series 
By:  Guglielmo Maria Caporale; Luis A. GilAlana 
Abstract:  This paper examines aggregate money demand relationships in five industrial countries by employing a twostep strategy for testing the null hypothesis of no cointegration against alternatives which are fractionally cointegrated. Fractional cointegration would imply that, although there exists a longrun relationship, the equilibrium errors exhibit slow reversion to zero, i.e. that the error correction term possesses long memory, and hence deviations from equilibrium are highly persistent. It is found that the null hypothesis of no cointegration cannot be rejected for Japan. By contrast, there is some evidence of fractional cointegration for the remaining countries, i.e., Germany, Canada, the US, and the UK (where, however, the negative income elasticity which is found is not theoryconsistent). Consequently, it appears that money targeting might be the appropriate policy framework for monetary authorities in the first three countries, but not in Japan or in the UK. 
Date:  2005–01 
URL:  http://d.repec.org/n?u=RePEc:bru:bruedp:0501&r=ets 
By:  Sarai Criado Nuevo 
URL:  http://d.repec.org/n?u=RePEc:fda:fdaeee:0501&r=ets 
By:  Apel, Mikael (Monetary Policy Department, Central Bank of Sweden); Jansson, Per (Monetary Policy Department, Central Bank of Sweden) 
Abstract:  It has been suggested that interestrate smoothing may be partly explained by an omitted variable that relates to conditions in financial markets. We propose an alternative interpretation that suggests that it relates to measurement errors in the output gap. 
Keywords:  Interestrate smoothing; Measurement errors; Output gap 
JEL:  E43 E44 E52 
Date:  2005–03–01 
URL:  http://d.repec.org/n?u=RePEc:hhs:rbnkwp:0178&r=ets 
By:  Adolfson, Malin (Research Department, Central Bank of Sweden); Laséen, Stefan (Monetary Policy Department, Central Bank of Sweden); Lindé, Jesper (Research Department, Central Bank of Sweden); Villani, Mattias (Research Department, Central Bank of Sweden) 
Abstract:  In this paper we develop a dynamic stochastic general equilibrium (DSGE) model for an open economy, and estimate it on Euro area data using Bayesian estimation techniques. The model incorporates several open economy features, as well as a number of nominal and real frictions that have proven to be important for the empirical fit of closed economy models. The paper offers: i) a theoretical development of the standard DSGE model into an open economy setting, ii) Bayesian estimation of the model, including assesments of the relative importance of various shocks and frictions for explaining the dynamic development of an open economy, and iii) an evaluation of the model's empirical properties using standard validation methods. 
Keywords:  DSGE model; Open economy; Monetary Policy; Bayesian Inference; Business cycle 
JEL:  C11 E40 E47 E52 
Date:  2005–03–01 
URL:  http://d.repec.org/n?u=RePEc:hhs:rbnkwp:0179&r=ets 
By:  Adolfson, Malin (Research Department, Central Bank of Sweden); Laséen, Stefan (Monetary Policy Department, Central Bank of Sweden); Lindé, Jesper (Research Department, Central Bank of Sweden); Villani, Mattias (Research Department, Central Bank of Sweden) 
Abstract:  This paper uses an estimated open economy DSGE model to examine if constant interest forecasts one and two years ahead can be regarded as modest policy interventions during the period 1993Q42002Q4. An intervention is here defined to be modest if it does not trigger the agents to revise their expectations about the inflation targeting policy. Using univariate modesty statistics, we show that the modesty of the policy interventions depends on the assumptions about the uncertainty in the future shock realizations. In 1998Q42002Q4, the two year constant interest rate projections turn out immodest when assuming uncertainty only about monetary policy shocks during the conditioning period. However, allowing nonpolicy shocks to influence the forecasts makes the interventions more modest, at least one year ahead. Using a multivariate statistic, however, which takes the joint effects of the policy interventions into consideration, we find that the conditional policy shifts all projections beyond what is plausible in the latter part of the sample (1998Q42002Q4), and thereby affects the expectations formation of the agents. Consequently, the constant interest rate assumption has arguably led to conditional forecasts at the two year horizon that cannot be considered economically meaningful during this period. 
Keywords:  Forecasting; Monetary policy; Open economy DSGE model; Policy interventions; Bayesian inference 
JEL:  C11 C53 E47 E52 
Date:  2005–03–01 
URL:  http://d.repec.org/n?u=RePEc:hhs:rbnkwp:0180&r=ets 
By:  Villani, Mattias (Research Department, Central Bank of Sweden) 
Abstract:  Vector autoregressions have steadily gained in popularity since their introduction in econometrics 25 years ago. A drawback of the otherwise fairly well developed methodology is the inability to incorporate prior beliefs regarding the system's steady state in a satisfactory way. Such prior information are typically readily available and may be crucial for forecasts at long horizons. This paper develops easily implemented numerical simulation algorithms for analyzing stationary and cointegrated VARs in a parametrization where prior beliefs on the steady state may be adequately incorporated. The analysis is illustrated on macroeconomic data for the Euro area. 
Keywords:  Cointegration; Bayesian inference; Forecasting; Unconditional mean; VARs 
JEL:  C11 C32 C53 E50 
Date:  2005–03–16 
URL:  http://d.repec.org/n?u=RePEc:hhs:rbnkwp:0181&r=ets 
By:  Hosung Jung 
Abstract:  This paper presents an autocorrelation test that is applicable to dynamic panel data models with serially correlated errors. Our residualbased GMM ttest (hereafter: ttest) differs from the m2 and Sargan's overidentifying restriction (hereafter: Sargan test) in Arellano and Bond (1991), both of which are based on residuals from the firstdifference equation. It is a significance test which is applied after estimating a dynamic model by the instrumental variable (IV) method and is directly applicable to any other consistently estimated residual. Two interesting points are found: the test depends only on the consistency of the firststep estimation, not on its efficiency;and the test is applicable to both forms of serial correlation (i.e., AR(1) or MA(1)). Monte Carlo simulations are also performed to study the practical performance of these three tests, the m2, the Sargan and the ttest for models with firstorder autoregressive AR(1) and firstorder movingaverage MA(1) serial correlation. The m2 and Sargan test statistics appear to accept too often in small samples even when the autocorrelation coefficient approaches unity in the AR(1) disturbance. Overall, our residual based ttest has considerably more power than the m2 test or the Sargan test. 
Keywords:  Dynamic panel data, Residual based GMM ttest, m2 and Sargan tests 
Date:  2005–02 
URL:  http://d.repec.org/n?u=RePEc:hst:hstdps:d0477&r=ets 
By:  Massimiliano Marcellino; James Stock; Mark Watson 
Abstract:  “Iterated” multiperiod ahead time series forecasts are made using a oneperiod ahead model, iterated forward for the desired number of periods, whereas “direct” forecasts are made using a horizonspecific estimated model, where the dependent variable is the multiperiod ahead value being forecasted. Which approach is better is an empirical matter: in theory, iterated forecasts are more efficient if correctly specified, but direct forecasts are more robust to model misspecification. This paper compares empirical iterated and direct forecasts from linear univariate and bivariate models by applying simulated outofsample methods to 171 U.S. monthly macroeconomic time series spanning 1959 – 2002. The iterated forecasts typically outperform the direct forecasts, particularly if the models can select long lag specifications. The relative performance of the iterated forecasts improves with the forecast horizon. 
URL:  http://d.repec.org/n?u=RePEc:igi:igierp:285&r=ets 
By:  Torben G. Andersen; Tim Bollerslev; Peter F. Chirstoffersen; Francis X. Diebold 
Abstract:  Volatility has been one of the most active and successful areas of research in time series econometrics and economic forecasting in recent decades. This chapter provides a selective survey of the most important theoretical developments and empirical insights to emerge from this burgeoning literature, with a distinct focus on forecasting applications. Volatility is inherently latent, and Section 1 begins with a brief intuitive account of various key volatility concepts. Section 2 then discusses a series of different economic situations in which volatility plays a crucial role, ranging from the use of volatility forecasts in portfolio allocation to density forecasting in risk management. Sections 3, 4 and 5 present a variety of alternative procedures for univariate volatility modeling and forecasting based on the GARCH, stochastic volatility and realized volatility paradigms, respectively. Section 6 extends the discussion to the multivariate problem of forecasting conditional covariances and correlations, and Section 7 discusses volatility forecast evaluation methods in both univariate and multivariate cases. Section 8 concludes briefly. 
JEL:  C1 G1 
Date:  2005–03 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:11188&r=ets 
By:  Jaroslava Hlouskova; Martin Wagner 
Abstract:  This paper presents results concerning the size and power of first generation panel unit root and stationarity tests obtained from a large scale simulation study, with in total about 290 million test statistics computed. The tests developed in the following papers are included: Levin, Lin and Chu (2002), Harris and Tzavalis (1999), Breitung (2000), Im, Pesaran and Shin (1997 and 2003), Maddala and Wu (1999), Hadri (2000) and Hadri and Larsson (2002). Our simulation setup is designed to address i.a. the following issues. First, we assess the performance as a function of the time and the crosssection dimension. Second, we analyze the impact of positive MA roots on the test performance. Third, we investigate the power of the panel unit root tests (and the size of the stationarity tests) for a variety of first order autoregressive coefficients. Fourth, we consider both of the two usual specifications of deterministic variables in the unit root literature 
Keywords:  Panel Unit Root Test; Panel Stationarity Test; Size; Power; Simulation Study 
JEL:  C12 C15 C23 
Date:  2005–03 
URL:  http://d.repec.org/n?u=RePEc:ube:dpvwib:dp0503&r=ets 
By:  Patrick Marsh 
Abstract:  This paper proposes and analyses a measure of distance for the unit root hypothesis tested against stochastic stationarity. It applies over a family of distributions, for any sample size, for any specification of deterministic components and under additional autocorrelation, here parameterised by a finite order movingaverage. The measure is shown to obey a set of inequalities involving the measures of distance of Gibbs and Su (2002) which are also extended to include power. It is also shown to be a convex function of both the degree of a time polynomial regressors and the moving average parameters. Thus it is minimisable with respect to either. Implicitly, therefore, we find that linear trends and innovations having a moving average negative unit root will necessarily make power small. In the context of the Nelson and Plosser (1982) data, the distance is used to measure the impact that specification of the deterministic trend has on our ability to make unit root inferences. For certain series it highlights how imposition of a linear trend can lead to estimated models indistinguishable from unit root processes while freely estimating the degree of the trend yields a model very different in character. 
URL:  http://d.repec.org/n?u=RePEc:yor:yorken:05/02&r=ets 