|
on Econometric Time Series |
By: | Smets, Frank; Wouters, Rafael |
Abstract: | Using a Bayesian likelihood approach, we estimate a dynamic stochastic general equilibrium model for the US economy using seven macro-economic time series. The model incorporates many types of real and nominal frictions and seven types of structural shocks. We show that this model is able to compete with Bayesian Vector Autoregression models in out-of-sample prediction. We investigate the relative empirical importance of the various frictions. Finally, using the estimated model we address a number of key issues in business cycle analysis: What are the sources of business cycle fluctuations? Can the model explain the cross-correlation between output and inflation? What are the effects of productivity on hours worked? What are the sources of the “Great Moderation”? |
Keywords: | business cycle; DSGE models; monetary policy |
JEL: | E4 E5 |
Date: | 2007–02 |
URL: | http://d.repec.org/n?u=RePEc:cpr:ceprdp:6112&r=ets |
By: | Matteo Manera (University of Milan Bicocca); Chiara Longo (Fondazione Eni Enrico Mattei); Anil Markandya (University of Bath and Fondazione Eni Enrico Mattei); Elisa Scarpa (Risk Management Department, Intesa-San Paolo) |
Abstract: | The relevance of oil in the world economy explains why considerable effort has been devoted to the development of different types of econometric models for oil price forecasting. Several specifications have been proposed in the economic literature. Some are based on financial theory and concentrate on the relationship between spot and futures prices (“financial” models). Others assign a key role to variables explaining the characteristics of the physical oil market (“structural” models). The empirical literature is very far from any consensus about the appropriate model for oil price forecasting that should be implemented. Relative to the previous literature, this paper is novel in several respects. First of all, we test and systematically evaluate the ability of several alternative econometric specifications proposed in the literature to capture the dynamics of oil prices. Second, we analyse the effects of different data frequencies on the coefficient estimates and forecasts obtained using each selected econometric specification. Third, we compare different models at different data frequencies on a common sample and common data. Fourth, we evaluate the forecasting performance of each selected model using static and dynamic forecasts, as well as different measures of forecast errors. Finally, we propose a new class of models which combine the relevant aspects of the financial and structural specifications proposed in the literature (“mixed” models). Our empirical findings can be summarized as follows. Financial models in levels do not produce satisfactory forecasts for the WTI spot price. The financial error correction model yields accurate in-sample forecasts. Real and strategic variables alone are insufficient to capture the oil spot price dynamics in the forecasting sample. Our proposed mixed models are statistically adequate and exhibit accurate forecasts. Different data frequencies seem to affect the forecasting ability of the models under analysis. |
Keywords: | Oil Price, WTI Spot And Futures Prices, Forecasting, Econometric Models |
JEL: | C52 C53 Q32 Q43 |
Date: | 2007–01 |
URL: | http://d.repec.org/n?u=RePEc:fem:femwpa:2007.4&r=ets |
By: | Badi H. Baltagi (Center for Policy Research, Maxwell School, Syracuse University, Syracuse, NY 13244-1020) |
Abstract: | This paper gives a brief survey of forecastiang with panel data. Starting with a simple error component regression model and surveying best linear unbiased prediction under various assumptions of the disturbance term. This includes various ARMA models as well as spatial autoregressive models. The paper also surveys how these forecasts have been used in panel data applications, running horse races between heterogeneous and homogeneous panel data models using out of sample forecasts. |
Keywords: | forecasting; BLUP; panel data; spatial dependence; serial correlation; heterogeneous panels. |
JEL: | C33 |
URL: | http://d.repec.org/n?u=RePEc:max:cprwps:91&r=ets |
By: | Alex Maynard (Wilfrid Laurier University); Katsumi Shimotsu (Queen's University) |
Abstract: | This paper develops a new test of orthogonality based on a zero restriction on the covariance between the dependent variable and the predictor. The test provides a useful alternative to regression-based tests when conditioning variables have roots close or equal to unity. In this case standard predictive regression tests can suffer from well-documented size distortion. Moreover, under the alternative hypothesis, they force the dependent variable to share the same order of integration as the predictor, whereas in practice the dependent variable often appears stationary while the predictor may be near-nonstationary. By contrast, the new test does not enforce the same orders of integration and is therefore capable of detecting alternatives to orthogonality that are excluded by the standard predictive regression model. Moreover, the test statistic has a standard normal limit distribution for both unit root and local-to-unity conditioning variables, without prior knowledge of the local-to-unity parameter. If the conditioning variable is stationary, the test remains conservative and consistent. Thus the new test requires neither size correction nor unit root pre-test. Simulations suggest good small sample performance. As an empirical application, we test for the predictability of stock returns using two persistent predictors, the dividend-price-ratio and short-term interest rate. |
Keywords: | orthogonality test, covariance estimation, local-to-unity, unit roots, market efficiency, predictive regression, regression imbalance |
JEL: | C12 C22 |
Date: | 2007–02 |
URL: | http://d.repec.org/n?u=RePEc:qed:wpaper:1122&r=ets |
By: | Rodney C Wolff; Peter Hall; Qiwei Yao (School of Economics and Finance, Queensland University of Technology) |
Abstract: | Motivated by the problem of setting prediction intervals in time series analysis, we suggest two new methods for conditional distribution estimation. The first method is based on locally fitting a logistic model and is in the spirit of recent work on locally parametric techniques in density estimation. It produces distribution estimators that may be of arbitrarily high order but nevertheless always lie between 0 and 1. The second method involves an adjusted form of the Nadaraya--Watson estimator. It preserves the bias and variance properties of a class of second-order estimators introduced by Yu and Jones but has the added advantage of always being a distribution itself. Our methods also have application outside the time series setting; for example, to quantile estimation for independent data. This problem motivated the work of Yu and Jones. |
Keywords: | Absolutely regular; bandwidth; biased bootstrap; conditional distribution; kernel methods; local linear methods; local logistic methods; Nadaraya-Watson estimator; prediction; quantile estimation; time series analysis; weighted bootstrap |
Date: | 2006–06–15 |
URL: | http://d.repec.org/n?u=RePEc:qut:rwolff:2006-11&r=ets |
By: | Rodney C Wolff; Darfiana Nur; Kerrie L Mengersen (School of Economics and Finance, Queensland University of Technology) |
Abstract: | Most MCMC users address the convergence problem by applying diagnostic tools to the output produced by running their samplers. Potentially useful diagnostics may be borrowed from diverse areas such as time series. One such method is phase randomisation. The aim of this paper is to describe this method in the context of MCMC, summarise its characteristics, and contrast its performance with those of the more common diagnostic tests for MCMC. It is observed that the new tool contributes information about third and higher order cumulant behaviour which is important in characterising certain forms of nonlinearity and nonstationarity. |
Keywords: | Convergence diagnostics; higher cumulants; Markov Chain Monte Carlo; non-linear time series; stationarity; surrogate series |
Date: | 2006–06–15 |
URL: | http://d.repec.org/n?u=RePEc:qut:rwolff:2006-4&r=ets |
By: | Rodney C Wolff; Adrian G Barnett (School of Economics and Finance, Queensland University of Technology) |
Abstract: | The bispectrum and third-order moment can be viewed as equivalent tools for testing for the presence of nonlinearity in stationary time series. This is because the bispectrum is the Fourier transform of the third-order moment. An advantage of the bispectrum is that its estimator comprises terms that are asymptotically independent at distinct bifrequencies under the null hypothesis of linearity. An advantage of the third-order moment is that its values in any subset of joint lags can be used in the test, whereas when using the bispectrum the entire (or truncated) third-order moment is required to construct the Fourier transform. In this paper, we propose a test for nonlinearity based upon the estimated third-order moment. We use the phase scrambling bootstrap method to give a nonparametric estimate of the variance of our test statistic under the null hypothesis. Using a simulation study, we demonstrate that the test obtains its target significance level, with large power, when compared to an existing standard parametric test that uses the bispectrum. Further we show how the proposed test can be used to identify the source of nonlinearity due to interactions at specific frequencies. We also investigate implications for heuristic diagnosis of nonstationarity. |
Keywords: | Third-order moment; bispectrum; non-linear; non-stationary; time series; bootstrap; phase scrambling |
Date: | 2006–06–15 |
URL: | http://d.repec.org/n?u=RePEc:qut:rwolff:2006-5&r=ets |
By: | Rodney C Wolff (School of Economics and Finance, Queensland University of Technology) |
Abstract: | Of much interest in financial econometrics is the recovery of joint distributional behaviour of collections of contemporaneous financial time series, e.g., two related commodity price series, or two asset returns series. An approach to model their joint behaviour is to use copulas. Essentially, copulas are selected on the basis of a measure of correlation between the two series and are made to match their marginal properties. Of course, generalisations exist for more than two series. A possible limitation of this approach is that only linear correlations between series might be captured. We consider incorporating more general dependence structures, through the use of the correlation integral (as in the BDS test), as a means to refine the choice of candidate copulas in an empirical situation. |
Keywords: | Archimedean copula; copula; correlation integral; dependence; Poisson convergence |
Date: | 2006–06–15 |
URL: | http://d.repec.org/n?u=RePEc:qut:rwolff:2006-6&r=ets |
By: | Mauro Costantini (ISAE - Institute for Studies and Economic Analyses); Roy Cerqueti (Università degli Studi di Roma “La Sapienza”, Italy) |
Abstract: | This paper provides a theoretical fractional cointegration analysis in a nonparametric framework. We solve a generalized eigenvalues problem. To this end, a couple of random matrices are constructed taking into account the stationarity properties of the differencesof a fractional p-variate integrated process. These difference orders are assumed to vary in a continuous and discrete range. The random matrices are defined by some weight functions. Asymptotic behaviors of these random matrices are obtained by stating some conditions on the weight functions, and by using Bierens (1997) and Andersen et al.(1983) results. In this way, a nonparametric analysis is provided. Moving from the solution of the generalized eigenvalue problem, a fractional nonparametric VAR model for cointegration is also presented. |
Keywords: | Fractional integrated process, Nonparametric methods, Cointegration, Asymptotic distribution, Generalized eigenvalues problem. |
JEL: | C14 C22 C65 |
Date: | 2007–02 |
URL: | http://d.repec.org/n?u=RePEc:isa:wpaper:78&r=ets |