nep-ets New Economics Papers
on Econometric Time Series
Issue of 2010‒05‒15
thirteen papers chosen by
Yong Yin
SUNY at Buffalo

  1. Are Some Forecasters Really Better Than Others? By Antonello D’Agostino; Kieran McQuinn; Karl Whelan
  2. Forecasting from Mis-specified Models in the Presence of Unanticipated Location Shifts By Michael P. Clements; David F. Hendry
  3. Spot Variance Path Estimation and its Application to High Frequency Jump Testing By Charles S. Bos; Pawel Janus; Siem Jan Koopman
  4. Do Jumps Matter? Forecasting Multivariate Realized Volatility Allowing for Common Jumps By Yin Liao; Heather Anderson; Farshid Vahid
  5. Are policy counterfactuals based on structural VARs reliable? By Luca Benati
  6. Forecasting with DSGE models By Kai Christoffel; Günter Coenen; Anders Warne
  7. Detrending moving average algorithm for multifractals By Gao-Feng Gu; Wei-Xing Zhou
  8. Persistent collective trend in stock markets By Emeric Balogh; Ingve Simonsen; Balint Zs. Nagy; Zoltan Neda
  9. Geometrical Approximation method and stochastic volatility market models By Dell'Era, Mario
  10. Characterizing economic trends by Bayesian stochastic model specification search By Grassi, Stefano; Proietti, Tommaso
  11. A Threshold Stochastic Volatility Model with Realized Volatility By Dinghai Xu
  12. Modeling Asymmetric Volatility Clusters Using Copulas and High Frequency Data By Cathy Ning, Dinghai Xu, Tony Wirjanto
  13. Empirical Evidence of the Leverage Effect in a Stochastic Volatility Model: A Realized Volatility Approach By Dinghai Xu, Yuying Li

  1. By: Antonello D’Agostino (Central Bank of Ireland); Kieran McQuinn (Central Bank of Ireland); Karl Whelan (University College Dublin)
    Abstract: In any dataset with individual forecasts of economic variables, some forecasters will perform better than others. However, it is possible that these ex post differences reflect sampling variation and thus overstate the ex ante differences between forecasters. In this paper, we present a simple test of the null hypothesis that all forecasters in the US Survey of Professional Forecasters have equal ability. We construct a test statistic that reflects both the relative and absolute performance of the forecaster and use bootstrap techniques to compare the empirical results with the equivalents obtained under the null hypothesis of equal forecaster ability. Results suggests limited evidence for the idea that the best forecasters are actually innately better than others, though there is evidence that a relatively small group of forecasters perform very poorly.
    Keywords: Forecasting, Bootstrap
    Date: 2010–04–15
  2. By: Michael P. Clements; David F. Hendry
    Abstract: This chapter describes the issues confronting any realistic context for economic forecasting, which is inevitably based on unknowingly mis-specified models, usually estimated from mis-measured data, facing intermittent and often unanticipated location shifts. We focus on mitigating the systematic forecast failures that result in such settings, and describe the background to our approach, the difficulties of evaluating forecasts, and the devices that are more robust when change occurs.
    Keywords: Economic forecasting, Location shifts, Mis-specified models, Robust forecasts
    JEL: C51 C22
    Date: 2010
  3. By: Charles S. Bos (VU University Amsterdam); Pawel Janus (VU University Amsterdam); Siem Jan Koopman (VU University Amsterdam)
    Abstract: This paper considers spot variance path estimation from datasets of intraday high frequency asset prices in the presence of diurnal variance patterns, jumps, leverage effects and microstructure noise. We rely on parametric and nonparametric methods. The estimated spot variance path can be used to extend an existing high frequency jump test statistic, to detect arrival times of jumps and to obtain distributional characteristics of detected jumps. The effectiveness of our approach is explored through Monte Carlo simulations. It is shown that sparse sampling for mitigating the impact of microstructure noise has an adverse effect on both spot variance estimation and jump detection. In our approach we can analyze high frequency price observations that are contaminated with microstructure noise without the need for sparse sampling, say at fifteen minute intervals. An empirical illustration is presented for the intraday EUR/USD exchange rates. Our main finding is that fewer jumps are detected when sampling intervals increase.
    Keywords: high frequency; intraday periodicity; jump testing; leverage effect; microstructure noise; pre-averaged bipower variation; spot variance
    JEL: C12 C13 C22 G10 G14
    Date: 2009–12–04
  4. By: Yin Liao; Heather Anderson; Farshid Vahid
    Abstract: Realized volatility of stock returns is often decomposed into two distinct components that are attributed to continuous price variation and jumps. This paper proposes a tobit multivariate factor model for the jumps coupled with a standard multivariate factor model for the continuous sample path to jointly forecast volatility in three Chinese Mainland stocks. Out of sample forecast analysis shows that separate multivariate factor models for the two volatility processes outperform a single multivariate factor model of realized volatility, and that a single multivariate factor model of realized volatility outperforms univariate models.
    JEL: C13 C32 C52 C53 G32
    Date: 2010–05
  5. By: Luca Benati (European Central Bank, Monetary Policy Strategy Division, Kaiserstrasse 29, 60311 Frankfurt am Main, Germany.)
    Abstract: Based on standard New Keynesian models I show that policy counterfactuals based on the theoretical structural VAR representations of the models fail to reliably capture the impact of changes in the parameters of the Taylor rule on the (reduced-form) properties of the economy. Based on estimated models for the Great Inflation and the most recent period, I show that, as a practical matter, the problem appears to be non-negligible. These results imply that the outcomes of SVAR-based policy counterfactuals should be regarded with caution, as their informativeness for the specific issue at hand–e.g., understanding the role played by monetary policy in exacerbating the Great Depression, causing the Great Inflation, or fostering the Great Moderation–is, in principle, open to question. Finally, I argue that SVAR-based policy counterfactuals suffer from a cruciallogical shortcoming: given that their reliability crucially depends on unknown structural characteristics of the underlying data generation process, such reliability cannot simply be assumed, and can instead only be ascertained with a reasonable degree of confidence by estimating structural (DSGE) models. JEL Classification: E30, E32.
    Keywords: Lucas critique, structural VARs, policy counterfactuals, DSGE models, Taylor rules, monetary policy, Great Depression, Great Inflation, Great Moderation.
    Date: 2010–05
  6. By: Kai Christoffel (Directorate General Research, European Central Bank, Kaiserstrasse 29, 60311 Frankfurt am Main, Germany.); Günter Coenen (Directorate General Research, European Central Bank, Kaiserstrasse 29, 60311 Frankfurt am Main, Germany.); Anders Warne (Directorate General Research, European Central Bank, Kaiserstrasse 29, 60311 Frankfurt am Main, Germany.)
    Abstract: In this paper we review the methodology of forecasting with log-linearised DSGE models using Bayesian methods. We focus on the estimation of their predictive distributions, with special attention being paid to the mean and the covariance matrix of h-step ahead forecasts. In the empirical analysis, we examine the forecasting performance of the New Area-Wide Model (NAWM) that has been designed for use in the macroeconomic projections at the European Central Bank. The forecast sample covers the period following the introduction of the euro and the out-of-sample performance of the NAWM is compared to nonstructural benchmarks, such as Bayesian vector autoregressions (BVARs). Overall, the empirical evidence indicates that the NAWM compares quite well with the reduced-form models and the results are therefore in line with previous studies. Yet there is scope for improving the NAWM’s forecasting performance. For example, the model is not able to explain the moderation in wage growth over the forecast evaluation period and, therefore, it tends to overestimate nominal wages. As a consequence, both the multivariate point and density forecasts using the log determinant and the log predictive score, respectively, suggest that a large BVAR can outperform the NAWM. JEL Classification: C11, C32, E32, E37.
    Keywords: Bayesian inference, DSGE models, euro area, forecasting, open-economy macroeconomics, vector autoregression.
    Date: 2010–05
  7. By: Gao-Feng Gu; Wei-Xing Zhou
    Abstract: Detrending moving average (DMA) is a widely used method to quantify the correlation of non-stationary signals. We generalize DMA to multifractal detrending moving average (MFDMA), and then extend one-dimensional MFDMA to two-dimensional version. In the paper, we elaborate one-dimensional and two-dimensional MFDMA theoretically and apply the methods to synthetic multifractal measures. We find that the numerical estimations of the multifractal scaling exponent $\tau(q)$ and the multifractal spectrum $f(\alpha)$ are in good agreement with the theoretical values. We also compare the performance of MFDMA with MFDFA, and report that MFDMA is superior to MFDFA when apply them to analysis the properties of one-dimensional and two-dimensional multifractal measures.
    Date: 2010–05
  8. By: Emeric Balogh; Ingve Simonsen; Balint Zs. Nagy; Zoltan Neda
    Abstract: Empirical evidence is given for a significant difference in the collective trend of the share prices during the stock index rising and falling periods. Data on the Dow Jones Industrial Average and its stock components are studied between 1991 and 2008. Pearson-type correlations are computed between the stocks and averaged over stock-pairs and time. The results indicate a general trend: whenever the stock index is falling the stock prices are changing in a more correlated manner than in case the stock index is ascending. A thorough statistical analysis of the data shows that the observed difference is significant, suggesting a constant-fear factor among stockholders.
    Date: 2010–05
  9. By: Dell'Era, Mario
    Abstract: We propose to discuss a new technique to derive an good approximated solution for the price of a European Vanilla options, in a market model with stochastic volatility. In particular, the models that we have considered are the Heston and SABR(for beta=1). These models allow arbitrary correlation between volatility and spot asset returns. We are able to write the price of European call and put, in the same form in which one can see in the Black-Scholes model. The solution technique is based upon coordinate transformations that reduce the initial PDE in a straightforward one-dimensional heat equation.
    Keywords: Financial pricing method
    JEL: C0 C02 I22
    Date: 2010–05–05
  10. By: Grassi, Stefano; Proietti, Tommaso
    Abstract: We apply a recently proposed Bayesian model selection technique, known as stochastic model specification search, for characterising the nature of the trend in macroeconomic time series. We illustrate that the methodology can be quite successfully applied to discriminate between stochastic and deterministic trends. In particular, we formulate autoregressive models with stochastic trends components and decide on whether a specific feature of the series, i.e. the underlying level and/or the rate of drift, are fixed or evolutive.
    Keywords: Bayesian model selection; stationarity; unit roots; stochastic trends; variable selection.
    JEL: E32 C52 C22
    Date: 2010–05–07
  11. By: Dinghai Xu (Department of Economics, University of Waterloo)
    Abstract: Rapid development in the computer technology has made the financial transaction data visible at an ultimate limit level. The realized volatility, as a proxy for the "true" volatility, can be constructed using the high frequency data. This paper extends a threshold stochastic volatility specification proposed in So, Li and Lam (2002) by incorporating the high frequency volatility measures. Due to the availability of the volatility time series, the parameters estimation can be easily implemented via the standard maximum likelihood estimation (MLE) rather than using the simulated Bayesian methods. In the Monte Carlo section, several mis-specification and sensitivity experiments are conducted. The proposed methodology shows good performance according to the Monte Carlo results. In the empirical study, three stock indices are examined under the threshold stochastic volatility structure. Empirical results show that in different regimes, the returns and volatilities exhibit asymmetric behavior. In addition, this paper allows the threshold in the model to be flexible and uses a sequential optimization based on MLE to search for the "optimal" threshold value. We find that the model with a flexible threshold is always preferred to the model with a fixed threshold according to the log-likelihood measure. Interestingly, the "optimal" threshold is found to be stable across different sampling realized volatility measures.
    JEL: C01 C51
    Date: 2010–05
  12. By: Cathy Ning, Dinghai Xu, Tony Wirjanto (Department of Economics, University of Waterloo)
    Abstract: Volatility clustering is a well-known stylized feature of financial asset returns. In this paper, we investigate the asymmetric pattern of volatility clustering on both the stock and foreign exchange rate markets. To this end, we employ copula-based semi-parametric univariate time-series models that accommodate the clusters of both large and small volatilities in the analysis. Using daily realized volatilities of the individual company stocks, stock indices and foreign exchange rates constructed from high frequency data, we find that volatility clustering is strongly asymmetric in the sense that clusters of large volatilities tend to be much stronger than those of small volatilities. In addition, the asymmetric pattern of volatility clusters continues to be visible even when the clusters are allowed to be changing over time, and the volatility clusters themselves remain persistent even after forty days.
    JEL: C51 G32
    Date: 2010–01
  13. By: Dinghai Xu, Yuying Li (Department of Economics, University of Waterloo)
    Abstract: Increasing attention has been focused on the analysis of the realized volatility, which can be treated as a proxy for the true volatility. In this paper, we study the potential use of the realized volatility as a proxy in a stochastic volatility model estimation. We estimate the leveraged stochastic volatility model using the realized volatility computed from five popular methods across six sampling-frequency transaction data (from 1-min to 60-min). Availability of the realized volatility allows us to estimate the model parameters via the MLE and thus avoids computational challenge in the high dimensional integration.Six stock indices are considered in the empirical investigation. We discover some consistent findings and interesting patterns from the empirical results. In general, the significant leverage effect is consistently detected at each sampling frequency. The volatility persistence becomes weaker at the lower sampling frequency. We also find that the consistent-scaling and "optimal"-weighted realized volatility method proposed by Hansen and Lunde (2005) provide relatively better performances compared to other methods considered. Length: 26 pages
    JEL: C01 C51
    Date: 2010–05

This nep-ets issue is ©2010 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.