nep-ets New Economics Papers
on Econometric Time Series
Issue of 2012‒11‒17
ten papers chosen by
Yong Yin
SUNY at Buffalo

  1. Robust estimation and forecasting of the long-term seasonal component of electricity spot prices By Jakub Nowotarski; Jakub Tomczyk; Rafal Weron
  2. Estimates of Uncertainty around the RBA's Forecasts By Peter Tulip; Stephanie Wallace
  3. Bayesian forecasting with highly correlated predictors By Dimitris Korobilis
  4. Estimation of long memory in integrated variance By Eduardo Rossi; Paolo Santucci de Magistris
  5. Long memory and Periodicity in Intraday Volatility By Eduardo Rossi; Dean Fantazzini
  6. Memory Parameter Estimation in the Presence of Level Shifts and Deterministic Trends By Adam McCloskey; Pierre Perron
  7. Estimation of the Long-Memory Stochastic Volatility Model Parameters that is Robust to Level Shifts and Deterministic Trends By Adam McCloskey
  8. Modeling Time-Varying Dependencies between Positive-Valued High-Frequency Time Series By Nikolaus Hautsch; Julia Schuamburg; Melanie Schienle;
  9. Nonlinear Forecasting Using Large Datasets: Evidences on US and Euro Area Economies By Alessandro Giovannelli
  10. "Realized stochastic volatility with leverage and long memory" By Shinichiro Shirota; Takayuki Hizu; Yasuhiro Omori

  1. By: Jakub Nowotarski; Jakub Tomczyk; Rafal Weron
    Abstract: When building stochastic models for electricity spot prices the problem of uttermost importance is the estimation and consequent forecasting of a component to deal with trends and seasonality in the data. While the short-term seasonal components (daily, weekly) are more regular and less important for valuation of typical power derivatives, the long-term seasonal components (LTSC; seasonal, annual) are much more difficult to tackle. Surprisingly, in many academic papers dealing with electricity spot price modeling the importance of the seasonal decomposition is neglected and the problem of forecasting it is not considered. With this paper we want to fill the gap and present a thorough study on estimation and forecasting of the LTSC of electricity spot prices. We consider a battery of models based on Fourier or wavelet decomposition combined with linear or exponential decay. We find that all considered wavelet-based models are significantly better in terms of forecasting spot prices up to a year ahead than all considered sine-based models. This result questions the validity and usefulness of stochastic models of spot electricity prices built on sinusoidal long-term seasonal components.
    Keywords: Electricity spot price; Long-term seasonal component; Robust modeling; Forecasting; Wavelets;
    JEL: C45 C53 C80 Q47
    Date: 2012
  2. By: Peter Tulip (Reserve Bank of Australia); Stephanie Wallace (Reserve Bank of Australia)
    Abstract: We use past forecast errors to construct confidence intervals and other estimates of uncertainty around the Reserve Bank of Australia's forecasts of key macroeconomic variables. Our estimates suggest that uncertainty about forecasts is high. We find that the RBA's forecasts have substantial explanatory power for the inflation rate but not for GDP growth.
    Keywords: forecast errors; confidence intervals
    JEL: E17 E27 E37
    Date: 2012–11
  3. By: Dimitris Korobilis
    Abstract: This paper considers Bayesian variable selection in regressions with a large number of possibly highly correlated macroeconomic predictors. I show that by acknowledging the correlation structure in the predictors can improve forecasts over existing popular Bayesian variable selection algorithms.
    Keywords: Bayesian semiparametric selection; Dirichlet process prior; correlated predictors; clustered coefficients
    JEL: C11 C14 C32 C52 C53
    Date: 2012–07
  4. By: Eduardo Rossi (Department of Economics and Management, University of Pavia); Paolo Santucci de Magistris (School of Economics and Management, Aarhus University, CREATES)
    Abstract: A stylized fact is that realized variance has long memory. We show that, when the instantaneous volatility is a long memory process of order d, the integrated variance is characterized by the same long-range dependence. We prove that the spectral density of realized variance is given by the sum of the spectral density of the integrated variance plus that of a measurement error, due to the sparse sampling and market microstructure noise. Hence, the realized volatility has the same degree of long memory as the integrated variance. The additional term in the spectral density induces a finite-sample bias in the semiparametric estimates of the long memory. A Monte Carlo simulation provides evidence that the corrected local Whittle estimator of Hurvich et al. (2005) is much less biased than the standard local Whittle estimator and the empirical application shows that it is robust to the choice of the sampling frequency used to compute the realized variance. Finally, the empirical results suggest that the volatility series are more likely to be generated by a nonstationary fractional process.
    Keywords: Realized variance, Long memory stochastic volatility, Measurement error, local Whittle estimator.
    JEL: C14 C22 C58
    Date: 2012–11
  5. By: Eduardo Rossi (Department of Economics and Management, University of Pavia); Dean Fantazzini (Moscow School of Economics, M.V. Lomonosov Moscow State University)
    Abstract: Intraday return volatilities are characterized by the contemporaneous presence of periodicity and long memory. This paper proposes two new parameterizations of the intraday volatility: the Fractionally Integrated Periodic EGARCH and the Seasonal Fractional Integrated Periodic EGARCH, which provide the required flexibility to account for both features. The periodic kurtosis and periodic autocorrelations of power transformations of the absolute returns are computed for both models. The empirical application shows that volatility of the hourly Emini S&P 500 futures returns are characterized by a periodic leverage effect coupled with a statistically significant long-range dependence. An out-of-sample forecasting comparison with alternative models shows that a constrained version of the FI-PEGARCH provides superior forecasts. A simulation experiment is carried out to investigate the effects that sample frequency has on the fractional differencing parameter estimate.
    Keywords: Intraday volatility, Long memory, FI-PEGARCH, SFI-PEGARCH, Periodicmodels.
    JEL: C22 C58 G13
    Date: 2012–11
  6. By: Adam McCloskey; Pierre Perron
    Abstract: We propose estimators of the memory parameter of a time series that are robust to a wide variety of random level shift processes, deterministic level shifts and deterministic time trends. The estimators are simple trimmed versions of the popular log-periodogram regression estimator that employ certain sample size-dependent and, in some cases, data-dependent trimmings which discard low-frequency components. We also show that a previously developed trimmed local Whittle estimator is robust to the same forms of data contamination. Regardless of whether the underlying long/shortmemory process is contaminated by level shifts or deterministic trends, the estimators are consistent and asymptotically normal with the same limiting variance as their standard untrimmed counterparts. Simulations show that the trimmed estimators perform their intended purpose quite well, substantially decreasing both finite sample bias and root mean-squared error in the presence of these contaminating components. Furthermore, we assess the tradeoffs involved with their use when such components are not present but the underlying process exhibits strong short-memory dynamics or is contaminated by noise. To balance the potential finite sample biases involved in estimating the memory parameter, we recommend a particular adaptive version of the trimmed log-periodogram estimator that performs well in a wide variety of circumstances. We apply the estimators to stock market volatility data to find that various time series typically thought to be long-memory processes actually appear to be short or very weak long-memory processes contaminated by level shifts or deterministic trends.
    Keywords: long-memory processes, semiparametric estimators, level shifts, structural change, deterministic trends
    Date: 2012
  7. By: Adam McCloskey
    Abstract: I provide conditions under which the trimmed FDQML estimator, advanced by McCloskey (2010) in the context of fully parametric short-memory models, can be used to estimate the long-memory stochastic volatility model parameters in the presence of additive low-frequency contamination in log-squared returns. The types of lowfrequency contamination covered include level shifts as well as deterministic trends. I establish consistency and asymptotic normality in the presence or absence of such lowfrequency contamination under certain conditions on the growth rate of the trimming parameter. I also provide theoretical guidance on the choice of trimming parameter by heuristically obtaining its asymptotic MSE-optimal rate under certain types of lowfrequency contamination. A simulation study examines the finite sample properties of the robust estimator, showing substantial gains from its use in the presence of level shifts. The finite sample analysis also explores how different levels of trimming affect the parameter estimates in the presence and absence of low-frequency contamination and long-memory.
    Keywords: stochastic volatility, frequency domain estimation, robust estimation, spurious persistence, long-memory, level shifts, structural change, deterministic trends
    Date: 2012
  8. By: Nikolaus Hautsch; Julia Schuamburg; Melanie Schienle;
    Abstract: Multiplicative error models (MEM) became a standard tool for modeling conditional durations of intraday transactions, realized volatilities and trading volumes. The parametric estimation of the corresponding multivariate model, the so-called vector MEM (VMEM), requires a specification of the joint error term distribution, which is due to the lack of multivariate distribution functions on Rd + defined via a copula. Maximum likelihood estimation is based on the assumption of constant copula parameters and therefore, leads to invalid inference, if the dependence exhibits time variations or structural breaks. Hence, we suggest to test for time-varying dependence by calibrating a time-varying copula model and to reestimate the VMEM based on identified intervals of homogenous dependence. This paper summarizes the important aspects of (V)MEM, its estimation and a sequential test for changes in the dependence structure. The techniques are applied in an empirical example.
    Keywords: vector multiplicative error model, copula, time-varying copula, highfrequency data
    JEL: C32 C51
    Date: 2012–09
  9. By: Alessandro Giovannelli (Department of Economics and Finance, University of Rome "Tor Vergata")
    Abstract: The primary objective of this paper is to propose two nonlinear extensions for macroeconomic forecasting using large datasets. First, we propose an alternative technique for factor estimation, i.e., kernel principal component analysis, which allows the factors to have a nonlinear relationship to the input variables. Second, we propose artificial neural networks as an alternative to the factor augmented linear forecasting equation. These two extensions allow us to determine whether, in general, there is empirical evidence in favor of nonlinear methods and, in particular, to verify whether the nonlinearity occurs in the estimation of the factors or in the functional form that links the target variable to the factors. In an effort to verify the empirical performances of the methods proposed, we conducted several pseudo forecasting exercises on the industrial production index and consumer price index for the Euro area and US economies. These methods were employed to construct the forecasts at 1-, 3-, 6-, and 12-month horizons using a large dataset containing 259 predictors for the Euro area and 131 predictors for the US economy. The results obtained from the empirical study suggest that the estimation of nonlinear factors, using kernel principal components, significantly improves the quality of forecasts compared to the linear method, while the results for artificial neural networks have the same forecasting ability as the factor augmented linear forecasting equation.
    Keywords: Kernel Principal Component Analysis; Large Dataset; Artificial Neural Networks; QuickNet; Forecasting
    JEL: C45 C53 C13 C33
    Date: 2012–11–07
  10. By: Shinichiro Shirota (Graduate School of Economics, University of Tokyo); Takayuki Hizu (Mitsubishi UFJ Trust and Banking); Yasuhiro Omori (Faculty of Economics, University of Tokyo)
    Abstract: The daily return and the realized volatility are simultaneously modeled in the stochastic volatility model with leverage and long memory. The dependent variable in the stochastic volatility model is the logarithm of the squared return, and its error distribution is approximated by a mixture of normals. In addition, we incorporate the logarithm of the realized volatility into the measurement equation, assuming that the latent log volatility follows an Autoregressive Fractionally Integrated Moving Average (ARFIMA) process to describe its long memory property. Using a state space representation, we propose an ecient Bayesian estimation method implemented using Markov chain Monte Carlo method (MCMC). Model comparisons are performed based on the marginal likelihood, and the volatility forecasting performances are investigated using S&P500 stock index returns.
    Date: 2012–11

This nep-ets issue is ©2012 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.