
on Econometric Time Series 
By:  Jakub Nowotarski; Jakub Tomczyk; Rafal Weron 
Abstract:  When building stochastic models for electricity spot prices the problem of uttermost importance is the estimation and consequent forecasting of a component to deal with trends and seasonality in the data. While the shortterm seasonal components (daily, weekly) are more regular and less important for valuation of typical power derivatives, the longterm seasonal components (LTSC; seasonal, annual) are much more difficult to tackle. Surprisingly, in many academic papers dealing with electricity spot price modeling the importance of the seasonal decomposition is neglected and the problem of forecasting it is not considered. With this paper we want to fill the gap and present a thorough study on estimation and forecasting of the LTSC of electricity spot prices. We consider a battery of models based on Fourier or wavelet decomposition combined with linear or exponential decay. We find that all considered waveletbased models are significantly better in terms of forecasting spot prices up to a year ahead than all considered sinebased models. This result questions the validity and usefulness of stochastic models of spot electricity prices built on sinusoidal longterm seasonal components. 
Keywords:  Electricity spot price; Longterm seasonal component; Robust modeling; Forecasting; Wavelets; 
JEL:  C45 C53 C80 Q47 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:wuu:wpaper:hsc1206&r=ets 
By:  Peter Tulip (Reserve Bank of Australia); Stephanie Wallace (Reserve Bank of Australia) 
Abstract:  We use past forecast errors to construct confidence intervals and other estimates of uncertainty around the Reserve Bank of Australia's forecasts of key macroeconomic variables. Our estimates suggest that uncertainty about forecasts is high. We find that the RBA's forecasts have substantial explanatory power for the inflation rate but not for GDP growth. 
Keywords:  forecast errors; confidence intervals 
JEL:  E17 E27 E37 
Date:  2012–11 
URL:  http://d.repec.org/n?u=RePEc:rba:rbardp:rdp201207&r=ets 
By:  Dimitris Korobilis 
Abstract:  This paper considers Bayesian variable selection in regressions with a large number of possibly highly correlated macroeconomic predictors. I show that by acknowledging the correlation structure in the predictors can improve forecasts over existing popular Bayesian variable selection algorithms. 
Keywords:  Bayesian semiparametric selection; Dirichlet process prior; correlated predictors; clustered coefficients 
JEL:  C11 C14 C32 C52 C53 
Date:  2012–07 
URL:  http://d.repec.org/n?u=RePEc:gla:glaewp:2012_12&r=ets 
By:  Eduardo Rossi (Department of Economics and Management, University of Pavia); Paolo Santucci de Magistris (School of Economics and Management, Aarhus University, CREATES) 
Abstract:  A stylized fact is that realized variance has long memory. We show that, when the instantaneous volatility is a long memory process of order d, the integrated variance is characterized by the same longrange dependence. We prove that the spectral density of realized variance is given by the sum of the spectral density of the integrated variance plus that of a measurement error, due to the sparse sampling and market microstructure noise. Hence, the realized volatility has the same degree of long memory as the integrated variance. The additional term in the spectral density induces a finitesample bias in the semiparametric estimates of the long memory. A Monte Carlo simulation provides evidence that the corrected local Whittle estimator of Hurvich et al. (2005) is much less biased than the standard local Whittle estimator and the empirical application shows that it is robust to the choice of the sampling frequency used to compute the realized variance. Finally, the empirical results suggest that the volatility series are more likely to be generated by a nonstationary fractional process. 
Keywords:  Realized variance, Long memory stochastic volatility, Measurement error, local Whittle estimator. 
JEL:  C14 C22 C58 
Date:  2012–11 
URL:  http://d.repec.org/n?u=RePEc:pav:demwpp:017&r=ets 
By:  Eduardo Rossi (Department of Economics and Management, University of Pavia); Dean Fantazzini (Moscow School of Economics, M.V. Lomonosov Moscow State University) 
Abstract:  Intraday return volatilities are characterized by the contemporaneous presence of periodicity and long memory. This paper proposes two new parameterizations of the intraday volatility: the Fractionally Integrated Periodic EGARCH and the Seasonal Fractional Integrated Periodic EGARCH, which provide the required flexibility to account for both features. The periodic kurtosis and periodic autocorrelations of power transformations of the absolute returns are computed for both models. The empirical application shows that volatility of the hourly Emini S&P 500 futures returns are characterized by a periodic leverage effect coupled with a statistically significant longrange dependence. An outofsample forecasting comparison with alternative models shows that a constrained version of the FIPEGARCH provides superior forecasts. A simulation experiment is carried out to investigate the effects that sample frequency has on the fractional differencing parameter estimate. 
Keywords:  Intraday volatility, Long memory, FIPEGARCH, SFIPEGARCH, Periodicmodels. 
JEL:  C22 C58 G13 
Date:  2012–11 
URL:  http://d.repec.org/n?u=RePEc:pav:demwpp:015&r=ets 
By:  Adam McCloskey; Pierre Perron 
Abstract:  We propose estimators of the memory parameter of a time series that are robust to a wide variety of random level shift processes, deterministic level shifts and deterministic time trends. The estimators are simple trimmed versions of the popular logperiodogram regression estimator that employ certain sample sizedependent and, in some cases, datadependent trimmings which discard lowfrequency components. We also show that a previously developed trimmed local Whittle estimator is robust to the same forms of data contamination. Regardless of whether the underlying long/shortmemory process is contaminated by level shifts or deterministic trends, the estimators are consistent and asymptotically normal with the same limiting variance as their standard untrimmed counterparts. Simulations show that the trimmed estimators perform their intended purpose quite well, substantially decreasing both finite sample bias and root meansquared error in the presence of these contaminating components. Furthermore, we assess the tradeoffs involved with their use when such components are not present but the underlying process exhibits strong shortmemory dynamics or is contaminated by noise. To balance the potential finite sample biases involved in estimating the memory parameter, we recommend a particular adaptive version of the trimmed logperiodogram estimator that performs well in a wide variety of circumstances. We apply the estimators to stock market volatility data to find that various time series typically thought to be longmemory processes actually appear to be short or very weak longmemory processes contaminated by level shifts or deterministic trends. 
Keywords:  longmemory processes, semiparametric estimators, level shifts, structural change, deterministic trends 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:bro:econwp:201215&r=ets 
By:  Adam McCloskey 
Abstract:  I provide conditions under which the trimmed FDQML estimator, advanced by McCloskey (2010) in the context of fully parametric shortmemory models, can be used to estimate the longmemory stochastic volatility model parameters in the presence of additive lowfrequency contamination in logsquared returns. The types of lowfrequency contamination covered include level shifts as well as deterministic trends. I establish consistency and asymptotic normality in the presence or absence of such lowfrequency contamination under certain conditions on the growth rate of the trimming parameter. I also provide theoretical guidance on the choice of trimming parameter by heuristically obtaining its asymptotic MSEoptimal rate under certain types of lowfrequency contamination. A simulation study examines the finite sample properties of the robust estimator, showing substantial gains from its use in the presence of level shifts. The finite sample analysis also explores how different levels of trimming affect the parameter estimates in the presence and absence of lowfrequency contamination and longmemory. 
Keywords:  stochastic volatility, frequency domain estimation, robust estimation, spurious persistence, longmemory, level shifts, structural change, deterministic trends 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:bro:econwp:201217&r=ets 
By:  Nikolaus Hautsch; Julia Schuamburg; Melanie Schienle; 
Abstract:  Multiplicative error models (MEM) became a standard tool for modeling conditional durations of intraday transactions, realized volatilities and trading volumes. The parametric estimation of the corresponding multivariate model, the socalled vector MEM (VMEM), requires a specification of the joint error term distribution, which is due to the lack of multivariate distribution functions on Rd + defined via a copula. Maximum likelihood estimation is based on the assumption of constant copula parameters and therefore, leads to invalid inference, if the dependence exhibits time variations or structural breaks. Hence, we suggest to test for timevarying dependence by calibrating a timevarying copula model and to reestimate the VMEM based on identified intervals of homogenous dependence. This paper summarizes the important aspects of (V)MEM, its estimation and a sequential test for changes in the dependence structure. The techniques are applied in an empirical example. 
Keywords:  vector multiplicative error model, copula, timevarying copula, highfrequency data 
JEL:  C32 C51 
Date:  2012–09 
URL:  http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2012054&r=ets 
By:  Alessandro Giovannelli (Department of Economics and Finance, University of Rome "Tor Vergata") 
Abstract:  The primary objective of this paper is to propose two nonlinear extensions for macroeconomic forecasting using large datasets. First, we propose an alternative technique for factor estimation, i.e., kernel principal component analysis, which allows the factors to have a nonlinear relationship to the input variables. Second, we propose artificial neural networks as an alternative to the factor augmented linear forecasting equation. These two extensions allow us to determine whether, in general, there is empirical evidence in favor of nonlinear methods and, in particular, to verify whether the nonlinearity occurs in the estimation of the factors or in the functional form that links the target variable to the factors. In an effort to verify the empirical performances of the methods proposed, we conducted several pseudo forecasting exercises on the industrial production index and consumer price index for the Euro area and US economies. These methods were employed to construct the forecasts at 1, 3, 6, and 12month horizons using a large dataset containing 259 predictors for the Euro area and 131 predictors for the US economy. The results obtained from the empirical study suggest that the estimation of nonlinear factors, using kernel principal components, significantly improves the quality of forecasts compared to the linear method, while the results for artificial neural networks have the same forecasting ability as the factor augmented linear forecasting equation. 
Keywords:  Kernel Principal Component Analysis; Large Dataset; Artificial Neural Networks; QuickNet; Forecasting 
JEL:  C45 C53 C13 C33 
Date:  2012–11–07 
URL:  http://d.repec.org/n?u=RePEc:rtv:ceisrp:255&r=ets 
By:  Shinichiro Shirota (Graduate School of Economics, University of Tokyo); Takayuki Hizu (Mitsubishi UFJ Trust and Banking); Yasuhiro Omori (Faculty of Economics, University of Tokyo) 
Abstract:  The daily return and the realized volatility are simultaneously modeled in the stochastic volatility model with leverage and long memory. The dependent variable in the stochastic volatility model is the logarithm of the squared return, and its error distribution is approximated by a mixture of normals. In addition, we incorporate the logarithm of the realized volatility into the measurement equation, assuming that the latent log volatility follows an Autoregressive Fractionally Integrated Moving Average (ARFIMA) process to describe its long memory property. Using a state space representation, we propose an ecient Bayesian estimation method implemented using Markov chain Monte Carlo method (MCMC). Model comparisons are performed based on the marginal likelihood, and the volatility forecasting performances are investigated using S&P500 stock index returns. 
Date:  2012–11 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2012cf869&r=ets 