
on Econometric Time Series 
Issue of 2021‒02‒01
twelve papers chosen by Jaqueson K. Galimberti Auckland University of Technology 
By:  Wang, Xiaohu (Fudan University); Xiao, Weilin (Zhejiang University); Yu, Jun (School of Economics, Singapore Management University) 
Abstract:  This paper derives asymptotic properties of the least squares estimator of the autoregressive parameter in local to unity processes with errors being fractional Gaussian noises with the Hurst parameter H. It is shown that the estimator is consistent when H ∈ (0, 1). Moreover, the rate of convergence is n when H ∈ [0.5, 1). The rate of convergence is n2H when H ∈ (0, 0.5). Furthermore, the limit distribution of the centered least squares estimator depends on H. When H = 0.5, the limit distribution is the same as that obtained in Phillips (1987a) for the local to unity model with errors for which the standard functional central theorem is applicable. When H > 0.5 or when H 
Keywords:  Least squares; Local to unity; Fractional Brownian motion; Fractional OrnsteinUhlenbeck process 
JEL:  C22 
Date:  2020–12–23 
URL:  http://d.repec.org/n?u=RePEc:ris:smuesw:2020_027&r=all 
By:  Justyna Wr\'oblewska 
Abstract:  The paper aims at developing the Bayesian seasonally cointegrated model for quarterly data. We propose the prior structure, derive the set of full conditional posterior distributions, and propose the sampling scheme. The identification of cointegrating spaces is obtained \emph{via} orthonormality restrictions imposed on vectors spanning them. In the case of annual frequency, the cointegrating vectors are complex, which should be taken into account when identifying them. The point estimation of the cointegrating spaces is also discussed. The presented methods are illustrated by a simulation experiment and are employed in the analysis of money and prices in the Polish economy. 
Date:  2020–12 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2012.14820&r=all 
By:  Pacifico, Antonio 
Abstract:  The paper suggests and develops a computational approach to improve hierarchical fuzzy clustering timeseries analysis when accounting for high dimensional and noise problems in dynamic data. A Robust Weighted Distance measure between pairs of sets of AutoRegressive Integrated Moving Average models is used. It is robust because Bayesian Model Selection methodology is performed with a set of conjugate informative priors in order to discover the most probable set of clusters capturing different dynamics and interconnections among timevarying data, and weighted because each timeseries is 'adjusted' by own Posterior Model Size distribution in order to group dynamic data objects into 'ad hoc' homogenous clusters. Monte Carlo methods are used to compute exact posterior probabilities for each cluster chosen and thus avoid the problem of increasing the overall probability of errors that plagues classical statistical methods based on significance tests. Empirical and simulated examples describe the functioning and the performance of the procedure. Discussions with related works and possible extensions of the methodology to jointly deal with endogeneity issues and misspecified dynamics in high dimensional multicountry setups are also displayed. 
Keywords:  Distance Measures; Fuzzy Clustering; ARIMA TimeSeries; Bayesian Model Selection; MCMC Integrations. 
JEL:  C1 C52 C61 
Date:  2020 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:104379&r=all 
By:  Eiji Kurozumi; Anton Skrobotov; Alexey Tsarev 
Abstract:  This paper is devoted to testing for the explosive bubble under timevarying nonstationary volatility. Because the limiting distribution of the seminal Phillips et al. (2011) test depends on the variance function and usually requires a bootstrap implementation under heteroskedasticity, we construct the test based on a deformation of the time domain. The proposed test is asymptotically pivotal under the null hypothesis and its limiting distribution coincides with that of the standard test under homoskedasticity, so that the test does not require computationally extensive methods for inference. Appealing finite sample properties are demonstrated through MonteCarlo simulations. An empirical application demonstrates that the upsurge behavior of cryptocurrency time series in the middle of the sample is partially explained by the volatility change. 
Date:  2020–12 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2012.13937&r=all 
By:  Sam Ouliaris; Adrian Pagan 
Abstract:  When sign restrictions are used in SVARs impulse responses are only set identified. If sign restrictions are just given for a single shock the shocks may not be separated, and so the resulting structural equations can be unacceptable. Thus, in a supply demand model, if only signs are given for the impulse responses to a demand shock this may result in two supply curves being in the SVAR. One needs to find the identified set so that this effect is excluded. Granziera el al’s (2018) frequentist approach to inference potentially suffers from this issue. One also has to recognize that the identified set should be adjusted so that it produces responses to the same size shock. Finally, because researchers are often unwilling to set out sign restrictions to separate all shocks, we describe how this can be done with a SVAR/VAR system rather than a straight SVAR. 
Keywords:  SVAR, Sign Restrictions, Identified Set 
JEL:  E37 C51 C52 
Date:  2020–11 
URL:  http://d.repec.org/n?u=RePEc:een:camaaa:2020101&r=all 
By:  Lopes Moreira Da Veiga, María Helena; Rue, Havard; Marín Díazaraque, Juan Miguel; Zea Bermudez, P. De 
Abstract:  The aim of the paper is to implement the integrated nested Laplace (INLA) approximations,known to be very fast and efficient, for a threshold stochastic volatility model. INLAreplaces MCMC simulations with accurate deterministic approximations. We use properal though not very informative priors and Penalizing Complexity (PC) priors. The simulation results favor the use of PC priors, specially when the sample size varies from small to moderate. For these sample sizes, they provide more accurate estimates of the model'sparameters, but as sample size increases both type of priors lead to reliable estimates of the parameters. We also validate the estimation method insample and outofsample by applying it to six series of returns including stock market, commodity and crypto currency returns and by forecasting their onedayahead volatilities, respectively. Our empirical results support that the TSV model does a good job in forecasting the onedayahead volatility of stock market and gold returns but faces difficulties when the volatility of returns is extreme, which occurs in the case of cryptocurrencies. 
Keywords:  Threshold Stochastic Volatility Model; Pc Priors; Inla 
JEL:  C58 C52 C32 C13 
Date:  2021–01–27 
URL:  http://d.repec.org/n?u=RePEc:cte:wsrepe:31804&r=all 
By:  Sayar Karmakar; Marek Chudy; Wei Biao Wu 
Abstract:  Accurate forecasting is one of the fundamental focus in the literature of econometric timeseries. Often practitioners and policy makers want to predict outcomes of an entire time horizon in the future instead of just a single $k$step ahead prediction. These series, apart from their own possible nonlinear dependence, are often also influenced by many external predictors. In this paper, we construct prediction intervals of timeaggregated forecasts in a highdimensional regression setting. Our approach is based on quantiles of residuals obtained by the popular LASSO routine. We allow for general heavytailed, longmemory, and nonlinear stationary error process and stochastic predictors. Through a series of systematically arranged consistency results we provide theoretical guarantees of our proposed quantilebased method in all of these scenarios. After validating our approach using simulations we also propose a novel bootstrap based method that can boost the coverage of the theoretical intervals. Finally analyzing the EPEX Spot data, we construct prediction intervals for hourly electricity prices over horizons spanning 17 weeks and contrast them to selected Bayesian and bootstrap interval forecasts. 
Date:  2020–12 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2012.08223&r=all 
By:  Boeckelmann Lukas; StallaBourdillon Arthur 
Abstract:  We propose a novel approach to quantify spillovers on financial markets based on a structural version of the DieboldYilmaz framework. Key to our approach is a SVARGARCH model that is statistically identified by heteroskedasticity, economically identified by maximum shock contribution and that allows for timevarying forecast error variance decompositions. We analyze credit risk spillovers between EZ sovereign and bank CDS. Methodologically, we find the model to better match economic narratives compared with common spillover approaches and to be more reactive than models relying on rolling window estimations. We find, on average, spillovers to explain 37% of the variation in our sample, amid a strong variation of the latter over time. 
Keywords:  CDS, spillover, sovereign debt, systemic risk, SVAR, identification by heteroskedasticity 
JEL:  C58 G01 G18 G21 
Date:  2021 
URL:  http://d.repec.org/n?u=RePEc:bfr:banfra:798&r=all 
By:  Isao Shoji; Masahiro Nozawa 
Abstract:  A geometric method to analyze nonlinear oscillations is discussed. We consider a nonlinear oscillation modeled by a second order ordinary differential equation without specifying the function form. By transforming the differential equation into the system of first order ordinary differential equations, the trajectory is embedded in $R^3$ as a curve, and thereby the time evolution of the original state can be translated into the behavior of the curve in $R^3$, or the vector field along the curve. We analyze the vector field to investigate the dynamic properties of a nonlinear oscillation. While the function form of the model is unspecified, the vector fields and those associated quantities can be estimated by a nonparametric filtering method. We apply the proposed analysis to the time series of the Japanese stock price index. The application shows that the vector field and its derivative will be used as the tools of picking up various signals that help understanding of the dynamic properties of the stock price index. 
Date:  2020–12 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2012.11825&r=all 
By:  Antonio Martin Arroyo; Aranzazu de Juan Fernandez 
Abstract:  This paper considers the SplitThenCombine (STC) approach (Arroyo and de Juan, 2014) to combine forecasts inside the simplex space, the sample space of positive weights adding up to one. As it turns out, the simplicial statistic given by the center of the simplex compares favorably against the fixedweight, average forecast. Besides, we also develop a CombineAfterSelection (CAS) method to get rid of redundant forecasters. We apply these two approaches to make outofsample onestep ahead combinations and subcombinations of forecasts for several economic variables. This methodology is particularly useful when the sample size is smaller than the number of forecasts, a case where other methods (e.g., Least Squares (LS) or Principal Component Analysis (PCA)) are not applicable. 
Date:  2020–12 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2012.11935&r=all 
By:  Niko Hauzenberger; Florian Huber; Karin Klieber 
Abstract:  In this paper, we assess whether using nonlinear dimension reduction techniques pays off for forecasting inflation in realtime. Several recent methods from the machine learning literature are adopted to map a large dimensional dataset into a lower dimensional set of latent factors. We model the relationship between inflation and these latent factors using stateoftheart timevarying parameter (TVP) regressions with shrinkage priors. Using monthly realtime data for the US, our results suggest that adding such nonlinearities yields forecasts that are on average highly competitive to ones obtained from methods using linear dimension reduction techniques. Zooming into model performance over time moreover reveals that controlling for nonlinear relations in the data is of particular importance during recessionary episodes of the business cycle. 
Date:  2020–12 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2012.08155&r=all 
By:  Hubert Gabrisch (The Vienna Institute for International Economic Studies, wiiw) 
Abstract:  This study attempts to identify uncertainty in the longterm rate of interest based on the controversial interest rate theories of Keynes and Kalecki. While Keynes stated that the future of the rate of interest is uncertain because it is numerically incalculable, Kalecki was convinced that it could be predicted. The theories are empirically tested using a reducedform GARCHinmean model assigned to six globally leading financial markets. The obtained results support Keynes’s theory – the longterm rate of interest is a nonergodic financial phenomenon. Analyses of the relation between the interest rate and macroeconomic variables without interest uncertainty are thus seriously incomplete. 
Keywords:  uncertainty, interest rate, Keynes, Kalecki, GARCH 
JEL:  B26 C58 E43 E47 
Date:  2021–01 
URL:  http://d.repec.org/n?u=RePEc:wii:wpaper:191&r=all 