|
on Econometric Time Series |
By: | Guochang Wang; Ke Zhu; Guodong Li; Wai Keung Li |
Abstract: | Asymmetric power GARCH models have been widely used to study the higher order moments of financial returns, while their quantile estimation has been rarely investigated. This paper introduces a simple monotonic transformation on its conditional quantile function to make the quantile regression tractable. The asymptotic normality of the resulting quantile estimators is established under either stationarity or non-stationarity. Moreover, based on the estimation procedure, new tests for strict stationarity and asymmetry are also constructed. This is the first try of the quantile estimation for non-stationary ARCH-type models in the literature. The usefulness of the proposed methodology is illustrated by simulation results and real data analysis. |
Date: | 2019–11 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1911.09343&r=all |
By: | Peter Boswijk (University of Amsterdam); Giuseppe Cavaliere (University of Bologna and Exeter Business School); Iliyan Georgiev (University of Bologna); Anders Rahbek (University of Copenhagen) |
Abstract: | To what extent can the bootstrap be applied to conditional mean models – such as regression or time series models – when the volatility of the innovations is random and possibly non-stationary? In fact, the volatility of many economic and financial time series displays persistent changes and possible non-stationarity. However, the theory of the bootstrap for such models has focused on deterministic changes of the unconditional variance and little is known about the performance and the validity of the bootstrap when the volatility is driven by a non-stationary stochastic process. This includes near-integrated exogenous volatility processes as well as near-integrated GARCH processes, where the conditional variance has a diffusion limit; a further important example is the case where volatility exhibits infrequent jumps. This paper fills this gap in the literature by developing conditions for bootstrap validity in time series and regression models with non-stationary, stochastic volatility. We show that in such cases the distribution of bootstrap statistics (conditional on the data) is random in the limit. Consequently, the conventional approaches to proofs of bootstrap consistency, based on the notion of weak convergence in probability of the bootstrap statistic, fail to deliver the required validity results. Instead, we use the concept of `weak convergence in distribution' to develop and establish novel conditions for validity of the wild bootstrap, conditional on the volatility process. We apply our results to several testing problems in the presence of non-stationary stochastic volatility, including testing in a location model, testing for structural change using CUSUM-type functionals, and testing for a unit root in autoregressive models. Importantly, we show that sufficient conditions for conditional wild bootstrap validity include the absence of statistical leverage effects, i.e., correlation between the error process and its future conditional variance. The results of the paper are illustrated using Monte Carlo simulations, which indicate that a wild bootstrap approach leads to size control even in small samples. |
Keywords: | Bootstrap, Non-stationary stochastic volatility, Random limit measures, Weak convergence in Distribution |
JEL: | C32 |
Date: | 2019–12–01 |
URL: | http://d.repec.org/n?u=RePEc:tin:wpaper:20190083&r=all |
By: | Aastha M. Sathe; N. S. Upadhye |
Abstract: | In this article, we first propose the modified Hannan-Rissanen Method for estimating the parameters of the autoregressive moving average (ARMA) process with symmetric stable noise and symmetric stable generalized autoregressive conditional heteroskedastic (GARCH) noise. Next, we propose the modified empirical characteristic function method for the estimation of GARCH parameters with symmetric stable noise. Further, we show the efficiency, accuracy, and simplicity of our methods through Monte-Carlo simulation. Finally, we apply our proposed methods to model financial data. |
Date: | 2019–11 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1911.09985&r=all |
By: | Nikolaos Kourentzes; George Athanasopoulos |
Abstract: | Intermittent demand forecasting has been widely researched in the context of spare parts management. However, it is becoming increasingly relevant to many other areas, such as retailing, where at the very disaggregate level time series may be highly intermittent, but at more aggregate levels are likely to exhibit trends and seasonal patterns. The vast majority of intermittent demand forecasting methods are inappropriate of producing forecasts with such features. We propose using temporal hierarchies to produce forecasts that demonstrate these traits at the various aggregation levels, effectively informing the resulting intermittent forecasts of these patterns that are identifiable only at higher levels. We conduct an empirical evaluation on real data and demonstrate statistically significant gains for both point and quantile forecasts. |
Keywords: | forecasting, temporal aggregation, temporal hierarchies, forecast combination, forecast reconciliation |
Date: | 2019 |
URL: | http://d.repec.org/n?u=RePEc:msh:ebswps:2019-27&r=all |
By: | Sebastian Ankargren; M{\aa}ns Unosson; Yukai Yang |
Abstract: | We propose a Bayesian vector autoregressive (VAR) model for mixed-frequency data. Our model is based on the mean-adjusted parametrization of the VAR and allows for an explicit prior on the 'steady states' (unconditional means) of the included variables. Based on recent developments in the literature, we discuss extensions of the model that improve the flexibility of the modeling approach. These extensions include a hierarchical shrinkage prior for the steady-state parameters, and the use of stochastic volatility to model heteroskedasticity. We put the proposed model to use in a forecast evaluation using US data consisting of 10 monthly and 3 quarterly variables. The results show that the predictive ability typically benefits from using mixed-frequency data, and that improvements can be obtained for both monthly and quarterly variables. We also find that the steady-state prior generally enhances the accuracy of the forecasts, and that accounting for heteroskedasticity by means of stochastic volatility usually provides additional improvements, although not for all variables. |
Date: | 2019–11 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1911.09151&r=all |
By: | Karel Janda (Institute of Economic Studies, Faculty of Social Sciences, Charles University, Opletalova 26, 110 00, Prague, Czech Republic; Department of Banking and Insurance, Faculty of Finance and Accounting, University of Economics, Prague, Namesti Winstona Churchilla 4, 130 67 Prague, Czech Republic); Binyi Zhang (Institute of Economic Studies, Faculty of Social Sciences, Charles University, Opletalova 26, 110 00, Prague, Czech Republic) |
Abstract: | In this paper, we analyse the dynamic relationship among the Chinese renewable energy stock prices, the U.S renewable energy stock prices, oil prices and technology stock prices. We apply a four-variable Lag Augmented Vector Autoregressive (LA-VAR) model to study the return interactions among the variables. Moreover, we also use Generalized Autoregressive Conditional Heteroskedasticity (GARCH) models to study the dynamic conditional volatility of the Chinese renewable energy stock prices. The empirical results indicate that both return and conditional volatility of the Chinese renewable energy stock prices can be explained by past movements of the U.S renewable energy stock prices and technology stock prices. In addition, we find significant evidence to support the existence of the GARCH effects in the Chinese renewable energy stock prices. However, only weak statistical evidence reveals the significance of the leverage effects in the Chinese renewable energy stock market. |
Keywords: | Renewable energy, Financial modeling, China |
JEL: | Q20 G15 |
Date: | 2019–05 |
URL: | http://d.repec.org/n?u=RePEc:fau:wpaper:wp2019_07&r=all |
By: | Moura, Guilherme V.; Ruiz, Esther; Santos, André A. P. |
Abstract: | Modelling and forecasting high dimensional covariance matrices is a key challenge in data-richenvironments involving even thousands of time series since most of the available models sufferfrom the curse of dimensionality. In this paper, we challenge some popular multivariate GARCH(MGARCH) and Stochastic Volatility (MSV) models by fitting them to forecast the conditionalcovariance matrices of financial portfolios with dimension up to 1000 assets observed daily over a30-year time span. The time evolution of the conditional variances and covariances estimated bythe different models is compared and evaluated in the context of a portfolio selection exercise. Weconclude that, in a realistic context in which transaction costs are taken into account, modelling thecovariance matrices as latent Wishart processes delivers more stable optimal portfolio compositionsand, consequently, higher Sharpe ratios. |
Keywords: | Stochastic Volatility; Risk-Adjusted Return; Portfolio Turnover; Minimum-Variance Portfolio; Garch; Covariance Forecasting |
JEL: | G17 C53 |
Date: | 2019–11–30 |
URL: | http://d.repec.org/n?u=RePEc:cte:wsrepe:29291&r=all |
By: | Riveros Gavilanes, John Michael |
Abstract: | This article performs simulations with different small samples considering the regression techniques of OLS, Jackknife, Bootstrap, Lasso and Robust Regression in order to stablish the best approach in terms of lower bias and statistical significance with a pre-specified data generating process -DGP-. The methodology consists of a DGP with 5 variables and 1 constant parameter which was regressed among the simulations with a set of random normally distributed variables considering samples sizes of 6, 10, 20 and 500. Using the expected values discriminated by each sample size, the accuracy of the estimators was calculated in terms of the relative bias for each technique. The results indicate that Jackknife approach is more suitable for lower sample sizes as it was stated by Speed (1994), Bootstrap approach reported to be sensitive to a lower sample size indicating that it might not be suitable for stablish significant relationships in the regressions. The Monte Carlo simulations also reflected that when a significant relationship is found in small samples, this relationship will also tend to remain significant when the sample size is increased. |
Keywords: | Small sample size; Statistical significance; Regression; Simulations; Bias |
JEL: | C15 C19 C63 |
Date: | 2019–11–17 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:97017&r=all |