|
on Econometric Time Series |
By: | Pawe{\l} Fiedor; Odd Magnus Trondrud |
Abstract: | Modelling financial time series as a time change of a simpler process has been proposed in various forms over the years. One of such recent approaches is called volatility homogenisation decomposition, and has been designed specifically to aid the forecasting of price changes on financial markets. The authors of this method have attempted to prove the its usefulness by applying a specific forecasting procedure and determining the effectiveness of this procedure on the decomposed time series, as compared with the original time series. This is problematic in at least two ways. First, the choice of the forecasting procedure obviously has an effect on the results, rendering them non-exhaustive. Second, the results obtained were not completely convincing, with some values falling under 50% guessing rate. Additionally, only nine Australian stocks were being investigated, which further limits the scope of this proof. In this study we propose to find the usefulness of volatility homogenisation by calculating the predictability of the decomposed time series and comparing it to the predictability of the original time series. We are applying information-theoretic notion of entropy rate to quantify predictability, which guarantees the result is not tied to a specific method of prediction, and additionally we base our calculations on a large number of stocks from the Warsaw Stock Exchange. |
Date: | 2014–06 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1406.7526&r=ets |
By: | Helmut Lütkepohl; Aleksei Netsunajev |
Abstract: | In structural vector autoregressive analysis identifying the shocks of interest via heteroskedasticity has become a standard tool. Unfortunately, the approaches currently used for modelling heteroskedasticity all have drawbacks. For instance, assuming known dates for variance changes is often unrealistic while more exible models based on GARCH or Markov switching residuals are difficult to handle from a statistical and computational point of view. Therefore we propose a modelbased on a smooth change in variance that is exible as well as relatively easy to estimate. The model is applied to a five-dimensional system of U.S. variables to explore the interaction between monetary policy and the stock market. It is found that previously used conventional identification schemes in this context are rejected by the data if heteroskedasticity is allowed for. Shocks identified via heteroskedasticity have a different economic interpretation than the shocks identified using conventional methods. |
Keywords: | Structural vector autoregressions, heteroskedasticity, smooth transition VAR models, identification via heteroskedasticity |
JEL: | C32 |
Date: | 2014 |
URL: | http://d.repec.org/n?u=RePEc:diw:diwwpp:dp1388&r=ets |
By: | Guglielmo Maria Caporale; Marinko Skare |
Abstract: | This paper analyses the long-memory properties of both the conditional mean and variance of UK real GDP over the period 1851-2013 by estimating a multivariate ARFIMA-FIGARCH model (with the unemployment rate and inflation as explanatory variables). The results suggest that this series is non-stationary and non-mean-reverting, the null hypotheses of I(0), I(1) and I(2) being rejected in favour of fractional integration - shocks appear to have permanent effects, and therefore policy actions are required to restore equilibrium. The estimate of the long-memory parameter (1.37) is similar to that reported by Candelon and Gil-Alana (2004), implying that aggregate output is not an I(1) process. The presence of long memory in output volatility (d = 0.80) is also confirmed. |
Keywords: | ARFIMA-(FI)GARCH, Dual long memory, Volatility, Fractional impulse-response, Unemployment, Inflation |
JEL: | B23 C14 C32 C53 C54 |
Date: | 2014 |
URL: | http://d.repec.org/n?u=RePEc:diw:diwwpp:dp1395&r=ets |