|
on Econometric Time Series |
By: | Christian Hotz-Behofsits; Florian Huber; Thomas O. Z\"orner |
Abstract: | In this paper we forecast daily returns of crypto-currencies using a wide variety of different econometric models. To capture salient features commonly observed in financial time series like rapid changes in the conditional variance, non-normality of the measurement errors and sharply increasing trends, we develop a time-varying parameter VAR with t-distributed measurement errors and stochastic volatility. To control for overparameterization, we rely on the Bayesian literature on shrinkage priors that enables us to shrink coefficients associated with irrelevant predictors and/or perform model specification in a flexible manner. Using around one year of daily data we perform a real-time forecasting exercise and investigate whether any of the proposed models is able to outperform the naive random walk benchmark. To assess the economic relevance of the forecasting gains produced by the proposed models we moreover run a simple trading exercise. |
Date: | 2018–01 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1801.06373&r=ets |
By: | Vincent Vandenberghe (UNIVERSITE CATHOLIQUE DE LOUVAIN, Institut de Recherches Economiques et Sociales (IRES)) |
Abstract: | A common problem with differences-in-differences (DD) estimates is the failure of the parallel-trend assumption. To cope with this, most authors include polynomial (linear, quadratic…) trends among the regressors, and estimate the treatment effect as a once-in-a-time trend shift. In practice that strategy does not work very well, because inter alia the estimation of the trend uses post-treatment data. An extreme case is when sample covers only one period before treatment and many after. Then the trend's estimate relies almost completely on post-treatment developments, and absorbs most of the treatment effect. What is needed is a method that i) uses pre-treatment observations to capture linear or non-linear trend differences, and ii) extrapolates these to compute the treatment effect. This paper shows how this can be achieved using a fully-flexible version of the canonical DD equation. It also contains an illustration using data on a 1994-2000 EU programme that was implemented in the Belgian province of Hainaut. |
Keywords: | Treatment-Effect Analysis, Differences-in-Differences Models, Correction for trend differences |
JEL: | C21 C4 C5 |
Date: | 2018–01–25 |
URL: | http://d.repec.org/n?u=RePEc:ctl:louvir:2018001&r=ets |
By: | Ghouse, Ghulam; Khan, Saud Ahmed; Rehman, Atiq Ur |
Abstract: | Spurious regression have performed a vital role in the construction of contemporary time series econometrics and have developed many tools employed in applied macroeconomics. The conventional Econometrics has limitations in the treatment of spurious regression in non-stationary time series. While reviewing a well-established study of Granger and Newbold (1974) we realized that the experiments constituted in this paper lacked Lag Dynamics thus leading to spurious regression. As a result of this paper, in conventional Econometrics, the Unit root and Cointegration analysis have become the only ways to circumvent the spurious regression. These procedures are also equally capricious because of some specification decisions like, choice of the deterministic part, structural breaks, autoregressive lag length choice and innovation process distribution. This study explores an alternative treatment for spurious regression. We concluded that it is the missing variable (lag values) that are the major cause of spurious regression therefore an alternative way to look at the problem of spurious regression takes us back to the missing variable which further leads to ARDL Model. The study mainly focus on Monte Carlo simulations. The results are providing justification, that ARDL model can be used as an alternative tool to avoid the spurious regression problem. |
Keywords: | Spurious regression, misspecification, Stationarity, unit root, cointegration and ARDL |
JEL: | B41 C4 C5 C53 |
Date: | 2018–01–10 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:83973&r=ets |
By: | Tatsushi Oka; Pierre Perron |
Abstract: | The issue addressed in this paper is that of testing for common breaks across or within equations of a multivariate system. Our framework is very general and allows integrated regressors and trends as well as stationary regressors. The null hypothesis is that breaks in different parameters occur at common locations and are separated by some positive fraction of the sample size unless they occur across different equations. Under the alternative hypothesis, the break dates across parameters are not the same and also need not be separated by a positive fraction of the sample size whether within or across equations. The test considered is the quasi-likelihood ratio test assuming normal errors, though as usual the limit distribution of the test remains valid with non-normal errors. Of independent interest, we provide results about the rate of convergence of the estimates when searching over all possible partitions subject only to the requirement that each regime contains at least as many observations as some positive fraction of the sample size, allowing break dates not separated by a positive fraction of the sample size across equations. Simulations show that the test has good finite sample properties. We also provide an application to issues related to level shifts and persistence for various measures of inflation to illustrate its usefulness. |
Date: | 2016–05 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1606.00092&r=ets |
By: | Fernando Fernandes Neto |
Abstract: | The present paper aims to demonstrate the usage of Convolutional Neural Networks as a generative model for stochastic processes, enabling researchers from a wide range of fields (such as quantitative finance and physics) to develop a general tool for forecasts and simulations without the need to identify/assume a specific system structure or estimate its parameters. |
Date: | 2018–01 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1801.03523&r=ets |
By: | Antolin-Diaz, Juan; Petrella, Ivan; Rubio-Ramírez, Juan Francisco |
Abstract: | In the context of vector autoregressions, conditional forecasts are typically constructed by specifying the future path of one or more variables while remaining silent about the structural shocks that might have caused the path. However, in many cases, researchers may be interested in identifying a structural vector autoregression and choosing which structural shock is driving the path of the conditioning variables. This would allow researchers to create a ''structural scenario'' that can be given an economic interpretation. In this paper we show how to construct structural scenarios and develop efficient algorithms to implement our methods. We show how structural scenario analysis can lead to results that are very different from, but complementary to, those of the traditional conditional forecasting exercises. We also propose an approach to assess and compare the plausibility of alternative scenarios. We illustrate our methods by applying them to two examples: comparing alternative monetary policy options and stress testing the reaction of bank profitability to an economic recession. |
Keywords: | Bayesian methods; Conditional forecasts; probability distribution; SVARs |
JEL: | C32 C53 E47 |
Date: | 2018–01 |
URL: | http://d.repec.org/n?u=RePEc:cpr:ceprdp:12579&r=ets |
By: | Lu, Yang |
Abstract: | The computation of the likelihood function and the term structure of probabilistic forecasts in higher-order INAR(p) models are qualified numerically intractable and the literature has considered various approximations. Using the notion of compound autoregressive process, we propose an exact and fast algorithm for both quantities. We find that existing approximation schemes induce significant errors for forecasting. |
Keywords: | compound autoregressive process, probabilistic forecast of counts, matrix arithmetic. |
JEL: | C22 C25 |
Date: | 2018–01–01 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:83682&r=ets |
By: | M. Angeles Carnero Fernández (Universidad de Alicante); Ana Pérez Espartero (Dpto. Economía Aplicada) |
Abstract: | This paper illustrates how outliers can affect both the estimation and testing of leverage effect by focusing on the TGARCH model. Three estimation methods are compared through Monte Carlo experiments: Gaussian Quasi-Maximum Likelihood, Quasi-Maximum Likelihood based on the t Student likelihood and Least Absolute Deviation method. The empirical behavior of the t-ratio and the Likelihood Ratio tests for the significance of the leverage parameter is also analyzed. Our results put forward the unreliability of Gaussian Quasi-Maximum Likelihood methods in the presence of outliers. In particular, we show that one isolated outlier could hide true leverage effect whereas two consecutive outliers bias the estimated leverage coefficient in a direction that crucially depends on the sign of the first outlier and could lead to wrongly reject the null of no leverage effect or to estimate asymmetries of the wrong sign. By contrast, we highlight the good performance of the robust estimators in the presence of an isolated outlier. However, when there are patches of outliers, our findings suggest that the sizes and powers of the tests as well as the estimated parameters based on robust methods may still be distorted in some cases. We illustrate these results with two series of daily returns, namely the Spain IGBM Consumer Goods index and the futures contracts of the Natural gas. |
Keywords: | Conditional heteroscedasticity, QMLE, Robust estimators, TGARCH, AVGARCH |
JEL: | C22 G10 Q40 |
Date: | 2018–01 |
URL: | http://d.repec.org/n?u=RePEc:ivi:wpasad:2018-01&r=ets |
By: | Manabu Asai (Soka University, Japan); Michael McAleer (Asia University, Taiwan; University of Sydney Business School, Australia; Erasmus School of Economics, Erasmus University Rotterdam, The Netherlands; Complutense University of Madrid, Spain; Yokohama National University, Japan) |
Abstract: | The paper develops a new realized matrix-exponential GARCH (MEGARCH) model, which uses the information of returns and realized measure of co-volatility matrix simultaneously. The paper also considers an alternative multivariate asymmetric function to develop news impact curves. We consider Bayesian MCMC estimation to allow non-normal posterior distributions. For three US financial assets, we compare the realized MEGARCH models with existing multivariate GARCH class models. The empirical results indicate that the realized MEGARCH models outperform the other models regarding in-sample and out-of-sample performance. The news impact curves based on the posterior densities provide reasonable results. |
Keywords: | C11; C32 |
Date: | 2018–01–17 |
URL: | http://d.repec.org/n?u=RePEc:tin:wpaper:20180005&r=ets |