|
on Econometric Time Series |
By: | José Casals (Departamento de Fundamentos del Análisis Económico II. Facultad de Ciencias Económicas. Campus de Somosaguas. 28223 Madrid (SPAIN).); Sonia Sotoca (Departamento de Fundamentos del Análisis Económico II. Facultad de Ciencias Económicas. Campus de Somosaguas. 28223 Madrid (SPAIN).); Miguel Jerez (Departamento de Fundamentos del Análisis Económico II. Facultad de Ciencias Económicas. Campus de Somosaguas. 28223 Madrid (SPAIN).) |
Abstract: | Computing the gaussian likelihood for a nonstationary state-space model is a difficult problem which has been tackled by the literature using two main strategies: data transformation and diffuse likelihood. The data transformation approach is cumbersome, as it requires nonstandard filtering. On the other hand, in some nontrivial cases the diffuse likelihood value depends on the scale of the diffuse states, so one can obtain different likelihood values corresponding to different observationally equivalent models. In this paper we discuss the properties of the minimally-conditioned likelihood function, as well as two efficient methods to compute its terms with computational advantages for specific models. Three convenient features of the minimally-conditioned likelihood are: (a) it can be computed with standard Kalman filters, (b) it is scale-free, and (c) its values are coherent with those resulting from differencing, being this the most popular approach to deal with nonstationary data. |
Keywords: | State-space models, Conditional likelihood, Diffuse likelihood, Diffuse initial conditions, Kalman filter, Nonstationarity. |
JEL: | C32 C51 C10 |
Date: | 2012 |
URL: | http://d.repec.org/n?u=RePEc:ucm:doicae:1204&r=ets |
By: | Mehmet Pinar (Fondazione Eni Enrico Mattei); Thanasis Stengos (University of Guelph.); M. Ege Yazgan (Istanbul Bilgi University) |
Abstract: | The forecast combination puzzle refers to the finding that a simple average forecast combination outperforms more sophisticated weighting schemes and/or the best individual model. The paper derives optimal (worst) forecast combinations based on stochastic dominance (SD) analysis with differential forecast weights. For the optimal (worst) forecast combination, this index will minimize (maximize) forecasts errors by combining time-series model based forecasts at a given probability level. By weighting each forecast differently, we find the optimal (worst) forecast combination that does not rely on arbitrary weights. Using two exchange rate series on weekly data for the Japanese Yen/U.S. Dollar and U.S. Dollar/Great Britain Pound for the period from 1975 to 2010 we find that the simple average forecast combination is neither the worst nor the best forecast combination something that provides partial support for the forecast combination puzzle. In that context, the random walk model is the model that consistently contributes with considerably more than an equal weight to the worst forecast combination for all variables being forecasted and for all forecast horizons, whereas a flexible Neural Network autoregressive model and a self-exciting threshold autoregressive model always enter the best forecast combination with much greater than equal weights. |
Keywords: | Nonparametric Stochastic Dominance, Mixed Integer Programming; Forecast combinations; Forecast combination |
JEL: | C53 C61 C63 |
Date: | 2011 |
URL: | http://d.repec.org/n?u=RePEc:gue:guelph:2012-06.&r=ets |
By: | Hecq Alain; Laurent Sébastien; Palm Franz C. (METEOR) |
Abstract: | First, we investigate the minimal order univariate representation of some well known n-dimensionalconditional volatility models. Even simple low order systems (e.g. a multivariate GARCH(0,1)) forthe joint behavior of several variables imply individual processes with a lot of persistence inthe form of high order lags. However, we show that in the presence of common GARCH factors,parsimonious univariate representations (e.g. GARCH(1,1)) can result from large multivariatemodels generating the conditional variances and conditional covariances/correlations. The trivialdiagonal model without any contagion effects in conditional volatilities gives rise to the sameconclusions though.Consequently, we then propose an approach to detect the presence of these commonalities inmultivariate GARCH process. The factor we extract is the volatility of a portfolio made up by theoriginal assets whose weights are determined by the reduced rank analysis.We compare the small sample performances of two strategies. First, extending Engle and Marcucci(2006), we use reduced rank regressions in a multivariate system for squared returns andcross-returns. Second we investigate a likelihood ratio approach, where under the null the matrixparameters of the BEKK have a reduced rank structure (Lin, 1992). It emerged that the latterapproach has quite good properties enabling us to discriminate between a system with seeminglyunrelated assets (e.g. a diagonal model) and a model with few common sources of volatility. |
Keywords: | econometrics; |
Date: | 2012 |
URL: | http://d.repec.org/n?u=RePEc:dgr:umamet:2012018&r=ets |
By: | David E. Giles (Department of Economics, University of Victoria) |
Abstract: | By noting that the Hodrick-Prescott filter can be expressed as the solution to a particular regression problem, we are able to show how to construct confidence bands for the filtered time-series. This procedure requires that the data are stationary. The construction of such confidence bands is illustrated using annual U.S. data for real value-added output; and monthly U.S. data for the unemployment rate. |
Keywords: | Hodrick-Prescott filter; time-series decomposition; confidence bands |
JEL: | C13 C20 E3 |
Date: | 2012–04–19 |
URL: | http://d.repec.org/n?u=RePEc:vic:vicewp:1202&r=ets |
By: | Fabio Bellini; Franco Pellerey; Carlo Sgarra; Salimeh Yasaei Sekeh |
Abstract: | We consider the problem of stochastic comparison of general Garch-like processes, for different parameters and different distributions of the innovations. We identify several stochastic orders that are propagated from the innovations to the Garch process itself, and discuss their interpretations. We focus on the convex order and show that in the case of symmetric innovations it is also propagated to the cumulated sums of the Garch process. More generally, we discuss multivariate comparison results related to the multivariate convex and supermodular order. Finally we discuss ordering with respect to the parameters in the Garch (1,1) case. Key words: Garch, Convex Order, Peakedness, Kurtosis, Supermodularity. |
Date: | 2012–04 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1204.3786&r=ets |
By: | Jordi Camprodon; Josep Perell\'o |
Abstract: | Volatility measures the amplitude of price fluctuations. Despite it is one of the most important quantities in finance, volatility is not directly observable. Here we apply a maximum likelihood method which assumes that price and volatility follow a two-dimensional diffusion process where volatility is the stochastic diffusion coefficient of the log-price dynamics. We apply this method to the expOU, the OU and the Heston stochastic volatility models and we study their performance in terms of the log-price probability, the volatility probability, and the mean first-passage for the log-price. The approach has some predictive power on the future returns amplitude by only knowing current volatility. The assumed models do not consider long-range volatility auto-correlation and the asymetric return-volatility cross-correlation but the method still arises very naturally these two important stylized facts. We apply the method to different market indexes and with a good performance in all cases. |
Date: | 2012–04 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1204.3556&r=ets |