Econometric Time Series
http://lists.repec.orgmailman/listinfo/nep-ets
Econometric Time Series
2016-04-30
Random factor approach for large sets of equity time-series
http://d.repec.org/n?u=RePEc:arx:papers:1604.05896&r=ets
Factor models are commonly used in financial applications to analyze portfolio risk and to decompose it to loadings of risk factors. A linear factor model often depends on a small number of carefully-chosen factors and it has been assumed that an arbitrary selection of factors does not yield a feasible factor model. We develop a statistical factor model, the random factor model, in which factors are chosen at random based on the random projection method. Random selection of factors has the important consequence that the factors are almost orthogonal with respect to each other. The developed random factor model is expected to preserve covariance between time-series. We derive probabilistic bounds for the accuracy of the random factor representation of time-series, their cross-correlations and covariances. As an application of the random factor model, we analyze reproduction of correlation coefficients in the well-diversified Russell 3,000 equity index using the random factor model. Comparison with the principal component analysis (PCA) shows that the random factor model requires significantly fewer factors to provide an equally accurate reproduction of correlation coefficients. This occurs despite the finding that PCA reproduces single equity return time-series more faithfully than the random factor model. Accuracy of a random factor model is not very sensitive to which particular set of randomly-chosen factors is used. A more general kind of universality of random factor models is also present: it does not much matter which particular method is used to construct the random factor model, accuracy of the resulting factor model is almost identical.
Antti Tanskanen
Jani Lukkarinen
Kari Vatanen
2016-04
Efficient Two-Step Estimation via Targeting
http://d.repec.org/n?u=RePEc:cir:cirwor:2016s-16&r=ets
The standard description of two-step extremum estimation amounts to plugging-in a first-step estimator of nuisance parameters to simplify the optimization problem and then deducing a user friendly, but potentially inefficient, estimator for the parameters of interest. In this paper, we consider a more general setting of two-step estimation where we do not necessarily have “nuisance parameters” but rather awkward occurrences of the parameters of interest. The efficiency problem associated with two-step estimators in this context is more difficult than with standard nuisance parameters as even if the true unknown value of the parameters were plugged-in to alleviate the awkward occurrences of the parameters, the resulting second-step estimator may not be efficient. In addition, standard approaches to restore efficiency for two-step procedures may not work due to a consistency issue. To alleviate this potential issue, we propose a new computationally simple two-step estimation procedure that relies on targeting and penalized to enforce consistency, with the second-step estimators maintaining asymptotic efficiency. We compare this new method with existing iterative methods in the framework of copula models and asset pricing models. Simulation results illustrate that this new method performs better than existing iterative procedures and is (nearly) computationally equivalent.
David T. Frazierz
Éric Renault
Targeting, Penalization, Multivariate Time Series Models, Asset Pricing,
2016-04-08
A new combination approach to reducing forecast errors with an application to volatility forecasting
http://d.repec.org/n?u=RePEc:cqe:wpaper:4616&r=ets
This paper formally establishes a new forecast combination approach, which is based on VAR modeling of the forecast errors resulting from alternative forecast models. We apply our approach to volatility forecasting by combining several structural time series models with implied volatility. Using a multi-currency data set, we conduct in-sample and out-of-sample forecasting analyses in order (a) to demonstrate the statistical significance of our approach, and (b) to assess its forecasting superiority over alternative forecasting models and combinations.
Till Weigt
Bernd Wilfling
Forecast combination, volatility forecasting, realized volatility, implied volatility, exchange rates
2016-04
Inference on Self-Exciting Jumps in Prices and Volatility using High Frequency Measures
http://d.repec.org/n?u=RePEc:msh:ebswps:2016-8&r=ets
Dynamic jumps in the price and volatility of an asset are modelled using a joint Hawkes process in conjunction with a bivariate jump diffusion. A state space representation is used to link observed returns, plus nonparametric measures of integrated volatility and price jumps, to the specified model components; with Bayesian inference conducted using a Markov chain Monte Carlo algorithm. An evaluation of marginal likelihoods for the proposed model relative to a large number of alternative models, including some that have featured in the literature, is provided. An extensive empirical investigation is undertaken using data on the S&P500 market index over the 1996 to 2014 period, with substantial support for dynamic jump intensities â€“ including in terms of predictive accuracy â€“ documented.
Worapree Maneesoonthorn
Catherine S. Forbes
Gael M. Martin
Dynamic price and volatility jumps, stochastic volatility, Hawkes process, nonlinear state space model, Bayesian Markov chain Monte Carlo, global financial crises
2016
Density Forecast Evaluation in Unstable Environments
http://d.repec.org/n?u=RePEc:ucr:wpaper:201606&r=ets
Gloria Gonzalez-Rivera
Yingying Sun
2016-04