|
on Econometric Time Series |
By: | Sancetta, A. |
Abstract: | Given the sequential update nature of Bayes rule, Bayesian methods find natural application to prediction problems. Advances in computational methods allow to routinely use Bayesian methods in econometrics. Hence, there is a strong case for feasible predictions in a Bayesian framework. This paper studies the theoretical properties of Bayesian predictions and shows that under minimal conditions we can derive finite sample bounds for the loss incurred using Bayesian predictions under the Kullback-Leibler divergence. In particular, the concept of universality of predictions is discussed and universality is established for Bayesian predictions in a variety of settings. These include predictions under almost arbitrary loss functions, model averaging, predictions in a non stationary environment and under model miss-specification. Given the possibility of regime switches and multiple breaks in economic series, as well as the need to choose among different forecasting models, which may inevitably be miss-specified, the finite sample results derived here are of interest to economic and financial forecasting. Key words: Bayesian prediction, model averaging, universal prediction. |
JEL: | C11 C44 C53 |
Date: | 2007–11 |
URL: | http://d.repec.org/n?u=RePEc:cam:camdae:0755&r=ets |
By: | Alain Chaboud; Benjamin Chiquoine; Erik Hjalmarsson; Mico Loretan |
Abstract: | Using two newly available ultrahigh-frequency datasets, we investigate empirically how frequently one can sample certain foreign exchange and U.S. Treasury security returns without contaminating estimates of their integrated volatility with market microstructure noise. Using volatility signature plots and a recently-proposed formal decision rule to select the sampling frequency, we find that one can sample FX returns as frequently as once every 15 to 20 seconds without contaminating volatility estimates; bond returns may be sampled as frequently as once every 2 to 3 minutes on days without U.S. macroeconomic announcements, and as frequently as once every 40 seconds on announcement days. With a simple realized kernel estimator, the sampling frequencies can be increased to once every 2 to 5 seconds for FX returns and to about once every 30 to 40 seconds for bond returns. These sampling frequencies, especially in the case of FX returns, are much higher than those often recommended in the empirical literature on realized volatility in equity markets. We suggest that the generally superior depth and liquidity of trading in FX and government bond markets contributes importantly to this difference. |
Date: | 2007 |
URL: | http://d.repec.org/n?u=RePEc:fip:fedgif:905&r=ets |
By: | Kuswanto, Heri; Sibbertsen, Philipp |
Abstract: | We show that specific nonlinear time series models such as SETAR, LSTAR, ESTAR and Markov switching which are common in econometric practice can hardly be distinguished from long memory by standard methods such as the GPH estimator for the memory parameter or linearity tests either general or against a specific nonlinear model. We show by Monte Carlo that under certain conditions, the nonlinear data generating process can have misleading either stationary or non-stationary long memory properties. |
Keywords: | Nonlinear models, long-range dependencies |
JEL: | C12 C22 |
Date: | 2007–11 |
URL: | http://d.repec.org/n?u=RePEc:han:dpaper:dp-380&r=ets |
By: | Sibbertsen, Philipp; Kruse, Robinson |
Abstract: | We show that tests for a break in the persistence of a time series in the classical I(0) - I(1) framework have serious size distortions when the actual data generating process exhibits long-range dependencies. We prove that the limiting distribution of a CUSUM of squares based test depends on the true memory parameter if the DGP exhibits long memory. We propose adjusted critical values for the test and give finite sample response curves which allow the practitioner to easily implement the test and to compute the relevant critical values. We furthermore prove consistency of the test and prove consistency for a simple break point estimator also under long memory. We show that the test has satisfying power properties when the correct critical values are used. |
Keywords: | break in pesistence, long memory, CUSUM of squares based test |
JEL: | C12 C22 |
Date: | 2007–11 |
URL: | http://d.repec.org/n?u=RePEc:han:dpaper:dp-381&r=ets |
By: | Nakatani, Tomoaki (Dept. of Economic Statistics, Stockholm School of Economics); Teräsvirta, Timo (CREATES, School of Economics and Management, University of Aarhus) |
Abstract: | In this article, we derive a set of necessary and sufficient conditions for the positive definiteness of the conditional covariance matrix in the conditional correlation (CC) GARCH models. Under the new conditions, it is possible to introduce negative interdependence among volatilities even in the simplest CC-GARCH(1,1) formulation. An empirical example illustrates how the conditions are imposed and verified in practice. |
Keywords: | Multivariate GARCH; positivity constraints; conditional correlation |
JEL: | C12 |
Date: | 2007–10–15 |
URL: | http://d.repec.org/n?u=RePEc:hhs:hastef:0675&r=ets |
By: | Muhammad Akram; Rob J. Hyndman; J. Keith Ord |
Abstract: | We consider the properties of nonlinear exponential smoothing state space models under various assumptions about the innovations, or error, process. Our interest is restricted to those models that are used to describe non-negative observations, because many series of practical interest are so constrained. We first demonstrate that when the innovations process is assumed to be Gaussian, the resulting prediction distribution may have an infinite variance beyond a certain forecasting horizon. Further, such processes may converge almost surely to zero; an examination of purely multiplicative models reveals the circumstances under which this condition arises. We then explore effects of using an (invalid) Gaussian distribution to describe the innovations process when the underlying distribution is lognormal. Our results suggest that this approximation causes no serious problems for parameter estimation or for forecasting one or two steps ahead. However, for longer-term forecasts the true prediction intervals become increasingly skewed, whereas those based on the Gaussian approximation may have a progressively larger negative component. In addition, the Gaussian approximation is clearly inappropriate for simulation purposes. The performance of the Gaussian approximation is compared with those of two lognormal models for short-term forecasting using data on the weekly sales of over three hundred items of costume jewelry. |
Keywords: | Forecasting; time series; exponential smoothing; positive-valued processes; seasonality; state space models. |
JEL: | C53 C22 C51 |
Date: | 2007–11 |
URL: | http://d.repec.org/n?u=RePEc:msh:ebswps:2007-14&r=ets |
By: | Francesco Audrino; Dominik Colagelo |
Abstract: | We propose a new semi-parametric model for the implied volatility surface, which incorporates machine learning algorithms. Given a starting model, a tree-boosting algorithm sequentially minimizes the residuals of observed and estimated implied volatility. To overcome the poor predicting power of existing models, we include a grid in the region of interest, and implement a cross-validation strategy to find an optimal stopping value for the tree boosting. Back testing the out-of-sample appropriateness of our model on a large data set of implied volatilities on S&P 500 options, we provide empirical evidence of its strong predictive potential, as well as comparing it to other standard approaches in the literature. |
Keywords: | Implied Volatility, Implied Volatility Surface, Forecasting, Tree Boosting, Regression Tree, Functional Gradient Descent |
JEL: | C13 C14 C51 C53 C63 G12 G13 |
Date: | 2007–11 |
URL: | http://d.repec.org/n?u=RePEc:usg:dp2007:2007-42&r=ets |
By: | McCauley, Joseph L. |
Abstract: | We show that Ito processes imply the Fokker-Planck (K2) and Kolmogorov backward time (K1) partial differential eqns. (pde) for transition densities, which in turn imply the Chapman-Kolmogorov equation without approximations. This result is not restricted to Markov processes. We define ‘finite memory’ and show that Ito processes admit finitely many states of memory. We then provide an example of a Gaussian transition density depending on two past states that satisfies both K1, K2, and the Chapman-Kolmogorov eqn. Finally, we show that transition densities of Black-Scholes type pdes with finite memory are martingales and also satisfy the Chapman-Kolmogorov equation. This leads to the shortest possible proof that the transition density of the Black-Scholes pde provides the so-called ‘martingale measure’ of option pricing. |
Keywords: | Ito process; martingale; stochastic differential eqn.; Langevin eqn.; memory; nonMarkov process; Fokker-Planck eqn.; Kolmogorov’s backward time eqn.; Chapman-Kolmogorov eqn.; Black-Scholes eqn. |
JEL: | G1 C20 |
Date: | 2007–11–16 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:5811&r=ets |
By: | Francis X. Diebold (University of Pennsylvania and NBER); Kamil Yýlmaz |
Abstract: | Notwithstanding its impressive contributions to empirical financial economics, there remains a significant gap in the volatility literature, namely its relative neglect of the connection between macroeconomic fundamentals and asset return volatility. We progress by analyzing a broad international cross section of stock markets. We find a clear link between macroeconomic fundamentals and stock market volatilities, with volatile fundamentals translating into volatile stock markets. |
Keywords: | Financial market, equity market, asset return, risk, variance, asset pricing |
JEL: | G1 E0 |
Date: | 2004–03 |
URL: | http://d.repec.org/n?u=RePEc:koc:wpaper:0711&r=ets |