nep-ets New Economics Papers
on Econometric Time Series
Issue of 2009‒09‒26
29 papers chosen by
Yong Yin
SUNY at Buffalo

  1. Modelling and Forecasting Liquidity Supply Using Semiparametric Factor Dynamics By Wolfgang Karl Härdle; Nikolaus Hautsch; Andrija Mihoci
  2. ADL tests for threshold cointegration By Jing Li; Junsoo Lee
  3. The Multistep Beveridge-Nelson Decomposition By Tommaso Proietti
  4. A Nonparametric Copula Based Test for Conditional Independence with Applications to Granger Causality By Taoufik Bouezmarni; Jeroen V.K. Rombouts; Abderrahim Taamouti
  5. Nonparametric time series forecasting with dynamic updating By Han Lin Shang; Rob J Hyndman
  6. Using Panel Data to Construct Simple and Efficient Unit Root Tests in the Presence of GARCH By Westerlund, Joakim; Narayan, Paresh
  7. Seasonal Unit Root Tests for Trending and Breaking Series with Application to Industrial Production By Westerlund, Joakim; Costantini, Mauro; Narayan, Paresh; Popp, Stephan
  8. Myths and Facts about Panel Unit Root Tests By Westerlund, Joakim; Breitung, Jörg
  9. Contemporaneous-Threshold Smooth Transition GARCH Models By Michael Dueker; Zacharias Psaradakis; Martin Sola; Fabio Spagnolo
  10. Multivariate Contemporaneous Threshold Autoregressive Models By Michael Dueker; Zacharias Psaradakis; Martin Sola; Fabio Spagnolo
  11. Bootstrap Unit Root Tests for Nonlinear Threshold Models By Dilem Yildirim; Ralf Becker; Denise R Osborn
  12. Finite Sample Correction Factors for Panel Cointegration Tests By Hlouskova, Jaroslava; Wagner, Martin
  13. "Modelling and Forecasting Noisy Realized Volatility" By Manabu Asai; Michael McAleer; Marcelo C. Medeiros
  14. "A General Asymptotic Theory for Time Series Models" By Shiqing Ling; Michael McAleer
  15. Matching Theory and Data: Bayesian Vector Autoregression and Dynamic Stochastic General Equilibrium Models By Alexander Kriwoluzky
  16. MIDAS vs. mixed-frequency VAR: Nowcasting GDP in the Euro Area By Vladimir Kuzin; Massimiliano Marcellino; Christian Schumacher
  17. Forecasting Large Datasets with Bayesian Reduced Rank Multivariate Models By Andrea Carriero; George Kapetanios; Massimiliano Marcellino
  18. An Adaptive Markov Chain Monte Carlo Method for GARCH Model By Tetsuya Takaishi
  19. Inference on multivariate ARCH processes with large sizes By Gilles Zumbach
  20. High order discretization schemes for stochastic volatility models By Benjamin Jourdain; Mohamed Sbai
  21. Long-range memory stochastic model of the return in financial markets By V. Gontis; J. Ruseckas; A. Kononovicius
  22. Asymptotic Behavior of the Stock Price Distribution Density and Implied Volatility in Stochastic Volatility Models By A. Gulisashvili; E. M. Stein
  23. Most Efficient Homogeneous Volatility Estimators By A. Saichev; D. Sornette; V. Filimonov
  24. Bayesian Inference on QGARCH Model Using the Adaptive Construction Scheme By Tetsuya Takaishi
  25. Optimisation of Stochastic Programming by Hidden Markov Modelling based Scenario Generation By Sovan Mitra
  26. New procedures for testing whether stock price processes are martingales By Kei Takeuchi; Akimichi Takemura; Masayuki Kumon
  27. Modeling non-Markovian, nonstationary scaling dynamics By Fulvio Baldovin; Dario Bovina; Attilio L. Stella
  28. Bayesian Estimation of the GARCH(1,1) Model with Student-t Innovations in R By Ardia, David
  29. Forecast performance of implied volatility and the impact of the volatility risk premium By Ralf Becker; Adam Clements; Christopher Coleman-Fenn

  1. By: Wolfgang Karl Härdle (Humboldt Universität zu Berlin and National Central University, Taiwan); Nikolaus Hautsch (Humboldt Universität zu Berlin, Quantitative Products Laboratory, Berlin, and CFS); Andrija Mihoci (Humboldt Universität zu Berlin and University of Zagreb, Croatia)
    Abstract: We model the dynamics of ask and bid curves in a limit order book market using a dynamic semiparametric factor model. The shape of the curves is captured by a factor structure which is estimated nonparametrically. Corresponding factor loadings are assumed to follow multivariate dynamics and are modelled using a vector autoregressive model. Applying the framework to four stocks traded at the Australian Stock Exchange (ASX) in 2002, we show that the suggested model captures the spatial and temporal dependencies of the limit order book. Relating the shape of the curves to variables reflecting the current state of the market, we show that the recent liquidity demand has the strongest impact. In an extensive forecasting analysis we show that the model is successful in forecasting the liquidity supply over various time horizons during a trading day. Moreover, it is shown that the model’s forecasting power can be used to improve optimal order execution strategies.
    Keywords: Limit Order Book, Liquidity Risk, Semiparametric Model, Factor Structure, Prediction
    JEL: C14 C32 C53 G1
    Date: 2009–09–15
    URL: http://d.repec.org/n?u=RePEc:cfs:cfswop:wp200918&r=ets
  2. By: Jing Li (Department of Economics, South Dakota State University); Junsoo Lee (Department of Economics, Finance, and Legal Studies, University of Alabama)
    Abstract: In this paper, we propose new tests for threshold cointegration in the autoregressive distributed lag (ADL) model. The indicators in the threshold model are based on either a nonstationary or stationary threshold variable. The cointegrating vector in this paper is not pre-specied. We adopt a supremum Wald type test to account for the so-called Davies problem. Theasymptotic null distributions of the proposed tests are free of nuisance parameters. As such, a bootstrap procedure is not required and critical values of the proposed tests are tabulated. A Monte Carlo experiment shows a good finite-sample performance of the proposed tests.
    Keywords: Econometric Theory, Time Series
    JEL: C12 C15 C32
    Date: 2009–04
    URL: http://d.repec.org/n?u=RePEc:sda:workpa:22009&r=ets
  3. By: Tommaso Proietti
    Abstract: The Beveridge-Nelson decomposition defines the trend component in terms of the eventual forecast function, as the value the series would take if it were on its long-run path. The paper in-troduces the multistep Beveridge-Nelson decomposition, which arises when the forecast function is obtained by the direct autoregressive approach, which optimizes the predictive ability of the AR model at forecast horizons greater than one. We compare our proposal with the standard Beveridge-Nelson decomposition, for which the forecast function is obtained by iterating the one-step-ahead predictions via the chain rule. We illustrate that the multistep Beveridge-Nelson trend is more efficient than the standard one in the presence of model misspecification and we subsequently assess the predictive validity of the extracted transitory component with respect to future growth.
    Keywords: Trend and Cycle, Forecasting, Filtering.
    JEL: C22 C52 E32
    Date: 2009–09–24
    URL: http://d.repec.org/n?u=RePEc:eei:rpaper:eeri_rp_2009_24&r=ets
  4. By: Taoufik Bouezmarni; Jeroen V.K. Rombouts; Abderrahim Taamouti
    Abstract: This paper proposes a new nonparametric test for conditional independence, which is based on the comparison of Bernstein copula densities using the Hellinger distance. The test is easy to implement because it does not involve a weighting function in the test statistic, and it can be applied in general settings since there is no restriction on the dimension of the data. In fact, to apply the test, only a bandwidth is needed for the nonparametric copula. We prove that the test statistic is asymptotically pivotal under the null hypothesis, establish local power properties, and motivate the validity of the bootstrap technique that we use in finite sample settings. A simulation study illustrates the good size and power properties of the test. We illustrate the empirical relevance of our test by focusing on Granger causality using financial time series data to test for nonlinear leverage versus volatility feedback effects and to test for causality between stock returns and trading volume. In a third application, we investigate Granger causality between macroeconomic variables.
    Keywords: Nonparametric tests, conditional idependence, Granger non-causality, Bernstein density copula, bootstrap, finance, volatility asymmetry, leverage effect, volatility feedback effect, macroeconomics
    JEL: C12 C14 C15 C19 G1 G12 E3 E4 E52
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:lvl:lacicr:0927&r=ets
  5. By: Han Lin Shang; Rob J Hyndman
    Abstract: We present a nonparametric method to forecast a seasonal univariate time series, and propose four dynamic updating methods to improve point forecast accuracy. Our methods consider a seasonal univariate time series as a functional time series. We propose first to reduce the dimensionality by applying functional principal component analysis to the historical observations, and then to use univariate time series forecasting and functional principal component regression techniques. When data in the most recent year are partially observed, we improve point forecast accuracy using dynamic updating methods. We also introduce a nonparametric approach to construct prediction intervals of updated forecasts, and compare the empirical coverage probability with an existing parametric method. Our approaches are data-driven and computationally fast, and hence they are feasible to be applied in real time high frequency dynamic updating. The methods are demonstrated using monthly sea surface temperatures from 1950 to 2008.
    Keywords: Functional time series, Functional principal component analysis, Ordinary least squares, Penalized least squares, Ridge regression, Sea surface temperatures, Seasonal time series.
    JEL: C14 C23
    Date: 2009–08
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2009-8&r=ets
  6. By: Westerlund, Joakim (Department of Economics, School of Business, Economics and Law, Göteborg University); Narayan, Paresh (Deakin University)
    Abstract: In search for more efficient unit root tests in the presence of GARCH, some researchers have recently turned their attention to estimation by maximum likelihood. However, although theoretically appealing, the new test is difficult to implement, which has made it quite uncommon in the empirical literature. The current paper offers a panel data based solution to this problem.<p>
    Keywords: Panel Data; Unit Root Tests; GARCH
    JEL: C23 G00
    Date: 2009–09–11
    URL: http://d.repec.org/n?u=RePEc:hhs:gunwpe:0379&r=ets
  7. By: Westerlund, Joakim (Department of Economics, School of Business, Economics and Law, Göteborg University); Costantini, Mauro (University of Vienna); Narayan, Paresh (Deakin University); Popp, Stephan (University of Duisburg–Essen)
    Abstract: Some unit root testing situations are more difficult than others. In the case of quarterly industrial production there is not only the seasonal variation that needs to be considered but also the occasionally breaking linear trend. In the current paper we take this as our starting point to develop three new seasonal unit root tests that allow for a break in both the seasonal mean and linear trend of a quarterly time series. The asymptotic properties of the tests are derived and investigated in small-samples using simulations. In the empirical part of the paper we consider as an example the industrial production of 13 European countries. The results suggest that for most of the series there is evidence of stationary seasonality around an otherwise nonseasonal unit root.<p>
    Keywords: Seasonal unit root tests; Structural breaks; Linear time trend; Industrial production
    JEL: C12 C22
    Date: 2009–09–11
    URL: http://d.repec.org/n?u=RePEc:hhs:gunwpe:0377&r=ets
  8. By: Westerlund, Joakim (Department of Economics, School of Business, Economics and Law, Göteborg University); Breitung, Jörg (University of Bonn)
    Abstract: This paper points to some of the common myths and facts that have emerged from 20 years of research into the analysis of unit roots in panel data. Some of these are wellknown, others are not. But they all have in common that if ignored the effects can be very serious. This is demonstrated using both simulations and theoretical reasoning.<p>
    Keywords: Non-stationary panel data; Unit root tests; Cross-section dependence; Multidimensional limits
    JEL: C13 C33
    Date: 2009–09–11
    URL: http://d.repec.org/n?u=RePEc:hhs:gunwpe:0380&r=ets
  9. By: Michael Dueker; Zacharias Psaradakis; Martin Sola; Fabio Spagnolo
    Abstract: This paper proposes a contemporaneous-threshold smooth transition GARCH (or CSTGARCH) model for dynamic conditional heteroskedasticity. The C-STGARCH model is a generalization to second conditional moments of the contemporaneous smooth transition threshold autoregressive model of Dueker et al. (2007), in which the regime weights depend on the ex ante probability that a contemporaneous latent regime-specific variable exceeds a threshold value. A key feature of the C-STGARCH model is that its transition function depends on all the parameters of the model as well as on the data. These characteristics allow the model to account for the large persistence and regime shifts that are often observed in the conditional second moments of economic and financial time series.
    Keywords: Conditional heteroskedasticity; Smooth transition GARCH; Threshold; Stock returns.
    JEL: C22 E31 G12
    Date: 2009–06
    URL: http://d.repec.org/n?u=RePEc:udt:wpecon:2009-06&r=ets
  10. By: Michael Dueker; Zacharias Psaradakis; Martin Sola; Fabio Spagnolo
    Abstract: In this paper we propose a contemporaneous threshold multivariate smooth transition autoregressive (C-MSTAR) model in which the regime weights depend on the ex ante probabilities that latent regime-specific variables exceed certain threshold values. A key feature of the model is that the transition function depends on all the parameters of the model as well as on the data. Since the mixing weights are a function of the regime-specific contemporaneous variance-covariance matrix, the model can account for contemporaneous regime-specific co-movements of the variables. The stability and distributional properties of the proposed model are discussed, as well as issues of estimation, testing and forecasting. The practical usefulness of the C-MSTAR model is illustrated by examining the relationship between US stock prices and interest rates and discussing the regime specific Granger causality relationships.
    Keywords: Nonlinear autoregressive models; Smooth transition; Stability; Threshold.
    JEL: C32 G12
    Date: 2009–03
    URL: http://d.repec.org/n?u=RePEc:udt:wpecon:2009-03&r=ets
  11. By: Dilem Yildirim; Ralf Becker; Denise R Osborn
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:man:sespap:0915&r=ets
  12. By: Hlouskova, Jaroslava (Department of Economics and Finance, Institute for Advanced Studies, Vienna, Austria); Wagner, Martin (Department of Economics and Finance, Institute for Advanced Studies, Vienna, Austria)
    Abstract: In this paper we present finite T mean and variance correction factors and corresponding response surface regressions for the panel cointegration tests presented in Pedroni (1999, 2004), Westerlund (2005), Larsson et al. (2001), and Breitung (2005). For the single equation tests we consider up to 12 regressors and for the system tests vector autoregression dimensions up to 12 variables. All commonly used specifications for the deterministic components are considered. The time dimension sample sizes are 10, 20, 30, 40, 50, 60, 70, 80, 90, 100, 200, 500.
    Keywords: Panel cointegration test, correction factor, response surface, simulation
    JEL: C12 C15 C23
    Date: 2009–09
    URL: http://d.repec.org/n?u=RePEc:ihs:ihsesp:244&r=ets
  13. By: Manabu Asai (Faculty of Economics, Soka University); Michael McAleer (Econometric Institute, Erasmus School of Economics, Erasmus University Rotterdam and Tinbergen Institute and Center for International Research on the Japanese Economy (CIRJE), Faculty of Economics, University of Tokyo); Marcelo C. Medeiros (Department of Economics, Pontifical Catholic University of Rio de Janeiro)
    Abstract: Several methods have recently been proposed in the ultra high frequency financial literature to remove the effects of microstructure noise and to obtain consistent estimates of the integrated volatility (IV) as a measure of ex-post daily volatility. Even bias-corrected and consistent (modified) realized volatility (RV) estimates of the integrated volatility can contain residual microstructure noise and other measurement errors. Such noise is called "realized volatility error". As such measurement errors ignored, we need to take account of them in estimating and forecasting IV. This paper investigates through Monte Carlo simulations the effects of RV errors on estimating and forecasting IV with RV data. It is found that: (i) neglecting RV errors can lead to serious bias in estimators due to model misspecification; (ii) the effects of RV errors on one-step ahead forecasts are minor when consistent estimators are used and when the number of intraday observations is large; and (iii) even the partially corrected R2 recently proposed in the literature should be fully corrected for evaluating forecasts. This paper proposes a full correction of R2 , which can be applied to linear and nonlinear, short and long memory models. An empirical example for &P 500 data is used to demonstrate that neglecting RV errors can lead to serious bias in estimating the model of integrated volatility, and that the new method proposed here can eliminate the effects of the RV noise. The empirical results also show that the full correction for R2 is necessary for an accurate description of goodness-of-fit.
    Date: 2009–09
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2009cf669&r=ets
  14. By: Shiqing Ling (Department of Mathematics, Hong Kong University of Science and Technology); Michael McAleer (Econometric Institute, Erasmus School of Economics, Erasmus University Rotterdam and Tinbergen Institute and Center for International Research on the Japanese Economy (CIRJE), Faculty of Economics, University of Tokyo)
    Abstract: This paper develops a general asymptotic theory for the estimation of strictly stationary and ergodic time series models. Under simple conditions that are straightforward to check, we establish the strong consistency, the rate of strong convergence and the asymptotic normality of a general class of estimators that includes LSE, MLE, and some M-type estimators. As an application, we verify the assumptions for the long-memory fractional ARIMA model. Other examples include the GARCH(1,1) model, random coefficient AR(1) model and the threshold MA(1) model.
    Date: 2009–09
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2009cf670&r=ets
  15. By: Alexander Kriwoluzky
    Abstract: This paper shows how to identify the structural shocks of a Vector Autoregression (VAR) while simultaneously estimating a dynamic stochastic general equilibrium (DSGE) model that is not assumed to replicate the data-generating process. It proposes a framework for estimating the parameters of the VAR model and the DSGE model jointly: the VAR model is identified by sign restrictions derived from the DSGE model; the DSGE model is estimated by matching the corresponding impulse response functions.
    Keywords: Bayesian Model Estimation, Vector Autoregression, Identification
    JEL: C51
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2009/29&r=ets
  16. By: Vladimir Kuzin; Massimiliano Marcellino; Christian Schumacher
    Abstract: This paper compares the mixed-data sampling (MIDAS) and mixed-frequency VAR (MF-VAR) approaches to model speci.cation in the presence of mixed-frequency data, e.g., monthly and quarterly series. MIDAS leads to parsimonious models based on exponential lag polynomials for the coe¢ cients, whereas MF-VAR does not restrict the dynamics and therefore can su¤er from the curse of dimensionality. But if the restrictions imposed by MIDAS are too stringent, the MF-VAR can perform better. Hence, it is di¢ cult to rank MIDAS and MF-VAR a priori, and their relative ranking is better evaluated empirically. In this paper, we compare their performance in a relevant case for policy making, i.e., nowcasting and forecasting quarterly GDP growth in the euro area, on a monthly basis and using a set of 20 monthly indicators. It turns out that the two approaches are more complementary than substitutes, since MF-VAR tends to perform better for longer horizons, whereas MIDAS for shorter horizons.
    Keywords: nowcasting, mixed-frequency data, mixed-frequency VAR, MIDAS
    JEL: E37 C53
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2009/32&r=ets
  17. By: Andrea Carriero; George Kapetanios; Massimiliano Marcellino
    Abstract: The paper addresses the issue of forecasting a large set of variables using multivariate models. In particular, we propose three alternative reduced rank forecasting models and compare their predictive performance for US time series with the most promising existing alternatives, namely, factor models, large scale Bayesian VARs, and multivariate boosting. Speci.cally, we focus on classical reduced rank regression, a two-step procedure that applies, in turn, shrinkage and reduced rank restrictions, and the reduced rank Bayesian VAR of Geweke (1996). We .nd that using shrinkage and rank reduction in combination rather than separately improves substantially the accuracy of forecasts, both when the whole set of variables is to be forecast, and for key variables such as industrial production growth, inflation, and the federal funds rate. The robustness of this finding is confirmed by a Monte Carlo experiment based on bootstrapped data. We also provide a consistency result for the reduced rank regression valid when the dimension of the system tends to infinity, which opens the ground to use large scale reduced rank models for empirical analysis.
    Keywords: Bayesian VARs, factor models, forecasting, reduced rank
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2009/31&r=ets
  18. By: Tetsuya Takaishi
    Abstract: We propose a method to construct a proposal density for the Metropolis-Hastings algorithm in Markov Chain Monte Carlo (MCMC) simulations of the GARCH model. The proposal density is constructed adaptively by using the data sampled by the MCMC metho d itself. It turns out that autocorrelations between the data generated with our adaptive proposal density are greatly reduced. Thus it is concluded that the adaptive construction method is very efficient and works well for the MCMC simulations of the GARCH model.
    Date: 2009–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0901.0992&r=ets
  19. By: Gilles Zumbach
    Abstract: The covariance matrix is formulated in the framework of a linear multivariate ARCH process with long memory, where the natural cross product structure of the covariance is generalized by adding two linear terms with their respective parameter. The residuals of the linear ARCH process are computed using historical data and the (inverse square root of the) covariance matrix. Simple measure of qualities assessing the independence and unit magnitude of the residual distributions are proposed. The salient properties of the computed residuals are studied for three data sets of size 54, 55 and 330. Both new terms introduced in the covariance help in producing uncorrelated residuals, but the residual magnitudes are very different from unity. The large sizes of the inferred residuals are due to the limited information that can be extracted from the empirical data when the number of time series is large, and denotes a fundamental limitation to the inference that can be achieved.
    Date: 2009–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0903.1531&r=ets
  20. By: Benjamin Jourdain (CERMICS); Mohamed Sbai (CERMICS)
    Abstract: In usual stochastic volatility models, the process driving the volatility of the asset price evolves according to an autonomous one-dimensional stochastic differential equation. We assume that the coefficients of this equation are smooth. Using It\^o's formula, we get rid, in the asset price dynamics, of the stochastic integral with respect to the Brownian motion driving this SDE. Taking advantage of this structure, we propose - a scheme, based on the Milstein discretization of this SDE, with order one of weak trajectorial convergence for the asset price, - a scheme, based on the Ninomiya-Victoir discretization of this SDE, with order two of weak convergence for the asset price. We also propose a specific scheme with improved convergence properties when the volatility of the asset price is driven by an Orstein-Uhlenbeck process. We confirm the theoretical rates of convergence by numerical experiments and show that our schemes are well adapted to the multilevel Monte Carlo method introduced by Giles [2008a,b].
    Date: 2009–08
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0908.1926&r=ets
  21. By: V. Gontis; J. Ruseckas; A. Kononovicius
    Abstract: We present nonlinear stochastic differential equation (SDE) which mimics the probability density function (PDF) of return and power spectrum of absolute return in the financial markets. Absolute return as a measure of market volatility is considered in the proposed model as a long-range memory stochastic variable. SDE is obtained from the analogy with earlier proposed model of trading activity in the financial markets and generalized within the nonextensive statistical mechanics framework. Proposed stochastic model generates time series of return with two power-law statistics, i.e., PDF and power spectral density, reproducing the empirical data of the one minute trading return in NYSE.
    Date: 2009–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0901.0903&r=ets
  22. By: A. Gulisashvili; E. M. Stein
    Abstract: We study the asymptotic behavior of distribution densities arising in stock price models with stochastic volatility. The main objects of our interest in the present paper are the density of time averages of the squared volatility process and the density of the stock price process in the Stein-Stein and the Heston model. We find explicit formulas for leading terms in asymptotic expansions of these densities and give error estimates. As an application of our results, sharp asymptotic formulas for the implied volatility in the Stein-Stein and the Heston model are obtained.
    Date: 2009–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0906.0392&r=ets
  23. By: A. Saichev; D. Sornette; V. Filimonov
    Abstract: We present a comprehensive theory of homogeneous volatility (and variance) estimators of arbitrary stochastic processes that fully exploit the OHLC (open, high, low, close) prices. For this, we develop the theory of most efficient point-wise homogeneous OHLC volatility estimators, valid for any price processes. We introduce the "quasi-unbiased estimators", that can address any type of desirable constraints. The main tool of our theory is the parsimonious encoding of all the information contained in the OHLC prices for a given time interval in the form of the joint distributions of the high-minus-open, low-minus-open and close-minus-open values, whose analytical expression is derived exactly for Wiener processes with drift. The distributions can be calculated to yield the most efficient estimators associated with any statistical properties of the underlying log-price stochastic process. Applied to Wiener processes for log-prices with drift, we provide explicit analytical expressions for the most efficient point-wise volatility and variance estimators, based on the analytical expression of the joint distribution of the high-minus-open, low-minus-open and close-minus-open values. The efficiency of the new proposed estimators is favorably compared with that of the Garman-Klass, Roger-Satchell and maximum likelihood estimators.
    Date: 2009–08
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0908.1677&r=ets
  24. By: Tetsuya Takaishi
    Abstract: We study the performance of the adaptive construction scheme for a Bayesian inference on the Quadratic GARCH model which introduces the asymmetry in time series dynamics. In the adaptive construction scheme a proposal density in the Metropolis-Hastings algorithm is constructed adaptively by changing the parameters of the density to fit the posterior density. Using artificial QGARCH data we infer the QGARCH parameters by applying the adaptive construction scheme to the Bayesian inference of QGARCH model. We find that the adaptive construction scheme samples QGARCH parameters effectively, i.e. correlations between the sampled data are very small. We conclude that the adaptive construction scheme is an efficient method to the Bayesian estimation of the QGARCH model.
    Date: 2009–07
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0907.5276&r=ets
  25. By: Sovan Mitra
    Abstract: This paper formed part of a preliminary research report for a risk consultancy and academic research. Stochastic Programming models provide a powerful paradigm for decision making under uncertainty. In these models the uncertainties are represented by a discrete scenario tree and the quality of the solutions obtained is governed by the quality of the scenarios generated. We propose a new technique to generate scenarios based on Gaussian Mixture Hidden Markov Modelling. We show that our approach explicitly captures important time varying dynamics of stochastic processes (such as autoregression and jumps) as well as non-Gaussian distribution characteristics (such as skewness and kurtosis). Our scenario generation method enables richer robustness and scenario analysis through exploiting the tractable properties of Markov models and Gaussian mixture distributions. We demonstrate the benefits of our scenario generation method by conducting numerical experiments on FTSE-100 data.
    Date: 2009–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0904.1131&r=ets
  26. By: Kei Takeuchi; Akimichi Takemura; Masayuki Kumon
    Abstract: We propose procedures for testing whether stock price processes are martingales based on limit order type betting strategies. We first show that the null hypothesis of martingale property of a stock price process can be tested based on the capital process of a betting strategy. In particular with high frequency Markov type strategies we find that martingale null hypotheses are rejected for many stock price processes.
    Date: 2009–07
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0907.3273&r=ets
  27. By: Fulvio Baldovin; Dario Bovina; Attilio L. Stella
    Abstract: Financial data give an opportunity to uncover the non-stationarity which may be hidden in many single time-series. Five years of daily Euro/Dollar trading records in the about three hours following the New York opening session are shown to give an accurate ensemble representation of the self-similar, non-Markovian stochastic process with nonstationary increments recently conjectured to generally underlie financial assets dynamics [PNAS {\bf 104}, 19741 (2007)]. Introducing novel quantitative tools in the analysis of non-Markovian time-series we show that empirical non-linear correlators are in remarkable agreement with model predictions based only on the anomalous scaling form of the logarithmic return distribution.
    Date: 2009–09
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0909.3244&r=ets
  28. By: Ardia, David
    Abstract: This paper presents the R package bayesGARCH which provides functions for the Bayesian estimation of the parsimonious but effective GARCH(1,1) model with Student-t innovations. The estimation procedure is fully automatic and thus avoids the time-consuming and difficult task of tuning a sampling algorithm. The usage of the package is shown in an empirical application to exchange rate log-returns.
    Keywords: GARCH; Bayesian; MCMC; Student-t; R software
    JEL: C52 C22 C15 C11
    Date: 2009–09–20
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:17414&r=ets
  29. By: Ralf Becker (Manchester); Adam Clements (QUT); Christopher Coleman-Fenn (QUT)
    Abstract: Forecasting volatility has received a great deal of research attention, with the relative performance of econometric models based on time-series data and option implied volatility forecasts often being considered. While many studies find that implied volatility is the preferred approach, a number of issues remain unresolved. Implied volatilities are risk-neutral forecasts of spot volatility, whereas time-series models are estimated on risk-adjusted or real world data of the underlying. Recently, an intuitive method has been proposed to adjust these risk-neutral forecasts into their risk-adjusted equivalents, possibly improving on their forecast accuracy. By utilising recent econometric advances, this paper considers whether these risk-adjusted forecasts are statistically superior to the unadjusted forecasts, as well as a wide range of model based forecasts. It is found that an unadjusted risk-neutral implied volatility is an inferior forecast. However, after adjusting for the risk premia it is of equal predictive accuracy relative to a number of model based forecasts.
    Keywords: Implied volatility, volatility forecasts, volatility models, volatility risk premium, model confidence sets
    JEL: C12 C22 G00
    Date: 2009–07–21
    URL: http://d.repec.org/n?u=RePEc:qut:auncer:2009_58&r=ets

This nep-ets issue is ©2009 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.