|
on Econometric Time Series |
By: | Robinson Kruse (School of Economics and Management, Aarhus University and CREATES); Rickard Sandberg (Department of Economic Statistics, Stockholm School of Economics) |
Abstract: | Building upon the work of Vogelsang (1998) and Harvey and Leybourne (2007) we derive tests that are invariant to the order of integration when the null hypothesis of linearity is tested in time-varying smooth transition models. As heteroscedasticity may lead to spurious rejections of the null hypothesis, a White correction is also considered. The asymptotic properties of the tests are studied. Our Monte Carlo simulations suggest that the newly proposed tests exhibit good size and competitive power properties. An empirical application to US inflation data from the Post-Bretton Woods period underlines the empirical usefulness of our tests. |
Keywords: | Linearity testing, Linear I(0) and (1) models, Non-linear I(0) and I(1) models,White correction. |
JEL: | C22 |
Date: | 2010–07–26 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2010-36&r=ets |
By: | Fernando Baltazar-Larios (Universidad Nacional Autónoma de México); Michael Sørensen (University of Copenhagen and CREATES) |
Abstract: | We propose a method for obtaining maximum likelihood estimates of parameters in diffusion models when the data is a discrete time sample of the integral of the process, while no direct observations of the process itself are available. The data are, moreover, assumed to be contaminated by measurement errors. Integrated volatility is an example of this type of observations. Another example is ice-core data on oxygen isotopes used to investigate paleo-temperatures. The data can be viewed as incomplete observations of a model with a tractable likelihood function. Therefore we propose a simulated EM-algorithm to obtain maximum likelihood estimates of the parameters in the diffusion model. As part of the algorithm, we use a recent simple method for approximate simulation of diffusion bridges. In simulation studies for the Ornstein-Uhlenbeck process and the CIR process the proposed method works well. |
Keywords: | Diffusion bridge, discretely sampled diffusions, EM-algorithm, likelihood inference, measurement error, stochastic differential equation, stochastic volatility. |
JEL: | C22 C51 |
Date: | 2010–08–05 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2010-33&r=ets |
By: | Mogens Bladt (Universidad Nacional Autónoma de México); Michael Sørensen (University of Copenhagen and CREATES) |
Abstract: | With a view to likelihood inference for discretely observed diffusion type models, we propose a simple method of simulating approximations to diffusion bridges. The method is applicable to all one-dimensional diffusion processes and has the advantage that simple simulation methods like the Euler scheme can be applied to bridge simulation. Another advantage over other bridge simulation methods is that the proposed method works well when the diffusion bridge is defined in a long interval because the computational complexity of the method is linear in the length of the interval. In a simulation study we investigate the accuracy and efficiency of the new method and compare it to exact simulation methods. In the study the method provides a very good approximation to the distribution of a diffusion bridge for bridges that are likely to occur in applications to likelihood inference. To illustrate the usefulness of the new method, we present an EM-algorithm for a discretely observed diffusion process. We demonstrate how this estimation method simplifies for exponential families of diffusions and very briefly consider Bayesian inference. |
Keywords: | Bayesian inference, diffusion bridge, discretely sampled diffusions, EM-algorithm, Euler scheme, likelihood inference, time-reversion |
JEL: | C22 C15 |
Date: | 2010–08–05 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2010-32&r=ets |
By: | Rasmus Tangsgaard Varneskov (School of Economics and Management, Aarhus University and CREATES) |
Abstract: | This paper considers the performance of di erent long-memory dynamic models when forecasting volatility in the stock market using implied volatility as an exogenous variable in the information set. Observed volatility is sep- arated into its continuous and jump components in a framework that allows for consistent estimation in the presence of market microstructure noise. A comparison between a class of HAR- and ARFIMA models is facilitated on the basis of out-of-sample forecasting performance. Implied volatility conveys incremental information about future volatility in both specifications, improv- ing performance both in- and out-of-sample for all models. Furthermore, the ARFIMA class of models dominates the HAR specications in terms of out-of- sample performance both with and without implied volatility in the information set. A vectorized ARFIMA (vecARFIMA) model is introduced to control for possible endogeneity issues. This model is compared to a vecHAR specication, re-enforcing the results from the single equation framework. |
Keywords: | ARFIMA, HAR, Implied Volatility, Jumps, Market Microstructure Noise, VecARFIMA, Volatility Forecasting |
JEL: | C14 C22 C32 C53 G10 |
Date: | 2010–08–19 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2010-39&r=ets |
By: | Leonidas Tsiaras (Department of Business Studies, ASB, Aarhus University and CREATES) |
Abstract: | This study examines the information content of alternative implied volatility measures for the 30 components of the Dow Jones Industrial Average Index from 1996 until 2007. Along with the popular Black-Scholes and \model-free" implied volatility expectations, the recently proposed corridor implied volatil- ity (CIV) measures are explored. For all pair-wise comparisons, it is found that a CIV measure that is closely related to the model-free implied volatility, nearly always delivers the most accurate forecasts for the majority of the firms. This finding remains consistent for different forecast horizons, volatility definitions, loss functions and forecast evaluation settings. |
Keywords: | Model-Free Implied Volatility, Corridor Implied Volatility, Volatility Forecasting |
JEL: | C22 C53 G13 G14 |
Date: | 2010–02–01 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2010-34&r=ets |
By: | Christian M. Dahl (Department of Business and Economics, University of Southern Denmark and CREATES); Emma M. Iglesias (Department of Economics, Michigan State University) |
Abstract: | In this paper consistency and asymptotic normality of the quasi maximum like-lihood estimator in the level-effect ARCH model of Chan, Karolyi, Longstaff and Sanders (1992) is established. We consider explicitly the case where the parameters of the conditional heteroskedastic process are in the stationary region and discuss carefully how the results can be extended to the region where the conditional heteroskedastic process is nonstationary. The results illustrate that Jensen and Rahbek's (2004a,2004b) approach can be extended further than to traditional ARCH and GARCH models. |
Keywords: | Level-effect ARCH, QMLE, Asymptotics, Stationarity, Nonstationarity. |
JEL: | C12 C13 C22 |
Date: | 2010–08–25 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2010-48&r=ets |
By: | Rasmus Tangsgaard Varneskov (School of Economics and Management, Aarhus University and CREATES); Valeri Voev (School of Economics and Management, Aarhus University and CREATES) |
Abstract: | Recently, consistent measures of the ex-post covariation of financial assets based on noisy high-frequency data have been proposed. A related strand of literature focuses on dynamic models and covariance forecasting for high-frequency data based covariance measures. The aim of this paper is to investigate whether more sophisticated estimation approaches lead to more precise covariance forecasts, both in a statistical precision sense and in terms of economic value. A further issue we address, is the relative importance of the quality of the realized measure as an input in a given forecasting model vs. the model’s dynamic specification. The main finding is that the largest gains result from switching from daily to high-frequency data. Further gains are achieved if a simple sparsesampling covariance measure is replaced with a more efficient and noise-robust estimator. |
Keywords: | Forecast evaluation, Volatility forecasting, Portfolio optimization, Mean-variance analysis. |
JEL: | C32 C53 G11 |
Date: | 2010–08–26 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2010-45&r=ets |
By: | Robinson Kruse (School of Economics and Management, Aarhus University and CREATES); Philipp Sibbertsen (Leibniz University Hannover, School of Economics and Management, Institute of Statistics) |
Abstract: | We study the empirical behaviour of semi-parametric log-periodogram estimation for long memory models when the true process exhibits a change in persistence. Simulation results confirm theoretical arguments which suggest that evidence for long memory is likely to be found. A recently proposed test by Sibbertsen and Kruse (2009) is shown to exhibit noticeable power to discriminate between long memory and a structural change in autoregressive parameters. |
Keywords: | Long memory, changing persistence, structural break, semi-parametric estimation |
JEL: | C12 C22 |
Date: | 2010–08–01 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2010-42&r=ets |
By: | Morten Ørregaard Nielsen (Queen?s University and CREATES); Per Frederiksen (Nordea Markets) |
Abstract: | We consider estimation of the cointegrating relation in the weak fractional cointegration model, where the strength of the cointegrating relation (difference in memory parameters) is less than one-half. A special case is the stationary fractional cointegration model, which has found important application recently, especially in financial economics. Previous research on this model has considered a semiparametric narrow-band least squares (NBLS) estimator in the frequency domain, but in the stationary case its asymptotic distribution has been derived only under a condition of non-coherence between regressors and errors at the zero frequency. We show that in the absence of this condition, the NBLS estimator is asymptotically biased, and also that the bias can be consistently estimated. Consequently, we introduce a fully modi?ed NBLS estimator which eliminates the bias, and indeed enjoys a faster rate of convergence than NBLS in general. We also show that local Whittle estimation of the integration order of the errors can be conducted consistently based on NBLS residuals, but the estimator has the same asymptotic distribution as if the errors were observed only under the condition of non-coherence. Furthermore, compared to much previous research, the development of the asymptotic distribution theory is based on a different spectral density representation, which is relevant for multivariate fractionally integrated processes, and the use of this representation is shown to result in lower asymptotic bias and variance of the narrow-band estimators. We present simulation evidence and a series of empirical illustrations to demonstrate the feasibility and empirical relevance of our methodology. |
Keywords: | Fractional cointegration, frequency domain, fully modi?ed estimation, long memory, semiparametric. |
JEL: | C22 |
Date: | 2010–05–12 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2010-31&r=ets |
By: | Chang, C-L.; McAleer, M.J.; Franses, Ph.H.B.F. |
Abstract: | Macro-economic forecasts are often based on the interaction between econometric models and experts. A forecast that is based only on an econometric model is replicable and may be unbiased, whereas a forecast that is not based only on an econometric model, but also incorporates an expert’s touch, is non-replicable and is typically biased. In this paper we propose a methodology to analyze the qualities of combined non-replicable forecasts. One part of the methodology seeks to retrieve a replicable component from the non-replicable forecasts, and compares this component against the actual data. A second part modifies the estimation routine due to the assumption that the difference between a replicable and a non-replicable forecast involves a measurement error. An empirical example to forecast economic fundamentals for Taiwan shows the relevance of the methodological approach. |
Keywords: | combined forecasts;efficient estimation;generated regressors;replicable forecasts;non-replicable forecasts;expert’s intuition;C22;E27;E37 |
Date: | 2010–07–28 |
URL: | http://d.repec.org/n?u=RePEc:dgr:eureir:1765020156&r=ets |
By: | Renee Fry; Adrian Pagan |
Abstract: | Structural Vector Autoregressions (SVARs) have become one of the major ways of extracting information about the macro economy. One might cite three major uses of them in macro-econometric research. 1. For quantifying impulse responses to macroeconomic shocks. 2. For measuring the degree of uncertainty about the impulse responses or other quantities formed from them. 3. For deciding on the contribution of different shocks to fluctuations and forecast errors through variance decompositions. To determine this information a VAR is first fitted to summarize the data and then a structural VAR (SVAR) is proposed whose structural equation errors are taken to be the economic shocks. The parameters of these structural equations are then estimated by utilizing the information in the VAR. The VAR is a reduced form which summarizes the data; the SVAR provides an interpretation of the data. As for any set of structural equations, recovery of the structural equation parameters (shocks) requires the use of identification restrictions that reduce the number of "free" parameters in the structural equations to the number that can be recovered from the information in the reduced form. |
Date: | 2010–07 |
URL: | http://d.repec.org/n?u=RePEc:acb:camaaa:2010-22&r=ets |
By: | Torben G. Andersen; Dobrislav Dobrev; Ernst Schaumburg |
Abstract: | We propose two new jump-robust estimators of integrated variance based on high-frequency return observations. These MinRV and MedRV estimators provide an attractive alternative to the prevailing bipower and multipower variation measures. Specifically, the MedRV estimator has better theoretical efficiency properties than the tripower variation measure and displays better finite-sample robustness to both jumps and the occurrence of “zero” returns in the sample. Unlike the bipower variation measure, the new estimators allow for the development of an asymptotic limit theory in the presence of jumps. Finally, they retain the local nature associated with the low-order multipower variation measures. This proves essential for alleviating finite sample biases arising from the pronounced intraday volatility pattern that afflicts alternative jump-robust estimators based on longer blocks of returns. An empirical investigation of the Dow Jones 30 stocks and an extensive simulation study corroborate the robustness and efficiency properties of the new estimators. |
Keywords: | Stocks - Rate of return ; Stock market ; Stock - Prices |
Date: | 2010 |
URL: | http://d.repec.org/n?u=RePEc:fip:fednsr:465&r=ets |
By: | Naoto Kunitomo (Faculty of Economics, University of Tokyo); Seisho Sato (Institute of Statistical Mathematics) |
Abstract: | For estimating the realized volatility and covariance by using high frequency data, we have introduced the Separating Information Maximum Likelihood (SIML) method when there are possibly micro-market noises by Kunitomo and Sato (2008a, 2008b, 2010a, 2010b). The resulting estimator is simple and it has the representation as a specific quadratic form of returns. We show that the SIML estimator has reasonable asymptotic properties; it is consistent and it has the asymptotic normality (or the stable convergence in the general case) when the sample size is large under general conditions including some non-Gaussian processes and some volatility models. Based on simulations, we find that the SIML estimator has reasonable finite sample properties and thus it would be useful for practice. The SIML estimator has the asymptotic robustness properties in the sense it is consistent when the noise terms are weakly dependent and they are endogenously correlated with the efficient market price process. We also apply our method to an analysis of Nikkei-225 Futures, which has been the major stock index in the Japanese financial sector. |
Date: | 2010–08 |
URL: | http://d.repec.org/n?u=RePEc:tky:fseres:2010cf758&r=ets |
By: | Luís Francisco Aguiar (Universidade do Minho - NIPE); Maria Joana Soares (Universidade do Minho) |
Abstract: | Wavelet analysis is becoming more popular in the Economics discipline. Until recently, most works have made use of tools associated with the Discrete Wavelet Transform. However, after 2005, there has been a growing body of work in Economics and Finance that makes use of the Continuous Wavelet Transform tools. In this article, we give a self-contained summary on the most relevant theoretical results associated with the Continuous Wavelet Transform, the Cross-Wavelet Transform, the Wavelet Coherency and the Wavelet Phase-Difference. We describe how the transforms are usually implemented in practice and provide some examples. We also introduce the Economists to a new class of analytic wavelets, the Generalized Morse Wavelets, which have some desirable properties and provide an alternative to the Morlet Wavelet. Finally, we provide a user friendly toolbox which will allow any researcher to replicate our results and to use it in his/her own research. |
Keywords: | Economic cycles; ContinuousWavelet Transform, Cross-Wavelet Transform, Wavelet Coherency, Wavelet Phase-Difference; The Great Moderation. |
Date: | 2010 |
URL: | http://d.repec.org/n?u=RePEc:nip:nipewp:23/2010&r=ets |