|
on Econometric Time Series |
By: | Federico Carlini (Aarhus University and CREATES); Paolo Santucci de Magistris (Aarhus University and CREATES) |
Abstract: | This paper discusses identification problems in the fractionally cointegrated system of Johansen (2008) and Johansen and Nielsen (2012). The identification problem arises when the lag structure is over-specified, such that there exist several equivalent reparametrization of the model associated with different fractional integration and cointegration parameters. The properties of these multiple non-identified sub-models are studied and a necessary and sufficient condition for the identification of the fractional parameters of the system is provided. The condition is named F(d). The assessment of the F(d) condition in the empirical analysis is relevant for the determination of the fractional parameters as well as the lag structure. |
Keywords: | Fractional Cointegration; Cofractional Models; Identification; Lag |
JEL: | C19 C32 |
Date: | 2013–11–12 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2013-44&r=ets |
By: | Ting Zhang; Hwai-Chung Ho; Martin Wendler; Wei Biao Wu |
Abstract: | The paper considers the block sampling method for long-range dependent processes. Our theory generalizes earlier ones by Hall, Jing and Lahiri (1998) on functionals of Gaussian processes and Nordman and Lahiri (2005) on linear processes. In particular, we allow nonlinear transforms of linear processes. Under suitable conditions on physical dependence measures, we prove the validity of the block sampling method. The problem of estimating the self-similar index is also studied. |
Date: | 2013–12 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1312.5807&r=ets |
By: | Adam D. Bull |
Abstract: | In quantitative finance, we often wish to recover the volatility of asset prices given by a noisy It\=o semimartingale. Existing estimates, however, lose accuracy when the jumps are of infinite variation, as is suggested by empirical evidence. In this paper, we show that when the efficient prices are given by an unknown time-changed L\'evy process, the rate of time change, which plays the role of the volatility, can be estimated well under arbitrary jump activity. We further show that our estimate remains valid for the volatility in the general semimartingale model, obtaining convergence rates as good as any previously implied in the literature. |
Date: | 2013–12 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1312.5911&r=ets |
By: | Jia Li; Andrew J. Patton |
Abstract: | This paper provides a general framework that enables many existing inference methods for predictive accuracy to be used in applications that involve forecasts of latent target variables. Such applications include the forecasting of volatility, correlation, beta, quadratic variation, jump variation, and other functionals of an underlying continuous-time process. We provide primitive conditions under which a "negligibility" result holds, and thus the asymptotic size of standard predictive accuracy tests, implemented using a high-frequency proxy for the latent variable, is controlled. An extensive simulation study verifies that the asymptotic results apply in a range of empirically relevant applications, and an empirical application to correlation forecasting is presented. |
Keywords: | Forecast evaluation, realized variance, volatility, jumps, semimartingale |
JEL: | C53 C22 C58 C52 C32 |
Date: | 2013 |
URL: | http://d.repec.org/n?u=RePEc:duk:dukeec:13-26&r=ets |
By: | Irving Arturo De Lira Salvatierra; Andrew J. Patton |
Abstract: | This paper proposes a new class of dynamic copula models for daily asset returns that exploits information from high frequency (intra-daily) data. We augment the generalized autoregressive score (GAS) model of Creal, et al. (2012) with high frequency measures such as realized correlation to obtain a "GRAS" model. We find that the inclusion of realized measures significantly improves the in-sample fit of dynamic copula models across a range of U.S. equity returns. Moreover, we find that out-of-sample density forecasts from our GRAS models are superior to those from simpler models. Finally, we consider a simple portfolio choice problem to illustrate the economic gains from exploiting high frequency data for modeling dynamic dependence. |
Keywords: | Realized correlation, realized volatility, dependence, forecasting, tail risk |
JEL: | C32 C51 C58 |
Date: | 2013 |
URL: | http://d.repec.org/n?u=RePEc:duk:dukeec:13-28&r=ets |
By: | Dong Hwan Oh; Andrew J. Patton |
Abstract: | This paper proposes a new class of copula-based dynamic models for high dimension conditional distributions, facilitating the estimation of a wide variety of measures of systemic risk. Our proposed models draw on successful ideas from the literature on modeling high dimension covariance matrices and on recent work on models for general time-varying distributions. Our use of copula-based models enable the estimation of the joint model in stages, greatly reducing the computational burden. We use the proposed new models to study a collection of daily credit default swap (CDS) spreads on 100 U.S. firms over the period 2006 to 2012. We find that while the probability of distress for individual firms has greatly reduced since the financial crisis of 2008-09, the joint probability of distress (a measure of systemic risk) is substantially higher now than in the pre-crisis period. |
Keywords: | correlation, tail risk, financial crises, DCC |
JEL: | C32 C58 G01 |
Date: | 2013 |
URL: | http://d.repec.org/n?u=RePEc:duk:dukeec:13-30&r=ets |
By: | Miguel, Belmonte; Gary, Koop |
Abstract: | This paper investigates the usefulness of switching Gaussian state space models as a tool for implementing dynamic model selecting (DMS) or averaging (DMA) in time-varying parameter regression models. DMS methods allow for model switching, where a different model can be chosen at each point in time. Thus, they allow for the explanatory variables in the time-varying parameter regression model to change over time. DMA will carry out model averaging in a time-varying manner. We compare our exact approach to DMA/DMS to a popular existing procedure which relies on the use of forgetting factor approximations. In an application, we use DMS to select different predictors in an in ation forecasting application. We also compare different ways of implementing DMA/DMS and investigate whether they lead to similar results. |
Keywords: | Model switching, forecast combination, switching state space model, inflation forecasting, |
Date: | 2013 |
URL: | http://d.repec.org/n?u=RePEc:edn:sirdps:440&r=ets |
By: | Gary, Koop |
Abstract: | This paper discusses the challenges faced by the empirical macroeconomist and methods for surmounting them. These challenges arise due to the fact that macroeconometric models potentially include a large number of variables and allow for time variation in parameters. These considerations lead to models which have a large number of parameters to estimate relative to the number of observations. A wide range of approaches are surveyed which aim to overcome the resulting problems. We stress the related themes of prior shrinkage, model averaging and model selection. Subsequently, we consider a particular modelling approach in detail. This involves the use of dynamic model selection methods with large TVP-VARs. A forecasting exercise involving a large US macroeconomic data set illustrates the practicality and empirical success of our approach. |
Keywords: | Bayesian VAR, forecasting, time-varying coefficients, state-space model, |
Date: | 2013 |
URL: | http://d.repec.org/n?u=RePEc:edn:sirdps:443&r=ets |
By: | Jonas E. Arias; Juan Rubio-Ramirez; Daniel F. Waggoner |
Abstract: | Are optimism shocks an important source of business cycle fluctuations? Are deficit-financed tax cuts better than deficit-financed spending to increase output? These questions have been previously studied using SVARs identified with sign and zero restrictions and the answers have been positive and definite in both cases. While the identification of SVARs with sign and zero restrictions is theoretically attractive because it allows the researcher to remain agnostic with respect to the responses of the key variables of interest, we show that current implementation algorithms do not respect the agnosticism of the theory. These algorithms impose additional sign restrictions on variables that are seemingly unrestricted that bias the results and produce misleading confidence intervals. We provide an alternative and efficient algorithm that does not introduce any additional sign restriction, hence preserving the agnosticism of the theory. Without the additional restrictions, it is hard to support the claim that either optimism shocks are an important source of business cycle uctuations or deficit-financed tax cuts work best at improving output. Our algorithm is not only correct but also faster than current ones. |
Date: | 2013–12 |
URL: | http://d.repec.org/n?u=RePEc:fda:fdaddt:2013-24&r=ets |
By: | Fernandes, Marcelo; Medeiros, Marcelo C.; Veiga, Alvaro |
Abstract: | In this paper, we propose a class of ACD-type models that accommodatesoverdispersion, intermittent dynamics, multiple regimes, and sign and size asymme-tries in financial durations. In particular, our functional coefficient autoregressive con-ditional duration (FC-ACD) model relies on a smooth-transition autoregressive speci-fication. The motivation lies on the fact that the latter yields a universal approximationif one lets the number of regimes grows without bound. After establishing that the suf-ficient conditions for strict stationarity do not exclude explosive regimes, we addressmodel identifiability as well as the existence, consistency, and asymptotic normality ofthe quasi-maximum likelihood (QML) estimator for the FC-ACD model with a fixednumber of regimes. In addition, we also discuss how to consistently estimate using asieve approach a semiparametric variant of the FC-ACD model that takes the numberof regimes to infinity. An empirical illustration indicates that our functional coefficientmodel is flexible enough to model IBM price durations. |
Date: | 2013–12–09 |
URL: | http://d.repec.org/n?u=RePEc:fgv:eesptd:343&r=ets |
By: | H. Peter Boswijk (University of Amsterdam, Amsterdam School of Economics, Tinbergen Institute); Giuseppe Cavaliere (Department of Statistics, University of Bologna); Anders Rahbek (Department of Statistics and Operations Research, Copenhagen University); A.M. Robert Taylor (University of Essex) |
Abstract: | It is well established that the shocks driving many key macro-economic and financial variables display time-varying volatility. In this paper we consider estimation and hypothesis testing on the coefficients of the co-integrating relations and the adjustment coefficients in vector autoregressions driven by both conditional and unconditional heteroskedasticity of a quite general and unknown form in the shocks. We show that the conventional results in Johansen (1996) for the maximum likelihood estimators and associated likelihood ratio tests derived under homoskedasticity do not in general hold in the presence of heteroskedasticity. As a consequence, standard confidence intervals and tests of hypothesis on these coefficients are potentially unreliable. Solutions to this inference problem based on Wald tests (using a "sandwich" estimator of the variance matrix) and on the use of the wild bootstrap are discussed. These do not require the practitioner to specify a parametric model for volatility, or to assume that the pattern of volatility is common to, or independent across, the vector of series under analysis. We formally establish the conditions under which these methods are asymptotically valid. A Monte Carlo simulation study demonstrates that significant improvements in finite sample size can be obtained by the bootstrap over the corresponding asymptotic tests in both heteroskedastic and homoskedastic environments. An application to the term structure of interest rates in the US illustrates the difference between standard and bootstrap inferences regarding hypotheses on the co-integrating vectors and adjustment coefficients. |
Keywords: | Co-integration, adjustment coefficients, (un)conditional heteroskedasticity, heteroskedasticity-robust inference, wild bootstrap |
JEL: | C30 C32 |
Date: | 2013–11–14 |
URL: | http://d.repec.org/n?u=RePEc:kud:kuiedp:1313&r=ets |
By: | Degui Li; Peter C. B. Phillips; Jiti Gao |
Abstract: | We obtain uniform consistency results for kernel-weighted sample covariances in a nonstationary multiple regression framework that allows for both fixed design and random design coefficient variation. In the fixed design case these nonparametric sample covariances have different uniform convergence rates depending on direction, a result that differs fundamentally from the random design and stationary cases. The uniform convergence rates derived are faster than the corresponding rates in the stationary case and confirm the existence of uniform super-consistency. The modelling framework and convergence rates allow for endogeneity and thus broaden the practical econometric import of these results. As a specific application, we establish uniform consistency of nonparametric kernel estimators of the coefficient functions in nonlinear cointegration models with time varying coefficients and provide sharp convergence rates in that case. For the fixed design models, in particular, there are two uniform convergence rates that apply in two different directions, both rates exceeding the usual rate in the stationary case. |
Keywords: | and phrases: Cointegration; Functional coefficients; Kernel degeneracy; Nonparametric kernel smoothing; Random coordinate rotation; Super-consistency; Uniform convergence rates; Time varying coefficients. |
Date: | 2013 |
URL: | http://d.repec.org/n?u=RePEc:msh:ebswps:2013-27&r=ets |
By: | Worapree Maneesoonthorn; Catherine S. Forbes; Gael M. Martin |
Abstract: | This paper investigates the dynamic behaviour of jumps in financial prices and volatility. The proposed model is based on a standard jump diffusion process for price and volatility augmented by a bivariate Hawkes process for the two jump components. The latter process speci.es a joint dynamic structure for the price and volatility jump intensities, with the intensity of a volatility jump also directly affected by a jump in the price. The impact of certain aspects of the model on the higher-order conditional moments for returns is investigated. In particular, the differential effects of the jump intensities and the random process for latent volatility itself, are measured and documented. A state space representation of the model is constructed using both financial returns and non-parametric measures of integrated volatility and price jumps as the observable quantities. Bayesian inference, based on a Markov chain Monte Carlo algorithm, is used to obtain a posterior distribution for the relevant model parameters and latent variables, and to analyze various hypotheses about the dynamics in, and the relationship between, the jump intensities. An extensive empirical investigation using data based on the S&P500 market index over a period ending in early-2013 is conducted. Substantial empirical support for dynamic jump intensities is documented, with predictive accuracy enhanced by the inclusion of this type of specification. In addition, movements in the intensity parameter for volatility jumps are found to track key market events closely over this period. |
Keywords: | and phrases: Dynamic price and volatility jumps; Stochastic volatility; Hawkes process; Nonlinear state space model; Bayesian Markov chain Monte Carlo; Global financial cri- |
Date: | 2013 |
URL: | http://d.repec.org/n?u=RePEc:msh:ebswps:2013-28&r=ets |
By: | Simone D. Grose; Gael M. Martin; Donald S. Poskitt |
Abstract: | This paper investigates the accuracy of bootstrap-based bias correction of persistence measures for long memory fractionally integrated processes. The bootstrap method is based on the semi-parametric sieve approach, with the dynamics in the long memory process captured by an autoregressive approximation. With a view to improving accuracy, the sieve method is also applied to data pre-filtered by a semi-parametric estimate of the long memory parameter. Both versions of the bootstrap technique are used to estimate the finite sample distributions of the sample autocorrelation coefficients and the impulse response coefficients and, in turn, to bias-adjust these statistics. The accuracy of the resultant estimators in the case of the autocorrelation coefficients is also compared with that yielded by analytical bias adjustment methods when available. |
Keywords: | and phrases:Long memory, ARFIMA, sieve bootstrap, bootstrap-based bias correction, sample autocorrelation function, impulse response function. |
Date: | 2013 |
URL: | http://d.repec.org/n?u=RePEc:msh:ebswps:2013-29&r=ets |
By: | Trojan, Sebastian |
Abstract: | A very general stochastic volatility (SV) model specification with leverage, heavy tails, skew and switching regimes is proposed, using realized volatility (RV) as an auxiliary time series to improve inference on latent volatility. Asymmetry in the observation error is modeled by the Generalized Hyperbolic skew Student-t distribution, whose heavy and light tail enable modeling of substantial skewness. The information content of the range and of implied volatility using the VIX index is also investigated. Up to four regimes are identified from S&P 500 index data using RV as additional time series. Resulting number of regimes and dynamics differ dependent on the auxiliary volatility proxy and are investigated in-sample for the financial crash period 2008/09. An out-of-sample study comparing predictive ability of various model variants for a calm and a volatile period yields insights about the gains on forecasting performance that can be expected by incorporating different volatility proxies into the model. Results indicate that including RV pays off mostly in more volatile market conditions, whereas in calmer environments SV specifications using no auxiliary series appear to be the models of choice. Results for the VIX as a measure of implied volatility point in a similar direction. The range as volatility proxy provides a superior in-sample fit, but its predictive performance is found to be weak. |
Keywords: | Stochastic volatility, realized volatility, non-Gaussian and nonlinear state space model, Generalized Hyperbolic skew Student-t distribution, mixing distribution, regime switching, Markov chain Monte Carlo, particle filter |
JEL: | C11 C15 C32 C58 |
Date: | 2013–12 |
URL: | http://d.repec.org/n?u=RePEc:usg:econwp:2013:41&r=ets |
By: | Lance A. Fisher (Macquarie University); Hyeon-seung Huh (Yonsei University); Adrian R. Pagan (University of Sydney) |
Abstract: | This paper considers structural models when both I(1) and I(0) variables are present. It is necessary to extend the traditional classification of shocks as permanent and transitory, and we do this by introducing a mixed shock. The extra shocks coming from introducing I(0) variables into a system are then classified as either mixed or transitory. Conditions are derived upon the nature of the SVAR in the event that these extra shocks are transitory. We then analyse what happens when there are mixed shocks, finding that it changes a number of ideas that have become established from the cointegration literature. The ideas are illustrated using a well-known SVAR where there are mixed shocks. This SVAR is re-formulated so that the extra shocks coming from the introduction of I(0) variables do not affect relative prices in the long-run and it is found that this has major implications for whether there is a price puzzle. It is also shown how to handle long-run parametric restrictions when some shocks are identified using sign restrictions. |
Keywords: | Mixed models, transitory shocks, mixed shocks, long?run restrictions, sign restrictions, instrumental variables business cycles |
JEL: | C32 C36 C51 |
Date: | 2013–12 |
URL: | http://d.repec.org/n?u=RePEc:yon:wpaper:2013rwp-61&r=ets |