|
on Econometrics |
By: | Luc, BAUWENS (UNIVERSITE CATHOLIQUE DE LOUVAIN, Center for Operations Research and Econometrics (CORE)); C.M., HAFNER; J.V.K., ROMBOUTS |
Abstract: | We propose a new multivariate volatility model where the conditional distribution of a vector time series is given by a mixture of multivariate normal distributions. Each of these distributions is allowed to have a time-varying covariance matrix. The process can be globally covariance-stationary even though some components are not covariance-stationary. We derive some theoretical properties of the model such as the unconditional covariance matrix and autocorrelations of squared returns. The complexity of the model requires a powerful estimation algorithm. In a simulation study we compare estimation by a maximum likelihood with the EM algorithm and Bayesian estimation with a Gibbs sampler. Finally, we apply the model to daily U.S. stock returns. |
Keywords: | Multivariate volatility; Finite mixture; EM algorithm; Bayesian inference |
JEL: | C11 C22 C52 |
Date: | 2006–02–20 |
URL: | http://d.repec.org/n?u=RePEc:ctl:louvec:2006007&r=ecm |
By: | Juan Carlos Escanciano |
Abstract: | This article proposes a general class of joint diagnostic tests for parametric conditional mean and variance models of possibly nonlinear and/or non-Markovian time series sequences. The new tests are based on a generalized spectral approach and, contrary to existing procedures, they do not need to choose a lag order depending on the sample size or to smooth the data. Moreover, they are robust to higher order dependence of unknown form. It turns out that the asymptotic null distributions of the new tests depend on the data generating process, so a bootstrap procedure is proposed and theoretically justified. A simulation study compares the finite sample performance of the proposed and competing tests and shows that our tests can play a valuable role in time series modelling. An application to the S&P500 highlights the merits of our approach. |
JEL: | C12 C14 C52 |
URL: | http://d.repec.org/n?u=RePEc:una:unccee:wpwp0206&r=ecm |
By: | Luc, BAUWENS (UNIVERSITE CATHOLIQUE DE LOUVAIN, Center for Operations Research and Econometrics (CORE)); Arie, PREMINGER (UNIVERSITE CATHOLIQUE DE LOUVAIN, Center for Operations Research and Econometrics (CORE)); Jeroen, ROMBOUTS |
Abstract: | We develop univariate regime-switching GARCH (RS-GARCH) models wherein the conditional variance switches in time from one GARCH process to another. The switching is governed by a time-varying probability, specified as a function of past information. We provide sufficient conditions for stationarity and existence of moments. Because of path dependence, maximum likehood estimation is infeasible. By enlarging the parameter space to include the state variables, Bayesian estimation using a Gibbs sampling algorithm is feasible. We apply this model using the NASDAQ daily returns series. |
Keywords: | GARCH; regime switching; Bayesian inference |
JEL: | C11 C22 C52 |
Date: | 2006–02–20 |
URL: | http://d.repec.org/n?u=RePEc:ctl:louvec:2006006&r=ecm |
By: | Anindya Banerjee (European University Institute, Department of Economics, Villa San Paolo, Via della Piazzuola 43, 50133 Florence, Italy.); Josep Lluís (University of Barcelona, Department of Econometrics, Statistics and Spanish Economy, Av. Diagonal 690, 08034 Barcelona, Spain.) |
Abstract: | The power of standard panel cointegration statistics may be affected by misspecification errors if proper account is not taken of the presence of structural breaks in the data. We propose modifications to allow for one structural break when testing the null hypothesis of no cointegration that retain good properties in terms of empirical size and power. Response surfaces to approximate the finite sample moments that are required to implement the statistics are provided. Since panel cointegration statistics rely on the assumption of cross-section independence, a generalisation of the tests to the common factor framework is carried out in order to allow for dependence among the units of the panel. |
Keywords: | Panel cointegration; structural break; common factors; cross-section dependence |
JEL: | C12 C22 |
Date: | 2006–02 |
URL: | http://d.repec.org/n?u=RePEc:ecb:ecbwps:20060591&r=ecm |
By: | Ekkehart Schlicht (University of Munich and IZA Bonn); Johannes Ludsteck (Institute for Employment Research (IAB)) |
Abstract: | This papers describes an estimator for a standard state-space model with coefficients generated by a random walk that is statistically superior to the Kalman filter as applied to this particular class of models. Two closely related estimators for the variances are introduced: A maximum likelihood estimator and a moments estimator that builds on the idea that some moments are equalized to their expectations. These estimators perform quite similar in many cases. In some cases, however, the moments estimator is preferable both to the proposed likelihood estimator and the Kalman filter, as implemented in the program package Eviews. |
Keywords: | time-varying coefficients, adaptive estimation, Kalman filter, state-space |
JEL: | C2 C22 C51 C52 |
Date: | 2006–03 |
URL: | http://d.repec.org/n?u=RePEc:iza:izadps:dp2031&r=ecm |
By: | Bruno Crépon (CREST-INSEE, CEPR and IZA Bonn) |
Abstract: | The control function in the semiparametric selection model is zero at infinity. This paper proposes additional restrictions of the same type and shows how to use them to test assumed exclusion restrictions necessary for root N estimation of the model. The test is based on the estimated control function and its derivative and takes the form of a GMM step that occurs at infinity. Alternative estimation of the parameters are proposed which do not rely on exclusion restrictions, extending available results for the estimation of the intercept at infinity. Simulations are implemented. |
Keywords: | policy evaluation, sample selection, exclusion restriction, semi parametric estimation |
JEL: | C14 C31 C34 |
Date: | 2006–03 |
URL: | http://d.repec.org/n?u=RePEc:iza:izadps:dp2035&r=ecm |
By: | Luc, Bauwens (UNIVERSITE CATHOLIQUE DE LOUVAIN, Center for Operations Research and Econometrics (CORE)); J.V.K., ROMBOUTS |
Abstract: | We estimate by Bayesian inference the mixed conditional heteroskedasticity model of (Haas, Mittnik and Paolelella 2004a). We construct a Gibbs sampler algorithm to compute posterior and predictive densities. The number of mixture components is selected by the marginal likelihood criterion. We apply the model to the SP500 daily returns |
Keywords: | Finite mixure; ML estimation; Bayesian inference; Value at Risk |
JEL: | C11 C15 C32 |
Date: | 2005–12–01 |
URL: | http://d.repec.org/n?u=RePEc:ctl:louvec:2005058&r=ecm |
By: | T. Brenner; C. Werker |
Abstract: | This paper introduces a categorization of simulation models. It provides an explicit overview of the steps that lead to a simulation model. We highlight the advantages and disadvantages of various simulation approaches by examining how they advocate different ways of constructing simulation models. To this end, it discusses a number of relevant methodological issues, such as how realistic simulation models are obtained and which kinds of inference can be used in a simulation approach. Finally, the paper presents a practical guide on how simulation should and can be conducted. |
Keywords: | Methodology, Simulation Models, Practical Guide |
JEL: | B41 B52 C63 |
Date: | 2006–03 |
URL: | http://d.repec.org/n?u=RePEc:esi:evopap:2006-02&r=ecm |
By: | Susan Athey (Dept. of Economics, Stanford University); Philip A. Haile (Dept. of Economics and Cowles Foundation, Yale University) |
Abstract: | Many important economic questions arising in auctions can be answered only with knowledge of the underlying primitive distributions governing bidder demand and information. An active literature has developed aiming to estimate these primitives by exploiting restrictions from economic theory as part of the econometric model used to interpret auction data. We review some highlights of this recent literature, focusing on identification and empirical applications. We describe three insights that underlie much of the recent methodological progress in this area and discuss some of the ways these insights have been extended to richer models allowing more convincing empirical applications. We discuss several recent empirical studies using these methods to address a range of important economic questions. |
Keywords: | Auctions, Identification, Estimation, Testing |
JEL: | C5 L1 D4 |
Date: | 2006–03 |
URL: | http://d.repec.org/n?u=RePEc:cwl:cwldpp:1562&r=ecm |
By: | Pesavento, Elena; Rossi, Barbara |
Abstract: | This paper is a comprehensive comparison of existing methods for constructing confidence bands for univariate impulse response functions in the presence of high persistence. Monte Carlo results show that Kilian (1998a), Wright (2000), Gospodinov (2004) and Pesavento and Rossi (2005) have favorable coverage properties, although they differ in terms of robustness at various horizons, median unbiasedness, and reliability in the possible presence of a unit or mildly explosive root. On the other hand, methods like Runkle’s (1987) bootstrap, Andrews and Chen (1994), and regressions in levels or first differences (even when based on pre-tests) may not have accurate coverage properties. The paper makes recommendations as to the appropriateness of each method in empirical work. |
Keywords: | Local to unity asymptotics, persistence, impulse response functions |
JEL: | C1 C2 |
Date: | 2006 |
URL: | http://d.repec.org/n?u=RePEc:duk:dukeec:06-03&r=ecm |
By: | John Creedy; Guyonne Kalb; Hsein Kew |
Abstract: | This paper addresses the need for a measure of the uncertainty that is associated with the results calculated through tax policy behavioural microsimulation modelling. Deriving the analytical measure would be extremely complicated, therefore, a simulated approach is proposed which generates a pseudo sampling distribution of aggregate measures based on the sampling distribution of the estimated labour supply parameters. This approach, which is very computer intensive, is compared to a more time-efficient approach where the functional form of the sampling distribution is assumed to be normal. The results show that in many instances the results from the two approaches are quite similar. The exception is when aggregate measures for minor types of payments, involving relatively small groups of the population, are examined. |
Date: | 2005 |
URL: | http://d.repec.org/n?u=RePEc:mlb:wpaper:936&r=ecm |
By: | Martin Spieß |
URL: | http://d.repec.org/n?u=RePEc:diw:diwwpp:dp564&r=ecm |
By: | Michael C. Lovell (Wesleyan University) |
Date: | 2005–12 |
URL: | http://d.repec.org/n?u=RePEc:wes:weswpa:2005-012&r=ecm |