nep-ecm New Economics Papers
on Econometrics
Issue of 2009‒03‒07
thirteen papers chosen by
Sune Karlsson
Orebro University

  1. Nonstationary-Volatility Robust Panel Unit Root Tests and the Great Moderation By Hanck Christoph
  2. Efficient Estimation of Copula-based Semiparametric Markov Models By Xiaohong Chen; Wei Biao Wu; Yanping Yi
  3. Testing for Unit Root against LSTAR Model: Wavelet Improvement under GARCH Distortion By Li, Yushu; Shukur, Ghazi
  4. "Asymptotic Expansions and Higher Order Properties of Semi-Parametric Estimators in a System of Simultaneous Equations" By Naoto Kunitomo; Yukitoshi Matsushita
  5. Duration-Based Volatility Estimation By Torben G. Andersen; Dobrislav Dobrev; Ernst Schaumburg
  6. Volatility Forecasting: The Jumps Do Matter By Fulvio Corsi; Davide Pirino; Roberto Reno
  7. Wavelet Improvement of the Over-rejection of Unit root test under GARCH errors By Li, Yushu; Shukur, Ghazi
  8. Nonparametric Stochastic Volatility By Federico M. Bandi; Roberto Reno
  9. Automated Variable Selection in Vector Multiplicative Error Models By Fabrizio Cipollini; Giampiero M. Gallo
  10. Semiparametric vector MEM By Fabrizio Cipollini; Robert F. Engle; Giampiero M. Gallo
  11. Further simulation evidence on the performance of the Poisson pseudo-maximum likelihood estimator By J.M.C. Santos Silva; Silvana Tenreyro
  12. An Optimal Weight for Realized Variance Based on Intermittent High-Frequency Data By Hiroki Masuda; Takayuki Morimoto
  13. Econometric reduction theory and philosophy By Genaro Sucarrat

  1. By: Hanck Christoph (METEOR)
    Abstract: This paper proposes a new testing approach for panel unit roots that is, unlike previously suggested tests, robust to nonstationarity in the volatility process of the innovations of the time series in the panel. Nonstationarity volatility arises for instance when there are structural breaks in the innovation variances. A prominent example is the reduction in GDP growth variances enjoyed by many industrialized countries, known as the `Great Moderation.'' The panel test is based on Simes'' [Biometrika 1986, "An Improved Bonferroni Procedure for Multiple Tests of Significance''''] classical multiple test, which combines evidence from time series unit root tests of the series in the panel. As time series unit root tests, we employ recently proposed tests of Cavaliere and Taylor [Journal of Time Series Analysis 2008, "Time-Transformed Unit Root Tests for Models with Non-Stationary Volatility'''']. The panel test is robust to general patterns of cross-sectional dependence and yet straightforward to implement, only requiring valid p-values of time series unit root tests, and no resampling. Monte Carlo experiments show that other panel unit root tests suffer from sometimes severe size distortions in the presence of nonstationary volatility, and that this defect can be remedied using the test proposed here. The new test is applied to test for a unit root in an OECD panel of gross domestic products, yielding inference robust to the ''Great Moderation''. We find little evidence of trend stationarity.
    Keywords: macroeconomics ;
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:dgr:umamet:2009009&r=ecm
  2. By: Xiaohong Chen (Cowles Foundation, Yale University); Wei Biao Wu (Dept. of Statistics, University of Chicago); Yanping Yi (Dept of Economics, New York University)
    Abstract: This paper considers efficient estimation of copula-based semiparametric strictly stationary Markov models. These models are characterized by nonparametric invariant distributions and parametric copula functions; where the copulas capture all scale-free temporal dependence and tail dependence of the processes. The Markov models generated via tail dependent copulas may look highly persistent and are useful for financial and economic applications. We first show that Markov processes generated via Clayton, Gumbel and Student's t copulas (with tail dependence) are all geometric ergodic. We then propose a sieve maximum likelihood estimation (MLE) for the copula parameter, the invariant distribution and the conditional quantiles. We show that the sieve MLEs of any smooth functionals are root-n consistent, asymptotically normal and efficient; and that the sieve likelihood ratio statistics is chi-square distributed. We present Monte Carlo studies to compare the finite sample performance of the sieve MLE, the two-step estimator of Chen and Fan (2006), the correctly specified parametric MLE and the incorrectly specified parametric MLE. The simulation results indicate that our sieve MLEs perform very well; having much smaller biases and smaller variances than the two-step estimator for Markov models generated by Clayton, Gumbel and other copulas having strong tail dependence.
    Keywords: Copula, Tail dependence, Nonlinear Markov models, Geometric ergodicity, Sieve MLE, Semiparametric efficiency, Sieve likelihood ratio statistics, Value-at-Risk
    JEL: C14 C22
    Date: 2009–02
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1691&r=ecm
  3. By: Li, Yushu (Centre for Labour Market Policy Research (CAFO)); Shukur, Ghazi (Centre for Labour Market Policy Research (CAFO))
    Abstract: In this paper, we propose a Nonlinear Dickey-Fuller F test for unit root against first order Logistic Smooth Transition Autoregressive LSTAR (1) model with time as the transition variable. The Nonlinear Dickey-Fuller F test statistic is established under the null hypothesis of random walk without drift and the alternative model is a nonlinear LSTAR (1) model. The asymptotic distribution of the test is analytically derived while the small sample distributions are investigated by Monte Carlo experiment. The size and power properties of the test have been investigated using Monte Carlo experiment. The results have shown that there is a serious size distortion for the Nonlinear Dickey-Fuller F test when GARCH errors appear in the Data Generating Process (DGP), which lead to an over-rejection of the unit root null hypothesis. To solve this problem, we use the Wavelet technique to count off the GARCH distortion and to improve the size property of the test under GARCH error. We also discuss the asymptotic distributions of the test statistics in GARCH and wavelet environments. Finally, an empirical example is used to compare our test with the traditional Dickey-Fuller F test.
    Keywords: Unit root Test; Dickey-Fuller F test; STAR model; GARCH (1: 1) Wavelet method; MODWT
    JEL: C15 C52
    Date: 2009–02–26
    URL: http://d.repec.org/n?u=RePEc:hhs:vxcafo:2009_006&r=ecm
  4. By: Naoto Kunitomo (Faculty of Economics, University of Tokyo); Yukitoshi Matsushita (JSPS and Graduate School of Economics, University of Tokyo)
    Abstract: Asymptotic expansions are made for the distributions of the Maximum Empirical Likelihood (MEL) estimator and the Estimating Equation (EE) estimator (or the Generalized Method of Moments (GMM) in econometrics) for the coefficients of a single structural equation in a system of linear simultaneous equations, which corresponds to a reduced rank regression model. The expansions in terms of the sample size, when the non-centrality parameters increase proportionally, are carried out to O(n-1). Comparisons of the distributions of the MEL and GMM estimators are made. Also we relate the asymptotic expansions of the distributions of the MEL and GMM estimators to the corresponding expansions for the Limited Information Maximum Likelihood (LIML) and the Two-Stage Least Squares (TSLS) estimators. We give useful information on the higher order properties of alternative estimators including the semi-parametric inefficiency factor under the homoscedasticity assumption.
    Date: 2009–02
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2009cf611&r=ecm
  5. By: Torben G. Andersen; Dobrislav Dobrev; Ernst Schaumburg
    Abstract: We develop a novel approach to estimating the integrated variance of a general jump-diffusion with stochastic volatility. Our approach exploits the relationship between the speed (distance traveled per fixed time unit) and passage time (time taken to travel a fixed distance) of the Brownian motion. The new class of duration-based IV estimators derived in this paper is shown to be robust to both jumps and market microstructure noise. Moreover, their asymptotic and finite sample properties compare favorably to those of commonly used robust IV estimators.
    Date: 2009–03
    URL: http://d.repec.org/n?u=RePEc:hst:ghsdps:gd08-034&r=ecm
  6. By: Fulvio Corsi; Davide Pirino; Roberto Reno
    Abstract: This study reconsiders the role of jumps for volatility forecasting by showing that jumps have a positive and mostly significant impact on future volatility. This result becomes apparent once volatility is correctly separated into its continuous and discontinuous component. To this purpose, we introduce the concept of threshold multipower variation (TMPV), which is based on the joint use of bipower variation and threshold estimation. With respect to alternative methods, our TMPV estimator provides less biased and robust estimates of the continuous quadratic variation and jumps. This technique also provides a new test for jump detection which has substantially more power than traditional tests. We use this separation to forecast volatility by employing an heterogeneous autoregressive (HAR) model which is suitable to parsimoniously model long memory in realized volatility time series. Empirical analysis shows that the proposed techniques improve significantly the accuracy of volatility forecasts for the S&P500 index, single stocks and US bond yields, especially in periods following the occurrence of a jump.
    Keywords: volatility forecasting, jumps, bipower variation, threshold estimation, stock, bond
    JEL: G1 C1 C22 C53
    Date: 2009–03
    URL: http://d.repec.org/n?u=RePEc:hst:ghsdps:gd08-036&r=ecm
  7. By: Li, Yushu (Centre for Labour Market Policy Research (CAFO)); Shukur, Ghazi (Centre for Labour Market Policy Research (CAFO))
    Abstract: In this paper, we use the wavelet technique to improve the over-rejection problem of the traditional Dickey-Fuller test for unit root when the data suffers from GARCH (1,1) effect. The logic is based on that the wavelet spectrum decomposition can separate out information of different frequencies in the data series. We prove that the asymptotic distribution of our test is similar to the traditional Dickey-Fuller(1979 and 1981) type of tests. The small sample distribution of the new test is assessed by means of Monte Carlo simulation. An empirical example with data on immigration to Sweden during the period 1950 to 2000 is used to illustrate the test. The results reveal that using the traditional Dickey-Fuller type of test, the unit root is rejected while our wavelet improved test shows the opposite result.
    Keywords: Dickey-Fuller test; GARCH (1:1); Wavelet spectrum decomposition; MODWT
    JEL: C15 C52
    Date: 2009–02–26
    URL: http://d.repec.org/n?u=RePEc:hhs:vxcafo:2009_007&r=ecm
  8. By: Federico M. Bandi; Roberto Reno
    Abstract: Using recent advances in the nonparametric estimation of continuous-time processes under mild statistical assumptions as well as recent developments on nonparametric volatility estimation by virtue of market microstructure noise-contaminated high-frequency asset price data, we provide (i) a theory of spot variance estimation and (ii) functional methods for stochastic volatility modelling. Our methods allow for the joint evaluation of return and volatility dynamics with nonlinear drift and diffusion functions, nonlinear leverage effects, jumps in returns and volatility with possibly state-dependent jump intensities, as well as nonlinear risk-return trade-offs. Our identification approach and asymptotic results apply under weak recurrence assumptions and, hence, accommodate the persistence properties of variance in finite samples. Functional estimation of a generalized (i.e., nonlinear) version of the square-root stochastic variance model with jumps in both volatility and returns for the S&P500 index suggests the need for richer variance dynamics than in existing work. We find a linear specification for the variance's diffusive variance to be misspecified (and inferior to a more flexible CEV specification) even when allowing for jumps in the variance dynamics.
    Keywords: Spot variance, stochastic volatility, jumps in returns, jumps in volatility, leverage effects, risk-return trade-offs, kernel methods, recurrence, market microstructure noise.
    Date: 2009–03
    URL: http://d.repec.org/n?u=RePEc:hst:ghsdps:gd08-035&r=ecm
  9. By: Fabrizio Cipollini (Università di Firenze, Dipartimento di Statistica "G. Parenti"); Giampiero M. Gallo (Università degli Studi di Firenze, Dipartimento di Statistica "G. Parenti")
    Abstract: Multiplicative Error Models (MEM) can be used to trace the dynamics of non–negative valued processes. Interactions between several such processes are accommodated by the vector MEM and estimated by maximum likelihood (Gamma marginals with copula functions) or by Generalized Method of Moments. In choosing the relevant variables one can follow an automated procedure where the full specification is successively pruned in a general–to–specific approach. An efficient and fast algorithm is presented in this paper and evaluated by means of a simulation and a real world example of volatility spillovers in European markets.
    Keywords: Multiplicative Error Model, GMM, Simultaneous Equations, Volatility, Market Activity
    JEL: C22 C52 C53
    Date: 2009–02
    URL: http://d.repec.org/n?u=RePEc:fir:econom:wp2009_02&r=ecm
  10. By: Fabrizio Cipollini (Università di Firenze, Dipartimento di Statistica "G. Parenti"); Robert F. Engle (New York University - Leonard Stern School of Business); Giampiero M. Gallo (Università degli Studi di Firenze, Dipartimento di Statistica "G. Parenti")
    Abstract: In financial time series analysis we encounter several instances of non–negative valued processes (volumes, trades, durations, realized volatility, daily range, and so on) which exhibit clustering and can be modeled as the product of a vector of conditionally autoregressive scale factors and a multivariate iid innovation process (vector Multiplicative Error Model). Two novel points are introduced in this paper relative to previous suggestions: a more general specification which sets this vector MEM apart from an equation by equation specification; and the adoption of a GMM-based approach which bypasses the complicated issue of specifying a general multivariate non–negative valued innovation process. A vMEM for volumes, number of trades and realized volatility reveals empirical support for a dynamically interdependent pattern of relationships among the variables on a number of NYSE stocks.
    Keywords: Multiplicative Error Model, GMM, Simultaneous Equations, Volatility, Market Activity
    JEL: C22 C52 C53
    Date: 2009–02
    URL: http://d.repec.org/n?u=RePEc:fir:econom:wp2009_03&r=ecm
  11. By: J.M.C. Santos Silva; Silvana Tenreyro
    Abstract: We extend the simulation results given in Santos Silva and Tenreyro (2006, “The log of gravity,” The Review of Economics and Statistics, 88, 641-658) by considering data generated as a finite mixture of gamma variates. Data generated in this way can naturally have a large proportion of zeros and is fully compatible with constant elasticity models such as the gravity equation. Our results confirm that the Poisson pseudo maximum likelihood estimator is generally well behaved.
    Date: 2009–03–02
    URL: http://d.repec.org/n?u=RePEc:esx:essedp:666&r=ecm
  12. By: Hiroki Masuda; Takayuki Morimoto
    Abstract: In Japanese stock markets, there are two kinds of breaks, i.e., nighttime and lunch break, where we have no trading, entailing inevitable increase of variance in estimating daily volatility via naive realized variance (RV). In order to perform a much more stabilized estimation, we are concerned here with a modification of the weighting technique of Hansen and Lunde (2005). As an empirical study, we estimate optimal weights in a certain sense for Japanese stock data listed on the Tokyo Stock Exchange. We found that, in most stocks appropriate use of the optimally weighted RV can lead to remarkably smaller estimation variance compared with naive RV, hence substantially to more accurate forecasting of daily volatility.
    Keywords: high-frequency data, market microstructure noise, realized volatility, Japanese stock markets, variance of realized variance
    JEL: C19 C22 C51
    Date: 2009–02
    URL: http://d.repec.org/n?u=RePEc:hst:ghsdps:gd08-033&r=ecm
  13. By: Genaro Sucarrat
    Abstract: Econometric reduction theory provides a comprehensive probabilistic framework for the analysis and classification of the reductions (simplifications) associated with empirical econometric models. However, the available approaches to econometric reduction theory are unable to satisfactory accommodate a commonplace theory of social reality, namely that the course of history is indeterministic, that history does not repeat itself and that the future depends on the past. Using concepts from philosophy this paper proposes a solution to these shortcomings, which in addition permits new reductions, interpretations and definitions.
    Keywords: Theory of reduction, DGP, Possible worlds, Econometrics and philosophy
    JEL: B40 C50
    Date: 2009–02
    URL: http://d.repec.org/n?u=RePEc:cte:werepe:we091005&r=ecm

This nep-ecm issue is ©2009 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.