nep-ets New Economics Papers
on Econometric Time Series
Issue of 2005‒04‒03
ten papers chosen by
Yong Yin
SUNY at Buffalo

  1. A Panel Unit Root Test with Good Size and Power in Small Samples By Claude Lopez
  3. Univariate nonlinear time series models By Terasvirta, Timo
  4. Transition Variables in the Markov-switching Model: Some Small Sample Properties By Erlandsson, Ulf
  5. Nonparametric inference for unbalance time series data By Oliver Linton
  6. Automatic positive semi-definite HAC covariance matrix and GMM estimation By Richard Smith
  7. GEL Criteria for Moment Condition Models By Richard Smith
  8. On the Long-Run Variance Ratio Test for a Unit Root By Ye Cai; Mototsugu Shintani
  9. Volatility clustering, leverage effects, and jumps dynamics in emerging Asian equity markets By Daal, Elton; Naka, Atsuyuki; Yu, Jung-Suk
  10. Reducing Bias of MLE in a Dynamic Panel Model By Jinyong Hahn; Hyungsik Roger Moon

  1. By: Claude Lopez
    Abstract: This paper offers a panel extension of the unit root test proposed by Elliott, Rothenberg and Stock (1996). More specifically, the proposed approach allows for heterogeneous serial and contemporaneous correlation, while fixing the rate of convergence to be homogeneous across series. The new test demonstrates significantly better finite sample-power properties than the Levin, Lin and Chu (2002) or the Moon, Perron and Phillips (2003) tests, especially for highly persistent series. An application to the real exchange rate convergence illustrates the impact of such improvements. Analyzing the post Bretton Woods period, the new test provides strong and reliable evidence of Purchasing Power Parity among industrialized countries.
    Date: 2005
  2. By: Rebeca Albacete; Antoni Espasa
    Abstract: Economic agents and financial authorities require frequent updates to a path of accurate inflation forecasts and need forecasts to include an explanation of the factors by which they are determined. This paper studies how to approach this need, developing a method for analysing inflation in the euro area, measured according to HICP. Time series models using the most recent information on prices and an important functional and geographically disaggregation can provide monthly forecasts which are reasonably accurate, but they do not provide an explanation of the factors by which the forecast is determined. In this respect, it is important to enlarge the data set used considering explanatory variables and build congruent econometric models including variables which, following previous works by D. Hendry, capture disequilibria on different markets, goods and services, labour, monetary and international. The final result of this work shows that combining the forecasts from a monthly time series vector model, constructed on price subindexes from a disaggregation of the HICP by countries and sectors, with the forecasts derived from a quarterly econometric vector model on aggregate inflation and other economic variables, very accurate forecasts are obtained. Both vector models are specified including empirical cointegration restrictions, which in the first case capture the constrains necessary present between the trends of the price subindexes and in the second approximate the long-run restrictions postulated by economic theory.
    Date: 2005–01
  3. By: Terasvirta, Timo (Dept. of Economic Statistics, Stockholm School of Economics)
    Abstract: In this paper developments in the analysis of univariate nonlinear time series are considered. First a number of commonly used nonlinear models are presented. The next section is devoted to methods of testing linearity, which is an important part of nonlinear model building. Techniques of modelling nonlinear series within a predetermined family of models are discussed thereafter. Forecasting with nonlinear models also has its own section. A brief set of final remarks closes the chapter.
    Keywords: Hidden Markov model; linearity test; neural network; nonlinear model building; threshold autoregressive model; smooth transition autoregressive model
    JEL: C22 C52
    Date: 2005–03–29
  4. By: Erlandsson, Ulf (Department of Economics, Lund University)
    Abstract: This paper researches small-sample properties of the Markov-switching model with time-varying transition probabilities. By means of simulation, it is shown that the likelihood ratio statistic is over-sized for sample sizes relevant in many empirical applications. The number of regime switches occurring in the sample rather than the total number of observations is central to the magnitude of the distortion, with other factors such a persistence in transition equation variables and the precision at which states are inferred being influential on size. In an application to possible predictors of switches to recessions in U.S. data, it is shown that critical values for the likelihood ratio statistic need to be adjusted far upwards to reflect true confidence levels.
    Keywords: regime switching; transition probability; small-sample
    JEL: C13 C32 E32
    Date: 2005–03–21
  5. By: Oliver Linton (Institute for Fiscal Studies and London School of Economics)
    Abstract: Estimation of heteroskedasticity and autocorrelation consistent covariance matrices (HACs) is a well established problem in time series. Results have been established under a variety of weak conditions on temporal dependence and heterogeneity that allow one to conduct inference on a variety of statistics, see Newey and West (1987), Hansen (1992), de Jong and Davidson (2000), and Robinson (2004). Indeed there is an extensive literature on automating these procedures starting with Andrews (1991). Alternative methods for conducting inference include the bootstrap for which there is also now a very active research program in time series especially, see Lahiri (2003) for an overview. One convenient method for time series is the subsampling approach of Politis, Romano, andWolf (1999). This method was used by Linton, Maasoumi, andWhang (2003) (henceforth LMW) in the context of testing for stochastic dominance. This paper is concerned with the practical problem of conducting inference in a vector time series setting when the data is unbalanced or incomplete. In this case, one can work only with the common sample, to which a standard HAC/bootstrap theory applies, but at the expense of throwing away data and perhaps losing effciency. An alternative is to use some sort of imputation method, but this requires additional modelling assumptions, which we would rather avoid.1 We show how the sampling theory changes and how to modify the resampling algorithms to accommodate the problem of missing data. We also discuss effciency and power. Unbalanced data of the type we consider are quite common in financial panel data, see for example Connor and Korajczyk (1993). These data also occur in cross-country studies.
    Date: 2004–04
  6. By: Richard Smith (Institute for Fiscal Studies and University of Warwick)
    Abstract: This paper proposes a new class of HAC covariance matrix estimators. The standard HAC estimation method re-weights estimators of the autocovariances. Here we initially smooth the data observations themselves using kernel function based weights. The resultant HAC covariance matrix estimator is the normalised outer product of the smoothed random vectors and is therefore automatically positive semi-definite. A corresponding efficient GMM criterion may also be defined as a quadratic form in the smoothed moment indicators whose normalised minimand provides a test statistic for the over-identifying moment conditions.
    Keywords: GMM, HAC Covariance Matrix Estimation, Overidentifying Moments
    JEL: C13 C30
    Date: 2004–12
  7. By: Richard Smith (Institute for Fiscal Studies and University of Warwick)
    Abstract: GEL methods which generalize and extend previous contributions are defined and analysed for moment condition models specified in terms of weakly dependent data. These procedures offer alternative one-step estimators and tests that are asymptotically equivalent to their efficient two-step GMM counterparts. The basis for GEL estimation is via a smoothed version of the moment indicators using kernel function weights which incorporate a bandwidth parameter. Examples for the choice of bandwidth parameter and kernel function are provided. Efficient moment estimators based on implied probabilities derived from the GEL method are proposed, a special case of which is estimation of the stationary distribution of the data. The paper also presents a unified set of test statistics for over-identifying moment restrictions and combinations of parametric and moment restriction hypotheses.
    Keywords: GMM, Generalized Empirical Likelihood, Efficient Moment Estimation,
    JEL: C13 C30
    Date: 2004–12
  8. By: Ye Cai (Graduate Student, Department of Economics, Vanderbilt University); Mototsugu Shintani (Department of Economics, Vanderbilt University)
    Abstract: This paper investigates the effects of consistent and inconsistent long-run variance estimation on a unit root test based on the generalization of the von Neumann ratio. The results from the Monte Carlo experiments suggest that the tests based on an inconsistent estimator have less size distortion and more stability of size across different autocorrelation specifications as compared to the tests based on a consistent estimator. This improvement in size property, however, comes at the cost of a loss in power. The finite sample power, as well as the local asymptotic power, of the tests with an inconsistent estimator is shown to be much lower than that of conventional tests. This finding resembles the case of the autocorrelation robust test in the standard regression context. The paper also points out that combining consistent and inconsistent estimators in the long-run variance ratio test for a unit root is one possibility of balancing the size and power.
    Keywords: Bandwidth, local asymptotic power, von Neumann ratio
    JEL: C12 C22
    Date: 2005–03
  9. By: Daal, Elton; Naka, Atsuyuki; Yu, Jung-Suk
    Abstract: This paper proposes a mixed GARCH-Jump model that is tailored to the specific circumstances arising in emerging equity markets. Our model accommodates lagged currency returns as a local information variable in the autoregressive jump intensity function, incorporates jumps in the returns and volatility, and allows volatility to respond asymmetrically to both normal innovations and jump shocks. The model captures the distinguishing features of the Asian index returns and significantly improves the fit for those markets that were affected by the 1997 Asian crisis. Our proposed model yields higher levels of conditional kurtosis and superior forecasts of the expected arrival rate of jumps.
    Keywords: Jumps, Volatility, Leverage effects, Emerging markets, Asia, Equity markets
    Date: 2004–09–30
  10. By: Jinyong Hahn; Hyungsik Roger Moon
    Abstract: This paper investigates a simple dynamic linear panel regression model with both fixed effects and time effects. Using "large n and large T " asymptotics, we approximate the distribution of the fixed effect estimator of the autoregressive parameter in the dynamic linear panel model and derive its asymptotic bias. We find that the same higher order bias correction approach proposed by Hahn and Kuersteiner (2002) can be applied to the dynamic linear panel model even when time specifc effects are present.
    Date: 2004–12

This nep-ets issue is ©2005 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.