nep-ets New Economics Papers
on Econometric Time Series
Issue of 2010‒11‒27
ten papers chosen by
Yong Yin
SUNY at Buffalo

  1. Semiparametric Quantile Regression Estimation in Dynamic Models with Partially Varying Coefficients By Zongwu Cai; Zhijie Xiao
  2. Forecasting Compositional Time Series with Exponential Smoothing Methods By Anne B. Koehler; Ralph D. Snyder; J. Keith Ord; Adrian Beaumont
  3. Alternative Asymmetric Stochastic Volatility Models By Manabu Asai; Michael McAleer
  4. Spectral Analysis of Non-Stationary Time Series By D M NACHANE
  5. Simulation-based Estimation Methods for Financial Time Series Models By Jun Yu
  6. A New Bayesian Unit Root Test in Stochastic Volatility Models By Yong Li; Jun Yu
  7. Asymptotic Distributions of the Least Squares Estimator for Diffusion Processes By Qiankun Zhou; Jun Yu
  8. Estimating the GARCH Diffusion: Simulated Maximum Likelihood in Continuous Time By Tore Selland Kleppe; Jun Yu; Hans J. Skaug
  9. Bias-Corrected Estimation for Spatial Autocorrelation By Zhenlin Yang
  10. Can We Trust Cluster-Corrected Standard Errors? An Application of Spatial Autocorrelation with Exact Locations Known By John Gibson; Bonggeun Kim; Susan Olivia

  1. By: Zongwu Cai (Department of Mathematics & Statistics, University of North Carolina at Charlotte; Fujian Key Laboratory of Statistical Sciences, Xiamen University); Zhijie Xiao (Boston College)
    Abstract: We study quantile regression estimation for dynamic models with partially varying coefficients so that the values of some coefficients may be functions of informative covariates. Estimation of both parametric and nonparametric functional coefficients are proposed. In particular, we propose a three stage semiparametric procedure. Both consistency and asymptotic normality of the proposed estimators are derived. We demonstrate that the parametric estimators are root-n consistent and the estimation of the functional coefficients is oracle. In addition, efficiency of parameter estimation is discussed and a simple efficient estimator is proposed. A simple and easily implemented test for the hypothesis of varying-coefficient is proposed. A Monte Carlo experiment is conducted to evaluate the performance of the proposed estimators.
    Keywords: Efficiency; nonlinear time series; partially linear; partially varying coefficients; quantile regression; semiparametric
    Date: 2010–11–22
  2. By: Anne B. Koehler; Ralph D. Snyder; J. Keith Ord; Adrian Beaumont
    Abstract: Compositional time series are formed from measurements of proportions that sum to one in each period of time. We might be interested in forecasting the proportion of home loans that have adjustable rates, the proportion of nonagricultural jobs in manufacturing, the proportion of a rock's geochemical composition that is a specific oxide, or the proportion of an election betting market choosing a particular candidate. A problem may involve many related time series of proportions. There could be several categories of nonagricultural jobs or several oxides in the geochemical composition of a rock that are of interest. In this paper we provide a statistical framework for forecasting these special kinds of time series. We build on the innovations state space framework underpinning the widely used methods of exponential smoothing. We couple this with a generalized logistic transformation to convert the measurements from the unit interval to the entire real line. The approach is illustrated with two applications: the proportion of new home loans in the U.S. that have adjustable rates; and four probabilities for specified candidates winning the 2008 democratic presidential nomination.
    Keywords: compositional time series, innovations state space models, exponential smoothing, forecasting proportions
    JEL: C22
    Date: 2010–11
  3. By: Manabu Asai (Faculty of Economics, Soka University); Michael McAleer (Erasmus University Rotterdam, Tinbergen Institute, The Netherlands, and Institute of Economic Research, Kyoto University)
    Abstract: The stochastic volatility model usually incorporates asymmetric effects by introducing the negative correlation between the innovations in returns and volatility. In this paper, we propose a new asymmetric stochastic volatility model, based on the leverage and size effects. The model is a generalization of the exponential GARCH (EGARCH) model of Nelson (1991). We consider categories for asymmetric effects, which describes the difference among the asymmetric effect of the EGARCH model, the threshold effects indicator function of Glosten, Jagannathan and Runkle (1992), and the negative correlation between the innovations in returns and volatility. The new model is estimated by the efficient importance sampling method of Liesenfeld and Richard (2003), and the finite sample properties of the estimator are investigated using numerical simulations. Four financial time series are used to estimate the alternative asymmetric SV models, with empirical asymmetric effects found to be statistically significant in each case. The empirical results for S&P 500 and Yen/USD returns indicate that the leverage and size effects are significant, supporting the general model. For TOPIX and USD/AUD returns, the size effect is insignificant, favoring the negative correlation between the innovations in returns and volatility. We also consider standardized t distribution for capturing the tail behavior. The results for Yen/USD returns show that the model is correctly specified, while the results for three other data sets suggest there is scope for improvement.
    Keywords: Stochastic volatility, asymmetric effects, leverage, threshold, indicator function, importance sampling, numerical simulations.
    Date: 2010–10
  4. By: D M NACHANE
    Abstract: The aim of this paper is to take stock of the important recent contributions to spectral analysis, especially as they apply to non-stationary processes. Non-stationary processes are particularly relevant in the empirical sciences where most phenomena exhibit pronounced departures from stationary.
    Keywords: spectral analysis, non-stationary, empirical sciences, time series,
    Date: 2010
  5. By: Jun Yu (School of Economics, Singapore Management University)
    Abstract: This chapter overviews some recent advances on simulation-based methods of estimating financial time series models that are widely used in financial economics. The simulation-based methods have proven to be particularly useful when the likelihood function and moments do not have tractable forms, and hence, the maximum likelihood (ML) method and the generalized method of moments (GMM) are diffcult to use. They are also capable of improving the finite sample performance of the traditional methods. Both frequentist's and Bayesian simulation-based methods are reviewed. Frequentist's simulation-based methods cover various forms of simulated maximum likelihood (SML) methods, the simulated generalized method of moments (SGMM), the efficient method of moments (EMM), and the indirect inference (II) method. Bayesian simulation-based methods cover various MCMC algorithms. Each simulation-based method is discussed in the context of a specific financial time series model as a motivating example. Empirical applications, based on real exchange rates, interest rates and equity data, illustrate how the simulation-based methods are implemented. In particular, SML is applied to a discrete time stochastic volatility model, EMM to estimate a continuous time stochastic volatility model, MCMC to a credit risk model, the II method to a term structure model.
    Keywords: Generalized method of moments, Maximum likelihood, MCMC, Indirect Inference, Credit risk, Stock price, Exchange rate, Interest rate..
    Date: 2010–10
  6. By: Yong Li (Business School, Sun Yat-Sen University); Jun Yu (School of Economics, Singapore Management University)
    Abstract: A new posterior odds analysis is proposed to test for a unit root in volatility dynamics in the context of stochastic volatility models. This analysis extends the Bayesian unit root test of So and Li (1999, Journal of Business Economic Statistics) in two important ways. First, a numerically more stable algorithm is introduced to compute the Bayes factor, taking into account the special structure of the competing models. Owing to its numerical stability, the algorithm overcomes the problem of diverged “size” in the marginal likelihood approach. Second, to improve the “power” of the unit root test, a mixed prior specification with random weights is employed. It is shown that the posterior odds ratio is the by-product of Bayesian estimation and can be easily computed by MCMC methods. A simulation study examines the “size” and “power” performances of the new method. An empirical study, based on time series data covering the subprime crisis, reveals some interesting results.
    Keywords: Bayes factor; Mixed Prior; Markov Chain Monte Carlo; Posterior odds ratio; Stochastic volatility models; Unit root testing.
    Date: 2010–10
  7. By: Qiankun Zhou (School of Economics, Singapore Management University); Jun Yu (School of Economics, Singapore Management University)
    Abstract: The asymptotic distributions of the least squares estimator of the mean reversion parameter (κ) are developed in a general class of diffusion models under three sampling schemes, namely, longspan, in-fill and the combination of long-span and in-fill. The models have an affine structure in the drift function, but allow for nonlinearity in the diffusion function. The limiting distributions are quite different under the alternative sampling schemes. In particular, the in-fill limiting distribution is non-standard and depends on the initial condition and the time span whereas the other two are Gaussian. Moreover, while the other two distributions are discontinuous at κ = 0, the in-fill distribution is continuous in κ. This property provides an answer to the Bayesian criticism to the unit root asymptotics. Monte Carlo simulations suggest that the in-fill asymptotic distribution provides a more accurate approximation to the finite sample distribution than the other two distributions in empirically realistic settings. The empirical application using the U.S. Federal fund rates highlights the difference in statistical inference based on the alternative asymptotic distributions and suggests strong evidence of a unit root in the data.
    Keywords: Vasicek Model, One-factor Model, Mean Reversion, In-fill Asymptotics, Long-span Asymptotics, Unit Root Test
    JEL: C12 C22 G12
    Date: 2010–01
  8. By: Tore Selland Kleppe (Department of Mathematics, University of Bergen); Jun Yu (School of Economics, Singapore Management University); Hans J. Skaug (Department of Mathematics, University of Bergen)
    Abstract: A new algorithm is developed to provide a simulated maximum likelihood estimation of the GARCH diffusion model of Nelson (1990) based on return data only. The method combines two accurate approximation procedures, namely, the polynomial expansion of Aït-Sahalia (2008) to approximate the transition probability density of return and volatility, and the Efficient Importance Sampler (EIS) of Richard and Zhang (2007) to integrate out the volatility. The first and second order terms in the polynomial expansion are used to generate a base-line importance density for an EIS algorithm. The higher order terms are included when evaluating the importance weights. Monte Carlo experiments show that the new method works well and the discretization error is well controlled by the polynomial expansion. In the empirical application, we fit the GARCH diffusion to equity data, perform diagnostics on the model fit, and test the finiteness of the importance weights.
    Keywords: Ecient importance sampling; GARCH diusion model; Simulated Maximum likelihood; Stochastic volatility
    JEL: C11 C15 G12
    Date: 2010–01
  9. By: Zhenlin Yang (School of Economics, Singapore Management University)
    Abstract: The biasedness issue arising from the maximum likelihood estimation of the spatial autoregressive model (SAR) is further investigated under a broader set-up than that in Bao and Ullah (2007a). A major difficulty in analytically evaluating the expectations of ratios of quadratic forms is overcome by a simple bootstrap procedure. With that, the corrections on bias and variance of the spatial estimator can easily be made up to third-order, and once this is done, the estimators of other model parameters become nearly unbiased. Compared with the analytical approach, the new approach is much simpler, and can easily be extended to other models of a similar structure. Extensive Monte Carlo results show that the new approach performs excellently in general.
    Keywords: Third-order bias; Third-order variance; Bootstrap; Concentrated estimating equation; Monte Carlo; Quasi-MLE; Spatial layout.
    JEL: C10 C21
    Date: 2010–10
  10. By: John Gibson (University of Waikato); Bonggeun Kim (Seoul National University); Susan Olivia (Monash University)
    Abstract: Standard error corrections for clustered samples impose untested restrictions on spatial correlations. Our example shows these are too conservative, compared with a spatial error model that exploits information on exact locations of observations, causing inference errors when cluster corrections are used.
    Keywords: clustered samples; GPS; spatial correlation
    JEL: C31 C81
    Date: 2011–08–18

This nep-ets issue is ©2010 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.