nep-ets New Economics Papers
on Econometric Time Series
Issue of 2010‒06‒11
fourteen papers chosen by
Yong Yin
SUNY at Buffalo

  1. Likelihood inference for a fractionally cointegrated vector autoregressive model By Søren Johansen; Morten Ørregaard Nielsen
  2. Comparing Penalized Splines and Fractional Polynomials for Flexible Modelling of the Effects of Continuous Predictor Variables By Alexander Strasak; Nikolaus Umlauf; Ruth Pfeiffer; Stefan Lang
  3. High order discretization schemes for stochastic volatility models By Benjamin Jourdain; Mohamed Sbai
  4. Testing for the Existence of a Generalized Wiener Process- the Case of Stock Prices By Melvin. J. Hinich; Phillip Wild; John Foster
  5. Combining Non-Replicable Forecasts By Chia-Lin Chang; Philip Hans Franses; Michael McAleer
  6. Closed-Form Likelihood Expansions for Multivariate Time-Inhomogeneous Diffusions By Seungmoon Choi
  7. Semiparametric Trending Panel Data Models with Cross-Sectional Dependence By Jia Chen; Jiti Gao; Degui Li
  8. Nonparametric Time-Varying Coefficient Panel Data Models with Fixed Effects By Degui Li; Jia Chen; Jiti Gao
  9. Bias Correction and Out-of-Sample Forecast Accuracy By Hyeongwoo Kim; Nazif Durmaz
  10. Moment-based estimation of smooth transition regression models with endogenous variables By Waldyr Dutra Areosa; Michael McAleer; Marcelo Cunha Medeiros
  11. Forecasting Realized Volatility with Linear and Nonlinear Models By Francesco Audrino; Marcelo Cunha Medeiros
  12. "Efficient Bayesian Estimation of a Multivariate Stochastic Volatility Model with Cross Leverage and Heavy-Tailed Errors" By Tsunehiro Ishihara; Yasuhiro Omori
  13. Forecasting Government Bond Yields with Large Bayesian VARs By A. Carriero; G. Kapetanios; M. Marcellino
  14. Empirical Simultaneous Confidence Regions for Path-Forecasts By Òscar Jordà; Malte Knüppel; Massimiliano Marcellino

  1. By: Søren Johansen (University of Copenhagen and CREATES); Morten Ørregaard Nielsen (Queen?s University and CREATES)
    Abstract: We consider model based inference in a fractionally cointegrated (or cofractional) vector autoregressive model based on the conditional Gaussian likelihood. The model allows the process X_{t} to be fractional of order d and cofractional of order d-b; that is, there exist vectors ß for which ß'X_{t} is fractional of order d-b. The parameters d and b satisfy either d=b=1/2, d=b=1/2, or d=d_{0}=b=1/2. Our main technical contribution is the proof of consistency of the maximum likelihood estimators on the set 1/2=b=d=d_{1} for any d_{1}=d_{0}. To this end, we consider the conditional likelihood as a stochastic process in the parameters, and prove that it converges in distribution when errors are i.i.d. with suitable moment conditions and initial values are bounded. We then prove that the estimator of ß is asymptotically mixed Gaussian and estimators of the remaining parameters are asymptotically Gaussian. We also find the asymptotic distribution of the likelihood ratio test for cointegration rank, which is a functional of fractional Brownian motion of type II.
    Keywords: Cofractional processes, cointegration rank, fractional cointegration, likelihood inference, vector autoregressive model
    JEL: C32
    Date: 2010–05–18
    URL: http://d.repec.org/n?u=RePEc:aah:create:2010-24&r=ets
  2. By: Alexander Strasak; Nikolaus Umlauf; Ruth Pfeiffer; Stefan Lang
    Abstract: P(enalized)-splines and fractional polynomials (FPs) have emerged as powerful smoothing techniques with increasing popularity in several fields of applied research. Both approaches provide considerable flexibility, but only limited comparative evaluations of the performance and properties of the two methods have been conducted to date. We thus performed extensive simulations to compare FPs of degree 2 (FP2) and degree 4 (FP4) and P-splines that used generalized cross validation (GCV) and restricted maximum likelihood (REML) for smoothing parameter selection. We evaluated the ability of P-splines and FPs to recover the “true” functional form of the association between continuous, binary and survival outcomes and exposure for linear, quadratic and more complex, non-linear functions, using different sample sizes and signal to noise ratios. We found that for more curved functions FP2, the current default implementation in standard software, showed considerably bias and consistently higher mean squared error (MSE) compared to spline-based estimators (REML, GCV) and FP4, that performed equally well in most simulation settings. FPs however, are prone to artefacts due to the specific choice of the origin, while P-splines based on GCV reveal sometimes wiggly estimates in particular for small sample sizes. Finally,we highlight the specific features of the approaches in a real dataset.
    Keywords: generalized additive models; GAMs; simulation; smoothing
    JEL: C14
    Date: 2010–05
    URL: http://d.repec.org/n?u=RePEc:inn:wpaper:2010-11&r=ets
  3. By: Benjamin Jourdain (CERMICS - Centre d'Enseignement et de Recherche en Mathématiques, Informatique et Calcul Scientifique - INRIA - Ecole Nationale des Ponts et Chaussées); Mohamed Sbai (CERMICS - Centre d'Enseignement et de Recherche en Mathématiques, Informatique et Calcul Scientifique - INRIA - Ecole Nationale des Ponts et Chaussées)
    Abstract: In usual stochastic volatility models, the process driving the volatility of the asset price evolves according to an autonomous one-dimensional stochastic differential equation. We assume that the coefficients of this equation are smooth. Using Itô's formula, we get rid, in the asset price dynamics, of the stochastic integral with respect to the Brownian motion driving this SDE. Taking advantage of this structure, we propose - a scheme, based on the Milstein discretization of this SDE, with order one of weak trajectorial convergence for the asset price, - a scheme, based on the Ninomiya-Victoir discretization of this SDE, with order two of weak convergence for the asset price. We also propose a specific scheme with improved convergence properties when the volatility of the asset price is driven by an Orstein-Uhlenbeck process. We confirm the theoretical rates of convergence by numerical experiments and show that our schemes are well adapted to the multilevel Monte Carlo method introduced by Giles [2008a, 2008b].
    Keywords: discretization schemes, stochastic volatility models, weak trajectorial convergence, multilevel Monte Carlo
    Date: 2009–08–07
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-00409861_v2&r=ets
  4. By: Melvin. J. Hinich; Phillip Wild; John Foster (School of Economics, The University of Queensland)
    Abstract: In this article, we present two nonparametric trispectrum based tests for testing the hypothesis that an observed time series was generated by what we call a generalized Wiener process (GWP). Assuming the existence of a Weiner process for asset rates of return is critical to the Black-Scholes model and its extension by Merton (BSM). The Hinich trispectrum-based test of linearity and the trispectrum extension of the Hinich-Rothman bispectrum test for time reversibility are used to test the validity of BSM. We apply the tests to a selection of high frequency NYSE and Australian (ASX) stocks.
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:qld:uq2004:408&r=ets
  5. By: Chia-Lin Chang; Philip Hans Franses; Michael McAleer (University of Canterbury)
    Abstract: Macro-economic forecasts are often based on the interaction between econometric models and experts. A forecast that is based only on an econometric model is replicable and may be unbiased, whereas a forecast that is not based only on an econometric model, but also incorporates an expert’s touch, is non-replicable and is typically biased. In this paper we propose a methodology to analyze the qualities of combined non-replicable forecasts. One part of the methodology seeks to retrieve a replicable component from the non-replicable forecasts, and compares this component against the actual data. A second part modifies the estimation routine due to the assumption that the difference between a replicable and a non-replicable forecast involves a measurement error. An empirical example to forecast economic fundamentals for Taiwan shows the relevance of the methodological approach.
    Keywords: Combined forecasts; efficient estimation; generated regressors; replicable forecasts; non-replicable forecasts; expert’s intuition
    JEL: C53 C22 E27 E37
    Date: 2010–05–01
    URL: http://d.repec.org/n?u=RePEc:cbt:econwp:10/35&r=ets
  6. By: Seungmoon Choi (School of Economics, University of Adelaide)
    Abstract: The aim of this paper is to find approximate log-transition density functions for multivariate time-inhomogeneous diffusions in closed form. There are many empirical evidences that the underlying data generating processes for many economic variables might change over time. One possible way to explain the time-dependent behavior of state variables is to model the drift or volatility terms as functions of time t as well as state variables. Closed-form likelihood expansions for multivariate time-homogeneous diffusions have been obtained by Ait-Sahalia (2008). This research is built on his work and extends his results to time-inhomogeneous cases. Simulation study reveals that our method yields a very accurate approximate likelihood function that can be a good candidate when the true likelihood function is unavailable.
    Keywords: Likelihood function; Multivariate time-inhomogeneous diffusion; Reducible diffusions, Irreducible diffusions
    Date: 2010–05
    URL: http://d.repec.org/n?u=RePEc:adl:wpaper:2010-11&r=ets
  7. By: Jia Chen (School of Economics, University of Adelaide); Jiti Gao (School of Economics, University of Adelaide); Degui Li (School of Economics, University of Adelaide)
    Abstract: A semiparametric fixed effects model is introduced to describe the nonlinear trending phenomenon in panel data analysis and it allows for the cross-sectional dependence in both the regressors and the residuals. A semiparametric profile likelihood approach based on the first-stage local linear fitting is developed to estimate both the parameter vector and the time trend function.
    Keywords: Cross-sectional dependence, nonlinear time trend, panel data, profile likelihood, semiparametric regression
    JEL: C13 C14 C23
    Date: 2010–05
    URL: http://d.repec.org/n?u=RePEc:adl:wpaper:2010-10&r=ets
  8. By: Degui Li (School of Economics, University of Adelaide); Jia Chen (School of Economics, University of Adelaide); Jiti Gao (School of Economics, University of Adelaide)
    Abstract: This paper is concerned with developing a nonparametric time-varying coefficient model with fixed effects to characterize nonstationarity and trending phenomenon in nonlinear panel data analysis. We develop two methods to estimate the trend function and the coefficient function without taking the first difference to eliminate the fixed effects. The first one eliminates the fixed effects by taking cross-sectional averages, and then uses a nonparametric local linear approach to estimate the trend function and the coefficient function. The asymptotic theory for this approach reveals that although the estimates of both the trend function and the coefficient function are consistent, the estimate of the coefficient function has a rate of convergence that is slower than that of the trend function. To estimate the coefficient function more efficiently, we propose a pooled local linear dummy variable approach. This is motivated by a least squares dummy variable method proposed in parametric panel data analysis. This method removes the fixed effects by deducting a smoothed version of cross-time average from each individual. The asymptotic distributions of both of the estimates are established when T tends to infinity and N is fixed or both T and N tend to infinity. Simulation results are provided to illustrate the finite sample behavior of the proposed estimation methods.
    Keywords: Fixed effects, local linear estimation, nonstationarity, panel data, specification testing, time-varying coeffcient function
    JEL: C13 C14 C23
    Date: 2010–05
    URL: http://d.repec.org/n?u=RePEc:adl:wpaper:2010-08&r=ets
  9. By: Hyeongwoo Kim; Nazif Durmaz
    Abstract: We evaluate the usefulness of bias-correction methods for autoregressive (AR) models in terms of out-of-sample forecast accuracy, employing two popular methods proposed by Hansen (1999) and So and Shin (1999). Our Monte Carlo simulations show that these methods do not necessarily achieve better forecasting performances than the bias-uncorrected Least Squares (LS) method, because bias correction tends to increase the variance of the estimator. There is a gain from correcting for bias only when the true data generating process is sufficiently persistent. Though the bias arises in finite samples, the sample size (N) is not a crucial factor of the gains from bias-correction, because both the bias and the variance tend to decrease as N goes up. We also provide a real data application with 7 commodity price indices which confirms our findings.
    Keywords: Small-Sample Bias, Grid Bootstrap, Recursive Mean Adjustment, Out-of-Sample Forecast
    JEL: C52 C53
    Date: 2010–05
    URL: http://d.repec.org/n?u=RePEc:abn:wpaper:auwp2010-02&r=ets
  10. By: Waldyr Dutra Areosa (Department of Economics PUC-Rio e Banco Central do Brasil); Michael McAleer (Erasmus School of Economics e Tinbergen Institute e Center for International Research on the Japanese Economy (CIRJE)); Marcelo Cunha Medeiros (Department of Economics PUC-Rio)
    Abstract: Nonlinear regression models have been widely used in practice for a variety of time series and cross-section datasets. For purposes of analyzing univariate and multivariate time series data, in particular, Smooth Transition Regression (STR) models have been shown to be very useful for representing and capturing asymmetric behavior. Most STR models have been applied to univariate processes, and have made a variety of assumptions, including stationary or cointegrated processes, uncorrelated, homoskedastic or conditionally heteroskedastic errors, and weakly exogenous regressors. Under the assumption of exogeneity, the standard method of estimation is nonlinear least squares. The primary purpose of this paper is to relax the assumption of weakly exogenous regressors and to discuss moment based methods for estimating STR models. The paper analyzes the properties of the STR model with endogenous variables by providing a diagnostic test of linearity of the underlying process under endogeneity, developing an estimation procedure and a misspecification test for the STR model, presenting the results of Monte Carlo simulations to show the usefulness of the model and estimation method, and providing an empirical application for inflation rate targeting in Brazil. We show that STR models with endogenous variables can be specified and estimated by a straightforward application of existing results in the literature.
    Keywords: Smooth transition, nonlinear models, nonlinear instrumental variables, generalized method of moments, endogeneity, inflation targeting.
    Date: 2010–03
    URL: http://d.repec.org/n?u=RePEc:rio:texdis:571&r=ets
  11. By: Francesco Audrino (University of St. Gallen); Marcelo Cunha Medeiros (Department of Economics PUC-Rio)
    Abstract: In this paper we propose a smooth transition tree model for both the conditional mean and variance of the short-term interest rate process. The estimation of such models is addressed and the asymptotic properties of the quasi-maximum likelihood estimator are derived. Model specification is also discussed. When the model is applied to the US short-term interest rate we find (1) leading indicators for inflation and real activity are the most relevant predictors in characterizing the multiple regimes’ structure; (2) the optimal model has three limiting regimes. Moreover, we provide empirical evidence of the power of the model in forecasting the first two conditional moments when it is used in connection with bootstrap aggregation (bagging).
    Keywords: short-term interest rate, regression tree, smooth transition, conditional variance, bagging, asymptotic theory
    Date: 2010–03
    URL: http://d.repec.org/n?u=RePEc:rio:texdis:570&r=ets
  12. By: Tsunehiro Ishihara (Graduate School of Economics, University of Tokyo); Yasuhiro Omori (Faculty of Economics, University of Tokyo)
    Abstract: An efficient Bayesian estimation using a Markov chain Monte Carlo method is proposed in the case of a multivariate stochastic volatility model as a natural extension of the univariate stochastic volatility model with leverage and heavy-tailed errors. Note that we further incorporate cross-leverage effects among stock returns. Our method is based on a multi-move sampler that samples a block of latent volatility vectors. The method is presented as a multivariate stochastic volatility model with cross leverage and heavytailed errors. Its high sampling efficiency is shown using numerical examples in comparison with a single-move sampler that samples one latent volatility vector at a time, given other latent vectors and parameters. To illustrate the method, empirical analyses are provided based on five-dimensional S&P500 sector indices returns.
    Date: 2010–05
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2010cf746&r=ets
  13. By: A. Carriero; G. Kapetanios; M. Marcellino
    Abstract: We propose a new approach to forecasting the term structure of interest rates, which allows to efficiently extract the information contained in a large panel of yields. In particular, we use a large Bayesian Vector Autoregression (BVAR) with an optimal amount of shrinkage towards univariate AR models. Focusing on the U.S., we provide an extensive study on the forecasting performance of our proposed model relative to most of the existing alternative speci.cations. While most of the existing evidence focuses on statistical measures of forecast accuracy, we also evaluate the performance of the alternative forecasts when used within trading schemes or as a basis for portfolio allocation. We extensively check the robustness of our results via subsample analysis and via a data based Monte Carlo simulation. We .nd that: i) our proposed BVAR approach produces forecasts systematically more accurate than the random walk forecasts, though the gains are small; ii) some models beat the BVAR for a few selected maturities and forecast horizons, but they perform much worse than the BVAR in the remaining cases; iii) predictive gains with respect to the random walk have decreased over time; iv) di¤erent loss functions (i.e., "statistical" vs "economic") lead to di¤erent ranking of speci.c models; v) modelling time variation in term premia is important and useful for forecasting.
    Keywords: Bayesian methods, Forecasting, Term Structure.
    JEL: C11 C53 E43 E47
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2010/17&r=ets
  14. By: Òscar Jordà; Malte Knüppel; Massimiliano Marcellino
    Abstract: Measuring and displaying uncertainty around path-forecasts, i.e. forecasts made in period T about the expected trajectory of a random variable in periods T+1 to T+H is a key ingredient for decision making under uncertainty. The probabilistic assessment about the set of possible trajectories that the variable may follow over time is summarized by the simultaneous confidence region generated from its forecast generating distribution. However, if the null model is only approximative or altogether unavailable, one cannot derive analytic expressions for this confidence region, and its non-parametric estimation is impractical given commonly available predictive sample sizes. Instead, this paper derives the approximate rectangular confidence regions that control false discovery rate error, which are a function of the predictive sample covariance matrix and the empirical distribution of the Mahalanobis distance of the path-forecast errors. These rectangular regions are simple to construct and appear to work well in a variety of cases explored empirically and by simulation. The proposed techniques are applied to provide con.dence bands around the Fed and Bank of England real-time path-forecasts of growth and inflation.
    Keywords: path forecast, forecast uncertainty, simultaneous confidence region, Scheffé’s S-method,Mahalanobis distance, false discovery rate.
    JEL: C32 C52 C53
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2010/18&r=ets

This nep-ets issue is ©2010 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.