
on Econometric Time Series 
By:  Søren Johansen (University of Copenhagen and CREATES); Morten Ørregaard Nielsen (Queen?s University and CREATES) 
Abstract:  We consider model based inference in a fractionally cointegrated (or cofractional) vector autoregressive model based on the conditional Gaussian likelihood. The model allows the process X_{t} to be fractional of order d and cofractional of order db; that is, there exist vectors ß for which ß'X_{t} is fractional of order db. The parameters d and b satisfy either d=b=1/2, d=b=1/2, or d=d_{0}=b=1/2. Our main technical contribution is the proof of consistency of the maximum likelihood estimators on the set 1/2=b=d=d_{1} for any d_{1}=d_{0}. To this end, we consider the conditional likelihood as a stochastic process in the parameters, and prove that it converges in distribution when errors are i.i.d. with suitable moment conditions and initial values are bounded. We then prove that the estimator of ß is asymptotically mixed Gaussian and estimators of the remaining parameters are asymptotically Gaussian. We also find the asymptotic distribution of the likelihood ratio test for cointegration rank, which is a functional of fractional Brownian motion of type II. 
Keywords:  Cofractional processes, cointegration rank, fractional cointegration, likelihood inference, vector autoregressive model 
JEL:  C32 
Date:  2010–05–18 
URL:  http://d.repec.org/n?u=RePEc:aah:create:201024&r=ets 
By:  Alexander Strasak; Nikolaus Umlauf; Ruth Pfeiffer; Stefan Lang 
Abstract:  P(enalized)splines and fractional polynomials (FPs) have emerged as powerful smoothing techniques with increasing popularity in several fields of applied research. Both approaches provide considerable flexibility, but only limited comparative evaluations of the performance and properties of the two methods have been conducted to date. We thus performed extensive simulations to compare FPs of degree 2 (FP2) and degree 4 (FP4) and Psplines that used generalized cross validation (GCV) and restricted maximum likelihood (REML) for smoothing parameter selection. We evaluated the ability of Psplines and FPs to recover the “true” functional form of the association between continuous, binary and survival outcomes and exposure for linear, quadratic and more complex, nonlinear functions, using different sample sizes and signal to noise ratios. We found that for more curved functions FP2, the current default implementation in standard software, showed considerably bias and consistently higher mean squared error (MSE) compared to splinebased estimators (REML, GCV) and FP4, that performed equally well in most simulation settings. FPs however, are prone to artefacts due to the specific choice of the origin, while Psplines based on GCV reveal sometimes wiggly estimates in particular for small sample sizes. Finally,we highlight the specific features of the approaches in a real dataset. 
Keywords:  generalized additive models; GAMs; simulation; smoothing 
JEL:  C14 
Date:  2010–05 
URL:  http://d.repec.org/n?u=RePEc:inn:wpaper:201011&r=ets 
By:  Benjamin Jourdain (CERMICS  Centre d'Enseignement et de Recherche en Mathématiques, Informatique et Calcul Scientifique  INRIA  Ecole Nationale des Ponts et Chaussées); Mohamed Sbai (CERMICS  Centre d'Enseignement et de Recherche en Mathématiques, Informatique et Calcul Scientifique  INRIA  Ecole Nationale des Ponts et Chaussées) 
Abstract:  In usual stochastic volatility models, the process driving the volatility of the asset price evolves according to an autonomous onedimensional stochastic differential equation. We assume that the coefficients of this equation are smooth. Using Itô's formula, we get rid, in the asset price dynamics, of the stochastic integral with respect to the Brownian motion driving this SDE. Taking advantage of this structure, we propose  a scheme, based on the Milstein discretization of this SDE, with order one of weak trajectorial convergence for the asset price,  a scheme, based on the NinomiyaVictoir discretization of this SDE, with order two of weak convergence for the asset price. We also propose a specific scheme with improved convergence properties when the volatility of the asset price is driven by an OrsteinUhlenbeck process. We confirm the theoretical rates of convergence by numerical experiments and show that our schemes are well adapted to the multilevel Monte Carlo method introduced by Giles [2008a, 2008b]. 
Keywords:  discretization schemes, stochastic volatility models, weak trajectorial convergence, multilevel Monte Carlo 
Date:  2009–08–07 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal00409861_v2&r=ets 
By:  Melvin. J. Hinich; Phillip Wild; John Foster (School of Economics, The University of Queensland) 
Abstract:  In this article, we present two nonparametric trispectrum based tests for testing the hypothesis that an observed time series was generated by what we call a generalized Wiener process (GWP). Assuming the existence of a Weiner process for asset rates of return is critical to the BlackScholes model and its extension by Merton (BSM). The Hinich trispectrumbased test of linearity and the trispectrum extension of the HinichRothman bispectrum test for time reversibility are used to test the validity of BSM. We apply the tests to a selection of high frequency NYSE and Australian (ASX) stocks. 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:qld:uq2004:408&r=ets 
By:  ChiaLin Chang; Philip Hans Franses; Michael McAleer (University of Canterbury) 
Abstract:  Macroeconomic forecasts are often based on the interaction between econometric models and experts. A forecast that is based only on an econometric model is replicable and may be unbiased, whereas a forecast that is not based only on an econometric model, but also incorporates an expert’s touch, is nonreplicable and is typically biased. In this paper we propose a methodology to analyze the qualities of combined nonreplicable forecasts. One part of the methodology seeks to retrieve a replicable component from the nonreplicable forecasts, and compares this component against the actual data. A second part modifies the estimation routine due to the assumption that the difference between a replicable and a nonreplicable forecast involves a measurement error. An empirical example to forecast economic fundamentals for Taiwan shows the relevance of the methodological approach. 
Keywords:  Combined forecasts; efficient estimation; generated regressors; replicable forecasts; nonreplicable forecasts; expert’s intuition 
JEL:  C53 C22 E27 E37 
Date:  2010–05–01 
URL:  http://d.repec.org/n?u=RePEc:cbt:econwp:10/35&r=ets 
By:  Seungmoon Choi (School of Economics, University of Adelaide) 
Abstract:  The aim of this paper is to find approximate logtransition density functions for multivariate timeinhomogeneous diffusions in closed form. There are many empirical evidences that the underlying data generating processes for many economic variables might change over time. One possible way to explain the timedependent behavior of state variables is to model the drift or volatility terms as functions of time t as well as state variables. Closedform likelihood expansions for multivariate timehomogeneous diffusions have been obtained by AitSahalia (2008). This research is built on his work and extends his results to timeinhomogeneous cases. Simulation study reveals that our method yields a very accurate approximate likelihood function that can be a good candidate when the true likelihood function is unavailable. 
Keywords:  Likelihood function; Multivariate timeinhomogeneous diffusion; Reducible diffusions, Irreducible diffusions 
Date:  2010–05 
URL:  http://d.repec.org/n?u=RePEc:adl:wpaper:201011&r=ets 
By:  Jia Chen (School of Economics, University of Adelaide); Jiti Gao (School of Economics, University of Adelaide); Degui Li (School of Economics, University of Adelaide) 
Abstract:  A semiparametric fixed effects model is introduced to describe the nonlinear trending phenomenon in panel data analysis and it allows for the crosssectional dependence in both the regressors and the residuals. A semiparametric profile likelihood approach based on the firststage local linear fitting is developed to estimate both the parameter vector and the time trend function. 
Keywords:  Crosssectional dependence, nonlinear time trend, panel data, profile likelihood, semiparametric regression 
JEL:  C13 C14 C23 
Date:  2010–05 
URL:  http://d.repec.org/n?u=RePEc:adl:wpaper:201010&r=ets 
By:  Degui Li (School of Economics, University of Adelaide); Jia Chen (School of Economics, University of Adelaide); Jiti Gao (School of Economics, University of Adelaide) 
Abstract:  This paper is concerned with developing a nonparametric timevarying coefficient model with fixed effects to characterize nonstationarity and trending phenomenon in nonlinear panel data analysis. We develop two methods to estimate the trend function and the coefficient function without taking the first difference to eliminate the fixed effects. The first one eliminates the fixed effects by taking crosssectional averages, and then uses a nonparametric local linear approach to estimate the trend function and the coefficient function. The asymptotic theory for this approach reveals that although the estimates of both the trend function and the coefficient function are consistent, the estimate of the coefficient function has a rate of convergence that is slower than that of the trend function. To estimate the coefficient function more efficiently, we propose a pooled local linear dummy variable approach. This is motivated by a least squares dummy variable method proposed in parametric panel data analysis. This method removes the fixed effects by deducting a smoothed version of crosstime average from each individual. The asymptotic distributions of both of the estimates are established when T tends to infinity and N is fixed or both T and N tend to infinity. Simulation results are provided to illustrate the finite sample behavior of the proposed estimation methods. 
Keywords:  Fixed effects, local linear estimation, nonstationarity, panel data, specification testing, timevarying coeffcient function 
JEL:  C13 C14 C23 
Date:  2010–05 
URL:  http://d.repec.org/n?u=RePEc:adl:wpaper:201008&r=ets 
By:  Hyeongwoo Kim; Nazif Durmaz 
Abstract:  We evaluate the usefulness of biascorrection methods for autoregressive (AR) models in terms of outofsample forecast accuracy, employing two popular methods proposed by Hansen (1999) and So and Shin (1999). Our Monte Carlo simulations show that these methods do not necessarily achieve better forecasting performances than the biasuncorrected Least Squares (LS) method, because bias correction tends to increase the variance of the estimator. There is a gain from correcting for bias only when the true data generating process is sufficiently persistent. Though the bias arises in finite samples, the sample size (N) is not a crucial factor of the gains from biascorrection, because both the bias and the variance tend to decrease as N goes up. We also provide a real data application with 7 commodity price indices which confirms our findings. 
Keywords:  SmallSample Bias, Grid Bootstrap, Recursive Mean Adjustment, OutofSample Forecast 
JEL:  C52 C53 
Date:  2010–05 
URL:  http://d.repec.org/n?u=RePEc:abn:wpaper:auwp201002&r=ets 
By:  Waldyr Dutra Areosa (Department of Economics PUCRio e Banco Central do Brasil); Michael McAleer (Erasmus School of Economics e Tinbergen Institute e Center for International Research on the Japanese Economy (CIRJE)); Marcelo Cunha Medeiros (Department of Economics PUCRio) 
Abstract:  Nonlinear regression models have been widely used in practice for a variety of time series and crosssection datasets. For purposes of analyzing univariate and multivariate time series data, in particular, Smooth Transition Regression (STR) models have been shown to be very useful for representing and capturing asymmetric behavior. Most STR models have been applied to univariate processes, and have made a variety of assumptions, including stationary or cointegrated processes, uncorrelated, homoskedastic or conditionally heteroskedastic errors, and weakly exogenous regressors. Under the assumption of exogeneity, the standard method of estimation is nonlinear least squares. The primary purpose of this paper is to relax the assumption of weakly exogenous regressors and to discuss moment based methods for estimating STR models. The paper analyzes the properties of the STR model with endogenous variables by providing a diagnostic test of linearity of the underlying process under endogeneity, developing an estimation procedure and a misspecification test for the STR model, presenting the results of Monte Carlo simulations to show the usefulness of the model and estimation method, and providing an empirical application for inflation rate targeting in Brazil. We show that STR models with endogenous variables can be specified and estimated by a straightforward application of existing results in the literature. 
Keywords:  Smooth transition, nonlinear models, nonlinear instrumental variables, generalized method of moments, endogeneity, inflation targeting. 
Date:  2010–03 
URL:  http://d.repec.org/n?u=RePEc:rio:texdis:571&r=ets 
By:  Francesco Audrino (University of St. Gallen); Marcelo Cunha Medeiros (Department of Economics PUCRio) 
Abstract:  In this paper we propose a smooth transition tree model for both the conditional mean and variance of the shortterm interest rate process. The estimation of such models is addressed and the asymptotic properties of the quasimaximum likelihood estimator are derived. Model specification is also discussed. When the model is applied to the US shortterm interest rate we find (1) leading indicators for inflation and real activity are the most relevant predictors in characterizing the multiple regimes’ structure; (2) the optimal model has three limiting regimes. Moreover, we provide empirical evidence of the power of the model in forecasting the first two conditional moments when it is used in connection with bootstrap aggregation (bagging). 
Keywords:  shortterm interest rate, regression tree, smooth transition, conditional variance, bagging, asymptotic theory 
Date:  2010–03 
URL:  http://d.repec.org/n?u=RePEc:rio:texdis:570&r=ets 
By:  Tsunehiro Ishihara (Graduate School of Economics, University of Tokyo); Yasuhiro Omori (Faculty of Economics, University of Tokyo) 
Abstract:  An efficient Bayesian estimation using a Markov chain Monte Carlo method is proposed in the case of a multivariate stochastic volatility model as a natural extension of the univariate stochastic volatility model with leverage and heavytailed errors. Note that we further incorporate crossleverage effects among stock returns. Our method is based on a multimove sampler that samples a block of latent volatility vectors. The method is presented as a multivariate stochastic volatility model with cross leverage and heavytailed errors. Its high sampling efficiency is shown using numerical examples in comparison with a singlemove sampler that samples one latent volatility vector at a time, given other latent vectors and parameters. To illustrate the method, empirical analyses are provided based on fivedimensional S&P500 sector indices returns. 
Date:  2010–05 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2010cf746&r=ets 
By:  A. Carriero; G. Kapetanios; M. Marcellino 
Abstract:  We propose a new approach to forecasting the term structure of interest rates, which allows to efficiently extract the information contained in a large panel of yields. In particular, we use a large Bayesian Vector Autoregression (BVAR) with an optimal amount of shrinkage towards univariate AR models. Focusing on the U.S., we provide an extensive study on the forecasting performance of our proposed model relative to most of the existing alternative speci.cations. While most of the existing evidence focuses on statistical measures of forecast accuracy, we also evaluate the performance of the alternative forecasts when used within trading schemes or as a basis for portfolio allocation. We extensively check the robustness of our results via subsample analysis and via a data based Monte Carlo simulation. We .nd that: i) our proposed BVAR approach produces forecasts systematically more accurate than the random walk forecasts, though the gains are small; ii) some models beat the BVAR for a few selected maturities and forecast horizons, but they perform much worse than the BVAR in the remaining cases; iii) predictive gains with respect to the random walk have decreased over time; iv) di¤erent loss functions (i.e., "statistical" vs "economic") lead to di¤erent ranking of speci.c models; v) modelling time variation in term premia is important and useful for forecasting. 
Keywords:  Bayesian methods, Forecasting, Term Structure. 
JEL:  C11 C53 E43 E47 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:eui:euiwps:eco2010/17&r=ets 
By:  Òscar Jordà; Malte Knüppel; Massimiliano Marcellino 
Abstract:  Measuring and displaying uncertainty around pathforecasts, i.e. forecasts made in period T about the expected trajectory of a random variable in periods T+1 to T+H is a key ingredient for decision making under uncertainty. The probabilistic assessment about the set of possible trajectories that the variable may follow over time is summarized by the simultaneous confidence region generated from its forecast generating distribution. However, if the null model is only approximative or altogether unavailable, one cannot derive analytic expressions for this confidence region, and its nonparametric estimation is impractical given commonly available predictive sample sizes. Instead, this paper derives the approximate rectangular confidence regions that control false discovery rate error, which are a function of the predictive sample covariance matrix and the empirical distribution of the Mahalanobis distance of the pathforecast errors. These rectangular regions are simple to construct and appear to work well in a variety of cases explored empirically and by simulation. The proposed techniques are applied to provide con.dence bands around the Fed and Bank of England realtime pathforecasts of growth and inflation. 
Keywords:  path forecast, forecast uncertainty, simultaneous confidence region, Scheffé’s Smethod,Mahalanobis distance, false discovery rate. 
JEL:  C32 C52 C53 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:eui:euiwps:eco2010/18&r=ets 