nep-ets New Economics Papers
on Econometric Time Series
Issue of 2006‒01‒24
twenty-six papers chosen by
Yong Yin
SUNY at Buffalo

  1. Testing for Additive Outliers in Seasonally Integrated Time Series By Niels Haldrup; Andreu Sansó
  2. Forecasting Core Inflation in Canada: Should We Forecast the Aggregate or the Components? By Frédérick Demers; Annie De Champlain
  3. Bootstrapping a linear estimator of the ARCH parameters By Arup Bose
  4. Economic and VAR Shocks: What Can Go Wrong? By Jesús Fernández-Villaverde; Juan F. Rubio-Ramíre; Thomas J. Sargent
  5. EFFICIENT WALD TESTS FOR FRACTIONAL UNIT ROOTS By Ignacio N. Lobato; Carlos Velasco
  6. ARE FEEDBACK FACTORS IMPORTANT IN MODELLING FINANCIAL DATA? By Helena Veiga
  7. Optimal Bandwidth Selection in Heteroskedasticity-Autocorrelation Robust Testing By Yixiao Sun; Peter C. B. Phillips; Sainan Jin
  8. Gaussian Inference in AR(1) Time Series with or without a Unit Root By Peter C. B. Phillips; Chirok Han
  9. Optimal Estimation of Cointegrated Systems with Irrelevant Instruments By Peter C. B. Phillips
  10. Refined Inference on Long Memory in Realized Volatility By Offer Lieberman; Peter C. B. Phillips
  11. Indirect Inference for Dynamic Panel Models By Christian Gourieroux; Peter C. B. Phillips; Jun Yu
  12. Nonparametric Tests for Serial Independence Based on Quadratic Forms By Cees Diks; Valentyn Panchenko
  13. Measuring Asymmetric Stochastic Cycle Components in U.S. Macroeconomic Time Series By Siem Jan Koopman; Kai Ming Lee
  14. Series Expansions for Finite-State Markov Chains By Bernd Heidergott; Arie Hordijk; Miranda van Uitert
  15. Periodic Seasonal Reg-ARFIMA-GARCH Models for Daily Electricity Spot Prices By Siem Jan Koopman; Marius Ooms; M. Angeles Carnero
  16. Outlier Detection in GARCH Models By Jurgen A. Doornik; Marius Ooms
  17. On Importance Sampling for State Space Models By Borus Jungbacker; Siem Jan Koopman
  18. A Simple Multiple Variance-Ratio Test Based on Ranks By Gilbert Colletaz
  19. Asymptotically Efficient Estimation of the Change Point for Semiparametric GARCH models By Takayuki Shiohama
  20. Structural Breakpoints in Volatility in International Markets By Viviana Fernandez;
  21. A Complete VARMA Modelling Methodology Based on Scalar Components By George Athanasopoulos; Farshid Vahid
  22. Some Nonlinear Exponential Smoothing Models are Unstable By Rob J Hyndman; Muhammad Akram
  23. New Variance Ratio Tests to Identify Random Walk from the General Mean Reversion Model By Kin Lam; May Chun Mei Wong; Wing-Keung Wong
  24. Macroeconometric Modelling with a Global Perspective By M. Hashem Pesaran; Ron Smith
  25. Structural Change and the Order of Integration in By Luis Alberiko Gil-Alana
  26. Forecasting Accuracy and Estimation Uncertainty using VAR Models with Short- and Long-Term Economic Restrictions: A Monte-Carlo Study By Osmani Teixeira de Carvalho Guillén; João Victor Issler; George Athanasopoulos

  1. By: Niels Haldrup; Andreu Sansó (Department of Economics, University of Aarhus, Denmark)
    Abstract: The role of additive outliers in integrated time series has attracted some attention recently and research shows that outlier detection should be an integral part of unit root testing procedures. Recently, Vogelsang (1999) suggested an iterative procedure for the detection of multiple additive outliers in integrated time series. However, the procedure appears to suffr from serious size distortions towards the finding of too many outliers as has been shown by Perron and Rodriguez (2003). In this note we prove the inconsistency of the test in each step of the iterative procedure and hence alternative routes need to be taken to detect outliers in nonstationary time series.
    Keywords: Additive outliers, outlier detection, integrated processes
    JEL: C12 C2 C22
    Date: 2006–01–16
    URL: http://d.repec.org/n?u=RePEc:aah:aarhec:2006-01&r=ets
  2. By: Frédérick Demers; Annie De Champlain
    Abstract: The authors investigate the behaviour of core inflation in Canada to analyze three key issues: (i) homogeneity in the response of various price indexes to demand or real exchange rate shocks relative to the response of aggregate core inflation; (ii) whether using disaggregate data helps to improve the forecast of core inflation; and (iii) whether using monthly data helps to improve quarterly forecasts. The authors show that the response of inflation to output-gap or real exchange rate shocks varies considerably across the components, although the average response remains low; they also show that the average response has decreased over time. To forecast monthly inflation, the use of disaggregate data is a significant improvement over the use of aggregate data. However, the improvements in forecasts of quarterly rates of inflation are only minor. Overall, it remains difficult to properly model and forecast monthly core inflation in Canada.
    Keywords: Econometric and statistical methods; Inflation and prices
    JEL: E37 C5
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:bca:bocawp:05-44&r=ets
  3. By: Arup Bose
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:cin:ucecwp:2006-03&r=ets
  4. By: Jesús Fernández-Villaverde; Juan F. Rubio-Ramíre; Thomas J. Sargent
    Date: 2005–12–31
    URL: http://d.repec.org/n?u=RePEc:cla:levrem:122247000000000990&r=ets
  5. By: Ignacio N. Lobato; Carlos Velasco
    Abstract: In this article we introduce efficient Wald tests for testing the null hypothesis of unit root against the alternative of fractional unit root. In a local alternative framework, the proposed tests are locally asymptotically equivalent to the optimal Robinson (1991, 1994a) Lagrange Multiplier tests. Our results contrast with the tests for fractional unit roots introduced by Dolado, Gonzalo and Mayoral (2002) which are inefficient. In the presence of short range serial correlation, we propose a simple and efficient two-step test that avoids the estimation of a nonlinear regression model. In addition, the first order asymptotic properties of the proposed tests are not affected by the pre-estimation of short or long memory parameters
    Date: 2005–11
    URL: http://d.repec.org/n?u=RePEc:cte:werepe:we056935&r=ets
  6. By: Helena Veiga
    Abstract: This paper provides empirical evidence that continuous time models with one factor of volatility are, in some circumstances, able to fit the main characteristics of financial data and reports insights about the importance of introducing feedback factors for capturing the strong persistence caused by the presence of changes in the variance. We use the Efficient Method of Moments (EMM) by Gallant and Tauchen (1996) to estimate and to select among logarithmic models with one and two stochastic volatility factors (with and without feedback).
    Date: 2006–01
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws060101&r=ets
  7. By: Yixiao Sun (Department of Economics, University of California, San Dieg); Peter C. B. Phillips (Cowles Foundation, Yale University; University of Auckland & University of York); Sainan Jin (Guanghua School of Management, Peking University)
    Abstract: In time series regressions with nonparametrically autocorrelated errors, it is now standard empirical practice to use kernel-based robust standard errors that involve some smoothing function over the sample autocorrelations. The underlying smoothing parameter b, which can be defined as the ratio of the bandwidth (or truncation lag) to the sample size, is a tuning parameter that plays a key role in determining the asymptotic properties of the standard errors and associated semiparametric tests. Small-b asymptotics involve standard limit theory such as standard normal or chi-squared limits, whereas fixed-b asymptotics typically lead to nonstandard limit distributions involving Brownian bridge functionals. The present paper shows that the nonstandard fixed-b limit distributions of such nonparametrically studentized tests provide more accurate approximations to the finite sample distributions than the standard small-b limit distribution. In particular, using asymptotic expansions of both the finite sample distribution and the nonstandard limit distribution, we confirm that the second-order corrected critical value based on the expansion of the nonstandard limiting distribution is also second-order correct under the standard small-b asymptotics. We further show that, for typical economic time series, the optimal bandwidth that minimizes a weighted average of type I and type II errors is larger by an order of magnitude than the bandwidth that minimizes the asymptotic mean squared error of the corresponding long-run variance estimator. A plug-in procedure for implementing this optimal bandwidth is suggested and simulations confirm that the new plug-in procedure works well in finite samples.
    Keywords: Asymptotic expansion, Bandwidth choice, Kernel method, Long-run variance, Loss function, Nonstandard asymptotics, Robust standard error, Type I and Type II errors
    JEL: C13 C14 C22 C51
    Date: 2006–01
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1545&r=ets
  8. By: Peter C. B. Phillips (Cowles Foundation, Yale University; University of Auckland & University of York); Chirok Han (Victoria University of Wellington)
    Abstract: This note introduces a simple first-difference-based approach to estimation and inference for the AR(1) model. The estimates have virtually no finite sample bias, are not sensitive to initial conditions, and the approach has the unusual advantage that a Gaussian central limit theory applies and is continuous as the autoregressive coefficient passes through unity with a uniform vn rate of convergence. En route, a useful CLT for sample covariances of linear processes is given, following Phillips and Solo (1992). The approach also has useful extensions to dynamic panels.
    Keywords: Autoregression, Differencing, Gaussian limit, Mildly explosive processes, Uniformity, Unit root
    JEL: C22
    Date: 2006–01
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1546&r=ets
  9. By: Peter C. B. Phillips (Cowles Foundation, Yale University; University of Auckland & University of York)
    Abstract: It has been know since Phillips and Hansen (1990) that cointegrated systems can be consistently estimated using stochastic trend instruments that are independent of the system variables. A similar phenomenon occurs with deterministically trending instruments. The present work shows that such “irrelevant” deterministic trend instruments may be systematically used to produce asymptotically efficient estimates of a cointegrated system. The approach is convenient in practice, involves only linear instrumental variables estimation, and is a straightforward one step procedure with no loss of degrees of freedom in estimation. Simulations reveal that the procedure works well in practice, having little finite sample bias and less finite sample dispersion than other popular cointegrating regression procedures such as reduced rank VAR regression, fully modified least squares, and dynamic OLS. The procedure is shown to be a form of maximum likelihood estimation where the likelihood is constructed for data projected onto the trending instruments. This “trend likelihood”” is related to the notion of the local Whittle likelihood but avoids frequency domain issues altogether. Correspondingly, the approach developed here has many potential applications beyond conventional cointegrating regression, such as the estimation of long memory and fractional cointegrating relationships.
    Keywords: Asymptotic efficiency, Cointegrated system, Instrumental variables, Irrelevant instrument, Karhunen-Loeve representation, Long memory, Optimal estimation, Orthonormal basis, Trend basis, Trend likelihood
    JEL: C22
    Date: 2006–01
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1547&r=ets
  10. By: Offer Lieberman (Technion-Israel Institute of Technology); Peter C. B. Phillips (Cowles Foundation, Yale University; University of Auckland & University of York)
    Abstract: There is an emerging consensus in empirical finance that realized volatility series typically display long range dependence with a memory parameter (d) around 0.4 (Andersen et. al. (2001), Martens et al. (2004)). The present paper provides some analytical explanations for this evidence and shows how recent results in Lieberman and Phillips (2004a, 2004b) can be used to refine statistical inference about d with little computational effort. In contrast to standard asymptotic normal theory now used in the literature which has an O(n-1/2) error rate on error rejection probabilities, the asymptotic approximation used here has an error rate of o(n-1/2). The new formula is independent of unknown parameters, is simple to calculate and highly user-friendly. The method is applied to test whether the reported long memory parameter estimates of Andersen et. al. (2001) and Martens et. al. (2004) differ significantly from the lower boundary (d = 0.5) of nonstationary long memory.
    Keywords: ARFIMA; Edgeworth expansion; Fourier integral expansion; Fractional differencing; Improved inference; Long memory; Pivotal statistic; Realized volatility; Singularity
    JEL: C13 C22
    Date: 2006–01
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1549&r=ets
  11. By: Christian Gourieroux (CREST-INSEE); Peter C. B. Phillips (Cowles Foundation, Yale University; University of Auckland & University of York); Jun Yu (School of Economics and Social Science, Singapore Management University)
    Abstract: It is well-known that maximum likelihood (ML) estimation of the autoregressive parameter of a dynamic panel data model with fixed effects is inconsistent under fixed time series sample size (T) and large cross section sample size (N) asymptotics. The estimation bias is particularly relevant in practical applications when T is small and the autoregressive parameter is close to unity. The present paper proposes a general, computationally inexpensive method of bias reduction that is based on indirect inference (Gouriéroux et al., 1993), shows unbiasedness and analyzes efficiency. The method is implemented in a simple linear dynamic panel model, but has wider applicability and can, for instance, be easily extended to more complicated frameworks such as nonlinear models. Monte Carlo studies show that the proposed procedure achieves substantial bias reductions with only mild increases in variance, thereby substantially reducing root mean square errors. The method is compared with certain consistent estimators and bias-corrected ML estimators previously proposed in the literature and is shown to have superior .nite sample properties to GMM and the bias-corrected ML of Hahn and Kuersteiner (2002). Finite sample performance is compared with that of a recent estimator proposed by Han and Phillips (2005).
    Keywords: Autoregression, Bias reduction, Dynamic panel, Fixed effects, Indirect inference
    JEL: C33
    Date: 2006–01
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1550&r=ets
  12. By: Cees Diks (CeNDEF, Faculty of Economics, University of Amsterdam); Valentyn Panchenko (CeNDEF, Faculty of Economics, University of Amsterdam)
    Abstract: Tests for serial independence and goodness-of-fit based on divergence notions between probability distributions, such as the Kullback-Leibler divergence or Hellinger distance, have recently received much interest in time series analysis. The aim of this paper is to introduce tests for serial independence using kernel-based quadratic forms. This separates the problem of consistently estimating the divergence measure from that of consistently estimating the underlying joint densities, the existence of which is no longer required. Exact level tests are obtained by implementing a Monte Carlo procedure using permutations of the original observations. The bandwidth selection problem is addressed by introducing a multiple bandwidth procedure based on a range of different bandwidth values. After numerically establishing that the tests perform well compared to existing nonparametric tests, applications to estimated time series residuals are considered. The approac! h is illustrated with an application to financial returns data.
    Keywords: Bandwidth selection; Nonparametric tests; Serial independence; Quadratic forms
    JEL: C14 C15
    Date: 2005–08–02
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20050076&r=ets
  13. By: Siem Jan Koopman (Faculty of Economics and Business Administration, Vrije Universiteit Amsterdam); Kai Ming Lee (Faculty of Economics and Business Administration, Vrije Universiteit Amsterdam)
    Abstract: To gain insights in the current status of the economy, macroeconomic time series are often decomposed into trend, cycle and irregular components. This can be done by nonparametric band-pass filtering methods in the frequency domain or by model-based decompositions based on autoregressive moving average models or unobserved components time series models. In this paper we consider the latter and extend the model to allow for asymmetric cycles. In theoretical and empirical studies, the asymmetry of cyclical behavior is often discussed and considered for series such as unemployment and gross domestic product (GDP). The number of attempts to model asymmetric cycles is limited and it is regarded as intricate and nonstandard. In this paper we show that a limited modification of the standard cycle component leads to a flexible device for asymmetric cycles. The presence of asymmetry can be tested using classical likelihood based test statistics. The trend-cycle de! composition model is applied to three key U.S. macroeconomic time series. It is found that cyclical asymmetry is a prominent salient feature in the U.S. economy.
    Keywords: Asymmetric business cycles; Unobserved Components; Nonlinear state space models; Monte Carlo likelihood; Importance sampling
    JEL: C13 C22 E32
    Date: 2005–08–15
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20050081&r=ets
  14. By: Bernd Heidergott (Faculty of Economics, Vrije Universiteit Amsterdam); Arie Hordijk (Leiden University, Mathematical Institute); Miranda van Uitert (Faculty of Economics, Vrije Universiteit Amsterdam)
    Abstract: This paper provides series expansions of the stationary distribution of a finite Markov chain. This leads to an efficient numerical algorithm for computing the stationary distribution of a finite Markov chain. Numerical examples are given to illustrate the performance of the algorithm.
    Keywords: finite-state Markov chain; (Taylor) series expansion; measure-valued derivatives; coupled processors
    JEL: C63 C44
    Date: 2005–09–20
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20050086&r=ets
  15. By: Siem Jan Koopman (Faculty of Economics and Business Administration, Vrije Universiteit Amsterdam); Marius Ooms (Faculty of Economics and Business Administration, Vrije Universiteit Amsterdam); M. Angeles Carnero (Dpt. Fundamentos del Analisis Economico, University of Alicante)
    Abstract: Novel periodic extensions of dynamic long memory regression models with autoregressive conditional heteroskedastic errors are considered for the analysis of daily electricity spot prices. The parameters of the model with mean and variance specifications are estimated simultaneously by the method of approximate maximum likelihood. The methods are implemented for time series of 1, 200 to 4, 400 daily price observations. Apart from persistence, heteroskedasticity and extreme observations in prices, a novel empirical finding is the importance of day-of-the-week periodicity in the autocovariance function of electricity spot prices. In particular, daily log prices from the Nord Pool power exchange of Norway are modeled effectively by our framework, which is also extended with explanatory variables. For the daily log prices of three European emerging electricity markets (EEX in Germany, Powernext in France, APX in The Netherlands), which are less persistent, periodicity is also highly significant.
    Keywords: Autoregressive fractionally integrated moving average model; Generalised autoregressive conditional heteroskedasticity model; Long memory process; Periodic autoregressive model; Volatility
    JEL: C22 C51 G10
    Date: 2005–10–12
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20050091&r=ets
  16. By: Jurgen A. Doornik (Nuffield College, University of Oxford); Marius Ooms (Department of Econometrics, Vrije Universiteit Amsterdam)
    Abstract: We present a new procedure for detecting multiple additive outliers in GARCH(1,1) models at unknown dates. The outlier candidates are the observations with the largest standardized residual. First, a likelihood-ratio based test determines the presence and timing of an outlier. Next, a second test determines the type of additive outlier (volatility or level). The tests are shown to be similar with respect to the GARCH parameters. Their null distribution can be easily approximated from an extreme value distribution, so that computation of <I>p</I>-values does not require simulation. The procedure outperforms alternative methods, especially when it comes to determining the date of the outlier. We apply the method to returns of the Dow Jones index, using monthly, weekly, and daily data. The procedure is extended and applied to GARCH models with Student-<I>t</I> distributed errors.
    Keywords: Dummy variable; Generalized Autoregressive Conditional Heteroskedasticity; GARCH-t; Outlier detection; Extreme value distribution
    JEL: C22 C52 G10
    Date: 2005–10–13
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20050092&r=ets
  17. By: Borus Jungbacker (Vrije Universiteit Amsterdam); Siem Jan Koopman (Vrije Universiteit Amsterdam)
    Abstract: On Importance Sampling for State Space Models Abstract: We consider likelihood inference and state estimation by means of importance sampling for state space models with a nonlinear non-Gaussian observation y ~ p(y|alpha) and a linear Gaussian state alpha ~ p(alpha). The importance density is chosen to be the Laplace approximation of the smoothing density p(alpha|y). We show that computationally efficient state space methods can be used to perform all necessary computations in all situations. It requires new derivations of the Kalman filter and smoother and the simulation smoother which do not rely on a linear Gaussian observation equation. Furthermore, results are presented that lead to a more effective implementation of importance sampling for state space models. An illustration is given for the stochastic volatility model with leverage.
    Keywords: Kalman filter; Likelihood function; Monte Carlo integration; Newton-Raphson; Posterior mode estimation; Simulation smoothing; Stochastic volatility model
    JEL: C15 C32
    Date: 2005–12–19
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20050117&r=ets
  18. By: Gilbert Colletaz (LEO - Laboratoire d'économie d'Orleans - http://www.univ-orleans.fr/DEG/LEO - CNRS : FRE2783 - Université d'Orléans)
    Abstract: Using Chow and Denning's arguments applied to the individual hypothesis test methodology of Wright (2000) I propose a multiple variance-ratio test based on ranks to investigate the hypothesis of no serial coorelation. This rank joint test can be exact if data are i.i.d.. Some Monte Carlo simulations show that its size distortions are small for observations obeying the martingale hypothesis while not being and i.i.d. process. Also, regarding size and power, it compares favorably with other popular tests.
    Keywords: Random walk hypothesis ; non parametric test ; variance-ratio test
    Date: 2006–01–13
    URL: http://d.repec.org/n?u=RePEc:hal:papers:halshs-00007801_v1&r=ets
  19. By: Takayuki Shiohama
    Abstract: Instability of volatility parameters in GARCH models in an important issue for analyzing financial time series. In this paper we investigate the asymptotic theory for change point estimators in semiparametric GARCH models. When the parameters of a GARCH models have changed within an observed realization, two types estimators, Maximum likelihood estimator (MLE) and Bayesian estimator (BE), are proposed. Then we derive the asymptotic distributions for these estimators. The MLE and BE have different limit laws, and the BE is asymptotically efficient. Monte Carlo studies on the finite sample behaviors are conducted.
    Keywords: GARCH process, change point, maximum likelihood estimator, Bayesian estimator, asymptotic efficiency
    Date: 2006–01
    URL: http://d.repec.org/n?u=RePEc:hit:hituec:a471&r=ets
  20. By: Viviana Fernandez;
    Abstract: In this article, we test for the presence of structural breaks in volatility by two alternative approaches: the Iterative Cumulative Sum of Squares (ICSS) algorithm and wavelet analysis. Specifically, we look at the effect of the outbreak of the Asian crisis and the terrorist attacks of September 11, 2001 on Emerging Asia, Europe, Latin America and North America's stock markets. In addition, we focus on the behavior of interest rates in Chile after the Central Bank switched its monetary policy interest rate from an inflationindexed to a nominal target in August 2001. Our estimation results show that the number of shifts detected by the two methods is substantially reduced when filtering out the data for both conditional heteroskedasticity and serial correlation. In addition, we conclude that the wavelet-based test tends to be more robust.
    Keywords: ICSS algorithm, wavelet analysis, volatility breakpoints.
    Date: 2005–12–15
    URL: http://d.repec.org/n?u=RePEc:iis:dispap:iiisdp076&r=ets
  21. By: George Athanasopoulos; Farshid Vahid
    Abstract: This paper proposes an extension to scalar component methodology for the identification and estimation of VARMA models. The complete methodology determines the exact positions of all free parameters in any VARMA model with a predetermined embedded scalar component structure. This leads to an exactly identified system of equations that is estimated using full information maximum likelihood.
    Keywords: Identification, Multivariate time series, Scalar components, VARMA models.
    JEL: C32 C51
    Date: 2006–01
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2006-2&r=ets
  22. By: Rob J Hyndman; Muhammad Akram
    Abstract: This paper discusses the instability of eleven nonlinear state space models that underly exponential smoothing. Hyndman et al. (2002) proposed a framework of 24 state space models for exponential smoothing, including the well-known simple exponential smoothing, Holt's linear and Holt-Winters' additive and multiplicative methods. This was extended to 30 models with Taylor's (2003) damped multiplicative methods. We show that eleven of these 30 models are unstable, having infinite forecast variances. The eleven models are those with additive errors and either multiplicative trend or multiplicative seasonality, as well as the models with multiplicative errors, multiplicative trend and additive seasonality. The multiplicative Holt-Winters' model with additive errors is among the eleven unstable models. We conclude that: (1) a model with a multiplicative trend or a multiplicative seasonal component should also have a multiplicative error; and (2) a multiplicative trend should not be mixed with additive seasonality.
    Keywords: exponential smoothing, forecast variance, nonlinear models, prediction intervals, stability, state space models.
    JEL: C53 C22
    Date: 2006–01
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2006-3&r=ets
  23. By: Kin Lam (Department of Finance & Decision Sciences, Hong Kong Baptist University); May Chun Mei Wong (Dental Public Health, The University of Hong Kong); Wing-Keung Wong (Department of Economics, The National University of Singapore)
    Abstract: We develop some properties on the autocorrelation of the k-period returns for the general mean reversion (GMR) process in which the stationary component is not restricted to the AR(l) process but take the form of a general ARMA process. We then derive some properties of the GMR process and three new non-parametric tests comparing the relative variability of returns over different horizons to validate the GMR process as an alternative to random walk. We further examine the asymptotic properties of these tests which can then be applied to identify random walk models from the GMR processes.
    Keywords: mean reversion, variance ratio test, random walk, stock price, stock return
    JEL: G12 G14
    URL: http://d.repec.org/n?u=RePEc:nus:nusewp:wp0514&r=ets
  24. By: M. Hashem Pesaran; Ron Smith
    Abstract: This paper provides a synthesis and further development of a global modelling approach introduced in Pesaran, Schuermann andWeiner (2004), where country specific models in the form of VARX* structures are estimated relating a vector of domestic variables, xit, to their foreign counterparts, xit, and then consistently combined to form a Global VAR (GVAR). It is shown that the VARX* models can be derived as the solution to a dynamic stochastic general equilibrium (DSGE) model where over-identifying long-run theoretical relations can be tested and imposed if acceptable. This gives the system a transparent long-run theoretical structure. Similarly, short-run over-identifying theoretical restrictions can be tested and imposed if accepted. Alternatively, if one has less confidence in the short-run theory the dynamics can be left unrestricted. The assumption of the weak exogeneity of the foreign variables for the long-run parameters can be tested, where xit variables can be interpreted as proxies for global factors. Rather than using deviations from ad hoc statistical trends, the equilibrium values of the variables reflecting the long-run theory embodied in the model can be calculated. This approach has been used in a wide variety of contexts and for a wide variety of purposes. The paper also provides some new results.
    Keywords: Global VAR (GVAR), DSGE models, VARX
    JEL: C32 E17 F42
    URL: http://d.repec.org/n?u=RePEc:scp:wpaper:05-43&r=ets
  25. By: Luis Alberiko Gil-Alana (Facultad de Ciencias Económicas y Empresariales)
    Abstract: In this article I investigate whether the presence of structural breaks affects inference on the order of integration in univariate time series. For this purpose, we make use of a version of the tests of Robinson (1994) which allows us to test unit and fractional roots in the presence of deterministic changes. Several Monte Carlo experiments conducted across the paper show that the tests perform relatively well in the presence of both mean and slope breaks. The tests are applied to annual data on German real GDP, the results showing that the series may be well described in terms of a fractional model with a structural slope break due to World War II. Luis A. Gil-Alana Universidad de Navarra Departamento de Economía 31080 Pamplona SPAIN alana@unav.es
    JEL: C15 C22
    URL: http://d.repec.org/n?u=RePEc:una:unccee:wp2005&r=ets
  26. By: Osmani Teixeira de Carvalho Guillén (IBMEC Business School - Rio de Janeiro and Banco Central do Brasil); João Victor Issler (Graduate School of Economics - EPGE, Getulio Vargas Foundation); George Athanasopoulos (Department of Economics and Business Statistics, Monash University)
    Abstract: Using vector autoregressive (VAR) models and Monte-Carlo simulation methods we investigate the potential gains for forecasting accuracy and estimation uncertainty of two commonly used restrictions arising from economic relationships. The first reduces parameter space by imposing long-term restrictions on the behavior of economic variables as discussed by the literature on cointegration, and the second reduces parameter space by imposing short-term restrictions as discussed by the literature on serial-correlation common features (SCCF). Our simulations cover three important issues on model building, estimation, and forecasting. First, we examine the performance of standard and modified information criteria in choosing lag length for cointegrated VARs with SCCF restrictions. Second, we provide a comparison of forecasting accuracy of fitted VARs when only cointegration restrictions are imposed and when cointegration and SCCF restrictions are jointly imposed. Third, we propose a new estimation algorithm where short- and long-term restrictions interact to estimate the cointegrating and the cofeature spaces respectively. We have three basic results. First, ignoring SCCF restrictions has a high cost in terms of model selection, because standard information criteria chooses too frequently inconsistent models, with too small a lag length. Criteria selecting lag and rank simultaneously have a superior performance in this case. Second, this translates into a superior forecasting performance of the restricted VECM over the VECM, with important improvements in forecasting accuracy - reaching more than 100% in extreme cases. Third, the new algorithm proposed here fares very well in terms of parameter estimation, even when we consider the estimation of long-term parameters, opening up the discussion of joint estimation of short- and long-term parameters in VAR models.
    Keywords: reduced rank models, model selection criteria, forecasting accuracy
    JEL: C32 C53
    Date: 2006–01–02
    URL: http://d.repec.org/n?u=RePEc:ibr:dpaper:2006-01&r=ets

This nep-ets issue is ©2006 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.