nep-ecm New Economics Papers
on Econometrics
Issue of 2006‒12‒16
seventeen papers chosen by
Sune Karlsson
Orebro University

  1. The Role of "Leads" in the Dynamic OLS Estimation of Cointegrating Regression Models By Kazuhiko Hayakawa; Eiji Kurozumi
  2. Dynamic modeling under linear-exponential loss By Stanislav Anatolyev
  3. Asymptotic Properties of the Efficient Estimators for Cointegrating Regression Models with Serially Dependent Errors By Eiji Kurozumi; Kazuhiko Hayakawa
  4. Methods for inference in large multiple-equation Markov-switching models By Christopher A. Sims; Daniel F. Waggoner; Tao Zha
  5. Predictive regressions with panel data By Erik Hjalmarsson
  6. An introduction to univariate GARCH models By Teräsvirta, Timo
  7. Stability Tests for Heterogeneous Panel Data By Felix Chan Tommaso Mancini-Griffoli Laurent L. Pauwels
  8. Some Cautions on the Use of the LLC Panel Unit Root Test By Westerlund Joakim
  9. Hybrid and size-corrected subsample methods (joint with D.W.K. Andrews), June 2005, this version December 2006 By Patrik Guggenberger
  10. Modelling Financial High Frequency Data Using Point Processes By Luc, BAUWENS; Nikolaus, HAUTSCH
  11. The (Mis)Specification of Discrete Time Duration Models with Unobserved Heterogeneity: a Monte Carlo study By Cheti Nicoletti; Concetta Rondinelli
  12. The economic and statistical value of forecast combinations under regime switching: an application to predictable U.S. returns By Massimo Guidolin; Carrie Fangzhou Na
  13. Panel Cointegration Tests of the Fisher Effect By Westerlund Joakim
  14. Investing Under Model Uncertainty: Decision Based Evaluation of Exchange Rate and Interest Rate Forecasts in the US, UK and Japan By Anthony Garratt; Kevin Lee
  15. Real Time Representation of the UK Output Gap in the Presence of Trend Uncertainty By Anthony Garratt; Kevin Lee; Emi Mise; Kalvinder Shields
  16. Real Time Representations of the Output Gap By Anthony Garratt; Kevin Lee; Emi Mise; Kalvinder Shields
  17. A Further Look into the Demography-based GDP Forecasting Method. By Tapas K. Mishra

  1. By: Kazuhiko Hayakawa; Eiji Kurozumi
    Abstract: In this paper, we consider the role of "leads" of the first difference of integrated variables in the dynamic OLS estimation of cointegrating regression models. We demonstrate that the role of leads is related to the concept of Granger causality and that in some cases leads are unnecessary in the dynamic OLS estimation of cointegrating regression models. Based on a Monte Carlo simulation, we find that the dynamic OLS estimator without leads substantially outperforms that with leads and lags; we therefore recommend testing for Granger noncausality before estimating models.
    Keywords: Cointegration, dynamic ordinary least squares estimator, Granger causality
    JEL: C13 C22
    Date: 2006–12
    URL: http://d.repec.org/n?u=RePEc:hst:hstdps:d06-194&r=ecm
  2. By: Stanislav Anatolyev (New Economic School)
    Abstract: We develop a methodology of parametric modeling of time series dynamics when the underlying loss function is linear-exponential (Linex). We propose to directly model the dynamics of the conditional expectation that determines the optimal predictor. The procedure hinges on the exponential quasi maximum likelihood interpretation of the Linex loss and nicely fits the multiple error modeling framework. Many conclusions relating to estimation, inference and forecasting follow from results already available in the econometric literature. The methodology is illustrated using data on United States GNP growth and Treasury bill returns.
    Keywords: Linear-exponential loss, optimal predictor, quasi maximum likelihood, multiple error model, autoregressive conditional durations
    JEL: C22 C51 C52
    Date: 2006–12
    URL: http://d.repec.org/n?u=RePEc:cfr:cefirw:w0092&r=ecm
  3. By: Eiji Kurozumi; Kazuhiko Hayakawa
    Abstract: In this paper, we analytically investigate three efficient estimators for cointegrating regression models: Phillips and Hansen's (1990) fully modified OLS estimator, Park's (1992) canonical cointegrating regression estimator, and Saikkonen's (1991) dynamic OLS estimator. First, by the Monte Carlo simulations, we demonstrate that these efficient methods do not work well when the regression errors are strongly serially correlated. In order to explain this result, we assume that the regression errors are generated from a nearly integrated autoregressive (AR) process with the AR coefficient approaching 1 at a rate of 1/T , where T is the sample size. We derive the limiting distributions of the three efficient estimators as well as the OLS estimator and show that they have the same limiting distribution under this assumption. This implies that the three efficient methods no longer work well when the regression errors are strongly serially correlated. Further, we consider the case where the AR coefficient in the regression errors approaches 1 at a rate slower than 1/T . In this case, the limiting distributions of the efficient estimators depend on the approaching rate. If the rate is slow enough, the efficiency is established for the three estimators; however, if the approaching rate is relatively fast, they have the same limiting distribution as the OLS estimator. This result explains why the effect of the efficient methods diminishes as the serial correlation in the regression errors gets stronger.
    Keywords: Cointegration, second-order bias, fully modified regressions, canonical cointegrating regressions, dynamic ordinary least squares regressions
    JEL: C13 C22
    Date: 2006–12
    URL: http://d.repec.org/n?u=RePEc:hst:hstdps:d06-197&r=ecm
  4. By: Christopher A. Sims; Daniel F. Waggoner; Tao Zha
    Abstract: The inference for hidden Markov chain models in which the structure is a multiple-equation macroeconomic model raises a number of difficulties that are not as likely to appear in smaller models. One is likely to want to allow for many states in the Markov chain without allowing the number of free parameters in the transition matrix to grow as the square of the number of states but also without losing a convenient form for the posterior distribution of the transition matrix. Calculation of marginal data densities for assessing model fit is often difficult in high-dimensional models and seems particularly difficult in these models. This paper gives a detailed explanation of methods we have found to work to overcome these difficulties. It also makes suggestions for maximizing posterior density and initiating Markov chain Monte Carlo simulations that provide some robustness against the complex shape of the likelihood in these models. These difficulties and remedies are likely to be useful generally for Bayesian inference in large time-series models. The paper includes some discussion of model specification issues that apply particularly to structural vector autoregressions with a Markov-switching structure.
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:fip:fedawp:2006-22&r=ecm
  5. By: Erik Hjalmarsson
    Abstract: This paper analyzes panel data inference in predictive regressions with endogenous and nearly persistent regressors. The standard fixed effects estimator is shown to suffer from a second order bias; analytical results, as well as Monte Carlo evidence, show that the bias and resulting size distortions can be severe. New estimators, based on recursive demeaning as well as direct bias correction, are proposed and methods for dealing with cross sectional dependence in the form of common factors are also developed. Overall, the results show that the econometric issues associated with predictive regressions when using time-series data to a large extent also carry over to the panel case. However, practical solutions are more readily available when using panel data. The results are illustrated with an application to predictability in international stock indices.
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:fip:fedgif:869&r=ecm
  6. By: Teräsvirta, Timo (School of Economics and Management, University of Aarhus)
    Abstract: This paper contains a survey of univariate models of conditional heteroskedasticity. The classical ARCH model is mentioned, and various extensions of the standard GARCH model are highlighted. This includes the Exponential GARCH model. Stochastic volatility models remain outside this review.
    Keywords: ARCH; conditional heteroskedasticity; GARCH; nonlinear GARCH; volatility modelling
    JEL: C22
    Date: 2006–12–03
    URL: http://d.repec.org/n?u=RePEc:hhs:hastef:0646&r=ecm
  7. By: Felix Chan Tommaso Mancini-Griffoli Laurent L. Pauwels (School of Economics and Finance, Curtin University of Technology Paris-Jourdan Sciences Economiques (PSE), CEPREMAP Hong Kong Monetary Authority and Graduate Institute of International Studies, Geneva)
    Abstract: This paper proposes a new test for structural instability in heterogeneous panels. The test builds on the seminal work of Andrews (2003) originally developed for time series. It is robust to non-normal, heteroskedastic and serially correlated errors, and allows for the number of post break observations to be small. Importantly, the test considers the alternative of a break affecting only some - and not all - individuals of the panel. Under mild assumptions the test statistic is shown to be asymptotically normal, thanks to the additional cross sectional dimension of panel data. This greatly facilitates the calculation of critical values. Monte Carlo experiments show that the test has good size and power under a wide range of circumstances. The test is then applied to investigate the effect of the Euro on trade.
    Keywords: Structural change, end-of-sample instability tests, heterogeneous panels, Monte Carlo, Euro effect on trade.
    JEL: C23 C52
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:gii:giihei:heiwp24-2006&r=ecm
  8. By: Westerlund Joakim (METEOR)
    Abstract: One of the single most cited studies within the field of nonstationary panel data analysis is that of LLC (Levin, Lin and Chu, 2002. Unit Root\linebreak Tests in Panel Data: Asymptotic and Finite Sample Properties. \emph{Journal of Econometrics} 98, 1-24), in which the authors propose a test for a common unit root in the panel. Using both theoretical arguments and simulation evidence, we show that this test generally suffers from serious bias when combined with most commonly used rules for lag length and bandwidth selection. To remedy this bias effect, we propose a slightly modified test that performs well in small samples and that is computationally more convenient than the LLC test.
    Keywords: econometrics;
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:dgr:umamet:2006056&r=ecm
  9. By: Patrik Guggenberger
    URL: http://d.repec.org/n?u=RePEc:cla:uclaol:400&r=ecm
  10. By: Luc, BAUWENS (UNIVERSITE CATHOLIQUE DE LOUVAIN, Department of Economics); Nikolaus, HAUTSCH
    Abstract: In this chapter written for a forthcoming Handbook of Financial Time Series to be published by Springer-Verlag, we review the econometric literature on dynamic duration and intensity processes applied to high frequency financial data, which was boosted by the work of Engle and Russell (1997) on autoregressive duration models
    Keywords: Duration, Intensity, Point process, High frequency data, ACD models
    JEL: C41 C32
    Date: 2006–09–18
    URL: http://d.repec.org/n?u=RePEc:ctl:louvec:2006039&r=ecm
  11. By: Cheti Nicoletti (Institute for Social and Economic Research); Concetta Rondinelli (University of Bocconi)
    Abstract: The most popular statistical models among empirical researchers are usually the ones which can be easily estimated by using commonly available software packages. Sequential binary models with or without normal random effects are an example of such models, because they can be adopted to estimate discrete time duration models in presence of unobserved heterogeneity. But an easy-to-implement estimation may incur a cost. In this paper we use Monte Carlo methods to analyze the consequences of omission or misspecification of unobserved heterogeneity in single spell discrete time duration models.
    Date: 2006–11
    URL: http://d.repec.org/n?u=RePEc:ese:iserwp:2006-53&r=ecm
  12. By: Massimo Guidolin; Carrie Fangzhou Na
    Abstract: We address one interesting case - the predictability of excess US asset returns from macroeconomic factors within a flexible regime switching VAR framework - in which the presence of regimes may lead to superior forecasting performance from forecast combinations. After having documented that forecast combinations provide gains in prediction accuracy and these gains are statistically significant, we show that combinations may substantially improve portfolio selection. We find that the best performing forecast combinations are those that either avoid estimating the pooling weights or that minimize the need for estimation. In practice, we report that the best performing combination schemes are based on the principle of relative, past forecasting performance. The economic gains from combining forecasts in portfolio management applications appear to be large.
    Keywords: Forecasting
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:fip:fedlwp:2006-059&r=ecm
  13. By: Westerlund Joakim (METEOR)
    Abstract: Most empirical evidence suggest that the Fisher effect, stating that inflation and nominal interest rates should cointegrate with a unit slope on inflation, does not hold, a finding at odds with many theoretical models. This paper argues that these results can be attributed in part to the low power of univariate tests, and that the use of panel data can generate more powerful tests. For this purpose, we propose two new panel cointegration tests that can be applied under very general conditions, and that are shown by simulation to be more powerful than other existing tests. These tests are applied to a panel of quarterly data covering 20 OECD countries between 1980 and 2004. The evidence suggest that the Fisher effect cannot be rejected once the panel evidence on cointegration has been taken into account.
    Keywords: econometrics;
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:dgr:umamet:2006054&r=ecm
  14. By: Anthony Garratt (School of Economics, Mathematics & Statistics, Birkbeck); Kevin Lee
    Abstract: We evaluate the forecast performance of a range of theory-based and atheoretical models explaining exchange rates and interest rates in US, UK and Japan. The decision-making environment is fully described for an investor who optimally allocates portfolio shares to domestic and foreign assets. Methods necessary to compute and use forecasts in this context are proposed, including the means of combining density forecasts to deal with model uncertainty. An out-of-sample evaluation exercise covering the 1990’s is described, comparing statistical criteria with decision-based criteria. The theory-based models are found to perform relatively well when their forecasts are judged by their economic value.
    Keywords: Model Averaging, Buy and Hold, Exchange rate and interest rate forecasts.
    JEL: C32 C53 E17
    Date: 2006–12
    URL: http://d.repec.org/n?u=RePEc:bbk:bbkefp:0616&r=ecm
  15. By: Anthony Garratt (School of Economics, Mathematics & Statistics, Birkbeck); Kevin Lee; Emi Mise; Kalvinder Shields
    Abstract: This paper describes an approach that accommodates in a coherent way three types of uncertainty when measuring the output gap. These are trend uncertainty (associated with the choice of model and de-trending technique), estimation uncertainty (with a given model) and data uncertainty (associated with the reliability of data). The approach employs VAR models to explain real time measures and realisations of output series jointly along with Bayesian-style ‘model averaging’ procedures. Probability forecasts provide a comprehensive representation of the output gap and the associated uncertainties in real time. The approach is illustrated using a real time dataset for the UK over 1961q2 — 2005q4.
    Keywords: Output gap, real time data, revisions, Hodrick-Prescott trend, exponential smoothing trend, moving average trend, model uncertainty, probability forecasts.
    JEL: E52 E58
    Date: 2006–12
    URL: http://d.repec.org/n?u=RePEc:bbk:bbkefp:0618&r=ecm
  16. By: Anthony Garratt (School of Economics, Mathematics & Statistics, Birkbeck); Kevin Lee; Emi Mise; Kalvinder Shields
    Abstract: Methods are described for the appropriate use of data obtained and analysed in real time to represent the output gap. The methods employ cointegrating VAR techniques to model real time measures and realisations of output series jointly. The model is used to mitigate the impact of data revisions; to generate appropriate forecasts that can deliver economically-meaningful output trends and that can take into account the end-of-sample problems associated with the use of the Hodrick-Prescott filter in measuring these trends; and to calculate probability forecasts that convey in a clear way the uncertainties associated with the gap measures. The methods are applied to data for the US 1965q4-2004q4 and the improvements over standard methods are illustrated.
    Keywords: Output gap measurement, real time data, data revision, HP end-points, probability forecasts.
    JEL: E52 E58
    Date: 2006–12
    URL: http://d.repec.org/n?u=RePEc:bbk:bbkefp:0619&r=ecm
  17. By: Tapas K. Mishra
    Abstract: Demography-based income forecasting has recently gained enormous popu- larity. Malmberg and Lindh (ML, 2005) in an important contribution forecast global income by incorporating demographic age information where the vari- ables were assumed to be stationary. Drawing on the insights from recent theoretical and empirical advances, in this paper we re-examine the stationary assumption and argue in favour of a more flexible framework where ’stationar- ity’ is a limiting condition of the stochastic demographic behavior. Based on Mishra and Urbain (2005) where we showed that the age-specific population display varied long-term and short-term dynamics, we invest this idea in the present paper for long-term projections of per capita income (till 2050) of a set of developed and developing countries and the World income. We find that GDP forecast that corroborates demographic information have higher forecasts than without demographic information - a result consistent with ML, but we find that embedding ’memory’ features of demographic variables lead to higher forecast that ML. The relevance of stochastic shocks in GDP forecasting is drawn in this paper and implications of these forecast in the presence of fluc- tuating age-shares in those countries are discussed.
    Keywords: Global income forecasting, Long memory, Demographic components, Economic growth.
    JEL: C13 E32 E43 E63 J11 C33 O47
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:ulp:sbbeta:2006-17&r=ecm

This nep-ecm issue is ©2006 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.