nep-ecm New Economics Papers
on Econometrics
Issue of 2006‒08‒05
twenty-two papers chosen by
Sune Karlsson
Orebro University

  1. Time Series of Count Data : Modelling and Estimation By Jung, Robert; Kukuk, Martin; Liesenfeld, Roman
  2. A BOOTSTRAP APPROACH TO TEST THE CONDITIONAL SYMMETRY IN TIME SERIES MODELS By Alicia Pérez Alonso
  3. Forecasting German GDP using alternative factor models based on large datasets By Schumacher, Christian
  4. Alternative distributions for observation driven count series models By Drescher, Daniel
  5. "Empirical Likelihood Methods in Econometrics: Theory and Practice" By Yuichi Kitamura
  6. CONSISTENT SPECIFICATION TEST FOR ORDERED DISCRETE CHOICE MODELS By Juan Mora; Ana I. Moro
  7. Improving MCMC Using Efficient Importance Sampling By Liesenfeld, Roman; Richard, Jean-François
  8. Panels with Nonstationary Multifactor Error Structures By G. Kapetanios; M. Hashem Pesaran; T. Yamagata
  9. Unit roots and cointegration in panels By Breitung, Jörg; Pesaran, M. Hashem
  10. "A Two-Stage Plug-In Bandwidth Selection and Its Implementation for Covariance Estimation" By Masayuki Hirukawa
  11. Dynamic factor models By Breitung, Jörg; Eickmeier, Sandra
  12. Forecasting and Combining Competing Models of Exchange Rate Determination By Carlo Altavilla; Paul De Grauwe
  13. Asymptotic distribution of linear unbiased estimators in the presence of heavy-tailed stochastic regressors and residuals By Kurz-Kim, Jeong-Ryeol; Rachev, Svetlozar T.; Samorodnitsky, Gennady
  14. Testing Temporal Disaggregation By Christian Müller
  15. An Evaluation of the World Economic Outlook Forecasts By Allan Timmermann
  16. Using Monthly Indicators to Predict Quarterly GDP By Isabel Yi Zheng; James Rossiter
  17. BEVERRIDGE NELSON DECOMPOSITION WITH MARKOV SWITCHING By Chin Nam Low; Heather Anderson; Ralph Snyder
  18. Testing for Parameter Stability in Dynamic Models Across Frequencies By Bertrand Candelon; Gianluca Cubadda
  19. Forecasting stock market volatility with macroeconomic variables in real time By Döpke, Jörg; Hartmann, Daniel; Pierdzioch, Christian
  20. Forecasting the price of crude oil via convenience yield predictions By Knetsch, Thomas A.
  21. The forecast ability of risk-neutral densities of foreign exchange By Craig, Ben; Keller, Joachim
  22. On the Model Based Interpretation of Filters and the Reliability of Trend-Cycle Estimates By Tommaso Proietti

  1. By: Jung, Robert; Kukuk, Martin; Liesenfeld, Roman
    Abstract: This paper compares various models for time series of counts which can account for discreetness, overdispersion and serial correlation. Besides observation- and parameter-driven models based upon corresponding conditional Poisson distributions, we also consider a dynamic ordered probit model as a flexible specification to capture the salient features of time series of counts. For all models, we present appropriate efficient estimation procedures. For parameter-driven specifications this requires Monte Carlo procedures like simulated Maximum likelihood or Markov Chain Monte-Carlo. The methods including corresponding diagnostic tests are illustrated with data on daily admissions for asthma to a single hospital.
    Keywords: Efficient Importance Sampling, GLARMA, Markov Chain Monte-Carlo, Observation-driven model, Parameter-driven model, Ordered Probit
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:zbw:cauewp:3194&r=ecm
  2. By: Alicia Pérez Alonso (Universidad de Alicante)
    Abstract: This paper discusses how to test for conditional symmetry in time seriesregression models. To that end, we utilize the Bai and Ng test. We also examinethe performance of some popular (unconditional) symmetry tests for observationswhen applied to regression residuals. The tests considered include the coeficientof skewness, a joint test of the third and fifth moments, the Runs test, the Wilcoxonsigned-rank test and the Triples test. An easy-to-implement symmetric bootstrapprocedure is proposed to calculate critical values for these tests. Consistency of thebootstrap procedure will be shown. A simple Monte Carlo experiment isconducted to explore the finite-sample properties of all the tests.
    Keywords: Near Epoch Dependence; Nonparametric tests; Conditional symmetry; Boot- strap; Monte Carlo simulation
    JEL: C12 C15 C22
    Date: 2006–07
    URL: http://d.repec.org/n?u=RePEc:ivi:wpasad:2006-18&r=ecm
  3. By: Schumacher, Christian
    Abstract: This paper discusses the forecasting performance of alternative factor models based on a large panel of quarterly time series for the german economy. One model extracts factors by static principals components analysis, the other is based on dynamic principal components obtained using frequency domain methods. The third model is based on subspace algorithm for state space models. Out-of-sample forecasts show that the prediction errors of the factor models are generally smaller than the errors of simple autoregressive benchmark models. Among the factors models, either the dynamic principal component model or the subspace factor model rank highest in terms of forecast accuracy in most cases. However, neither of the dynamic factor models can provide better forecasts than the static model over all forecast horizons and different specifications of the simulation design. Therefore, the application of the dynamic factor models seems to provide only small forecasting improvements over the static factor model for forecasting German GDP.
    Keywords: Factor models, static and dynamic factors, principal components, forecasting accuracy
    JEL: C43 C51 E32
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:zbw:bubdp1:4218&r=ecm
  4. By: Drescher, Daniel
    Abstract: Observation-driven models provide a flexible framework for modelling time series of counts. They are able to capture a wide range of dependence structures. Many applications in this field of research are concerned with count series whose conditional distribution given past observations and explanatory variables is assumed to follow a Poisson distribution. This assumption is very convenient since the Poisson distribution is simple and leads to models which are easy to implement. On the other hand this assumption is often too restrictive since it implies equidispersion, the fact that the conditional mean equals the conditional variance. This assumption is often violated in empirical applications. Therefore more flexible distributions which allow for overdispersion or underdispersion should be used. This paper is concerned with the use of alternative distributions in the framework of observationdriven count series models. In this paper different count distributions and their properties are reviewed and used for modelling. The models under consideration are applied to a time series of daily counts of asthma presentations at a Sydney hospital. This data set has already been analyzed by Davis et al. (1999, 2000). The Poisson-GLARMA model proposed by these authors is used as a benchmark. This paper extends the work of Davis et al. (1999) to distributions which are nested in either the generalized negative binomial or the generalized Poisson distribution. Additionally the maximum likelihood estimation for observation-driven models with generalized distributions is presented in this paper.
    Keywords: Count series, observation-driven models, GLARMA, dicrete distributions
    JEL: C13 C22 C25
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:zbw:cauewp:3197&r=ecm
  5. By: Yuichi Kitamura (Department of Economics, Yale University)
    Abstract: Recent developments in empirical likelihood (EL) methods are reviewed. First, to put the method inperspective, two interpretations of empirical likelihood are presented, one as a nonparametric maximum likelihood estimation method (NPMLE) and the other as a generalized minimum contrast estimator(GMC).The latter interpretation provides a clear connection between EL, GMM, GEL and other related estimators. Second, EL is shown to have various advantages over other methods. The theory of large deviations demonstrates that EL emerges naturally in achieving asymptotic optimality both for estimation and testing. Interestingly, higher order asymptotic analysis also suggests that EL is generally a preferred method. Third, extensions of EL are discussed in various settings, including estimation of conditional moment restriction models, nonparametric specification testing and time series models. Finally, practical issues in applying EL to real data, such as computational algorithms for EL, are discussed. Numerical examples to illustrate the efficacy of the method are presented.
    Date: 2006–06
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2006cf430&r=ecm
  6. By: Juan Mora (Universidad de Alicante); Ana I. Moro (Universidad de Granada)
    Abstract: We discuss how to test consistently the specification of an ordereddiscrete choice model. Two approaches are considered: tests based onconditional moment restrictions and tests based on comparisons betweenparametric and nonparametric estimations. Following these approaches,various statistics are proposed and their asymptotic properties are discussed.The performance of the statistics is compared by means of simulations. Avariant of the standard conditional moment statistic and a generalization ofHorowitz-Spokoiny’s statistic perform best.
    Keywords: Specification Tests, Ordered Discrete Choice Models; Statistical Simulation
    JEL: C25 C52 C15
    Date: 2006–07
    URL: http://d.repec.org/n?u=RePEc:ivi:wpasad:2006-17&r=ecm
  7. By: Liesenfeld, Roman; Richard, Jean-François
    Abstract: This paper develops a systematic Markov Chain Monte Carlo (MCMC) framework based upon Efficient Importance Sampling (EIS) which can be used for the analysis of a wide range of econometric models involving integrals without an analytical solution. EIS is a simple, generic and yet accurate Monte-Carlo integration procedure based on sampling densities which are chosen to be global approximations to the integrand. By embedding EIS within MCMC procedures based on Metropolis-Hastings (MH) one can significantly improve their numerical properties, essentially by providing a fully automated selection of critical MCMC components such as auxiliary sampling densities, normalizing constants and starting values. The potential of this integrated MCMC- EIS approach is illustrated with simple univariate integration problems and with the Bayesian posterior analysis of stochastic volatility models and stationary autoregressive processes.
    Keywords: Autoregressive models, Bayesian posterior analysis, Dynamic latent variables, Gibbs sampling, Metropolis Hastings, Stochastic volatility
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:zbw:cauewp:4349&r=ecm
  8. By: G. Kapetanios; M. Hashem Pesaran; T. Yamagata
    Abstract: The presence of cross-sectionally correlated error terms invalidates much inferential theory of panel data models. Recent work by Pesaran (2006) suggests a method which makes use of cross-sectional averages to provide valid inference for stationary panel regressions with multifactor error structure. This paper extends this work and examines the important case where the unobserved common factors follow unit root processes and could be cointegrated. It is found that the presence of unit roots does not affect most theoretical results which continue to hold irrespective of the integration and the cointegration properties of the unobserved factors. This finding is further supported for small samples via an extensive Monte Carlo study. In particular, the results of the Monte Carlo study suggest that the cross-sectional average based method is robust to a wide variety of data generation processes and has lower biases than all of the alternative estimation methods considered in the paper.
    Keywords: Cross Section Dependence, Large Panels, Unit Roots, Principal Components, Common Correlated Effects
    JEL: C12 C13 C33
    Date: 2006–08
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:0651&r=ecm
  9. By: Breitung, Jörg; Pesaran, M. Hashem
    Abstract: This paper provides a review of the literature on unit roots and cointegration in panels where the time dimension (T), and the cross section dimension (N) are relatively large. It distinguishes between the first generation tests developed on the assumption of the cross section independence, and the second generation tests that allow, in a variety of forms and degrees, the dependence that might prevail across the different units in the panel. In the analysis of cointegration the hypothesis testing and estimation problems are further complicated by the possibility of cross section cointegration which could arise if the unit roots in the different cross section units are due to common random walk components.
    Keywords: Panel Unit Roots, Panel Cointegration, Cross Section Dependence, Common Effects
    JEL: C12 C15 C22 C23
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:zbw:bubdp1:4236&r=ecm
  10. By: Masayuki Hirukawa (Department of Economics, Concordia University and CIREQ)
    Abstract: To improve two existing bandwidth choice rules for kernel HAC estimation by Andrews (1991) and Newey and West (1994), this paper proposes to estimate an unknown quantity in the optimal bandwidth (called the normalized curvature) with a general class of kernels and derives the bandwidth that minimizes the asymptotic mean squared error of this estimator. The theory of the two-stage plug-in bandwidth selection and a reliable implementation method are developed. It is shown that the optimal bandwidth for the kernel-smoothed normalized curvature estimator should diverge at a slower rate than the one for the HAC estimator with the same kernel. Finite sample performances of the new HAC estimator are assessed through Monte Carlo simulations.
    Date: 2006–06
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2006cf431&r=ecm
  11. By: Breitung, Jörg; Eickmeier, Sandra
    Abstract: Factor models can cope with many variables without running into scarce degrees of freedom problems often faced in a regression-based analysis. In this article we review recent work on dynamic factor models that have become popular in macroeconomic policy analysis and forecasting. By means of an empirical application we demonstrate that these models turn out to be useful in investigating macroeconomic problems.
    Keywords: Principal components, dynamic factors, forecasting
    JEL: C13 C33 C51
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:zbw:bubdp1:4232&r=ecm
  12. By: Carlo Altavilla; Paul De Grauwe
    Abstract: This paper investigates the out-of-sample forecast performance of a set of competing models of exchange rate determination. We compare standard linear models with models that characterize the relationship between exchange rate and its underlying fundamentals by nonlinear dynamics. Linear models tend to outperform at short forecast horizons especially when deviations from long-term equilibrium are small. In contrast, nonlinear models with more elaborate mean-reverting components dominate at longer horizons especially when deviations from long-term equilibrium are large. The results also suggest that combining different forecasting procedures generally produces more accurate forecasts than can be attained from a single model.
    Keywords: non-linearity, exchange rate modelling, forecasting
    JEL: C53 F31
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:ces:ceswps:_1747&r=ecm
  13. By: Kurz-Kim, Jeong-Ryeol; Rachev, Svetlozar T.; Samorodnitsky, Gennady
    Abstract: Under the symmetric á-stable distributional assumption for the disturbances, Blattberg et al (1971) consider unbiased linear estimators for a regression model with non-stochastic regressors. We consider both the rate of convergence to the true value and the asymptotic distribution of the normalized error of the linear unbiased estimators. By doing this, we allow the regressors to be stochastic and disturbances to be heavy-tailed with either finite or infinite variances, where the tail-thickness parameters of the regressors and disturbances may be different.
    Keywords: Asymptotic distribution, rate of convergence, stochastic regressor, stable non-Gaussian, finite or infinite variance, heavy tails
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:zbw:bubdp1:4215&r=ecm
  14. By: Christian Müller (Swiss Institute for Business Cycle Research (KOF), Swiss Federal Institute of Technology Zurich (ETH))
    Abstract: Economists and econometricians very often work with data which has been temporally disaggregated prior to use. Hence, the quality of the disaggregation clearly aspects the qual- ity of the analyses. Building on Chow and Lin's (1971) disaggregation model this paper proposes a new estimation approach and a specification test which assesses the quality of the disaggregation model. An advantage of the proposal is that estimation and testing can both be pursued using the aggregated data while the standard method requires a mixture of high and low frequency data. A small simulation study shows that the test indeed provides useful information.
    Keywords: temporal disaggregation, restricted ARMA
    JEL: F31 F47 C53
    Date: 2006–04
    URL: http://d.repec.org/n?u=RePEc:kof:wpskof:06-134&r=ecm
  15. By: Allan Timmermann
    Abstract: The World Economic Outlook (WEO) is a key source of forecasts of global economic conditions. It is therefore important to review the performance of these forecasts against both actual outcomes and alternative forecasts. This paper conducts a series of statistical tests to evaluate the quality of the WEO forecasts for a very large cross section of countries, with particular emphasis on the recent recession and recovery. It assesses whether forecasts were unbiased and informationally efficient, and characterizes the process whereby WEO forecasts get revised as the time to the point of the forecast draws closer. Finally, the paper assess whether forecasts can be improved by combining WEO forecasts with the Consensus forecasts. The results suggest that the performance of the WEO forecasts is similar to that of the Consensus forecasts. While WEO forecasts for many variables in many countries meet basic quality standards in some, if not all, dimensions, the paper raises a number of concerns with current forecasting performance.
    Keywords: World Economic Outlook , Economic forecasting , Economic conditions ,
    Date: 2006–03–15
    URL: http://d.repec.org/n?u=RePEc:imf:imfwpa:06/59&r=ecm
  16. By: Isabel Yi Zheng; James Rossiter
    Abstract: The authors build a model for predicting current-quarter real gross domestic product (GDP) growth using anywhere from zero to three months of indicators from that quarter. Their equation links quarterly Canadian GDP growth with monthly data on retail sales, housing starts, consumer confidence, total hours worked, and U.S. industrial production. The authors use time-series methods to forecast missing observations of the monthy indicators; this allows them to assess the performance of the method under various amounts of monthly information. The authors' model forecasts GDP growth as early as the first month of the reference quarter, and its accuracy generally improves with incremental monthly data releases. The final forecast from the model, available five to six weeks before the release of the National Income and Expenditure Accounts, delivers improved accuracy relative to those of several macroeconomic models used for short-term forecasting of Canadian output. The implications of real-time versus pseudo-real-time forecasting are investigated, and the authors find that the choice between real-time and latestavailable data affects the performance ranking among alternative models.
    Keywords: Economic models; Econometric and statistical methods
    JEL: C22 C53
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:bca:bocawp:06-26&r=ecm
  17. By: Chin Nam Low; Heather Anderson; Ralph Snyder
    Abstract: This paper considers Beveridge-Nelson decomposition in a context where the permanent and transitory components both follow a Markov switching process. Our approach insorporates Markov switching into a single source of error state-space framework, allowing business cycle asymmetries and regime switches in the long-run multiplier.
    JEL: C22 C51 E32
    Date: 2006–07
    URL: http://d.repec.org/n?u=RePEc:pas:camaaa:2006-18&r=ecm
  18. By: Bertrand Candelon (University of Maastricht - Department of Economics); Gianluca Cubadda (University of Rome II - Department of Financial and Quantitative Economics)
    Abstract: This paper contributes to the econometric literature on structural breaks by proposing a test for parameter stability in VAR models at a particular frequency w, where w [0, p]. When a dynamic model is affected by a structural break, the new tests allow for detecting which frequencies of the data are responsible for parameter instability. If the model is locally stable at the frequencies of interest, the whole sample size can be then exploited despite the presence of a break. Two empirical examples illustrate that local stability can concern only the lower frequencies (change in the U.S. monetary policy in the early 80'(s) or higher frequencies (decrease in the postwar U.S. productivity).
    Keywords: Structural breaks, spectral analysis, productivity slowdown, yield curve
    JEL: C32 E43
    Date: 2006–05–31
    URL: http://d.repec.org/n?u=RePEc:rtv:ceisrp:82&r=ecm
  19. By: Döpke, Jörg; Hartmann, Daniel; Pierdzioch, Christian
    Abstract: We compared forecasts of stock market volatility based on real-time and revised macroeconomic data. To this end, we used a new dataset on monthly real-time macroeconomic variables for Germany. The dataset covers the period 1994-2005. We used a statistical, a utility-based, and an options-based criterion to evaluate volatility forecasts. Our main result is that the statistical and economic value of volatility forecasts based on real-time data is comparable to the value of forecasts based on revised macroeconomic data.
    Keywords: Forecasting stock market volatility, Real-time macroeconomic data, Evaluation of forecasting accuracy
    JEL: C53 E44 G11
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:zbw:bubdp2:4357&r=ecm
  20. By: Knetsch, Thomas A.
    Abstract: The paper develops an oil price forecasting technique which is based on the present value model of rational commodity pricing. The approach suggests shifting the forecasting problem to the marginal convenience yield which can be derived from the cost-of-carry relationship. In a recursive out-of-sample analysis, forecast accuracy at horizons within one year is checked by the root mean squared error as well as the mean error and the frequency of a correct direction-of-change prediction. For all criteria employed, the proposed forecasting tool outperforms the approach of using futures prices as direct predictors of future spot prices. Vis-à-vis the random-walk model, it does not significantly improve forecast accuracy but provides valuable statements on the direction of change.
    Keywords: oil price forecasts, rational commodity pricing, convenience yield, single-equation model
    JEL: C22 E37 G12 G13 Q40
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:zbw:bubdp1:4353&r=ecm
  21. By: Craig, Ben; Keller, Joachim
    Abstract: We estimate the process underlying the pricing of American options by using higher-order lattices combined with a multigrid method. This paper also tests whether the risk-neutral densities given from American options provide a good forecasting tool. We use a nonparametric test of the densities that is based on the inverse probability functions and is modified to account for correlation across time between our random variables, which are uniform under the null hypothesis. We find that the densities based on the Americanoption markets for foreign exchange do quite well for the forecasting period over which the options are thickly traded. Further, simple models that fit the densities do about as well as more sophisticated models. Keywords: Risk-neutral densities from option prices, American exchange rate options, Evaluating Density Forecasts, Pentionominal tree, Density evaluation, Overlapping data problem
    Keywords: Risk-neutral densities from option prices, American exchange rate options, Evaluating Density Forecasts, Pentionominal tree, Density evaluation
    JEL: C52 C63 F31 F47
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:zbw:bubdp2:4260&r=ecm
  22. By: Tommaso Proietti (Università degli Studi di Udine - Dipartimento di Scienze Statistiche)
    Abstract: The paper explores and illustrates some of the typical trade-offs which arise in designing filters for the measurement of trends and cycles in economic time series, focusing, in particular, on the fundamental trade-off between the reliability of the estimates and the magnitude of the revisions as new observations become available. This assessment is available through a novel model based approach, according to which an important class of highpass and bandpass filters, encompassing the Hodrick-Prescott filter, are adapted to the particular time series under investigation. Via a suitable decomposition of the innovation process, it is shown that any linear time series with ARIMA representation can be broken down into orthogonal trend and cycle components, for which the class of filters is optimal. The main results then follow from Wiener-Kolmogorov signal extraction theory, whereas exact finite sample inferences are provided by the Kalman filter and smoother for the relevant state space representation of the decomposition.
    Keywords: Signal Extraction, Revisions, Kalman filter and Smoother, Bandpass
    Date: 2006–05–31
    URL: http://d.repec.org/n?u=RePEc:rtv:ceisrp:84&r=ecm

This nep-ecm issue is ©2006 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.