nep-ecm New Economics Papers
on Econometrics
Issue of 2005‒11‒09
eleven papers chosen by
Sune Karlsson
Orebro University

  1. Subsampling Cointegration Ranks in Large Systems By Chen Pu; Hsiao Chihying
  2. Robus Standard Error Estimation in Fixed-Effects Panel Models By Gabor Kezdi
  3. A maximal moment inequality for long range dependent time series with applications to estimation and model selection By Ching-Kang Ing; Ching-Zong Wei
  4. Bayesian Stochastic Frontier Analysis Using WinBUGS By Jim Griffin; Mark Steel
  5. Estimating Short and Long Run Relationships: A Guide to the Applied Economist By Bhaskara Rao
  6. Robustness or Efficiency, A Test to Solve the Dilemma By Catherine Dehon; Marjorie Gassner; Vincenzo Verardi
  7. Grid-Bootstrap Methods vs. Bayesian Analysis. Testing for Structural Breaks in the Conditional Variance of Nominal Interest Rate Spreads - Four Cases in Europe By Pierangelo De Pace
  8. Dirichlet-Multinomial Regression By Paulo Guimaraes; Richard Lindrooth
  9. A Statistical Model for the Identification of Key Sectors in I-O Models By Marco Percoco
  10. State Space Modelling of Cointegrated Systems using Subspace Algorithms By Segismundo Izquierdo; Cesáreo Hernández; Javier Pajares
  11. The smooth transition autoregressive target zone model with the Gaussian stochastic volatility and TGARCH error terms with applications By Oleg Korenok; Stanislav Radchenko

  1. By: Chen Pu (University Bielefeld); Hsiao Chihying (University Bielefeld)
    Abstract: In this paper we investigate the possibility of the application of subsampling procedure for testing cointegration relations in large multivariate systems. The subsampling technique is applied to overcome the difficulty of nonstandard distribution and nuisance parameters in testing for cointegration rank without an explicitly formulated structural model. The contribution in this paper is twofold: theoretically this paper shows that the subsampling testing procedure is consistent and has asymptotically power 1;practically this paper demonstrates that the subsampling procedure can be applied to determine the cointegration rank in large scale models, where the standard procedures hits already its limit. For empirical relevant cases our simulation studies show that centered subsampling improves decisively the performance of subsampling test procedure and makes it applicable also for cases when the number of independent stochastic trends are very large.
    Keywords: Cointegration, Large Systems, Nonparametric Tests, Subsampling
    JEL: C1 C2 C3 C4 C5 C8
    Date: 2005–08–07
  2. By: Gabor Kezdi (Central European University)
    Keywords: Fixed-Effects Panel Models, Serial Correlation, Robust Standard Error Estimation
    JEL: C23
    Date: 2005–08–25
  3. By: Ching-Kang Ing (Institute of Statistical Science, Academia Sinica); Ching-Zong Wei (Institute of Statistical Science, Academia Sinica)
    Abstract: We establish a maximal moment inequality for the weighted sum of a long- range dependent process. An extension to H$\acute{a}$jek-R$\acute{e}$ny and Chow's type inequality is then obtained. It enables us to deduce a strong law for the weighted sum of a stationary long-range dependent time series. To illustrate its usefulness, applications of the inequality to estimation and model selection in multiple regression models with long-range dependent errors are given.
    Keywords: Autoregressive fractionally integrated moving average, long range dependence, maximal inequality, model selection, convergence system, strong consistency.
    JEL: C1 C2 C3 C4 C5 C8
    Date: 2005–08–07
  4. By: Jim Griffin (University of Warwick); Mark Steel (University of Warwick)
    Abstract: Markov chain Monte Carlo (MCMC) methods have become a ubiquitous tool in Bayesian analysis. This paper implements MCMC methods for Bayesian analysis of stochastic frontier models using the WinBUGS package, a freely available software. General code for cross-sectional and panel data are presented and various ways of summarizing posterior inference are discussed. Several examples illustrate that analyses with models of genuine practical interest can be performed straightforwardly and model changes are easily implemented.
    Keywords: Efficiency, Markov chain Monte Carlo, Model comparison, Regularity, Software
    JEL: C11 C14 C23
    Date: 2005–09–04
  5. By: Bhaskara Rao (University of the South Pacific)
    Abstract: Many applied economists face problems in selecting an appropriate technique to estimate short and long run relationships with the time series methods. This paper reviews three alternative approaches viz., general to specific (GETS), vector autoregressions (VAR) and the vector error correction models (VECM). As in other methodological controversies, definite answers are difficult. It is suggested that if these techniques are seen as tools to summarize data, as in Smith (2000), often there may be only minor differences in their estimates. Therefore a computationally attractive technique is likely to be popular.
    Keywords: Var, Cointegration, General to Specific Approach
    JEL: C1 C2 C3 C4 C5 C8
    Date: 2005–08–13
  6. By: Catherine Dehon (ECARES-ULB); Marjorie Gassner (ECARES-ULB); Vincenzo Verardi (ECARES-ULB)
    Abstract: When dealing with the presence of outliers in a dataset, the problem of choosing between the classical ordinary least squares and robust regression methods is sometimes addressed inadequately. In this article, we propose using a Hausman-type test to determine whether a robust S- estimator is more appropriate than an ordinary least squares one in a multiple linear regression framework, on the basis of the trade-off betewen robustness and efficiency. An economic example is provided to illustrate the usefulness of the test.
    Keywords: Efficiency, Hausman Test, Linear Regression, Robustness, S- estimator
    JEL: C12 C13
    Date: 2005–08–08
  7. By: Pierangelo De Pace (Johns Hopkins University)
    Abstract: I use numerical methods to test for the presence of one-time structural breaks in the conditional variance of nominal interest rate spreads in four European countries over a period of eleven years (Jan 1988 to Dec 1998). I start with an intuitive approach consisting of a sequence of breakpoint Chow tests performed at subsequent dates over a given subsample of the squared residuals of the autoregressions used to model the yield spreads. Results from this procedure are misleading and spurious to some extent because of the incorrect critical values produced, which make the interpretation of the test stastistics basically unreliable. I then switch to large Monte Carlo simulations and to a fixed-regressor grid-bootstrap method to derive the right critical values and refine the previous conclusions. Finally, I utilize classical Bayesian econometrics to estimate alternative models for the series of nominal spreads and to detect potential shifts in the innovation variances of the equations describing the data. Outcomes need some interpretation: in the cases of Germany and Spain a break might have occurred in 1990 and 1994 espectively, as derived from the grid- bootstrap approach. Likewise, there is evidence of a shift in the case of France in 1996 according to the Bayesian techniques employed, which also validate the hypothesis of a break for Italian yield spreads in 1995.
    Keywords: Conditional Variance, Chow Test; Structural Breaks; Fixed- Regressor Grid-Bootstrap Method, Classical Bayesian Analysis
    JEL: C11 C12 C15 E3 E44
    Date: 2005–09–06
  8. By: Paulo Guimaraes (Medical University of South Carolina); Richard Lindrooth (Medical University of South Carolina)
    Abstract: In this paper we provide a Random-Utility based derivation of the Dirichlet-Multinomial regression and posit it as a convenient alternative for dealing with overdispersed multinomial data. We show that this model is a natural extension of McFadden's conditional logit for grouped data and show how it relates with count models. Finally, we use a data set on patient choice of hospitals to illustrate an application of the Dirichlet-Multinomial regression.
    Keywords: dirichlet-multinomial, grouped conditional logit, hospital choice, overdispersion
    JEL: C25 C21 I11
    Date: 2005–09–01
  9. By: Marco Percoco
    Abstract: Following the seminal work by Bullard and Sebald [Effects of Parametric Uncertainty and Technological Change on In put-Out put Models, Rev. of Ec. And Stat., vol. 59,75-81], in this paper we present an innovative approach to sensitivity analysis in Input-Out put model. In particular, we propose a statistical model capable to compute a sensitivity index associated to each technical coefficient. We call the ordered set of these indices Importance Matrix. Finally, in order to show a simple example for this methodology, we consider the case of the Chicago economy. Keywords: Input-Out put Models, Sensitivity Analysis, Importance Matrix JEL Classification: C15, C67, D5
    Date: 2004–08
  10. By: Segismundo Izquierdo (University of Valladolid); Cesáreo Hernández (University of Valladolid); Javier Pajares (University of Valladolid)
    Abstract: The use of subspace algorithms for the identification of non-stationary cointegrated stochastic systems is a promising technique that is currently under discussion. A revision of the literature provides two distinct algorithms: State Space Aoki Time Series (SSATS) identification algorithm (Aoki and Havenner 1991) and the Adapted Canonical Correlations Analysis (ACCA) of Bauer and Wagner (2002). Aoki’s method is intuitively appealing, but lacks statistical foundation. In contrast, ACCA has a sound statistical basis, though intuition is somewhat lost. Both algorithms are revisited and commented. The study of the underlying ideas and properties of both previous algorithms leads us to propose a new method for subspace identification of non-stationary cointegrated stochastic systems, trying to combine the best features of each one. This new method provides a state space trend-cycle representation of a cointegrated system. Some preliminary simulation results are summarised, comparing these subspace methods with Johansen’s maximum likelihood approach.
    Keywords: system identification, state space, subspace, cointegration, CCA
    JEL: C32
    Date: 2005–09–06
  11. By: Oleg Korenok (Virginia Commonwealth University); Stanislav Radchenko (University of North Carolina at Charlotte)
    Abstract: This paper proposes to model the error term in smooth transition autoregressive target zone model as Gaussian with stochastic volatility (STARTZ-SV) or as Student-t with GARCH volatility (STARTZ-TGARCH). Using the dynamics of Norwegian krone exchange rate index, we show that both models produce standardized residuals that are closer to assumed distributions and do not produce a hump in the estimated marginal distribution of exchange rate which is more consistent with theoretical predictions. We apply developed models to test whether the dynamics of oil price can be well approximated by the Krugman’s target zone model. Our estimates of conditional volatility and marginal distribution reject the target zone hypothesis.
    Keywords: target zone, oil price, exchange rate, stochastic volatility, griddy Gibbs, smooth transition
    JEL: C52 Q38 F31
    Date: 2005–08–18

This nep-ecm issue is ©2005 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.