nep-ecm New Economics Papers
on Econometrics
Issue of 2005‒05‒29
ten papers chosen by
Sune Karlsson
Orebro University

  1. On Testing Sample Selection Bias under the Multicollinearity Problem By Takashi Yamagata
  4. Identifying and Forecasting the Turning Points of the Belgian Business Cycle with Regime-Switching and Logit Models By Vincent, BODART; Konstantin, KHOLODILIN; Fati, SHADMAN-MEHTA
  5. Standard errors as weigths in multilateral price indices By Hill, R.; Timmer, M.
  6. Shock Identification of Macroeconomic Forecasts based on Daily Panels By Marlene Amstad; Andreas Fischer
  7. Optimal Conditionally Unbiased Bounded-Influence Inference in Dynamic Location and Scale Models By Fabio Trojani; Elvezio Ronchetti; Loriano Mancini
  8. A general multivariate threshold GARCH model with dynamic conditional correlations By Fabio Trojani; Francesco Audrino
  9. Efficient Derivative Pricing by Extended Method of Moments By Patrick Gagliardini; C. Gourieroux; E. Renault
  10. Tests for cointegration in panels with regime shifts By Luciano Gutierrez

  1. By: Takashi Yamagata
    Abstract: This paper examines and compares the finite sample performance of the existing tests for sample selection bias, especially under the multi-collinearity problem pointed out by Nawata (1993). The results show that under such multicollinearity problem, (i) the t-test for sample selection bias based on the Heckman and Greene variance estimator can be unreliable; (ii) the standard t-test (Heckman 1979) and the asymptotically efficient Lagrange multiplier test (Melino 1982) have correct size but very little power; (iii) however, the likelihood ratio test following the maximum likelihood estimation remains powerful.
    Keywords: Sample selection bias; t-test; Wald test, likelihood ratio test, Lagrange multiplier test
    JEL: C12 C24
    Date: 2005–05
  2. By: María Concepción Ausin; Michael Peter Wiper; Rosa Elvira Lillo
    Abstract: In this paper, we describe how to make Bayesian inference for the transient behaviour and busy period in a single server system with general and unknown distribution for the service and interarrival time. The dense family of Coxian distributions is used for the service and arrival process to the system. This distribution model is reparametrized such that it is possible to define a non-informative prior which allows for the approximation of heavytailed distributions. Reversible jump Markov chain Monte Carlo methods are used to estimate the predictive distribution of the interarrival and service time. Our procedure for estimating the system measures is based in recent results for known parameters which are frequently implemented by using symbolical packages. Alternatively, we propose a simple numerical technique that can be performed for every MCMC iteration so that we can estimate interesting measures, such as the transient queue length distribution. We illustrate our approach with simulated and real queues.
    Date: 2005–05
  3. By: María Concepcion Ausin; Pedro Galeano
    Abstract: In this paper, we perform Bayesian inference and prediction for a GARCH model where the innovations are assumed to follow a mixture of two Gaussian distributions. This GARCH model can capture the patterns usually exhibited by many financial time series such as volatility clustering, large kurtosis and extreme observations. A Griddy-Gibbs sampler implementation is proposed for parameter estimation and volatility prediction. The method is illustrated using the Swiss Market Index.
    Date: 2005–05
  4. By: Vincent, BODART (UNIVERSITE CATHOLIQUE DE LOUVAIN, Department of Economics); Konstantin, KHOLODILIN; Fati, SHADMAN-MEHTA (UNIVERSITE CATHOLIQUE DE LOUVAIN, Department of Economics)
    Abstract: This paper seeks to elaborate econometric models that can be used to forecast the turning points of the Belgian business cycle. We begin by suggesting three reference cycle, which we hope will fill the void of an official reference chronology for Belgium. We then construct two different types of model to estimate the probabilities of recession : Markov-switching models, and Logit models. We apply each approach to a limited set of data, which are a good representation of the economy, are available early and are subject to only minor revisions. We then select the best performing model for each chronology and type of approach. The out-of-sample results show that the models provide useful indicators of business cycle turning points. They are however far from perfect forecasting tools, especially when it comes to forecasting periods of classical recession.
    Keywords: Refrence chronologies; Markov-Switching and Logit models, forecasting business cycle turning points
    JEL: C5 E32 E37
    Date: 2005–03–15
  5. By: Hill, R.; Timmer, M. (Groningen University)
    Abstract: A number of multilateral methods for computing price indexes use bilateral comparisons as their basic building blocks. Some of these methods, such as the weighted-EKS and minimum-spanning-tree (MST) methods, give greater weight to those bilateral comparisons that are deemed more reliable (an adjustment that is particularly important for a heterogeneous set of countries). No consensus currently exists in the literature as to the best measure of reliability. Diewert (2002), in particular, proposes a number of reliability measures in an axiomatic setting. Existing measures (including all of Diewerts), however, fail to penalize bilateral comparisons when there is a small overlap in the products priced by each country. It is exactly in such situations that weighted methods are potentially most useful, but only if the reliability measure penalizes bilateral comparisons containing lots of gaps. Using a stochastic model, we show how the standard errors on bilateral price indexes provide a natural measure of reliability that automatically penalizes comparisons containing lots of gaps. Furthermore, we link these standard errors with the existing literature by showing that they are a generalization of one of Diewerts reliability measures. This finding provides an interesting new link between the axiomatic and stochastic approaches to index numbers. Also, these standard errors can be modified for use in consumer data sets below the basic-heading level (where no expenditure shares are available), a scenario of direct relevance to the latest round of the International Comparison Program (ICP) currently being undertaken at the World Bank. Finally, we apply our methodology to an international data set on agricultural production that contains a lot of gaps. Our results clearly demonstrate the appeal of weighted methods and the importance of adjusting the reliability measures for gaps in the data. Failure to do so may compromise weighted methods precisely in situations where they are most needed.
    Date: 2004
  6. By: Marlene Amstad (Swiss National Bank); Andreas Fischer (Swiss National Bank)
    Abstract: This paper proposes a new procedure for shock identification of macroeconomic forecasts based on factor analysis. Our identification scheme for information shocks relies on data reduction techniques for daily panels and the recognition that macroeconomic releases exhibit a high level of clustering. A large number of data releases on a single day is of considerable practical interest not only for the estimation but also for the identification of the factor model. The clustering of cross-sectional information facilitates the interpretation of the forecast innovations as real or as nominal information shocks. An empirical application is provided for Swiss inflation. We show that (i) the monetary policy shocks generate an asymmetric response to inflation, (ii) the pass-through for consumer price index inflation is weak, and (iii) that the information shocks to inflation are not synchronized.
    Date: 2005–02
  7. By: Fabio Trojani; Elvezio Ronchetti; Loriano Mancini
    Abstract: This paper studies the local robustness of estimators and tests for the conditional location and scale parameters in a strictly stationary time series model. We first derive optimal bounded-influence estimators for such settings under a conditionally Gaussian reference model. Based on these results, optimal bounded-influence versions of the classical likelihood-based tests for parametric hypotheses are obtained. We propose a feasible and efficient algorithm for the computation of our robust estimators, which makes use of analytical Laplace approximations to estimate the auxiliary recentering vectors ensuring Fisher consistency in robust estimation. This strongly reduces the necessary computation time by avoiding the simulation of multidimensional integrals, a task that has typically to be addressed in the robust estimation of nonlinear models for time series. In some Monte Carlo simulations of an AR(1)-ARCH(1) process we show that our robust procedures maintain a very high efficiency under ideal model conditions and at the same time perform very satisfactorily under several forms of departure from conditional normality. On the contrary, classical Pseudo Maximum Likelihood inference procedures are found to be highly inefficient under such local model misspecifications. These patterns are confirmed by an application to robust testing for ARCH.
    JEL: C1 C13 C14 C15 C22
    Date: 2005–01
  8. By: Fabio Trojani; Francesco Audrino
    Abstract: We propose a new multivariate DCC-GARCH model that extends existing approaches by admitting multivariate thresholds in conditional volatilities and conditional correlations. Model estimation is numerically feasible in large dimensions and positive semi-definiteness of conditional covariance matrices is naturally ensured by the pure model structure. Conditional thresholds in volatilities and correlations are estimated from the data, together with all other model parameters. We study the performance of our approach in some Monte Carlo simulations, where it is shown that the model is able to fit correctly a GARCH-type dynamics and a complex threshold structure in conditional volatilities and correlations of simulated data. In a real data application to international equity markets, we observe estimated conditional volatilities that are strongly influenced by GARCH-type and multivariate threshold effects. Conditional correlations, instead, are determined by simple threshold structures where no GARCH-type effect could be identified.
    JEL: C12 C13 C51 C53 C61
    Date: 2005–01
  9. By: Patrick Gagliardini; C. Gourieroux; E. Renault
    Abstract: In this paper we consider an incomplete market framework and explain how to use jointly observed prices of the underlying asset and of some deriv- atives written on this asset for an efficient pricing of other derivatives. This question involves two types of moment restrictions, which can be written either for a given value of the conditioning variable, or can be uniform with respect to this conditioning variable. This distinction between local and uni- form conditional moment restrictions leads to an extension of the Generalized Method of Moments (GMM), a method in which all restrictions are assumed uniform. The Extended Method of Moments (XMM) provides estimators of the parameters with different rates of convergence: the rate is the standard parametric one for the parameters which are identifiable from the uniform restrictions, whereas the rate can be nonparametric for the risk premium parameters. We derive the (kernel) nonparametric efficiency bounds for esti- mating a conditional moment of interest and prove the asymptotic efficiency of XMM. To avoid misleading arbitrage opportunities in estimated derivative prices, an XMM estimator based on an information criterion is introduced. The general results are applied in a stochastic volatility model to get effi- cient derivative prices, to measure the uncertainty of estimated prices and to estimate the risk premium parameters.
    JEL: C13 C14 G12
    Date: 2005–01
  10. By: Luciano Gutierrez (University of Sassari)
    Abstract: In the paper we extend Gregory and Hansen’s (1996)ADF, Za, Zt cointegration tests to panel data, using the method proposed in Maddala and Wu (1999). We test the null hypothesis of no cointegration for all the units in the panel against the alternative hypothesis of cointegration, while allowing for a one-time regime shift of unknown timing for at least some regressions. We derive the panel tests for the ADF, Za, Zt tests , and compare these tests with Pedroni’s (1999) panel cointegration tests. We show that Gregory and Hansen’s (1996) panel tests have higher power to reject null when there is a structural change in the cointegration vector. We apply the statistics to the analysis of the well known Feldstein-Horioka puzzle for a sample of sixteen OCDE countries. After we allow for a structural break in the cointegration regression, we find strong evidence of cointegration between saving and investment rates.
    Keywords: Panel data, Panel cointegration tests, Structural breaks, Feldstein-Horioka puzzle
    JEL: C22 C23 F32 F41
    Date: 2005–05–24

This nep-ecm issue is ©2005 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.