nep-ecm New Economics Papers
on Econometrics
Issue of 2008‒03‒01
thirteen papers chosen by
Sune Karlsson
Orebro University

  1. Testing fractional order of long memory processes : a Monte Carlo study By Laurent Ferrara; Dominique Guegan; Zhiping Lu
  2. Robust misspecification tests for the Heckman’s two-step estimator By Gabriel V. Montes-Rojas
  3. A Test For Monotone Comparative Statics By Ivana Komunjer; Federico Echenique
  4. Forecasting chaotic systems : the role of local Lyapunov exponents By Dominique Guegan; Justin Leroux
  5. Accurately Sized Test Statistics with Misspecified Conditional Homoskedasticity By Douglas Steigerwald; Jack Erb
  6. Multivariate Forecast Evaluation And Rationality Testing By Ivana Komunjer; MICHAEL OWYANG
  7. On the performance of small-area estimators: fixed vs. random area parameters By Alex Costa; Albert Satorra; Eva Ventura
  8. When does Heckman’s two-step procedure for censored data work and when does it not? By Jonsson, Robert
  9. Multivariate Regime–Switching GARCH with an Application to International Stock Markets By Markus Haas; Stefan Mittnik
  10. The Shorth Plot By Einmahl, J.H.J.; Gantner, M.; Sawitzki, G.
  11. Performance Evaluation Based on the Robust Mahalanobis Distance and Multilevel Modelling Using Two New Strategies By Hussain, S; Mohamed, M. A.; Holder, R.; Almasri, A.; Shukur, G
  12. The k-factor Gegenbauer asymmetric Power GARCH approach for modelling electricity spot price dynamics By Abdou Kâ Diongue; Dominique Guegan
  13. Modelos econométricos dinámicos y desarrollo económico: Análisis del salario real, la productividad y el empleo en los países de la OCDE, 1965-2005 By Guisan, M.C.

  1. By: Laurent Ferrara (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, DGEI-DAMEP - Banque de France); Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, Ecole d'économie de Paris - Paris School of Economics - Université Panthéon-Sorbonne - Paris I); Zhiping Lu (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, ECNU - East China Normal University)
    Abstract: Testing the fractionally integrated order of seasonal and non-seasonal unit roots is quite important for the economic and financial time series modelling. In this paper, Robinson test (1994) is applied to various well-known long memory models. Via Monte Carlo experiments, we study and compare the performances of this test using several sample sizes.
    Keywords: Long memory processes, test, Monte Carlo simulations.
    Date: 2008–02
    URL: http://d.repec.org/n?u=RePEc:hal:papers:halshs-00259193_v1&r=ecm
  2. By: Gabriel V. Montes-Rojas (Department of Economics, City University, London)
    Abstract: We construct and evaluate LM and Neyman’s C(a) tests based on bivariate Edgeworth expansions for the consistency of the Heckman’s two-step estimator in selection models, that is, for the marginal normality and linearity of the conditional expectation of the error terms. The proposed tests are robust to local misspecification in nuisance distributional parameters. Monte Carlo results show that instead of testing bivariate normality, testing marginal normality and linearity of the conditional expectations separately have a better size performance. Moreover, the robust variants of the tests have better size and similar power to non-robust tests, which determines that these tests can be successfully applied to detect specific departures from the null model of bivariate normality. We apply the tests procedures to women’s labor supply data.
    Keywords: Heckman’s two-step, LM tests, Neyman’s C(a) tests
    Date: 2008–02
    URL: http://d.repec.org/n?u=RePEc:cty:dpaper:0801&r=ecm
  3. By: Ivana Komunjer (Dept. of Economics, University of California, San Diego); Federico Echenique (Caltech)
    Abstract: In this paper we design an econometric test for monotone comparative statics (MCS) often found in models with multiple equilibria. Our test exploits the observable implications of the MCS prediction: that the extreme (high and low) conditional quantiles of the dependent variable increase monotonically with the explanatory variable. The main contribution of the paper is to derive a likelihoodratio test, which to the best of our knowledge, is the first econometric test of MCS proposed in the literature. The test is an asymptotic "chi-bar squared" test for order restrictions on intermediate conditional quantiles. The key features of our approach are: (1) it does not require estimating the underlying nonparametric model relating the dependent and explanatory variables to the latent disturbances; (2) it makes few assumptions on the cardinality, location or probabilities over equilibria. In particular, one can implement our test without assuming an equilibrium selection rule.
    Keywords: multiple equilibria, comparative statics, quantiles, "chi-bar squared",
    Date: 2007–10–01
    URL: http://d.repec.org/n?u=RePEc:cdl:ucsdec:2007-07&r=ecm
  4. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, Ecole d'économie de Paris - Paris School of Economics - Université Panthéon-Sorbonne - Paris I); Justin Leroux (Institute for Applied Economics - HEC MONTRÉAL, CIRPEE - Centre Interuniversitaire sur le Risque, les Politiques Economiques et l'Emploi)
    Abstract: We propose a novel methodology for forecasting chaotic systems which is based on the nearest-neighbor predictor and improves upon it by incorporating local Lyapunov exponents to correct for its inevitable bias. Using simulated data, we show that gains in prediction accuracy can be substantial. The general intuition behind the proposed method can readily be applied to other non-parametric predictors.
    Keywords: Chaos theory, Lyapunov exponent, logistic map, Monte Carlo simulations.
    Date: 2008–02
    URL: http://d.repec.org/n?u=RePEc:hal:papers:halshs-00259238_v1&r=ecm
  5. By: Douglas Steigerwald (University of California, Santa Barbara); Jack Erb (University of California, Santa Barbara)
    Abstract: We study the problem of obtaining accurately sized test statistics in finite samples for linear regression models where the error dependence is of unknown form. With an unknown dependence structure there is traditionally a trade-off between the maximum lag over which the correlation is estimated (the bandwidth) and the decision to introduce conditional heteroskedasticity. In consequence, the correlation at far lags is generally omitted and the resultant inflation of the empirical size of test statistics has long been recognized. To allow for correlation at far lags we study test statistics constructed under the possibly misspecified assumption of conditional homoskedasticity. To improve the accuracy of the test statistics, we employ the second-order asymptotic refinement in Rothenberg (1988) to determine critical values. We find substantial size improvements resulting from the second-order theory across a wide range of specifications, including substantial conditional heteroskedasticity. We also find that the size gains result in only moderate increases in the length of the associated confidence interval, which yields an increase in size-adjusted power. Finally, we note that the proposed test statistics do not require that the researcher specify the bandwidth or the kernel.
    Keywords: test size, confidence interval estimation, heteroskedasticity, autocorrelation,
    Date: 2007–07–01
    URL: http://d.repec.org/n?u=RePEc:cdl:ucsbec:09-07&r=ecm
  6. By: Ivana Komunjer (University of California - San Diego); MICHAEL OWYANG (Federal Reserve Bank of Saint Louis)
    Abstract: In this paper, we propose a new family of multivariate loss functions that can be used to test the rationality of vector forecasts without assuming independence across individual variables. When only one variable is of interest, the loss function reduces to the flexible asymmetric family recently proposed by Elliott, Komunjer, and Timmermann (2005). Following their methodology, we derive a GMM test for multivaariate forecast rationality that allows the forecast errors to be dependent, and takes into account forecast estimation ucertainty. We use our test to study the rationality of macroeconomic vector forecasts in the growth rate in nominal output, the CPI inflation rate, and a short-term interest rate.
    Keywords: multivariate forecast rationality, multivariate loss, asymmetries, Fed Transparency,
    Date: 2007–11–01
    URL: http://d.repec.org/n?u=RePEc:cdl:ucsdec:2007-08&r=ecm
  7. By: Alex Costa; Albert Satorra; Eva Ventura
    Abstract: Most methods for small-area estimation are based on composite estimators derived from design- or model-based methods. A composite estimator is a linear combination of a direct and an indirect estimator with weights that usually depend on unknown parameters which need to be estimated. Although model-based small-area estimators are usually based on random-effects models, the assumption of fixed effects is at face value more appropriate.Model-based estimators are justified by the assumption of random (interchangeable) area effects; in practice, however, areas are not interchangeable. In the present paper we empirically assess the quality of several small-area estimators in the setting in which the area effects are treated as fixed. We consider two settings: one that draws samples from a theoretical population, and another that draws samples from an empirical population of a labor force register maintained by the National Institute of Social Security (NISS) of Catalonia. We distinguish two types of composite estimators: a) those that use weights that involve area specific estimates of bias and variance; and, b) those that use weights that involve a common variance and a common squared bias estimate for all the areas. We assess their precision and discuss alternatives to optimizing composite estimation in applications.
    Keywords: Small area estimation, composite estimator, Monte Carlo study, random effect model, BLUP, empirical BLUP
    Date: 2008–02
    URL: http://d.repec.org/n?u=RePEc:upf:upfgen:1069&r=ecm
  8. By: Jonsson, Robert (Department of Economics, School of Business, Economics and Law, University of Gothenburg)
    Abstract: Heckman’s two-step procedure (Heckit) for estimating the parameters in linear models from censored data is frequently used by econometricians, despite of the fact that earlier studies cast doubt on the procedure. In this paper it is shown that estimates of the hazard h for approaching the censoring limit, the latter being used as an explanatory variable in the second step of the Heckit, can induce multicollinearity. The influence of the censoring proportion and sample size upon bias and variance in three types of random linear models are studied by simulations. From these results a simple relation is established that describes how absolute bias depends on the censoring proportion and the sample size. It is also shown that the Heckit may work with non-normal (Laplace) distributions, but it collapses if h deviates too much from that of the normal distribution. Data from a study of work resumption after sick-listing are used to demonstrate that the Heckit can be very risky.
    Keywords: Censoring; Cross-sectional and panel data; Hazard; Multicollinearity
    JEL: C10
    Date: 2008–02–22
    URL: http://d.repec.org/n?u=RePEc:hhs:gunsru:2008_002&r=ecm
  9. By: Markus Haas (University of Munich, Institute of Statistics); Stefan Mittnik (Department of Statistics, University of Munich, Center for Financial Studies, Frankfurt, and Ifo Institute for Economic Research, Munich)
    Abstract: We develop a multivariate generalization of the Markov–switching GARCH model introduced by Haas, Mittnik, and Paolella (2004b) and derive its fourth–moment structure. An application to international stock markets illustrates the relevance of accounting for volatility regimes from both a statistical and economic perspective, including out–of–sample portfolio selection and computation of Value–at–Risk.
    Keywords: Conditional Volatility, Markov–Switching, Multivariate GARCH
    JEL: C32 C51 G10 G11
    Date: 2008–01
    URL: http://d.repec.org/n?u=RePEc:cfs:cfswop:wp200808&r=ecm
  10. By: Einmahl, J.H.J.; Gantner, M.; Sawitzki, G. (Tilburg University, Center for Economic Research)
    Abstract: The shorth plot is a tool to investigate probability mass concentration. It is a graphical representation of the length of the shorth, the shortest interval covering a certain fraction of the distribution, localized by forcing the intervals considered to contain a given point x. It is easy to compute, avoids bandwidth selection problems and allows scanning for local as well as for global features of the probability distribution. We prove functional central limit theorems for the empirical shorth plot. The good rate of convergence of the empirical shorth plot makes it useful already for moderate sample size.
    Keywords: Data analysis;distribution diagnostics;functional central limit theorem;probability mass concentration.
    JEL: C13 C14
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:200824&r=ecm
  11. By: Hussain, S (Division of Primary Care and General Practice, School of Medicine, University of Birmingham); Mohamed, M. A. (Department of Public Health, University of Birmingham, UK); Holder, R. (Division of Primary Care and General Practice, School of Medicine, University of Birmingham); Almasri, A. (Department of Economics and Statistics, Karlstad University, Sweden); Shukur, G (Jönköping International Business School)
    Abstract: In this paper we propose a general framework for performance evaluation of organisations and individuals over time using routinely collected performance variables or indicators. Such variables or indicators are often correlated over time, with missing observations, and often come from heavy tailed distributions shaped by outliers. Two double robust strategies are used for evaluation (ranking) of sampling units. Strategy 1 can handle missing data using residual maximum likelihood (RML) at stage two, while strategy two handle missing data at stage one. Strategy 2 has the advantage that overcomes the problem of multicollinearity. Strategy one requires independent indicators for the construction of the distances, where strategy two does not. Two different domain examples are used to illustrate the application of the two strategies. Example one considers performance monitoring of gynaecologists and example two considers the performance of industrial firms.
    Keywords: Ranking indicators; performance; robust statistics; multilevel estimation; Mahalanobis distance
    JEL: C40 C51 C52
    Date: 2008–02–26
    URL: http://d.repec.org/n?u=RePEc:hhs:cesisp:0114&r=ecm
  12. By: Abdou Kâ Diongue (UFR SAT - Université Gaston Berger - Université Gaston Berger de Saint-Louis, CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, School of Economics and Finance - Queensland University of Technology); Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, Ecole d'économie de Paris - Paris School of Economics - Université Panthéon-Sorbonne - Paris I)
    Abstract: Electricity spot prices exhibit a number of typical features that are not found in most financial time series, such as complex seasonality patterns, persistence (hyperbolic decay of the autocorrelation function), mean reversion, spikes, asymmetric behavior and leptokurtosis. Efforts have been made worldwide to model the behaviour of the electricity's market price. In this paper, we propose a new approach dealing with the stationary k-factor Gegenbauer process with asymmetric Power GARCH noise under conditional Student-t distribution, which can take into account the previous features. We derive the stationary and invertible conditions as well as the δth-order moment of this model that we called GGk-APARCH model. Then we focus on the estimation parameters and provide the analytical from of the likelihood which permits to obtain consitent estimates. In order to characterize the properties of these estimates we perform a Monte Carlo experiment. Finally the previous approach is used to the model electricity spot prices coming from the Leipzig Power Exchange (LPX) in Germany, Powernext in France, Operadora del Mercado Espagñol de Electricidad (OMEL) in Spain and the Pennsylvania-New Jersey-Maryland (PJM) interconnection in United States. In terms of forecasting criteria we obtain very good results comparing with models using hederoscedastic asymmetric errors.
    Keywords: Asymmetric distribution function, electricity spot prices, Leptokurtosis, persistence, seasonality, GARMA, A-PARCH.
    Date: 2008–02
    URL: http://d.repec.org/n?u=RePEc:hal:papers:halshs-00259225_v1&r=ecm
  13. By: Guisan, M.C.
    Abstract: In order to analyses the propagation effect, which is of great importance in the analysis of dynamic relationships in studies of Economic Development, here we present a classification of dynamic models, having into account several situations retarding this important effect, including not only models with lags but also models without explicit lags but with propagation effect through one stock variable, as it happens in the case of the production function when real Gdp is explained by the supply side of primary inputs. We present a comparison of several dynamic specifications, including models in levels, in first differences, mixed dynamic models and Error Correction Models, to wages, productivity, employment and GDP in Spain and other OECD countries. The main conclusion regarding the dynamic model specification is that the mixed dynamic model is a good choice for many econometric applications. <p> Analizamos el efecto propagación, que es de gran importancia en el análisis de relaciones dinámicas en estudios de desarrollo económico, y para ello presentamos una clasificación de los modelos dinámicos, teniendo en cuenta varias situaciones e incluyendo no sólo modelos con retardos sino también modelos sin retardos explícitos donde el efecto propagación se transmite a través de una variable stock, como es el caso de la función de producción cuando el PIB real es explicado por el lado de la oferta de inputs primarios. Presentamos una comparación de varias especificaciones dinámicas, incluyendo modelos en niveles, en primeras diferencias, dinámico mixto, y modelos de corrección de error, CE, aplicadas a los salarios, productividad, empleo y PIB en España y en otros países de la OCDE. La principal conclusión en relación con la especificación dinámica es que el modelo dinámico mixto as una buena elección en muchas aplicaciones econométricas.
    JEL: B41 C51 C52 O51 E2 E24 O52 O57
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:eaa:ecodev:96&r=ecm

This nep-ecm issue is ©2008 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.