nep-ecm New Economics Papers
on Econometrics
Issue of 2007‒05‒19
twenty-two papers chosen by
Sune Karlsson
Orebro University

  1. Forecasting Time Series with Long Memory and Level Shifts, A Bayesian Approach By Silvestro Di Sanzo
  2. Asymptotic Properties of Estimators for the Linear Panel Regression Model with Individual Effects and Serially Correlated Errors: The Case of Stationary and Non-Stationary Regressors and Residuals By Badi H. Baltagi; Chihwa Kao; Long Liu
  3. Some Flexible Parametric Models for Partially Adaptive Estimators of Econometric Models By Hansen, Christian B.; McDonald, James B.; Theodossiou, Panayiotis
  4. Stability of nonlinear AR-GARCH models By Mika Meitz; Pentti Saikkonen
  5. A Monte Carlo Evaluation of Some Common Panel Data Estimators when Serial Correlation and Cross-sectional Dependence are Both Present By W. Robert Reed; Haichun Ye
  6. The vector innovation structural time series framework: a simple approach to multivariate forecasting By Ashton de Silva; Rob J. Hyndman; Ralph D. Snyder
  7. Inference on Categorical Survey Response: A Predictive Approach By Adhya Sumanta; Banerjee Tathagata; Chattopadhyay Gaurangadeb
  8. Ergodicity, mixing, and existence of moments of a class of Markov models with applications to GARCH and ACD models By Mika Meitz; Pentti Saikkonen
  9. The Performance of Panel Cointegration Methods. Results from a Large Scale Simulation Study By Wagner, Martin; Hlouskova, Jaroslava
  10. On Econometric Analysis of Structural Systems with Permanent and Transitory Shocks and Exogenous Variables By Pagan, A.; Pesaran, M.H.
  11. Tests of time-invariance By Busettti, F.; Harvey, A.
  12. Quantiles, Expectiles and Splines By DeRossi, G.; Harvey, A.
  13. Mixed Signals Among Tests for Panel Cointegration By Westerlund, Joakim; Basher, Syed A.
  14. On the Effect of Prior Assumptions in Bayesian Model Averaging with Applications to Growth Regression By Ley, Eduardo; Steel, Mark F.J.
  15. Changes in Predictive Ability with Mixed Frequency Data By Ana Beatriz Galvão
  16. A Low-Dimension Collinearity-Robust Test for Non-linearity By Jennifer L. Castle; David F. Hendry
  17. Information Criteria for Impulse Response Function Matching Estimation of DSGE Models By Hall, Alastair; Inoue, Atsushi; Nason M, James; Rossi, Barbara
  18. Learning Causal Relations in Multivariate Time Series Data By Chen, Pu; Chihying, Hsiao
  19. Spatial models for flood risk assessment By Marco Bee; Roberto Benedetti; Giuseppe Espa
  20. Taking a DSGE Model to the Data Meaningfully By Juselius, Katarina; Franchi, Massimo
  21. The Importance of Interest Rate Volatility in Empirical Tests of Uncovered Interest Parity By Metodij Hadzi-Vaskov; Clemens Kool
  22. Worldwide Econometrics Rankings: 1989-2005 By Badi H. Baltagi

  1. By: Silvestro Di Sanzo (Department of Economics, University Of Alicante)
    Abstract: Recent studies have showed that it is troublesome, in practice, to distinguish between long memory and nonlinear processes. Therefore, it is of obvious interest to try to capture both features of long memory and non-linearity into a single time series model to be able to assess their relative importance. In this paper we put forward such a model, where we combine the features of long memory and Markov nonlinearity. A Markov Chain Monte Carlo algorithm is proposed to estimate the model and evaluate its forecasting performance using Bayesian predictive densities. The resulting forecasts are a significant improvement over those obtained by the linear long memory and Markov switching models.
    Keywords: Markov-Switching models, Bootstrap, Gibbs Sampling
    JEL: C11 C15 C22
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:ven:wpaper:03_07&r=ecm
  2. By: Badi H. Baltagi (Center for Policy Research, Maxwell School, Syracuse University, Syracuse, NY 13244-1020); Chihwa Kao (Center for Policy Research, Maxwell School, Syracuse University, Syracuse, NY 13244-1020); Long Liu (Department of Economics, Maxwell School, Syracuse University, Syracuse, NY 13244-1020)
    Abstract: This paper studies the asymptotic properties of standard panel data estimators in a simple panel regression model with error component disturbances. Both the regressor and the remainder disturbance term are assumed to be autoregressive and possibly non-stationary. Asymptotic distributions are derived for the standard panel data estimators including ordinary least squares, fixed effects, first-difference, and generalized least squares (GLS) estimators when both T and n are large. We show that all the estimators have asymptotic normal distributions and have different convergence rates dependent on the non-stationarity of the regressors and the remainder disturbances. We show using Monte Carlo experiments that the loss in efficiency of the OLS, FE and FD estimators relative to true GLS can be substantial.
    Keywords: Panel data, OLS, Fixed-effects, First-difference, GLS.
    JEL: C33
    Date: 2007–05
    URL: http://d.repec.org/n?u=RePEc:max:cprwps:93&r=ecm
  3. By: Hansen, Christian B.; McDonald, James B.; Theodossiou, Panayiotis
    Abstract: This paper discusses three families of flexible parametric probability density functions: the skewed generalized t, the exponential generalized beta of the second kind, and the inverse hyperbolic sin distributions. These families allow quite flexible modeling the first four moments of a distribution and could be considered in modeling a wide variety of economic problems. We illustrate their use in a simple regression model with a simulation study that demonstrates that the use of the flexible distributions may result in significant efficiency gains relative to more conventional regression procedures, such as ordinary least squares or least absolute deviations regression, without a suffering from a large efficiency loss when errors are Gaussian.
    Keywords: Partially Adaptive Estimation, Econometric Models
    JEL: C13 C14 C15
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:zbw:ifwedp:5527&r=ecm
  4. By: Mika Meitz; Pentti Saikkonen
    Abstract: This paper studies the stability of nonlinear autoregressive models with conditionality heteroskedastic errors. We consider a nonlinear autoregression of order p (AR(p)) with the conditional variance specified as a nonlinear first order generalized autoregressive conditional heteroskedasticity (GARCH(1,1)) model. Conditions under which the model is stable in the sense that its Markov chain representation is geometrically ergodic are provided. This implies the existence of an initial distribution such that the process is strictly stationary and ?-mixing. Conditions under which the stationary distribution has finite moments are also given. The results cover several nonlinear specifications recently proposed for both the conditional mean and conditional variance, and only require mild moment conditions.
    Keywords: Nonlinear Autoregression, Generalized Autoregressive Conditional Heteroskedasticity, Nonlinear Time Series Models, Geometric Ergodicity, Mixing, Strict Stationarity, Existence of Moments, Markov Models
    JEL: C10 C22
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:oxf:wpaper:328&r=ecm
  5. By: W. Robert Reed (University of Canterbury); Haichun Ye
    Abstract: This study employs Monte Carlo experiments to evaluate the performances of a number of common panel data estimators when serial correlation and cross-sectional dependence are both present. It focuses on fixed effects models with less than 100 cross-sectional units and between 10 and 25 time periods (such as are commonly employed in empirical growth studies). Estimator performance is compared on two dimensions: (i) root mean square error and (ii) accuracy of estimated confidence intervals. An innovation of our study is that our simulated panel data sets are designed to look like “real-world” panel data. We find large differences in the performances of the respective estimators. Further, estimators that perform well on efficiency grounds may perform poorly when estimating confidence intervals, and vice versa. Our experimental results form the basis for a set of estimator recommendations. These are applied to “out of sample” simulated panel data sets and found to perform well.
    Keywords: Panel Data estimation; Monte Carlo analysis; FGLS; PCSE; Groupwise Heteroscedasticity; Serial Correlation; Cross-sectional Dependence; Stata; EViews
    JEL: C23 C15
    Date: 2007–04–30
    URL: http://d.repec.org/n?u=RePEc:cbt:econwp:07/01&r=ecm
  6. By: Ashton de Silva; Rob J. Hyndman; Ralph D. Snyder
    Abstract: The vector innovation structural time series framework is proposed as a way of modelling a set of related time series. Like all multi-series approaches, the aim is to exploit potential inter-series dependencies to improve the fit and forecasts. A key feature of the framework is that the series are decomposed into common components such as trend and seasonal effects. Equations that describe the evolution of these components through time are used as the sole way of representing the inter-temporal dependencies. The approach is illustrated on a bivariate data set comprising Australian exchange rates of the UK pound and US dollar. Its forecasting capacity is compared to other common single- and multi-series approaches in an experiment using time series from a large macroeconomic database.
    Keywords: Vector innovation structural time series, state space model, multivariate time series, exponential smoothing, forecast comparison, vector autoregression.
    JEL: C32 C51 C53
    Date: 2007–05
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2007-3&r=ecm
  7. By: Adhya Sumanta; Banerjee Tathagata; Chattopadhyay Gaurangadeb
    Abstract: We consider the estimation of finite population proportions of categorical survey responses obtained by probability sampling. The customary design-based estimator does not make use of the auxiliary data available for all the population units at the estimation stage. We adopt a model-based predictive approach to incorporate this information and make the estimates more efficient. In the first part of our paper we consider a multinomial logit type model when logit function is a known parametric function of the covariates. We then use it for the prediction of non-sampled responses. This together with sampled responses is used to obtain the estimates of the proportions. The asymptotic biases and variances of these estimators are obtained. The main drawback of this approach is, being a parametric model it may suffer from model misspecification and thus, may lose it’s efficiencies over the usual design-based estimates. To overcome this drawback, in the next part of this paper we replace the multinomial logit type model by a nonparametric model using recently developed random coefficients splines models. Finally, we carry out a simulation study. It shows that the nonparametric approach may lead to an appreciable improvement over both parametric and design-based approaches when the regression function is quite different from multinomial logit.
    Keywords: Auxiliary information, Model-based inference, Finite population estimation, Multinomial logit, Random coefficients splines models, Laplace approximation
    Date: 2007–05–14
    URL: http://d.repec.org/n?u=RePEc:iim:iimawp:2007-05-07&r=ecm
  8. By: Mika Meitz; Pentti Saikkonen
    Abstract: This paper studies a class of Markov models which consist of two components. Typically, one of the components is observable and the other is unobservable or `hidden`. Conditions under which geometric ergodicity of the unobservable component is inherited by the joint process formed of the two components are given. This implies existence of initial values such that the joint process is strictly stationary and ?-mixing. In addition to this, conditions for the existence of moments are also obtained and extensions to the case of nonstationary initial values are provided. All these results are applied to a general model which includes as special cases various first order generalized autoregressive conditional heteroskedasticity (GARCH) and autoregressive conditional duration (ACD) models with possibly complicated non-linear structures. The results only require mild moment assumptions and in some cases provide necessary and sufficient conditions for geometric ergodicity.
    Keywords: Generalized Autoregressive Conditional Heteroskedasticity, Autoregressive Conditional Duration, GARCH-in-mean, Nonlinear Time Series Models, Geometric Erogidicity, Mixing, Strict Stationarity, Existence of Moments, Markov Models
    JEL: C10 C22
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:oxf:wpaper:327&r=ecm
  9. By: Wagner, Martin (Department of Economics and Finance, Institute for Advanced Studies, Vienna, Austria); Hlouskova, Jaroslava (Department of Economics and Finance, Institute for Advanced Studies, Vienna, Austria)
    Abstract: This paper presents results concerning the performance of both single equation and system panel cointegration tests and estimators. The study considers the tests developed in Pedroni (1999, 2004), Westerlund (2005), Larsson, Lyhagen, and Löthgren (2001) and Breitung (2005); and the estimators developed in Phillips and Moon (1999), Pedroni (2000), Kao and Chiang (2000), Mark and Sul (2003), Pedroni (2001) and Breitung (2005). We study the impact of stable autoregressive roots approaching the unit circle, of I(2) components, of short-run cross-sectional correlation and of cross-unit cointegration on the performance of the tests and estimators. The data are simulated from three-dimensional individual specific VAR systems with cointegrating ranks varying from zero to two for fourteen different panel dimensions. The usual specifications of deterministic components are considered.
    Keywords: Cross-sectional dependence, estimator, panel cointegration, simulation study, test
    JEL: C12 C15 C23
    Date: 2007–05
    URL: http://d.repec.org/n?u=RePEc:ihs:ihsesp:210&r=ecm
  10. By: Pagan, A.; Pesaran, M.H.
    Abstract: This paper considers the implications of the permanent/transitory decomposition of shocks for identification of structural models in the general case where the model might contain more than one permanent structural shock. It provides a simple and intuitive generalization of the in.uential work of Blanchard and Quah (1989), and shows that structural equations for which there are known permanent shocks must have no error correction terms present in them, thereby freeing up the latter to be used as instruments in estimating their parameters. The proposed approach is illustrated by a re-examination of the identification scheme used in a monetary model by Wickens and Motta (2001), and in a well known paper by Gali (1992) which deals with the construction of an IS-LM model with supply-side e¤ects. We show that the latter imposes more short-run restrictions than are needed because of a failure to fully utilize the cointegration information.
    Keywords: Permanent shocks, structural identification, error correction models, IS-LM models.
    JEL: C30 C32 E10
    Date: 2007–01
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:0662&r=ecm
  11. By: Busettti, F.; Harvey, A.
    Abstract: Quantiles provide a comprehensive description of the properties of a variable and tracking changes in quantiles over time using signal extraction methods can be informative. It is shown here how stationarity tests can be generalized to test the null hypothesis that a particular quantile is constant over time by using weighted indicators. Corresponding tests based on expectiles are also proposed; these might be expected to be more powerful for distributions that are not heavy-tailed. Tests for changing dispersion and asymmetry may be based on contrasts between particular quantiles or expectiles. We report Monte Carlo experiments investigating the e¤ectiveness of the proposed tests and then move on to consider how to test for relative time invariance, based on residuals from fitting a time-varying level or trend. Empirical examples, using stock returns and U.S. inflation, provide an indication of the practical importance of the tests.
    Keywords: Dispersion; expectiles; quantiles; skewness; stationarity tests; stochastic volatility, value at risk.
    JEL: C12 C22
    Date: 2007–03
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:0657&r=ecm
  12. By: DeRossi, G.; Harvey, A.
    Abstract: A time-varying quantile can be fitted to a sequence of observations by formulating a time series model for the corresponding population quantile and iteratively applying a suitably modified state space signal extraction algorithm. It is shown that such time-varying quantiles satisfy the defining property of fixed quantiles in having the appropriate number of observations above and below. Expectiles are similar to quantiles except that they are defined by tail expectations. Like quantiles, time-varying expectiles can be estimated by a state space signal extraction algorithm and they satisfy properties that generalize the moment conditions associated with fixed expectiles. Time-varying quantiles and expectiles provide information on various aspects of a time series, such as dispersion and asymmetry, while estimates at the end of the series provide the basis for forecasting. Because the state space form can handle irregularly spaced observations, the proposed algorithms can be easily adapted to provide a viable means of computing spline-based non-parametric quantile and expectile regressions.
    Keywords: Asymmetric least squares; cubic splines; dispersion; non-parametric regression; quantile regression; signal extraction; state space smoother.
    JEL: C14 C22
    Date: 2007–02
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:0660&r=ecm
  13. By: Westerlund, Joakim; Basher, Syed A.
    Abstract: In this paper, we study the effect that different serial correlation adjustment methods can have on panel cointegration testing. As an example, we consider the very popular tests developed by Pedroni (1999, 2004). Results based on both simulated and real data suggest that different adjustment methods can lead to significant variations in test outcome, and thus also in the conclusions.
    Keywords: Panel Data; Cointegration Testing; Parametric and Semiparametric Methods.
    JEL: C32 C33 C14 C15
    Date: 2007–05–16
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:3261&r=ecm
  14. By: Ley, Eduardo; Steel, Mark F.J.
    Abstract: We consider the problem of variable selection in linear regression models. Bayesian model averaging has become an important tool in empirical settings with large numbers of potential regressors and relatively limited numbers of observations. We examine the effect of a variety of prior assumptions on the inference concerning model size, posterior inclusion probabilities of regressors and on predictive performance. We illustrate these issues in the context of cross-country growth regressions using three datasets with 41 to 67 potential drivers of growth and 72 to 93 observations. Finally, we recommend priors for use in this and related contexts.
    Keywords: Model size; Model uncertainty; Posterior odds; Prediction; Prior odds; Robustness
    JEL: C11 O47
    Date: 2007–05–12
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:3214&r=ecm
  15. By: Ana Beatriz Galvão (Queen Mary, University of London)
    Abstract: This paper proposes a new regression model – a smooth transition mixed data sampling (STMIDAS) approach – that captures recurrent changes in the ability of a high frequency variable in predicting a low frequency variable. The STMIDAS regression is employed for testing changes in the ability of financial variables in forecasting US output growth. The estimation of the optimal weights for aggregating weekly data inside the quarter improves the measurement of the predictive ability of the yield curve slope for output growth. Allowing for changes in the impact of the short-rate and the stock returns in future growth is decisive for finding in-sample and out-of-sample evidence of their predictive ability at horizons longer than one year.
    Keywords: Smooth transition, MIDAS, Predictive ability, Asset prices, Output growth
    JEL: C22 C53 E44
    Date: 2007–05
    URL: http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp595&r=ecm
  16. By: Jennifer L. Castle; David F. Hendry
    Abstract: A new test for non-linearity is developed using weighted combinations of regressor powers based on the eigenvectors of the variance-covariance matrix. The test extends the ingenious test for heteroskedasticity proposed by White (1980), but both circumvents problems of high dimensionality and collinearity, and allows inclusion of cubic functions to ensure power against asymmetry or skewness. A Monte Carlo analysis compares the performance of the test to the optimal infeasible test and to a variant of White`s test. The relative performance of the test is encouraging: the test has the appropriate size and has high power in many situations. Furthermore, collinearity between regressors can increase the power of the test.
    Keywords: Functional Form Test, Non-linearity, Collinearity
    JEL: C51 C52
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:oxf:wpaper:326&r=ecm
  17. By: Hall, Alastair; Inoue, Atsushi; Nason M, James; Rossi, Barbara
    Abstract: We propose a new Information Criterion for Impulse Response Function Matching estimators of the structural parameters of macroeconomic models. The main advantage of our procedure is that it allows the researcher to select the impulse responses that are most informative about the deep parameters, therefore reducing the bias and improving the efficiency of the estimates of the model's parameters. We show that our method substantially changes key parameter estimates of representative Dynamic Stochastic General Equilibrium models, thus reconciling their empirical results with the existing literature. Our criterion is general enough to apply to impulse responses estimated by VARs, local projections, as well as simulation methods.
    Keywords: impulse responses, matching, information criteria, DSGE models, structural macroeconomic models, estimation
    JEL: C32 E47 C52 C53
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:duk:dukeec:07-04&r=ecm
  18. By: Chen, Pu; Chihying, Hsiao
    Abstract: Applying a probabilistic causal approach, we define a class of time series causal models (TSCM) based on stationary Bayesian networks. A TSCM can be seen as a structural VAR identified by the causal relations among the variables. We classify TSCMs into observationally equivalent classes by providing a necessary and sufficient condition for the observational equivalence. Applying an automated learning algorithm, we are able to consistently identify the data-generating causal structure up to the class of observational equivalence. In this way we can characterize the empirical testable causal orders among variables based on their observed time series data. It is shown that while an unconstrained VAR model does not imply any causal orders in the variables, a TSCM that contains some empirically testable causal orders implies a restricted SVAR model. We also discuss the relation between the probabilistic causal concept presented in TSCMs and the concept of Granger causality. It is demonstrated in an application example that this methodology can be used to construct structural equations with causal interpretations.
    Keywords: Automated Learning, Bayesian Network, Inferred Causation, VAR, Wage-Price Spiral
    JEL: C1
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:zbw:ifwedp:5529&r=ecm
  19. By: Marco Bee; Roberto Benedetti; Giuseppe Espa
    Abstract: The problem of computing risk measures associated to flood events is extremely important not only from the point of view of civil protection systems but also because of the necessity for the municipalities of insuring against the damages. In this work we propose, in the framework of an integrated strategy, an operating solution which merges in a conditional approach the information usually available in this setup. First we use a Logistic Auto-Logistic (LAM) model for the estimation of the univariate conditional probabilities of flood events. This approach has two fundamental advantages: it allows to incorporate auxiliary information and does not require the target variables to be indepen- dent. Then we simulate the joint distribution of floodings by means of the Gibbs Sampler. Finally we propose an algorithm to increase ex post the spatial autocorrelation of the simulated events. The methodology is shown to be effective by means of an application to the estimation of the flood probability of Italian hydrographic regions.
    Keywords: Flood Risk, Conditional Approach, LAM Model, Pseudo-Maximum Likelihood Estimation, Spatial Autocorrelation, Gibbs Sampler.
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:trn:utwpde:0710&r=ecm
  20. By: Juselius, Katarina; Franchi, Massimo
    Abstract: All economists say that they want to take their model to the data. But with incomplete and highly imperfect data, doing so is difficult and requires carefully matching the assumptions of the model with the statistical properties of the data. The cointegrated VAR (CVAR) offers a way of doing so. In this paper we outline a method for translating the assumptions underlying a DSGE model into a set of testable assumptions on a cointegrated VAR model and illustrate the ideas with the RBC model in Ireland (2004). Accounting for unit roots (near unit roots) in the model is shown to provide a powerful robustification of the statistical and economic inference about persistent and less persistent movements in the data. We propose that all basic assumptions underlying the theory model should be formulated as a set of testable hypotheses on the long-run structure of a CVAR model, a so called ‘theory consistent hypothetical scenario’. The advantage of such a scenario is that if forces us to formulate all testable implications of the basic hypotheses underlying a theory model. We demonstrate that most assumptions underlying the DSGE model and, hence, the RBC model are rejected when properly tested. Leaving the RBC model aside, we then report a structured CVAR analysis that summarizes the main features of the data in terms of long-run relations and common stochastic trends. We argue that structuring the data in this way offers a number of ‘sophisticated’ stylized facts that a theory model has to replicate in order to claim empirical relevance.
    Keywords: DSGE, RBC, cointegrated VAR
    JEL: C32 C52 E32
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:zbw:ifwedp:5520&r=ecm
  21. By: Metodij Hadzi-Vaskov; Clemens Kool
    Abstract: Uncovered interest rate parity provides a crucial theoretical underpinning for many models in international finance and international monetary economics. Though theoretically sound, this concept has not been supported by the empirical evidence. Typically, econometric tests not only reject the null hypothesis, but also find significant slope coefficients with the wrong sign. Following the approach employed in Kool and Thornton (2004), we show that the empirical procedure conventionally used to test for UIP may produce biased slope coefficients if the true data-generating process slightly differs from the theoretically expected one. Using monthly data for ten industrial countries during the period W75-2004,we estimate the UIP relation for all possible bilateral country pairs for each of the six fiveyear sub-periods. The evidence supports the biasedness hypothesis: when the interest rate volatility of the anchor country is very high (very low), this estimation procedure reports significantly higher (lower) slope coefficients.
    Keywords: International Financial Markets, Estimation Bias, Exchange Rate Volatility
    JEL: F31 G15 C5
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:use:tkiwps:0616&r=ecm
  22. By: Badi H. Baltagi (Center for Policy Research, Maxwell School, Syracuse University, Syracuse, NY 13244-1020)
    Abstract: This paper updates Baltagi's (2003, Econometric Theory 19, 165-224) rankings of academic institutions by publication activity in econometrics from 1989-1999 to 1989-2005. This ranking is based on 16 leading international journals that publish econometrics articles. It is compared with the prior rankings by Hall (1980, 1987) for the period 1980-1988. In addition, a list of the top 150 individual producers of econometrics in these 16 journals over this 17-year period is provided. This is done for theoretical econometrics as well as all contributions in econometrics. Sensitivity analysis is provided using (i) alternative weighting factors given to the 16 journals taking into account impact citations, excluding self-citations, size and age of the journal, (ii) alternative time intervals, namely, (2000-2005), (1995-2005), and (1989-2005), (iii) alternative ranking using the number of articles published in these journals, (iv) separate rankings for both institutions and individuals by journal, (v) rankings for institutions and individuals based on publications in three core econometrics journals. This paper is forthcoming in Econometric Theory.
    Keywords: Econometrics rankings, Econometrics journals, Econometric theory, Applied econometrics.
    JEL: C01
    Date: 2007–05
    URL: http://d.repec.org/n?u=RePEc:max:cprwps:94&r=ecm

This nep-ecm issue is ©2007 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.