nep-ecm New Economics Papers
on Econometrics
Issue of 2006‒10‒07
seventeen papers chosen by
Sune Karlsson
Orebro University

  1. Tests of Independence in Separable Econometric Models: Theory and Application By Donald J. Brown; Rahul Deb; Marten H. Wegkamp
  2. Wake me up before you GO-GARCH By H. Peter Boswijk; Roy van der Weide
  3. Taking the temperature – forecasting GDP growth for mainland China By Curran , Declan; Funke, Michael
  4. Panels with Nonstationary Multifactor Error Structures By George Kapetanios; M. Hashem Pesaran; Takashi Yamagata
  5. Spatial Time-Series Modeling: A review of the proposed methodologies By Yiannis Kamarianakis; Poulicos Prastacos
  6. The Uniformly Most Powerful Invariant Test for the Shoulder Condition in Point Transect Sampling By Piero Quatto; Riccardo Borgoni
  8. Predictdion in the Panel Data Model with Spatial Correlation: The Case of Liquor By Badi H. Baltagi; Dong Li
  9. Testing for Cointegrating Rank via Model Selection: Evidence from 165 Data Sets By Badi H. Baltagi; Zijun Wang
  10. Forecast errors and the macroeconomy — a non-linear relationship? By Ulrich Fritsche; Joerg Doepke
  11. On the Specification of Propensity Scores: with an Application to the WTO-Environment Debate By Daniel Millimet; Rusty Tchernis
  12. On the evaluation of the cost efficiency of nonresponse rate reduction efforts - some general considerations By Tångdahl, Sara
  13. Time-Series Models in Marketing By Dekimpe, M.G.; Franses, Ph.H.B.F.; Hanssens, D.M.; Naik, P.
  14. Scenario Based Principal Component Value-at-Risk: an Application to Italian Banks' Interest Rate Risk Exposure By Roberta Fiori; Simonetta Iannotti
  15. Measuring Technical Efficiency under Factor Non-Substitution: A Stochastic von Liebig Crop Response Model By Margarita Genius; Maria Mavraki; Vangelis Tzouvelekas
  16. A Large-Scale Validation Study of Measurement Errors in Longitudinal Survey Data By Nicolai Kristensen; Niels Westergaard-Nielsen
  17. Why so (Un)certain? Calibration in Contingent Valuation Using the Certainty Approach By Svensson, Mikael

  1. By: Donald J. Brown (Cowles Foundation, Yale University); Rahul Deb (Dept. of Economics, Yale University); Marten H. Wegkamp (Dept. of Statistics, Florida State University)
    Abstract: A common stochastic restriction in econometric models separable in the latent variables is the assumption of stochastic independence between the unobserved and observed exogenous variables. Both simple and composite tests of this assumption are derived from properties of independence empirical processes and the consistency of these tests is established. As an application, we simulate estimation of a random quasilinear utility function, where we apply our tests of independence.
    Keywords: Cramer–von Mises distance, Empirical independence processes, Random utility models, Semiparametric econometric models, Specification test of independence
    JEL: C12 C13 C14
    Date: 2003–01
  2. By: H. Peter Boswijk (Universiteit van Amsterdam); Roy van der Weide (World Bank)
    Abstract: In this paper we present a new three-step approach to the estimation of Generalized Orthogonal GARCH (GO-GARCH) models, as proposed by van der Weide (2002). The approach only requires (non-linear) least-squares methods in combination with univariate GARCH estimation, and as such is computationally attractive, especially in larger-dimensional systems, where a full likelihood optimization is often infeasible. The effectiveness of the method is investigated using Monte Carlo simulations as well as a number of empirical applications.
    Keywords: Multivariate GARCH; Non-Linear Least-Squares; Maximum Likelihood
    JEL: C13 C32
    Date: 2006–09–19
  3. By: Curran , Declan; Funke, Michael (Hamburg University, Germany)
    Abstract: We present a new composite leading indicator of economic activity in mainland China, es-timated using a dynamic factor model. Our leading indicator is constructed from three se-ries: exports, a real estate climate index, and the Shanghai Stock Exchange index. These series are found to share a common, unobservable element from which our indicator can be identified. This indicator is then incorporated into out-of-sample one-step-ahead forecasts of Chinese GDP growth. Recursive out-of-sample accuracy tests indicate that the small-scale factor model approach leads to a successful representation of the sample data and provides an appropriate tool for forecasting Chinese business conditions.
    Keywords: forecasting; China; leading indicator; factor model; growth cycles
    JEL: C32 C52 E32 E37
    Date: 2006–06–02
  4. By: George Kapetanios; M. Hashem Pesaran; Takashi Yamagata
    Abstract: The presence of cross-sectionally correlated error terms invalidates much inferential theory of panel data models. Recently work by Pesaran (2006) has suggested a method which makes use of cross-sectional averages to provide valid inference for stationary panel regressions with multifactor error structure. This paper extends this work and examines the important case where the unobserved common factors follow unit root processes and could be cointegrated. It is found that the presence of unit roots does not affect most theoretical results which continue to hold irrespective of the integration and the cointegration properties of the unobserved factors. This finding is further supported for small samples via an extensive Monte Carlo study. In particular, the results of the Monte Carlo study suggest that the cross-sectional average based method is robust to a wide variety of data generation processes and has lower biases than all of the alternative estimation methods considered in the paper.
    Keywords: cross section dependence, large panels, unit roots, principal components, common correlated effects
    JEL: C12 C13 C33
    Date: 2006
  5. By: Yiannis Kamarianakis (Department of Economics, University of Crete); Poulicos Prastacos (Regional Analysis Division, Institute of Applied and Computational Mathematics, Foundation for Research and Technology-Hellas)
    Abstract: This paper discusses three modelling techniques, which apply to multiple time series data that correspond to different spatial locations (spatial time series). The first two methods, namely the Space-Time ARIMA (STARIMA) and the Bayesian Vector Autoregressive (BVAR) model with spatial priors apply when interest lies on the spatio-temporal evolution of a single variable. The former is better suited for applications of large spatial and temporal dimension whereas the latter can be realistically performed when the number of locations of the study is rather small. Next, we consider models that aim to describe relationships between variables with a spatio-temporal reference and discuss the general class of dynamic space-time models in the framework presented by Elhorst (2001). Each model class is introduced through a motivating application.
    Keywords: spatial time-series, space-time models, STARIMA, Bayesian Vector Autoregressions
    Date: 2006–03
  6. By: Piero Quatto; Riccardo Borgoni
    Abstract: Estimating population abundance is of primary interest in wildlife population studies. Point transect sampling is a well established methodology for this purpose. The usual approach for estimating the density or the size of the population of interest is to assume a particular model for the detection function (the conditional probability of detecting an animal given that it is at a given distance from the observer). The two most popular models for this function are the half-normal model and the negative exponential model. However, it appears that the estimates are extremely sensitive to the shape of the detection function, particularly to the so-called shoulder condition, which ensures that an animal is almost certain to be detected if it is at a small distance from the observer. The half-normal model satisfies this condition whereas the negative exponential does not. Therefore, testing whether such a hypothesis is consistent with the data at hand should be a primary concern in every study concerning the estimation of animal abundance. In this paper we propose a test for this purpose. This is the uniformly most powerful test in the class of the scale invariant tests. The asymptotic distribution of the test statistic is calculated by utilising both the half-normal and negative exponential model while the critical values and the power are tabulated via Monte Carlo simulations for small samples. Finally, the procedure is applied to two datasets of chipping sparrows collected at the Rocky Mountain Bird Observatory, Colorado..
    Keywords: Point Transect Sampling, Shoulder Condition, Uniformly Most Powerful Invariant Test, Asymptotic Critical Values, Monte Carlo Critical Values
    JEL: C12
    Date: 2006–09
  7. By: Yiannis Kamarianakis (Regional Analysis Division, Institute of Applied and Computational Mathematics, Foundation for Research and Technology-Hellas)
    Abstract: Despite the fact that the amount of datasets containing long economic time series with a spatial reference has significantly increased during the years, the presence of integrated techniques that aim to describe the temporal evolution of the series while accounting for the location of the measurements and their neighboring relations is very sparse in the econometric literature. This paper shows how the Hierarchical Bayesian Space Time model presented by Wikle, Berliner and Cressie (Environmental and Ecological Statistics, l998) for temperature modeling, can be tailored to model relationships between variables that have both a spatial and a temporal reference. The first stage of the hierarchical model includes a set of regression equations (each one corresponding to a different location) coupled with a dynamic space-time process that accounts for the unexplained variation. At the second stage, the regression parameters are endowed with priors that reflect the neighboring relations of the locations under study; moreover, the spatio-temporal dependencies in the dynamic process for the unexplained variation are being established. Putting hyperpriors on previous stages’ parameters completes the Bayesian formulation, which can be implemented in a Markov Chain Monte Carlo framework. The proposed modeling strategy is useful in quantifying the temporal evolution in relations between economic variables and this quantification may serve for excess forecasting accuracy.
    Keywords: space-time models
    Date: 2006–03
  8. By: Badi H. Baltagi (Center for Policy Research, Maxwell School, Syracuse University, Syracuse, NY 13244-1020); Dong Li
    Abstract: This paper considers the problem of prediction in a panel data regression model with spatial autocorrelation in the context of a simple demand equation for liquor. This is based on a panel of 43 states over the period 1965-1994. The spatial autocorrelation due to neighboring states and the individual heterogeneity across states is taken explicitly into account. We compare the performance of several predictors of the states demand for liquor for one year and five years ahead. The estimators whose predictions are compared include OLS, fixed effects ignoring spatial correlation, fixed effects with spatial correlation, random effects GLS estimator ignoring spatial correlation and random effects estimator accounting for the spatial correlation. Based on RMSE forecast performance, estimators that take into account spatial correlation and neterogeneity across the states perform the best for one year ahead forecasts. However, for two to five years ahead forecasts, estimators that take into account the heterogeneity across the states yield the best forecasts.
    Keywords: prediction, spatial correlation, panel data, liquor demand
    JEL: C21 C23 C53
    Date: 2006–07
  9. By: Badi H. Baltagi (Center for Policy Research, Maxwell School, Syracuse University, Syracuse, NY 13244-1020); Zijun Wang
    Abstract: The model selection approach has been proposed as an alternative to the popular tests for cointegration such as the residual-based ADF test and the system-based trace test. Using information criteria, we conduct cointegration tests on 165 data sets used in published studies. The empirical results demonstrate the usefulness of the model selection approach for applied researchers.
    JEL: C21 C23
    Date: 2006–07
  10. By: Ulrich Fritsche (Department for Economics and Politics, University of Hamburg, and DIW Berlin); Joerg Doepke (Fachhochschule Merseburg)
    Abstract: The paper analyses reasons for departures from strong rationality of growth and inflation forecasts based on annual observations from 1963 to 2004. We rely on forecasts from the joint forecast of the so-called "six leading" forecasting institutions in Germany and argue that violations of the rationality hypothesis are due to relatively few large forecast errors. These large errors are shown - based on evidence from probit models - to correlate with macroeconomic fundamentals, especially on monetary factors. We test for a non-linear relation between forecast errors and macroeconomic fundamentals and find evidence for such a non-linearity for inflation forecasts.
    Keywords: forecast error evaluation, non-linearities, business cycles
    JEL: E32 E37 C52 C53
    Date: 2006–02
  11. By: Daniel Millimet (Southern Methodist University); Rusty Tchernis (Indiana University Bloomington)
    Abstract: The use of propensity score methods for program evaluation with non-experimental data typically requires the propensity score be estimated, often with a model whose specification is unknown. While theoretical results suggest that estimators utilizing more flexible propensity score specifications perform better, this has not filtered into applied research. Here, we provide Monte Carlo evidence indicating the benefits of over-specifying the propensity score when using weighting estimators, as well as using normalized weights. We illustrate these results with an application assessing the environmental effects of GATT/WTO membership. We find that membership has a mixed impact, and that under-fitting the propensity score yields misleading inference in several cases.
    Keywords: Treatment Effects, Propensity score, Specification, WTO, Environment
    JEL: C21 C52 F18
    Date: 2006–09
  12. By: Tångdahl, Sara (Department of Business, Economics, Statistics and Informatics)
    Abstract: Virtually every survey today suffers from nonresponse to some extent. To counter this, survey administrators and researchers have a host of methods at their disposal, many of which are both expensive and time consuming. Reduction efforts, aiming at reducing the nonresponse rate, are an important part of the data collection process, but commonly also a substantial part of the available survey budget. <p>We propose that the effciency of the reduction efforts be evaluated in relation to the costs. In this paper we point in the direction of an evaluation procedure, using a measure of cost effciency, that can be used in an "ideal" situation, where all relevant quantities are known. It can not be applied directly in practice, but will serve as a point of reference when practically feasible approaches are developed.
    Keywords: resource allocation; cost efficiency; evaluation; nonresponse error; data collection
    JEL: C13 C14
    Date: 2006–09–27
  13. By: Dekimpe, M.G.; Franses, Ph.H.B.F.; Hanssens, D.M.; Naik, P. (Erasmus Research Institute of Management (ERIM), RSM Erasmus University)
    Abstract: Marketing data appear in a variety of forms. An often-seen form is time-series data, like sales per month, prices over the last few years, market shares per week. Time-series data can be summarized in time-series models. In this chapter we review a few of these, focusing in particular on domains that have received considerable attention in the marketing literature. These are (1) the use of persistence modelling and (2) the use of state space models.
    Keywords: Time Series;Marketing;Persistence;State Space;
    Date: 2006–09–20
  14. By: Roberta Fiori (Banca d'Italia); Simonetta Iannotti (Banca d'Italia)
    Abstract: The paper develops a Value-at-Risk methodology to assess Italian banks’ interest rate risk exposure. By using 5 years of daily data, the exposure is evaluated through a Principal Component VaR based on Monte Carlo simulation according to two different approaches (parametric and non-parametric). The main contribution of the paper is a methodology for modelling interest rate changes when underlying risk factors are skewed and heavy-tailed. The methodology is then implemented on a one year holding period in order to compare the results from those resulting from the Basel II standardized approach. We find that the risk measure proposed by Basel II gives an adequate description of risk, provided that duration parameters are changed to reflect market conditions. Finally, the methodology is used to perform a stress testing analysis.
    Keywords: Interest rate risk, VAR, PCA, Non-normality, Non parametric methods
    JEL: C14 C19 G21
    Date: 2006–09
  15. By: Margarita Genius (Department of Economics, University of Crete); Maria Mavraki (Department of Economics, University of Crete); Vangelis Tzouvelekas (Department of Economics, University of Crete)
    Abstract: The present paper develops an econometric model for measuring input-oriented technical efficiency when the underlying technology is characterized by the lack of substitution between inputs. In this instances, Farrell’s radial measure of technical inefficiency is inappropriate as it may be possible to identify a technical inefficient bundle as technical efficient. Instead Russell’s non-radial indices can adequately measure technical inefficiency in factor limitation models. To this end, a disequilibrium model augmented with a regime specific technical inefficiency term is proposed and its likelihood function derived together with the computation of technical efficiency under specific distributional assumptions. The framework under which the model is proposed is the well known von Liebig hypothesis that analyses crop response to different levels of fertilizer nutrients. Application of the proposed stochastic von Liebig crop response model to the experimental data of Heady and Pesek (1954) points out to the fact that technical inefficiency can arise for a subset of the nutrients considered.
    Date: 2006–03
  16. By: Nicolai Kristensen (Aarhus School of Business and CCP); Niels Westergaard-Nielsen (Aarhus School of Business, CCP and IZA Bonn)
    Abstract: In this paper, we analyze measurement and classification errors in several key variables, including earnings and educational attainment, in a matched sample of survey and administrative longitudinal data. The data, spanning 1994-2001 and covering all sectors in the Danish economy, are much more comprehensive than usually seen in validation studies. Measurement errors in earnings are found to be much larger than reported in previous studies limited to one single firm. Individuals who attrite from the panel report their earnings significantly less accurate than individuals who are observed throughout the entire sampling period. Furthermore, females are found to report their earnings significantly more precise than males, part-time workers report significantly less accurate than full-time workers and low-income workers report significantly less accurate than workers with relatively higher income. Classification errors in categorical variables are found to be of about the same magnitude as previously found in the literature. We analyze whether response error in one variable makes it more likely that the same respondent will report other variables with error but do not find support for this hypothesis.
    Keywords: measurement error, classification error, validation
    JEL: J24 J31 I2 J28
    Date: 2006–09
  17. By: Svensson, Mikael (Department of Business, Economics, Statistics and Informatics)
    Abstract: Hypothetical bias has been forcefully documented in the contingent valuation method. Some authors have argued and given empirical evidence that answers can be calibrated to adjust for hypothetical bias by only including respondents that are most certain of their answers. However, this paper, using data from a CV study in Sweden eliciting willingness to pay for traffic risk reduction, shows that the most certain respondents are also significantly older. Hence, estimates based on only these ansers will add another type of bias to the CV study. Even if hypothetical bias is partially or fully removed, there is evidence that mean willingness to pay should be lower for older respondents since they have fewer remaining life years.
    Keywords: ontingent Valuation; Hypothetical Bias; Calibration; Certainty Approach; Value of a Statistical Life
    JEL: D80 I18 Q51
    Date: 2006–09–27

This nep-ecm issue is ©2006 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.