nep-ecm New Economics Papers
on Econometrics
Issue of 2006‒03‒05
six papers chosen by
Sune Karlsson
Orebro University

  1. Efficient robust estimation of regression models By Cizek,Pavel
  2. Regression models and experimental designs : a tutorial for simulation analysts By Kleijnen,Jack P.C.
  3. Model Uncertainty and Bayesian Model Averaging in Vector Autoregressive Processes By Rodney W. Strachan; Herman K. van Dijk
  4. Investigating distance effects on environmental values: A choice modelling approach By Giovanni B. Concu
  5. The Integration Order of Vector Autoregressive Processes By Massimo Franchi
  6. Analysis of data from nonorthogonal multi-stratum designs By Gilmour S.G.; Goos P.

  1. By: Cizek,Pavel (Tilburg University, Center for Economic Research)
    Abstract: This paper introduces a new class of regression estimators robust to outliers, measurement errors, and other data irregularities. The estimators are based on the twostep least weighted squares method, where weights are adaptively computed using the empirical distribution function of regression residuals obtained from an initial robust fit. The asymptotic distribution of the proposed estimators is derived under general conditions, allowing for time-series applications. Further, it is shown that the breakdown point of the proposed estimators equals that of the initial robust estimate. The main contribution of the work is that the proposed two-step procedures combine several desirable properties, which different existing estimators posses separately, but not jointly. These properties are asymptotic efficiency if the errors are normally distributed, high breakdown point achieved without rejecting (trimming) of observations, and independence of auxiliary tuning parameters. A Monte Carlo study shows that the two-step least weighted squares outperform in most situations both least squares and existing robust estimators in finite samples.
    Keywords: least weighted squares;linear regression;robust statistics;two-step estimation
    JEL: C13 C20 C21 C22
    Date: 2006
  2. By: Kleijnen,Jack P.C. (Tilburg University, Center for Economic Research)
    Abstract: This tutorial explains the basics of linear regression models. especially low-order polynomials. and the corresponding statistical designs. namely, designs of resolution III, IV, V, and Central Composite Designs (CCDs). This tutorial assumes 'white noise', which means that the residuals of the fitted linear regression model are normally, independently, and identically distributed with zero mean. The tutorial gathers statistical results that are scattered throughout the literature on mathematical statistics, and presents these results in a form that is understandable to simulation analysts.
    Keywords: metamodels;fractional factorial designs;Plackett-Burman designs;factor interactions;validation;cross-validation
    JEL: C0 C1 C9 C15 C44
    Date: 2006
  3. By: Rodney W. Strachan; Herman K. van Dijk
    Abstract: Economic forecasts and policy decisions are often informed by empirical analysis based on econometric models. However, inference based upon a single model, when several viable models exist, limits its usefulness. Taking account of model uncertainty, a Bayesian model averaging procedure is presented which allows for unconditional inference within the class of vector autoregressive (VAR) processes. Several features of VAR process are investigated. Measures on manifolds are employed in order to elicit uniform priors on subspaces defined by particular structural features of VARs. The features considered are the number and form of the equilibrium economic relations and deterministic processes. Posterior probabilities of these features are used in a model averaging approach for forecasting and impulse response analysis. The methods are applied to investigate stability of the "Great Ratios" in U.S. consumption, investment and income, and the presence and effects of permanent shocks in these series. The results obtained indicate the feasibility of the proposed method.
    Keywords: Posterior probability; Grassman manifold; Orthogonal group; Cointegration; Model averaging; Stochastic trend; Impulse response; Vector autoregressive model.
    JEL: C11 C32 C52
    Date: 2006–02
  4. By: Giovanni B. Concu (Risk and Sustainable Management Group, University of Queensland)
    Abstract: This paper describes a Choice Modelling experiment set up to investigate the relationship between distance and willingness to pay for environmental quality changes. The issue is important for the estimation and transfer of benefits. So far the problem has been analysed through the use of Contingent Valuation-type of experiments, producing mixed results. The Choice Modelling experiment allows testing distance effects on parameters of environmental attributes that imply different trade-offs between use and non-use values. The sampling procedure is designed to provide a Ògeographically balancedÓ sample. Several specifications of the distance covariate are compared and distance effects are shown to take complex shapes. Welfare analysis also shows that disregarding distance produces under-estimation of individual and aggregated benefits and losses, seriously hindering the reliability of costbenefit analyses.
    Keywords: choice Modelling techniques, distance, aggregation, sampling, functional forms.
    JEL: Q51 Q58
    Date: 2005–12
  5. By: Massimo Franchi (Department of Economics, University of Copenhagen)
    Abstract: We show that the order of integration of a vector autoregressive process is equal to the difference between the multiplicity of the unit root in the characteristic equation and the multiplicity of the unit root in the adjoint matrix polynomial. The equivalence with the standard I(1) and I(2) conditions (Johansen, 1996) is proved and polynomial cointegration discussed in the general setup.
    Keywords: unit roots; order of integration; polynomial cointegration
    JEL: C32
  6. By: Gilmour S.G.; Goos P.
    Abstract: Split-plot and other multi-stratum structures are widely used in factorial and response surface experiments and residual maximum likelihood (REML) and generalized least squares (GLS) estimation is seen as the state-of-the-art method of data analysis for nonorthogonal designs. We analyze data from an experiment run to study the effects of five process factors on the drying rate for freeze dried coffee and find that the main-plot variance component is estimated to be zero. We show that this is a typical property of REML-GLS estimation which is highly undesirable and can give misleading conclusions. In the classical approach, it is possible to fix the main-plot variance at some positive value, but this is not satisfactory either. Instead, we recommend a Bayesian analysis, using an informative prior distribution for the main-plot variance component and implemented using Markov chain Monte Carlo sampling. Paradoxically, the Bayesian analysis is less dependent on prior assumptions than the REML-GLS analysis. Bayesian analyses of the coffee freeze drying data give more realistic conclusions than REML-GLS analysis, providing support for our recommendation.
    Date: 2005–02

This nep-ecm issue is ©2006 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.