nep-ecm New Economics Papers
on Econometrics
Issue of 2007‒12‒15
fifteen papers chosen by
Sune Karlsson
Orebro University

  1. Efficient Robust Estimation of Time-Series Regression Models By Cizek, P.
  2. A Convolution Estimator for the Density of Nonlinear Regression Observations By Støve, Bård; Tjøstheim, Dag
  3. Tests of Independence in Separable Econometric Models: Theory and Application By Donald J. Brown; Rahul Deb; Marten H. Wegkamp
  4. Inference in the Presence of Stochastic and Deterministic Trends By Chevillon, Guillaume
  5. Infinite Dimensional VARs and Factor Models By Chudik , A.; Pesaran, M.H.
  6. Exact Rational Expectations, Cointegration, and Reduced Rank Regression By Soren Johansen; Anders Rygh Swensen
  7. Importance Sampling for Sums of Lognormal Distributions, with Applications to Operational Risk By Marco Bee
  8. Monte Carlo Investigation of the Initial Values Problem in Censored Dynamic Random-Effects Panel Data Models By Akay, Alpaslan
  9. Estimating TFP in the Presence of Outliers and Leverage Points: An Examination of the KLEMS Dataset By Macdonald, Ryan
  10. Integrating latent variables in discrete choice models – How higher-order values and attitudes determine consumer choice By Dirk Temme; Marcel Paulssen; Till Dannewald
  11. Estimating Gravity Equations : To Log or not to Log? By Boriss Siliverstovs; Dieter Schumacher
  12. Correlation vs. Causality in Stock Market Comovement By Enzo Weber
  13. The Accuracy and Efficiency of the Consensus Forecasts: A Further Application and Extension of the Pooled Approach By Ager, Philipp; Kappler, Marcus; Osterloh, Steffen
  14. Mind the Break! Accounting for Changing Patterns of Growth during Transition By Jan Fidrmuc; Ariane Tichit
  15. Ecplaining Bootstraps and Robustness By Tony Lancaster

  1. By: Cizek, P. (Tilburg University, Center for Economic Research)
    Abstract: Abstract. This paper studies a new class of robust regression estimators based on the two-step least weighted squares (2S-LWS) estimator which employs data-adaptive weights determined from the empirical distribution or quantile functions of regression residuals obtained from an initial robust fit. Just like many existing two-step robust methods, the proposed 2S-LWS estimator preserves robust properties of the initial robust estimate. However contrary to existing methods, the first-order asymptotic behavior of 2S-LWS is fully independent of the initial estimate under mild conditions. We propose data-adaptive weighting schemes that perform well both in the cross-section and time-series data and prove the asymptotic normality and efficiency of the resulting procedure. A simulation study documents these theoretical properties in finite samples.
    Keywords: Asymptotic efficiency;least weighted squares;robust regression;time series
    JEL: C13 C21 C22
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:200795&r=ecm
  2. By: Støve, Bård (Dept. of Finance and Management Science, Norwegian School of Economics and Business Administration); Tjøstheim, Dag (Dept. of Mathematics, University of Bergen)
    Abstract: The problem of estimating an unknown density function has been widely studied. In this paper we present a convolution estimator for the density of the responses in a nonlinear regression model. The rate of convergence for the variance of the convolution estimator is of order 1/n. This is faster than the rate for the kernel density method. The intuition behind this result is that the convolution estimator uses model information, and thus an improvement can be expected. We also derive the bias of the new estimator and conduct simulation experiments to check the finite sample properties. The proposed estimator performs substantially better than the kernel density estimator for well-behaved noise densities.
    Keywords: Convergence rate; Convolution estimator; Kernel function; Mean squared error; Nonparametric density estimation
    JEL: C13
    Date: 2007–11–30
    URL: http://d.repec.org/n?u=RePEc:hhs:nhhfms:2007_025&r=ecm
  3. By: Donald J. Brown (Cowles Foundation, Yale University); Rahul Deb (Dept. of Economics, Yale University); Marten H. Wegkamp (Dept. of Statistics, Florida State University)
    Abstract: A common stochastic restriction in econometric models separable in the latent variables is the assumption of stochastic independence between the unobserved and observed exogenous variables. Both simple and composite tests of this assumption are derived from properties of independence empirical processes and the consistency of these tests is established. As an application, we simulate estimation of a random quasilinear utility function, where we apply our tests of independence.
    Keywords: Cramer-von Mises distance, Empirical independence processes, Random utility models, Semiparametric econometric models, Specification test of independence
    JEL: C01 C13 C14 C15 D12
    Date: 2003–01
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1395rr&r=ecm
  4. By: Chevillon, Guillaume (ESSEC Business School)
    Abstract: The focus of this paper is inference about stochastic and deterministic trends when both types are present. We show that, contrary to asymptotic theory and the existing literature, the parameters of the deterministic components must be taken into account in finite samples. We analyze the ubiquitous Likelihood Ratio test for the rank of cointegration in vector processes. Here, we directly control the parameters of the data generating process so that a local-asymptotic framework accounts for small sample interactions between stochastic and deterministic trends. We show that the usual corrections are invalid as they take no account of the relative magnitudes of these two types of trends. Block-local models provide an embedding framework which provides a rationale for consistent estimation and testing of the whole set of parameters. In an empirical application to European GDP series, we show that using usual corrections leads to underestimating the number of stochastic trends.
    Keywords: Block Local Models; Cointegration; Finite Samples; Likelihood Ratio; Weak Trends
    JEL: C12 C32 C51
    Date: 2007–08
    URL: http://d.repec.org/n?u=RePEc:ebg:essewp:dr-07021&r=ecm
  5. By: Chudik , A.; Pesaran, M.H.
    Abstract: This paper introduces a novel approach for dealing with the .curse of dimensionality.in the case of large linear dynamic systems. Restrictions on the coefficients of an unrestricted VAR are proposed that are binding only in a limit as the number of endogenous variables tends to infinity. It is shown that under such restrictions, an infinite-dimensional VAR (or IVAR) can be arbitrarily well characterized by a large number of finite-dimensional models in the spirit of the global VAR model proposed in Pesaran et al. (JBES, 2004). The paper also considers IVAR models with dominant individual units and shows that this will lead to a dynamic factor model with the dominant unit acting as the factor. The problems of estimation and inference in a stationary IVAR with unknown number of unobserved common factors are also investigated. A cross section augmented least squares estimator is proposed and its asymptotic distribution is derived. Satisfactory small sample properties are documented by Monte Carlo experiments. An empirical application to modelling of real GDP growth and investment-output ratios provides an illustration of the proposed approach. Considerable heterogeneities across countries and signi.cant presence of dominant effects are found. The results also suggest that increase in investment as a share of GDP predict higher growth rate of GDP per capita for non-negligible fraction of countries and vice versa.
    Keywords: Large N and T Panels, Weak and Strong Cross Section Dependence, VAR, Global VAR, Factor Models, Capital Accumulation and Growth.
    JEL: C10 C33 C51 O40
    Date: 2007–11
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:0757&r=ecm
  6. By: Soren Johansen (Department of Economics, University of Copenhagen); Anders Rygh Swensen (University of Oslo)
    Abstract: We interpret the linear relations from exact rational expectations models as restrictions on the parameters of the statistical model called the cointegrated vector autoregressive model for non-stationary variables. We then show how reduced rank regression, Anderson (1951), plays an important role in the calculation of maximum likelihood estimation of the restricted parameters.
    Date: 2007–11
    URL: http://d.repec.org/n?u=RePEc:kud:kuiedp:0729&r=ecm
  7. By: Marco Bee
    Abstract: In this paper we estimate tail probabilities for the sum of Lognormal distributions. We propose to use a defensive mixture, and develop a method of finding the optimal density via the EM algorithm; we also consider the technique which assumes the importance sampling density to belong to the same parametric family of the distribution of the random variables to be summed. Optimality is defined in terms of minimal Cross-Entropy. Several simulation experiments show that the defensive mixture has the best performance. Finally, we study the compound distribution framework, and present a real-data application concerning the Poisson-Lognormal compound distribution.
    Keywords: Tail Probability, Importance Sampling, Cross-Entropy, Defensive Mixtures, Compound Distributions
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:trn:utwpde:0728&r=ecm
  8. By: Akay, Alpaslan (Department of Economics, School of Business, Economics and Law, Göteborg University)
    Abstract: Three designs of Monte Carlo experiments are used to investigate the initial-value problem in censored dynamic random-effects (Tobit type 1) models. We compared three widely used solution methods: naive method based on exogenous initial values assumption; Heckman's approximation; and the simple method of Wooldridge. The results suggest that the initial values problem is a serious issue: using a method which misspecifies the conditional distribution of initial values can cause misleading results on the magnitude of true (structural) and spurious state-dependence. The naive exogenous method is substantially biased for panels of short duration. Heckman's approximation works well. The simple method of Wooldridge works better than naive exogenous method in short panels, but it is not as good as Heckman's approximation. It is also observed that these methods performs equally well for panels of long duration.<p>
    Keywords: Initial value problem; Dynamic Tobit model; Monte Carlo experiment; Heckman's approximation; Simple method of Wooldridge
    JEL: C23 C25
    Date: 2007–12–05
    URL: http://d.repec.org/n?u=RePEc:hhs:gunwpe:0278&r=ecm
  9. By: Macdonald, Ryan
    Abstract: This paper examines the effect of aberrant observations in the Capital, Labour, Energy, Materials and Services (KLEMS) database and a method for dealing with them. The level of disaggregation, data construction and economic shocks all potentially lead to aberrant observations that can influence estimates and inference if care is not exercised. Commonly applied pre-tests, such as the augmented Dickey-Fuller and the Kwaitkowski, Phillips, Schmidt and Shin tests, need to be used with caution in this environment because they are sensitive to unusual data points. Moreover, widely known methods for generating statistical estimates, such as Ordinary Least Squares, may not work well when confronted with aberrant observations. To address this, a robust method for estimating statistical relationships is illustrated.
    Keywords: Statistical methods, Economic accounts, Time series, Data analysis, Productivity accounts
    Date: 2007–12–05
    URL: http://d.repec.org/n?u=RePEc:stc:stcp5e:2007047e&r=ecm
  10. By: Dirk Temme; Marcel Paulssen; Till Dannewald
    Abstract: Integrated choice and latent variable (ICLV) models represent a promising new class of models which merge classic choice models with the structural equation approach (SEM) for latent variables. Despite their conceptual appeal, to date applications of ICLV models in marketing are still rare. The present study on travel mode choice clearly demonstrates the value of ICLV models to enhance understanding of choice processes. In addition to the usually studied directly observable variables such as travel time, we show how abstract motivations such as power and hedonisms as well as attitudes such as a desire for flexibility impact on travel mode choice. Further, we can show that it is possible to estimate ICLV models with the widely available structural equation modeling package Mplus. This finding is likely to encourage wider usage of this appealing model class in the marketing field.
    Keywords: Hybrid choice models; Mode choice; Values; Value-attitude hierarchy; Mplus
    JEL: C25 C51 C87 M31 R41
    Date: 2007–12
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2007-065&r=ecm
  11. By: Boriss Siliverstovs; Dieter Schumacher
    Abstract: This study compares two alternative approaches to estimate parameters in gravity equations. We compare the traditional OLS approach applied to the log-linear form of the gravity model with the Poisson Quasi Maximum Likelihood (PQML) estimation procedure applied to the non-linear multiplicative specification of the gravity model. We use the trade flows for all products, for all manufacturing products as well as for manufacturing products broken down by three-digit ISIC Rev.2 categories. We base our conclusions on the generalised gravity model of Bergstrand (1989) that allows us to investigate differences in factor-proportions and home-market effects at the industry level. In addition, we compare the effects of other explanatory variables such as exporter and importer total income, distance, preferential trade agreements, common border, historical ties, and common language on the volume of trade. Our study provides comprehensive evidence on likely qualitative and/or quantitative differences in the values of estimated coefficients as a result of application of an alternative estimation method. Our main conclusion is that both estimation results as well as results of the regression misspecification tests provide supporting evidence for the PQML estimation approach over the OLS estimation method.
    Keywords: generalised gravity equation, Poisson regression
    JEL: F12
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:diw:diwwpp:dp739&r=ecm
  12. By: Enzo Weber
    Abstract: This paper seeks to disentangle the sources of correlations between high-, mid- and lowcap stock indexes from the German prime standard. In principle, such comovement can arise from direct spillover between the variables or due to common factors. By standard means, these different components are obviously not identifiable. As a solution, the underlying study proposes specifying ARCH-type models for both the idiosyncratic innovations and a common factor, so that the model structure can be identified through heteroscedasticity. The seemingly surprising result that smaller caps have higher influence than larger ones is explained by asymmetric information processing in financial markets. Broad macroeconomic information is shown to enter the common factor rather than the segment-specific shocks.
    Keywords: Identification, Spillover, Common Factor, Structural EGARCH, DAX
    JEL: C32 G10
    Date: 2007–12
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2007-064&r=ecm
  13. By: Ager, Philipp; Kappler, Marcus; Osterloh, Steffen
    Abstract: In this paper we analyze the macroeconomic forecasts of the Consensus Forecasts for 12 countries over the period from 1996 to 2006 regarding bias and information efficiency. A pooled approach is employed which permits the evaluation of all forecasts for each target variable over 24 horizons simultaneously. It is shown how the pooled approach needs to be adjusted in order to accommodate the forecasting scheme of the Consensus Forecasts. Furthermore, the pooled approach is extended by a sequential test with the purpose of detecting the critical horizon after which the forecast should be regarded as biased. Moreover, heteroscedasticity in the form of year-specific variances of macroeconomic shocks is taken into account. The results show that in the analyzed period which was characterized by pronounced macroeconomic shocks, several countries show biased forecasts, especially with forecasts covering more than 12 months. In addition, information efficiency has to be rejected in almost all cases.
    Keywords: business cycle forecasting, forecast evaluation, Consensus Forecasts
    JEL: C52 E32 E37
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:zbw:zewdip:6655&r=ecm
  14. By: Jan Fidrmuc; Ariane Tichit
    Abstract: We argue that econometric analyses based on transition countries’ data can be vulnerable to structural breaks across time and/or countries. We demonstrate this argument by identifying structural breaks in growth regressions estimated with data for 25 countries and 16 years. Our method allows identification of structural breaks at a-priori unknown points in space or time. The only prior assumption is that breaks occur in relation to progress in implementing market-oriented reforms. We find robust evidence that the pattern of growth in transition has changed at least two times, yielding thus three different models of growth associated with different stages of reform. The speed with which individual countries progress through these stages differs considerably.
    Date: 2007–06
    URL: http://d.repec.org/n?u=RePEc:edb:cedidp:07-06&r=ecm
  15. By: Tony Lancaster
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:bro:econwp:2007-17&r=ecm

This nep-ecm issue is ©2007 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.