nep-ecm New Economics Papers
on Econometrics
Issue of 2008‒05‒05
fourteen papers chosen by
Sune Karlsson
Orebro University

  1. Maxuimum Empirical Likelihood Estimation of the Spectral Measure of an Extreme Value Distribution By Einmahl, J.H.J.; Segers, J.J.J.
  2. Estimation of Nonparametric Conditional Moment Models with Possibly Nonsmooth Moments By Xiaohong Chen; Demian Pouzo
  3. Bayesian semiparametric stochastic volatility modeling By Mark J Jensen; John M Maheu
  4. The Impact of a Hausman Pretest on the Size of Hypothesis Tests By Patrik Guggenberger
  5. The Dynamics of Economic Functions: Modelling and Forecasting the Yield Curve By Clive Bowsher; Roland Meeks
  6. The Effects of Small Sample Bias in Threshold Autoregressive Models By Yamin Ahmad
  7. Non-parametric Identification of the Mixed Hazards Model with Interval-Censored Durations By Christian N. Brinch
  8. Consumer Preferences and Demand Systems By William A. Barnett; Apostolos Serletis
  9. Identifiability of the Stochastic Frontier Models By Bandyopadhyay, Debdas; Das, Arabinda
  10. Forecasting the Swiss Economy Using VECX* Models: An Exercise in Forecast Combination Across Modelsand Observation Windows By Assenmacher-Wesche , Katrin; Pesaran, and M. Hashem
  11. Urn-based models for dependent credit risks and their calibration through EM algorithm By Riccardo Gusso; Uwe Schmock
  12. A Unique Orthogonal Variance Decomposition By Wong, Woon K
  13. A valid theory on probabilistic causation By Jose M. Vidal-Sanz
  14. Are Weekly Inflation Forecasts Informative? By Amstad, Marlene; Fischer, Andreas

  1. By: Einmahl, J.H.J.; Segers, J.J.J. (Tilburg University, Center for Economic Research)
    Abstract: AMS 2000 subject classifications: Primary 62G05, 62G30, 62G32; secondary 60G70, 60F05, 60F17, JEL: C13, C14.
    Keywords: functional central limit theorem;local empirical process;moment constraint;multivariate extremes;nonparametric maximum likelihood estimator;tail dependence
    Date: 2008
  2. By: Xiaohong Chen (Cowles Foundation, Yale University); Demian Pouzo (Dept. of Economics, New York University)
    Abstract: This paper studies nonparametric estimation of conditional moment models in which the residual functions could be nonsmooth with respect to the unknown functions of endogenous variables. It is a problem of nonparametric nonlinear instrumental variables (IV) estimation, and a difficult nonlinear ill-posed inverse problem with an unknown operator. We first propose a penalized sieve minimum distance (SMD) estimator of the unknown functions that are identified via the conditional moment models. We then establish its consistency and convergence rate (in strong metric), allowing for possibly non-compact function parameter spaces, possibly non-compact finite or infinite dimensional sieves with flexible lower semicompact or convex penalty, or finite dimensional linear sieves without penalty. Under relatively low-level sufficient conditions, and for both mildly and severely ill-posed problems, we show that the convergence rates for the nonlinear ill-posed inverse problems coincide with the known minimax optimal rates for the nonparametric mean IV regression. We illustrate the theory by two important applications: root-n asymptotic normality of the plug-in penalized SMD estimator of a weighted average derivative of a nonparametric nonlinear IV regression, and the convergence rate of a nonparametric additive quantile IV regression. We also present a simulation study and an empirical estimation of a system of nonparametric quantile IV Engel curves.
    Keywords: Nonsmooth residuals, Nonlinear ill-posed inverse, Penalized sieve minimum distance, Modulus of continuity, Average derivative of a nonparametric nonlinear IV regression, Nonparametric additive quantile IV regression
    JEL: C13 C14 D12
    Date: 2008–04
  3. By: Mark J Jensen; John M Maheu
    Abstract: This paper extends the existing fully parametric Bayesian literature on stochastic volatility to allow for more general return distributions. Instead of specifying a particular distribution for the return innovation, nonparametric Bayesian methods are used to flexibly model the skewness and kurtosis of the distribution while the dynamics of volatility continue to be modeled with a parametric structure. Our semiparametric Bayesian approach provides a full characterization of parametric and distributional uncertainty. A Markov chain Monte Carlo sampling approach to estimation is presented with theoretical and computational issues for simulation from the posterior predictive distributions. The new model is assessed based on simulation evidence, an empirical example, and comparison to parametric models.
    Keywords: Dirichlet process mixture, MCMC, block sampler
    JEL: C22 C11
    Date: 2008–04–25
  4. By: Patrik Guggenberger (Dept. of Economics, UCLA)
    Abstract: This paper investigates the size properties of a two-stage test in the linear instrumental variables model when in the first stage a Hausman (1978) specification test is used as a pretest of exogeneity of a regressor. In the second stage, a simple hypothesis about a component of the structural parameter vector is tested, using a t-statistic that is based on either the ordinary least squares (OLS) or the two-stage least squares estimator (2SLS) depending on the outcome of the Hausman pretest. The asymptotic size of the two-stage test is derived in a model where weak instruments are ruled out by imposing a lower bound on the strength of the instruments. The asymptotic size is a function of this lower bound and the pretest and second stage nominal sizes. The asymptotic size increases as the lower bound and the pretest size decrease. It equals 1 for empirically relevant choices of the parameter space. It is also shown that, asymptotically, the conditional size of the second stage test, conditional on the pretest not rejecting the null of regressor exogeneity, is 1 even for a large lower bound on the strength of the instruments. The size distortion is caused by a discontinuity of the asymptotic distribution of the test statistic in the correlation parameter between the structural and reduced form error terms. The Hausman pretest does not have sufficient power against correlations that are local to zero while the OLS t-statistic takes on large values for such nonzero correlations. Instead of using the two-stage procedure, the recommendation then is to use a t-statistic based on the 2SLS estimator or, if weak instruments are a concern, the conditional likelihood ratio test by Moreira (2003).
    Keywords: Asymptotic size, Exogeneity, Hausman specification test, Pretest, Size distortion
    JEL: C12
    Date: 2008–04
  5. By: Clive Bowsher; Roland Meeks
    Abstract: The class of Functional Signal plus Noise (FSN) models is introduced that provides a new, general method for modelling and forecasting time series of economic functions. The underlying, continuous economic function (or 'signal') is a natural cubic spline whose dynamic evolution is driven by a cointegrated vector autoregression for the ordinates (or 'y-values') at the knots of the spline. The natural cubic spline provides flexible cross-sectional fit and results in a linear, state space model. This FSN model achieves dimension reduction, provides a coherent description of the observed yield curve and its dynamics as the cross-sectional dimension N becomes large, and can feasibly be estimated and used for forecasting when N is large. The integration and cointegration properties of the model are derived. The FSN models are then applied to forecasting 36-dimensional yield curves for US Treasury bonds at the one month ahead horizon. The method consistently outperforms the Diebold and Li (2006) and random walk forecasts on the basis of both mean square forecast error criteria and economically relevant loss functions derived from the realised profits of pairs trading algorithms. The analysis also highlights in a concrete setting the dangers of attempts to infer the relative economic value of model forecasts on the basis of their associated mean square forecast errors.
    Keywords: FSN-ECM models, functional time series, term structure, forecasting interest rates, natural cubic spline, state space form.
    JEL: C33 C51 C53 E47 G12
    Date: 2008
  6. By: Yamin Ahmad (Department of Economics, University of Wisconsin - Whitewater)
    Abstract: This paper investigates the properties of a class of models which incorporate nonlinear dynamics, known as Threshold Autoregressive (TAR) models. Simulations show that within the context of the real exchange rate literature, a threshold model of exchange rates exhibits significant small sample bias even with long time series data. The results of this paper has severe implications for the properties of estimated coefficients within TAR models.
    Keywords: Threshold Autoregressive Models, Nonlinear Models, Small Sample Bias, Real Exchange Rates, Simulation
    JEL: F47 C15 C32
    Date: 2007–05
  7. By: Christian N. Brinch (Statistics Norway)
    Abstract: Econometric duration data are typically interval-censored, that is, not directly observed, but observed to fall within a known interval. Known non-parametric identification results for duration models with unobserved heterogeneity rely crucially on exact observation of durations at a continuous scale. Here, it is established that the mixed hazards model is non-parametrically identified through covariates that vary over time within durations as well as between observations when durations are interval-censored. The results hold for the mixed proportional hazards model as a special case.
    Keywords: duration analysis; interval-censoring; non-parametric identification
    JEL: C41
    Date: 2008–04
  8. By: William A. Barnett; Apostolos Serletis
    Abstract: This paper is an up-to-date survey of the state-of-the art in consumer demand modelling. We review and evaluate advances in a number of related areas, including different approaches to empirical demand analysis, such as the differential approach, the locally flexible functional forms approach, the semi-nonparametric approach, and a nonparametric approach. We also address estimation issues, including sampling theoretic and Bayesian estimation methods, and discuss the limitations of the currently common approaches. We also highlight the challenge inherent in achieving economic regularity, for consistency with the assumptions of the underlying neoclassical economic theory, as well as econometric regularity, when variables are nonstationary.
    JEL: D12 E21
    Date: 2008–01–29
  9. By: Bandyopadhyay, Debdas; Das, Arabinda
    Abstract: This paper examines the identifiability of the standard single-equation stochastic frontier models with uncorrelated and correlated error components giving, inter alia, mathematical content to the notion of “near-identifiability” of a statistical model. It is seen that these models are at least locally identifiable but suffer from the “near-identifiability” problem. Our results also highlight the pivotal role played by the Signal to Noise Ratio in the “near-identifiablity” of the stochastic frontier models.
    Keywords: Identification; Stochastic frontier model; Information Matrix; Signal to Noise Ratio
    JEL: C10 C52 C31
    Date: 2007–06
  10. By: Assenmacher-Wesche , Katrin (Swiss National Bank); Pesaran, and M. Hashem (University of Cambridge)
    Abstract: This paper uses vector error correction models of Switzerland for forecasting output, inflation and the short-term interest rate. It considers three different ways of dealing with forecast uncertainties. First, it investigates the effect on forecasting performance of averaging over forecasts from different models. Second, it considers averaging forecasts from different estimation windows. It is found that averaging over estimation windows is at least as effective as averaging over different models and both complement each other. Third, it examines whether using weighting schemes from the machine learning literature improves the average forecast. Compared to equal weights the effect of alternative weighting schemes on forecast accuracy is small in the present application.
    Keywords: Bayesian model averaging; choice of observation window; longrun structural vector autoregression
    JEL: C32 C53
    Date: 2008–04–29
  11. By: Riccardo Gusso; Uwe Schmock (Department of Applied Mathematics, University of Venice)
    Abstract: In this contribution we analyze two models for the joint probability of defaults of dependent credit risks that are based on a generalisation of Polya urn scheme. In particular we focus our attention on the problems related to the maximum likelihood estimation of the parameters involved, and to this purpose we introduce an approach based on the use of the Expectation-Maximization algorithm. We show how to implement it in this context, and then we analyze the results obtained, comparing them with results obtained by other approaches.
    JEL: C13 C16
    Date: 2008–04
  12. By: Wong, Woon K (Cardiff Business School)
    Abstract: Let e and Σ be respectively the vector of shocks and its variance covariance matrix in a linear system of equations in reduced form. This article shows that a unique orthogonal variance decomposition can be obtained if we impose a restriction that maximizes the trace of A, a positive definite matrix such that Az = e where z is vector of uncorrelated shocks with unit variance. Such a restriction is meaningful in that it associates the largest possible weight for each element in e with its corresponding element in z. It turns out that A = Σ<sup>1/2</sup>, the square root of Σ.
    Keywords: Variance decomposition; Cholesky decomposition; unique orthogonal decomposition and square root matrix
    JEL: C01
    Date: 2008–04
  13. By: Jose M. Vidal-Sanz
    Abstract: In this paper several definitions of probabilistic causation are considered, and their main drawbacks discussed. Current notions of probabilistic causality have symmetry limitations (e.g. correlation and statistical dependence are symmetric notions). To avoid the symmetry problem, non-reciprocal causality is often defined in terms of dynamic asymmetry. But these notions are likely to consider spurious regularities. In this paper we present a definition of causality that does non have symmetry inconsistences. It is a natural extension of propositional causality in formal logics, and it can be easily analyzed with statistical inference. The modeling problems are also discussed using empirical processes.
    Keywords: Causality, Empirical Processes and Classification Theory, 62M30, 62M15, 62G20
    Date: 2008–04
  14. By: Amstad, Marlene (Swiss National Bank); Fischer, Andreas (CEPR)
    Abstract: Are weekly inflation forecasts informative? Although several central banks review and discuss monetary policy issues on a bi-weekly basis, there have been few attempts by analysts to construct systematic estimates of core inflation that supports such a decision-making schedule. The timeliness of news releases and macroeconomic revisions are recognized to be an important information source in real-time estimation. We incorporate real-time information from macroeconomic releases and revisions into our weekly updates of monthly Swiss core inflation using a common factor procedure. The weekly estimates for Swiss core inflation find that it is worthwhile to update the forecast at least twice a month.
    Keywords: Inflation; Common Factors; Sequential Information Flow
    JEL: E52 E58
    Date: 2008–01–29

This nep-ecm issue is ©2008 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.