
on Econometrics 
By:  Einmahl, J.H.J.; Segers, J.J.J. (Tilburg University, Center for Economic Research) 
Abstract:  AMS 2000 subject classifications: Primary 62G05, 62G30, 62G32; secondary 60G70, 60F05, 60F17, JEL: C13, C14. 
Keywords:  functional central limit theorem;local empirical process;moment constraint;multivariate extremes;nonparametric maximum likelihood estimator;tail dependence 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:dgr:kubcen:200842&r=ecm 
By:  Xiaohong Chen (Cowles Foundation, Yale University); Demian Pouzo (Dept. of Economics, New York University) 
Abstract:  This paper studies nonparametric estimation of conditional moment models in which the residual functions could be nonsmooth with respect to the unknown functions of endogenous variables. It is a problem of nonparametric nonlinear instrumental variables (IV) estimation, and a difficult nonlinear illposed inverse problem with an unknown operator. We first propose a penalized sieve minimum distance (SMD) estimator of the unknown functions that are identified via the conditional moment models. We then establish its consistency and convergence rate (in strong metric), allowing for possibly noncompact function parameter spaces, possibly noncompact finite or infinite dimensional sieves with flexible lower semicompact or convex penalty, or finite dimensional linear sieves without penalty. Under relatively lowlevel sufficient conditions, and for both mildly and severely illposed problems, we show that the convergence rates for the nonlinear illposed inverse problems coincide with the known minimax optimal rates for the nonparametric mean IV regression. We illustrate the theory by two important applications: rootn asymptotic normality of the plugin penalized SMD estimator of a weighted average derivative of a nonparametric nonlinear IV regression, and the convergence rate of a nonparametric additive quantile IV regression. We also present a simulation study and an empirical estimation of a system of nonparametric quantile IV Engel curves. 
Keywords:  Nonsmooth residuals, Nonlinear illposed inverse, Penalized sieve minimum distance, Modulus of continuity, Average derivative of a nonparametric nonlinear IV regression, Nonparametric additive quantile IV regression 
JEL:  C13 C14 D12 
Date:  2008–04 
URL:  http://d.repec.org/n?u=RePEc:cwl:cwldpp:1650&r=ecm 
By:  Mark J Jensen; John M Maheu 
Abstract:  This paper extends the existing fully parametric Bayesian literature on stochastic volatility to allow for more general return distributions. Instead of specifying a particular distribution for the return innovation, nonparametric Bayesian methods are used to flexibly model the skewness and kurtosis of the distribution while the dynamics of volatility continue to be modeled with a parametric structure. Our semiparametric Bayesian approach provides a full characterization of parametric and distributional uncertainty. A Markov chain Monte Carlo sampling approach to estimation is presented with theoretical and computational issues for simulation from the posterior predictive distributions. The new model is assessed based on simulation evidence, an empirical example, and comparison to parametric models. 
Keywords:  Dirichlet process mixture, MCMC, block sampler 
JEL:  C22 C11 
Date:  2008–04–25 
URL:  http://d.repec.org/n?u=RePEc:tor:tecipa:tecipa314&r=ecm 
By:  Patrik Guggenberger (Dept. of Economics, UCLA) 
Abstract:  This paper investigates the size properties of a twostage test in the linear instrumental variables model when in the first stage a Hausman (1978) specification test is used as a pretest of exogeneity of a regressor. In the second stage, a simple hypothesis about a component of the structural parameter vector is tested, using a tstatistic that is based on either the ordinary least squares (OLS) or the twostage least squares estimator (2SLS) depending on the outcome of the Hausman pretest. The asymptotic size of the twostage test is derived in a model where weak instruments are ruled out by imposing a lower bound on the strength of the instruments. The asymptotic size is a function of this lower bound and the pretest and second stage nominal sizes. The asymptotic size increases as the lower bound and the pretest size decrease. It equals 1 for empirically relevant choices of the parameter space. It is also shown that, asymptotically, the conditional size of the second stage test, conditional on the pretest not rejecting the null of regressor exogeneity, is 1 even for a large lower bound on the strength of the instruments. The size distortion is caused by a discontinuity of the asymptotic distribution of the test statistic in the correlation parameter between the structural and reduced form error terms. The Hausman pretest does not have sufficient power against correlations that are local to zero while the OLS tstatistic takes on large values for such nonzero correlations. Instead of using the twostage procedure, the recommendation then is to use a tstatistic based on the 2SLS estimator or, if weak instruments are a concern, the conditional likelihood ratio test by Moreira (2003). 
Keywords:  Asymptotic size, Exogeneity, Hausman specification test, Pretest, Size distortion 
JEL:  C12 
Date:  2008–04 
URL:  http://d.repec.org/n?u=RePEc:cwl:cwldpp:1651&r=ecm 
By:  Clive Bowsher; Roland Meeks 
Abstract:  The class of Functional Signal plus Noise (FSN) models is introduced that provides a new, general method for modelling and forecasting time series of economic functions. The underlying, continuous economic function (or 'signal') is a natural cubic spline whose dynamic evolution is driven by a cointegrated vector autoregression for the ordinates (or 'yvalues') at the knots of the spline. The natural cubic spline provides flexible crosssectional fit and results in a linear, state space model. This FSN model achieves dimension reduction, provides a coherent description of the observed yield curve and its dynamics as the crosssectional dimension N becomes large, and can feasibly be estimated and used for forecasting when N is large. The integration and cointegration properties of the model are derived. The FSN models are then applied to forecasting 36dimensional yield curves for US Treasury bonds at the one month ahead horizon. The method consistently outperforms the Diebold and Li (2006) and random walk forecasts on the basis of both mean square forecast error criteria and economically relevant loss functions derived from the realised profits of pairs trading algorithms. The analysis also highlights in a concrete setting the dangers of attempts to infer the relative economic value of model forecasts on the basis of their associated mean square forecast errors. 
Keywords:  FSNECM models, functional time series, term structure, forecasting interest rates, natural cubic spline, state space form. 
JEL:  C33 C51 C53 E47 G12 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:sbs:wpsefe:2008fe24&r=ecm 
By:  Yamin Ahmad (Department of Economics, University of Wisconsin  Whitewater) 
Abstract:  This paper investigates the properties of a class of models which incorporate nonlinear dynamics, known as Threshold Autoregressive (TAR) models. Simulations show that within the context of the real exchange rate literature, a threshold model of exchange rates exhibits significant small sample bias even with long time series data. The results of this paper has severe implications for the properties of estimated coefficients within TAR models. 
Keywords:  Threshold Autoregressive Models, Nonlinear Models, Small Sample Bias, Real Exchange Rates, Simulation 
JEL:  F47 C15 C32 
Date:  2007–05 
URL:  http://d.repec.org/n?u=RePEc:uww:wpaper:0701&r=ecm 
By:  Christian N. Brinch (Statistics Norway) 
Abstract:  Econometric duration data are typically intervalcensored, that is, not directly observed, but observed to fall within a known interval. Known nonparametric identification results for duration models with unobserved heterogeneity rely crucially on exact observation of durations at a continuous scale. Here, it is established that the mixed hazards model is nonparametrically identified through covariates that vary over time within durations as well as between observations when durations are intervalcensored. The results hold for the mixed proportional hazards model as a special case. 
Keywords:  duration analysis; intervalcensoring; nonparametric identification 
JEL:  C41 
Date:  2008–04 
URL:  http://d.repec.org/n?u=RePEc:ssb:dispap:539&r=ecm 
By:  William A. Barnett; Apostolos Serletis 
Abstract:  This paper is an uptodate survey of the stateofthe art in consumer demand modelling. We review and evaluate advances in a number of related areas, including different approaches to empirical demand analysis, such as the differential approach, the locally flexible functional forms approach, the seminonparametric approach, and a nonparametric approach. We also address estimation issues, including sampling theoretic and Bayesian estimation methods, and discuss the limitations of the currently common approaches. We also highlight the challenge inherent in achieving economic regularity, for consistency with the assumptions of the underlying neoclassical economic theory, as well as econometric regularity, when variables are nonstationary. 
JEL:  D12 E21 
Date:  2008–01–29 
URL:  http://d.repec.org/n?u=RePEc:clg:wpaper:200825&r=ecm 
By:  Bandyopadhyay, Debdas; Das, Arabinda 
Abstract:  This paper examines the identifiability of the standard singleequation stochastic frontier models with uncorrelated and correlated error components giving, inter alia, mathematical content to the notion of “nearidentifiability” of a statistical model. It is seen that these models are at least locally identifiable but suffer from the “nearidentifiability” problem. Our results also highlight the pivotal role played by the Signal to Noise Ratio in the “nearidentifiablity” of the stochastic frontier models. 
Keywords:  Identification; Stochastic frontier model; Information Matrix; Signal to Noise Ratio 
JEL:  C10 C52 C31 
Date:  2007–06 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:8032&r=ecm 
By:  AssenmacherWesche , Katrin (Swiss National Bank); Pesaran, and M. Hashem (University of Cambridge) 
Abstract:  This paper uses vector error correction models of Switzerland for forecasting output, inflation and the shortterm interest rate. It considers three different ways of dealing with forecast uncertainties. First, it investigates the effect on forecasting performance of averaging over forecasts from different models. Second, it considers averaging forecasts from different estimation windows. It is found that averaging over estimation windows is at least as effective as averaging over different models and both complement each other. Third, it examines whether using weighting schemes from the machine learning literature improves the average forecast. Compared to equal weights the effect of alternative weighting schemes on forecast accuracy is small in the present application. 
Keywords:  Bayesian model averaging; choice of observation window; longrun structural vector autoregression 
JEL:  C32 C53 
Date:  2008–04–29 
URL:  http://d.repec.org/n?u=RePEc:ris:snbwpa:2008_003&r=ecm 
By:  Riccardo Gusso; Uwe Schmock (Department of Applied Mathematics, University of Venice) 
Abstract:  In this contribution we analyze two models for the joint probability of defaults of dependent credit risks that are based on a generalisation of Polya urn scheme. In particular we focus our attention on the problems related to the maximum likelihood estimation of the parameters involved, and to this purpose we introduce an approach based on the use of the ExpectationMaximization algorithm. We show how to implement it in this context, and then we analyze the results obtained, comparing them with results obtained by other approaches. 
JEL:  C13 C16 
Date:  2008–04 
URL:  http://d.repec.org/n?u=RePEc:vnm:wpaper:163&r=ecm 
By:  Wong, Woon K (Cardiff Business School) 
Abstract:  Let e and Σ be respectively the vector of shocks and its variance covariance matrix in a linear system of equations in reduced form. This article shows that a unique orthogonal variance decomposition can be obtained if we impose a restriction that maximizes the trace of A, a positive definite matrix such that Az = e where z is vector of uncorrelated shocks with unit variance. Such a restriction is meaningful in that it associates the largest possible weight for each element in e with its corresponding element in z. It turns out that A = Σ<sup>1/2</sup>, the square root of Σ. 
Keywords:  Variance decomposition; Cholesky decomposition; unique orthogonal decomposition and square root matrix 
JEL:  C01 
Date:  2008–04 
URL:  http://d.repec.org/n?u=RePEc:cdf:wpaper:2008/10&r=ecm 
By:  Jose M. VidalSanz 
Abstract:  In this paper several definitions of probabilistic causation are considered, and their main drawbacks discussed. Current notions of probabilistic causality have symmetry limitations (e.g. correlation and statistical dependence are symmetric notions). To avoid the symmetry problem, nonreciprocal causality is often defined in terms of dynamic asymmetry. But these notions are likely to consider spurious regularities. In this paper we present a definition of causality that does non have symmetry inconsistences. It is a natural extension of propositional causality in formal logics, and it can be easily analyzed with statistical inference. The modeling problems are also discussed using empirical processes. 
Keywords:  Causality, Empirical Processes and Classification Theory, 62M30, 62M15, 62G20 
Date:  2008–04 
URL:  http://d.repec.org/n?u=RePEc:cte:wbrepe:wb081702&r=ecm 
By:  Amstad, Marlene (Swiss National Bank); Fischer, Andreas (CEPR) 
Abstract:  Are weekly inflation forecasts informative? Although several central banks review and discuss monetary policy issues on a biweekly basis, there have been few attempts by analysts to construct systematic estimates of core inflation that supports such a decisionmaking schedule. The timeliness of news releases and macroeconomic revisions are recognized to be an important information source in realtime estimation. We incorporate realtime information from macroeconomic releases and revisions into our weekly updates of monthly Swiss core inflation using a common factor procedure. The weekly estimates for Swiss core inflation find that it is worthwhile to update the forecast at least twice a month. 
Keywords:  Inflation; Common Factors; Sequential Information Flow 
JEL:  E52 E58 
Date:  2008–01–29 
URL:  http://d.repec.org/n?u=RePEc:ris:snbwpa:2008_005&r=ecm 