
on Econometrics 
By:  Alexander Chudik (European Central Bank, Kaiserstrasse 29, 60311 Frankfurt am Main, Germany.); M. Hashem Pesaran (University of Cambridge, CIMF and USC; Faculty of Economics, Austin Robinson Building, Sidgwick Avenue, Cambridge, CB3 9DD, United Kingdom.) 
Abstract:  This paper introduces a novel approach for dealing with the 'curse of dimensionality' in the case of large linear dynamic systems. Restrictions on the coefficients of an unrestricted VAR are proposed that are binding only in a limit as the number of endogenous variables tends to infinity. It is shown that under such restrictions, an infinitedimensional VAR (or IVAR) can be arbitrarily well characterized by a large number of finitedimensional models in the spirit of the global VAR model proposed in Pesaran et al. (JBES, 2004). The paper also considers IVAR models with dominant individual units and shows that this will lead to a dynamic factor model with the dominant unit acting as the factor. The problems of estimation and inference in a stationary IVAR with unknown number of unobserved common factors are also investigated. A cross section augmented least squares estimator is proposed and its asymptotic distribution is derived. Satisfactory small sample properties are documented by Monte Carlo experiments. JEL Classification: C10, C33, C51. 
Keywords:  Large N and T Panels, Weak and Strong Cross Section Dependence, VAR, Global VAR, Factor Models. 
Date:  2009–01 
URL:  http://d.repec.org/n?u=RePEc:ecb:ecbwps:20090998&r=ecm 
By:  Chen, Xiaohong (Yale U); Pouzo, Demian (New York U) 
Abstract:  For semi/nonparametric conditional moment models containing unknown parametric components (theta) and unknown functions of endogenous variables (h), Newey and Powell (2003) and Ai and Chen (2003) propose sieve minimum distance (SMD) estimation of (theta, h) and derive the large sample properties. This paper greatly extends their results by establishing the followings: (1) The penalized SMD (PSMD) estimator (hat{theta}, hat{h}) can simultaneously achieve rootn asymptotic normality of theta hat and nonparametric optimal convergence rate of hat{h}, allowing for models with possibly nonsmooth residuals and/or noncompact infinite dimensional parameter spaces. (2) A simple weighted bootstrap procedure can consistently estimate the limiting distribution of the PSMD hat{theta}. (3) The semiparametric efficiency bound results of Ai and Chen (2003) remain valid for conditional models with nonsmooth residuals, and the optimally weighted PSMD estimator achieves the bounds. (4) The profiled optimally weighted PSMD criterion is asymptotically Chisquare distributed, which implies an alternative consistent estimation of confidence region of the efficient PSMD estimator of theta. All the theoretical results are stated in terms of any consistent nonparametric estimator of conditional mean functions. We illustrate our general theories using a partially linear quantile instrumental variables regression, a Monte Carlo study, and an empirical estimation of the shapeinvariant quantile Engel curves with endogenous total expenditure. 
JEL:  C14 
Date:  2008–02 
URL:  http://d.repec.org/n?u=RePEc:ecl:yaleco:38&r=ecm 
By:  Chen, Xiaohong (New York U); Hong, Han (Duke U); Tarozzi, Alessandro 
Abstract:  We study semiparametric efficiency bounds and efficient estimation of parameters defined through general nonlinear, possibly nonsmooth and overidentified moment restrictions, where the sampling information consists of a primary sample and an auxiliary sample. The variables of interest in the moment conditions are not directly observable in the primary data set, but the primary data set contains proxy variables which are correlated with the variables of interest. The auxiliary data set contains information about the conditional distribution of the variables of interest given the proxy variables. Identification is achieved by the assumption that this conditional distribution is the same in both the primary and auxiliary data sets. We provide semiparametric efficiency bounds for both the "verifyoutofsample" case, where the two samples are independent, and the "verifyinsample" case, where the auxiliary sample is a subset of the primary sample; and the bounds are derived when the propensity score is unknown, or known, or belongs to a correctly specified parametric family. These efficiency variance bounds indicate that the propensity score is ancillary for the "verifyinsample" case, but is not ancillary for the "verifyoutofsample" case. We show that sieve conditional expectation projection based GMM estimators achieve the semiparametric efficiency bounds for all the above mentioned cases, and establish their asymptotic efficiency under mild regularity conditions. Although inverse probability weighting based GMM estimators are also shown to be semiparametrically efficient, they need stronger regularity conditions and clever combinations of nonparametric and parametric estimates of the propensity score to achieve the efficiency bounds for various cases. Our results contribute to the literature on nonclassical measurement error models, missing data and treatment effects. 
JEL:  C1 
Date:  2008–03 
URL:  http://d.repec.org/n?u=RePEc:ecl:yaleco:42&r=ecm 
By:  Nielsen, Morten (Cornell U and CREATES) 
Abstract:  This paper presents a family of simple nonparametric unit root tests indexed by one parameter, d, and containing Breitung's (2002) test as the special case d = 1. It is shown that (i) each member of the family with d > 0 is consistent, (ii) the asymptotic distribution depends on d, and thus reects the parameter chosen to implement the test, and (iii) since the asymptotic distribution depends on d and the test remains consistent for all d > 0, it is possible to analyze the power of the test for different values of d. The usual PhillipsPerron or DickeyFuller type tests are characterized by tuning parameters (bandwidth, lag length, etc.), i.e. parameters which change the test statistic but are not reected in the asymptotic distribution, and thus have none of these three properties. It is shown that members of the family with d < 1 have higher asymptotic local power than the Breitung (2002) test, and when d is small the asymptotic local power of the proposed nonparametric test is relatively close to the parametric power envelope, particularly in the case with a linear timetrend. Furthermore, GLS detrending is shown to improve power when d is small, which is not the case for Breitung's (2002) test. Simulations demonstrate that, apart from some size distortion in the presence of large negative AR or MA coefficients, the proposed test has good finite sample properties in the presence of both linear and nonlinear shortrun dynamics. When applying a sieve bootstrap procedure, the proposed test has very good size properties, with finite sample power that is higher than that of Breitung's (2002) test and even rivals the (nearly) optimal parametric GLS detrended augmented DickeyFuller test with lag length chosen by an information criterion. 
JEL:  C22 
Date:  2008–05 
URL:  http://d.repec.org/n?u=RePEc:ecl:corcae:0805&r=ecm 
By:  Chen, Xiaohong (Yale U); Pouzo, Demian (New York U) 
Abstract:  This paper studies nonparametric estimation of conditional moment models in which the residual functions could be nonsmooth with respect to the unknown functions of endogenous variables. It is a problem of nonparametric nonlinear instrumental variables (IV) estimation, and a difficult nonlinear illposed inverse problem with an unknown operator. We first propose a penalized sieve minimum distance (SMD) estimator of the unknown functions that are identified via the conditional moment models. We then establish its consistency and convergence rate (in strong metric), allowing for possibly noncompact function parameter spaces, possibly noncompact finite or infinite dimensional sieves with flexible lower semicompact or convex penalty, or finite dimensional linear sieves without penalty. Under relatively lowlevel sufficient conditions, and for both mildly and severely illposed problems, we show that the convergence rates for the nonlinear illposed inverse problems coincide with the known minimax optimal rates for the nonparametric mean IV regression. We illustrate the theory by two important applications: rootn asymptotic normality of the plugin penalized SMD estimator of a weighted average derivative of a nonparametric nonlinear IV regression, and the convergence rate of a nonparametric additive quantile IV regression. We also present a simulation study and an empirical estimation of a system of nonparametric quantile IV Engel curves. 
JEL:  C13 
Date:  2008–04 
URL:  http://d.repec.org/n?u=RePEc:ecl:yaleco:47&r=ecm 
By:  Jorda, Oscar (U of California, Davis); Marcellino, Massimiliano (Universita Bocconi) 
Abstract:  A path forecast refers to the sequence of forecasts 1 to H periods into the future. A summary of the range of possible paths the predicted variable may follow for a given confidence level requires construction of simultaneous confidence regions that adjust for any covariance between the elements of the path forecast. This paper shows how to construct such regions with the joint predictive density and Scheffe's (1953) Smethod. In addition, the joint predictive density can be used to construct simple statistics to evaluate the local internal consistency of a forecasting exercise of a system of variables. Monte Carlo simulations demonstrate that these simultaneous confidence regions provide approximately correct coverage in situations where traditional error bands, based on the collection of marginal predictive densities for each horizon, are vastly off mark. The paper showcases these methods with an application to the most recent monetary episode of interest rate hikes in the U.S. macroeconomy. 
JEL:  C32 
Date:  2008–07 
URL:  http://d.repec.org/n?u=RePEc:ecl:ucdeco:085&r=ecm 
By:  Jinyong Hahn (Department of Economics, UCLA); Keisuke Hirano (University of Arizona); Dean Karlan (Economic Growth Center, Yale University) 
Abstract:  Many social experiments are run in multiple waves, or are replications of earlier social experiments. In principle, the sampling design can be modified in later stages or replications to allow for more efficient estimation of causal effects. We consider the design of a twostage experiment for estimating an average treatment effect, when covariate information is available for experimental subjects. We use data from the first stage to choose a conditional treatment assignment rule for units in the second stage of the experiment. This amounts to choosing the propensity score, the conditional probability of treatment given covariates. We propose to select the propensity score to minimize the asymptotic variance bound for estimating the average treatment effect. Our procedure can be implemented simply using standard statistical software and has attractive largesample properties. 
Keywords:  experimental design, propensity score, efficiency bound 
JEL:  C1 C14 C9 C93 C13 
Date:  2009–01 
URL:  http://d.repec.org/n?u=RePEc:egc:wpaper:969&r=ecm 
By:  Jing, Li 
Abstract:  This paper examines the performance of prediction intervals based on bootstrap for threshold autoregressive models. We consider four bootstrap methods to account for the variability of estimates, correct the smallsample bias of autoregressive coefficients and allow for heterogeneous errors. Simulation shows that (1) accounting for the sampling variability of estimated threshold values is necessary despite superconsistency, (2) biascorrection leads to better prediction intervals under certain circumstances, and (3) twosample bootstrap can improve long term forecast when errors are regimedependent. 
Keywords:  Bootstrap; Interval Forecasting; Threshold Autoregressive Models; Time Series; Simulation 
JEL:  C53 C22 C15 
Date:  2009–01 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:13086&r=ecm 
By:  James J. Heckman; Sergio Urzua 
Abstract:  This paper compares the economic questions addressed by instrumental variables estimators with those addressed by structural approaches. We discuss Marschak's Maxim: estimators should be selected on the basis of their ability to answer wellposed economic problems with minimal assumptions. A key identifying assumption that allows structural methods to be more informative than IV can be tested with data and does not have to be imposed. 
JEL:  C31 
Date:  2009–02 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:14706&r=ecm 
By:  Wagner Piazza Gaglianone; João Victor Issler 
Abstract:  This paper investigates an intertemporal optimization model to analyze the current account through Campbell & Shiller’s (1987) approach. In this setup, a Wald test is conducted to analyze a set of restrictions imposed to a VAR, used to forecast the current account for a set of countries. We focused here on three estimation procedures: OLS, SUR and the twoway error decomposition of Fuller & Battese (1974). We also propose an original note on Granger causality, which is a necessary condition to perform the Wald test. Theoretical results show that, in the presence of global shocks, OLS and SUR estimators might lead to a biased covariance matrix, with serious implications to the validation of the model. A small Monte Carlo simulation confirms these findings and indicates the Fuller & Battese procedure in the presence of global shocks. An empirical exercise for the G7 countries is also provided, and the results of the Wald test substantially change with different estimation techniques. In addition, global shocks can account up to 40% of the total residuals of the G7. 
Date:  2009–01 
URL:  http://d.repec.org/n?u=RePEc:bcb:wpaper:178&r=ecm 
By:  Markus Baldauf; J.M.C. Santos Silva 
Abstract:  The use of robust regression estimators obtained by iteratively reweighted least squares (IRLS) is gaining popularity among applied econometricians. The main argument invoked to justify the use of the robust IRLS estimators is that they provide efficiency gains in the presence of outliers or nonnormal errors. Unfortunately, most practitioners seem to be unaware of the fact that heteroskedastic and skewed errors can dramatically affect the properties of these estimators. In this paper we reconsider the interpretation of the robust IRLS estimators when used in typical econometric problems, and conclude that their use cannot be generally recommended. 
Date:  2009–01–28 
URL:  http://d.repec.org/n?u=RePEc:esx:essedp:664&r=ecm 
By:  Chen, Xiaohong (Yale U); Hansen, Lars Peter (U of Chicago); Carrasco, Marine (U of Montreal) 
Abstract:  Nonlinearities in the drift and diffusion coefficients influence temporal dependence in scalar diffusion models. We study this link using two notions of temporal dependence: betamixing and rhomixing. We show that betamixing and rhomixing with exponential decay are essentially equivalent concepts for scalar diffusions. For stationary diffusions that fail to be rhomixing, we show that they are still betamixing except that the decay rates are slower than exponential. For such processes we find transformations of the Markov states that have finite variances but infinite spectral densities at frequency zero. Some have spectral densities that diverge at frequency zero in a manner similar to that of stochastic processes with long memory. Finally we show how nonlinear, statedependent, Poisson sampling alters the unconditional distribution as well as the temporal dependence. 
JEL:  C12 
Date:  2008–05 
URL:  http://d.repec.org/n?u=RePEc:ecl:yaleco:48&r=ecm 
By:  Jian Wang; Jason J. Wu 
Abstract:  This paper attacks the MeeseRogoff (exchange rate disconnect) puzzle from a different perspective: outofsample interval forecasting. Most studies in the literature focus on point forecasts. In this paper, we apply Robust Semiparametric (RS) interval forecasting to a group of Taylor rule models. Forecast intervals for twelve OECD exchange rates are generated and modified tests of Giacomini and White (2006) are conducted to compare the performance of Taylor rule models and the random walk. Our contribution is twofold. First, we find that in general, Taylor rule models generate tighter forecast intervals than the random walk, given that their intervals cover outofsample exchange rate realizations equally well. This result is more pronounced at longer horizons. Our results suggest a connection between exchange rates and economic fundamentals: economic variables contain information useful in forecasting the distributions of exchange rates. The benchmark Taylor rule model is also found to perform better than the monetary and PPP models. Second, the inference framework proposed in this paper for forecastinterval evaluation, can be applied in a broader context, such as inflation forecasting, not just to the models and interval forecasting methods used in this paper. 
Date:  2009 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgif:963&r=ecm 
By:  AitSahalia, Yacine (Princeton U); Kimmel, Robert L. (Ohio State U) 
Abstract:  We develop and implement a technique for maximum likelihood estimation in closedform of multivariate affine yield models of the term structure of interest rates. We derive closedform approximations to the likelihood functions for all nine of the Dai and Singleton (2000) canonical affine models with one, two, or three underlying factors. Monte Carlo simulations reveal that this technique very accurately approximates true maximum likelihood, which is, in general, infeasible for affine models. We also apply the method to a dataset consisting of synthetic US Treasury strips, and find parameter estimates for nine different affine yield models, each using two different market price of risk specifications. One advantage of maximum likelihood estimation is the ability to compare nonnested models using likelihood ratio tests. We find, using these tests, that the choice of preferred canonical model depends on the market price of risk specification. Comparison to other approximation methods, Euler and QML, on both simulated and real data suggest that our approximation technique is much closer to true MLE than alternative methods. 
Date:  2008–10 
URL:  http://d.repec.org/n?u=RePEc:ecl:ohidic:200819&r=ecm 
By:  Miguel A. LeónLedesma (Department of Economics, Keynes College, University of Kent, Canterbury, Kent CT2 7NP, United Kingdom.); Peter McAdam (Corresponding author: Research Department, European Central Bank, Kaiserstrasse 29, D60311 Frankfurt am Main.); Alpo Willman (Research Department, European Central Bank, Kaiserstrasse 29, 60311 Frankfurt am Main, Germany.) 
Abstract:  Despite being critical parameters in many economic fields, the received wisdom, in theoretical and empirical literatures, states that joint identification of the elasticity of capitallabor substitution and technical bias is infeasible. This paper challenges that pessimistic interpretation. Putting the new approach of "normalized" production functions at the heart of a Monte Carlo analysis we identify the conditions under which identification is feasible and robust. The key result is that the jointly modeling the production function and firstorder conditions is superior to singleequation approaches in terms of robustly capturing production and technical parameters, especially when merged with "normalization". Our results will have fundamental implications for productionfunction estimation under nonneutral technical change, for understanding the empirical relevance of normalization and the variability underlying past empirical studies. JEL Classification: C22, E23, O30, 051. 
Keywords:  Constant Elasticity of Substitution, FactorAugmenting Technical Change, Normalization, Factor Income share, Identification, Monte Carlo. 
Date:  2009–01 
URL:  http://d.repec.org/n?u=RePEc:ecb:ecbwps:200901001&r=ecm 
By:  Kiefer, Nicholas M. (Cornell U and CREATES, The Danish Science Foundation, U of Aarhus); Racine, Jeffrey S. (McMaster U) 
Abstract:  Kernel smoothing techniques have attracted much attention and some notoriety in recent years. The attention is well deserved as kernel methods free researchers from having to impose rigid parametric structure on their data. The notoriety arises from the fact that the amount of smoothing (i.e., local averaging) that is appropriate for the problem at hand is under the control of the researcher. In this paper we provide a deeper understanding of kernel smoothing methods for discrete data by leveraging the unexplored links between hierarchical Bayesmodels and kernelmethods for discrete processes. A number of potentially useful results are thereby obtained, including bounds on when kernel smoothing can be expected to dominate nonsmooth (e.g., parametric) approaches in mean squared error and suggestions for thinking about the appropriate amount of smoothing. 
Date:  2008–05 
URL:  http://d.repec.org/n?u=RePEc:ecl:corcae:0801&r=ecm 
By:  Benjamin Jourdain (CERMICS  Centre d'Enseignement et de Recherche en Mathématiques, Informatique et Calcul Scientifique  INRIA  Ecole Nationale des Ponts et Chaussées); Mohamed Sbai (CERMICS  Centre d'Enseignement et de Recherche en Mathématiques, Informatique et Calcul Scientifique  INRIA  Ecole Nationale des Ponts et Chaussées) 
Abstract:  In this paper, we are interested in continuous time models in which the index level induces some feedback on the dynamics of its composing stocks. More precisely, we propose a model in which the logreturns of each stock may be decomposed into a systemic part proportional to the logreturns of the index plus an idiosyncratic part. We show that, when the number of stocks in the index is large, this model may be approximated by a local volatility model for the index and a stochastic volatility model for each stock with volatility driven by the index. We address calibration of both the limit and the original models. 
Keywords:  Index modeling, calibration, nonparametric estimation 
Date:  2008–12–17 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal00350652_v1&r=ecm 
By:  Martellosio, Federico 
Abstract:  This paper investigates how the correlations implied by a firstorder simultaneous autoregressive (SAR(1)) process are affected by the weights matrix W and the autocorrelation parameter . We provide an interpretation of the covariances between the random variables observed at two spatial units, based on a particular type of walks connecting the two units. The interpretation serves to explain a number of correlation properties of SAR(1) models, and clarifies why it is impossible to control the correlations through the specification of W. 
Keywords:  simultaneous autoregressions; spatial autocorrelation; spatial weights matrices; walks in graphs. 
JEL:  C50 C21 
Date:  2008–10 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:13141&r=ecm 
By:  Kiefer, Nicholas M. (Cornell U and US Department of the Treasury) 
Abstract:  Capital allocation decisions are made on the basis of an assessment of creditworthiness. Default is a rare event for most segments of a bank's portfolio and data information can be minimal. Inference about default rates is essential for efficient capital allocation, for risk management and for compliance with the requirements of the Basel II rules on capital standards for banks. Expert information is crucial in inference about defaults. A Bayesian approach is proposed and illustrated using prior distributions assessed from industry experts. A maximum entropy approach is used to represent expert information. The binomial model, most common in applications, is extended to allow correlated defaults yet remain consistent with Basel II. The application shows that probabilistic information can be elicited from experts and econometric methods can be useful even when data information is sparse. 
Date:  2008–04 
URL:  http://d.repec.org/n?u=RePEc:ecl:corcae:0802&r=ecm 