nep-ecm New Economics Papers
on Econometrics
Issue of 2014‒10‒17
twelve papers chosen by
Sune Karlsson
Örebro universitet

  1. Initial-Condition Free Estimation of Fixed Effects Dynamic Panel Data Models By Zhenlin Yang
  2. The wild tapered block bootstrap By Ulrich Hounyo
  3. Large Bayesian VARMAs By Joshua C C Chan; Eric Eisenstat; Gary Koop
  4. Moment-based estimation of nonlinear regression models with boundary outcomes and endogeneity, with applications to nonnegative and fractional responses By Esmeralda A. Ramalho; Joaquim J.S. Ramalho
  5. A J-Test for Panel Models with Fixed Effects, Spatial and Time By Harry H. Kelejian; Gianfranco Piras
  6. Testing for Neglected Nonlinearity Using Regularized Artificial Neural Networks By Tae-Hwy Lee; Zhou Xi; Ru Zhang
  7. A Nonparametric Test of Exogenous Participation in First-Price Auctions By Nianqing Liu; Yao Luo
  8. Identification of DSGE Models - the Effect of Higher-Order Approximation and Pruning By Willi Mutschler
  9. Minimax estimation of jump activity in semimartingales By Adam D. Bull
  10. The zero lower bound and parameter bias in an estimated DSGE model By Yasuo Hirose; Atsushi Inoue
  11. Signal Diffusion Mapping: Optimal Forecasting with Time Varying Lags By Paul Gaskell; Frank McGroarty; Thanassis Tiropanis
  12. Discrete choice estimation of time preferences By Jose Apesteguia; Miguel A. Ballester

  1. By: Zhenlin Yang (School of Economics, Singapore Management University, Singapore, 178903)
    Abstract: It is well known that (quasi) MLE of dynamic panel data (DPD) models with short panels depends on the assumptions on the initial values; ignoring them or a wrong treatment of them will result in inconsistency or serious bias. This paper introduces a initial-condition free method for estimating the fixed-effects DPD models, through a simple modification of the quasi-score. An outer-product-of-gradients (OPG) method is also proposed for robust inference. The MLE of Hsiao, Pesaran and Tahmiscioglu (2002, Journal of Econometrics), where the initial observations are modeled, is extended to quasi MLE and an OPG method is proposed for robust inference. Consistency and asymptotic normality for both estimation strategies are established, and the two methods are compared through Monte Carlo simulations. The proposed method performs well in general, whether the panel is short or not. The quasi MLE performs comparably, except when model does not contain time-varying regressor, or the panel is not short and the dynamic parameter is small. The proposed method is much simpler and easier to apply.
    Keywords: Bias reduction; Consistency; Asymptotic normality; Dynamic panel; Fixed effects; Modified quasi-score; Robust standard error; Short panel
    JEL: C10 C13 C23 C15
    Date: 2014–09
  2. By: Ulrich Hounyo (Oxford-Man Institute, University of Oxford, and Aarhus University and CREATES)
    Abstract: In this paper, a new resampling procedure, called the wild tapered block bootstrap, is introduced as a means of calculating standard errors of estimators and constructing confidence regions for parameters based on dependent heterogeneous data. The method consists in tapering each overlapping block of the series first, then applying the standard wild bootstrap for independent and heteroscedastic distributed observations to overlapping tapered blocks in an appropriate way. It preserves the favorable bias and mean squared error properties of the tapered block bootstrap, which is the state-of-the-art block-based method in terms of asymptotic accuracy of variance estimation and distribution approximation. For stationary time series, the asymptotic validity, and the favorable bias properties of the new bootstrap method are shown in two important cases: smooth functions of means, and M-estimators. The first-order asymptotic validity of the tapered block bootstrap as well as the wild tapered block bootstrap approximation to the actual distribution of the sample mean is also established when data are assumed to satisfy a near epoch dependent condition. The consistency of the bootstrap variance estimator for the sample mean is shown to be robust against heteroskedasticity and dependence of unknown form. Simulation studies illustrate the finite-sample performance of the wild tapered block bootstrap. This easy to implement alternative bootstrap method works very well even for moderate sample sizes.
    Keywords: Block bootstrap, Near epoch dependence, Tapering, Variance estimation
    JEL: C15 C22
    Date: 2014–09–24
  3. By: Joshua C C Chan (Australian National University); Eric Eisenstat (University of Bucharest); Gary Koop (Department of Economics, University of Strathclyde)
    Abstract: Abstract: Vector Autoregressive Moving Average (VARMA) models have many theoretical properties which should make them popular among empirical macroeconomists. However, they are rarely used in practice due to over-parameterization concerns, difficult - ties in ensuring identification and computational challenges. With the growing interest in multivariate time series models of high dimension, these problems with VARMAs become even more acute, accounting for the dominance of VARs in this field. In this paper, we develop a Bayesian approach for inference in VARMAs which surmounts these problems. It jointly ensures identification and parsimony in the context of an efficient Markov chain Monte Carlo (MCMC) algorithm. We use this approach in a macroeconomic application involving up to twelve dependent variables. We find our algorithm to work successfully and provide insights beyond those provided by VARs
    Keywords: VARMA identification, Markov Chain Monte Carlo, Bayesian, stochastic search variable selection
    JEL: C11 C32 E37
    Date: 2014–09
  4. By: Esmeralda A. Ramalho (Department of Economics and CEFAGE-UE, Universidade de Evora); Joaquim J.S. Ramalho (Department of Economics and CEFAGE-UE, Universidade de Evora)
    Abstract: In this paper we suggest simple moment-based estimators to deal with unobserved heterogeneity in a special class of nonlinear regression models that includes as main particular cases exponential models for nonnegative responses and logit and complementary loglog models for fractional responses. The proposed estimators: (i) treat observed and omitted covariates in a similar manner; (ii) can deal with boundary outcomes; (iii) accommodate endogenous explanatory variables without requiring knowledge on the reduced form model, although such information may be easily incorporated in the estimation process; (iv) do not require distributional assumptions on the unobservables, a conditional mean assumption being enough for consistent estimation of the structural parameters; and (v) under the additional assumption that the dependence between observables and unobservables is restricted to the conditional mean, produce consistent estimators of partial effects conditional only on observables.
    Keywords: Unobserved heterogeneity; Endogeneity; Boundary outcomes; Fractional regression; Exponential regression; Transformation regression models.
    JEL: C25 C51 C26
    Date: 2014
  5. By: Harry H. Kelejian (Department of Economics, University of Maryland); Gianfranco Piras (Regional Research Institute, West Virginia University)
    Abstract: In this paper we suggest a J-test in a spatial panel framework of a null model against one or more alternatives. The null model we consider has fixed effects, along with spatial and time dependence. The alternatives can have either fixed or random effects. We implement our procedure to test the specifications of a demand for cigarette model. We find that the most appropriate specification is one that contains the average price of cigarettes in neighboring states, as well as the spatial lag of the dependent variable. Along with formal large sample results, we also give small sample Monte Carlo results. Our large samples results are based on the assumption N ? 8 and T is fixed. Our Monte Carlo results suggest that our proposed J-test has good power, and proper size even for small to moderately sized samples.
    Keywords: spatial panel models, fixed effects, time and spatial lags, non-nested j-test
    JEL: C01 C12
    Date: 2013–03
  6. By: Tae-Hwy Lee (Department of Economics, University of California Riverside); Zhou Xi (University of California, Riverside); Ru Zhang (University of California, Riverside)
    Abstract: The artificial neural network (ANN) test of Lee et al. (Journal of Econometrics 56, 269–290, 1993) uses the ability of the ANN activation functions in the hidden layer to detect neglected functionalmisspecification. As the estimation of the ANN model is often quite difficult, LWG suggested activate the ANN hidden units based on randomly drawn activation parameters. To be robust to the random activations, a large number of activations is desirable. This leads to a situation for which regularization of the dimensionality is needed by techniques such as principal component analysis (PCA), Lasso, Pretest, partial least squares (PLS), among others. However, some regularization methods can lead to selection bias in testing if the dimensionality reduction is conducted by supervising the relationship between the ANN hidden layer activations of inputs and the output variable. This paper demonstrates that while these supervised regularization methods such as Lasso, Pretest, PLS, may be useful for forecasting, they may not be used for testing because the supervised regularizationwould create the post-sample inference or post-selection inference (PoSI) problem. Our Monte Carlo simulation shows that the PoSI problem is especially severe with PLS and Pretest while it seems relatively mild or even negligible with Lasso. This paper also demonstrates that the use of unsupervised regularization does not lead to the PoSI problem. Lee et al. (Journal of Econometrics 56, 269–290, 1993) suggested a regularization by principal components, which is a unsupervised regularization.While the supervised regularizations may be useful in forecasting, regularization should not be supervised in inference.
    Keywords: Randomized ANN activations • Dimension reduction • Supervised regularization • Unsupervised regularization • PCA • Lasso • PLS • Pretest • PoSI problem
    JEL: C12 C45
    Date: 2013–09
  7. By: Nianqing Liu; Yao Luo
    Abstract: This paper proposes a nonparametric test of exogenous participation in first-price auctions. Exogenous participation means that the valuation distribution does not depend on the number of bidders. Our test is motivated by the fact that two valuation distributions are the same if and only if their generalized Lorenz curves are the same. Our method avoids estimating unobserved valuations and does not require smooth estimation of bid density. We show that our test is consistent against all fixed alternatives and has power against root-n local alternatives. Monte Carlo experiments show that our test performs well in finite samples. We implement our method on data from the U.S. Forest Service timber auctions. We also discuss how our test can be adapted to other testing problems in auctions.
    Keywords: Auctions, Exogenous Participation, Nonparametric, Hypothesis Test, Generalized Lorenz Curve
    JEL: D44 D82 C12 C14
    Date: 2014–10–01
  8. By: Willi Mutschler
    Abstract: Several formal methods have been proposed to check local identification in linearized DSGE models using rank criteria. Recently there has been huge progress in the estimation of non-linear DSGE models, yet formal identification criteria are missing. The contribution of the paper is threefold: First, we extend the existent methods to higher-order approximations and establish rank criteria for local identification given the pruned state-space representation. It is shown that this may improve overall identification of a DSGE model via imposing additional restrictions on the moments and spectrum. Second, we derive analytical derivatives of the reduced-form matrices, unconditional moments and spectral density for the pruned state-space system. Third, using a second-order approximation, we are able to identify previously non-identifiable parameters: namely the parameters governing the investment adjustment costs in the Kim (2003) model and all parameters in the An and Schorfheide (2007) model, including the coeffcients of the Taylor-rule.
    Keywords: non-linear DSGE, rank condition, analytical derivatives, pruned state-space
    JEL: C10 C51 C52 E1
    Date: 2014–10
  9. By: Adam D. Bull
    Abstract: In quantitative finance, we often model asset prices as semimartingales, with drift, diffusion and jump components. The jump activity index measures the strength of the jumps at high frequencies, and is of interest both in model selection and fitting, and in volatility estimation. In this paper, we give a novel estimate of the jump activity, together with corresponding confidence intervals. Our estimate improves upon previous work, achieving near-optimal rates of convergence, and good finite-sample performance in Monte-Carlo experiments.
    Date: 2014–09
  10. By: Yasuo Hirose (Keio University); Atsushi Inoue (Vanderbilt University)
    Abstract: This paper examines how and to what extent parameter estimates can be biased in a dynamic stochastic general equilibrium (DSGE) model that omits the zero lower bound (ZLB) constraint on the nominal interest rate. Our Monte Carlo experiments using a standard sticky-price DSGE model show that no significant bias is detected in parameter estimates and that the estimated impulse response functions are quite similar to the true ones. However, as the probability of hitting the ZLB increases, the parameter bias becomes larger and therefore leads to substantial differences between the estimated and true impulse responses. It is also demonstrated that the model missing the ZLB causes biased estimates of structural shocks even with the virtually unbiased parameters.
    JEL: E3 E5
    Date: 2014–09–09
  11. By: Paul Gaskell; Frank McGroarty; Thanassis Tiropanis
    Abstract: We introduce a new methodology for forecasting which we call Signal Diffusion Mapping. Our approach accommodates features of real world financial data which have been ignored historically in existing forecasting methodologies. Our method builds upon well-established and accepted methods from other areas of statistical analysis. We develop and adapt those models for use in forecasting. We also present tests of our model on data in which we demonstrate the efficacy of our approach.
    Date: 2014–09
  12. By: Jose Apesteguia; Miguel A. Ballester
    Abstract: Discrete choice methods are often used for the estimation of time preferences. We show that these methods have pervasive problems when based on random utility models, for which cases our results establish that the probability of selecting a later option over an earlier one may be greater for higher levels of impatience. This could have profound implications, not only in the experimental estimation of time preferences, but also in a wide variety of empirical papers using such models in dynamic settings. Alternatively, we also show that discrete choice methods built on random preference models are always free of all such problems.
    Keywords: Discrete Choice; Structural Estimation; Time; Discounting; Random Utility Models; Random Preference Models.
    JEL: C25 D90
    Date: 2014–08

This nep-ecm issue is ©2014 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.