nep-ecm New Economics Papers
on Econometrics
Issue of 2014‒12‒13
fourteen papers chosen by
Sune Karlsson
Örebro universitet

  1. Approximate Bayesian Computation in State Space Models By Gael M. Martin; Brendan P.M. McCabe; Worapree Maneesoonthorn; Christian P. Robert
  2. Choice of Spectral Density Estimator in Ng-Perron Test: Comparative Analysis By Malik, Muhammad Irfan; Rehman, Atiq-ur-
  3. Local Polynomial Order in Regression Discontinuity Designs By David Card; Zhuan Pei; David S. Lee; Andrea Weber
  4. Optimal Rank Tests for Symmetry against Edgeworth-Type Alternatives By Delphine Cassart; Marc Hallin; Davy Paindaveine
  5. A Multivariate Model for Multinomial Choices By Bel, K.; Paap, R.
  6. "Tests for Covariance Matrices in High Dimension with Less Sample Size" By Muni S. Srivastava; Hirokazu Yanagihara; Tatsuya Kubokawa
  7. "On Predictive Density Estimation for Location Families under Integrated <em>L</em><sub>2</sub> and<em>L</em><sub>1</sub> Losses" By Tatsuya Kubokawa; Éric Marchand; William E. Strawderman
  8. Dealing with unobservable common trends in small samples: a panel cointegration approach By Francesca Di Iorio; Stefano Fachin
  9. A Performance Comparison of Large-n Factor Estimators By Gregory Connor; Zhuo Chen; Robert A. Korajczyk
  10. Group Interaction in Research and the Use of General Nesting Spatial Models By Peter Burridge; J. Paul Elhorst; Katarina Zigova
  11. . . . and the Cross-Section of Expected Returns By Campbell R. Harvey; Yan Liu; Heqing Zhu
  12. Multi-curve HJM modelling for risk management By Chiara Sabelli; Michele Pioppi; Luca Sitzia; Giacomo Bormetti
  13. An Application of Kernel Density Estimation via Diffusion to Group Yield Insurance By Ramsey, Ford
  14. A Bayesian Latent Variable Mixture Model for Filtering Firm Profit Rate By Gregor Semieniuk; Ellis Scharfenaker

  1. By: Gael M. Martin; Brendan P.M. McCabe; Worapree Maneesoonthorn; Christian P. Robert
    Abstract: A new approach to inference in state space models is proposed, based on approximate Bayesian computation (ABC). ABC avoids evaluation of the likelihood function by matching observed summary statistics with statistics computed from data simulated from the true process; exact inference being feasible only if the statistics are sufficient. With finite sample sufficiency unattainable in the state space setting, we seek asymptotic sufficiency via the maximum likelihood estimator (MLE) of the parameters of an auxiliary model. We prove that this auxiliary model-based approach achieves Bayesian consistency, and that - in a precise limiting sense - the proximity to (asymptotic) sufficiency yielded by the MLE is replicated by the score. In multiple parameter settings a separate treatment of scalar parameters, based on integrated likelihood techniques, is advocated as a way of avoiding the curse of dimensionality. Some attention is given to a structure in which the state variable is driven by a continuous time process, with exact inference typically infeasible in this case as a result of intractable transitions. The ABC method is demonstrated using the unscented Kalman filter as a fast and simple way of producing an approximation in this setting, with a stochastic volatility model for financial returns used for illustration.
    Keywords: Likelihood-free methods, latent diffusion models, linear Gaussian state space models, asymptotic sufficiency, unscented Kalman filter, stochastic volatility.
    JEL: C11 C22 C58
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2014-20&r=ecm
  2. By: Malik, Muhammad Irfan; Rehman, Atiq-ur-
    Abstract: Ng and Perron (2001) designed a unit root test which incorporates the properties of DF-GLS and Phillips Perron test. Ng and Perron claim that the test performs exceptionally well especially in the presence of negative moving average. However, the performance of test depends heavily on the choice of spectral density estimators used in the construction of test. There are various estimators for spectral density available in literature, having crucial impact on the output of test however there is no clarity on which of these estimators gives optimal size and power properties. This study aims to evaluate the performance of Ng-Perron for different choices of spectral density estimators in the presence of negative and positive moving average using Monte Carlo simulations. The results for large samples show that: (a) in the presence of positive moving average, test with kernel based estimator give good effective power and no size distortion (b) in the presence of negative moving average, autoregressive estimator gives better effective power, however, huge size distortion is observed in several specifications of data generating process
    Keywords: Ng-Perron test, Monte Carlo, Spectral Density, Unit Root Testing
    JEL: C01 C15 C63
    Date: 2014–11–17
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:59973&r=ecm
  3. By: David Card (UC Berkeley, NBER, and IZA); Zhuan Pei (Brandeis University); David S. Lee (Princeton University and NBER); Andrea Weber (University of Mannheim and IZA)
    Abstract: The local linear estimator has become the standard in the regression discontinuity design literature, but we argue that it should not always dominate other local polynomial estimators in empirical studies. We show that the local linear estimator in the data generating processes (DGP’s) based on two well- known empirical examples does not always have the lowest (asymptotic) mean squared error (MSE). Therefore, we advocate for a more flexible view towards the choice of the polynomial order, p, and suggest two complementary approaches for picking p: comparing the MSE of alternative estimators from Monte Carlo simulations based on an approximating DGP, and comparing the estimated asymptotic MSE using actual data.
    Keywords: Regression Discontinuity Design; Regression Kink Design; Local Polynomial Estima- tion; Polynomial Order
    Date: 2014–10
    URL: http://d.repec.org/n?u=RePEc:brd:wpaper:81&r=ecm
  4. By: Delphine Cassart; Marc Hallin; Davy Paindaveine
    Keywords: test of symmetry; skewness; edgeworth expansion; local asymptotic normality; signed rank test
    Date: 2014–11
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/177105&r=ecm
  5. By: Bel, K.; Paap, R.
    Abstract: __Abstract__ Multinomial choices of individuals are likely to be correlated. Nonetheless, econometric models for this phenomenon are scarce. A problem of multivariate multinomial choice models is that the number of potential outcomes can become very large which makes parameter interpretation and inference difficult. We propose a novel Multivariate Multinomial Logit specification, where (i) the number of parameters stays limited; (ii) there is a clear interpretation of the parameters in terms of odds ratios; (iii) zero restrictions on parameters result in independence between the multinomial choices and; (iv) parameter inference is feasible using a composite likelihood approach even if the multivariate dimension is large. Finally, these nice properties are also valid in a fixed-effects panel version of the model.
    Keywords: Discrete Choices, Multivariate analysis, Multinomial Logit, Composite Likelihood
    Date: 2014–10–13
    URL: http://d.repec.org/n?u=RePEc:ems:eureir:77168&r=ecm
  6. By: Muni S. Srivastava (Department of Statistics, University of Toronto); Hirokazu Yanagihara (Department of Mathematics, Hiroshima University); Tatsuya Kubokawa (Faculty of Economics, The University of Tokyo)
    Abstract: <p>In this article, we propose tests for covariance matrices of high dimension with fewer observations than the dimension for a general class of distributions with positive definite covariance matrices. In one-sample case, tests are proposed for sphericity and for testing the hypothesis that the covariance matrix ∑ is an identity matrix, by providing an unbiased estimator of tr [∑<sup>2</sup>] under the general model which requires no more computing time than the one available in the literature for normal model. In the two-sample case, tests for the equality of two covariance matrices are given. The asymptotic distributions of proposed tests in one-sample case are derived under the assumption that the sample size <em>N</em> = <em>O</em>(<em>p</em><sup>δ</sup>), 1/2 < δ < 1, where p is the dimension of the random vector, and <em>O</em>(<em>p</em><sup><sup>δ</sup>) means that <em>N/p</em> goes to zero as <em>N</em> and <em>p</em> go to infinity. Similar assumptions are made in the two-sample case.
    Date: 2014–06
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2014cf933&r=ecm
  7. By: Tatsuya Kubokawa (Faculty of Economics, The University of Tokyo); Éric Marchand (Université de Sherbrooke, Departement de mathématiques); William E. Strawderman (Rutgers University, Department of Statistics and Biostatistics,)
    Abstract: Our investigation concerns the estimation of predictive densities and a study of effiency as measured by the frequentist risk of such predictive densities with integrated L2 and L1 losses. Our findings relate to a p-variate spherically symmetric observable X ∼ px (||x -μ||2) and the objective of estimating the density of Y ∼ qY (||y - μ||2) based on X. For L2 loss, we describe Bayes estimation, minimum risk equivariant estimation (MRE), and minimax estimation. We focus on the risk performance of the benchmark minimum risk equivariant estimator, plug-in estimators, and plug-in type estimators with expanded scale. For the multivariate normal case, we make use of a duality result with a point estimation problem bringing into play reflected normal loss. In three of more dimensions (i.e., p ≥ 3), we show that the MRE estimator is inadmissible under L2 loss and provide dominating estimators. This brings into play Stein-type results for estimating a multivariate normal mean with a loss which is a concave and increasing function of ||δ - μ||2. We also study the phenomenon of improvement on the plug-in density estimator of the form qY (||y - aX ||2), 0 < a ≤ 1, by a subclass of scale expansions c-pqY (||(y - aX) / c ||2) with c > 1, showing in some cases, inevitably for large enough p, that all choices c > 1 are dominating estimators. Extensions are obtained for scale mixture of normals including a general inadmissibility result of the MRE estimator for p ≥ 3. Finally, we describe and expand on analogous plug-in dominance results for spherically symmetric distributions with p ≥ 4 under L1 loss.
    Date: 2014–07
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2014cf935&r=ecm
  8. By: Francesca Di Iorio (Universita' di Napoli Federico II); Stefano Fachin (Universita' di Roma "La Sapienza")
    Abstract: Non stationary panel models allowing for unobservable common trends have recently become very popular. However, standard methods, which are based on factor extraction or models augmented with cross-section averages, require large sample sizes, not always available in practice. In these cases we propose the simple and robust alternative of augmenting the panel regres- sion with common time dummies. The underlying assumption of additive e¤ects can be tested by means of a panel cointegration test, with no need of estimating a general interactive e¤ects model. An application to modelling labour productivity growth in the four major European economies (France, Germany, Italy and UK) illustrates the method.
    Keywords: Common trends, Panel cointegration, TFP.
    JEL: C23 C15 E2
    Date: 2014–11
    URL: http://d.repec.org/n?u=RePEc:sas:wpaper:20145&r=ecm
  9. By: Gregory Connor (Department of Economics, Finance and Accounting, National University of Ireland, Maynooth); Zhuo Chen (PBC School of Finance, Tsinghua University,); Robert A. Korajczyk (Kellogg School of Management, Northwestern University, IL 60208-2001, USA)
    Abstract: This paper uses simulations to evaluate the performance of various methods for estimating factor returns in an approximate factor model when the cross-sectional sample (n) is large relative to the time-series sample (T). We study the performance of the estimators under a variety of alternative speci?cations of the underlying factor structure. We ?nd that 1) all of the estimators perform well, even when they do not accommodate the form of heteroskedasticity present in the data; 2) for the sample sizes considered here, accommodating heteroskedasticity does not deteriorate performance much when simple forms of heteroskedaticity are present; 3) estimators that handle missing data by substituting ?tted returns from the factor model converge to the true factors more slowly than the other estimators.
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:may:mayecw:n255-14.pdf&r=ecm
  10. By: Peter Burridge (Department of Economics and Related Studies, University of York, UK); J. Paul Elhorst (Faculty of Economics and Business, University of Groningen, The Netherlands); Katarina Zigova (Department of Economics, University of Konstanz, Germany)
    Abstract: This paper tests the feasibility and empirical implications of a spatial econometric model with a full set of interaction effects and weight matrix defined as an equally weighted group interaction matrix applied to research productivity of individuals. We also elaborate two extensions of this model, namely with group fixed effects and with heteroskedasticity. In our setting the model with a full set of interaction effects is overparameterised: only the SDM and SDEM specifications produce acceptable results. They imply comparable spillover effects, but by applying a Bayesian approach taken from LeSage (2014), we are able to show that the SDEM specification is more appropriate and thus that colleague interaction effects work through observed and unobserved exogenous characteristics common to researchers within a group.
    Keywords: Spatial econometrics, identifcation, heteroskedasticity, group fixed effects, interaction effects, research productivity
    JEL: C21 D8 I23 J24
    Date: 2014–09–17
    URL: http://d.repec.org/n?u=RePEc:knz:dpteco:1419&r=ecm
  11. By: Campbell R. Harvey; Yan Liu; Heqing Zhu
    Abstract: Hundreds of papers and hundreds of factors attempt to explain the cross-section of expected returns. Given this extensive data mining, it does not make any economic or statistical sense to use the usual significance criteria for a newly discovered factor, e.g., a t-ratio greater than 2.0. However, what hurdle should be used for current research? Our paper introduces a multiple testing framework and provides a time series of historical significance cutoffs from the first empirical tests in 1967 to today. Our new method allows for correlation among the tests as well as missing data. We also project forward 20 years assuming the rate of factor production remains similar to the experience of the last few years. The estimation of our model suggests that a newly discovered factor needs to clear a much higher hurdle, with a t-ratio greater than 3.0. Echoing a recent disturbing conclusion in the medical literature, we argue that most claimed research findings in financial economics are likely false.
    JEL: C01 C58 G0 G1 G12 G3
    Date: 2014–10
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:20592&r=ecm
  12. By: Chiara Sabelli; Michele Pioppi; Luca Sitzia; Giacomo Bormetti
    Abstract: We present a HJM approach to the projection of multiple yield curves developed to capture the volatility content of historical term structures for risk management purposes. Since we observe the empirical data at daily frequency and only for a finite number of time to maturity buckets, we propose a modelling framework which is inherently discrete. In particular, we show how to approximate the HJM continuous time description of the multi-curve dynamics by a Vector Autoregressive process of order one. The resulting dynamics lends itself to a feasible estimation of the model volatility-correlation structure. Then, resorting to the Principal Component Analysis we further simplify the dynamics reducing the number of covariance components. Applying the constant volatility version of our model on a sample of curves from the Euro area, we demonstrate its forecasting ability through an out-of-sample test.
    Date: 2014–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1411.3977&r=ecm
  13. By: Ramsey, Ford
    Abstract: The recent priority given to Federal Crop Insurance as an agricultural policy instrument has increased the importance of rate making procedures. Actuarial soundness requires rates that are actuarially fair: the premium is set equal to expected loss. Formation of this expectation depends, in the case of group or area yield insurance, on precise estimation of the probability density of the crop yield in question. This paper applies kernel density estimation via diffusion to the estimation of crop yield probability densities and determines ensuing premium rates. The diffusion estimator improves on existing methods by providing a cogent answer to some of the issues that plague both parametric and nonparametric techniques. Application shows that premium rates can vary significantly depending on underlying distributional assumptions; from a practical point of view there is value to be had in proper specification.
    Keywords: crop insurance, yield distributions, density estimation via diffusion, nonparametric density estimation, Agricultural and Food Policy, Research Methods/ Statistical Methods, Risk and Uncertainty, C520, Q180, C140,
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:ags:aaea14:170173&r=ecm
  14. By: Gregor Semieniuk; Ellis Scharfenaker (Schwartz Center for Economic Policy Analysis (SCEPA))
    Abstract: By using Bayesian Markov chain Monte Carlo methods we select the proper subset of competitive firms and find striking evidence for Laplace shaped firm profit rate distributions. Our approach enables us to extract more information from data than previous research. We filter US firm-level data into signal and noise distributions by Gibbs-sampling from a latent variable mixture distribution, extracting a sharply peaked, negatively skewed Laplace-type profit rate distribution. A Bayesian change point analysis yields the subset of large firms with symmetric and stationary Laplace distributed profit rates, adding to the evidence for statistical equilibrium at the economy wide and sectoral levels.
    Keywords: Firm competition, Laplace distribution, Gibbs sampler, Profit rate, Statistical equilibrium
    JEL: C15 D20 E10 L11
    Date: 2014–02
    URL: http://d.repec.org/n?u=RePEc:epa:cepawp:2014-1&r=ecm

This nep-ecm issue is ©2014 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.