nep-ecm New Economics Papers
on Econometrics
Issue of 2015‒09‒05
fourteen papers chosen by
Sune Karlsson
Örebro universitet

  1. Factor structural time series models for official statistics with an application to hours worked in Germany By Weigand, Roland; Wanger, Susanne; Zapf, Ines
  2. Optimal Two-Sided Tests for Instrumental Variables Regression with Heteroskedastic and Autocorrelated Errors By Moreira, Humberto; Moreira, Marcelo J.
  3. Testing the "Separability" Condition in Two-Stage Nonparametric Models of Production By Cinzia Daraio; Leopold Simar; Paul W. Wilson
  4. Supervision in Factor Models Using a Large Number of Predictors By Lorenzo Boldrini; Eric Hillebrand
  5. Estimation of Dynastic Life-Cycle Discrete Choice Models By Gayle, George-Levi; Golan, Limor; Soytas, Mehmet A.
  6. Robust non-parametric estimation of cost efficiency with an application to banking industry By Galina Besstremyannaya; Jaak Simm
  7. How to Choose the Level of Significance: A Pedagogical Note By Kim, Jae
  8. Symbolic Multidimensional Scaling By Groenen, P.J.F.; Terada, Y.
  9. How to use economic theory to improve estimators, with an application to labor demand and wage inequality in Europe. By Pirmin Fessler; Kasy, Maximilian
  10. High dimensional Global Minimum Variance Portfolio By Li, Hua; Bai, Zhi Dong; Wong, Wing Keung
  11. Time-dependent scaling patterns in high frequency financial data By Noemi Nava; Tiziana Di Matteo; Tomaso Aste
  12. Early warning of large volatilities based on recurrence interval analysis in Chinese stock markets By Zhi-Qiang Jiang; Askery A. Canabarro; Boris Podobnik; H. Eugene Stanley; Wei-Xing Zhou
  13. Identification in Differentiated Products Markets By Steven T. Berry; Philip Haile
  14. Forecasting the Global Mean Sea Level, a Continuous-Time State-Space Approach By Lorenzo Boldrini

  1. By: Weigand, Roland (Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany]); Wanger, Susanne (Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany]); Zapf, Ines (Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany])
    Abstract: "We introduce a high-dimensional structural time series model, where co-movement between the components is due to common factors. A two-step estimation strategy is presented, which is based on principal components in differences in a first step and state space methods in a second step. The methods add to the toolbox of official statisticians, constructing timely regular statistics from different data sources. In this context, we discuss typical measurement features such as survey errors, statistical breaks, different sampling frequencies and irregular observation patterns, and describe their statistical treatment. The methods are applied to the estimation of paid and unpaid overtime work as well as flows on working-time accounts in Germany, which enter the statistics on hours worked in the national accounts." (Author's abstract, IAB-Doku) ((en))
    Keywords: IAB-Arbeitszeitrechnung - Methode, Arbeitszeit, Arbeitsvolumen, Zeitreihenanalyse, Schätzung, Methodenliteratur, Überstunden, Arbeitszeitkonto
    JEL: C14 C32 C51 C53 C58
    Date: 2015–08–13
    URL: http://d.repec.org/n?u=RePEc:iab:iabdpa:201522&r=all
  2. By: Moreira, Humberto; Moreira, Marcelo J.
    Abstract: This paper considers two-sided tests for the parameter of an endogenous variable in an instrumental variable (IV) model with heteroskedastic and autocorrelated er- rors. We develop the nite-sample theory of weighted-average power (WAP) tests with normal errors and a known long-run variance. We introduce two weights which are invariant to orthogonal transformations of the instruments; e.g., changing the order in which the instruments appear. While tests using the MM1 weight can be severely biased, optimal tests based on the MM2 weight are naturally two-sided when errors are homoskedastic. We propose two boundary conditions that yield two-sided tests whether errors are homoskedastic or not. The locally unbiased (LU) condition is related to the power around the null hypothesis and is a weaker requirement than unbiasedness. The strongly unbiased (SU) condition is more restrictive than LU, but the associatedWAP tests are easier to implement. Several tests are SU in nite samples or asymptotically, including tests robust to weak IV (such as the Anderson-Rubin, score, conditional quasi-likelihood ratio, and I. Andrews' (2015) PI-CLC tests) and two-sided tests which are optimal when the sample size is large and instruments are strong. We refer to the WAP-SU tests based on our weights as MM1-SU and MM2-SU tests. Dropping the restrictive assumptions of normality and known variance, the theory is shown to remain valid at the cost of asymptotic approximations. The MM2-SU test is optimal under the strong IV asymptotics, and outperforms other existing tests under the weak IV asymptotics.
    Date: 2015–05–21
    URL: http://d.repec.org/n?u=RePEc:fgv:epgewp:764&r=all
  3. By: Cinzia Daraio (Department of Computer, Control and Management Engineering Antonio Ruberti (DIAG), University of Rome La Sapienza, Rome, Italy); Leopold Simar (Institut de Statistique, Biostatistique et Sciences Actuarielles, Universite' Catholique de Louvain, Louvain-la-Neuve, Belgium); Paul W. Wilson (Department of Economics and School of Computing, Clemson University, Clemson, SC 29634)
    Abstract: Simar and Wilson (J. Econometrics, 2007) provided a statistical model that can rationalize two-stage estimation of technical efficiency in nonparametric settings. Two-stage estimation has been widely used, but requires a strong assumption: the second-stage environmental variables cannotaffect the support of the input and output variables in the first stage. In this paper, we provide a fully nonparametric test of this assumption. The test relies on new central limit theorem (CLT) results for unconditional efficiency estimators developed by Kneip et al. (Econometric Theory, 2015a) and new CLTs for conditional efficiency estimators developed in this paper. The test can be implemented relying on either asymptotic normality of the test statistics or using bootstrap methods to obtain critical values. Our simulation results indicate that our tests perform well both in terms of size and power. We present a real-world empirical example by updating the analysis performed by Aly et al. (R. E. Stat., 1990) on U.S. commercial banks; our tests easily reject the assumption required for two-stage estimation, calling into question results that appear in hundreds of papers that have been published in recent years.
    Keywords: technical efficiency ; conditional efficiency ; two-stage estimation ; bootstrap ; separability ; data envelopment analysis (DEA) ; free-disposal hull (FDH).
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:aeg:report:2015-08&r=all
  4. By: Lorenzo Boldrini (Aarhus University and CREATES); Eric Hillebrand (Aarhus University and CREATES)
    Abstract: In this paper we investigate the forecasting performance of a particular factor model (FM) in which the factors are extracted from a large number of predictors. We use a semi-parametric state-space representation of the FM in which the forecast objective, as well as the factors, is included in the state vector. The factors are informed of the forecast target (supervised) through the state equation dynamics. We propose a way to assess the contribution of the forecast objective on the extracted factors that exploits the Kalman filter recursions. We forecast one target at a time based on the filtered states and estimated parameters of the state-space system. We assess the out-of-sample forecast performance of the proposed method in a simulation study and in an empirical application, comparing its forecasts to the ones delivered by other popular multivariate and univariate approaches, e.g. a standard dynamic factor model with separate forecast and state equations.
    Keywords: state-space system, Kalman filter, factor model, supervision, forecasting JEL classification: C32, C38, C55
    Date: 2015–08–24
    URL: http://d.repec.org/n?u=RePEc:aah:create:2015-38&r=all
  5. By: Gayle, George-Levi (Federal Reserve Bank of St. Louis); Golan, Limor (Federal Reserve Bank of St. Louis); Soytas, Mehmet A. (Graduate School of Business, Ozyegin University)
    Abstract: This paper explores the estimation of a class of life-cycle discrete choice intergenerational models. It proposes a new semiparametric estimator. It shows that it is root-N-consistent and asymptotically normally distributed. We compare our estimator with a modified version of the full solution maximum likelihood estimator (MLE) in a Monte Carlo study. Our estimator performs comparably to the MLE in a finite sample but greatly reduces the computational cost. The paper documents that the quantity-quality trade-offs depend on the household composition and specialization in the household. Using the proposed estimator, we estimate a dynastic model that rationalizes these observed patterns.
    JEL: C13 J13 J22 J62
    Date: 2015–08–13
    URL: http://d.repec.org/n?u=RePEc:fip:fedlwp:2015-020&r=all
  6. By: Galina Besstremyannaya (CEFIR at New Economic School); Jaak Simm (University of Leuven)
    Abstract: The paper modifies the methodology of Simar and Wilson 2007 [J Econometrics 136] and 1998 [Manage Sci 44] to propose a new algorithm for robust estimation of cost efficiency in data envelopment analysis in terms of bias correction and estimating returns to scale. Simulation analyses with multi-input multi-output Cobb-Douglas production function with correlated outputs, and correlated technical and cost efficiency demonstrate consistency of the new algorithm both in absence and presence of environmental variables. Finally, we offer real data estimates for Japanese banking industry. An R package `rDea', developed for computations, is available from GitHub and CRAN repositary.
    Keywords: data envelopment analysis, cost efficiency, bias correction, bootstrap
    JEL: C44 C61
    Date: 2015–08
    URL: http://d.repec.org/n?u=RePEc:cfr:cefirw:w0217&r=all
  7. By: Kim, Jae
    Abstract: The level of significance should be chosen with careful consideration of the key factors such as the sample size, power of the test, and expected losses from Type I and II errors. While the conventional levels may still serve as practical benchmarks, they should not be adopted mindlessly and mechanically for every application.
    Keywords: Expected Loss, Statistical Significance, Sample Size, Power of the test
    JEL: A2 C12
    Date: 2015–08–31
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:66373&r=all
  8. By: Groenen, P.J.F.; Terada, Y.
    Abstract: __Abstract__ Multidimensional scaling (MDS) is a technique that visualizes dissimilarities between pairs of objects as distances between points in a low dimensional space. In symbolic MDS, a dissimilarity is not just a value but can represent an interval or even a histogram. Here, we present an overview of developments for symbolic MDS. We discuss how interval dissimilarities they can be represented by (concentric) circles or rectangles, how replications can be represented by a three-way MDS version, and show how nested intervals of distances can be obtained for representing histogram dissimilarities. The various models are illustrated by empirical examples.
    Keywords: MDS, Multidimensional Scaling
    Date: 2015–05–27
    URL: http://d.repec.org/n?u=RePEc:ems:eureir:78189&r=all
  9. By: Pirmin Fessler; Kasy, Maximilian
    Abstract: Economic theory, when it has empirical content, provides testable restrictions on empirically identified objects. These empirical objects might be estimated in an unrestricted way, leading to estimates of potentially large variance, or subject to the theoretical restrictions, leading to estimates of lower variance which are potentially biased, inconsistent, and non-robust. We propose an alternative approach, based on the empirical Bayes paradigm, which avoids both large variance and bias, and which performs particularly well when the theory is approximately correct. We characterize the geometry and the risk-function (mean squared error) of the proposed estimator. Simulations confirm that our estimator uniformly dominates unrestricted estimation over a large space of parameter values, and dominates structural estimation under modest violations of the theory. We apply our approach to models of labor demand which are used to analyze to what extent changes in the distribution of wages can be explained by changes in labor supply (due to demographic change, migration, or expanded access to education), as opposed to other factors (technical and institutional change). We study changes since 2003 of the wage distribution in the countries of the European Union, using the EU-SILC data. We find inverse elasticities of substitution which are negative but much smaller than comparable estimates for the US.
    Date: 2015–08
    URL: http://d.repec.org/n?u=RePEc:qsh:wpaper:309271&r=all
  10. By: Li, Hua; Bai, Zhi Dong; Wong, Wing Keung
    Abstract: This paper proposes the spectral corrected methodology to estimate the Global Minimum Variance Portfolio (GMVP) for the high dimensional data. In this paper, we analysis the limiting properties of the spectral corrected GMVP estimator as the dimension and the number of the sample set increase to infinity proportionally. In addition, we compare the spectral corrected estimation with the linear shrinkage and nonlinear shrinkage estimations and obtain that the performance of the spectral corrected methodology is best in the simulation study.
    Keywords: Global Minimum Variance Portfolio, Spectral Corrected Covariance, Sample Covariance
    JEL: C02
    Date: 2015–08–26
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:66284&r=all
  11. By: Noemi Nava; Tiziana Di Matteo; Tomaso Aste
    Abstract: We measure the influence of different time-scales on the dynamics of financial market data. This is obtained by decomposing financial time series into simple oscillations associated with distinct time-scales. We propose two new time-varying measures: 1) an amplitude scaling exponent and 2) an entropy like measure. We apply these measures to intra-day, 30-second sampled prices of various stock indices. Our results reveal intra-day trends where different time-horizons contribute with variable relative amplitudes over the course of the trading day. Our findings indicate that the time series we analysed have a non-stationary multi-fractional nature with predominantly persistent behaviour at the middle of the trading session and anti-persistent behaviour at the open and close. We demonstrate that these deviations are statistically significant and robust.
    Date: 2015–08
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1508.07428&r=all
  12. By: Zhi-Qiang Jiang (ECUST, BU); Askery A. Canabarro (UFAL, BU); Boris Podobnik (UR); H. Eugene Stanley (BU); Wei-Xing Zhou (ECUST)
    Abstract: Being able to forcast extreme volatility is a central issue in financial risk management. We present a large volatility predicting method based on the distribution of recurrence intervals between volatilities exceeding a certain threshold $Q$ for a fixed expected recurrence time $\tau_Q$. We find that the recurrence intervals are well approximated by the $q$-exponential distribution for all stocks and all $\tau_Q$ values. Thus a analytical formula for determining the hazard probability $W(\Delta t |t)$ that a volatility above $Q$ will occur within a short interval $\Delta t$ if the last volatility exceeding $Q$ happened $t$ periods ago can be directly derived from the $q$-exponential distribution, which is found to be in good agreement with the empirical hazard probability from real stock data. Using these results, we adopt a decision-making algorithm for triggering the alarm of the occurrence of the next volatility above $Q$ based on the hazard probability. Using a "receiver operator characteristic" (ROC) analysis, we find that this predicting method efficiently forecasts the occurrance of large volatility events in real stock data. Our analysis may help us better understand reoccurring large volatilities and more accurately quantify financial risks in stock markets.
    Date: 2015–08
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1508.07505&r=all
  13. By: Steven T. Berry; Philip Haile
    Abstract: Empirical models of demand for–and, often, supply of–differentiated products are widely used in practice, typically employing parametric functional forms and distributions of consumer heterogeneity. We review some recent work studying identification in a broad class of such models. This work shows that parametric functional forms and distributional assumptions are not essential for identification. Rather, identification relies primarily on the standard requirement that instruments be available for the endogenous variables–here, typically, prices and quantities. We discuss the kinds of instruments needed for identification and how the reliance on instruments can be reduced by nonparametric functional form restrictions or better data. We also discuss results on discrimination between alternative models of oligopoly competition.
    JEL: C3 C35 C36 C52 D12 D22 D43 L13
    Date: 2015–08
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:21500&r=all
  14. By: Lorenzo Boldrini (Aarhus University and CREATES)
    Abstract: In this paper we propose a continuous-time, Gaussian, linear, state-space system to model the relation between global mean sea level (GMSL) and the global mean temperature (GMT), with the aim of making long-term projections for the GMSL. We provide a justification for the model specification based on popular semi-empirical methods present in the literature and on zero-dimensional energy balance models. We show that some of the models developed in the literature on semi-empirical models can be analysed within this framework. We use the sea-level data reconstruction developed in Church and White (2011) and the temperature reconstruction from Hansen et al. (2010). We compare the forecasting performance of the proposed specification to the procedures developed in Rahmstorf (2007b) and Vermeer and Rahmstorf (2009). Finally, we compute projections for the sea-level rise conditional on the 21st century SRES temperature scenarios of the IPCC fourth assessment report. Furthermore, we propose a bootstrap procedure to compute confidence intervals for the projections, based on the method introduced in Rodriguez and Ruiz (2009).
    Keywords: energy balance model, semi-empirical model, state-space system, Kalman filter, forecasting, temperature, sea level, bootstrap JEL classification: C32
    Date: 2015–08–24
    URL: http://d.repec.org/n?u=RePEc:aah:create:2015-40&r=all

This nep-ecm issue is ©2015 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.