nep-ecm New Economics Papers
on Econometrics
Issue of 2011‒08‒15
thirteen papers chosen by
Sune Karlsson
Orebro University

  1. Asymptotic Power of Sphericity Tests for High-Dimensional Data By Alexei Onatski; Marcelo Moreira J.; Marc Hallin
  2. Bayesian inference with monotone instrumental variables By Qian, Hang
  3. Sampling Variation, Monotone Instrumental Variables and the Bootstrap Bias Correction By Qian, Hang
  4. Consistent Tests for Conditional Treatment Effects By Yu-Chin Hsu
  5. One-Sided Representations of Generalized Dynamic Factor Models By Mario Forni; Marc Hallin; Marco Lippi; Paolo Zaffaroni
  6. Structural Vector Autoregressions By Kilian, Lutz
  7. UNDERSTANDING THE FUNCTIONAL CENTRAL LIMIT THEOREMS WITH SOME APPLICATIONS TO UNIT ROOT TESTING WITH STRUCTURAL CHANGE By Juan Carlos Aquino; Gabriel Rodríguez
  8. Continuous-Time Modelling with Spatial Dependence By Johan H.L. Oud; Henk Folmer; Roberto Patuelli; Peter Nijkamp
  9. Goodness-of-fit testing for the marginal distribution of regime-switching models By Janczura, Joanna; Weron, Rafal
  10. Similar-on-the-Boundary Tests for Moment Inequalities Exist, But Have Poor Power By Donald W.K. Andrews
  11. A hybrid approach to efficiency measurement with empirical illustrations from education and health By Wagstaff, Adam; Wang, L. Choon
  12. A Multiple State Duration Model with Endogenous Treatment By Mroz, T.;; Picone, G.;
  13. Measuring overfitting and mispecification in nonlinear models By Bilger M.;; Manning W.G;

  1. By: Alexei Onatski; Marcelo Moreira J.; Marc Hallin
    Abstract: This paper studies the asymptotic power of tests of sphericity against perturbations in a single unknown direction as both the dimensionality of the data and the number of observations go to infinity. We establish the convergence, under the null hypothesis and the alternative, of the log ratio of the joint densities of the sample covariance eigenvalues to a Gaussian process indexed by the norm of the perturbation. When the perturbation norm is larger than the phase transition threshold studied in Baik et al. (2005), the limiting process is degenerate and discrimination between the null and the alternative is asymptotically certain. When the norm is below the threshold, the process is non-degenerate, so that the joint eigenvalue densities under the null and alternative hypotheses are mutually contiguous. Using the asymptotic theory of statistical experiments, we obtain asymptotic power envelopes and derive the asymptotic power for various sphericity tests in the contiguity region. In particular, we show that the asymptotic power of the Tracy-Widom-type tests is trivial, whereas that of the eigenvalue-based likelihood ratio test is strictly larger than the size, and close to the power envelope.
    Keywords: sphericity tests; large dimentionality; asymptotic power; spiker covariance; contiguity; power enveloppe; steepest descent; contour intgral representation
    Date: 2011–08
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/94952&r=ecm
  2. By: Qian, Hang
    Abstract: Sampling variations complicate the classical inference on the analogue bounds under the monotone instrumental variables assumption, since point estimators are biased and confidence intervals are difficult to construct. From the Bayesian perspective, a solution is offered in this paper. Using a conjugate Dirichlet prior, we derive some analytic results on the posterior distribution of the two bounds of the conditional mean response. The bounds of the unconditional mean response and the average treatment effect can be obtained with Bayesian simulation techniques. Our Bayesian inference is applied to an empirical problem which quantifies the effects of taking extra classes on high school students' test scores. The two MIVs are chosen as the education levels of their fathers and mothers. The empirical results suggest that the MIV assumption in conjunction with the monotone treatment response assumption yield good identification power.
    Keywords: Monotone instrumental variables; Bayesian; Dirichlet
    JEL: C31 C11
    Date: 2011–08
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:32672&r=ecm
  3. By: Qian, Hang
    Abstract: This paper discusses the finite sample bias of analogue bounds under the monotone instrumental variables assumption. By analyzing the bias function, we first propose a conservative estimator which is biased downwards (upwards) when the analogue estimator is biased upwards (downwards). Using the bias function, we then show the mechanism of the parametric bootstrap correction procedure, which can reduce but not eliminate the bias, and there is also a possibility of overcorrection.This motivates us to propose a simultaneous multi-level bootstrap procedure so as to further correct the remaining bias. The procedure is justified under the assumption that the bias function can be well approximated by a polynomial. Our multi-level bootstrap algorithm is feasible and does not suffer from the curse of dimensionality. Monte Carlo evidence supports the usefulness of this approach and we apply it to the disability misreporting problem studied by Kreider and Pepper(2007).
    Keywords: Monotone instrumental variables; Bootstrap; Bias correction
    JEL: C63 C31
    Date: 2011–08
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:32634&r=ecm
  4. By: Yu-Chin Hsu (Department of Economics, University of Missouri-Columbia)
    Abstract: We construct a Kolmogorov-Smirnov test for the null hypothesis that the average treatment effect is non-negative conditional on all possible values of the covariates. The null hypothesis of interest can be characterized as a conditional moment inequality under the unconfoundedness assumption, and we employ the instrumental variable method to convert the conditional moment inequality into an infinite number of unconditional moment inequalities without information loss. A Kolmogorov-Smirnov test is constructed based on these unconditional moment inequalities. It is shown that our test can control the size asymptotically, is consistent against fixed alternatives and is unbiased against some N^(−1/2) local alternatives. Furthermore, our test is more powerful than Lee and Whang’s (2009) against a broad set of N^(−1/2) local alternatives. Monte-Carlo simulation results confirm our theoretical findings. Several interesting extensions are also discussed.
    Keywords: Hypothesis testing, treatment effects, test consistency, propensity score.
    JEL: C01 C12 C21
    Date: 2011–05–23
    URL: http://d.repec.org/n?u=RePEc:umc:wpaper:1107&r=ecm
  5. By: Mario Forni; Marc Hallin; Marco Lippi; Paolo Zaffaroni
    Abstract: Factor model methods recently have become extremely popular in the theory and practice of large panels of time series data. Those methods rely on various factor models which all are particular cases of the Generalized Dynamic Factor Model (GDFM) introduced in Forni, Hallin, Lippi and Reichlin (2000). In that paper, however, estimation relies on Brillinger’s concept of dynamic principal components, which produces filters that are in general two-sided and therefore yield poor performances at the end of the observation period and hardly can be used for forecasting purposes. In the present paper, we remedy this problem, and show how, based on recent results on singular stationary processes with rational spectra, one-sided estimators are possible for the parameters and the common shocks in the GDFM. Consistency is obtained, along with rates. An empirical section, based on US macroeconomic time series, compares estimates based on our model with those based on the usual staticrepresentation restriction, and provide convincing evidence that the assumptions underlying the latter are not supported by the data.
    Keywords: generalized dynamic factor models; vector processes with singular spectral density; one-sided representations for dynamic factor models; consistency and rates for estimators of dynamic factor models
    JEL: C00 C01 E00
    Date: 2011–08
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/94959&r=ecm
  6. By: Kilian, Lutz
    Abstract: Structural vector autoregressive (VAR) models were introduced in 1980 as an alternative to traditional large-scale macroeconometric models when the theoretical and empirical support for these models became increasingly doubtful. Initial applications of the structural VAR methodology often were atheoretical in that users paid insufficient attention to the conditions required for identifying causal effects in the data. In response to ongoing questions about the validity of widely used identifying assumptions the structural VAR literature has continuously evolved since the 1980s. This survey traces the evolution of this literature. It focuses on alternative approaches to the identification of structural shocks within the framework of a reduced-form VAR model, highlighting the conditions under which each approach is valid and discussing potential limitations of commonly employed methods.
    Keywords: Identification; Structural model; VAR
    JEL: C32 C51
    Date: 2011–08
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:8515&r=ecm
  7. By: Juan Carlos Aquino; Gabriel Rodríguez (Departamento de Economía - Pontificia Universidad Católica del Perú)
    Abstract: This paper analyzes and employs two versions of the Functional Central Limit Theorem within the framework of a unit root with a structural break. Initial attention is focused on the probabilistic structure of the time series to be considered. Later, attention is placed on the asymptotic theory for nonstationary time series proposed by Phillips (1987a), which is applied by Perron (1989) to study the effects of an (assumed) exogenous structural break on the power of the augmented Dickey-Fuller test and by Zivot and Andrews (1992) to criticize the exogeneity assumption and propose a method for estimating an endogenous breakpoint. A systematic method for dealing with e¢ ciency issues is introduced by Perron and RodrÌguez (2003), which extends the Generalized Least Squares detrending approach due to Elliott, Rothenberg, and Stock (1996)
    Keywords: Hypothesis Testing, Unit Root, Structural Break, Functional Central Limit Theorem, Weak Convergence, Wiener Process, Ornstein-Uhlenbeck Process
    JEL: C12 C22
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:pcp:pucwps:wp00319&r=ecm
  8. By: Johan H.L. Oud (Radboud University Nijmegen); Henk Folmer (University of Groningen); Roberto Patuelli (University of Lugano); Peter Nijkamp (VU University Amsterdam)
    Abstract: (Spatial) panel data are routinely modelled in discrete time (DT). However, there are compelling arguments for continuous time (CT) modelling of (spatial) panel data. Particularly, most social processes evolve in CT, so that statistical analysis in DT is an oversimplification, gives an incomplete representation of reality and may lead to misinterpretation of estimation results. The most compelling reason for a CT approach is that, in contrast to DT modelling, it allows adequate modelling of dynamic adjustment processes. The paper introduces spatial dependence in a CT modelling framework. We propose a nonlinear Structural Equation Model (SEM) with latent variables for estimation of the Exact Discrete Model (EDM), which links the CT model parameters to the DT observations. The use of a SEM with latent variables makes it possible to take measurement errors in the variables into account, leading to a reduction of attenuation bias (i.e., disattenuation). The SE M-CT model with spatial dependence developed here is the first dynamic structural equation model with spatial dependence. The spatial econometric SEM-CT framework is illustrated on the basis of a simple regional labour market model for Germany made up of the endogenous state variables unemployment change and population change and of the exogenous input variables change in regional average wage and change in the structure of the manufacturing sector.
    Keywords: continuous-time modelling; structural equation modelling; latent variables; spatial dependence; panel data; disattenuation; measurement errors; unemployment change; population change; Germany
    JEL: C33 E24 O18 R11
    Date: 2011–08–09
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20110117&r=ecm
  9. By: Janczura, Joanna; Weron, Rafal
    Abstract: In this paper we propose a new goodness-of-fit testing scheme for the marginal distribution of regime-switching models. We consider models with an observable (like threshold autoregressions), as well as, a latent state process (like Markov regime-switching). The test is based on the Kolmogorov-Smirnov supremum-distance statistic and the concept of the weighted empirical distribution function. The motivation for this research comes from a recent stream of literature in energy economics concerning electricity spot price models. While the existence of distinct regimes in such data is generally unquestionable (due to the supply stack structure), the actual goodness-of-fit of the models requires statistical validation. We illustrate the proposed scheme by testing whether a commonly used Markov regime-switching model fits deseasonalized electricity prices from the NEPOOL (U.S.) day-ahead market.
    Keywords: Regime-switching; marginal distribution; goodness-of-fit; weighted empirical distribution function; Kolmogorov-Smirnov test
    JEL: C52 C12 Q40
    Date: 2011–07–09
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:32532&r=ecm
  10. By: Donald W.K. Andrews (Cowles Foundation, Yale University)
    Abstract: This paper shows that moment inequality tests that are asymptotically similar on the boundary of the null hypothesis exist, but have poor power. Hence, existing tests in the literature, which are asymptotically non-similar on the boundary, are not deficient. The results are obtained by first establishing results for the finite-sample multivariate normal one-sided testing problem. Then, these results are shown to have implications for more general moment inequality tests that are used in the literature on partial identification.
    Keywords: Moment inequality, One-sided test, Power, Similar, Test
    JEL: C12 C15
    Date: 2011–08
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1815&r=ecm
  11. By: Wagstaff, Adam; Wang, L. Choon
    Abstract: Inefficiency is commonplace, yet exercises aimed at improving provider performance efforts to date to measure inefficiency and use it in benchmarking exercises have not been altogether satisfactory. This paper proposes a new approach that blends the themes of Data Envelopment Analysis and the Stochastic Frontier Approach to measure overall efficiency. The hybrid approach nonparametrically estimates inefficiency by comparing actual performance with comparable real-life"best practice"on the frontier and could be useful in exercises aimed at improving provider performance. Four applications in the education and health sectors are used to illustrate the features and strengths of this hybrid approach.
    Keywords: Health Monitoring&Evaluation,Health Systems Development&Reform,Tertiary Education,Disease Control&Prevention,Education For All
    Date: 2011–08–01
    URL: http://d.repec.org/n?u=RePEc:wbk:wbrwps:5751&r=ecm
  12. By: Mroz, T.;; Picone, G.;
    Abstract: This study develops a discrete multiple state duration model that al- lows for duration dependence, unmeasured heterogeneity, partial observ- ability of the state and endogenous time-varying treatment. Our econo- metric strategy has numerous potential empirical applications. We apply our duration model to the progression of diabetic neuropathy, a compli- cation of diabetes with four levels of progression, which if left untreated may lead to amputation. Our results show that the longer a person has diabetes without having being diagnosed (and treated) increases the prob- abilities of transitioning to a worse stage, death or amputation.
    Keywords: Multiple state duration model, endogenous treatment, discrete factors
    JEL: C41 C14 C51 I12
    Date: 2011–07
    URL: http://d.repec.org/n?u=RePEc:yor:hectdg:11/19&r=ecm
  13. By: Bilger M.;; Manning W.G;
    Abstract: We start by proposing a new measure of overfitting expressed on the untransformed scale of the dependent variable, which is generally the scale of interest to the analyst.We then show that with nonlinear models shrinkage due to overfitting gets confounded by shrinkage—or expansion— arising from model misspecification. Out-of-sample predictive calibration can in fact be expressed as in-sample calibration times 1 minus this new measure of overfitting. We finally argue that re-calibration should be performed on the scale of interest and provide both a simulation study and a real-data illustration based on health care expenditure data.
    Keywords: overfitting, shrinkage, misspecification, forecasting, health care expenditure
    Date: 2011–08
    URL: http://d.repec.org/n?u=RePEc:yor:hectdg:11/25&r=ecm

This nep-ecm issue is ©2011 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.