nep-ecm New Economics Papers
on Econometrics
Issue of 2021‒07‒12
fourteen papers chosen by
Sune Karlsson
Örebro universitet

  1. Breusch and Pagan’s (1980) Test Revisited By Jinyong Hahn; Ruoyao Shi
  2. Pooled Bewley Estimator of Long-Run Relationships in Dynamic Heterogenous Panels By Alexander Chudik; M. Hashem Pesaran; Ron P. Smith
  3. Semiparametric inference for partially linear regressions with Box-Cox transformation By Daniel Becker; Alois Kneip; Valentin Patilea
  4. Macroeconomic Forecasting with Large Stochastic Volatility in Mean VARs By Jamie L. Cross; Chenghan Hou; Gary Koop
  5. Generalized Spatial and Spatiotemporal ARCH Models By Philipp Otto; Wolfgang Schmid
  6. Estimation of Common Factors for Microstructure Noise and Efficient Price in a High-frequency Dual Factor Model By Li, Y-N.; Chen, J.; Linton, O.
  7. Detecting multiple generalized change-points by isolating single ones By Anastasiou, Andreas; Fryzlewicz, Piotr
  8. Instrumental Variable Network Difference-in-Differences (IV-NDID) Estimator: Model and Application By Dall’erba, Sandy; Chagas, André; Ridley, William; Xu, Yilan; Yuan, Lilin
  9. BAYESIAN MODEL AVERAGING FOR PROPENSITY SCORE MATCHING IN TAX REBATE By Riccardo Lucchetti; Luca Pedini; Claudia Pigini
  10. Searching for Hysteresis By Luca Benati; Thomas Lubik
  11. Adjustment coefficients and exact rational expectations in cointegrated vector autoregressive models By Soeren Johansen; Anders Rygh Swensen
  12. Binary Endogenous Treatment in Stochastic Frontier Models with an Application to Soil Conservation in El Salvador By Centorrino, Samuele; Perez Urdiales, Mari­a; Bravo-Ureta, Boris; Wall, Alan
  13. Volatility Bursts: A discrete-time option model with multiple volatility components By Francesca Lilla
  14. Policy Evaluation Using Causal Inference Methods By Denis Fougère; Nicolas Jacquemet

  1. By: Jinyong Hahn (Department of Economics, UCLA); Ruoyao Shi (Department of Economics, University of California Riverside)
    Abstract: We consider the local asymptotic power of Breusch and Pagan’s (1980) test for the general nonlinear models. The test is motivated by the random effects, but we consider the fixed effects for the alternative hypothesis, derive the local power, and show that the test has a power to detect the fixed effects. We also examine how the estimation noise of the maximum likelihood estimator changes the asymptotic distribution of the test under the null, and show that such a noise may be ignored in a large n large T situation, which may have a convenient implication in the possible application of the test to network models.
    Keywords: Lagrange multiplier test, fixed effects, local power, error component model
    Date: 2021–06
    URL: http://d.repec.org/n?u=RePEc:ucr:wpaper:202110&r=
  2. By: Alexander Chudik; M. Hashem Pesaran; Ron P. Smith
    Abstract: This paper, using the Bewley (1979) transformation of the autoregressive distributed lag model, proposes a pooled Bewley (PB) estimator of long-run coefficients for dynamic panels with heterogeneous short-run dynamics, in the same setting as the widely used Pooled Mean Group (PMG) estimator. The Bewley transform enables us to obtain an analytical closed form expression for the PB, which is not available when using the maximum likelihood approach. This lets us establish asymptotic normality of PB as n,T→∞ jointly, allowing for applications with n and T large and of the same order of magnitude, but excluding panels where T is short relative to n. In contrast, asymptotic distribution of PMG estimator was obtained for n fixed and T→∞. Allowing for both n and T large seems to be the more relevant empirical setting, as revealed by numerous applications of the PMG estimator in the literature. Dynamic panel estimators are biased when T is not sufficiently large. Three bias corrections (simulation based, split-panel jackknife and a combined procedure) are investigated using Monte Carlo experiments, of which the combined procedure works best in reducing bias. In contrast to PMG, PB does not weight by estimated variances, which can make it more robust in small samples, though less efficient asymptotically. The PB estimator is illustrated with an application to the aggregate consumption function estimated in the original PMG paper.
    Keywords: Heterogeneous dynamic panels; I(1) regressors; pooled mean group estimator (PMG); Autoregressive-Distributed Lag model (ARDL); Bewley transform; bias correction; split-panel jackknife
    JEL: C12 C13 C23 C33
    Date: 2021–05–27
    URL: http://d.repec.org/n?u=RePEc:fip:feddgw:92809&r=
  3. By: Daniel Becker (University of Bonn); Alois Kneip (University of Bonn); Valentin Patilea (CREST)
    Abstract: In this paper, a semiparametric partially linear model in the spirit of Robinson (1988) with Box- Cox transformed dependent variable is studied. Transformation regression models are widely used in applied econometrics to avoid misspecification. In addition, a partially linear semiparametric model is an intermediate strategy that tries to balance advantages and disadvantages of a fully parametric model and nonparametric models. A combination of transformation and partially linear semiparametric model is, thus, a natural strategy. The model parameters are estimated by a semiparametric extension of the so called smooth minimum distance (SmoothMD) approach proposed by Lavergne and Patilea (2013). SmoothMD is suitable for models defined by conditional moment conditions and allows the variance of the error terms to depend on the covariates. In addition, here we allow for infinite-dimension nuisance parameters. The asymptotic behavior of the new SmoothMD estimator is studied under general conditions and new inference methods are proposed. A simulation experiment illustrates the performance of the methods for finite samples.
    Date: 2021–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2106.10723&r=
  4. By: Jamie L. Cross; Chenghan Hou; Gary Koop
    Abstract: Vector autoregressions with stochastic volatility in both the conditional mean and variance are commonly used to estimate the macroeconomic effects of uncertainty shocks. Despite their popularity, intensive computational demands when estimating such models have made out-of-sample forecasting exercises impractical, particularly when working with large data sets. In this article, we propose an efficient Markov chain Monte Carlo (MCMC) algorithm for posterior and predictive inference in such models that facilitates such exercises. The key insight underlying the algorithm is that the (log-)conditional densities of the log-volatilities possess Hessian matrices that are banded. This enables us to build upon recent advances in band and sparse matrix algorithms for state space models. In a simulation exercise, we evaluate the new algorithm numerically and establish its computational and statistical effciency over a conventional particle filter based algorithm. Using macroeconomic data for the US we find that such models generally deliver more accurate point and density forecasts over a conventional benchmark in which stochastic volatility only enters the variance of the model.
    Keywords: Bayesian VARs, Macroeconomic Forecasting, Stochastic Volatility in Mean, State Space Models, Uncertainty
    Date: 2021–06
    URL: http://d.repec.org/n?u=RePEc:bny:wpaper:0100&r=
  5. By: Philipp Otto; Wolfgang Schmid
    Abstract: In time-series analyses, particularly for finance, generalized autoregressive conditional heteroscedasticity (GARCH) models are widely applied statistical tools for modelling volatility clusters (i.e., periods of increased or decreased risk). In contrast, it has not been considered to be of critical importance until now to model spatial dependence in the conditional second moments. Only a few models have been proposed for modelling local clusters of increased risks. In this paper, we introduce a novel spatial GARCH process in a unified spatial and spatiotemporal GARCH framework, which also covers all previously proposed spatial ARCH models, exponential spatial GARCH, and time-series GARCH models. In contrast to previous spatiotemporal and time series models, this spatial GARCH allows for instantaneous spill-overs across all spatial units. For this common modelling framework, estimators are derived based on a non-linear least-squares approach. Eventually, the use of the model is demonstrated by a Monte Carlo simulation study and by an empirical example that focuses on real estate prices from 1995 to 2014 across the ZIP-Code areas of Berlin. A spatial autoregressive model is applied to the data to illustrate how locally varying model uncertainties (e.g., due to latent regressors) can be captured by the spatial GARCH-type models.
    Date: 2021–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2106.10477&r=
  6. By: Li, Y-N.; Chen, J.; Linton, O.
    Abstract: We develop the Double Principal Component Analysis (DPCA) based on a dual factor structure for high-frequency intraday returns data contaminated with microstructure noise. The dual factor structure allows a factor structure for the microstructure noise in addition to the factor structure for efficient log-prices. We construct estimators of factors for both efficient log-prices and microstructure noise as well as their common components, and provide uniform consistency of these estimators when the number of assets and the sampling frequency go to infinity. In a Monte Carlo exercise, we compare our DPCA method to a PCA-VECM method. Finally, an empirical analysis of intraday returns of S&P 500 Index constituents provides evidence of co-movement of the microstructure noise that distinguishes from latent systematic risk factors.
    Keywords: Cointegration, Factor model, High-frequency data, Microstructure noise, Non-stationarity
    JEL: C10 C13 C14 C33 C38
    Date: 2021–06–30
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:2150&r=
  7. By: Anastasiou, Andreas; Fryzlewicz, Piotr
    Abstract: We introduce a new approach, called Isolate-Detect (ID), for the consistent estimation of the number and location of multiple generalized change-points in noisy data sequences. Examples of signal changes that ID can deal with are changes in the mean of a piecewise-constant signal and changes, continuous or not, in the linear trend. The number of change-points can increase with the sample size. Our method is based on an isolation technique, which prevents the consideration of intervals that contain more than one change-point. This isolation enhances ID’s accuracy as it allows for detection in the presence of frequent changes of possibly small magnitudes. In ID, model selection is carried out via thresholding, or an information criterion, or SDLL, or a hybrid involving the former two. The hybrid model selection leads to a general method with very good practical performance and minimal parameter choice. In the scenarios tested, ID is at least as accurate as the state-of-the-art methods; most of the times it outperforms them. ID is implemented in the R packages IDetect and breakfast, available from CRAN.
    Keywords: segmentation; symmetric interval expansion; threshold criterion; Schwarz information criterion; SDLL; EP/L014246/1; UKRI fund
    JEL: C1
    Date: 2021–05–24
    URL: http://d.repec.org/n?u=RePEc:ehl:lserod:110258&r=
  8. By: Dall’erba, Sandy (Dept. of Agricultural and Consumer Economics, University of Illinois at Urbana-Champaign); Chagas, André (Departamento de Economia, Universidade de São Paulo); Ridley, William (Dept. of Agricultural and Consumer Economics, University of Illinois at Urbana-Champaign); Xu, Yilan (Dept. of Agricultural and Consumer Economics, University of Illinois at Urbana-Champaign); Yuan, Lilin (School of Economics, Nankai University, China)
    Abstract: The difference-in-difference (DID) framework is now a well-accepted method in quasi-experimental research. However, DID does not consider treatment-induced changes to a network linking treated and control units. Our instrumental variable network DID methodology controls first for the endogeneity of the network to the treatment and, second, for the direct and indirect role of the treatment on any network member. Monte Carlo simulations and an estimation of the drought impact on global wheat trade and production demonstrate the performance of our new estimator. Results show that DID disregarding the network and its changes leads to significant underestimates of overall treatment effects.
    Keywords: International Trade; Climate Change; Crop Yield.
    JEL: C21 F14 Q54
    Date: 2021–07–02
    URL: http://d.repec.org/n?u=RePEc:ris:nereus:2021_005&r=
  9. By: Riccardo Lucchetti (Dipartimento di Scienze Economiche e Sociali; Facolta' di Economia "Giorgio Fua'; Universita' Politecnica delle Marche); Luca Pedini (Dipartimento di Scienze Economiche e Sociali; Facolta' di Economia "Giorgio Fua'; Universita' Politecnica delle Marche); Claudia Pigini (Dipartimento di Scienze Economiche e Sociali; Facolta' di Economia "Giorgio Fua'; Universita' Politecnica delle Marche)
    Abstract: Propensity Score Matching is a popular approach to evaluate treatment effects in observational studies. However, when building the underlying propensity score model practitioners often overlook the issue of model uncertainty and its consequences. We tackle this problem by Bayesian Model Averaging (BMA) with an application to the 2014 Italian tax credit reform (the so-called "Renzi bonus"). Model uncertainty has a great impact on the estimated treatment effects. BMA-based estimates point towards a significant effect of the rebate on food consumption only for liquidity constrained house- holds; conversely, model selection procedures sometimes produce results incompatible with the consumption smoothing hypothesis.
    Keywords: 2014 Italian Tax Credit Reform, Bayesian Model Averaging, Model Uncertainty, Propensity Score Matching, Reversible Jump Markov Chain Monte Carlo, Tax Rebate Policies
    JEL: C11 C52 D12
    Date: 2021–06
    URL: http://d.repec.org/n?u=RePEc:anc:wpaper:457&r=
  10. By: Luca Benati; Thomas Lubik
    Abstract: Taking as data-generation process a standard DSGE model, we show via Monte Carlo that reliably detecting hysteresis, defined as the presence of aggregate demand shocks with a permanent impact on output, is a significant challenge, as model-consistent identification schemes (i) spuriously detect it with non-negligible probability when in fact the data-generation process features none, and (ii) have a low power to discriminate between alternative extents of hysteresis. We propose a simple approach to test for the presence of hysteresis, and to estimate its extent, based on the notion of simulating specific statistics (e.g., the fraction of frequency-zero variance of GDP due to hysteresis shocks) conditional on alternative values of hysteresis we impose upon the VAR, and then comparing the resulting Monte Carlo distributions to the corresponding distributions computed based on the actual data via the Kullback-Leibler divergence. Based on two alternative identification schemes, evidence suggests that post-WWII U.S. data are compatible with the notion of no hysteresis, although the most plausible estimate points towards a modest extent, equal to 7 per cent of the frequency-zero variance of GDP.
    Keywords: Hysteresis, permanent shocks, long-run restrictions, sign restrictions, Bayesian methods; Kullback-Leibler divergence
    JEL: E2 E3
    Date: 2021–05
    URL: http://d.repec.org/n?u=RePEc:ube:dpvwib:dp2107&r=
  11. By: Soeren Johansen; Anders Rygh Swensen (Department of Mathematics, University of Oslo)
    Abstract: In cointegrated vector autoregressive models exact linear rational expectation relations can imply restrictions on the adjustment parameters. We show how such restrictions can be tested, in particular when the restrictions imply weak exogeneity of some variables.
    Keywords: Abstract, Exact rational expectations; Cointegrated VAR model; Reduced rank regression; Adjustment coefficients
    JEL: C23
    Date: 2021–06
    URL: http://d.repec.org/n?u=RePEc:kud:kuiedp:2107&r=
  12. By: Centorrino, Samuele; Perez Urdiales, Mari­a; Bravo-Ureta, Boris; Wall, Alan
    Keywords: Production Economics, Research Methods/ Statistical Methods
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:ags:aesc21:312058&r=
  13. By: Francesca Lilla
    Abstract: I propose an affine discrete-time model, called Vector Autoregressive Gamma with volatility Bursts (VARG-B) in which volatility experiences, in addition to frequent and small changes, periods of sudden and extreme movements generated by a latent factor which evolves according to the Autoregressive Gamma Zero process. A key advantage of the discrete-time specification is that it makes it possible to estimate the model via the Extended Kalman Filter. Moreover, the VARG-B model leads to a fully analytic conditional Laplace transform, resulting in a closed-form option pricing formula. When estimated on S&P500 index options and returns the new model provides more accurate option pricing and modelling of the IV surface compared with some alternative models.
    Keywords: volatility bursts, ARG-zero, option pricing, Kalman filter, realized volatility
    JEL: C13 G12 G13
    Date: 2021–06
    URL: http://d.repec.org/n?u=RePEc:bdi:wptemi:td_1336_21&r=
  14. By: Denis Fougère (CNRS - Centre National de la Recherche Scientifique, OSC - Observatoire sociologique du changement - Sciences Po - Sciences Po - CNRS - Centre National de la Recherche Scientifique, LIEPP - Laboratoire interdisciplinaire d'évaluation des politiques publiques [Sciences Po] - Sciences Po - Sciences Po, CEPR - Center for Economic Policy Research - CEPR, IZA - Forschungsinstitut zur Zukunft der Arbeit - Institute of Labor Economics); Nicolas Jacquemet (PSE - Paris School of Economics - ENPC - École des Ponts ParisTech - ENS Paris - École normale supérieure - Paris - PSL - Université Paris sciences et lettres - UP1 - Université Paris 1 Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique - EHESS - École des hautes études en sciences sociales - INRAE - Institut National de Recherche pour l’Agriculture, l’Alimentation et l’Environnement, UP1 - Université Paris 1 Panthéon-Sorbonne, CES - Centre d'économie de la Sorbonne - UP1 - Université Paris 1 Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique)
    Abstract: This chapter describes the main impact evaluation methods, both experimental and quasi-experimental, and the statistical model underlying them. Some of the most important methodological advances to have recently been put forward in this field of research are presented. We focus not only on the need to pay particular attention to the accuracy of the estimated effects, but also on the requirement to replicate assessments, carried out by experimentation or quasi-experimentation, in order to distinguish false positives from proven effects.
    Keywords: Causal inference,Evaluation methods,Causal effects
    Date: 2021
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:hal-03098058&r=

This nep-ecm issue is ©2021 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.