nep-ecm New Economics Papers
on Econometrics
Issue of 2020‒08‒10
eleven papers chosen by
Sune Karlsson
Örebro universitet

  1. Predicting the global minimum variance portfolio By Reh, Laura; Krüger, Fabian; Liesenfeld, Roman
  2. Intertemporal Collective Household Models: Identification in Short Panels with Unobserved Heterogeneity in Resource Shares By Irene Botosaru; Chris Muris; Krishna Pendakur
  3. Agnostic Output Gap Estimation and Decomposition in Large Cross-Sections By Florian Eckert; Samad Sarferaz
  4. Fully Distribution-free Center-outward Rank Tests for Multiple-output Regression and Manova By Marc Hallin; Daniel Hlubinka; Sarka Hudecova
  5. Bootstrap Bartlett Adjustment for Hypotheses Testing on Cointegrating Vectors. By Canepa, Alessandra
  6. Testing Financial Hierarchy Based on A PDQ-CRE Model By Zongwu Cai; Meng Shi; Yue Zhao; Wuqing Wu
  7. Partially Linear Models with Endogeneity: a conditional moment based approach By Bertille Antoine; Xiaolin Sun
  8. Smoothing time fixed effects By Gösser, Niklas; Moshgbar, Nima
  9. Bounding Program Benefits When Participation is Misreported By Denni Tommasi; Lina Zhang
  10. Inference in Differences-in-Differences with Few Treated Units and Spatial Correlation By Bruno Ferman
  11. Difference-in-Difference Hedonics By Banzhaf, H. Spencer

  1. By: Reh, Laura; Krüger, Fabian; Liesenfeld, Roman
    Abstract: We propose a novel dynamic approach to forecast the weights of the global minimum variance portfolio (GMVP). The GMVP weights are the population coefficients of a linear regression of a benchmark return on a vector of return differences. This representation enables us to derive a consistent loss function from which we can infer the optimal GMVP weights without imposing any distributional assumptions on the returns. In order to capture time variation in the returns' conditional covariance structure, we model the portfolio weights through a recursive least squares (RLS) scheme as well as by generalized autoregressive score (GAS) type dynamics. Sparse parameterizations combined with targeting towards nonlinear shrinkage estimates of the long-run GMVP weights ensure scalability with respect to the number of assets. An empirical analysis of daily and monthly financial returns shows that the proposed models perform well in- and out-of-sample in comparison to existing approaches.
    Keywords: Consistent loss function,Elicitability,Forecasting,Generalized autoregressivescore,Nonlinear shrinkage,Recursive least squares
    JEL: C14 C32 C51 C53 C58 G11 G17
    Date: 2020
  2. By: Irene Botosaru; Chris Muris; Krishna Pendakur
    Abstract: We provide a new full-commitment intertemporal collective household model to estimate resource shares, defined as the fraction of household expenditure enjoyed by household members. Our model implies nonlinear time-varying household quantity demand functions that depend on fixed effects. We provide new econometric results showing identification of a large class of models that includes our household model. We cover fixed-T panel models where the response variable is an unknown monotonic function of a linear latent variable with fixed effects, regressors, and a nonparametric error term. The function may be weakly monotonic and time-varying, and the fixed effects are unrestricted. We identify the structural parameters and features of the distribution of fixed effects. In our household model, these correspond to features of the distribution of resource shares. Using Bangladeshi data, we show: women’s resource shares decline with household budgets; and, half the variation in women’s resource shares is due to unobserved heterogeneity.
    Keywords: panel data; fixed effects; incidental parameter; time-varying transformation function; collective household; full commitment; resource shares; gender inequality
    JEL: C14 C23 C41
    Date: 2020–06
  3. By: Florian Eckert (KOF Swiss Economic Institute, ETH Zurich, Switzerland); Samad Sarferaz (KOF Swiss Economic Institute, ETH Zurich, Switzerland)
    Abstract: This paper uses a Bayesian non-stationary dynamic factor model to extract common trends and cycles from large datasets. An important but neglected feature of Bayesian statistics allows to treat stationary and non-stationary time series equally in terms of parameter estimation. Based on this feature we show how to extract common trends and cycles from the data by ex-post processing the posterior output and describe how to derive an agnostic output gap measure. We apply the procedure to a large panel of quarterly time series that covers 158 macroeconomic and financial series for the United States. We find that our derived output gap measure tracks the U.S. business cycle well, exhibiting a high correlation with alternative estimates of the output gap. Since the factors are extracted from a comprehensive dataset, the resulting output gap estimates are stable at the current edge and can be decomposed in a new and meaningful way.
    Keywords: Non-Stationary Dynamic Factor Model, Potential Output Estimation, Output Gap Decomposition
    JEL: C11 C32 E32
    Date: 2019–12
  4. By: Marc Hallin; Daniel Hlubinka; Sarka Hudecova
    Abstract: Extending rank-based inference to a multivariate setting such as multiple-output regression or MANOVA with unspecified d-dimensional error density has remained an open problem for more than half a century. None of the many solutions proposed so far is enjoying the combination of distribution-freeness and efficiency that makes rank-based inference a successful tool in the univariate setting. A concept of center-outward multivariate ranks and signs based on measure transportation ideas has been introduced recently. Center-outward ranks and signs are not only distribution-free but achieve in dimension d > 1 the (essential) maximal ancillarity property of traditional univariate ranks, hence carry all the “distribution-free information" available in the sample. We derive here the Hájek representation and asymptotic normality results required in the construction of center-outward rank tests for multiple-output regression and MANOVA. When based on appropriate spherical scores, these fully distribution-free tests achieve parametric efficiency in the corresponding models.
    Keywords: Multivariate ranks; Multivariate signs; Multiple output regression; MANOVA; MANOVA; Rank test; Hájek representation
    Date: 2020–07
  5. By: Canepa, Alessandra (University of Turin)
    Abstract: Johansen?s (2000) Bartlett correction factor for the LR test of linear restrictions on cointegrated vectors is derived under the i.i.d. Gaussian assumption for the innovation terms. However, the distribution of most data relating to ?nancial variables are fat-tailed and often skewed, there is therefore a need to examine small sample inference procedures that require weaker assumptions for the innovation term. This paper suggests that using a non-parametric bootstrap to approximate a Bartlett-type correction provides a statistic that does not require speci?cation of the innovation distribution and can be used by applied econometricians to perform a small sample inference procedure that is less computationally demanding than estimating the p-value of the observed statistic.
    Date: 2020–03
  6. By: Zongwu Cai (Department of Economics, The University of Kansas, Lawrence, KS 66045, USA); Meng Shi (Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing, Beijing 100190, and School of Mathematical Sciences, University of Chinese Academy of Sciences, Beijing, Beijing 100049, China); Yue Zhao (School of Business, Renmin University of China, Beijing, Beijing 100872, China); Wuqing Wu (School of Business, Renmin University of China, Beijing, Beijing 100872, China)
    Abstract: This paper investigates the relative importance of internal and external sources of funds in financing activities across different levels of investment activities by proposing a panel data quantile regression model with correlated random effects (PDQ-CRE), which accounts for heteroscedasticity in both firm-specific individuals and distribution of investment. A new estimation method is proposed by using the quasi-likelihood function for conditional quantile model and Laplace approximation. We show that the proposed estimator is consistent and normally distributed. A Monte Carlo simulation is conducted to examine the performance of the estimator in finite samples. Finally, empirical results find a strong evidence that the financing hierarchy of U.S. firms is in accordance with the first rung of the pecking order theory across all levels of investments from 10% to 90%.
    Keywords: Correlated Random Effects; Panel Data; Pecking Order Theory; Quantile Regression Model; Quasi-Maximum Likelihood Estimator
    JEL: C33 C31
    Date: 2020–07
  7. By: Bertille Antoine (Simon Fraser University); Xiaolin Sun (Simon Fraser University)
    Abstract: In a partially linear conditional moment model, we propose a new estimator for the slope parameter of the endogenous variable of interest which combines a Robinson’s transformation (Robinson (1988)), to partial out the non-linear part of the model, with a smooth minimum distance approach (Lavergne and Patilea (2013)), to exploit all the information of the conditional mean independence restriction. Our estimator is easy to compute, consistent and √n-asymptotically normal under standard regularity conditions. Simulations show that our estimator is competitive with GMM-type estimators, and often displays a smaller bias and variance, as well as better coverage rates for confidence intervals. We revisit and extend some of the empirical results in Dinkelman (2011) who estimates the impact of electrification on employment growth in South Africa: overall, we obtain estimates that are smaller in magnitude, more precise, and still economically relevant.
    Keywords: Robinson’s transformation; Conditional mean independence; Nonlinearity; Minimum distance estimation; Instrument.
    JEL: C13 C21 C51 D04
    Date: 2020–07
  8. By: Gösser, Niklas; Moshgbar, Nima
    Abstract: Controlling for time fixed effects in analyses on longitudinal data by means of timedummy variables has long been a standard tool in every applied econometrician's toolbox. In order to obtain unbiased estimates, time fixed effects are typically put forward to control for macroeconomic shocks and are (almost) automatically implemented when longitudinal data are analyzed. The applied econometrician's toolbox contains however no standard method to control for time fixed effects when time-dummy variables are not applicable. A number of empirical applications are crucially concerned with both suffering from bias due to omitting time and time-dummies being inapplicable. This paper introduces a simple and readily available parametric approach to approximate time fixed effects in case time dummy variables are not applicable. Applying Monte Carlo simulations, we show that under certain regulatory conditions, trend polynomials (smoothing time fixed effects) yield consistent estimates by controlling for time fixed effects, also in cases time-dummy variables are inapplicable. As the introduced approach implies testing nested hypotheses, a standard testing procedure enables the identification of the order of the trend polynomial. Applications that may considerably suffer from bias in case time fixed effects are neglected are among others cartel overcharge estimations, merger and regulation analyses and analyses of economic and financial crises. These applications typically divide time into event and control periods, such that standard time dummies may not be applicable due to perfect multicollinearity. In turn, their estimates of interest most crucially need to be purged from other (unobserved) time dependent factors to be consistent as time may by construction induce omitted-variable bias.
    Date: 2020
  9. By: Denni Tommasi; Lina Zhang
    Abstract: In empirical research, measuring correctly the benefits of welfare interventions is incredibly relevant for policymakers as well as academic researchers. Unfortunately, the endogenous program participation is often misreported in survey data and standard instrumental variable techniques are not sufficient to point identify and consistently estimate the effects of interest. In this paper, we focus on the weighted average of local average treatment effects (LATE) and (i) derive a simple relationship between the causal and the identifiable parameter that can be recovered from the observed data, (ii) provide an instrumental variable method to partially identify the heterogeneous treatment effects, (iii) formalize a strategy to combine administrative data on the misclassification probabilities of treated individuals to further tighten the bounds. Finally, we use our method to reassess the benefits of participating to the 401(k) pension plan on savings.
    Keywords: heterogenous treatment effects, causality, binary treatment, endogenous measurement error, discrete or multiple instruments, weighted average of LATEs, endogeneity, program evaluation
    JEL: C14 C21 C26 C35 C51
    Date: 2020
  10. By: Bruno Ferman
    Abstract: We consider the problem of inference in Differences-in-Differences models when there are few treated units and errors are spatially correlated. We first show that, when there is a single treated unit, existing inference methods designed for settings with few treated and many control units remain asymptotically valid when errors are strongly mixing. However, these methods are invalid with more than one treated unit. We propose an asymptotically valid, though generally conservative, inference method for settings with more than one treated unit.
    Date: 2020–06
  11. By: Banzhaf, H. Spencer
    Abstract: Traditional cross-sectional estimates of hedonic price functions theoretically can recover marginal willingness to pay for characteristics, but face endogeneity problems when some characteristics are unobserved. To help overcome such problems, economists have introduced difference-in-differences and other quasi-experimental econometric methods into the hedonic model. Unfortunately, the welfare interpretation of the estimands has not been clear. This paper shows that, when they condition on baseline data, they identify the "average direct unmediated effect" on prices from a change in characteristics. It further shows that this effect is a lower bound on welfare, specifically Hicksian equivalent surplus plus the change in profits. The paper illustrates these results with an application to toxic facilities' effects on housing prices.
    Keywords: Hedonic Pricing, Housing Markets, Differentiated Products, Nonlinear Pricing
    JEL: D4 D6 Q5 R3
    Date: 2019–08

This nep-ecm issue is ©2020 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.